Teaching AI to read fetal ultrasound in low- and middle-income countries

Northwestern Medicine and Google are collaborating on a project to bring fetal ultrasound to developing countries by combining AI (artificial intelligence), low-cost hand-held ultrasound devices, and a smartphone.

The project will develop algorithms that enable AI to read ultrasound images from these devices taken by lightly trained community health workers and even pregnant people at home, aimed at birth parents and To assess the welfare of both the child.

Image credit: Northwestern University

Image credit: Northwestern University

Assistant Professor of Anesthesiology at Northwestern University Feinberg School of Medicine and project leader at Northwestern, Dr. “We want to make it easy for high-quality fetal ultrasound to take your temperature,” said Mozziar Etemadi. He is also a Northwestern Medicine physician and medical director of Advanced Technologies for Health Systems.

The World Health Organization recommends ultrasound evaluation before 24 weeks of pregnancy. It helps in assessing the health of the birth parents and child, planning the course of pregnancy and monitoring early risks and complications. Worldwide, a large number of patients die during childbirth. An estimated 300,000 maternal deaths and 2.5 million perinatal deaths occur each year, with 94% occurring in low- and middle-income countries.

Ultrasound technology is becoming more portable and more affordable. Still, up to half of all birth parents in developing countries are not screened for pregnancy.

This is because existing hand-held devices require a trained technician to precisely manipulate the ultrasound probe to capture the correct images. Then, the image should be interpreted by a radiologist or a specially trained obstetrician. Trained technicians and physicians are in limited supply in many underserved communities and developing countries. This is where AI comes in.

“The new AI algorithm will be able to use many imperfect ultrasound images taken with inexpensive, handheld ultrasound devices and interpret them as if they are perfect images,” Atemadi said. The raw ultrasound images will be sent to a smartphone, where the AI ​​will isolate important features such as the age and position of the fetus.

The low-cost device will take the image, send it to a smartphone, and then the AI ​​will provide a reading on factors such as the age and condition of the fetus.

“Training a new AI requires a lot of data, and in this case, we don’t routinely collect amateur ultrasound images, so the data doesn’t exist anywhere,” Atemadi said. “We have to first build this database of amateur ultrasounds with these wireless probes.”

Then Google Health can develop an AI that will interpret the fetus.

In the first phase of developing the algorithm, Atemady and his team will conduct research with pregnant patients from Northwestern Medicine in which they perform their ultrasound with a low-cost hand-held device. Northwest technicians will also perform fetal ultrasounds on patients, and even family members will attend. This will be followed by a routine diagnostic fetal ultrasound of the patients. All images and other pregnancy related data will be downloaded into a database.

Study participants will use handheld ultrasound devices that come pre-installed with Google Health’s custom application to collect, process, and deliver a fetal ultrasound “blind sweep.” A “blind-sweep” ultrasound consists of six freehand ultrasound sweeps across the abdomen to produce a computer image.

The goal is to collect a comprehensive set of data and related information, including reports on fetal-growth restriction, placental location, gestational age, and other relevant conditions and risk factors. Data will be collected from all three trimesters and from a diverse representative group of patients. The study will collect ultrasound images from several thousand patients over the next year. All patients will consent to inclusion.

The AI ​​will obtain professional and amateur images in many situations that clinicians typically want to monitor such as the age of the fetus and whether it has a heart defect. By having side-by-side image capture, AI amateurs can adapt to interpret image captures and learn to interpret them more precisely.

“In the developing world, people are far away from health care, sometimes by walking for several days,” Atemadi said. “People don’t get health care, or when they go, it’s too late.”

“The real power of this AI tool will be to allow the first test of care, so lightly trained community health providers can do scans of the birth parents. Patients don’t have to go to the city to pick it up.” The AI ​​will help inform what to do next – if the patient is fine or if they need to go to a higher level of care. We truly believe it will save the lives of a lot of birth parents and babies. will save.”

Source: Northwestern University


Leave a Reply

Your email address will not be published.