When most toddlers diagnosed with autism sit still, their heads sway very slightly back and forth.

That movement is an extremely reliable way to distinguish between a neurotypical child and a child with autism, research has shown. But it’s almost imperceptible to even a seasoned clinician.

For a computer trained to see it, though, the movement is obvious.

Advertisement - Continue Reading Below

“A computer just gives you a beautiful readout of how many times the head moves back and forth,” said Geraldine Dawson, director of Duke University’s Center for Autism and Brain Development.

Dawson and her colleagues are testing an app that Duke has patented to harness that power. It uses artificial intelligence to analyze toddlers’ movements, eye positions and facial expressions, among other things, to help predict who has autism.

Children with autism display a number of subtle behaviors like the head tilt that can be used to accurately diagnose kids early, Dawson said. When they’re watching a video, for example, they spend less time looking at the people on the screen than their neurotypical peers.

Using a recently awarded $12 million National Institutes of Health grant, Dawson’s lab will test the digital app to see whether computers can offer a more accessible and objective way of diagnosing autism.

The app displays short videos designed by the researchers to gauge toddlers’ social interest on a phone screen. At the same time, it observes a child’s movement and marks behaviors associated with autism in real-time.

“By using a computer, you’re using a very objective technique that is able to really pick up on the subtleties in a way that’s much more reliable,” Dawson said of the app, which is not yet used by clinicians.

Left Behind

Earlier is almost always better when it comes to autism intervention.

Early treatment is often associated with a higher IQ and a higher likelihood the child will be able to learn in a traditional school setting, Dawson said.

“It is so effective, especially when it’s done early, that many children no longer meet diagnostic criteria for autism,” said Dr. Theresa Flynn, vice president of the North Carolina Pediatric Society.

Yet, not all children receive early diagnoses or therapy. Black children are typically diagnosed with autism three years after white kids in the United States. Girls are diagnosed about a year and a half later than boys, on average.

The differences are partly due to health care access disparities, Dawson said. Children that don’t go to the doctor don’t get screened for autism.

Dawson said she hopes the Duke app, which can be used to capture data in clinics and homes, could help make this process more accessible to those families.

“The remarkable thing is the fact that we’re really just using a smartphone,” she said. “There’s no equipment, no research assistant or physician or anything.”

Improving screening?

Many researchers, including Dawson, believe disparities in diagnosis are partly due to a flawed screening system that relies on measures susceptible to human bias.

Primary care providers typically screen toddlers for autism using a 20-question survey that asks questions of parents like: “Does your child play pretend or make-believe?” and “Is your child interested in other children?”

While survey-based screening is important, it has clear blind spots, Dawson said. One study involving 26,000 children, found that the survey is not as accurate in girls, children of color and those from lower-income households.

Artificial intelligence circumvents some of those biases by focusing more on objective measures, like eye position, rather than a parent’s interpretation of their child’s behavior, Dawson said.

Flynn, the North Carolina Pediatric Society vice president, noted however that no computer algorithm is truly objective.

One study published last year in Nature Medicine found that an algorithm used to screen chest X-rays routinely missed signs of disease in female, Black and Hispanic patients, labeling them incorrectly as healthy. Other researchers have raised concerns that algorithms tasked with detecting skin-cancer, many of which are primarily trained using photos of white patients, might do worse at detecting the cancer on Black patients.

Dawson said her research has shown that the autism app’s algorithm is equally good at identifying autism-related behaviors in kids of different races and ethnicities. In future experiments her lab will test whether this holds true in a larger population, she said.

Dr. Kristin Sohl, who heads a subcommittee on autism for the American Academy of Pediatrics, said there’s a certain amount of human expertise that’s still essential for diagnosing autism.

“While these tools are great, I still think there’s an overlay of that clinical judgment that’s required,” she said.

© 2022 The News & Observer
Distributed by Tribune Content Agency, LLC

Read more stories like this one. Sign up for Disability Scoop's free email newsletter to get the latest developmental disability news sent straight to your inbox.