A mobile app was successful at distinguishing toddlers diagnosed with autism from typically developing toddlers based on their eye movements while watching videos, according to a study funded by the National Institutes of Health. The findings suggest that the app could one day screen infants and toddlers for autism and refer them for early intervention, when chances for treatment success are greatest.
The study appears in JAMA Pediatrics and was conducted by Geraldine Dawson, Ph.D., director of the NIH Autism Center of Excellence at Duke University, and colleagues. Funding was provided by NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) and National Institute of Mental Health.
Studies have found that the human brain is hard-wired for social cues, with a person’s gaze automatically focusing on social signals. In autism , attention to social stimuli is reduced, and researchers have sought to screen for autism in young children by tracking their eye movements while they view social stimuli. However, equipment used for visual tracking is expensive and requires specially trained personnel, limiting its use outside of laboratory settings.
The current study enrolled 933 toddlers ages 16 to 38 months during a well-child primary care visit. Of these children, 40 were later diagnosed with autism. They viewed on a mobile device short videos of people smiling and making eye contact or engaging in conversation. Researchers recorded the children’s gaze patterns with the device’s camera and measured them using computer vision and machine learning analysis. Children with ASD were much less likely than typically developing children to focus on social cues and visually track the conversations in the videos.
Pending confirmation by larger studies, the authors concluded that this eye-tracking app featuring specially designed videos and computer vision analysis could be a viable method for identifying young children with autism.