Skip to content

Breaking News

New Study On Visual/Verbal Interaction Needs More Participants

Author
PUBLISHED: | UPDATED:

Most of us look at our companion’s face when they’re speaking. “There’s a lot of information on the face, not just identity,” explains Julia Irwin, an associate professor of psychology at Southern Connecticut State University. “I know how you feel but I can also see the visible articulation of your speech.” But, children with autism often avoid eye contact, and don’t look at other’s faces during verbal exchanges. Does this reduce their ability to detect information, especially in a noisy environment? Does it make learning to talk more difficult? These questions prompted Irwin to spearhead a unique study — using new technology — called Listening to Faces. The research is gaining traction, prompting participation and hope for new therapies in the future.

“We have fifty children in the study already although that’s about half of what we’d like to have,” says Irwin, looking for participants between the ages of 6 and 12 who have autism or speech problems. The study also includes neuro-typical kids. After a testing session at SCSU — when kids get comfortable with the process — they head to Haskins Laboratories, at Yale University in New Haven, where they put-on a cap, lined with electrodes. “We measure brain activity while they watch videos of people talking,” says Irwin, also a senior scientist at the lab, noting that the electrodes are non-invasive. “We pair that with an eye tracker. So, we can actually say when a child is looking at a face, is their brain detecting change in what’s being said?”

Irwin and 25 student helpers have also developed an iPad app called Listening to Faces. For six months, every participant in the study takes a tablet home to utilize the app that shows faces on the screen saying words, as background noise is increased. “The goal is for us to provide — with our basic data — an effective, useful tool to help increase understanding of speech,” says Irwin who believes this game can “train” a child to watch a face, inspiring new, targeted intervention, starting at a young age. This mother-of-two has a history of working on speech issues, as director of the LEARN (Language and Early Assessment Research Network) Center at Haskins.

The research, federally funded for 12 years, evolves as technologies are added. It inspires enthusiasm from the families involved. Irwin is excited, hopeful she and her team are on the verge of making important discoveries that will provide significant emotional, social and linguistic help for youngsters in the future: “I think that is really a good example of the community and the scientific community working together so that we can benefit kids as they develop.”

For more information, email listeningtofaces@haskins.yale.edu. To see the research, watch Monday’s Fox CT Morning News.