
Over the past decade, researchers at the University of Kansas (KU) have developed an innovative virtual reality (VR) system aimed at helping students with disabilities, particularly those with autism r, to enhance their social skills. Now, this groundbreaking project is set to reach new heights with the addition of artificial intelligence (AI) components, thanks to a five-year, $2.5 million grant from the U.S. Office of Special Education Programs.
The project, titled Increasing Knowledge and Natural Opportunities With Social Emotional Competence (iKNOW), aims to create an extended reality (XR) experience that offers students more natural and immersive social interactions. Building on the success of the existing VR system known as Virtual reality Opportunity to Integrate Social Skills (VOISS), which has already proven effective in improving social skills for students with disabilities, iKNOW will integrate advanced AI technologies to enhance these capabilities further.
The original VOISS system includes 140 unique learning scenarios designed to teach 183 social skills in virtual school environments, such as classrooms, hallways, cafeterias, and buses. These scenarios are accessible through various platforms, including iPads, Chromebooks, and Oculus VR headsets, allowing students to practice social skills across multiple settings.
Introducing AI to Enhance Realism and Interaction
The new iKNOW system will leverage large language models and AI technologies to enable more natural user interactions. Instead of responding to pre-recorded narratives by pushing buttons, students can engage in real-time conversations, with AI accurately transcribing spoken language and generating appropriate video responses from avatars. This approach aims to make social skill practice more relatable and effective.
Amber Rowland, an assistant research professor at KU’s Center for Research on Learning and one of the grant’s co-principal investigators, explained, “Avatars in iKNOW can have certain reactions and behaviors based on what we want them to do. They can model the practices we want students to see. The system will harness AI to ensure students have more natural interactions and put them in the role of the ‘human in the loop’ by allowing them to speak, and it will respond like a normal conversation.”
Monitoring Progress and Ensuring Realism
iKNOW will also include a real-time student progress monitoring system, providing valuable feedback to students, educators, and families. This system will track metrics such as the length and frequency of spoken responses, the number of keywords used, and areas where students may have struggled, helping to enhance overall understanding and progress.
To ensure a realistic experience, all avatar voices in iKNOW are provided by real middle school students, educators, and administrators. This approach eliminates students’ potential discomfort when practising social skills with classmates in supervised sessions, allowing them to practice until they are confident enough to transfer these skills to real-life situations.
Benefits for Teachers and Students
Maggie Mosher, an assistant research professor at KU’s Achievement & Assessment Institute and co-principal investigator for the grant, emphasized the system’s potential to lighten teachers’ workloads and provide practical student tools. “It will leverage our ability to take something off of teachers’ plates and provide tools for students to learn these skills in multiple environments,” Mosher said.
Mosher’s doctoral dissertation compared VOISS to other social skills interventions and found it to be statistically significant and valid in improving social skills and knowledge across multiple domains. This research, published in high-impact journals, underscores the system’s effectiveness.
Future Prospects and Collaboration
The iKNOW project, supported by one of four OSEP Innovation and Development grants, will be presented at the annual I/ITSEC conference, the world’s most extensive modeling, simulation, and training event. The research team, including principal investigator Sean Smith, professor of special education, and other esteemed colleagues, will showcase their work at this prestigious event.
The team has successfully implemented VOISS in schools nationwide, and iKNOW is set to expand its reach. Interested parties can learn more about the system, access demonstrations and videos, and contact developers through the iKNOW website.
Additionally, iKNOW will provide resources for teachers and families through a dedicated website called iKNOW TOOLS (Teaching Occasions and Opportunities for Learning Supports), supporting the generalization of social skills across real-world settings.
“By combining our research-based social emotional virtual reality work (VOISS) with the increasing power and flexibility of AI, iKNOW will further personalize the learning experience for individuals with disabilities along with struggling classmates,” said Sean Smith. “Our hope and expectation is that iKNOW will further engage students to develop the essential social emotional skills to then apply in the real world to improve their overall learning outcomes.”