Colleges and universities are increasingly partnering with artificial intelligence (AI) companies to make course materials accessible to students of all ability levels, as required by the American Disabilities Act (ADA). AI systems, paired with human experts, can accelerate the delivery of transcriptions to students, lighten the workload of human accessibility officers, reduce costs for institutions, enhance personalized learning, and provide big data insights, reports Education Dive.
Integrated systems aiding in government-mandated compliance
The ADA and the Rehabilitation Act require all public colleges and universities that receive government funding to provide transcription services. But as more classrooms and lessons incorporate video content—which can take humans hours to transcribe—schools’ accessibility departments are finding themselves facing insurmountable backlogs of transcription requests. Speech-to-text AI systems like Verbit are helping accessibility officers process and fulfill requests more efficiently for people who are deaf, hard of hearing, or who have difficulty writing notes.
Cynthia Curry, director of the National Center on Accessible Educational Materials for Learning, says technology created many barriers to learning. “If there can be systems built within technology to automatically, accurately, and consistently make sure that the technology is being delivered in a way that’s inherently accessible to all learners, that’s really exciting.”
AI systems requiring human moderation
AI systems are not fully accurate, however, and still require a human layer of quality control. For example, a speech-to-text system will have trouble filtering noises in a full classroom or decoding a lecturer’s jargon. Human translators can correct AI mistakes and educate teachers about their origins. “We can start to, not hand over the reins to artificial intelligence, but figure out good and healthy partnerships,” said Christopher Phillips, electronic and information technology accessibility coordinator at Utah State University.
Machine learning shaping educator resource selection
AI analyses also can flag accessibility issues within course materials—and in doing so, teach educators how to refine their resource selection and presentation. AI systems like Blackboard Ally scan course materials, flagging obstacles to accessibility like untagged PDF formatting, under-organized text that screen readers might struggle to interpret, or suboptimal image contrasts that challenge low-vision students.
Human experts can use these analyses to train educators about evaluating primary sources for accessibility. “A lot of instructors may not have considered before (or) even thought that their material could be presented in a different format, so just having that in front of (them) can be really powerful,” said Curry.
Enhancing learning for ESL students
In addition, machine learning systems can benefit students without disabilities. Echo360’s closed-captioning service can help students search video transcripts and learn the same content in two formats, which aids information retention. Online community college students learning English as a second language can benefit from Voxy, which learns about students and adapts to their skill levels and interests to create personalized learning paths, often in a fraction of the time that it would take their instructors. Katie Nielson, chief education officer at Voxy, said “with language learning, the more personalized you can make content recommendations, the better.”
Using big data for major insights
Overall, the data captured by these systems can provide insights into how students interact with educational resources.
“There’s going to be all kinds of relationships we had no idea really existed about how learning takes place,” said Fred Singer, CEO of Echo360. “You’re going to get to this whole other level, and it’s because we’re starting to automate this whole process instead of it all just being random and not data driven.”