By Anand Raman Group Product Manager, Azure AI
This blog post was co-authored by Anny Dow, Product Marketing Manager, Azure Cognitive Services.
As schools and organizations around the world prepare for a new school year, remote learning tools have never been more critical. Educational technology, and especially AI, has a huge opportunity to facilitate new ways for educators and students to connect and learn.
Today, we are excited to announce the general availability of Immersive Reader, and shine a light on how new improvements to Azure Cognitive Services can help developers build AI apps for remote education that empower everyone.
Make content more accessible with Immersive Reader, now generally available
Immersive Reader is an Azure Cognitive Service within the Azure AI platform that helps readers read and comprehend text. Through today’s general availability, developers and partners can add Immersive Reader right into their products, enabling students of all abilities to translate in over 70 languages, read text aloud, focus attention through highlighting, other design elements, and more.
Immersive Reader has become a critical resource for distance learning, with more than 23 million people every month using the tool to improve their reading and writing comprehension. Between February and May 2020, when many schools moved to a distance learning model, we saw a 560 percent increase in Immersive Reader usage. As the education community embarks on a new school year in the Fall, we expect to see continued momentum for Immersive Reader as a tool for educators, parents, and students.
With the general availability of Immersive Reader, we are also rolling out the following enhancements:
· Immersive Reader SDK 1.1: Updates include support to have a page read aloud automatically, pre-translating content, and more. Learn about SDK updates.
· New Neural Text-to-Speech (TTS) languages: Immersive Reader is adding 15 new Neural Text to Speech voices, enabling students to have content read aloud in even more languages. Learn about the new Neural Text to Speech languages.
· New Translator languages: Translator is adding five new languages that will also be available in Immersive Reader—Odia, Kurdish (Northern), Kurdish (Central), Pashto, and Dari. Learn about the latest Translator languages.
Today, we’re adding new partners who are integrating Immersive Reader to make content more accessible, Code.org and SAFARI Montage.
Code.org is a nonprofit dedicated to expanding access to computer science in schools. To ensure that students of all backgrounds and abilities can access their resources and course content, Code.org is integrating Immersive Reader into their platform.
“We’re thrilled to partner with Microsoft to bring Immersive Reader to the Code.org community. The inclusive capabilities of Immersive Reader to improve reading fluency and comprehension in learners of varied backgrounds, abilities, and learning styles directly aligns with our mission to ensure every student in every school has the opportunity to learn computer science.” – Hadi Partovi, Founder and CEO of Code.org
SAFARI Montage, a leading learning object repository, is integrating Immersive Reader to make it possible for students of any language background or accessibility needs to engage with content, and enable families who don’t speak the language of instruction to be more involved in their students’ learning journeys.
“Immersive Reader is a crucial support for CPS students and families. During remote learning, particularly for our younger learners, student learning is often supported by parents, guardians, or other caregivers. Since Immersive Reader can be used to translate the student-facing instructions in our digital curriculum, families can support student learning in over 80 languages, making digital learning far more equitable and accessible than ever before! In addition, read-aloud and readability supports are game-changers for diverse learners” – Giovanni Benincasa, UX Manager, Department of Curriculum, Instruction, and Digital Learning, Chicago Public Schools.