Microsoft is set to revolutionize communication with a new AI interpreter feature for Teams, allowing participants to engage in meetings using their preferred languages. Announced on November 19, this update supports near-real-time voice interpretation in nine languages, including Mandarin, English, French, German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish. A limited testing phase is currently underway, with broader availability expected in 2025 for users with a Microsoft 365 Copilot license.
Microsoft Teams adds functionality to help users converse in different languages during meetings
The introduction of the AI interpreter aims to democratize access to interpreters, offering cost-effective solutions compared to human counterparts. According to Nicole Herskowitz, corporate vice president of productivity and collaboration for Copilot, the goal is to provide “a high-quality translation experience” for companies that may not have the resources for traditional interpreting services.
Consent from users is required for the interpreter to simulate voices during meetings. Users can opt out of voice replication, allowing the AI to use a default interpretation voice instead. The feature is designed to make multilingual meetings accessible, offering a seamless translation experience. Microsoft is expanding its capabilities in response to industry demands, as evidenced by similar offerings from companies like Google, Salesforce, and Zoom, all of which have introduced AI-driven products to enhance user experience.
The technology behind this AI interpreter stems from a rising trend in voice simulators. Notably, New York Mayor Eric Adams utilized AI simulators to promote local events in multiple languages, connecting with residents in ways he could not do fluently. This technology extends beyond business applications, aiding individuals with speech impairments to communicate effectively once again.
While the AI interpreter in Teams aims to provide accurate translations, Herskowitz cautions that it may not always achieve 100% accuracy. To mitigate potential errors, Teams will soon offer a multi-language transcription service, allowing users to view the original spoken language alongside the interpreted version. This dual-display feature seeks to enhance understanding and comprehension during meetings.
Microsoft has plans to further enrich the Teams experience with additional features including Copilot Actions, which will automate recurring tasks like summarizing client interactions or providing updates for meeting preparations. This functionality is currently in private testing, reflecting Microsoft’s commitment to streamlining workflows for users.
Moreover, Microsoft is developing tools enabling users to create custom AI bots tailored to specific queries, leveraging knowledge from dedicated file repositories. This can improve team efficiency by enabling quick access to information relevant to particular projects or clients.
The AI interpreter’s capabilities set it apart from existing interpretation solutions. While platforms like Zoom have previously introduced options for live interpreters, Microsoft’s approach utilizes voice simulation technology to create a more engaging and personalized experience. Zoom’s interpretation feature, available since 2022, focuses on real-time audio channels for human interpreters and captioning in multiple languages.
By early 2025, users can expect features like Teams Super Resolution—to improve video call clarity during poor internet connectivity—and AI tools for image enhancement that will assist developers in refining visual content.
Featured image credit: Microsoft