Friday, March 14, 2025

Apple's Airpod AI + iOS 19 = real-time translation. July '25 release


Update

Accoding to Perplexity.ai:

Apple AirPods Live Translation

The upcoming live translation feature for AirPods will leverage the earbuds' microphones and the iPhone's Translate app to facilitate seamless cross-language communication12. Users will be able to hear translations through their AirPods while the original audio plays through the phone's speaker, eliminating the need to pass devices back and forth during conversations3. This enhancement builds upon Apple's existing Translate app, introduced in iOS 14, by integrating it directly with AirPods for a more streamlined experience14. The feature is expected to support multiple languages, though specific details on language availability have not yet been announced.


...After searching Perplexity, I found EZDubs and AI Phone Translator.

EZDubs offers voice emulation as I described in my requirements, and AI Phone Translator offers no lag and 99% accuracy. EZDubs' lag leaves room to hear the speaker's voice, followed by the emulating voice translation. The app offers 15+ minute free trial, and costs $14.99/mo. thereafter.

My Summary about Apple's Pending Releases

From what I understand, Apple's Airpod's will achieve real-time translation using it's native translation app and AI (upgraded Siri voice control to invoke the app) to communicate on a call (rather than out loud in person using the app on the phone).

It's the same real-time translation capability as that of an Android device working with Google Translate and earbuds for a more seamless translation experience (allows for voice control via Google Assistant instead of invoking the app on the device by hand).

So, I guess this means on both devices, you can use Google Translate conversation mode or Apple's Translate app to speak and hear translations through the phone.

Currently, I can speak to Google Translate from my Airpods, but the translation output voice only goes to the device, not back to me. I've not tried using the app on a call.


My ideal future requirements (short-list):

  • Airpods or Earbuds work with any device
  • Airpods or Earbuds work with Google Translate as well as Apple's Translation app
  • Mute speaker's words as the translated voice is expressed to prevent overlap
  • Send a text translation using voice only
  • Send a text translation using voice only while driving? E.g. "Hey Siri, translate text to Jose in Spanish"..."Okay, what do you want to say?"...and confirm the message in my language before sending.
  • Emulate the tone and rhythm of the person speaking (i.e. will Apple acquire / achieve Eleven Labs technological capability)
  • Airpods and Earbuds eventually contain all voice functionality of the phone to enable platform-agnostic voice communication, translation, and AI assistance at any distance
  • Incorporate voice capability into glasses like Meta's Raybans to incorporate AR and screen capabilities

-

Apple's real-time translation with Airpod AI + iOS 19 (July '25 public beta release) 

https://www.reuters.com/technology/apple-plans-airpods-feature-that-can-live-translate-conversations-bloomberg-news-2025-03-13/

 https://www.cnet.com/tech/apple-reportedly-planning-airpods-that-can-translate-languages-live/

https://www.tomsguide.com/phones/iphones/ios-19

From Perplexity.ai:

Based on the reported information, Apple's upcoming live translation feature for AirPods will process translations on the iPhone, not within the AirPods themselves135.

The system is expected to work as follows:

  1. The AirPods' microphones will capture the spoken audio.

  2. The iPhone will process and translate the audio using its Translate app and AI capabilities.

  3. The translated audio will be sent back to the AirPods for playback to the user.

This approach leverages the iPhone's more powerful processing capabilities and existing translation technology. The AirPods will primarily serve as input and output devices, while the iPhone acts as the translation hub.

This integration is expected to be part of the iOS 19 update, creating a seamless experience between AirPods and iPhones for real-time language translation. By processing translations on the iPhone, Apple can potentially offer more accurate and faster translations, as well as support for a wider range of languages, compared to processing within the AirPods themselves.

-

Google's current real-time translation:

Google Translate conversation mode is processed on the device, not in the earbuds.

"Live Translate" (Released 2020): 

Samsung Galaxy S24 or Pixel Phone + Pixel Buds + Google Translate = Real-time translation.

Google Pixel Buds: The Live Translate feature is available on Google Pixel Buds Pro, Pixel Buds A-Series, and Pixel Buds 2 when used with an Android device running Android 6.0 or later

https://www.amazon.com/Google-Pixel-Buds-Pro-Headphones/dp/B0B1N7Z8B3

From Perplexity.ai:

To use the real-time translation feature:

  1. Wear your Pixel Buds and connect them to your phone.

  2. Activate Google Assistant by saying "Hey Google" or pressing and holding the earbud.

  3. Say "Help me speak [language]" to launch the conversation mode in Google Translate.

  4. Press and hold the earbud to speak in your native language.

  5. Use the Google Translate app on your phone to have the other person respond.

It's important to note that while the Pixel Buds enable a more seamless translation experience, the actual translation processing occurs on the connected Android device, not within the earbuds themselves.