Meta’s New AI Translator Can Preserve Your Accent, Tone, and Voice

Real-time translation is nothing new, but most tools end up making you sound like a robot devoid of emotion, even those powered by AI. This is exactly what Meta AI’s new Seamless Translation model has managed to overcome.

Meta unveiled SeamlessM4T, its multimodal AI translation model, in August. This advanced tool supports almost 100 languages for text and 36 for speech. With an improved “v2” architecture, Meta is now expanding and refining the capabilities of this technology to improve conversational translations, seeking to add spontaneity and expressiveness—elements crucial for fostering authentic cross-language conversations.

One of the two latest features is “SeamlessExpressive,” which, as the name suggests, transfers your expressions into your translated speech. This encompasses elements such as pitch, volume, emotional tone (excitement, sadness, or whispers), speech rate, and pauses.

This breakthrough is potentially transformative, addressing the historical robotic quality of translated speeches, with implications for both daily communication and content production. The supported languages are English, Spanish, German, French, Italian, and Chinese, although the demo page currently lacks Italian and Chinese at the time of writing.

Seamless Expressive works alongside another AI model to do the magic called Seamless Streaming. It minimizes the delay between the speech and its live translation to as low as two seconds only. This means you don’t have to wait for the speaker to finish talking and can start listening to them straight away.

Meta highlights the challenge of varying sentence structures in different languages, leading to the development of a dedicated algorithm. This algorithm focuses on analyzing partial audio input to determine whether there is sufficient context to begin generating a translated output or if it should continue listening.

The recent advancement in Meta’s “Seamless Communication” suite appears notably impressive, surpassing the capabilities of mobile interpreter tools provided by companies like Google and Samsung.

The timeline for a public release remains undisclosed, but one can envision Meta incorporating them into its smart glasses in the future, enhancing their practicality to new levels.

Follow ProPakistani on Google News & scroll through your favourite content faster!

Support independent journalism

If you want to join us in our mission to share independent, global journalism to the world, we’d love to have you on our side. If you can, please support us on a monthly basis. It takes less than a minute to set up, and you can rest assured that you’re making a big impact every single month in support of open, independent journalism. Thank you.



Get Alerts

Follow ProPakistani to get latest news and updates.


ProPakistani Community

Join the groups below to get latest news and updates.



>