close
close

first Drop

Com TW NOw News 2024

Meta Brings the Voices of Judi Dench, John Cena and Keegan-Michael Key to AI Chatbot
news

Meta Brings the Voices of Judi Dench, John Cena and Keegan-Michael Key to AI Chatbot


New York
CNN

Facebook and Instagram users can now talk to each other voices that sound a lot like John Cena and Judi Dench. However, they will not be real actors, but a chatbot with artificial intelligence.

Parent company Meta (META) announced Wednesday that it is adding voice chat and celebrity voices to its artificial intelligence chatbot, Meta AI. Now, instead of just messaging the chatbot, users can have real-time conversations and choose from a selection of computer-generated or celebrity voices.

The company worked with Cena and Dench, as well as actors Kristen Bell, Awkwafina and Keegan-Michael Key, to train the chatbot to mimic their voices.

The update comes as Meta attempts to keep its AI chatbot — which lets users chat on Facebook, Instagram, WhatsApp, and Threads — on par with rival products, including ChatGPT, which is rolling out its own advanced voice mode. Meta CEO Mark Zuckerberg said Meta AI is on track to become “the most used AI assistant in the world” by the end of this year, likely helped by the more than 3 billion people who use Meta AI. the company’s apps every day, though it’s not clear how Meta measures chatbot usage or how often people use the tool.

Rival OpenAI came under fire earlier this year when it demonstrated its own real-time voice mode feature for ChatGPT due to a demo voice that sounded remarkably like actress Scarlett Johansson, who said she was asked to join the project but declined. OpenAI denied that the voice, dubbed Sky was based on Johansson, but discontinued its use. In contrast to that debacle, Meta appears to have formed formal partnerships with the actors whose voices were used to train the tool.

Zuckerberg announced the new voice mode during his keynote speech at the company’s annual Meta Connect conference, where he also talked about other AI developments, a new, lower-cost version of Meta’s Quest headsets and updates to the company’s augmented reality line of Ray-Ban eyewear.

Other notable announcements: Meta now lets social media influencers create AI versions of themselves. Previously, influencers could train AI to have text conversations with their followers; now, followers can have full, quasi-video conversations with AI versions of influencers using the tool.

Meta’s AI technology will also automatically translate and dub foreign language Reels (short videos from Meta) for viewers. So if you speak English but a Reel comes into your feed that was originally made in, say, Spanish, it will appear in your feed as if it were made in English, complete with adjustments to the speaker’s mouth to make the dubbing look natural.

And you may start seeing more AI-generated content in your Facebook and Instagram feeds. Meta says it will generate and share AI-generated images to users’ feeds based on their “interests or current trends,” a feature it calls “crafted for you.” (It’s not clear whether users will be able to opt out of this if they prefer to see only content from their real, human friends.)

Meta’s AR glasses will also get live, AI-powered translation. A user could have a conversation with someone speaking a foreign language and hear the translation to their own language in their ear within seconds, Zuckerberg said.

Zuckerberg also previewed “Orion,” a prototype for a more advanced tech-savvy pair of glasses that would essentially pack the power of an AR headset — like Meta Quest or Apple’s Vision Pro — into a pair of mostly normal-looking (if a bit bulky) glasses.

But there’s a big difference between the Orion and headsets like the Quest or Vision Pro. With existing AR headsets, users stare at a screen that uses a camera to display emails or photos overlaid on the user’s environment, a technology known as “passthrough.” But the Orion’s lenses are actually see-through, using holograms to make it appear as if your email inbox or text messages or even a live 3D representation of a friend are floating in space next to you.

Zuckerberg called them “the most advanced glasses the world has ever seen,” but they’re not yet available for consumers to buy. The CEO said Meta will continue to experiment with the glasses internally and make them available to select outside developers to build software for them ahead of a final consumer release.