Meta Connect 2022: Researchers Create More Realistic-Looking Avatars

During the company’s annual event, Meta showcased its ongoing work

Ben Wodecki, Junior Editor - AI Business

October 13, 2022

2 Min Read
Image shows Meta CEO Mark Zuckerberg

Researchers at Meta are experimenting with AI to make improvements to metaverse avatars and interfaces.

During the company’s annual Connect event, Meta showcased its ongoing work where researchers combined AI and electromyography to create more intuitive, realistic-looking avatars.

Earlier this year, Meta researchers unveiled Pixel Codec Avatars (PiCA), a deep generative model capable of generating realistic 3D human faces of people.

At Connect 2022, Meta showcased further work on Codec Avatars – including Instant Codec Avatars – designed to be created using a smartphone in a smaller length of time.

Despite being billed as ‘instant,’ the generation process still takes a few hours, but Meta said at its event it wants to cut that time down in the future.

The technology could also be similarly applied to generating models of objects for use in VR.

At the event, Meta CEO Mark Zuckerberg used the tech to scan a teddy bear using a smartphone. After some processing time, a model of the bear was generated and could be imported into VR. The result was a high-fidelity model of the bear, with which users could interact.

“Neither approach is real time yet and each has its limitations,” said Michael Abrash, the chief scientist at Meta’s Reality Labs. “But the goal is to let you quickly and easily make physical objects a part of your virtual world.”

Carnegie Mellon Alliance

Meta also revealed a partnership with Carnegie Mellon University to develop tools for visually impaired individuals.

On display were technologies able to create virtual spaces providing visually impaired people better directions and navigations to where they were going.

Scientists from both sides created a 3D map of the Pittsburgh National Airport using techniques including neural radiance fields and inverse rendering. The map can be accessed via a smartphone app.

The research is largely conducted through the company’s Reality Labs, a Meta Platforms business tasked with producing the next generation of VR and AR hardware and software.

“With Reality Labs, we’re inventing a new computing platform — one built around people, connections and the relationships that matter,” the company said.

Meta described its research work as developing “foundational technologies for future devices and the metaverse.”

This article first appeared in IoT World Today’s sister publication AI Business

About the Author

Ben Wodecki

Junior Editor - AI Business

Ben Wodecki is the junior editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to junior editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like