Snap launches new AI tools for augmented reality
Image credit: BBC

Snap launches new AI tools for augmented reality

Read Time:2 Minute, 11 Second

Snap released a more advanced version of its generative AI technology to improve Snapchat’s augmented reality (AR) features. These effects were more realistic and easier to create with the new Lens Studio.

As a way to stay ahead of its social media rivals, Snap put out the most recent version of its generative AI technology on Tuesday.

This new version lets users see more realistic special effects when they film themselves using their phones.

Augmented reality, or AR, is a technology that uses a camera to show the real world and then adds computer images on top of it. Snap was one of the first companies to bring AR to the market.

Snap is betting that the new generation of creative opt-in features, called lenses, will bring more people to Snapchat and earn them more money from advertisers.

With Snap’s AI tools, AR developers can now make smart lenses in the same way. According to the company’s announcement, Snapchat users will be able to add lenses to their content.

Snap Inc., which owns Snapchat, announced a better version of the developer program called Lens Studio. Its main office is in Santa Monica, California.

When he took over Snapchat, Bobby Murphy, the company’s CTO, promised that Lens Studio would make it easier to create AR effects, cutting the time it takes from weeks to hours and letting people do more complex work.

“It’s fun for us that these tools not only let people be more creative, but they’re also simple to use,” Murphy said in an interview. This means that newcomers can quickly create something unique.

Generative AI Tools Enhance Lens Studio for AR Development

There are also generative AI tools in Lens Studio, such as Lens Brand and Dev QQ. If a developer has a question, the AI can help him find the answer.

Artists will be able to describe an object in text with another tool, which will then turn that text into a 3D object that the artist can use right away to make an AR lens without having to start from scratch with the model.

In its early stages, AR technology could only make simple changes, like adding a hat to a video character. It will let Snap’s AR developers make lenses that look even more real than they do now.

Murphy said that one idea is to have the hat pop animation move with the head of a participant and have the hat level with the video lighting.

Murphy stated that Snap plans to utilize its technology for applications beyond facial mapping and body mapping, which are currently challenging to achieve.

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *

Apple Developer Academy Introduces AI training for students and graduates Previous post Apple Developer Academy Introduces AI training for students and graduates
Meta AI lifts restrictions on Indian election while Google continues to apply limits Next post Meta AI lifts restrictions on Indian election while Google continues to apply limits