Snap once again is pushing the boundaries of AR. Be it with Dress Up, the new virtual fashion hub or a new AR image processing tool, or its Lens Studio, a back-end for its developer platform - Snap is demonstrating again that the capabilities of AR are limitless.
Here is a recap of the new features announced by Snap:
People are now accustomed to shopping while socializing. A study has shown AR guided purchases lead to a 25% decrease in returns, and interacting with products in AR has a 94% higher conversion rate. Social shopping or social commerce, the conversion of digital platforms that support social interactions such as social media networks into new distribution and sales channels for brands, is a trend that is here to stay. Snap's latest in-app shopping feature, Dress-up, is a new hub for everything AR shopping.
This catalog lets users use their cameras to virtually try on clothes and accessories through Snap’s AR lenses, take a picture, and send it to friends. Unlike a regular Amazon page, this shopping experience is more fun and experiential. Carolina Havas, Snap’s head of AR strategy and product marketing, said: “Now, there’s a really core utilitarian use case that we’re also focused on driving … there’s also a huge area of fashion that’s all about self-expression and asking friends for advice and having fun with friends.”
This feature is perhaps one of the most disruptive: Snap can now turn any product image into a 3D product image. This new feature allows users to take a full-body selfie and try on almost anything. This AR SDK (software development kit) will bring catalog-powered shopping lenses into the retailer’s product pages to allow their customers to access virtual try-on technology easily. The company will launch this feature soon; it will work on iOS and Android apps with a future version compatible with web platforms.
Puma will be the global partner to use the technology and allow Snapchatters to virtually try on the virtual sneakers by just pointing their phone at their feet. Snap’s AR Image Processing technology in its 3D asset manager will also be available to retailers, making it easier to build augmented reality shopping experiences. Soon, brands will be able to turn their product SKUs and turn them into Shopping lenses very fast and cost-effectively, Snap claims.
Snap is a massive community of users that counts 300M DAU*. It also gathers a community of creators; more than 250,00 of them have built over 2.5M lenses that have been viewed over 5 trillion times. More than 300 creators have reached over a billion views on their Lenses and hundreds of developers across the world have been using Spectacles to build Lenses. To continue to support its developer platform, the company is launching the Lens Cloud.
This server-side component will allow developers to create dynamic multiplayer experiences and location-based lenses. Developers will also be able to use the Lens Cloud to store assets and load them. The developer platform can also act as a memory card, meaning they start a lens project, leave it, and pick up where they left off later. Besides, applications for Creator Marketplace are now open to everyone.
*Daily active users
Over 1 billion Bitmoji avatars have been created to date. To further personalize users' in-app experience, Snapchat introduces emoji-focused polls that users can share in Snap and stories. It is a fun way to engage with friends and measure their responses to their chosen questions. Snapchat adds a new option to its individual messaging functionality, enabling users to reply to any comment with a separate chat thread. Like other social apps, Snap is rolling out character Bitmoji reactions for a more personal touch. And finally, Snap also updated its video and audio calling interface to “make live conversations more fun”.
Snap’s dedicated AR innovation lab, Ghost, will be updated with a new machine-learning track. Developers will be able to understand how users interact with their experience during sessions, thanks to a new analytics package. Next-level graphics will be possible using Ray Tracing coupled with a machine-learning-enabled environment. The first use cases involved a Tiffany & Co’s hyper-realistic jewelry model and a Disney’s Lightyear experience.
This new set of camera and editor tools includes a quick edit feature, a camera speed adjuster, a green screen mode, and a dual-camera for filming with the selfie and the world-facing cameras at the same time.
To enhance the experience of going to concerts and festivals, Snap is signing up a multi-year partnership with Live Nation. This partnership aims to create a deeper connection between artists and fans via immersive AR and allow performers to tell their stories using a new format. Attendees at select concerts and festivals will be able to use AR to find and connect with friends, unlock landmark experiences and try on AR merchandise.
Last but not least, a pocket-size drone named Pixy will perhaps be the coolest edition of the user’s creative toolbox. It will take off from the user’s hand, float, orbit, and follow them to record and take images. The content will be uploaded to Snapchat memories directly, making it easier to upload social media content on the platform.