You might also be stuck at home (thx Corona) and probably find yourself in way too many Zoom meetings these days. It’s just natural that we try to explore how we can augment our video-conferencing to have some fun while we are in this situation.
Snap, the company behind the well-known social media app Snapchat, recently launched Snap Camera which allows you to augment your video stream with a “lens” – an AR overlay on top of your video stream. The output (raw video and augmented elements) can then be used as a video source in Zoom and other video-conferencing apps. These are not just overlays, the lenses are pretty smart little add-ons. For example you can detect face gestures, like smiles, and react accordingly.
This looks quite playful, and it definitely is. But it made me curious how these AR experiences, for Snap Camera or the Snapchat mobile application, can actually be created. So I entered the dev world of Snap and began looking into the Snap Lens Studio, the developer environment for all these AR lenses that you can use in the Snap Ecosystem.
In short: I was blown away by the depth of tooling there is available. And just in the recent week, at Snap’s dev event, they released some cool new features such as Snap Minis that now allow developers to create HTML5-based “experiences” right within the Snapchat application. For example, this can be used to buy tickets right from Snapchat. With this super clear indication that the Snap world of AR is very relevant for Customer Experience, I started to look deeper.
I am sharing my experiences here – with this blog post you get a first overview. I highly recommend to step into this new world of AR, Snapchat, their lenses and the tooling behind it.
Snap Camera, Snapchat & Spectacles
Before I would like share my experiences how create AR experiences using the Snap Lens Studio, let’s first explore where the lenses and AR effects can be used. All three options are shown below – there are the Snap Camera, Snapchat on Mobile and also the Spectacles lenses.
The Snap Camera is Snap’s latest software creation and let’s you browse so-called Lenses, which are AR-experiences that you can overlay onto your webcam video stream. The lenses you can choose here are the same that are also available for the Snapchat mobile app. With the Snap Camera, Snap makes these Lenses available to all Desktop video conferencing users – of course all this happens at the best time possible, during the Corona pandemic while many of use are required to work from home and therefore find ourselves in a constant Zoom meeting…
The Spectacles lenses – as far as I can tell without having tried them out – allow you to create AR content but at the same time they allow you to use AR effects similar to the lenses (or the same) for the content you create. In my opinion the videos created with the spectacles come close to using the rear camera in the Snapchat mobile app, which limits the type of lenses you can use. They typically will not use face tracking for example and focus on augmenting the world around you while in selfie-mode you’d typically augment your own face.
Lens Studio is the authoring/creation environment for the AR experiences and filters that can be explored with Snap Camera, Snapchat on Mobile or even the Spectacles Lenses.
It integrate really well with the Snap ecosystem. After you connected your snapchat mobile application, for example, you can preview and test the lens your working on directly on the mobile app. But I found also the built-in preview very usable.
Lens Studio is a really well developed IDE, but for many new to the AR world, lot’s of 3D geometry specific terms will be a challenge. To ease the path to AR development, Snap offers new developers so-called templates, which you can simply modify to your needs. They also created a great ecosystem of ambassadors, such as Ben that we will talk to for Labs Talk and lot’s of introductory tutorials and videos.
Just to show you what AR experiences you can create and what cool features to build your experiences are readily available, I will briefly explain some of my sample projects:
Reacting to Face Events
I put this example first, just to show you that we cannot just modify templates with custom logos or images, but the experiences you can create are very specific and dynamic.
Face and Hand Tracking
Using tracking, a lens developer can very easily overlay an image onto a specific point on a face or hand. The overlaid object does not only follow the face, the position on the face/hand will also stay in sync. Below for the face tracking example, teh Labs Talk logo is positioned in the upper area of my face and it will stay there – even if I tilt, rotate and move my face.
The same is true for hand tracking – you choose the position via a hand tracking object. The overlaid object can be anything, I chose an image, text and most likely complex 3d geometry can be overlaid, too.
Bringing it all together
The next example essentially combines head binding with behaviors. The head binding is used to overlay a hat and glasses onto my face. Next, behavior scripts are used to modify the objects in the scene. When I open my mouth, soem text objects showing a dynamic countdown to a date becomes visible and some background animation is started.
The next small thing: Minis
We’re just scratching the surface, really, but I hope the overview so far showed you that there is really vast ecosystem of content-creators behind Snapchat and really great tooling, that not only appeals to professional 3d artists, but will most likely also motivate many kids and “citizen developers” to express themselves via their own custom lenses.
At the dev Snap partner event 2020, which was held virtually, many new things got announced. The news that I find most interesting from the perspective of our team, Customer Experience Labs, is Snap Minis. Snap Minis are small HTML5 based applications that you can launch right into a Snapchat experience.
Snap Minis integrates a HTML5 based browser into Snapchat and let’s content creators/page authors use the well-known HTML5 elements, tools, features ( I assume the typical HTML5/CSS/JavasScript mix) to build experiences such as buying for tickets, meditating together, or even registering for voting.
I hope to get access to this feature soon and will then definitely try it out and let you know how it works in detail. Till then I hope you enjoyed diving into the world of AR. Be sure to subscribe to Labs Talk for an upcoming episode about the Snap Lens Studio together with Ben Knutsen and get in touch via Twitter in case you want to continue the discussion!