Introducing Lens Studio 3.0
Jun 11, 2020
Today at the Snap Partner Summit we announced even more ways for creators to transform their world! You can now integrate your own machine learning models into Lens Studio with SnapML.
With SnapML, you can now create your own Lens features with neural networks that you have trained. By adding these ML models to your Lenses, the Snapchat Camera will be even better at transforming the world around you! Not an ML developer? Don’t worry, we’ve added templates to this new version of Lens Studio so anyone can get started. What will you create?
Apply an art style to the camera feed using Machine Learning to alter the world as you see it.
Use a custom segmentation mask to apply effects to certain parts of a scene.
Attach an image to a custom detected object.
Understand what’s in a scene and react to what the camera sees.
Replace the ground with a material and occlude objects not on the ground.
Use these templates as a starting point for integrating additional models, or add your own.
Use different hand poses like open palm, closed palm, and more to trigger custom effects.
Trigger effects by tracking specific coordinates on the face.
Use Lens Studio’s new facial expression tracking to drive blendshapes on 3D models.
In partnership with Wannaby, we’ve created a foot tracking template powered by their ML model that allows anyone to easily create Lenses that interact with your feet!
New Update, New Look
We’ve updated Lens Studio’s UI and added two themes to make it easier to customize and navigate.
Access and create textures from script with Procedural Texture.
Grey Shadows be gone! Use any color of the rainbow to add depth to your Lens.
3D Eyeball Tracking
Replace or augment eyes in full 3D, including realistic gaze tracking.
More Material Editor Nodes
Further customize your materials with new matrices, arrays, and lighting nodes!
Team Lens Studio