Lens Fest 2025: Awards & Lensathon Recap 🏆
Earlier this month Snap threw their yearly AR-focused event – Lens Fest. It wasn’t just an opportunity for the company’s CTO Bobby Murphy to share more about their recent updates and upcoming releases, but also one of the best and biggest community meetups we have throughout the whole year!
Zuza, Lenslist’s COO went to the event, so we’re here to share some of the highlights for all of you who couldn’t attend.
We won’t cover the news and announcements part (Snap’s newsroom provided their recap here and here you can watch the keynote, as well as all the breakout sessions!), but what’s most important for us – who won the Lens Fest Awards, who were the finalists on the Lensathon and of course, what Lens Creators and developers who took part in the event think about the whole thing. Let’s get started!
The Theme & Main Takeaways
The overall theme of the event was celebrating a decade of Lenses and their makers. Throughout the last ten years, as Snap’s article states: our developer community — 400,000 people who have built more than 4 million Lenses — have turned this idea into one of the largest and most advanced ecosystems for creative expression.
Aside from this main theme, the key focuses are for sure game Lenses and of course Spectacles, with Specs, the new, consumer version of the device set to be released next year. So in case you wonder where to put your time and effort in designing your future as a Lens Creator, these two paths should be top of mind for you.
Another subject that got its spotlight in the keynote was of course AI, and all the integrations and tools that will make it easier and faster to build Lenses, or the genAI features enabling Creators to publish a new type of content with Lens Studio.
Last, but for sure not least, what was often mentioned was monetizing your work as a Lens Creator. Bobby, as well as other speakers focused on 4 key pillars during their talks:
➡️ Lens Creator Rewards, now expanded with Lens+ Payouts, allowing creators to make money based on engagement from Lens+ and Snapchat Platinum subscribers (Snap’s newsroom).
➡️ Challenge Tags (powered by Lenslist!), an ongoing program of monthly themed challenges for Creators building Snapchat Lenses.
➡️ Spectacles Community Challenges (powered by us as well!), another regular program rewarding top Spectacles Lenses.
➡️ Commerce Kit for Spectacles allows Spectacles Developers to integrate in-Lens purchases directly into their AR experiences. It enables the creation of monetized applications where users can buy digital products or premium features through secure payment processing on the Spectacles device.
Lens Fest Awards
This year, there were 9 categories in which both individual creators and studios competed for bragging rights and the coolest Snap and Spectacles themed awards. The Snap team selected 3 finalists for each category out of hundreds of submissions. Then Raag Harshavat, Developer Relations Specialist and Addison Black, Spectacles Producer announced the winners right there on the Lens Fest Stage. See all of the winners below!
Best Engaging Lens: Mohamad El Asmar with Hug Me Please
Best Artistic Lens: Valerii Pidhurskyi with Dualooped World
Best Innovative Lens: Yegor Ryabtsov with Trajectory: The Object Liberation Front
Try it Out! We spoke with Yegor about his recent successes, including the Lens Fest Awards and reaching the finals of the Spectacles Lensathon. He shared insights on staying at the forefront of innovation in the AR/XR space and what drives him to keep experimenting, exploring new features, continuously expanding both his technical and creative skills.
Yegor Ryabtsov
It was a great honour to be nominated for the Lens Fest Awards. Trajectory was competing in the Best Game and Most Innovative Lens categories, and at first it took me by surprise that it won for innovation – after all, it’s a game first and foremost. However, when I look back at the process of creating it, it was definitely more of an attempt to make something that hadn’t really been seen before within AR as a medium, which involved a lot of unorthodox creative and technical solutions. For me, Spectacles have been indispensable in unlocking experimentation around true, wearable, spatial AR. I’m not aware of any other platform or hardware available to developers right now that allows you to do so much, or feel so close to that ultimate vision of a future where the virtual and the real blend seamlessly to shape our daily lives. The platform also gets new features all the time, so in a way it already serves as a roadmap for what might be worth learning if you want to succeed in this space. But you also quickly realise that there are no solidified rules yet for this whole medium – especially when it comes to design and interaction patterns. So while it’s exciting to discover new technology and integrate it into your AR work, it’s even more rewarding to find new ways of making people feel at home in these experiences. After all, if we’re not building this for humans then what’s the point? I’m also very big on R&D, and behind everything I release there are dozens of prototypes that never make it to production but still serve their purpose – to teach me what’s possible and what’s not. For example, it’s easier than ever to integrate AI into your experiences, or use it to aid production. However, some things that seem great on paper simply don’t work the way you imagine. I’ve tried many times to develop AI agents that could perform just as well as a scripted NPC in telling a story, and sometimes they come close – but I’ve also realised that a hybrid approach works best. The story, the script, the characters, their essence and idiosyncrasies still need to come from a human author, or else it all feels stale. Where AI shines is in scaling that vision, adding variation, and allowing a single creator to build more ambitious things in less time. Ultimately, all the shiny new tools are the same as the old ones, in the sense that you can use them for anything, or not use them at all. The way and degree to which you use them should be guided by something deeper than the aggregate of those technologies. It’s an incredible time for scientifically oriented minds and for making computers do amazing things – but we’re still here and (still) in human form. And maybe tending to that – to what makes us the way we are – might just be the most innovative thing one can do right now.
Staying on top of everything happening in the AR/XR space isn’t easy – not only do you need to distinguish genuine innovation from fleeting fads, but also decide which of the real developments truly deserve your attention. I know there’s a lot happening in the AR fashion space, and it fascinates me, but does it captivate me strongly enough to dedicate weeks and months of attention to it? Probably not, and that’s okay. Instead, I’ve always been drawn to world-facing, interactive, wearable AR – even back when mobile AR was the only option – and I know I want my work to explore structures of meaning, communicate ideas, and tell stories. So I follow that.
Best Utility Lens: Krunal Gediya with Home Automation
Best Game Lens: Krunal Gediya with Tower Defence (that’s right, Krunal brought home two Lens Fest Awards!)
Try it Out! Winning not one, but two Lens Fest Awards, Krunal shared the secrets behind his success and his insights on the practical potential of AR Lenses, particularly when it comes to Spectacles.
Krunal Gediya
Thank you so much for the wishes. Winning two awards was an unexpected and truly surreal moment for me, the real krazyy feeling. This experience has doubled my confidence in Lens Studio and Snap as a whole. I genuinely believe it is one of the best creative IDEs to build the craziest ideas with ease, whether it’s XR experiences or fully interactive 3D games. We also experimented with the live translation feature on Specs OS 2.0, using it to converse with a friend who came to Lens Fest from Japan who didn’t speak much English. It was such a heartwarming moment, seeing real-time subtitles appear as we spoke, allowing us to truly understand each other in our native languages ✨ I have many more ideas brewing, from custom wearables that track and visualize body vitals using Specs BLE to navigation tools designed for visually impaired individuals. I am excited to bring these concepts to life in the coming months. Experiences like these, going beyond entertainment, are what will make AR Spectacles stand out, enabling deeper and more personalized connections with technology. Less likes, more love 💛
I have always been drawn to the utility side of AR, as I believe that is where the key to mass adoption lies. My winning lens prototype reinforced this belief, proving that AR is not just about blending virtual elements into our surroundings but about unlocking new ways to interact with our physical environment. With the new Specs and their enhanced capabilities, such utilities will bring devices closer to users in more personal and meaningful ways.
Best Snapchat Lens: Ruya Baraz with Inner Portal
Try it Out! Best Spectacles Lens: Harry Banda with Card Master
Try it Out! Lensathon
This year, 80 creators and developers were invited to compete in the in-person Lensathon and divided into two different tracks: Games and Spectacles. Creators established their teams and had two days preceding the actual Lens Fest event (in practice probably closer to ~24 hours!) to come up with prize-worthy Lenses.
Starting from ideation, going through all the design and development stages, and finishing with pitching their projects to judges from the Snap team. Then, three teams from each track representing the highest scoring projects showcased their Lenses on the Lens Fest stage, pitching to the final judging panel made up of Snap’s internal development team. See which teams got to the final, and hear from the top competitors and winners representing their teams!
Games Track
Team Knockout: Mabu Yussif, Seki Bacsain, Atsushi Ishikawa, Vivek Thakur with Knockout Combo
Try it Out! What made the Knockout Team so successful? Mabu shared their journey and offered insights on Snapchat’s increasing focus on games and game creation within Lens Studio.
Mabu Yussif
Snapchat focusing more on games and game creation in Lens Studio really made me realise how limitless the possibilities are. Our game Knockout was such a fun project to build with my amazing team Seki, Aoe, and Vivek and I genuinely enjoyed every part of the process. For Knockout, we went beyond and combined Mixamo and Maya to create custom Bitmoji animations and built unique mechanics using triggers. We also integrated other tools to improve interactivity and polish the overall experience, making the game both smooth and exciting to play. We had a heavy focus on a polished yet fun finish. Overall, this project really opened my eyes to how much creative potential Lens Studio has, especially for social and interactive gaming experiences on mobile. Keeping it mobile-based also made the creation process feel much more accessible to a global community. I’m really happy to see Snapchat pushing for more mobile-based games!
Many people think Lens Studio is mainly for creating filters or AR effects, but this experience showed me that you can build complex yet fun mobile games too beyond the traditional style of games we currently see being produced. Using Lens Studio’s Turn-Based Lens asset, we made it possible for players to compete with their friends directly on Snapchat, which made the gameplay really engaging.
Team Hackstreet Bois: Yassin Benhaddou, Ines Hilz, Maya Pruitt, Friso van Waveren with Epic Dance Battle
Try it Out! 🏆 And the winners are…
Team Mouth Cannon Crew: Sallia Goldstein, Ben Knutson, Ruya Baraz, Hart Woolery who created and won with their Tongue Topple Lens
Try it Out! We asked the Mouth Cannon Crew to share more about the project, their team, the creative process, and particularly the innovative machine learning component they developed.
Hart Woolery
When we created our team, we thought it might be fun to incorporate ML in some way. Since Snap has a wide range of face-tracking ML already, we focused on something that was not yet tracked: the user’s tongue.
From there we decided to create a game based on a tongue-driven aim mechanic, and settle on a tower-destruction themed game. We needed to collect data quickly and efficiently, so I built a simple Lens Studio project to gather both images and tongue-tip keypoint data. We went around the Lensathon area to collect samples from various participants and Snap staff.
After collecting around 250 datapoints, we trained the model on a cloud-based notebook, using images for model inputs, and heatmaps for model outputs. Once the model was trained, we were able to convert it to onnx format and import into Lens Studio. The model’s decoded keypoint + the mouth center were used to determine an aim vector for the game. While I focused more on the ML side, our team was able to create the game and build the mechanics around aiming with your tongue.
Spectacles Track
Team Marshmellow: Pavlo Tkachenko, Yegor Ryabtsov, Stijn Spanhove with Fireside Tales
Team CARTDB: Nicholas Ross, Guillaume Dagens, Nigel Hartman, Uttam Grandhi with CARTDB
Representing the CARTDB Team, Guillaume Dagens shared the value of being a member of the AR Community, taking part in events like Lens Fest and serving as a Spectacles expert, especially with the upcoming launch of the new Spectacles in 2026.
Guillaume Dagens
For me, this whole ride has been something raw and personal. I’ve always felt like an outsider, hypersensitive to every flicker of light, every sound, every shift in the air, like the world’s volume was turned up too loud, and I couldn’t find the remote.
Meanwhile, I’d watch others online jetting off to wild events, being sponsored, treated like golden children of the digital age… while my own life felt like a mental gulag: four walls, no money, no future, and the quiet paranoia of a man trying to break out through sheer imagination.
So when I got officially invited to the Spectacles Lensathon and Lens Fest, with everything taken care of by Snap, that meant the world to me. I’m so grateful and feel so privileged to have lived that experience: the Hyatt Hotel, the Snap offices, the goodies, the food, and so much genuine kindness.
Suddenly, I wasn’t the ghost in the room anymore, I was part of the storm. Meeting other creators who live and breathe this strange art of light and code was like finding my own tribe (even though I always feel a little out of place in any crowd, but that’s on me).
And when I brought that trophy home, yeah, I’ll admit it, I cried a little. Tears of joy, disbelief, maybe relief that the grind hadn’t been for nothing after all. And then there was Lens Fest, and man, that was a trip.
The food, the panels, the people, the energy, it felt like stumbling into a futuristic carnival where everyone speaks in shaders and dreams in polygons. The staff were so kind, the vibe was warm, and for the first time in a long while, I felt like I belonged in this strange, beautiful world of yellow.
If there’s one truth I took home, it’s that Snap might just be the best company on the planet when it comes to treating people like actual human beings. I used to despise corporate culture, suits, buzzwords, fake smiles, but Snap changed my mind. You folks are the good kind of crazy.
Now I’m just burning for the next Lens Fest. I can’t wait to dive back into that electric chaos, to see everyone again, to get my hands on the new next-gen Spectacles, and to push Lens Studio into whatever wild future comes next.
And yeah, I’ll say it: I hope we, the devs who’ve been sweating pixels since day one, get to hold those new Specs in our hands. Not out of greed, but because we’ve been dreaming with you since the beginning.
If they’ve got stronger performance, better thermals, bigger waveguides — then God help us all… because the future’s going to look insanely good.
🏆 And the 1st Place winners are…
Team Where is Liam?: Candice Branchereau, Marcin Polakowski, Inna Horobchuk, Volodymyr Kurbatov who build and won the Lensathon with their Decisionator Lens
With the project utilizing Lens Studio’s AI features, we asked Inna Horobchuk about their approach to implementing AI and their approach to time pressure and creating the Lens within such a tight timeframe.
Inna Horobchuk
Our team consisted of four people: my partner in crime, Volodymyr Kurbatov, in cooperation with Marcin Polacowski and Candice Branchereau from Flat Pixel, so we had a good balance of technical and creative backgrounds.
A day before, I had a discussion with my husband about AI, how far it might go in the future, and how much we rely on AI now. It was just silly talk, but after we shared some jokes during lens brainstorming, the team picked up on this topic, and together we developed the concept of Decisinator.
It was not the first idea that came to us during brainstorming, but we trusted the process and decided to spend more time discussing before jumping into execution. We tried to picture all the ideas from the start to the end, it’s technically part, and how the main concept and presentation gonna sound.
Honestly, I liked all of the ideas. Mostly, we were just joking around, imagining a new world with our lenses, and we picked up the one we would love to keep developing as it was witty and most pleasurable. As my franch collegue said, the Decisinator was “simply elegant”, and I think everyone agreed. We did have lots of fun during lens testing.
The main idea is to use AI for minor decision-making during the day to avoid decision fatigue at the end of the day.
In other words, every tiny decision we make during the day drains our ability to make decisions. For example, lots of famous people wear sililar clothing every day to avoid unnecessary choices. On the other hand, AI that knows our preferences can choose for us.
It does not mean it will choose the same thing we would, but rather the right thing for a specific user, based on their age, zodiac sign, diet, preferences, goals, context, and previous decisions. Our lens also creates a personal profile in the cloud to store all the data of the Specs owner. It means that Spectacles from different users will make different decisions.
I think Decisinator is sort of hands-up on the AI topic, to spark discussions about human decisions as reflections of personality. The main question is: what dose of “ourselves” can we delegate to AI?
Congratulations again to both the finalists and winners! Watching you pitching your projects on stage was a blast, just like connecting with so many Creators in person. Big thank you to the Snap team for inviting us – we’re already looking forward to Lens Fest 2026! 💛