Throughout the pandemic, virtual try-on has offered a sweet sense of relief to fashion and beauty brands unable to demo their products offline.
From seeing how a new pair of sunglasses might look on your face to testing new lipstick shades out in anticipation of a mask-free future, these AR- and AI-enabled experiences have brought a small part of the traditional retail experiences to lockdown. Even retailers like Wayfair and IKEA rolled out 3D and augmented reality furniture visualization tools so shoppers could ‘try before they buy’.
Only, they haven’t all been brilliant—particularly where fashion is concerned.
In ‘real life’, dressing rooms are the heart of shopping experiences. Searching for the right styles, size and fit can make or break purchasing decisions and—without advanced technology able to realistically dress an individual’s body in augmented reality—the latter is far more likely online.
Until now, perhaps.
At today’s Snap Partner Summit, Snapchat introduced a range of brand new AR try-on experiences that add more to the smartphone shopping experience than ever before.
“We believe shopping is more than the transaction,” says Carolina Arguelles, Global Product Marketing Lead, Augmented Reality at Snap, “it’s really about the experience and we believe a better experience—one that’s more immersive, connected with your friends, and emotional—is not only going to increase that buyer confidence but also build a longer term loyalty to brands and businesses.”
Partnering with fashion giants like Farfetch and Prada to start, Snap’s new machine learning technology will utilize ‘3D Body Mesh’ to replicate real-life fits as Snapchatters try virtual clothes on via the camera, implementing voice-enabled controls to let the app know they’re looking to browse and try on in AR.
All of which was not only developed to build shopper confidence, but act as a solution for businesses and consumers alike.
“70% of consumers feel that finding clothes online that fit is really difficult, and returns are a $550 billion problem for businesses,” says Arguelles. “Everything we’ve announced today focuses on improving the realism of these experiences and how natural it feels to interact with them.”
In addition to 3D body mesh enhancements and body tracking, a new cloth simulation machine learning algorithm has been put in place to continually recreate the movement of fabrics.
“With features like Snap ML [machine learning], which is embedded into our Lens Studio technology, we’re basically allowing machine learning experts from around the world to take the algorithms that they built, plug that into our platform, and make our cameras smarter,” she says.
Case in point, shoe try-on. Though the algorithm is actually powered by Wanna, a Snap partner, brands like Dior have already seen a 6.2x return on ad spend while utilizing the tech in their Snapchat promotions, benefitting all parties.
Ordinarily, try-on has been leveraged through advertising alone, where businesses could guarantee their ads would be seen and who would see them. Which has been rather successful, considering brands like NYX Cosmetics have been able to drive over 60 million try-ons of their beauty experiences in a single day.
“We believe it’s important to complement our advertising with a permanent and sustainable place where people can discover these experiences, though, so today’s launch of public profiles for businesses is the most critical launch for businesses on Snapchat,” she says.
With Public Profiles for Business, companies will be able to save their content permanently, creating something of a virtual storefront for their brands. And it’s 100% free to publish.
“Brands can hire experts to create these experiences but, if they’re savvy, they can also just plug 3D models they would have from their product development cycle into Lens Studio for free,” she says, emphasizing how important it is for the company to democratize access to AR experiences.
“We’re also making these DIY templates for AR shopping in our new tool, called Lens Web Builder, where small emerging beauty brands can simply upload the hex colour SKU of a specific item, the finish of the item—matte, glossy, whatever—and build experiences for free by just tapping a few buttons.”
And while all of this could seem like a large retail-facing investment for a camera app to take on, such investments have already offered high returns.
Last year Snap bought conversion-boosting AR shopping leaders Fit Analytics for a cool $124.4 million, yet have already been able to drive $2 million in incremental sales for advertising partners like American Eagle, via virtual Snapchat pop-up shops, through a single campaign.
“Personalization is so important — how do we make this personal to you, to your face, to your body, to your room—and that’s where try-on and features like fit recommendations with Fit Analytics really come into play,” Arguelles adds.
But they’re not the only ones. Earlier this week Wacoal, the nation’s leading intimate apparel brand, launched a revolutionary fit solution for the ever-so-tricky bra buying process.
The mybraFit app is the first of its kind, utilizing artificial intelligence (AI) and a augmented reality bra fit studio determines accurate bra size within minutes.
“Wacoal has always been obsessed with fit,” says Miryha Fantegrossi, Vice President of Merchandising and Design at Wacoal America. ” Traditionally, we’ve delivered our fit experience in stores, but as we saw consumer shopping behaviour shifting, I really stayed awake at night wondering how we’d bring our product message and benefits to the customer as she shops online.”
The difficulty, where bra fit is concerned, is that the size charts and broad bra size calculators available online have never been accurate enough to guarantee a fit or replicate a true, tailored experience.
“When we started down this path we really, legitimately didn’t know how we were going to do it, we just knew we wanted to offer her something better,” she admits.
Most important to the process, in Fantegrossi’s mind, was making sure the customer had to do as little as possible to achieve a perfect fit. No measuring tape. No consultation. No fuss. No fee.
In total, the development process took two years.
“Bra sizing is specific to the millimetre, so accuracy is super important. We went through several stages of research and validation, looking at the accuracy of potential tech partners, and landed on Sizer,” she says.
Sizer’s technology, which has now been weaved into Wacoal’s own proprietary bra sizing algorithm, simply took a digital image of the customer in three different poses, via smartphone, and calculated their results.
Fantegrossi knew it wouldn’t just be relevant to a time-poor generation (who, perhaps, couldn’t think of anything worse than having a stranger measure their chest), but be the best possible way to plug her team’s 70-plus years of fit intelligence into something tangible.
“Our size recommendation is now completely different to what the rest of the industry is using,” Fantegrossi says, “we matched great knowledge with great tech. Old sizing charts, the ones that were originally developed in the 1930s off of Victorian shirt sizes, can’t compete.”
Both the algorithm and the style recommendation engine, from which the customer is presented a wardrobe of perfectly-chosen pieces after her fit, have patents pending.
Of course, expertise and patents aren’t always enough. And beauty brands, arguably, have it hardest.
While faces are fairly easy to map, meaning it’s not terribly hard to place color on a face, from a technical standpoint, using AR and AI to suggest color is challenging.
“I’ve been searching for an intuitive foundation shade finder tool since I started Cult Beauty, in 2008, and nothing has lived up to the experience of having a professional match you in daylight,” says Alexia Inge, founder of Cult Beauty.
“There are so many variables from light, skin tones, prevalent undertones, device, screen, OS, to product density, oxidation, as well as preferences for coverage levels, finish, brand and skin-type,” she says. “Then, if you layer personal nuances like wanting a slightly lighter or darker tone than ‘true’, you have the sort of brief that sends most developers running for the hills.”
Still, she was desperate to offer such a service to her digital-first Cult Beauty customers, so set out searching for a diamond in the virtual rough. Which is where MIME came in.
In 2016, MIME launched as a “Shazam for makeup” where someone could take a photo of any color and receive product recommendations (toenail polish, lipstick, et al) that matched.
Eventually, they realized users were searching for products to match their skin above all else, so decided to shift focus onto the complex task of foundation matching.
By the time Inge discovered MIME and met with its founder, Chris Merkle, she was sold. “I was struck by Chris’ real understanding, not only of light detection and machine learning, but also his grasp of the emotional connection that people have with their complexion product choices, the psychology of foundation,” she says. “It’s as personal as underwear, as potentially loaded as the term, ‘nude’, as transformational as a great haircut and an instant tool that lends confidence to deep insecurities. Getting it wrong not only wastes money and time but also, and most importantly, trust.”
After a few months finessing the algorithms and product offering with Cult Beauty’s own, the brands launched MatchMe, a foundation-matching AI experience that requires little more than a selfie.
After self-classifying their skin type and foundation needs, the app runs the customer’s selfie data through Cult Beauty’s inventory and selects the shades are the not only the closest match, but their desired coverage, finish, skin type, and color preference (around 75% of shoppers like a perfect shade match, while 25% may like a shade that is lighter or darker than their ‘true’ color).
“I identified an opportunity where we could understand lighting better than many of our competitors,” says Merkle. “If you could not understand the environmental lighting, you throw all color accuracy out the window. We knew that before we could be experts at any product recommendation, we had to understand light.”
MIME’s Color Absorption Model, which understand how foundation blends into the skin and even dries/oxidizes once out of its packaging, is also unique to anything else in the industry. “This allowed us to create a color model that goes beyond an alpha transparency layer technique that most virtual try-on tools use,” he says. “As a founder, I had to know that our recommendation would be the perfect shade when the customer opened the package at home and put it on their skin.”
Behind the scenes, however, color-matching isn’t as time-consuming as you might think. Though each recommended shade must be shipped to MIME for scanning before it can be included in the algorithm, it typically only takes one to two days to complete the analysis. Currently, over 60% of Cult Beauty’s foundations feature in MatchMe.
“In the past, there has been a large focus on gamification and not quite enough practical consumer problem solving, and if nothing else 2021 will be the year of time-saving pragmatism,” says Inge, proud of their collective accomplishment.
After a wholly unexpected year of lockdowns and retail doomsdays, it seems, AR and AI might just become essential to not only help fashion and beauty brands survive, but thrive.
“The good news, and the reason why we’re seeing such positive momentum, is that it isn’t just because of a pandemic,” Arguelles adds.
“The challenge of wanting to reach people wherever they are has been there well before the pandemic, and will continue to be with the rise of e-commerce. How do you recreate that emotional connection that you have when you physically walk in the store? But in a much more sustainable way? It’s an exciting investment, and it’s really just the beginning.”