Retailers have turned to AI to replace photoshoots and predict what people will want to buy and wear in the future
Julie Bornstein spent two years quietly building AI shopping app THE YES to launch it in March 2020. Then the pandemic struck – and changed what people were wearing. “Right now, we’re in a heavy comfort zone,” Bornstein says. The pandemic has meant demand for tracksuit bottoms and work-from-home clothing is high. But as vaccines allow people more freedom, trends are expected to reverse.
THE YES is part of a new wave of companies using AI to personalise how people shop online. It pulls items of clothing from brands and retailers’ websites and shows them in a feed within the app. Think of it like a clothing version of Tinder: if users like the dress being shown, they tap “yes”. If they’re not interested, they tap “no”. But, unlike Tinder, it can improve the items it shows over time by using artificial intelligence and machine learning.
Every like and dislike is fed back to the underlying machine learning models to inform each personalised feed of items users can then buy, and no two people’s recommendations are the same. “AI is simply the ability to understand consumer behaviour and act on it,” says Bornstein, the former chief operating officer of personal styling service Stitch Fix. “The problem with e-commerce is that the infrastructure doesn’t exist to do that today. You need to rebuild the tech stack.”
During the two years it spent in stealth mode THE YES was creating a system to pull in and standardise product data from existing brands and creating its recommendation algorithms. Each time a customer installs the app, they’re prompted with a series of questions about what they like and dislike. Their recommendations are then refined as they “yes” or “no” the products they’re shown. “We factor in hundreds of data points”, Bornstein says. These include preferred brands, price range, size and item silhouettes.
Since it launched in May 2020, there have been more than seven million “yes” and “no” entries into THE YES’ recommendation system, and Bornstein says the firm is on the tenth version of its algorithm. “Really, what we’re doing is ranking the web according to each user.”
Aside from THE YES, which currently only operates in the US, but is planning to expand to the UK, a wave of firms are deploying AI in a bid to transform the fashion industry. Research published by Google’s Cloud business in November 2020 said retailers were looking to use AI within ten different areas of their business, from demand prediction to customer loyalty schemes and product personalisation. Analysis from Meticulous Research says AI in retail is set to be worth $19 billion by 2027 and companies have used the pandemic to speed up their adoption.
Fashion retailers have turned to AI to make their businesses more efficient, replace photoshoots and predict what people will want to buy and wear in the future. Startup Finesse is using AI to trawl the web to predict what the next trend may be, and then use algorithmic design to quickly produce small runs of clothing within 25 days. The firm says it is using 3D modelling software for all of its gender-neutral clothing to reduce costs and cut down on the amount of waste that’s created during the process of creating samples.
With people unable to go out and buy – or even try on – clothes, the technology needed for virtual fitting rooms to operate has accelerated. During the pandemic, Israel-based Zeekit has used its AI to allow brands, including ASOS, Macy’s and adidas, to hold virtual photoshoots. AI is able to map clothes onto people’s bodies – these can either be models or potential customers who upload their own photos to an app.
“With our algorithms, it takes one second per model to render apparel on models,” says Larissa Posner, a former model and now CEO of StyleScan. The company has built virtual try-on software that allows people to upload a photo and then see new clothes on them – it’s being used by retailers to show clothes on a range of models, but it is working on expanding to consumers. A range of custom-built neural networks detect people’s joints and can understand material types and movement.
“They just ship us one size of their physical apparel,” Posner says, “and then they access the backend and see all the models digitally dressed.” One of StyleScan’s investors is a production studio in Los Angeles where it conducts photoshoots with models.
Posner says StyleScan is working with brands, all of which have “over $100 million in annual revenue,” to build its technology into their existing apps. The goal is to add a small icon that people can tap on, upload some photos, and then see themselves wearing the potential purchases. It’s also working on a way to replace virtual dressing rooms with short videos and not just still images, Posner says. “Eventually, we’re going to see influencers being digitally dressed.”