London-based Snap Fashion uses algorithms to help find outfits you’ve seen on someone else — and tells you where to buy them. “We’ve all had that moment when you admire a stranger’s look but have no idea where to get it,” says founder and CEO Jenny Griffiths. “So I thought, why don’t we search and shop using photos?” If you see an outfit you like, snap a picture or copy the link and upload it to Snap Fashion’s iOS app and it will find the closest match. Launched last September, Snap Fashion hit 100,000 users in April. It was conceived while Griffiths was studying machine vision at Bristol University. Although lacking fashion experience, she hit on the idea of a visual search engine — a set of algorithms that break down images, rather than words.
Snap Fashion’s technology strips out a photo’s extraneous visual data — sleeve length, style, neckline — and focuses on colour, cut and texture. The app then combs through the inventories of 170 retailers including Net-A-Porter, Topshop and French Connection to find a match. Snap Fashion earns a commission from the retailer when someone buys an item through the app. The algorithms also incorporate a technology called background segmentation, which enables the app to recognise images in any context, from catwalk snaps to catalogue photos. “You can have a go with any photo,” says Griffiths. “Even by taking a picture of a model you’ve seen on TV.”