Google, with its strong focus on generative AI, is launching a new feature on Google Shopping that displays clothing on a range of real-life fashion models. This innovative virtual try-on tool for apparel uses an image of an outfit and predicts how it would look on different models in various poses.
Powering this feature is a new diffusion-based model developed by Google. Diffusion models, including the likes of text-to-art generators Stable Diffusion and DALL-E 2, learn to subtract noise from an initial image made entirely of noise, gradually moving it closer to a target image.
The model was trained using numerous pairs of images, each featuring a person wearing an outfit in two distinct poses. To combat visual defects and enhance the model's robustness, this process was repeated with random image pairs of garments and individuals.
Starting today, U.S. shoppers using Google Shopping can virtually try on women's tops from brands like Anthropologie, Everlane, H&M, and LOFT, indicated by the new "Try On" badge on Google Search. Men's tops will be launched later in the year.
While virtual try-on tech isn't a new concept, Google's use of generative AI takes this a step further. Although this innovation may face pushback from models, Google has stressed its commitment to using real, diverse models, spanning sizes XXS-4XL and representing different ethnicities, skin tones, body shapes, and hair types.
Alongside the virtual try-on rollout, Google is also launching AI and visual matching algorithm-powered filtering options for clothing searches on Shopping. These filters let users narrow their searches across stores using inputs like color, style, and pattern.