Six Big Companies That Use Visual Search Six Big Companies That Use Visual Search
Companies that use visual search are able to achieve next-level ease and opportunity for people trying to figure things out. Searching... Six Big Companies That Use Visual Search

Companies that use visual search are able to achieve next-level ease and opportunity for people trying to figure things out.

Searching with images allows people to find what they’re looking for without having the words to describe it. This is a powerful modern affordance, especially for those who don’t know the necessary vocabulary or how to optimize their searches online. Plus, it breaks down language barriers so people across cultures and countries can search for anything they see. For example, trying to identify a food you enjoy from another country.

While the basic idea is the same, companies use and implement visual search in different ways. Here are a few cool examples of trials and triumphs in visual search technology from several prominent organizations.


Google is an obvious example when you think of visual search. The Google Images search engine itself has a “search by image” function. The Google Lens mobile application enables Android users to perform visual searches by simply pointing their camera at an object.

The company first released a visual search app in 2009 called Google Goggles, which was essentially discontinued earlier this year. Google’s current artificial intelligence-powered application, Lens, combines its computer vision and natural language processing with its search engine capabilities.

When users point the app at objects, colored dots appear over them to show Google has detected them. If a user taps one of those dots, a window will appear with more information about that object. This might include its brand, creator, author, price, etc., depending on what the object is.

Google Lens detects a dog and identifies its breed.


Google says Lens can identify apparel and home goods, barcodes, business cards, books, event flyers, landmarks/buildings, paintings in museums, plants, and animals. When users scan apparel, Lens will find online listings of the same or similar products. If it’s a business card, Lens saves the information to a new contact. When Lens detects an event flyer, it adds that event to the user’s calendar. And when the user scans a plant or animal, a window pops up to describe more about the species/breed.

Google Lens is now built into 10 Android phones through a button within the camera app. Other Androids can access it through Google Assistant. Apple users can also use Google Lens visual search on existing photos in their camera roll through Google Photos.


Snapchat announced in late September it is working on a visual search tool. Theirs will allow users to search Amazon in real-time by focusing their camera on a product.

When users press down on an object through Snapchat’s camera, it will generate a thumbnail of that object’s product page on Amazon. Users will be able to click the thumbnail to open the Amazon page on the app or mobile website.

Snapchat determines whether there is an object, and Amazon finds the object’s unique identifiers to search product listings.


In this example video, produced by Josh Constine of TechCrunch, one Snapchat user searches for a pair of sneakers by focusing the camera on the shoe itself. A product page for a pair of the shoes pops up on the screen. Another user searches for a bottle of foundation by scanning the barcode of a bottle they already have. Several Amazon listings show up within the Snapchat application.

Forbes reported Snapchat’s tool could help visual search catch on in the mainstream by making it more natural. It may also further normalize shopping on social media.


Obviously, Amazon is half the equation in Snapchat’s visual search update. But just days before the Snapchat partnership was announced, Amazon also added an image browsing and search tool to some of its shopping categories.

Using the browsing tool, called Scout, users glance through a variety of products and click thumbs up or down to narrow Amazon’s results to things they’d be more interested in purchasing. Amazon currently offers this feature for products in seven categories: furniture, home décor, lighting, kitchen and dining, patio, bedding, and women’s shoes.

In the “coffee tables” section of home decor, we start with this:

By liking only the elephant coffee table and disliking all the other starting tables, we wind up with a spread of more nature- and animal-themed pieces:

For these seven categories, users can browse items they might like using images instead of keywords. Digital Commerce 360 reported other categories will be added to Scout search soon, such as handbags, women’s apparel, and toys.

Forever 21

Through its mobile website and app, Forever 21 offers a “Discover Your Style” module. The module allows customers to narrow their clothing searches based on simple icon representations of styles.

Shoppers can adjust features like color, neckline, hem length, and general type using the module, narrowing their options.

Forever 21 debuted this feature on their app in May for dresses and tops. Over the next month, they saw a 20 percent increase in average purchase value in those categories, plus an increase in sales conversions. After that, the company picked up the pace to integrate visual search across its website. By the end of August, they’d applied “Discover Your Style” visual search to all women’s clothing.

To create this visual search, Forever 21 partnered with Donde Search, whose B2B platform’s algorithm recreates the way shoppers think about products to improve recommendations. The platform uses artificial intelligence, computer vision, and natural language processing to do so.


In October 2017, eBay launched a mobile feature to let users search its site for products using images from the internet, social media, their camera roll, and directly from their camera. More recently, the company implemented an internal visual search. Now, if a user is interested in a certain item on eBay, they can drag and drop that listing’s image into the eBay search bar to find similar products.

eBay uses convolutional neural networks to process images uploaded to its site. The network converts them to vector representations that are more easily compared to one another for similarity.

What’s more, in August eBay Motors launched a feature to let shoppers find car parts via a diagram of their vehicle. If a shopper doesn’t know the name of the part they need to fix their car, they can open an interactive schematic diagram of the vehicle and tap the image of the part. Then they’ll be redirected to eBay’s stock page of the original equipment and aftermarket parts.

Auto parts are a prime example of how visual search can save the day when a person doesn’t possess the necessary vocabulary to enter a keyword search for what they need.


Wayfair's first visual search tool in 2017.

Visual search seemed like an obvious fit for Wayfair customers to find furniture they want.

On the company’s technology blog, Cung Tran wrote in mid-2017, “While we offer great text and faceted search, those features can only go so far. It can be difficult to precisely describe an item in mind, especially in a way search engines can understand. But there’s one type of search query that’s very explicit, specific, and interesting — an image that you upload from your phone.”

He was announcing Wayfair’s first iteration of visual search. It consisted of a tool on the company’s app that let a user snap a photo, crop out an object, and search for similar products.

A year later, Wayfair’s tech team has made significant improvements to the tool’s usability. Now Wayfair’s Object Detection locates and classifies objects within an image, identifying their positions with small white dots. Users can tap between those dots to look at products similar to the various objects in an image. In the future, the tech team plans to implement real-time object detection.

Wayfair visual search in 2018.


Visual search has different applications depending on end users’ needs. These are just a few powerful companies testing out their own capabilities, some of which are still in their beta version. Though many of the uses are somewhat niche and hidden within the company sites now, visual search will become an increasingly central part of people’s daily web searches.

Paxtyn Merten

Paxtyn Merten

Paxtyn is a student at Northeastern University studying journalism and data science.