Shutterstock, Inc. has made its visual search features,
first introduced for desktop use in March available for mobile use. Reverse Image Search for mobile invites users to capture the world around them on their mobile phones, and then upload them via the
Shutterstock app to search Shutterstock’s collection of over 80 million images for similar content and style.
Since it launched its first mobile app five years ago, Shutterstock has invested in creating easy-to-use mobile technology. Bringing machine learning to mobile is the next step toward a more mobile-centric future for images. As users upload photos captured by phones to search Shutterstock’s collection, the neural network on Shutterstock’s back end studies and learns what types of images are most popular for mobile usage rather than desktop usage. With time, it will grow to understand authentic photography taken in more natural settings. Data collected will showcase emerging trends and best techniques on mobile devices.
This is the latest Shutterstock innovation made to enable next-generation search and discovery experiences by expanding beyond keywords.
“When we unveiled Reverse Image Search this past spring, we knew that it was a perfect fit for our mobile application -- it’s arguably one of the best use cases for computer vision technology in general,” said Shutterstock CEO and founder Jon Oringer. “It’s so easy to take a picture and everyone stores hundreds or thousands of them on their phones. Now you can use those photos to help you search for and find better-quality, more suitable images for your professional needs.”
Computer vision is the ability for a computer to break an image down into its primary characteristics, both visually and conceptually that can be represented numerically. The technology relies on pixel data within images - rather than metadata collected through keywords and tagging - to help identify and surface relevant content.