Emily Sermons
try@2x.png

Visual Search Controls

FSC NB@2x.png

Adobe Stock

Visual Search Controls

HERE'S THE GIST

Adobe Stock’s Visual Search Controls allows users to find images with similar content, color, and composition as a source image.

01. Problem & Background

Aspects of Similarity

In 2016, Adobe Stock followed in the footsteps of other technology search giants like Google, Bing, and Pinterest and launched a visual search feature. Like the others that came before it, the feature allowed users to upload an image and see similar-looking images within the Adobe Stock library. It had the added functionality that you could also choose an image you liked from your Adobe Stock search results and run a visual search on it. For a website that sold stock photography, it seemed only natural that you should be able to search with images rather than words.

However, as time went on, we found that our visual search feature just wasn’t performing like we expected it to. We expected the number of images purchased after users performed a visual search to be much higher than images purchased after performing a regular keyword search. To our surprise, these conversion numbers were almost identical. We also saw large numbers of users leaving the visual search experience once they were in it.

The team began to investigate: while some visual searches provided satisfactory results, many others showed odd, seemingly incorrect result sets that didn’t match the users intention when they made the visual search. When users received image results that didn’t match their expectations, the only action they could take was to scratch their heads and wonder what went wrong.

After a year of having the feature launched, the Stock team decided to go back to the drawing board on their visual search feature to see why it wasn’t working the way they expected it to. They formulated two hypotheses: the first problem to fix was an algorithmic one, and the machine learning model would need to be better trained on a wider variety of images before updating the feature. The second hypothesis, however, was that the visual search feature lacked user intent.

There are so many components that make up an image, and there are even more reasons that a user might like any one of those particular image components. Instead of just asking our users what images they liked, was there a way to also ask them what about the image they liked?

It was from the combination of these two hypotheses that the next generation of Visual Search on Adobe Stock was born: Visual Search Controls, which allowed users to find images that had similar content, similar color, or similar composition as their chosen source image.

 

02. Project Setup

Passing the Baton

My involvement in this project began in a very non-traditional way. At the time this project was being kicked off, I was currently working as the Experience Designer working on Libraries and Asset Management on Adobe Stock. While I had previously done a few one-off projects for Adobe Stock search, our team had another senior designer that was devoted to search for the site. About mid-way through this Visual Search Controls project, the search designer announced he was leaving the team; I became the new designer leading Search design and worked closely with the departing designer in his last month at Adobe to transfer knowledge on this project to finish the feature before its big launch at Adobe MAX 2018.

Prior to my involvement on the project, the search designer had completed research on the project and user tested some UI design concepts. When I came onto the project, he was finishing up the latest round of this usability testing, and we reviewed the results of his testing together.

Old FSC.png

When they tried out the prototype that allowed them to find similar images by content, color, and composition, the concept clicked quickly. They were able to articulate what kind of images they would expect if they selected each of the three options. Additionally, several interviewees were able to reference past-projects where the feature would have helped them in their stock image search.

 

03. User Validation & Aglorithm Adjustments

Machine Learning Curve

My two biggest contributions to this project ended up happening after most of the design work had wrapped. First, I needed to ensure that the results that users received as they were using the feature actually matched their expectations. While our engineering team finished up the new algorithm, I got to work on a series of surveys to learn more about how people understood the concept we were working on.

I started off as simply as I could: I reached out to members of our internal Adobe Design team that had searched for and purchased Stock imagery in the last 12 months with a survey and received 55 responses. The survey asked them to define, in their own words, what content, color, and composition of an image meant to them. In this same survey, I showed them a series of three different stock images and asked them to describe the content, color, and composition in each.

Their answers showed how users understood the concepts we were introducing, and which (if any) seemed to be the trickiest to understand. As it turned out, most users were able to understand the concepts of content and color quite easily.

When talking about composition, however, users tended to give varied responses. Most users understood that the composition of the image had something to do with the way that the subjects were arranged in the photo. But many expanded on it further: some users believed that the size of the image subjects fell under composition, while others focused heavily on words like “foreground” and “background.” I also found that when asked to identify the elements of composition in particular images, users’ answered varied depending on whether they were viewing a landscape image or an image with a clear subject.

I brought the results of this early reconnaissance to my product manager and the engineers training the machine learning algorithm for composition — we decided that we needed to do further research into our user’s expectations before finalizing the feature. We developed a plan to build a prototype that tested two versions of the composition algorithm to determine which one (if either) provided visual search results that matched the expectation of our users.

We reached out to the same set of 55 survey responders with another survey. This survey asked users to play with our new prototype and perform any visual search that they’d like with it. For each visual search they ran, the prototype gave two sets of results and asked them to compare each one; survey responders answered which of the two results sets returned images that had image compositions most similar to their source photo. They explained their answers and were prompted to submit the survey as many times as they liked.

We came out with two major findings from this round of surveys:

First, we found that users found both algorithms to provide results that matched up with their expectations, but that one algorithm had a slight edge over the other. We used our user’s responses to understand why they felt that one algorithm sometimes worked better than the other, and also recorded instances where the winning algorithm didn’t perform as well. We used the latter piece of information to train the winning algorithm on certain images to bulk up its accuracy in these cases.

FSC2Graph@2x.png

Second, we noticed an interesting phenomenon amongst our users search queries. One or two users used the feature in a way that we did not expect: they might type in a search for “mountain,” but upload an image to our visual search that contained a vibrant color palette. This resulted in images of mountains that included only these bright and vibrant hues in the picture. We found this phenomenon fascinating: Adobe Stock had long provided a filter that allowed users to select a particular color they’d like to see in their images, but this was the first time that users could effectively upload a “color palette” to our site and find images that matched up with that palette or mood. Seeing this helped us understand how we wanted to showcase the feature and think creatively about the future of our visual search offering. You can view an example of this use case in the video below (I recommend using the highest video quality setting):

 

04. Engineering Handoff & Designs

Going Off the Grid

As we worked on the user validation of the feature and finalized our algorithm, I simultaneously tackled my next big hurdle: ensuring our front end development team could implement the designs that the search designer before me had proposed.

We hit a snag almost immediately: the design we were proposing would break our complicated grid structure that organized the assets in our search results page. Since we were working on a tight timeline in trying to launch this feature for our big Adobe MAX conference, we knew that the design would have to be adjusted in order to launch in time. My biggest priority was making sure that the design still stayed easy to use and understand for our users, but I couldn’t ignore the reality of our situation: we didn’t have enough time to user test an entirely new design with users if we were going to make our deadline for MAX.

The design compromise we needed to make was a tough one: we could keep the original design we wanted, but we would need to move the Visual Search Control element outside of the search results grid. This meant losing vertical space above the fold. This was one of the hardest compromises our team was asked to make for the MAX timeline, and one that we had to vet through both our product team and our users. You can compare the differences between the intended design versus the one that we made adjustments to:

In the end, we made the compromise: I took time to look at the average height of our images to make sure the element was designed to take up the least amount of vertical space while still retaining it’s user-friendly mechanics. I worked closely with our lead front-end developer to make sure that the design covered all of the cases for images and videos with varying sizes, aspect ratios, and locales.

See examples of the final design below:

 

05. A/B Test & Customer Interviews

It's Alive!

In September of 2018, we launched our new Visual Search Controls feature as an A/B test on the site that was visible by 50% of our user base. We did this pre-launch for two reasons: we first wanted to ensure the feature was live before the big MAX reveal so that our engineering team had time to catch and fix any bugs with the feature on the live site. We also used the opportunity to bring the now-live feature to several of our Enterprise customer groups to get feedback on the feature for its future iterations.

I traveled to New York with the Product Manager and Engineering Manager for search to introduce customers to the feature, watch them use it in their real workflows, and ask questions as we observed. Many of these customers continued to use the feature with their teams after the interviews and sent more feedback as they experimented.

The feedback we received from these customers was very positive, and all of our hard work felt amply rewarded. Customers were able to understand the mechanics of the feature and find ways to relate it back to their daily projects — not only did users think that the feature worked well, but they seemed to really enjoy continuing to play and experiment after the interview.

To comply with our legal agreements with our enterprise customers, we cannot showcase actual audio clips from our September interview sessions on a public platform. Please contact me if you’d like to discuss their feedback in more specificity.

As an art director, sometimes a designer will come to me with an image that’s almost the right idea, but something’s still off. A tool like this will help them get to the right solution faster.
— Art Director, Adobe Enterprise Client
 

06. Demo & Afterthoughts

What's Up Next?

At the time of writing, Visual Search Controls is still launched on the site as an A/B test, and will be available for all users to use at the time of MAX 2018 in mid-October. The completion of the first iteration of this feature has been met with high praise throughout the Adobe Stock team and beyond; we’re glad to be able to showcase another example of how Adobe Stock search is pushing the boundaries of machine learning features in both the Stock photography space and the larger context of visual search giants like Google, Pinterest, and Bing.

Looking back on this project, there were certainly challenges that our team and I faced and overcame to the best of our ability. Two of my biggest constraints included:

  • Needing to own and execute a design that I didn’t create. This was an unconventional design project for me, as I needed to execute the details and vision of another designer after he left the team. When I picked up the project, there were still some hard questions to answer: did the results produced by the algorithm of this feature meet up with users expectations? Could this design be built by engineering? How could this design scale for new iterations of the feature? In taking on this project with a tight timeline, I tried not to focus on how I would have designed the feature and instead tried my best to execute a solution that was intuitive and made our users feel successful.

  • Not being able to execute the original design vision fully due to engineering constraints. When we realized that our timeline wouldn’t account for the work it would take to adjust our image results grid, we worked together to adjust the design to make it easier to implement. We did this in a way that allowed us to stay on schedule but still kept the original intent of the design that had been validated successfully with our users and minimize impact to the above-the-fold space on our page.

Looking forward, our team has big plans for the Visual Search feature — many that can’t be discussed publicly. As the designer on search, I am glad that I get to pivot my thought process on this feature to a more holistic approach. With the V1 of this feature, I was most concerned with executing on the vision of a fellow teammate while ensuring that it would be user friendly and fit within our larger timelines. Now, I get to think how we can expand our Visual Search feature to scale to help our users hone in their images results with even more specificity and ingenuity.

Below is a video featuring the power of the new Visual Search Controls feature (I recommend using the highest video quality setting). This feature is also live on our site for 50% of users to experiment with prior to MAX 2018.