The winner of TechCrunch’s Disrupt SF 2017 Hackathon gives us a glimpse into what shopping at physical stores will be like in a future where voice assistants break out of the home. Hundreds of engineers and designers competed to create the most interesting project they could in 24 hours. The winner, Alexa Shop Assist, imagines a future where devices like the Amazon Echo or Echo Dot are scattered throughout a store to help shoppers easily navigate the space and get answers to questions without needing to hunt down an employee.
The winning team created an Alexa skill that combines Amazon’s voice assistant with their own speaker recognition algorithm to help store shoppers. The demo involved a customer in a store asking Alexa “where can I find engine oil for my Prius,” to which Alexa responded with the aisle number. While a voice-powered store directory is impressive, the demo really shined when the shopper asked Alexa “which one should I get” while in the engine oil aisle.
The Alexa skill, using the team’s custom speaker recognition algorithm, identified the shopper as the person from earlier who asked about engine oil. Already knowing the shopper was looking for engine oil for a Prius, the application responded with the correct oil grade to buy.
Amazon is already testing convenience store without any cashiers, so it’s easy to see Alexa being used to assist customers in navigating those stores, in order to further reduce the necessary staff. Having easy access to a large pool of knowledge covering all of the products in a store would be quite beneficial to shoppers as well.