Amazon adds tool to aid visually impaired customers in the kitchen #SmallBiz - The Entrepreneurial Way with A.I.

Breaking

Tuesday, September 24, 2019

Amazon adds tool to aid visually impaired customers in the kitchen #SmallBiz

#SmallBusiness


Dive Brief:

  • Amazon's Alexa can now identify household pantry items for visually impaired customers who have Echo Show cameras in their homes through a tool called "Show and Tell," the company wrote in a blog post published Monday.
  • To use the feature, Amazon shoppers hold up items in front of their Echo Show and ask, "Alexa, what am I holding?" Alexa will then identify the item using advanced computer vision and machine learning.
  • The Show and Tell feature is now available to U.S. customers who have a first- or second-generation Echo Show device.

Dive Insight:

Amazon said in the blog post that the concept arose from feedback the company received from blind and low vision customers. From there, the company collaborated with the Vista Center for the Blind and Visually Impaired in Santa Cruz, California, and blind Amazon employees to research, develop and test the feature.

Show and Tell combines a smart speaker with computer vision, which could theoretically push users to add identified items to a cart or reorder items for purchase. Amazon didn’t say in the blog post whether or not there would be a connection between Show and Tell and voice ordering.  

According to the National Institutes of Health, the number of visually impaired or blind Americans will double to more than 8 million by 2050, and another 16.4 million are expected to have trouble seeing. This is just one example of a disability impacting a significant group of consumers that could be aided through technology. 

Recent advancements in computer vision and technology have opened up the market for tools like Show and Tell. At Groceryshop in Las Vegas, AT&T was showcasing Aira, a service that aids visually impaired shoppers in the grocery store. Through wearing Aira glasses or using the Aira app, customers can talk to professionals who then use the camera function to guide their shopping trip. Last year, Wegmans became the first retailer to enable Aira in its stores.

A similar product is Microsoft’s Seeing AI app, which is powered by machine learning and can read text from images and describe objects seen in a photo. The app isn't strictly for grocery shopping, but it can be used to support the task.  

Voice ordering, another technology that can aid visually impaired shoppers, is still relatively new to the retail industry, accounting for around 1% of all e-commerce sales, according to eMarketer. But experts say it can be particularly useful when it comes to replenishment and repeat ordering — which, of course, caters to grocery shoppers. Peapod added an "Ask Alexa" feature last year, and in April, Walmart partnered with Google to launch voice ordering for customers with Google Assistant. The platform is supposed to get smarter with each grocery order a person makes.





Small Business

via https://www.aiupnow.com

Jessica Dumont, Khareem Sudlow