
Over a month ago Tesla held its second annual AI Day (you’ll be forgiven if you’ve already forgotten since Tesla’s CEO is making much larger headlines in his latest business venture). While critics panned the event as overblown, it is a great example of how artificial intelligence (AI) is marketed (hyping customer-facing AI) compared to how it is actually used (behind the scenes).
According to research co-authored by Matthew Schneider, PhD, an associate professor in Drexel University’s LeBow College of Business, the benefits of AI for companies – specifically in retailing – may not be as pronounced as it has been suggested in popular press, but it still can be a valuable tool for retailers — particularly in non-customer facing applications.
Published in the Journal of Retailing in 2021, the study used past research and interviews with senior managers to examine how senior retail managers should adopt AI, as well as factors for consideration when adopting AI. In addition to focusing on AI in retail, the research team also compared customer-facing and non-customer-facing AI applications.
For the study, researchers defined “AI” as “a system’s ability to interpret external data correctly, to learn from such data and to use those learnings to achieve specific goals and tasks through flexible adaptation.”
The customer-facing AI that the authors use as an example is AI-powered “nudge bots” that interact with, influence and make suggestions to online shoppers. This sort of automated guidance — called customer journey management — is important to retailers because it pushes customers to complete purchases and has been shown to increase overall sales. But risks, like privacy and bias, are more likely to be noticed by customers with customer-facing applications and can negatively affect customers’ perceptions of the retail brand. Therefore, the authors predict retailers are more likely to adopt non-customer-facing AI first.
“Our research appears prescient. Softbank discontinued “Pepper,” a customer facing robot. But “Whiz” – non-customer facing – is doing okay,” said Abhijit Guha, PhD, the first author on the study with Schneider and associate professor at the University of South Carolina.
In the publication, the examples of non-customer-facing applications are the use of AI to assist customer service reps when responding to customer service requests, supply chain optimization and analyze significantly large amounts of data.
Using AI does not come without risks. Schneider points to two areas of concern when adopting customer-facing AI applications: privacy and bias. Retailers are less likely to adopt in-store customer-facing AI applications because in-store customers directly interact with the technology.
“Retailers will use video cameras that detect emotions in stores, called facial analysis,” said Schneider. “Those cameras usually save all the data to a server. Then AI analyzes it and can detect whether somebody was pleased or not so retailers can use this to make different decisions within their store, like which products to put on display.”
But there is a risk because customers don’t know what happens next with the data that’s saved, which conflicts with the retailer’s promise of privacy to their customer.
“The initial purpose for collecting data may be very different from the second, third or last purpose, particularly if AI measures thousands of patterns,” said Schneider. “All they need to do is switch a button and say, ‘now, use the detected emotion to see if people are more likely to steal from the store.”
According to Schneider, bias is related to privacy in the use of AI in a retail setting. He explains that if retailers know how old somebody is, their race and other demographic features, then they can probably figure out who they are.
“With privacy, what you do is synthetically change these features about [the customers] so they can’t be identified. So, now you can’t really make a decision based upon their age or race, unless there are other features that are also correlated to these demographics,” said Schneider. “Future research should continue along these lines to identify different ways to increase privacy, and enable retailers to draw insights with less bias.”
For customer-facing AI applications, there is the ethical concern that bias in the system could direct companies to take actions that are discriminatory. The researchers found that concerns about this problem may reduce the likelihood of companies using AI. For example, Walmart was exposed for locking up Black beauty products and accused of racial discrimination. “While Walmart has since ended this practice, such outcomes are still possible at other retailers if AI is used to recommend products to lock up,” Schneider said.
Another example of risky customer-facing AI adoption is in-store robots – like Giant’s Marty.
“These robots are going around counting soup cans for restocking inventory or looking for spills so people don’t slip. But customers are left guessing whether it’s monitoring them (privacy) or following them if they’re going to steal (bias),” said Schneider.
He further explained these things can hurt a business, “even though the AI robot might be saving some broken bones and lawsuits by cleaning up a spill or restocking shelves quicker than humans could do, there could be a net loss in business due to decreased customer trust in the retailer.”
While there are concerns about data privacy, bias and ethics when a company considers adopting AI systems, Schneider and his co-authors believe there is added value in using AI applications. And despite the authors’ caution, they are optimistic about the impact of AI on retailing and believe “retailers that can suitably harness the power of AI will thrive.”
For future solutions, Schneider said, retailers want usable data without privacy and bias issues. “The questions are not about whether AI should be adopted by retailers, but about how AI should be adopted, and who should oversee the ill effects of AI.”
“We can’t expect the people that created the problems to fix the problems,” said Schneider. “We need people who are primarily versed in the philosophical and statistical aspects of privacy and bias, while not intending to destroy the business value of the data.”
Media interested in speaking with Schneider should contact Annie Korp, Assistant Director, News and Media Relations, at 215-571-4244 or amk522@drexel.edu.