Technology/Services

Are AI Cameras at Convenience Stores Moral?

‘AI ethics is a big can of worms,’ and ‘the industry has a responsibility to educate the public how video data is being used,’ according to retailers and leaders in the technology space
AI Camera Ethics
Photograph: Shutterstock

Artificial intelligence (AI) is becoming a more prevalent conversation amongst convenience-store retailers. As technology develops, retailers have heightened opportunities to stitch together a personalization journey that flows seamlessly as a customer walks through a c-store.

When a convenience-store chain offers a loyalty program, that program gains effectiveness from customer awareness. Personalization, in other words, uses a customer’s history and tendencies to offer deals that make sense to that particular individual.

But if a convenience store is investing in technology, particularly cameras, that essentially tracks customers from the time they arrive on the lot, is that an invasion of privacy? How far can a convenience-store chain go to incentivize consumer purchases through artificial intelligence?

C-store retailers and leaders from technology companies met at the Dover Fueling Solutions conference in Austin, Texas, last week to discuss such controversies.

DFS

Gray Taylor (right), executive director at Conexxus, Alexandria, Virginia, moderated a panel discussion on AI where panelists talked about ethics. Panelists included Sridhar Sudarsan (left), chief technology officer at SparkCognition, based in Austin; Nate Wootten (second from left), partner and vice president of commercial strategy at WillowTree, Charlottesville, Virginia; Juichia Che (third from left), principal cloud solution architect at Microsoft, Redmond, Washington; and Poornima Bethmangalkar (fourth from left), industry group head of industrial and manufacturing at Happiest Minds Technologies, Bangalore, India.

One conversation stemmed from AI company SparkCognition’s technology, which can detect loyalty with repetitive face detection. This technology can be used to identify two types of consumers—loyal customers and people a retailer doesn’t want in the store because they’ve had a history of stealing.

“In both cases, as a business owner, you want to know [who your customers are],” said Sudarsan. “But you’re not going spend hours having people flag these videos, so that’s a problem. That’s cost prohibitive and not doable.”

Instead of performing this task manually, AI can detect these categories of consumers, but storing that data in a database is a privacy concern, he said.

So use AI, but don’t store the data. Understand that the system is not picking out individual faces, he said.

“When you look at that [data], you will not know it’s a face, let alone the person’s face [or] which person’s face,” Sudarsan said. “Being able to apply innovations like this really helps solve the business problem that you’re trying to solve, as well as prevent yourself from being vulnerable to security or privacy concerns that are out there.”

Another set of panelists dove into the morality of AI during a session on frictionless customer experience.

DFS

Steve Van Vlack (right), director of business development at Dover Fueling Solutions (DFS), led a panel and spoke with Jarrett Nasca (second from right), chief marketing officer at Grubbrr, a self-ordering systems and digital kiosk software company based in Boca Raton, Florida; Scott Langdoc (left), global head of convenience and energy retail at Amazon Web Services, which offers a cloud-computing platform and is based in Seattle; and Toby Awalt (second from left), vice president of marketing at AI touchless self-checkout company Mashgin, based in Palo Alto, California.

“Because we can do all this cool stuff, should we?” Van Vlack from DFS asked the panelists.

The industry has a responsibility to educate the public how video data is being used, according to Langdoc.

“We like to say the words anonymous and aggregated,” he said. “We’re not talking about personal video identification of unique people. We’re talking about the collective information when we’re watching somebody shop and do a just-walk-out transaction. It’s not about that person. It’s about the activity and actions that person is taking from the time they enter to the time they leave.”

Privacy is a fundamental concern when that kind of technology is being used to improve the shopping experience, Langdoc said.

On top of the protection of not storing this data, the idea of analyzing customers via video is not so far off from what already exists, Awalt said.

“A lot of these providers are tapping into existing cameras in stores,” he said. “You’re not rolling in and saying ‘we’ve got extra things watching you now’; it’s the same security cameras that they once were.”

The existing security cameras are now being enabled by a smarter technology to record data in a way that’s safe from personally identifiable information, Awalt said.

“The video is staying local, and it’s only the tallies of actions that [retailers are] interested in that are being sent up to the cloud as aggregate reports,” he said. “[It’s] the same amount of footage that you normally have of a human on your security camera, we’re just abstracting value from that footage and not storing it anywhere.”

Another use-case for AI cameras that might have retailers questioning morality is license-plate detection.

When a vehicle enters the convenience-store lot, an AI camera can recognize the car’s license plate, evaluate how dirty the car is and offer a coupon for the car wash, Langdoc said. And by virtue of another camera, the system knows that the car wash is open.

“Imagine the ability with all those additional measurement points how you can create a practical, personalized offer,” Langdoc said. “You have to be very careful; there’s an opt-in element to it.”

“[It’s] the same amount of footage that you normally have of a human on your security camera, we’re just abstracting value from that footage,” said Toby Awalt, vice president of marketing at Mashgin.

No matter the use, personal or across any industry, there’s going to be a long road of AI ethics and policy implementation. These frameworks are currently being blueprinted, and there’s an overarching design of principles that should guide AI usage, said Che of Microsoft.

“AI ethics is a big can of worms, and actually, an endless can,” she said. “We all know we shouldn’t use AI for lawless behaviors,” she said. “After blueprinting, we have researchers, analysts and policy makers look into what’s currently out there in AI innovation in terms of the models and how we use the data and how we can revise the blueprint to better fit what’s out there.”

The European government just recently implemented the Artificial Intelligence Act in March to ensure that it’s not used for high-risk use cases.

“It’s an iterative process where we have to be able to come up with policies, review them and evaluate them,” Che said. “It’s a process that everybody should be held accountable for, and everybody should contribute to, because as users of AI, we’re also responsible for how AI is used and how the policies are shaped.”

Members help make our journalism possible. Become a CSP member today and unlock exclusive benefits, including unlimited access to all of our content. Sign up here.

Multimedia

Exclusive Content

Snacks & Candy

How Convenience Stores Can Improve Meat Snack, Jerky Sales

Innovation, creative retailers help spark growth in the snack segment

Technology/Services

C-Stores Headed in the Right Direction With Rewards Programs

Convenience operators are working to catch up to the success of loyalty programs in other industries

General Merchandise/HBC

How Convenience Stores Can Prepare for Summer Travel Season

Vacationers more likely to spend more for premium, unique products, Lil’ Drug Store director says

Trending

More from our partners