Tuesday, October 26, 2021

Cashier-less technology could detect shoplifting, but prejudice abounds

Must Read

As the pandemic continues to rage around the world, it is becoming clear that COVID-19 will last longer than some health experts initially predicted. Partly due to the slow rollout of vaccines, the rapid spread of new strains, and politically charged rhetoric around social distancing, the novel coronavirus is likely to become endemic, requiring changes in the way we are live our lives.

Some of these changes can occur in physical retail stores, where tactile surfaces such as counters, cash, credit cards, and bags are potential vectors for the virus to spread. The pandemic appears to have renewed interest in cashierless technology like Amazon Go, Amazon’s chain of stores that allows shoppers to collect and purchase items without interacting with a store clerk. Indeed, Walmart, 7-Eleven and cashier-less startups including AiFi, Standard, and Grabango have expanded their presence over the past year.

But as cashless technology standardizes, it risks being used for purposes other than payment, especially for shoplifting detection. While shoplifting detection is not problematic at first glance, case studies show it to be susceptible to bias and other flaws that could, at worst, lead to false positives.

Synthetic data sets

Most cashier-less platforms rely on cameras, among other sensors, to monitor individual customer behaviors in stores as they shop. Video footage from the cameras feeds into machine learning classification algorithms, which identify when a shopper picks up and places an item in a shopping cart, for example. In a session at Amazon’s re: March 2019 conference, Dilip Kumar, vice president of Amazon Go, explained that Amazon engineers use errors such as missed item detections to train machine learning models that fuel the cashless experiences of its Go stores. Synthetic datasets increase the diversity of training data and apparently the robustness of models, which use both geometry and deep learning to ensure that transactions are associated with the correct customer.

The problem with this approach is that synthetic datasets, if poorly audited, can encode biases that machine learning models then learn to amplify. In 2015, a software engineer discovered that image recognition algorithms deployed in Google Photos, Google’s photo storage service, called black people “gorillas.” Google’s Cloud Vision API recently mislabeled thermometers held by people with darker skin such as guns. And countless experiments have shown that image classification models trained on ImageNet, a popular (but problematic) dataset containing photos taken from the Internet, automatically learn human biases about race, gender, weight. , etc.

Jerome Williams, professor and senior administrator at Rutgers University’s Newark campus, told NBC that a theft detection algorithm could end up unfairly targeting people of color, who are regularly arrested on suspicion. shoplifting more often than white shoppers. A 2006 study Toy stores found that not only were middle-class white women often given preferential treatment, they were never called to the police, even when their behavior was aggressive. And in a recent survey of black shoppers published in the Journal of Consumer Culture, 80% of those surveyed said they faced racial stigma and stereotypes when shopping.

“People who get caught for shoplifting are not an indication of who the shoplifting is,” Williams told NBC. In other words, black shoppers who feel scrutinized in stores are more likely to appear nervous when shopping, which …

News Highlights

  • According to the source Cashier-less technology could detect shoplifting, but prejudice abounds
  • Check all news and articles from the tech news updates.
Disclaimer: If you need to update/edit this article then please visit our help center. For Latest Updates Follow us on Google News

More Articles Like This

Latest News