Newly-released university research further explores how grocers and other retailers can take advantage of in-store cameras with advanced artificial intelligence (AI) technologies to read facial expressions — raising an eyebrow, opening eyes, smiling — to enhance store layouts, RetailWire reports.
“Emotion-recognition algorithms work by employing computer vision techniques to locate the face, and identify key landmarks on the face, such as corners of the eyebrows, tip of the nose and corners of the mouth,” Kien Nguyen of Australia’s Queensland University of Technology, said in a press release.
He added, “Other behaviors like staring at a product and reading the box of a product are a gold mine for marketing to understand the interest of customers in a product.”
The researchers note that while use of facial recognition in retail settings is still controversial, mainly due to privacy concerns, the footage data can be made anonymous so that customers would be examined only at an aggregate level.
When it comes to designing store layouts, the researchers say that the conventional approach is based on a “passive reaction” to customer behavior, such as basing decisions on sales data. But that methodology has its drawbacks, they note.
“Importantly, the conventional design process does not reflect (1) how customers actually navigate through store aisles, (2) how much time customers actually spend in each section, and (3) what visible emotion (e.g., happiness) customers exhibit in response to a product,” the researchers wrote.
Advertisement
For the full story, click here.