Three QUT researchers are part of an international research team that have identified new ways for retailers to use Artificial Intelligence in concert with in-store cameras to better service consumer behaviour and tailor store layouts to maximise sales.
In research published in Artificial Intelligence Review, the team propose an AI-powered store layout design framework for retailers to best take advantage of recent advances in AI techniques, and its sub-fields in computer vision and deep learning to monitor the physical shopping behaviours of their customers.
Any shopper who has retrieved milk from the farthest corner of a shop knows well that an efficient store layout presents its merchandise to both attract customer attention to items they had not intended to buy, increase browsing time, and easily find related or viable alternative products grouped together.
A well thought out layout has been shown to positively correlate with increased sales and customer satisfaction. It is one of the most effective in-store marketing tactics which can directly influence customer decisions to boost profitability.
QUT researchers Dr Kien Nguyen and Professor Clinton Fookes from the School of Electrical Engineering & Robotics and Professor Brett Martin, QUT Business School teamed up with researchers Dr Minh Le, from the University of Economics, Ho Chi Minh city, Vietnam, and Professor Ibrahim Cil from Sakarya University, Serdivan, Turkey, to conduct a comprehensive review on existing approaches to in store layout design.
Dr Nguyen says improving supermarket layout design – through understanding and prediction – is a vital tactic to improve customer satisfaction and increase sales.
“Most importantly this paper proposes a comprehensive and novel framework to apply new AI techniques on top of the existing CCTV camera data to interpret and better understand customers and their behaviour in store,” Dr Nguyen said.
“CCTV offers insights into how shoppers travel through the store; the route they take, and sections where they spend more time. This research proposes drilling down further, noting that people express emotion through observable facial expressions such as raising an eyebrow, eyes opening or smiling.”
Understanding customer emotion as they browse could provide marketers and managers with a valuable tool to understand customer reactions to the products they sell.
“Emotion recognition algorithms work by employing computer vision techniques to locate the face, and identify key landmarks on the face, such as corners of the eyebrows, tip of the nose, and corners of the mouth,” Dr Nguyen said.
“Understanding customer behaviours is the ultimate goal for business intelligence. Obvious actions like picking up products, putting products into the trolley, and returning products back to the shelf have attracted great interest for the smart retailers.
“Other behaviours like staring at a product and reading the box of a product are a gold mine for marketing to understand the interest of customers in a product,” Dr Nguyen said.
Along with understanding emotions through facial cues and customer characterisation, layout managers could employ heatmap analytics, human trajectory tracking and customer action recognition techniques to inform their decisions. This type of knowledge can be assessed directly from the video and can be helpful to understand customer behaviour at a store-level while avoiding the need to know about individual identities.
Professor Clinton Fookes said the team had proposed the Sense-Think-Act-Learn (STAL) framework for retailers.
“Firstly, ‘Sense’ is to collect raw data, say from video footage from a store’s CCTV cameras for processing and analysis. Store managers routinely do this with their own eyes; however, new approaches allow us to automate this aspect of sensing, and to perform this across the entire store,” Professor Fookes said.
“Secondly, ‘Think’ is to process the data collected through advanced AI, data analytics, and deep machine learning techniques, like how humans use their brains to process the incoming data.
“Thirdly, ‘Act’ is to use the knowledge and insights from the second phase to improve and optimise the supermarket layout. The process operates as a continuous learning cycle.
“An advantage of this framework is that it allows retailers to evaluate store design predictions such as the traffic flow and behaviour when customers enter a store, or the popularity of store displays placed in different areas of the store,” Professor Fookes said.
“Stores like Woolworths and Coles already routinely use AI empowered algorithms to better serve customer interests and wants, and to provide personalised recommendations. This is particularly true at the point-of-sale system and through loyalty programs. This is simply another example of using AI to provide better data-driven store layouts and design, and to better understand customer behaviour in physical spaces.”
Dr Nguyen said data could be filtered and cleaned to improve quality and privacy and transformed into a structural form. As privacy was a key concern for customers, data could be de-identified or made anonymous, for example, by examining customers at an aggregate level.
“Since there is an intense data flow from the CCTV cameras, a cloud-based system can be considered as a suitable approach for supermarket layout analysis in processing and storing video data,” he said.
“The intelligent video analytic layer in the THINK phase plays the key role in interpreting the content of images and videos.”
Dr Nguyen said layout managers could consider store design variables (for example space design, point-of-purchase displays, product placement, placement of cashiers), employees (for example: number, placement) and customers (for example: crowding, visit duration, impulse purchases, use of furniture, waiting queue formation, receptivity to product displays).
Media contact:
Pat Whyte, QUT Media, 07 3138 1150, patrick.whyte@qut.edu.au
After hours: 0407 585 901, media@qut.edu.au