Powered by Azure Cognitive Services, ARKit2, Firebase and React
Eye-tracking and face analysis platform in AR, for e-commerce companies to track user's focal points on screen and predict users' demographics and emotions when shopping online. Dynamically store and visualize analysis data. Proof of concept at BizHacks.
First Place, Deloitte and Best Buy Price Winner out of 400+ people.
Visual.Eyes.Demo.mp4
Click the image above to watch a demo
The iOS app tracks user's eye, calculates user's focal area and displays it on screen. It is also taking a facial snapshot of users for every 2 seconds, and uses Azure face analysis APIs from cognitive services to predict user's age, gender and emotion during a set period. All the user data generated by the app are uploaded to Firebase in real time, grouped by unique usernames and upload times.
In addition, a client website demo serves as a data visualization tool. It can display accumulated user emotions as a pie chart. It is built with React and JavaScript and reads data from the Firebase server and converts it into beautiful graphs built with Plotly.