top of page

About
How it works
We use the Microsoft Computer Vision Emotion API to perform facial and emotion recognition from a recorded lecture video. The video is analyzed frame by frame to detect trends in student emotion as the class progresses. Once the instructor streams or uploads a video, they can receive feedback and visual representations of students during the course of the class and reports on statistics such as attendance, location of students in the class, and break downs of each emotion recognized by the API.
How it Works...
bottom of page