Intel working on AI that detects students' emotions in virtual classrooms
Intel's "emotion AI" is designed to detect how students feel, which has drawn criticism.
What you need to know
- Intel and Classroom Technologies are working on a set of artificial intelligence tools that can identify the emotions of students in virtual classrooms.
- The AI feature could inform teachers when a student is confused or bored during instruction.
- The technology has been met with pushback on moral and ethical grounds.
Intel and Classroom Technologies are working on tools that use artificial intelligence (AI) to detect the mood of children in virtual classrooms. The feature could be used to tell a teacher if a student was bored, confused, or distracted. As stated by Tom's Hardware, the AI tool has been met with resistance from many due to the ethical and moral ramifications of monitoring students and assessing emotional states using AI (via Protocol).
The feature uses facial recognition, speech recognition, and other technologies to record people's expressions. AI is then used to determine how the person in question is feeling.
Some believe that it can be detrimental to label people with a single word or description. Humans experience a myriad of emotions, so simplifying someone's state to "happy" or "bored" may be counterproductive.
Additionally, people are nuanced, and expressions are not universal. For example, the same facial expression may mean different things for separate individuals.
The software, dubbed "emotion AI," integrates with Zoom through Class, which is a software product from Classroom Technologies. Since Zoom is used frequently in education, it would be simple to implement AI tech into many virtual classrooms.
In addition to questions surrounding the accuracy and helpfulness of the technology, "emotion AI" critics question the morality of student surveillance.
Sinem Aslan, a research scientist at Intel, stated the intention behind the tech was not surveillance. "We did not start this technology as a surveillance system. In fact, we don't want this technology to be a surveillance system."
An advocacy group called Fight for the Future called on Zoom to stop using "emotion AI" in an open letter earlier this month.
Updated April 19, 2022, at 5:15 p.m. ET: Intel has provided the following statement: "At Intel, we believe AI can help drive beneficial advancements in medicine, industry and society, and empower us with the right tools and enable a responsible, inclusive and sustainable future. Intel's adaptive learning research is rooted in social science and our multi-disciplinary research team works with students, teachers, parents and other stakeholders in education to explore how human-AI collaboration in education can help support individual learners' needs, provide more personalized experiences and improve learning outcomes. As with all research projects, Intel and its collaborators abide by strict data privacy policies and adhere to ongoing oversight."
Windows Central Newsletter
Get the best of Windows Central in your inbox, every day!
Sean Endicott brings nearly a decade of experience covering Microsoft and Windows news to Windows Central. He joined our team in 2017 as an app reviewer and now heads up our day-to-day news coverage. If you have a news tip or an app to review, hit him up at email@example.com (opens in new tab).
What's the difference between a teacher reading students' faces and a robot doing it? The robot does it faster, and the robot doesn't have the teacher's biases. The robot will however have the biases of the people that train it. While I had my fair share of great teachers, I also had many bad ones. In those cases I'd rather have had the robot.
"The computer says your eyes were not focused on the study material for the requisite 87.5% of one-hour class time eye contact is required. You have failed this course." "Your facial data is being analyzed by third-party companies, which you have indirectly consented to by proxy as part of being a student using this software." C'mon, Andrew, you're smarter than this. There are infinite ways this sort of tech could be far more destructive than a plain-old bad teacher.
You watch too much sci-fi.
How would teachers feel if this "tool" was pointed at them for use by school administrators to assess their quality of instruction and interest in their subject?