Source: Windows Central
What you need to know
- A Microsoft Research project studied the use of AI to read people's non-verbal communications during virtual meetings.
- The study used a bot within Teams calls to identify various emotions.
- The study suggests positive results from using AI to enhance communication.
A Microsoft study used an AI tool to monitor people's expressions and non-verbal communication during video calls. The AI is called AffectiveSpotlight, and it uses a neural network to classify the expressions of people. AffectiveSpotlight was tested against random selection to help presenters see people's reactions. The study was recently highlighted by NewScientist.
Generally, people are good at reading non-verbal communication. It's normal to pick up on subtle, and not-so-subtle, facial expressions and other cues within conversations. That type of communication tends to suffer on video calls. Not only are people's video feeds smaller than people appear in real life, but we're also often looking at several people at once on a grid.
Microsoft's website has a page on human-computer interaction that includes several publications. One publication, titled "AffectiveSpotlight: Facilitating the Communication of Affective Responses from Audience Members during Online Presentations" goes over observations on using AI to monitor people's expressions.
Its summary states:
The ability to monitor audience reactions is critical when delivering presentations. However, current videoconferencing platforms offer limited solutions to support this. This work leverages recent advances in affect sensing to capture and facilitate communication of relevant audience signals. Using an exploratory survey (N=175), we assessed the most relevant audience responses such as confusion, engagement, and head-nods. We then implemented AffectiveSpotlight, a Microsoft Teams bot that analyzes facial responses and head gestures of audience members and dynamically spotlights the most expressive ones.
The results of the study suggest that the AI helped presenters:
In a within-subjects study with 14 groups (N=117), we observed that the system made presenters significantly more aware of their audience, speak for a longer period of time, and self-assess the quality of their talk more similarly to the audience members, compared to two control conditions (randomly-selected spotlight and default platform UI).
During testing, ActiveSpotlight highlighted 40 percent of participants during talks, which is significantly less than the 87 percent selected with the random software.
This is only a single publication, so it may be some time before we see AI integrated with everyday calls. It does show the promise of the concept of using AI to enhance communication.
We may earn a commission for purchases using our links. Learn more.

ID@Azure is a new program for indie cloud development, based on ID@Xbox
Microsoft is spinning up a new program aimed at introducing independent devs and teams to Azure cloud-based services and infrastructure, according to this new job listing.

Microsoft is testing 1080p xCloud streams for Xbox Game Pass gaming
Xbox Game Pass Ultimate is currently limited to 720p for streaming to mobile devices, but we've seen evidence that it's about to get bumped up to 1080p. Here's what you need to know.

Here are 10 Xbox games that need 'FPS Boost'
FPS Boost is another impressive feature currently available in the Xbox backwards compatibility utility belt. Here are 10 games we want to see take advantage of enhanced framerates.

Complete list of apps supported by NVIDIA RTX Voice
NVIDIA RTX Voice can remove the background noise from your streams, voice chats, and video conferences. Here are all the Windows apps with official NVIDIA RTX Voice support.