How to Give Virtual Reality All the Feels – Periódico Página100 – Noticias de popayán y el Cauca

How to Give Virtual Reality All the Feels

Imagine if your friendly robot was actually able to smile, smirk, or cry. That day may be closer than you think.

Virtual reality is here: People can strap on headsets and sensors and interact with CGI avatars of themselves and others in video games, military training simulations, engineering design spaces and even virtual medical labs.

But these avatars have a problem. They tend to be crude, expressionless, graphical sock puppets, lacking that human ability to express subtle facial emotion.

Guillermo Bernal, a Ph.D. student at the Massachusetts Institute of Technology, might have just figured out a solution. In 2017, Bernal launched Emotional Beasts, an effort to adapt off-the-shelf virtual reality hardware with an open-source game engine to create emotive VR avatars.

For his work, MIT awarded Bernal the 2019 Harold and Arlene Schnitzer Prize in the Visual Arts.

“If you go to any state-of-the-art virtual reality platform, you’ll see avatars with faces that are static masks,” Bernal, 32, said in a statement announcing the prize. “I’d like to give them facial expressions, to show whether they are happy or surprised or even angry.”

But that’s easier said than done. For an avatar to express simmering rage or embarrassed arousal or cautious curiosity, the computer running the avatar must be able to sense the same emotions in the live human subject.

Then there are the potential secondary effects. If you teach a computer to read emotions for the purposes of say, setting up a virtual classroom for refugees or facilitating remote mental-health diagnoses, what’s to stop some retailer from using the same tech to figure out which images and ads excite you?

What, in other words, will stop someone from using emotive VR to sell us stuff? For one, law. As Bernal refines his technology, he’s also grappling with the ethical and legal implications of its possible success.

Bernal first needs to equip a VR headset with sensors that could accurately read a wide range of human emotions. He started with a Vive VR headset, which retails for around $500, and modified the overall equipment rig with additional sensors.

Vive already has a camera for tracking the wearer’s eye movements a microphone for registering voice commands. To that, Bernal added what he described as “bio-signal sensors.”

Dry electrodes measure skin tension on the forehead, an indicator of how much the wearer is sweating, which in turn indicates how agitated they are. Bernal also added a heart-rate sensor on the user’s temple and tweaked the microphone to register different tones of voice.

Combined, the sensors, allow the Emotional Beasts setup to “get insights into the respondents’ physical state, anxiety and stress levels (arousal), and … determine how changes in their physiological state relate to their actions and decisions,” Bernal wrote in a research paper.

The headset plugs into a cluster of computer microcontrollers that run custom algorithms in Python script and feed it all into a VR space that Bernal created using the open-source Unreal graphics engine.

By 2018 he had a working prototype. But modifying the hardware and writing the software is just part of the solution. Bernal also needs data. Specifically, a library of how diverse groups of people all over the world express different emotions.

The way a young, upper-middle-class white guy in America shows anger on his face isn’t necessarily the way a middle-age, working-class Chinese woman might do so.

“Emotions are more complex and socially determined than the simple positive-negative, strong-weak arousal model suggests,” a team led by New York University’s Meredith Whittaker warned in a 2018 research paper. “Even distinguishing fear, anxiety and disgust on physiological grounds turns out to be extremely problematic.”

To accurately project a user’s emotions onto a VR avatar, the system must speak the user’s emotional language. “Data is king in everything we’re doing,” Bernal told The Daily Beast. Right now, he added, “the data is not there.”

So the next step, this summer, is to create a tough, easy-to-use version of the Emotional Beasts system that Bernal can share with testers all over the world. They would play with the new VR setup and feed the resulting data into an online repository so that Bernal can begin building his global library of emotional expressions.

There could be ethical and legal obstacles. Some jurisdictions already give consumers veto power over commercial use of their physiological data, and that could limit the scope of Bernal’s data set, to say nothing of complicating any wider roll-out of empathetic VR.

“The California Consumer Privacy Act for instance covers biometric information and does not allow an exception based on the idea one’s face or other biometric information is publicly available,” Mark MacCarthy, a Georgetown University professor and privacy expert, told The Daily Beast. “The Illinois Biometric Information Privacy Act prohibits companies from gathering, using, or sharing biometric information without informed opt-in consent. So, figuring out your emotions from the way you look or walk or your heartbeat needs your permission.”

Bernal said he’s aware of the hurdles he’ll have to clear. “We need to have those dialogues,” he told The Daily Beast.


From Popayan Colombia

If you liked it, share it with your friends!

If you want to be informed of our articles via WhatsApp send us a message to +573232926034

Deja un comentario

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *

Abrir chat
Accede al grupo de WhatsApp Noticias grupo1 👇

Accede al grupo de WhatsApp Noticias grupo2 👇
A %d blogueros les gusta esto: