Connect with us

Video

Why Companies-And DARPA-Are Using AI To Predict Human Emotion | Forbes

Published

on

The Pentagon’s research arm has pumped $1 million into a contract to build an AI tool meant to decode and predict the emotions of allies and enemies. It even wants the AI app to advise generals on major military decisions. DARPA’s backing is the starting pistol for a race with the government and startups to use AI to predict emotions but the science behind it is deeply controversial. Some say it’s entirely unproven, making military applications that much riskier.

The previously-unreported work is being carried out under a DARPA project dubbed PRIDE, short for the Prediction and Recognition of Intent, Decision and Emotion. The aim is to create an AI that can understand and predict reactions of a group, rather than an individual, and then offer guidance on what to do next. Think of a military leader who wants to know how a political faction or a whole country would react should he or she take an aggressive action against their leader. In PRIDE, the emotion detection is not for an individual. It’s more as a collective group and even at a national level,” says Dr. Kalyan Gupta, president and founder of Knexus. “To think about, you know, whether a nation state is either angry or agitated.” And it’s no small fry initiative; the plan is for PRIDE to provide recommendations for “international courses of action,” according to a contract description.

Whilst DARPA’s project is largely looking at sentiment elicited from text and information posted online, a handful of startups, from the U.K. to Silicon Valley claim they can both understand what people are feeling and how they will feel in the future by looking at their face.

In the Farringdon, London, offices of Element Human, 36-year-old founder Matt Celuszak grandly claims such emotion detection is about to cause a “shift change in how people live their lives and where humanity is evolving.” His company works with clients to hone the quality of their video ads by showing them to a small audience and having algorithms look for signs of emotion, whether that’s mild amusement or abject terror. It’s been operating largely under stealth, until now, though it’s been testing its tech with various major publishers, from CNN and Time Inc to the BBC.

Read the full profile on Forbes: https://www.forbes.com/sites/thomasbrewster/2020/07/15/the-pentagons-1-million-question-can-ai-predict-an-enemys-emotions/#4dc0610832b4

Subscribe to FORBES: https://www.youtube.com/user/Forbes?sub_confirmation=1

Stay Connected
Forbes newsletters: https://newsletters.editorial.forbes.com
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com

Forbes covers the intersection of entrepreneurship, wealth, technology, business and lifestyle with a focus on people and success.

Continue Reading
Advertisement
Comments