The next frontier of artificial intelligence is emotional. If we’re going to take AI seriously, it needs to understand who we really are and what makes us tick. After all, how smart is artificial intelligence if it doesn’t take into account how we feel?
Emotional intelligence can be defined as “the global capacity of the individual to deal effectively with his environment.” When we translate this to AI, we allow computers to interpret our feelings and react appropriately, opening an entirely new world of possibilities that are already transforming a wide range of industries.
Rather than answer your questions in a way that’s tone-deaf, this fast-evolving emotion AI engine improves the digital experience by introducing emotional intelligence into speech recognition technology and detecting real-time changes in mood. This has wide-reaching effects in tracking human thinking and behavior and is already being used from giving voice commands to a GPS navigator to seeking assistance from an automated phone support line.
Emotion AI is on track to disrupt nearly every industry in the next decade, most notably in sporting, entertainment, and retail. Let’s explore how:
The North American sports market is expected to reach $73.5 billion in revenue this year as increasingly lucrative media rights deals surpass gate revenues as the industry’s leading profit segment. As media outlets continue the push to enhance the spectator experience, AI is playing a vital role in shaping the possibilities of what that experience can be.
Smart chatbots are redefining customer service and are already assisting sports fans. At venues like Little Caesars Arena in Detroit and NRG Stadium in Houston, you can avoid the hassle of trying to find stadium staff and simply ask an AI concierge on your phone to answer questions about your seats, food vendors, restroom locations, and in-stadium entertainment options.
Chatbots are especially useful in a stadium setting because they’re available 24/7 and act as a staff member that can field thousands of inquiries at once, respond immediately, and keep track of trends. For example, if customers repeatedly ask for plant-based options, the chatbot will alert venue staff and impact concession decisions moving forward.
Facial coding can compile emotional data that influences venue layouts and which aspects of the stadium experience fans particularly respond to off the field.
On the playing side, wearable tech is revolutionizing performance monitoring, tracking athletes’ distance, speed, heart rates, fatigue index, stress load, and recovery. Professional and international soccer, basketball, and hockey clubs are already implementing this technology and gives coaches the tools to predict outcomes, minimize risk, and make decisions based on real data, fusing the emotional with the technological.
Over at Wimbledon, AI analyzes match data, crowd interaction, and player expressions to determine what content is worthy of inclusion in highlight packages. This saves humans from doing the laborious work of editing while freeing up the professional to work more extensively on the creative aspect.
Integrating emotional intelligence into the arts has the power to enhance creativity rather than suppress it.
The U.S. media and entertainment industry is the largest in the world, and streaming platforms have opened up our options in navigating the vast sea of film, television, music, video games, and more. In the digital era, content truly is unlimited.
Emotion AI can ease the process of figuring out what to watch through careful algorithmic curation and content recommendation. It can also analyze how viewers respond emotionally to certain shows through survey processing feedback that capitalizes on open-end, stream-of-consciousness responses to characters, plot lines, and special effects.
On the production side, deep learning algorithms can now perform many of the more mundane tasks, such as manually soothing out effects or enhancing a digitally-animated character to look more lifelike. They may even be able to predict whether or not a film will be commercially successfully simply by analyzing a screenplay.
Emotion-sensing technology also has the capability to enable emotion-aware games that senses a player’s facial expressions for signs of distress. The horror-adventure game Nevermind uses biofeedback to add a striking level of surrealism; if players can’t manage their stress level, for example, a room may flood or spikes may jut out from the floor, only until the player physically calms down and alleviates the anxiety. This can teach the player real, actionable coping mechanisms for daily life.
The creative process behind gaming will only become more interesting. An opponent might actively learn from and adapt to a player’s personal style of play. Storylines may change and the scripts of non-player characters may become less predictable. Delivering personalized content while reducing time and cost during post-production can boost audience engagement and change the way people create, curate, and engage with the arts.
Engaging with the customer on an emotional level is one of the best ways to sell, and with emotion AI, retailers have the tools to understand and react to how their customers truly feel.
Computer vision, facial recognition, and emotion analytics allow retailers and brands to determine which products or areas of a store are most engaging and how shoppers display emotion about what they are buying and what they choose not to buy.
Using small cameras or sensors to detect the facial expressions of shoppers, the computer vision and analytics components then recognize and interpret the emotions those individuals seem to be exhibiting. This can help marketers determine which prices, packaging, and branding customers respond to.
If someone appears happy when they’re walking by a certain flavor or brand, for example, a retailer could stock more of the product eliciting that reaction and less of those where customers appear unhappy or confused. If emotional responses imply frustration, management can alter shelf placement or reconfigure the store layout for an optimal experience.
The fashion industry, in particular, is experimenting by creating more experiential spaces driven by emotion and immersion. Bringing a catalog to life to create retail theater is a “phygital” compliment to the online experience that allows people to engage with VR to elicit emotion.
70% of U.S. millennials claim they would appreciate a brand or retailer using AI to enhance their shopping experience, while it is predicted that by 2020 85% of customer interactions in retail will be managed without any human interaction.
The value of emotionally connecting with customers, regardless of industry, is going to drive emotion AI into its next exciting phase. Using this emotional data along with traditional methods of customer feedback is the path forward to gaining a fuller understanding of what moves consumers and how you can better share an experience with them tomorrow.