Artificial Intelligence (AI) is at a crucial juncture of becoming a transformational technology by integrating emotional intelligence with existing AI technology. Wondering what difference, it will make? Well, artificial emotional intelligence or Emotion AI is an idea of making devices capable of recognizing and responding to emotions, aping humans. This can be done in several ways such as by understanding the changes in facial expressions, gestures, physiology and speech. Over the years, artificial emotional intelligence has seen a tremendous increase in application for gaming, automotive, robotics, advertising, retail, healthcare and education to name a few. Of these, Emotional AI for the automobile industry is quite intriguing.
Automotive companies are increasingly devising methods to understand drivers’ emotional input to provide human-like responses in real time and deliver a more personalized in-car experience. These can include alternate routes, pitstop destinations along the route based on user interests, audio system management and climate control. Emotion AI can be used to detect body posture and facial expressions to determine the depth of human state and attention while driving. If the driver is found distracted, the car can respond by taking pre-crash actions, such as tightening seat belts or preparing brakes or switching the car to an automated driving mode and taking control of the vehicle to improve safety.
Eyeris, Honda, Toyota and Affectiva are some of the major automotive companies that have shown huge interest in this area. Eyeris, a world leader in vision AI for emotion recognition, recently launched a technology named Human Behavior Understanding (HBU) artificial intelligence. The company’s EmoVu software integrated with cameras mounted in the vehicle is used to monitor facial micro-expressions, body movements and activities of the driver. It can also detect signs of exhaustion, such as eyes rolling downward or backward, and provide safer driving alternatives.
Further, Toyota and Honda Motor have also taken their first tentative steps towards building emotionally-aware vehicles through their concept cars — the Toyota’s Concept-I and Honda’s NeuV. These cars aim at improving safety by anticipating a driver’s desires via analysis of facial expressions, voice tones and driving habits. Such vehicles can give suggestions in different situations such as adjusting the seat or making a stop along the way. Affectiva has also recently launched Automotive AI to help cars measure the emotional and cognitive states of a vehicle’s occupants in real time.
In 2017, Shaanxi University of Science and Technology published a patent CN107458381A. This patent discloses a facial expression recognition system in a motor vehicle. The emotions of a driver such as anger, disgust, fear, happiness, sadness, and surprise are determined based on inputs from the speech recognition system and facial expressions of the driver. If the driver is found unstable to drive, a signal of refusing ignition is outputted by the microprocessor to an ignition controller to prevent accidents caused by the driver’s emotional instability.
In 2011, Toyota published a patent US7982620B2 that discloses a system to reduce boredom while driving to ensure safety. The system includes a vehicle environment monitor that provides information about the physiological state of the driver such as yawning or frustration and driving conditions such as straight roads and lack of traffic. The system initiates human-machine voice interactions or interactive displays to avoid boredom.
In 2013, Honda published a patent US9751534 B2 that describes a method to monitor the behavioral state of a driver using different sensors. On detecting the unstable state of the driver, a response system activates an automatic brake prefill system, providing warnings of any potential collision threats or playing appropriate music to adjust the driver’s emotions to ensure safe driving.
Ford filed for patent US7138922 B2 in 2003 disclosing a drowsy driver detection system. On detecting drowsiness of the driver, an audible or visual warning is provided to the vehicle operator with directions to the nearest rest area, exit, gas station or other places for the vehicle operator to rest.
In 2017, Zhiche Youxing Technology published a patent CN106803423A. This patent discloses a human-machine interactive voice control method based on the driver’s emotional state. The current emotional state of the driver is determined based on facial expressions, driving behavior, speech speed and tone of the driver’s voice. An intelligent system may activate a music system or change the tone of the audio of the navigation system to improve the driver’s emotion.
In the coming years, artificial emotional intelligence is projected to grow into a multibillion-dollar technology, completely transforming industries. There are challenges and obstacles to overcome before providing the best and safest in-vehicle experience. However, automobile companies such as Ford, Tesla, Google and Apple are researching at a rapid pace towards delivering a completely safe, automated, personalized and emotionally intelligent next generation of vehicles. With the power of artificial emotional intelligence, the future is a vehicle that can engage with people in return and make interactions more conversational and relational.
Featured image is intended for representational purpose alone and has been sourced from (https://www.flickr.com/photos/jurvetson/7408464122)