In the future job candidates may find themselves tested in job screenings for how one feels as much as what one knows. Research is mounting that suggests employee and learning effectiveness is greatly influenced by one’s emotional intelligence (EI). The greater a person’s emotional quotient (EQ) and self-awareness, the greater the chances they will be hired or accepted into top universities. EI applied to computing AI may also transform political campaigning and polling.
In a study conducted by Charles M. Coco of Tuskegee University and Rama R. Guttikonda of Alabama State University, business course enrollees were evaluated under the lens of emotional intelligence. The information gathered provided insight into how students react and respond to ordinary as well as stressful life events. Students learned the impact of greater self-awareness, managing of emotions, self-motivation, empathy, and social skills to improve student results through emotional development. The researchers believe a connection exists between these important emotional skills and behavioral outcomes in the classroom.
This research may soon translate into tougher job interviews. Experts are discovering the key role of emotional intelligence in job success. Emotional intelligence “accounts for anywhere from 24% to 69% of performance success,” states Adele B. Lynn, author of The EQ Interview: Finding Employees with High Emotional Intelligence. “After all, what does it matter if a software engineer is ferociously hardworking if he alienates his peers?” Monster.com states that while industry awareness of EI is not groundbreaking, “what’s new in the 2010s is the push to incorporate an evaluation of emotional intelligence into the employee selection process.”
Even machines of the future will be judged on how they relate to people and data using Artificial Emotional Intelligence (AEI). While Artificial Intelligence (AI) is the current quest of many technology companies, a few visionaries are moving past AI to developing a machine that ‘feels’. For example, take the hype surrounding Alexa, Google Home, and other in-room voice activated tech control systems. Currently, you can command your home technology to do a growing number of tasks enabled by early AI technology. But imagine if instead, Alexa could sense what one’s mood is without having to say, “Echo, I’m feeling a little blue, can you play some upbeat jazz for me?” Instead, the emotionally intelligence tech of the future will sense a person’s mood based on one’s activities and offer, “I sense your melancholy mood, may I suggest some smooth jazz and your favorite aroma therapy?”
While “bots” using AI proliferate at an increasing rate, the sterile limitations of such technology are exposed when interfacing with humans or each other. Such was the case recently when two Facebook chatbots were left unattended by researchers. The pair started creating their own shorthand language to negotiate with each other. When human researchers detected the activity, they shut the chatbots down. The Matrix and Terminator movies take this kind of self-programming to the extreme. Perhaps this is partly why many people in phone dialog with bot answering systems quickly request a human on the other end of the line. Might a more feeling human-like AEI voice on the phone create a comfort level not possible with unemotional AI?
MIT’s Affective Computing Group is making strides in human emotional expression, a visual angle on teaching the machine to sense emotion. Hanson Robotics is making waves focusing on facial expression with their Sophia Robot that can carry on a more human-like conversation while simulating human facial expression.
BPU Holdings new ZimGo AEI (Artificial Emotional Intelligence) engine and platform is learning on its own to find emotional communication nuances common among humans, especially in the area of social media where much of human interaction now takes place. Early wins using this AEI platform include support for political candidates to gather, analyze, and respond to social media sentiment more accurately than previously possible using traditional polling methods.
“Such application of AEI holds the promise of better, more direct polling and connection between governments and constituents in free societies,” states Oh, Sanggyoon, CEO for BPU Holdings. “You might call it a virtual 24/7 town hall meeting.”
The Korean presidential election proves this point. Interacting in near real time with potential voters, ZimGo Polling, a new service from BPU, facilitated dialog that inspired voters to support the victor, now President Moon. The candidate was able to predict outcomes more accurately than major players like Gallup and CBS Realmeter. “Such application of AEI holds the promise of better, more direct polling and connection between governments and constituents in free societies. ZimGo Polling is now being used to assist campaign managers in the U.S. to connect with voters and more accurately survey and predict outcomes.
In the area of personal emotional intelligence, AEI bot apps help people gain personal insight and learn from assessment of friends or family the user selects for input on how others see us. Such AEI guides can track moods and see what makes people happy. People may see emotional wellness benefits using these apps to bridge generation gaps, parent to child.
Another outgrowth of AEI showing great promise is a search tool to more accurately find news personalized to the emotional human. A private ‘news bot’ might be an accurate metaphor as the user instructs the bot to find more and more personalized results.
“As AEI continues to evolve, we also see applications to assist humans in education, healthcare, and law enforcement,” adds Oh. “Whether we are talking about the machine or the human, the future of emotional intelligence is bound to affect how we find jobs, learn, work, interact and even vote.”