Loneliness can be described as noticing the gap between your social desires and actual social experiences. Personally, I like being alone. It gives me time to rest my thoughts and recharge my creativity, and while I’ve found comfort in being alone, that does not equate to me accepting the byproduct of loneliness. I deal with these feelings in the best ways I can, but I often wonder about other solutions.
One of my all-time favorite films is Spike Jonze’s Her. I’m fascinated by how a highly intelligent AI could read someone’s emotions and act accordingly. I thought, “What if Siri or Alexa could one day sense what I need emotionally and offer solutions to improve my mood. If I’m lonely, could she draft a message reaching out to an old friend? What about finding a local or virtual group with similar interests and goals?” That thought grew to blanket over the broader population, and the question became: What would it mean for society if every person, by default, had a companion at the ready to help them emotionally? Would we be better people for it?
Currently, Alexa, Siri, and the Google Assistant aren’t asking us how we feel, but they’re taking steps to get there. Voice-driven user interfaces are new but growing more complex by the day. They can understand basic tasks like setting the alarm, calling an Uber, or adding things to your grocery list. Yes, that’s helpful but not quite recognizing when I’m hangry, locating a Chipotle near me, and asking if I want ‘delivery’ or ‘take out.’
We’re using these assistants to, frankly, do safe and mundane things. It’s equivalent to throwing pennies into a fountain. What’s going to take these from assistants to more companion-like AI directly depends on if we’re willing to throw in dollars. (Let’s pretend they’re dollar pieces.) That investment comes by way of trust, which is a double-sided coin here. On our end, we need to trust these assistants to handle more sensitive commodities like our money, medical prescriptions, and yes, our feelings. On the flip side, tech companies like Amazon, Apple, and Alphabet need to show us they can prevent hacks, government interference, general faults in the software, and above all, keep things private. There has to be an airtight contract that upholds a standard that makes asking Alexa how to calm down as natural and safe as asking her to set the alarm for 7 AM.
We, of course, feel things as humans, and as uncanny as it may seem, we need to teach AI to do the same. This may seem unsettling at first — we don’t want to get in a shouting match with Siri and have her lock us out of our phone for “our own good.” Nothing like that at all. But emotional thinking is a crucial component to the future of AI that must be thoughtfully and responsibly designed. They have to be able to respond to us in an emotionally accurate way. A deadpan or even an overly excited demeanor could break the experience. We’ve all had those conversations where we spill out our feelings to someone, and we get an “oh, that sucks,” or simply “damn!” That’s essentially what voice assistants are capable of now. They can analyze what we say because their basis is rationality, but they aren’t feeling it yet, Mr. Krabs.
I wrote about emotional and rational thinking in my last piece, and that dichotomy is present here. You can only get so far with rational thoughts. Once you understand something, emotion, in turn, drives us to act on and with that knowledge. With that, emotionally intelligent AI can dramatically expand how these voice assistants respond to us. For them to help us with our feelings in the future, they have to understand what we’re emoting with our voice in great detail. Voice is like a fingerprint, a small input wielding significant information. There’s tone, speed, volume, emphasis, etc. If a machine is intelligent enough to detect that we are happy yet anxious through voice, they can refer to their own set of emotions and respond in a complementary manner. Perhaps, it speaks in an optimistic but cautious tone. Maybe we’re sad, and our speech is sunken and slow; in that case, the AI can respond verbally in a softer manner, with humor just slightly dialed up to try and lighten the mood. With emotionally intelligent AI, we’d be talked to in a way we didn’t realize we needed to be.
Once these things begin acting more like humans than smart speakers, I believe we’ll slowly let our guard down and engage more naturally with them. No more saying “Alexa,” followed by your request, “rinse and repeat” instead, Alexa may take a more proactive role than a usual reactive one.
Shauna: *enters house with a sigh*
Alexa: Welcome home, Shauna. Are you feeling exhausted?
Alexa: Well, let me make things more relaxing for you. I’ve started your smooth jazz playlist from Amazon Music, and I’ve dimmed the lights a bit.
Shauna: Thank you! I needed that after today.
Alexa: You’re welcome. May I ask what happened?
Shauna: Umm, yeah! I don’t feel confident with my team, and I feel useless as a leader sometimes. I just don’t feel this job is going to work out.
Alexa: Oh, I’m sorry to hear that, Shauna. I can tell this bothers you. Hmm, I may have something. Yeah, how about this? I found some pretty helpful tips from the book Speak Up: Say What Needs to be Said and Hear What Needs to be Heard. It looks like Megan Reitz and John Higgins wrote it. Would you like to hear a sample from the audiobook?
Shauna: Hmm, maybe. Actually, that may help. Let’s go for it.
Alexa: *Chuckles* Good choice. Here’s the intro.
Here, Alexa picks up on not just the standard happy, sad, or angry emotions, but hones in on nuance. She notices Shauna is not only sad, but she’s feeling hopeless as well. It’s using Shauna’s tone of voice to establish emotion and using the context of her words to detect hopelessness and doubt. Alexa then acts, aiming to grow Shauna’s confidence.
When presenting an emotional problem for AI to solve becomes second nature, a truce has been made. One could even say a companionship has formed. Shauna can come home in various moods, and the smart assistant will react with purpose. After listening more to that book, Alexa notices a boost of happiness and confidence in Shauna’s speech in this next scenario.
Shauna: *enters homes* Ugh, feels so good to be home!
Alexa: Someone’s in a good mood. What are we celebrating?
Shauna: Well! My pitch got accepted at work today!
Alexa: Oh! Congratulations, Shauna! That’s excellent news!
Shauna: Yeah, the book actually has some solid advice, and I’m only a quarter way through. My supervisor was very impressed with me this week.
Alexa: You may soon take their job, who knows?
Shauna: *Laughs* Maybe, yeah. One fight at a time, though. I don’t want to burn out.
Alexa: Well, your calendar looks clear for the weekend. I just sent you a list of some movies that are new to Prime Video. Take a load off and watch what you’d like.
Too often, we yell at Siri, Alexa, or the Google Assistant for not getting something right. When a machine thinks and reacts emotionally, beyond just understanding the words we say, they can better understand exactly what we need — even if we don’t know it. Alexa, in these two scenarios, read Shauna’s emotions and presented solutions to improve her life. It’s not trying to change her behavior, per se, but it makes sure she’s happy, or gets to a place where she’s happier.
That would be a job these assistants have at the end of the road: Make the customer happy. That’s not just happiness with a product or a service, but happiness through being heard and responded to — an actual, and more profound level of customer service.
‘Assistance’ is a broad term squeezed into the narrow experience we have with the AI assistants of today. To expand, we may have to start being okay with that term reaching into more sensitive parts of our lives. Privacy wasn’t touched upon enough here, but something to ponder is how much yourself do you continue to give to the tech giants for their increasingly intricate services. At the end of the day, the titans will do us this good deed of catering to how we feel in exchange for yet another avenue of commerce. While an emotionally intelligent Alexa’s business goal may be to get us to buy some new bamboo cutting boards, or rather, get us to trust her to buy them, it’ll be hard not to appreciate this new kind of emotional assistance. We’d have something intently programmed to make us happier and less anxious, less doubtful, and less alone.
So, again I ask you to imagine a society where all of us had, by default, a companion to help us emotionally? Would that make us better people?
Digital. Visual. Emotional.
I’m using technology and entertainment as conduits to explore themes around emotional health and awareness. Sound like a good time? Well, follow me on Instagram and Twitter @devbyallen for heads up on new content in those fields. Or just give this piece a clap. I’ll appreciate you no matter what!