Humans And Machines
As Amazon applies machine learning, Alexa will learn and develop her personality to fit in my home differently from yours. Ask her, “How are you so smart?” and she’ll reply, “I was made by a group of very smart people and I am constantly improving my mind.”
My kitchen opens up to my family room, where right smack in the middle sits Alexa, an Amazon Echo device. It feels strange to call Alexa a device, because she has been chatting with my family and is building out her personality. Tell Alexa, “You are stupid,” and she’ll reply, “I’ll try to do better next time,” and “That’s not very nice to say,” the second time.
A device with a mind! It reminds me of Sonny from the movie iRobot. How far-fetched is it to think that Alexa will one day differentiate my dog’s barking to know when it is hungry and get my pet dispenser to give out its favorite food, which she will re-order from Amazon, of course.
Alexa, Apple’s Siri and Google Now have all been listening to us to build out voice recognition and fetch us information. As this evolves into a conversation in the most intimate of settings, our perception of the experience with our devices will begin to blur.
Our experiences are nuanced and filtered by context, identity and relevance based on location and time. This is influenced by our moods and our biases that are stored deep in our lizard brains, controlling how we feel, perceive reality, get creative, solve problems, and define and store memories.
Our experiences, and recalling them, shapes our identity of who we are and helps us grow in self-awareness. This underlying ability of living our experience as life is human intelligence. Add free will to it and it creates many billion unique combinations, making it complex to replicate human experience and human intelligence. Or so we thought.
We are reaching an era of questioning what is artificial about AI.
The Internet of Things (IoT) makes ordinary things all around us “smart” and collects huge volumes of data that can be processed to define human intelligence. Machine learning and deep learning applied to smart things changes the human-machine interface. We are reaching an era of questioning what is artificial about AI. If human intelligence can be replicated, and human experience can be deep learned, the barrier between man and machine will begin to crumble.
IoT devices are sensing our environments with gesture computing, feeling our emotions with affective computing and personalizing our experiences with recognition computing. Will this stop at human-machine interactions, or are we helping machines develop self-awareness with personality and opinions, to become our friends, partners and part of our families. There is no escape from the trajectory of change of our perceptions of our worlds and us as humans. Or is there?
With Personal Assistants, Who’s In Charge?
Personal assistants come with human sounding names, such as Cloe, Clara, Julie, Luka and Amy. The ultimate in names is Mother. Personal assistants help us shop, manage our time and life routines, stay healthy and forget decision fatigue. They give us the perception of us being in charge, while slowing enslaving us by learning our behaviors and preferences.
Have you seen Preemadonna’s Nailbot from TechCrunch’s Disrupt SF 2015 Battlefield? That’s one robot getting ready to become the center of tween girls’ slumber parties. It uses machine vision to adapt nail sizes to unleash the creativity of girls as fun nail art designs. Who’s in charge here? Girl or machine?
Slow Changes In Our Machine Interactions
With IoT making all things smart, our interaction with things is changing. It is a gentle reminder from our Fitbands and Apple Health Apps to exercise more, from imedipac to take our pills, from Aura sleep monitor to sleep well and from Nest Protect to change our smoke alarm batteries.
This becomes more pervasive when the interaction changes to assertive, with Kolibree’s smart toothbrush telling parents that kids didn’t brush behind their molars or Sensoria socks asking us to change our running style to shift balance on our heels. It enters a more private space with a Yono Fertility band monitoring basal temperature to help women get pregnant, and gets to a crazy zone with the True Love Bra from Ravijour that opens only at a certain heartbeat rhythm.
Conversation As Equals
Combining gesture and recognition computing helps machines understand movement, voice and photos in context, giving them the ability to perceive the world around them.
Fin and Nod help us engage naturally with our environments by mapping our hand gestures to rule-based actions of turning on lights or increasing the volume of the home entertainment system. It is not far off that these devices can machine learn that we pound our fist in anger and nod to say yes.
The ControlAir App from eyeSight allows me to shush my device when my phone rings, so it is not hard to imagine that we will get to more human-like interactions once our devices figure out when we roll our eyes.
Devices are becoming part of our families, listening to us and guiding us to run our lives smoothly.
Voice recognition is shifting to a two-way conversation. We can now talk to our homes and connected cars and phones, not just to get driving directions, but ask for advice about where to go, what to eat and what to do next. The AI being built in home hubs such as Mycroft allows it to understand our requests based on proximity and voice recognition. Devices are becoming part of our families, listening to us and guiding us to run our lives smoothly.
Conversation Where Humans Are Absent
Digital Genius from Disrupt NY 2015 uses natural language processing to let machines take on mundane tasks, such as customer service and automated onboarding to services. It also can do M2M communication, which is where it will get interesting — our machines negotiate with each other to find deals for us or arrange our travels or let a factory run itself from interconnected devices for our benefit.
Drone couriers making deliveries is another scenario becoming a reality from CyPhy Works. When the drone delivers pizza, our connected home might pick it up and have a Tellspec track the food calories we eat. How will our relationships with machines change as they smarten up, deep learning other machines’ behavior then advising us?
Augmented Reality
Augmented reality blurs our senses of what is local and what is remote as we peer into a machine. Trylive offers a remote immersive shopping experience. Qualcomm’s mobile vision platform Vuforia lets us interact with toys before opening our gifts. Layar creates a localized augmented experience for food, housing and entertainment and education.
Affective Computing
Affective computing teaches machines to understand emotions with the goal to develop empathy to help them fit in socially and to take actions based on sensing our emotions. I would like Alexa in my family room to behave well and eat her veggies with a smile.
But my connected car may notify other cars around it if I am driving with road rage. Wize Mirror from Semeoticons measures a person’s overall health from facial recognition. If a Wize Mirror could connect to the rest of my IoT devices in the retail realm it could tell my Fitbit that I am ready for binge shopping because I am walking out my stress.
Combining context with emotions, machines can take our conversations to the next level. A device on or near us all the time can notice that the girlfriend is not seen in proximity any more. Whether it will “talk” to us about it or use this information to filter its engagement with us is based on the deep learning algorithm it will develop.
Get Ready For The Future
Let’s get ready for the future, where the human-machine boundary is blurred and we have the free will to give up free will dulled by convenience for engaging experiences that nurture us and own us.
Alexa is listening and learning to build us a world not dissimilar to the one we live in today, but may give us equal choice to switch between a human and machine to be our family, friend or service companion.
Comments