Tag Archives: warm and creepy mom ux face recognition samsung humans evolution

Why the Future will be Warm and Creepy: Mom as Perfect UX for Human Machine Interaction

mother and baby
Perfect Interaction

No, your mom isn’t creepy. This post is about future interaction between humans and computers and paints a future alternative to the classic science fiction models of cold and organized (Star Trek), cold and creepy (The Matrix).

I’ve been chatting on twitter with @moyalynne about perfect interaction and UX. It was spawned by a comment by @NicoleLazzaro who said:

 @NicoleLazzaro: Children who watch TV today they expect a response. Tablets are so more engaging. ~ @asrarasheed #digitalkids

This led to a sprawling discussion about interactive User Experiences… My first thought is how much the word “Interactive” has evolved over the past decade, and look back to when the word essentially meant CD-ROMs.

Now the expectation is touch on glass, tilt, multitouch gestures and more.

In thinking about the evolution of UX in interactivity platforms, I’d like to frame it in an way that will help us understand how far we’ve come and where we’re going. How do we define “perfect interaction”?

Mobile UX: The “Bubble”

My thinking on the topic of what people look for in mobile UX, particularly gaming experiences is something that I call “The Bubble”. When you are inside “The Bubble” you are fully protected from the evils of the outside world. Developers should seek to form a perfect bubble around their players. Any clunky interaction “breaks the bubble”. This is why services like http://crittercism.com are so important, because the worst and most infuriating breaking of the bubble is a crash of the app.

Mobile Ads break the “Bubble”

Many app developers are rejecting mobile advertising in its current form. This is because mobile banner adds break the bubble of the app. Getting kicked out of an app to a mobile browser is a jarring experience. Most apps arent about browsing or reading, and taking a browser based advertising concept and transplanting it is largely a failure. Advertisement will need to happen “inside” the bubble presumably MRaid compliant HTML5/JS dynamic “ads” but think of them more like branded interactive experiences. Better yet, brand advertising could create bubbles of their own.

So how do we go beyond the “bubble” stage of interactivity? This is what people got excited about with http://foursquare.com which overlays the “in the world” experience with the experience in the bubble. Interestingly the bubble is permeable.

The Bubble is the Womb

To introduce the metaphor, imagine that the bubble is the womb. Now imagine that nutrition can get into the womb via the umbilical cord. In this way for example, text messaging is a welcome set of data that can flow into and out of the bubble. That’s why advertising by alert and notification such as http://airpush.com will ultimately fail. As you know, young people HATE the app on their devices that generates “phone calls”. Nothing is more invasive and “bubble popping” that having someone ring your phone and force you to talk with them, with all your icky emotions and wierd quirks. Just send text messages, it’s much cooler.

Birth, or how the Bubble pops naturally

So “phone calling” software crashes and today’s mobile advertising pop the bubble. But how do we emerge from this stage of interactivity naturally?

If you take this metaphor further, imagine that the next stage is birth.

Mom’s Face is the first UX for information transfer

After we’re born, we dont see well, but we interact with our caregiver (probably mom or someone acting in that role) mainly by crying (sending message) and looking at the caregiver’s face (receiving message). The arms take the place of the womb for protection, and the breast takes the place of the umbilical cord. The face is the first UX for information transfer. Most of the information is about regulating the sympathetic/parasympathetic nervous system (fight or flight vs rest or digest), the mother effectively serves as a “threat coprocessor” for the baby via facial action encoding.

Skin to Skin, Face to Face

But the interaction moves up a level pretty quickly… skin to skin touch is next, serving a communication purpose but also elevating oxytocin levels. Face to face interaction comes next for both threat detection and social bonding. Of course we understand Facebook to be bringing more facial interaction through profile pictures, picture tagging and photo sharing. But i’m talking about when your devices are able to read your mood and needs by looking at you. Android Ice Cream Sandwich can already “recognize” the phone owner and unlock based on face recognition.

All your Face are Belong to Us

Emotional analysis of face is moving ahead quickly. The seminal work of Dr Paul Ekman is helpful here.

Steve Jurvetson posted the below about the Fraunhofer Face Detector. 

Fraunhofer Face Finder

We’re getting there.

Machines are bad at faces but good at sensors

One of the things we love about computer technology is how it makes the invisible visible. One way is through “big data” processing, but the other is through using sensors that are unavailable to unmodified humans. The emergent cybernetic organism doesn’t need to rely on solely evolutionary interfaces like the human face.

The average Android device has literally dozens of sensors. Dont think just multitouch or front and back cameras, think accelerometer, barometer (yes they have those), GPS, temperature, battery sensor, WiFi sensors, Light sensors, magentism. Combining software with these sensors allow a camera to become a heart rate monitor or a bar code reader. By reading things like heart rate, blood pressure, skin galvanic response you can get a better sense of the emotional state of the user.

An example of an excellent human machine interface is the Tongue Camera. This technology allows blind people to see by interfacing with the tongue through an array of movable pins. Amazing, and in this context important because it’s an evolutionary interface, but it’s being commandeered (thanks to neuroplacticity) to serve a completely different purpose than it was intended for.

The Internet will kiss your boo-boo and make it better

Why is the emotional state of the user important? Because life outside of the bubble is bright, painful and threatening. The ability to detect and ameliorate these negatives enables the newly born organism to attain at least some measure of comfort in this new environment. If your pain is recognized by someone, it takes some of the sting away. This is why mom can kiss your skinned knee to make it all better.

And on to mind-reading

IBM predicts that we will have mind reading computers within 5 years. This is somewhat important for applications that assist people without the use of their limbs for example, and there is tremendous advancement there.

The future will be warm, not cold

So the model for “interactive” in the future will be Mom. Watch for better haptic feedback (better than fingers on glass for sure, think skin on skin) to emerge, better emotional communication between humans and machines (including mind reading) and ways for the machine parts of us to help us assess threat, acquire nutrition and alas, find reproductive partners. Not that your mom helps you find a date on a Saturday night (I hope not) but that we’ll increasingly relate with technology as a means to meet the whole spectrum of human need. Technology will not supply all the needs, but will increasingly serve as the interface.

Siri is not enough.

Just because the future will be warm and have a human face does not mean it wont be creepy as hell

As a final note, I’m not pollyannaish enough to think all of this will be good. I just wanted to sound this closing note of caution in case the tone of amazement at our achievements in technology is misinterpreted as a blanket approval of all uses of technology. We will be heading into new territory in privacy, ethics and legislation here.

Posted in Mobile, Neuroscience | Tagged | Comments Off on Why the Future will be Warm and Creepy: Mom as Perfect UX for Human Machine Interaction