Post on 21-Apr-2017
HELLO Greg Carley Head of Product Strategy
Chaotic Moonchaoticmoon.com
Creative Technology StudioLocated in Austin, TX
Everything else:https://about.me/gregcarley
ACCENTURE
Digital ///
Strategy
Technology
Consulting
Operations FJORD
Service Design
+ Living Services
CHAOTIC MOON
Product Realization
+ Human UI
Design & Innovation
+ Human Expectations
Interactive
“Digitalization of Everything”
Interactions with machines will evolve to be less screen dependent and more appealing to all of the human senses
EXPLICIT• Lots of manual action required
• Singular purpose apps
• Limited data
• Only appeals to a few senses
• Screen Based
Digitalization of Everything — People, Places and Things (IoT) When done right, experiences will feel magical
• Automatic
• Default everything
• Based on privacy
• Interoperability
• Appeals to many senses
• Screens only used when necessary
IMPLICIT
FIVETECHNOLOGY DRIVERS 1. Data and Analytics
2. Cloud Connectivity
3. Connected Sensors
4. Mobile Technology
5. User Interfaces
Just as we have moved away from point-and-click devices toward touch screens, next we will see our bodies being increasingly used as both a controller and an interface.
DESIGNING FOR HUMAN BANDWIDTH
DESIGNERS WILL NEED TO ASK THEMSELVES, WHAT IS THE QUICKEST, MOST RELIABLE WAY TO GET INFORMATION INTO AND OUT OF THE HUMAN BODY? — TO DESIGN FOR A HUMAN INTERFACE, SUCH AS THE BODY, DESIGNERS WILL NEED TO MEASURE BANDWIDTH AGAINST USABILITY.
Responding to natural human behavior will become a more important element of design as we move into a mixed environment of screens and smart objects without screens.
CONVERSATIONS WITH DATA
HUMAN UI — THE COMBINATION OF BODY LANGUAGE + HUMAN SENSES TO NATURALLY COMMUNICATE WITH MACHINES AROUND US
— BODY LANGUAGE REMAINS A VITAL WAY IN WHICH WE TRANSMIT MEANING AND EMPHASIS.
THE RISE OF PCS, LAPTOPS AND MOBILE PHONES HAS CREATED A REVOLUTION IN REMOTE COMMUNICATIONS—A WORLD WHERE BODY LANGUAGE SEAMS LESS IMPORTANT.
Although there are cultural variations, all over the world we ‘read’ other people through their body language, be it consciously or subconsciously.
BODY LANGUAGE CONSIDERATIONS
Body language will manifest itself in a number of ways: 1) Gestures 2) Intent 3) Face Speed
GAMING HAS ALREADY BEEN DOING IT
Speaking with our bodies can eliminate friction points from daily life. When implemented in a way that is useful to people, gestures will become something we barely think about.
SKELETAL GESTURES
Designers will have the opportunity to create standards for gestures, but will need to be sensitive to specific cultural meanings and ‘gesture conflict.’
HAND GESTURES
We use a wide array of subtle gestures and signals in our daily interactions with each other to signify intent. The UI of intent is very important when navigating around the physical world.
INTENT
AUTOMATED AND ROBOTIC OBJECTS’ WILL NEED TO EXPRESS CLEAR INTENT
Self-driving cars, for instance, need to express to pedestrians waiting to cross the road whether or not it is safe for them to go.
As we begin to interact with objects embedded in our environment, and as that process becomes more human, we will become less tolerant of delays.
FACE SPEED
HUMANIZED OBJECTS WILL BE EXPECTED TO RESPOND QUICKLY
As we begin to interact with objects embedded in the environment around us, and as that process becomes more human, we’re going to become less tolerant of delays.
HUMAN SENSES
SIGHT SOUND TOUCH SMELL TASTE
Exteroceptive Senses that perceive the body's own position, motion, and state
• Temperature • Direction • Balance • Pain • Kinesthetic
Interoception Internal senses that are normally stimulated from within the body
• Tension • Pressure • Stretch • Itch • Chemoreceptors • Thirst • Hunger
Perception – Not based on a specific sensory organ: i.e. Time
+ Body Language Considerations: Gesture, Intent and Face Speed
SIGHTOphthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods).
EYES Inputs • Cameras • Contacts • Eye Implant
SIGHTOphthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods).
EYES Outputs • Screens
(TVs, Tablets, Phones) • Glass
(Windows, Mirrors, Visors)
• Eye Wear (Glasses, Goggles)
• Projectors
SIGHTOphthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods).
EYES Application • GUI and NUI • AR/VR/MR • Facial Recognition • Mood • Emotional Resonance • Image/video/object
recognition • Gesture Control • Heat Mapping • Night Vision • Eye Tracking
SOUNDAudioception – Detecting vibrations along some medium, such as air or water that is in contact with your ear drums
MOUTH Inputs • Microphone • Sonar
SOUNDAudioception – Detecting vibrations along some medium, such as air or water that is in contact with your ear drums
EAR Outputs • Speaker • Headphones • Hearing Aid
SOUNDAudioception – Detecting vibrations along some medium, such as air or water that is in contact with your ear drums
EARS & MOUTH Application • Voice Simulation • Text to Speech • Translation • Recording • Noise Cancellation • Volume
TOUCHTactioception – Refers to the body's ability to feel physical sensations. This sense uses several modalities distinct from touch like pressure, temperature, pain, and even itch senses.
SKIN Inputs • Tactile Perception • Texture • Pressure • Temperature • Motion/Movement
TOUCHTactioception – Refers to the body's ability to feel physical sensations. This sense uses several modalities distinct from touch like pressure, temperature, pain, and even itch senses.
SKIN Outputs • Haptic Feedback • Vibrations
TOUCHTactioception – Refers to the body's ability to feel physical sensations. This sense uses several modalities distinct from touch like pressure, temperature, pain, and even itch senses.
SKIN Application • Clothing • Wearables • Tattoos • Implants • Navigation
Tech Tats Sentari
SMELLOphthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods).
NOSE Inputs • Sensors • Digital Device
SMELLOphthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods).
NOSE Outputs • Digital Scent • Breath Analysis • Freshness Analysis • Health Detection • Odor Diffusion
SMELLOlfacoception – Yet another of the senses that work off of a chemical reaction. This sense combines with taste to produce flavors.
NOSE Application • Clothing • Respirator • Breathable Wear • Nose Ring • Nose Plug • Connected Home
TASTEGustaoception – This is sometimes argued to be five senses by itself due to the differing types of taste receptors (sweet, salty, sour, bitter, and umami), but generally is just referred to as one sense.
MOUTH Inputs • Sensors • Tooth Implant • e-cigarette • Ingestable • Eating Utensils
TASTEGustaoception – This is sometimes argued to be five senses by itself due to the differing types of taste receptors (sweet, salty, sour, bitter, and umami), but generally is just referred to as one sense.
MOUTH Outputs • Taste Sensing • Taste Simulator • Teach Cooking
TASTEGustaoception – This is sometimes argued to be five senses by itself due to the differing types of taste receptors (sweet, salty, sour, bitter, and umami), but generally is just referred to as one sense.
MOUTH Application • Taste Profile • Meal Creation • Diet Recommendation • Medical
MEASURING THE MOOD OF AN EVENTWe collected data from each individual portrait, live tweets from the event, and facial detection camera data, and analyzed this with Watson User Modeling. We represented this analysis through an evolving animated visual that details the Big 5 personality traits.
STATE FARM POCKET AGENT• We gave Watson eyes: Using facial detection, we
tracked various actions from visitors near the tasting
booth, like smiling and attention.
• Individual Conversations: We built a custom iPad app
that allowed Watson to get to know visitors through a
short, engaging conversation, using speech-to-text
technology.
• Concept expansion: We analyzed the Twitter feeds
from event visitors to generate a list of topics, along
with each individual’s interest level of each topic.
WHAT WE DELIVERED
Chaotic Moon Confidential
GIVING WATSON EYES We will use facial detection to track various actions from visitors near the tasting booth, like smiling and attention. !By utilizing this third-party technology, we have embraced the natural extensibility of the Watson platform to combine an unrelated technology with object detection technology, giving Watson vision into the physical world. !
Nullam id dolor id nibh ultricies vehicula ut id elit. Cum
sociis natoque penatibus et magnis dis parturient
montes, nascetur ridiculus mus.
COLLECTIVE PORTRAIT
EMAIL AUSTIN OFFICE319 Congress, Suite 200Austin, TX 78701
PHONECHAOTICMOON.COM
GET IN TOUCH
greg@chaoticmoon.com
512.420.8800