VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
-
Upload
mark-billinghurst -
Category
Technology
-
view
899 -
download
1
Transcript of VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
USING AR AND VR TO CREATE EMPATHIC EXPERIENCES Mark Billinghurst [email protected]
October 19th 2016
VSMM 2016
• daf
Hiroshi Ishii – AWE 2014
1977 – Star Wars
Augmented Reality
Augmented Reality
• Defining Characteristics • Combines Real and Virtual Images
• Both can be seen at the same time • Interactive in real-time
• The virtual content can be interacted with • Registered in 3D
• Virtual objects appear fixed in space
ARToolKit (1999)
First open source AR SDK http://artoolkit.sourceforge.net/
Shared Space (1999)
• Face to Face interaction, Tangible AR metaphor • Easy collaboration with strangers • Users acted same as if handling real objects
MagicBook (2001)
• First AR story book • Transitional AR to VR interface
AR Tennis (2005)
• First collaborative AR game on mobile phone
Mobile AR Advertising (2007)
• First mobile AR ad campaign (Saatchi & Saatchi)
GeoBoids (2012)
• Outdoor AR game • Capture creatures
• Use whistling to attract creatures • Google Glass/Handheld AR
GeoBoids Demo
https://www.youtube.com/watch?v=x3MKR3xVRM0
Pokemon GO..
Pokemon GO Effect
• Fastest App to reach $500 million in Revenue • Only 63 days after launch, > $1 Billion in 6 months • Over 500 million downloads, > 25 million DAU • Nintendo stock price up by 50% (gain of $9 Billion USD)
AR Business Today
• Around $600 Million USD in 2014 (>$2B 2015) • 70-80+% Games and Marketing
“Do you want to sell sugar water for the rest of your life or do you want to come with me and change the world?"
Steve Jobs to John Sculley 1983
Mark’s Midlife Crisis
• Get Married • Sabbatical at Google • Looking for new opportunities • Resigned my job • Moved to a new country • Now creating a new research group
• df
From Hiroshi Ishii
• asfas
Interaction Technology Natural
Time
Punch Card
Keyboard
Mouse
Speech
Gesture
Emotion
1950 1960 1980 1990 2000 2010
Thought
Physiological Sensing
Emotiv Empatica
Interaction Technology Natural
Time
Punch Card
Keyboard
Mouse
Speech
Gesture
Emotion
1950 1960 1980 1990 2000 2010
Thought
Implicit
Explicit
Content Capture Realism
Time
Photo
Film
Live Video
Panorama
360 Video
3D Space
1850 1900 1940 1990 2000 2010
Live Streaming/3D Space Capture
Google Project Tango Samsung Project Beyond
Content Capture Realism
Time
Photo
Film
Live Video
Panorama
360 Video
3D Space
1850 1900 1940 1990 2000 2010
2D Static
Immersive
Live
Experience
Networking Log (b/s)
Time
100 b/s
10 Kb/s
1 Mb/s
1980 1985 1990 1995 2000 2010
100 Mb/s
2005
Network Innovation
Universal Connectivity
Networking Log (b/s)
Time
100 b/s
10 Kb/s
1 Mb/s
1980 1985 1990 1995 2000 2010
100 Mb/s
2005
Text
Audio
Natural
Video
Holoportation
• Augmented Reality + 3D capture + high bandwidth • http://research.microsoft.com/en-us/projects/holoportation/
Holoportation Demo
https://www.youtube.com/watch?v=7d59O6cfaM0
Natural Collaboration
Implicit Understanding
Experience Capture
Natural Collaboration
Implicit Understanding
Experience Capture
Empathic Computing
EMPATHIC COMPUTING Systems that Create and Share Understanding
Empathy
“Seeing with the Eyes of another,
Listening with the Ears of another,
and Feeling with the Heart of another..”
Alfred Adler
Empathic Computing
1. Understanding: Systems that can understand your feelings and emotions
2. Experiencing: Systems that help you better experience the world of others
3. Sharing: Systems that help you better share the experience of others
Sensors
VR
AR
1. Understanding: Affective Computing
• Ros Picard – MIT Media Lab • Systems that recognize emotion
Appliances That Make You Happy
• Jun Rekimoto – University of Tokyo/Sony CSL • Smile detection + smart appliances
Happiness Counter Demo
https://vimeo.com/29169237
2. Experiencing: Virtual Reality
"Virtual reality offers a whole different medium to tell stories that really connect people and create an empathic connection."
Nonny de la Peña http://www.emblematicgroup.com/
Using VR for Empathy
• USC Project Syria (2014) • Experience of Terrorism • Project Homeless (2015)
• Experience of Homelessness
Demo: Project Syria
https://www.youtube.com/watch?v=Uuszow5giaQ
CHILDHOOD
• Kenji Suzuki, University of Tsukuba • What does it feel like to be a child? • VR display + moved cameras + hand restrictors
CHILDHOOD Demo
https://vimeo.com/128641932
3. Sharing
Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
• sdfs
• sdfgs
• axcvxca
Movies are like a machine that generates Empathy
Roger Ebert
Technical Requirements
• Basic Requirements • Make the technology transparent
• Wearable, unobtrusive
• Technology for transmitting • Sights, Sounds, Feelings of another
• Audio, video, physiological sensors
Wearable AR for Empathic Interfaces
• Wearable AR can: • Be unobtrusive • Capture emotion • Share sights and sounds • Provide two way communication • Enhance interaction in the real world
Changing Perspective
• CamNet (1992) • British Telecom
• Wearable Teleconferencing • audio, video
• Remote collaboration • Sends task space video
• Similar CMU study (1996) • cut performance time in half
WACL: Remote Expert Collaboration
• Wearable Camera/Laser Pointer • Independent pointer control • Remote panorama view
WACL: Remote Expert Collaboration
• Remote Expert View • Panorama viewing, annotation, image capture
Kurata, T., Sakata, N., Kourogi, M., Kuzuoka, H., & Billinghurst, M. (2004, October). Remote collaboration using a shoulder-worn active camera/laser. In Wearable Computers, 2004. ISWC 2004. Eighth International Symposium on (Vol. 1, pp. 62-69).
Example: Google Glass
• Camera + Processing + Display + Connectivity • Ego-Vision Collaboration (But with Fixed View)
Social Panoramas (ISMAR 2014)
• Capture and share social spaces in real time • Supports independent views into Panorama
Implementation
• Google Glass • Capture live image panorama (compass + camera)
• Remote device (tablet) • Immersive viewing, live annotation
Interface
Glass View
Tablet View
Social Panorama
https://www.youtube.com/watch?v=vdC0-UV3hmY
Lessons Learned
• Good • Communication easy and natural • Users enjoy have view independence • Very natural capturing panorama on Glass • Sharing panorama enhances the shared experience
• Bad • Difficult to support equal input • Need to provide awareness cues
JackIn – Live Immersive Video Streaming
• Jun Rekimoto – University of Tokyo/Sony CSL
JackIn Hardware
• Wide angle cameras – 360 degree video capture • Live video stitching
JackIn Demo
https://www.youtube.com/watch?v=mxFuQcYL4D8
JospehTame – Tokyo Marathon
• Live streaming from Tokyo marathon • http://josephta.me/en/tokyo-marathon/
• zxcz
Capturing Space: Real World Capture
• Hands free AR • Portable scene capture (color + depth) ! Projector/Kinect combo, Remote controlled pan/tilt
• Remote expert annotation interface
Remote Expert View
Example: CoSense (CHI 2015)
• Real time sharing - Emotion, video, and audio • Wearable (sender) – Send emotion and view • Desktop (receiver) - See remote view and emotion
Google Glass e-Health 2.0 board
+
Implementation
Data Capture
Feature Detection
Emotion Recognition
Emotion Representation
Empathic User Interface
Hardware
User Interface
Wearable Interface
• Google Glass + e-Health + Spydroid + SSI • Measure GSR, pulse oxygen, ECG, voice pitch • Share video and audio remotely • Representative emotions sent back to Glass user
!
!
Desktop Interface
Lessons Learned
• Good • System was wearable • Sender and receiver mirrored emotion • Minimal cues provided best experience
• Bad • System delays • Need for good stimulus • Difficult to represent emotion
Gaze and Video Conferencing • Gaze tracking
• Implicit communication cue • Shows intent
• Task space collaboration • HMD + camera + gaze tracker
• Expected Results • Gaze cues reduce need for communication • Allow remote collaborator to respond faster
Equipment
• Custom eye-tracker • Head mounted camera • Head mounted display
Experiment Set Up • Lego assembly • Two assembly areas • Remote expert
Experiment Design
• 4 conditions varying eye-tracking/pointer support • 13 pairs subjects • Measures
• Performance time • Likert scale results, Ranking results, User preference
Task Performance
• Performance Time (seconds)
Ranking
• Median Ranking Values
Key Results • Both the pointer and eye tracking visual cues helped participants to perform significantly faster
• The pointer cue significantly improved perceived quality of collaboration and co-presence
• Eye-tracking improved the collaboration quality, and sense of being focused for the local users, and enjoyment for the remote users
• The Both condition was ranked as the best in user experience, while the None condition was worst.
Empathy Glasses (CHI 2016)
• Combine together eye-tracking, display, face expression • Impicit cues – eye gaze, face expression
+ +
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
AffectiveWear – Emotion Glasses
• Photo sensors to recognize expression • User calibration • Machine learning • Recognizing 8 face expressions
Integrated System • Local User
• Video camera • Eye-tracking • Face expression
• Remote Helper • Remote pointing
System Diagram
• Two monitors on Remote User side • Scene + Emotion Display
Empathy Glasses Demo
Ranking Results
"I ranked the (A) condition best, because I could easily point to communicate, and when I needed it I could check the facial expression to make sure I was being understood.”
Q2: Communication Q3: Understanding partner
HMD – Local User Computer – Remote User
Lessons Learned • Pointing really helps in remote collaboration
• Makes remote user feel more connected
• Gaze looks promising • Shows context of what person talking about • Establish shared understanding/awareness
• Face expression • Used as an implicit cue to show comprehension
• Limitations • Limited implicit cues • Task was a poor emotional trigger • AffectiveWear needs improvement
Empathic VR Environments
• Player and Viewer • Viewer slaved to player
• Share emotional signals • Heart rate, GSR
• Remote affect measuring
Emotion Classification
• Classify emotions from physiological data • 8 emotional states
• Collaboration with Sensaura • http://www.sensauratech.com
Heart Rate for Different VR Apps. • Games: theBlu, BrookHaven, Night Café
quiet scene – scary scene – peaceful scene • Devices: HTV VIVE, Empatica E4
Heart Rate The Blu BrookHaven Night Café
Mean 84.89 BPM 98.23 BPM 73.01 BPM
Sharing VR Experiences
• Player controls viewer position (not view) • Measure and share physiological cues
Viewer
Player
Demo Video
AR and VR for Empathic Computing • VR systems are ideal for trying experiences:
• Strong story telling medium • Provide total immersion/3D experience • Easy to change virtual body scale and representation
• AR systems are idea for live sharing: • Allow overlay on real world view/can share viewpoints • Support remote annotation/communication • Enhance real world task
Looking to the Future
What’s Next?
Scaling Up
• Seeing actions of millions of users in the world • Augmentation on city/country level
AR + Smart Sensors + Social Networks
• Track population at city scale (mobile networks) • Match population data to external sensor data • Mine data for applications
Example: MIT SENSEable City Lab
http://senseable.mit.edu/wikicity/rome/
Example: CSIRO WeFeel Tool
• Emotionally mining global Twitter feeds
• http://wefeel.csiro.au
European Cup – July 10th
GOAL!
Research Challenges
• How to capture emotion?
• How to measure empathy?
• Interface/interaction models?
• How to communicate emotion?
• How to create strong empathic bonds?
• How to scaling up to city/country scale?
Potential Applications • Education • Sports training • Rich life logging • Remote meeting support • Psychological treatments • Virtual Travel/Entertainment • Surrogate Adventure Tourism • First responders (stress, team cohesion)
CONCLUSION
Harvard Grant Study • $20 million, 75 years study
• 268 Harvard graduates • 456 disadvantaged people • Led by George Valliant
• What makes us happy? • warmth of relationships throughout life have the greatest positive impact on "life satisfaction".
“The seventy-five years and twenty million dollars expended on the Grant Study points to a straightforward five-word conclusion: Happiness is love. Full stop.” George Valliant
Conclusions
• Empathic Computing • Sharing what you see, hear and feel
• AR/VR Enables Empathic Experiences • Removing technology • Changing perspective • Sharing space/experience
• Many directions for future research