Slide 1

Slide 1 text

2 December 2005 Next Generation User Interfaces Virtual and Augmented Reality Prof. Beat Signer Department of Computer Science Vrije Universiteit Brussel beatsigner.com

Slide 2

Slide 2 text

Beat Signer - Department of Computer Science - [email protected] 2 April 15, 2024 Mixed Reality ▪ Reality–Virtuality continuum ▪ introduced by Paul Milgram and Fumio Kishino in 1994 ▪ Merging of real and virtual environments ▪ physical and digital objects co-exist and interact in real time ▪ mixed reality is the spectrum between the real environment and the purely virtual environment ▪ augmented reality and augmented virtuality (e.g.digital twins) Mixed Reality (MR) Real Environment Virtual Environ- ment (VR) Augmented Reality (AR) Augmented Virtuality (AV)

Slide 3

Slide 3 text

Beat Signer - Department of Computer Science - [email protected] 3 April 15, 2024 Virtual Reality ▪ Virtual Reality (VR) is an artificial environment which is experienced through sensory stimuli (e.g.sight or sound) provided by a computer and in which a user’s actions partially determine what happens in the environment ▪ Main issues ▪ create acceptable substitutes for real world objects or environments ▪ sense the virtual environment ▪ navigate through the virtual environment ▪ interact with the virtual environment ▪ Opportunities ▪ experience situations that would be too dangerous or expensive in the real world

Slide 4

Slide 4 text

Beat Signer - Department of Computer Science - [email protected] 4 April 15, 2024 The Sword of Damocles ▪ First virtual reality (VR) system and augmented reality (AR) head-mounted display (HMD) ▪ developed by Ivan Sutherland and his student Bob Sproull in 1968 ▪ binocular display with simple stereoscopic view of wireframe rooms ▪ heavy HMD attached to a mechanical arm suspended from the ceiling - head tracking to change the perspective shown by the software

Slide 5

Slide 5 text

Beat Signer - Department of Computer Science - [email protected] 5 April 15, 2024 Virtual Reality Applications ▪ Architecture ▪ experience and navigate in virtual buildings ▪ Education ▪ visualisation and interaction with complex data ▪ Medicine ▪ training ▪ virtual robotic surgery ▪ Engineering ▪ Military ▪ Entertainment

Slide 6

Slide 6 text

Beat Signer - Department of Computer Science - [email protected] 6 April 15, 2024 Virtual Reality Applications … ▪ Sport ▪ Simulations ▪ Modelling ▪ Information Systems ▪ Fashion ▪ Games ▪ …

Slide 7

Slide 7 text

Beat Signer - Department of Computer Science - [email protected] 7 April 15, 2024 Immersion ▪ Perceptual immersion (physical immersion or sensory immersion) is about the perception of being physically present in a non-physical virtual environment which is created by surrounding images, sound or other stimuli by the VR system ▪ panoramic 3D vision ▪ 3D surround sound ▪ touch and force feedback ▪ taste ▪ smell ▪ direct connection to the human nervous system

Slide 8

Slide 8 text

Beat Signer - Department of Computer Science - [email protected] 8 April 15, 2024 Non-Immersive Virtual Reality ▪ Non-immersive virtual environments show a real-time 3D environment on a desktop screen ▪ typical applications: CAD, simulations, 3D computer games, … ▪ also called "desktop virtual reality" ▪ Continuum from non-immersive to partially immersive and fully immersive systems

Slide 9

Slide 9 text

Beat Signer - Department of Computer Science - [email protected] 9 April 15, 2024 Virtual Reality Technologies ▪ Large screens ▪ Binocular Omni-Orientation Monitor (BOOM) ▪ Cave Automatic Virtual Environment (CAVE) ▪ Head-mounted Display (HMD)

Slide 10

Slide 10 text

Beat Signer - Department of Computer Science - [email protected] 10 April 15, 2024 Large Screens ▪ Large (panoramic) screen displays ▪ flat displays ▪ cylindrical displays (up to 360°) ▪ hemispherical or spherical displays ▪ Displays can be implemented with or without stereoscopy

Slide 11

Slide 11 text

Beat Signer - Department of Computer Science - [email protected] 11 April 15, 2024 BOOM ▪ Binocular Omni-Orientation Monitor ▪ Head-coupled stereo- scopic display device ▪ Screens are housed in a box that is attached to a multi-link arm ▪ user looks into the box and can move the box to different positions ▪ head tracking is realised via sensors in the joints of the arm holding the box [http://www.umich.edu/~vrl/intro/AndreOnBoom.jpg]

Slide 12

Slide 12 text

Beat Signer - Department of Computer Science - [email protected] 12 April 15, 2024 CAVE ▪ A Cave Automatic Virtual Environment (CAVE) provides the illusion of immersion by projecting stereo images on the walls and floor of a room-sized cube ▪ Users wear stereo glasses ▪ Head tracking system continuously adjusts the stereo projection to the current position of the leading viewer ▪ only one view for multiple users

Slide 13

Slide 13 text

Beat Signer - Department of Computer Science - [email protected] 13 April 15, 2024 Head-mounted Display (HMD) ▪ A head-mounted display is a lightweight virtual reality device that the user wears to have video information directly displayed in front of their eyes ▪ one or two small displays (LCD, OLED) embedded in the helmet, glasses or visor ▪ lenses are used to give the perception that the images are coming from farther away - moves the virtual image to a distance that allows the eye to focus comfortably ▪ connected to a computer or stand-alone solutions such as the Meta Quest 3 Meta Quest 3 virtual reality (mixed reality) headset

Slide 14

Slide 14 text

Beat Signer - Department of Computer Science - [email protected] 14 April 15, 2024 Navigation and Interaction ▪ Virtual reality is about creating computer-generated scenes in which a user can navigate and interact ▪ Navigation is the ability to move around and explore the features of the virtual environment (3D scene) ▪ walk within a virtual building ▪ Interaction involves the selection and moving of objects in a scene ▪ open a virtual door ▪ move an atom ▪ …

Slide 15

Slide 15 text

Beat Signer - Department of Computer Science - [email protected] 15 April 15, 2024 VR Navigation Techniques ▪ Grabbing in the air ▪ user grabs points in the virtual world and drags and rotates them at will ▪ Lean-based velocity ▪ lean forwards or backwards to move in the virtual world ▪ Path drawing ▪ specify a path to be followed ▪ Walking in place ▪ might be supported via a treadmill Virtuix Omni

Slide 16

Slide 16 text

Beat Signer - Department of Computer Science - [email protected] 16 April 15, 2024 Video: Disney HoloTile Treadmill

Slide 17

Slide 17 text

Beat Signer - Department of Computer Science - [email protected] 17 April 15, 2024 VR Interaction Techniques ▪ Two types of interaction techniques ▪ non-immersive interaction - e.g.via mouse or joystick ▪ immersive interaction - e.g.using wearable device or capturing of limb positions ▪ Gesture recognition ▪ datagloves ▪ optical, vision-based gesture recognition Power Glove for Nintendo, Mattel, 1989

Slide 18

Slide 18 text

Beat Signer - Department of Computer Science - [email protected] 18 April 15, 2024 VR Interaction Techniques … ▪ Interaction with the virtual world through ▪ virtual hand - map the position of a user’s hand to the virtual reality environment ▪ ray casting - virtual light ray is leaving a user’s hand and selects the first object that it hits ▪ image plane - an object is selected with a pointing gesture and manipulated via the other hand Ray casting

Slide 19

Slide 19 text

Beat Signer - Department of Computer Science - [email protected] 19 April 15, 2024 Video: Haptic PIVOT

Slide 20

Slide 20 text

Beat Signer - Department of Computer Science - [email protected] 20 April 15, 2024 Augmented Reality Augmented Reality (AR) is a variation of Virtual Environments (VE), or Virtual Reality as it is more commonly called. VE technologies completely immerse a user inside a synthetic environment. While immersed, the user cannot see the real world around him. In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world. Therefore, AR supplements reality, rather than completely replacing it. Ideally, it would appear to the user that the virtual and real objects coexisted in the same space […] Ronald T. Azuma, A Survey of Augmented Reality, Teleoperators and Virtual Environments, 6(4), 1997

Slide 21

Slide 21 text

Beat Signer - Department of Computer Science - [email protected] 21 April 15, 2024 Augmented Reality Applications ▪ Maintenance ▪ Architecture ▪ Education ▪ Medicine ▪ Entertainment ▪ Navigation ▪ Gaming ▪ Advertising ▪ …

Slide 22

Slide 22 text

Beat Signer - Department of Computer Science - [email protected] 22 April 15, 2024 Augmented Reality Techniques ▪ Video compositing ▪ Head-up displays ▪ Direct projection ▪ Magic lens metaphor ▪ Magic mirror metaphor ▪ Magic eyeglass metaphor

Slide 23

Slide 23 text

Beat Signer - Department of Computer Science - [email protected] 23 April 15, 2024 Video Compositing ▪ Virtual information is overlaid in a video stream of a real scene ▪ can happen in real time or in post processing LiberoVision

Slide 24

Slide 24 text

Beat Signer - Department of Computer Science - [email protected] 24 April 15, 2024 Video: Augmented Live Music Performance

Slide 25

Slide 25 text

Beat Signer - Department of Computer Science - [email protected] 25 April 15, 2024 Head-up Displays (HUDs) ▪ Head-up displays are used in civil and military aircrafts as well as in some cars ▪ The overlaid information is generally not directly connected to the objects seen through the window ▪ weak blend between virtuality and reality

Slide 26

Slide 26 text

Beat Signer - Department of Computer Science - [email protected] 26 April 15, 2024 SixthSense (Direct Projection) ▪ Wearable augmented reality interface ▪ small camera and projector ▪ developed at MIT Media Lab - Pranav Mistry and Pattie Maes ▪ Visionary wearable augmented reality system ▪ what happened to the SixthSense?

Slide 27

Slide 27 text

Beat Signer - Department of Computer Science - [email protected] 27 April 15, 2024 Video: SixthSense

Slide 28

Slide 28 text

Beat Signer - Department of Computer Science - [email protected] 28 April 15, 2024 Wikitude World Browser (Magic Lens) ▪ The WIKITUDE World browser presents information about nearby physical landmarks as well as content added by other users ▪ Real-time augmentation of mobile phone camera view ▪ location-based augmented reality based on GPS, compass and accelerometer ▪ WIKITUDE SDK for augmented reality applications

Slide 29

Slide 29 text

Beat Signer - Department of Computer Science - [email protected] 29 April 15, 2024 Sky Map ▪ Mobile phone application (originally developed by Google) that can be used as a magic lens to get information about stars in the sky ▪ Real-time augmentation of the sky based on the mobile phone’s position (e.g.via GPS) and orientation

Slide 30

Slide 30 text

Beat Signer - Department of Computer Science - [email protected] 30 April 15, 2024 ARToolKit ▪ Tracking library to overlay virtual imagery ▪ Calculates real-time camera position and orientation relative to square physical markers ▪ Fast enough for real-time AR applications ▪ Free and open source ▪ https://github.com/artoolkitx ▪ Multiple spinoffs ▪ ARTag, MRToolkit, osgART, ARToolKitPlus

Slide 31

Slide 31 text

Beat Signer - Department of Computer Science - [email protected] 31 April 15, 2024 Video: IKEA Augmented Reality

Slide 32

Slide 32 text

Beat Signer - Department of Computer Science - [email protected] 32 April 15, 2024 Magic Mirror Metaphor ▪ Technically the concept of a magic mirror is very similar to the magic lens, except for the orientation of the camera ▪ Typically used to overlay information on the user ▪ e.g.in fashion

Slide 33

Slide 33 text

Beat Signer - Department of Computer Science - [email protected] 33 April 15, 2024 Magic Eyeglass Techniques ▪ See-through head mounted displays ▪ Virtual images mixed with a real view of the world ▪ Three kinds of see-through HMDs ▪ optical see-through HMDs ▪ video see-through HMDs ▪ virtual retinal displays

Slide 34

Slide 34 text

Beat Signer - Department of Computer Science - [email protected] 34 April 15, 2024 Optical See-through HMDs ▪ With optical see-through HMDs, the virtual images are produced on semi-transparent surfaces (LCD panels) or reflected on semi-transparent mirrors ▪ tracking of head to create virtual world ▪ examples: Microsoft HoloLens 2, Magic Leap 2 Microsoft HoloLens 2 https://niteeshyadav.com/blog/

Slide 35

Slide 35 text

Beat Signer - Department of Computer Science - [email protected] 35 April 15, 2024 Video See-through HMDs ▪ With video see-through HMDs, real video images are captured by one or multiple video cameras installed in the unit and overlaid with computer graphics (virtual) images ▪ examples: Meta Quest 3, Apple Vision Pro, HTC Vive XR, Varjo XR-4 https://niteeshyadav.com/blog/ Apple Vision Pro

Slide 36

Slide 36 text

Beat Signer - Department of Computer Science - [email protected] 36 April 15, 2024 Virtual Retinal Displays (VRD) ▪ Virtual retinal display (VRD) or retinal scan display (RSD) ▪ Projects three modulated light beams directly onto the retina of the eye producing a rasterised image ▪ Illusion of seeing the source image like a conventional display floating in space in front of the eye ▪ example: Google Glass

Slide 37

Slide 37 text

Beat Signer - Department of Computer Science - [email protected] 37 April 15, 2024 Google Glass ▪ Android-based headset ▪ Bluetooth connection to mobile phone ▪ display with "picture in picture" experience ▪ camera and microphone ▪ voice and touch-based interaction ▪ What about using voice navigation in public space? ▪ do people for example use Siri in public space? ▪ What about privacy and safety? ▪ recording pictures and movies or recognising people’s faces ▪ What about the social acceptance of Google Glass?

Slide 38

Slide 38 text

Beat Signer - Department of Computer Science - [email protected] 38 April 15, 2024 Video: Looking Through Google Glass

Slide 39

Slide 39 text

Beat Signer - Department of Computer Science - [email protected] 39 April 15, 2024 Microsoft HoloLens 2 ▪ Lenses to combine superimposed images ▪ optical waveguides ▪ Integrated processor ▪ all the processing happens on the HoloLens ▪ Depth camera and other sensors to "understand" the environment

Slide 40

Slide 40 text

Beat Signer - Department of Computer Science - [email protected] 40 April 15, 2024 Video: HoloLens 2 on Stage

Slide 41

Slide 41 text

Beat Signer - Department of Computer Science - [email protected] 41 April 15, 2024 Video: HoloLens 2 at Philips

Slide 42

Slide 42 text

Beat Signer - Department of Computer Science - [email protected] 42 April 15, 2024 Video: Dynamics 365 Remote Assist

Slide 43

Slide 43 text

Beat Signer - Department of Computer Science - [email protected] 43 April 15, 2024 WebXR ▪ New Web-based specification (W3C) for the development of VR as well as AR applications ▪ implement once and run on different platforms and devices ▪ Device API ▪ see-through AR on mobile phones, VR headsets, AR headsets, … ▪ Room Tracking API ▪ detection of surfaces and walls ▪ inside-out or outside-in tracking ▪ API for native features ▪ controllers, hand tracking, spatial audio, haptic feedback, … ▪ You will get more details in the WebXR exercise session

Slide 44

Slide 44 text

Beat Signer - Department of Computer Science - [email protected] 44 April 15, 2024 The Future of Augmented Reality? ▪ New forms of less distracting augmentations ▪ e.g. augmented contact lenses ▪ Improvements in tracking technologies ▪ e.g. see Microsoft HoloLens ▪ What about the social acceptance? B.A. Parviz, University of Washington

Slide 45

Slide 45 text

Beat Signer - Department of Computer Science - [email protected] 45 April 15, 2024 References ▪ P. Milgram and F. Kishino, Taxonomy of Mixed Reality Visual Displays, IEICE Transactions on Information Systems, December 1994 ▪ https://www.researchgate.net/publication/231514051_A_Taxonomy_of_Mi xed_Reality_Visual_Displays ▪ SixthSense ▪ https://www.youtube.com/watch?v=nZ-VjUKAsao ▪ Looking Through Google Glass: Real Life Example ▪ https://www.youtube.com/watch?v=d-y3bEjEVV8 ▪ Place IKEA Furniture in Your Home ▪ https://www.youtube.com/watch?v=vDNzTasuYEw

Slide 46

Slide 46 text

Beat Signer - Department of Computer Science - [email protected] 46 April 15, 2024 References … ▪ T. Azuma, A Survey of Augmented Reality, Teleoperators and Virtual Environments, 6(4), 1997 ▪ https://www.cs.unc.edu/~azuma/ARpresence.pdf ▪ Augmented Reality During Live Music Performances ▪ https://www.youtube.com/watch?v=nyVs_5TfN4c ▪ L. Hoste and B. Signer, Expressive Control of Indirect Augmented Reality During Live Music Performances, Proceedings of NIME 2013, 13th International Con- ference on New Interfaces for Musical Expression, Daejeon, Korea Republic, May 2013 ▪ https://beatsigner.com/publications/hoste_NIME2013.pdf

Slide 47

Slide 47 text

Beat Signer - Department of Computer Science - [email protected] 47 April 15, 2024 References … ▪ Haptic PIVOT by Microsoft Research ▪ https://www.youtube.com/watch?v=kj3RdeJUJos ▪ Microsoft HoloLens 2 on Stage ▪ https://www.youtube.com/watch?v=uIHPPtPBgHk ▪ Microsoft HoloLens 2 at Philips ▪ https://www.youtube.com/watch?v=3dHBBdRf9VE ▪ Dynamics 365 Remote Assist and HoloLens 2 ▪ https://www.youtube.com/watch?v=d3YT8j0yYl0 ▪ Interview with Fabien Bénétou on WebXR ▪ https://www.inthepocket.com/blog/our-conversation-with-fabien- benetou

Slide 48

Slide 48 text

Beat Signer - Department of Computer Science - [email protected] 48 April 15, 2024 References … ▪ Disney HoloTile ▪ https://www.youtube.com/watch?v=68YMEmaF0rs

Slide 49

Slide 49 text

2 December 2005 Next Lecture Data Physicalisation