WebXR: Beyond
 WebGL

3c557c6103a4addc52f7b76ffd0a0f27?s=47 yomotsu
November 25, 2018
890

WebXR: Beyond
 WebGL

3c557c6103a4addc52f7b76ffd0a0f27?s=128

yomotsu

November 25, 2018
Tweet

Transcript

  1. WebXR Beyond
 WebGL HTML5 Conference 2018 | 2018-11-25

  2. Frontend Engineer at PixelGrid, Inc. Akihiro Oyamada @yomotsu

  3. https://www.youtube.com/watch?v=ttDLGV5IJCM

  4. This will possible on
 every mobile phone

  5. 1. WebXR Device API

  6. • The API for both AR and VR
 (WebVR API

    will be replaced by WebXR Device API) • Close relationship
 with Khronos’ OpenXR WebXR Device API
  7. • Currently, only available in
 Chrome Canary with flags •

    Only works on either
 https or localhost As of Nov. 2018
  8. Connect your Android Chrome to localhost
 with DevTools on desktop

  9. FYI: Origin Trials Token is required
 except localhost https://github.com/GoogleChrome/OriginTrials/blob/gh-pages/developer-guide.md

  10. None
  11. None
  12. Consist of 2 sources

  13. WebGL scene (Under your control) XR imaginary
 (Camera input) WebGL

    framebuffer
  14. WebGL framebuffer

  15. How does it work?

  16. How does it work? const device = await navigator.xr.requestDevice();

  17. How does it work? const device = await navigator.xr.requestDevice();

  18. XRSession device.requestSession( { immersive: false, outputContext: xrContext, environmentIntegration: true, }

    );
  19. XRSession const glCanvas = document.createElement( 'canvas' ); const gl =

    glCanvas.getContext( 'webgl', { xrCompatible: true } ); xrSession.baseLayer = new XRWebGLLayer( xrSession, gl ); WebGL context to be used
 as a source for XR imagery
  20. The framebuffer can be filled with XR imagery gl.bindFramebuffer( gl.FRAMEBUFFER,

    xrSession.baseLayer.framebuffer );
  21. Draw your 3D scene on it

  22. None
  23. frameOfReference Real world coordinate

  24. XRFrame
 (delta) frameOfReference

  25. xrFrame.view[n].projectionMatrix
 
 const pose = xrFrame.getDevicePose(frameOfRef) pose.getViewMatrix( view ) frameOfReference

    XRFrame
 (delta) Position and
 Orientation
  26. Demo

  27. Dive into the code!

  28. const width = window.innerWidth; const height = window.innerHeight; navigator.xr.requestDevice().then( (

    device ) => { const outputCanvas = document.getElementById( 'xrCanvas' ); outputCanvas.width = width; outputCanvas.height = height; const xrContext = outputCanvas.getContext( 'xrpresent' ); // session request must be done in user action such as click window.addEventListener( 'click', onEnterAR ); async function onEnterAR() { const xrSession = await device.requestSession( { outputContext: xrContext, environmentIntegration: true, } ); const renderer = new THREE.WebGLRenderer(); renderer.autoClear = false; renderer.setSize( width, height );
  29. const width = window.innerWidth; const height = window.innerHeight; navigator.xr.requestDevice().then( (

    device ) => { const outputCanvas = document.getElementById( 'xrCanvas' ); outputCanvas.width = width; outputCanvas.height = height; const xrContext = outputCanvas.getContext( 'xrpresent' ); // session request must be done in user action such as click window.addEventListener( 'click', onEnterAR ); async function onEnterAR() { const xrSession = await device.requestSession( { outputContext: xrContext, environmentIntegration: true, } ); const renderer = new THREE.WebGLRenderer(); renderer.autoClear = false; renderer.setSize( width, height );
  30. const width = window.innerWidth; const height = window.innerHeight; navigator.xr.requestDevice().then( (

    device ) => { const outputCanvas = document.getElementById( 'xrCanvas' ); outputCanvas.width = width; outputCanvas.height = height; const xrContext = outputCanvas.getContext( 'xrpresent' ); // session request must be done in user action such as click window.addEventListener( 'click', onEnterAR ); async function onEnterAR() { const xrSession = await device.requestSession( { outputContext: xrContext, environmentIntegration: true, } ); const renderer = new THREE.WebGLRenderer(); renderer.autoClear = false; renderer.setSize( width, height );
  31. window.addEventListener( 'click', onEnterAR ); async function onEnterAR() { const xrSession

    = await device.requestSession( { outputContext: xrContext, environmentIntegration: true, } ); const renderer = new THREE.WebGLRenderer(); renderer.autoClear = false; renderer.setSize( width, height ); // bind gl context to XR session const gl = renderer.getContext(); gl.setCompatibleXRDevice( xrSession.device ); xrSession.baseLayer = new XRWebGLLayer( xrSession, gl ); const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera(); camera.matrixAutoUpdate = false; const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial()
  32. window.addEventListener( 'click', onEnterAR ); async function onEnterAR() { const xrSession

    = await device.requestSession( { outputContext: xrContext, environmentIntegration: true, } ); const renderer = new THREE.WebGLRenderer(); renderer.autoClear = false; renderer.setSize( width, height ); // bind gl context to XR session const gl = renderer.getContext(); gl.setCompatibleXRDevice( xrSession.device ); xrSession.baseLayer = new XRWebGLLayer( xrSession, gl ); const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera(); camera.matrixAutoUpdate = false; const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial()
  33. window.addEventListener( 'click', onEnterAR ); async function onEnterAR() { const xrSession

    = await device.requestSession( { outputContext: xrContext, environmentIntegration: true, } ); const renderer = new THREE.WebGLRenderer(); renderer.autoClear = false; renderer.setSize( width, height ); // bind gl context to XR session const gl = renderer.getContext(); gl.setCompatibleXRDevice( xrSession.device ); xrSession.baseLayer = new XRWebGLLayer( xrSession, gl ); const scene = new THREE.Scene(); const camera = new THREE.PerspectiveCamera(); camera.matrixAutoUpdate = false; const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial()
  34. xrSession.baseLayer = new XRWebGLLayer( xrSession, gl ); const scene =

    new THREE.Scene(); const camera = new THREE.PerspectiveCamera(); camera.matrixAutoUpdate = false; const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial() ); scene.add( box ); const frameOfRef = await xrSession.requestFrameOfReference( 'eye-level' ); xrSession.requestAnimationFrame( onDrawFrame ); function onDrawFrame( timestamp, xrFrame ) { const session = xrFrame.session; // xrSession === xrFrame.session const pose = xrFrame.getDevicePose( frameOfRef ); session.requestAnimationFrame( onDrawFrame ); gl.bindFramebuffer( gl.FRAMEBUFFER, session.baseLayer.framebuffer );
  35. scene.add( box ); const frameOfRef = await xrSession.requestFrameOfReference( 'eye-level' );

    xrSession.requestAnimationFrame( onDrawFrame ); function onDrawFrame( timestamp, xrFrame ) { const session = xrFrame.session; // xrSession === xrFrame.session const pose = xrFrame.getDevicePose( frameOfRef ); session.requestAnimationFrame( onDrawFrame ); gl.bindFramebuffer( gl.FRAMEBUFFER, session.baseLayer.framebuffer ); if ( ! pose ) return; // if the session is for both right and left eyes, length of views would be 2. // if not, length is 1, xrFrame.views.forEach( ( view ) => { const viewport = session.baseLayer.getViewport( view ); renderer.setSize( viewport.width, viewport.height );
  36. scene.add( box ); const frameOfRef = await xrSession.requestFrameOfReference( 'eye-level' );

    xrSession.requestAnimationFrame( onDrawFrame ); function onDrawFrame( timestamp, xrFrame ) { const session = xrFrame.session; // xrSession === xrFrame.session const pose = xrFrame.getDevicePose( frameOfRef ); session.requestAnimationFrame( onDrawFrame ); gl.bindFramebuffer( gl.FRAMEBUFFER, session.baseLayer.framebuffer ); if ( ! pose ) return; // if the session is for both right and left eyes, length of views would be 2. // if not, length is 1, xrFrame.views.forEach( ( view ) => { const viewport = session.baseLayer.getViewport( view ); renderer.setSize( viewport.width, viewport.height );
  37. function onDrawFrame( timestamp, xrFrame ) { const session = xrFrame.session;

    // xrSession === xrFrame.session const pose = xrFrame.getDevicePose( frameOfRef ); session.requestAnimationFrame( onDrawFrame ); gl.bindFramebuffer( gl.FRAMEBUFFER, session.baseLayer.framebuffer ); if ( ! pose ) return; // if the session is for both right and left eyes, length of views would be 2. // if not, length is 1, xrFrame.views.forEach( ( view ) => { const viewport = session.baseLayer.getViewport( view ); renderer.setSize( viewport.width, viewport.height ); camera.projectionMatrix.fromArray( view.projectionMatrix ); const viewMatrix = new THREE.Matrix4().fromArray( pose.getViewMatrix( view ) ); camera.matrix.getInverse( viewMatrix ); camera.updateMatrixWorld( true ); renderer.clearDepth(); renderer.render( scene, camera );
  38. function onDrawFrame( timestamp, xrFrame ) { const session = xrFrame.session;

    // xrSession === xrFrame.session const pose = xrFrame.getDevicePose( frameOfRef ); session.requestAnimationFrame( onDrawFrame ); gl.bindFramebuffer( gl.FRAMEBUFFER, session.baseLayer.framebuffer ); if ( ! pose ) return; // if the session is for both right and left eyes, length of views would be 2. // if not, length is 1, xrFrame.views.forEach( ( view ) => { const viewport = session.baseLayer.getViewport( view ); renderer.setSize( viewport.width, viewport.height ); camera.projectionMatrix.fromArray( view.projectionMatrix ); const viewMatrix = new THREE.Matrix4().fromArray( pose.getViewMatrix( view ) ); camera.matrix.getInverse( viewMatrix ); camera.updateMatrixWorld( true ); renderer.clearDepth(); renderer.render( scene, camera );
  39. if ( ! pose ) return; // if the session

    is for both right and left eyes, length of views would be 2. // if not, length is 1, xrFrame.views.forEach( ( view ) => { const viewport = session.baseLayer.getViewport( view ); renderer.setSize( viewport.width, viewport.height ); camera.projectionMatrix.fromArray( view.projectionMatrix ); const viewMatrix = new THREE.Matrix4().fromArray( pose.getViewMatrix( view ) ); camera.matrix.getInverse( viewMatrix ); camera.updateMatrixWorld( true ); renderer.clearDepth(); renderer.render( scene, camera ); } ); } } } );
  40. Hit test

  41. source: https://codelabs.developers.google.com/codelabs/ar-with-webxr/index.html#4 Find intersection point
 that ray with the real-world

    surface
  42. Demo

  43. The code

  44. const width = window.innerWidth; const height = window.innerHeight; const startButton

    = document.getElementById( 'startButton' ); navigator.xr.requestDevice().then( ( device ) => { const outputCanvas = document.getElementById( 'xrCanvas' ); outputCanvas.width = width; outputCanvas.height = height; const xrContext = outputCanvas.getContext( 'xrpresent' ); // session request must be done in user action such as click startButton.addEventListener( 'click', onEnterAR ); async function onEnterAR() { startButton.style.display = 'none'; const xrSession = await device.requestSession( { The same as previous one…
  45. renderer.render( scene, camera ); } ); } window.addEventListener( 'click', onClick

    ); // use Raycaster to make ray origin and direction const raycaster = new THREE.Raycaster(); // onClick must be async, since hitTest will be done with await async function onClick() { const x = 0; const y = 0; raycaster.setFromCamera( { x, y }, camera ); const origin = new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) { Add a click action
  46. renderer.render( scene, camera ); } ); } window.addEventListener( 'click', onClick

    ); // use Raycaster to make ray origin and direction const raycaster = new THREE.Raycaster(); // onClick must be async, since hitTest will be done with await async function onClick() { const x = 0; const y = 0; raycaster.setFromCamera( { x, y }, camera ); const origin = new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) {
  47. window.addEventListener( 'click', onClick ); // use Raycaster to make ray

    origin and direction const raycaster = new THREE.Raycaster(); // onClick must be async, since hitTest will be done with await async function onClick() { const x = 0; const y = 0; raycaster.setFromCamera( { x, y }, camera ); const origin = new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) { const hit = hits[ 0 ]; const hitMatrix = new THREE.Matrix4().fromArray( hit.hitMatrix ); const box = new THREE.Mesh(
  48. const raycaster = new THREE.Raycaster(); // onClick must be async,

    since hitTest will be done with await async function onClick() { const x = 0; const y = 0; raycaster.setFromCamera( { x, y }, camera ); const origin = new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) { const hit = hits[ 0 ]; const hitMatrix = new THREE.Matrix4().fromArray( hit.hitMatrix ); const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial() );
  49. const raycaster = new THREE.Raycaster(); // onClick must be async,

    since hitTest will be done with await async function onClick() { const x = 0; const y = 0; raycaster.setFromCamera( { x, y }, camera ); const origin = new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) { const hit = hits[ 0 ]; const hitMatrix = new THREE.Matrix4().fromArray( hit.hitMatrix ); const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial() );
  50. raycaster.setFromCamera( { x, y }, camera ); const origin =

    new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) { const hit = hits[ 0 ]; const hitMatrix = new THREE.Matrix4().fromArray( hit.hitMatrix ); const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial() ); box.position.setFromMatrixPosition( hitMatrix ); scene.add( box ); } } }
  51. raycaster.setFromCamera( { x, y }, camera ); const origin =

    new Float32Array( raycaster.ray.origin.toArray() ); const direction = new Float32Array( raycaster.ray.direction.toArray() ); const hits = await xrSession.requestHitTest( origin, direction, frameOfRef ); if ( hits.length ) { const hit = hits[ 0 ]; const hitMatrix = new THREE.Matrix4().fromArray( hit.hitMatrix ); const box = new THREE.Mesh( new THREE.BoxBufferGeometry( .2, .2, .2 ), new THREE.MeshNormalMaterial() ); box.position.setFromMatrixPosition( hitMatrix ); scene.add( box ); } } }
  52. Demo 52

  53. How about iOS?

  54. 2. AR Quick Look

  55. • Upon the release of iOS 12 • Only works

    exclusively on Safari
 (Doesn’t work even in iOS Chrome) • Special HTML syntax • Apple’s propriety AR Quick Look
  56. <a href="./3d-model.usdz" rel="ar"> <img src="./thumb.jpg" alt=""> </a>

  57. <a href="./3d-model.usdz" rel="ar"> <img src="./thumb.jpg" alt=""> </a> Link to usdz

    with rel="ar" Must contain one <img> or <picture>
  58. 58 Demo

  59. USDZ

  60. • Stands for “Universal Scene Description” archived with Zip •

    3D model format for AR Quick Look • Created by Apple and Pixar USDZ
  61. • With “usdz_converter”
 which is a Xcode command line
 Only

    for macOS • Vectary (Web service)
 Cannot configure details such as size How to prepare USDZ
  62. • Up to 10M Polygons • Up to 10 seconds

    for the animation • Up to 2048×2048 texture size Limitations
  63. $ xcrun usdz_converter ./my-model.obj my-model.usdz -color_map albedo.jpg -metallic_map metallic.jpg
 -roughness_map

    roughness.jpg -normal_map normal.jpg -ao_map ao.jpg -emissive_map emissive.jpg USDZ Converter Terminal.app
  64. $ xcrun usdz_converter ./my-model.obj my-model.usdz -color_map albedo.jpg -metallic_map metallic.jpg
 -roughness_map

    roughness.jpg -normal_map normal.jpg -ao_map ao.jpg -emissive_map emissive.jpg Terminal.app
  65. $ xcrun usdz_converter ./my-model.obj my-model.usdz -color_map albedo.jpg -metallic_map metallic.jpg
 -roughness_map

    roughness.jpg -normal_map normal.jpg -ao_map ao.jpg -emissive_map emissive.jpg Terminal.app File input Output name Input option name Option value
  66. https://www.vectary.com/ Vectary

  67. https://www.vectary.com/

  68. https://www.vectary.com/ USDZ export

  69. • Apple's propriety
 (Hopefully it’s a temporary spec until WebXR

    Device API) • Some limitations of USDZ • Just pop and show in AR
 (Cannot be utilized as game and others)
  70. 3D Model format
 for Web Apps?

  71. 3. glTF

  72. • Stands for GL Transmission Format • Open standard 3D

    model format • JPEG of 3D • Maintained by Khronos What is glTF
  73. • JSON format as the container with binary payloads
 or

    packed single binary file called glb • Animation supported • Extensible just like WebGL Spec
 (Like Adobe Fireworks PNG) What is glTF
  74. None
  75. • Supported by many 3D modeling tools • Several WebGL

    libraries support glTF loading
 (Such as three.js, BabylonJS, Cesium) • Microsoft Paint3D, Office and others use glb as 3D model format • Adobe Animate has glTF exporter • Facebook's 3D posts use glb • VRM: glTF extended format for humanoid avatar
 (For Virtual-YouTuber, VRChat and others) glTF of the present
  76. https://www.youtube.com/watch?v=H2XoeQmkchw

  77. https://www.youtube.com/watch?v=zzDM42PdqZk

  78. Load glTF in three.js

  79. const width = window.innerWidth; const height = window.innerHeight; const scene

    = new THREE.Scene(); const camera = new THREE.PerspectiveCamera( 45, width / height, 0.001, 100 ); camera.position.set( 0, 0, 0.5 ); const renderer = new THREE.WebGLRenderer(); renderer.setSize( width, height ); renderer.gammaInput = true; renderer.gammaOutput = true; document.body.appendChild( renderer.domElement ); scene.add( new THREE.HemisphereLight( 0xffffff , 0x332222 ) );
  80. document.body.appendChild( renderer.domElement ); scene.add( new THREE.HemisphereLight( 0xffffff , 0x332222 )

    ); const loader = new THREE.GLTFLoader(); loader.load( './models/barger/barger.gltf', function ( gltf ) { scene.add( gltf.scene ); } ); ( function anim () { requestAnimationFrame( anim );
  81. scene.add( gltf.scene ); } ); ( function anim () {

    requestAnimationFrame( anim ); renderer.render( scene, camera ); } )();
  82. None
  83. Demo 83

  84. glTF Animation

  85. Demo

  86. glTF is very popular
 as the standard

  87. Security issues

  88. • Gaze tracking
 Detect virtual keyboard type with gaze direction

    in a VR environment • Trusted environment
 Motion sickness by low FPS and pose track error • Fingerprinting
 Identify users room shape or even face shape with depth
  89. Conclusion

  90. XR is coming to the Web

  91. • A Web API (in development) • For both VR

    and AR • Just works on WebBrowsers
 no add-ons nor installation required WebXR Device API
  92. • AR for iOS • Using USDZ • Basic AR

    feature AR Quick Look
  93. • The standard • 3D model format in JSON or

    Binary • Can be seen many places • Loaders are available in JavaScript glTF
  94. Web will be connected
 to the real world 94 Source:

    https://www.netflix.com/jp/title/80182418 © ࢜࿠ਖ਼फɾProduction I.G/ߨஊࣾɾ߈֪ػಈୂ੡࡞ҕһձ
  95. gl.finish(); @yomotsu