Slide 1

Slide 1 text

visionOSでの空間表現実装と Immersive Video表 示 について 服部 智

Slide 2

Slide 2 text

株式会社サイバーエージェント Cyber AI Productions Virtual Force 服部 智 Hattori Satoshi satoshi 0 2 1 2 @shmdevelop

Slide 3

Slide 3 text

visionOS 3 0 Days Challenge

Slide 4

Slide 4 text

visionOSでの空間表現の事例 空間表現を実装する Immersive VideoのViewerを実装する 0 1 0 2 0 3 CONTENTS

Slide 5

Slide 5 text

0 1 visionOSでの空間表現の事例

Slide 6

Slide 6 text

visionOSでの空間表現の事例 空間的な演出の評価が 高 いvisionOSのアプリ 0 1

Slide 7

Slide 7 text

visionOSでの空間表現の事例 0 1 ・ 現実空間に3D CG キャラクター表 示 ・ ポータル表 示 、VR表 示 ・ ハンドトラッキングなど What If … ? An Immersive Story

Slide 8

Slide 8 text

visionOSでの空間表現の事例 0 1 ・3 D CG キャラクター ・ ポータル表 示 + 前 面 の3Dオブジェクト ・手 の動きを使ったインタラクション Kung Fu Panda: School of Chi

Slide 9

Slide 9 text

visionOSでの空間表現の事例 0 1 ・2 D動画視聴 ・3 D動画視聴 ・高 品質なVR視聴環境 Disney+

Slide 10

Slide 10 text

visionOSでの空間表現の事例 0 1 ・高 品質な2D動画 ・ 多彩な空間的エフェクト ・ 動画の進 行 に従って演出表 示 GUCCI

Slide 11

Slide 11 text

visionOSでの空間表現の事例 GUCCIの構成要素と空間的表現を掘り下げ 0 1

Slide 12

Slide 12 text

image from "GUCCI" application

Slide 13

Slide 13 text

image from "GUCCI" application

Slide 14

Slide 14 text

%動画 コントローラー 空間的表現

Slide 15

Slide 15 text

image from "GUCCI" application

Slide 16

Slide 16 text

image from "GUCCI" application

Slide 17

Slide 17 text

image from "GUCCI" application

Slide 18

Slide 18 text

image from "GUCCI" application

Slide 19

Slide 19 text

image from "GUCCI" application

Slide 20

Slide 20 text

image from "GUCCI" application

Slide 21

Slide 21 text

image from "GUCCI" application

Slide 22

Slide 22 text

image from "GUCCI" application

Slide 23

Slide 23 text

visionOSでの空間表現の事例

Slide 24

Slide 24 text

visionOSでの空間表現の事例 空間表現を実装する Immersive VideoのViewerを実装する 0 1 0 2 0 3 CONTENTS

Slide 25

Slide 25 text

0 2 空間表現を実装する

Slide 26

Slide 26 text

空間表現を実装する 今回作るもの 0 2 GUCCIのような表現

Slide 27

Slide 27 text

+ 花 火 線状パーティクル 雨 パーティクル + 黒背景 Environment(アプリ内)

Slide 28

Slide 28 text

空間表現を実装する 今回採 用 した制御 方 式 0 2 HLSの指定時間に演出制御のtag埋め込み

Slide 29

Slide 29 text

作成内容 1 . Metadata付きHLSを作成する 2. 空間演出を作成する 3. 動画再 生 する 4. 空間演出を表 示 する 空間表現を実装する 0 2

Slide 30

Slide 30 text

空間表現を実装する 元となる動画を 用 意 0 2 1 . Metadata付きHLSを作成する

Slide 31

Slide 31 text

No content

Slide 32

Slide 32 text

No content

Slide 33

Slide 33 text

空間表現を実装する ID 3 TagとMacroファイルを作成 0 2 1 . Metadata付きHLSを作成する

Slide 34

Slide 34 text

https://developer.apple.com/streaming/

Slide 35

Slide 35 text

https://developer.apple.com/documentation/http-live-streaming/using-apple-s-http-live-streaming-hls-tools

Slide 36

Slide 36 text

空間表現を実装する ID 3 Tag Generator 0 2

Slide 37

Slide 37 text

$ id 3 taggenerator -o reset.id 3 -t "c_reset" -o | -output- fi le < fi le> Speci fi es the path where the generated ID 3 tag is written. -t | -text Inserts a text frame with the given string.

Slide 38

Slide 38 text

$ id 3 taggenerator -o reset.id 3 -t "c_reset" $ id 3 taggenerator -o line_on.id 3 -t "c_on_line_particle" $ id 3 taggenerator -o line_o ff .id 3 -t "c_o ff _line_particle" $ id 3 taggenerator -o rain_on.id 3 -t "c_on_rain_particle" $ id 3 taggenerator -o rain_o ff .id 3 -t "c_o ff _rain_particle" $ id 3 taggenerator -o fi reworks_on.id 3 -t "c_on_ fi reworks_particle" $ id 3 taggenerator -o fi reworks_o ff .id 3 -t "c_o ff _ fi reworks_particle" $ id 3 taggenerator -o env_ 0 1 _on.id 3 -t "c_on_env_ 0 1 " $ id 3 taggenerator -o env_ 0 1 _o ff .id 3 -t "c_o ff _env_ 0 1 "

Slide 39

Slide 39 text

No content

Slide 40

Slide 40 text

空間表現を実装する Macro.txtを作成 0 2 1 . Metadata付きHLSを作成する

Slide 41

Slide 41 text

0 id 3 ./reset.id 3 2 id 3 ./line_on.id 3 1 0 id 3 ./line_o ff .id 3 1 1 . 5 id 3 ./env_ 0 1 _on.id 3 2 0 . 5 id 3 ./env_ 0 1 _o ff .id 3 2 1 id 3 ./rain_on.id 3 3 0 id 3 ./rain_o ff .id 3 3 2 id 3 ./ fi reworks_on.id 3 4 0 id 3 ./ fi reworks_o ff .id 3 4 4 id 3 ./reset.id 3 Macro.txt

Slide 42

Slide 42 text

空間表現を実装する Media File SegmenterでHLSリソース 生 成 0 2 1 . Metadata付きHLSを作成する

Slide 43

Slide 43 text

空間表現を実装する Media File Segmenter 0 2

Slide 44

Slide 44 text

$ media fi lesegmenter -f ./output/ -i index.m 3 u 8 -B media- -t 1 \ -M ./macro.txt ./SpatialE ff ects 0 0 1 .mov -f | - fi le-base path Directory to store the media and index fi les. -i | -index- fi le fi leName This option de fi nes the index fi le name. The default is prog_index.m 3 u 8 . It is recommended that the index fi le have an extension of .m 3 u 8 or .m 3 u. -B | -base-media- fi le-name name This option de fi nes the base name of the media fi les. The default is fi leSequence. The current sequence number of the fi le is appended, and an extension added. For example, specifying name as AppleMediaFile will generate fi le names that look like AppleMediaFile 1 2 .ts. -t | -target-duration duration Speci fi es a target duration for the media fi les. The default duration is 1 0 seconds. The duration is calculated by looking at the PTS/DTS in the source fi le. -M | -meta-macro- fi le fi le Speci fi es the macro fi le to be used to insert timed metadata into the stream.

Slide 45

Slide 45 text

No content

Slide 46

Slide 46 text

作成内容 1 . Metadata付きHLSを作成する 2. 空間演出を作成する 3. 動画再 生 する 4. 空間演出を表 示 する 空間表現を実装する 0 2

Slide 47

Slide 47 text

空間表現を実装する Reality Composer Pro 0 2 2. 空間演出を作成する

Slide 48

Slide 48 text

No content

Slide 49

Slide 49 text

No content

Slide 50

Slide 50 text

No content

Slide 51

Slide 51 text

No content

Slide 52

Slide 52 text

空間表現を実装する 線状Particle 0 2 2. 空間演出を作成する

Slide 53

Slide 53 text

No content

Slide 54

Slide 54 text

No content

Slide 55

Slide 55 text

No content

Slide 56

Slide 56 text

空間表現を実装する 雨 と黒背景 0 2 2. 空間演出を作成する

Slide 57

Slide 57 text

No content

Slide 58

Slide 58 text

No content

Slide 59

Slide 59 text

No content

Slide 60

Slide 60 text

空間表現を実装する 花 火 0 2 2. 空間演出を作成する

Slide 61

Slide 61 text

No content

Slide 62

Slide 62 text

No content

Slide 63

Slide 63 text

No content

Slide 64

Slide 64 text

空間表現を実装する Environment 0 2 2. 空間演出を作成する

Slide 65

Slide 65 text

No content

Slide 66

Slide 66 text

No content

Slide 67

Slide 67 text

https://developer.apple.com/documentation/realitykit/construct-an-immersive-environment-for-visionos

Slide 68

Slide 68 text

作成内容 1 . Metadata付きHLSを作成する 2. 空間演出を作成する 3. 動画再 生 する 4. 空間演出を表 示 する 空間表現を実装する 0 2

Slide 69

Slide 69 text

空間表現を実装する 0 2 3. 動画再 生 する

Slide 70

Slide 70 text

No content

Slide 71

Slide 71 text

No content

Slide 72

Slide 72 text

@main struct SpatialEffectsVideoPlayerApp: App { @State private var appModel = AppModel() @State private var playerViewModel = AVPlayerViewModel() @State private var surroundingsEffect: SurroundingsEffect? = .semiDark var body: some Scene { WindowGroup { if playerViewModel.isPlaying { AVPlayerView(viewModel: playerViewModel) } else { ContentView() .environment(appModel) } } .windowResizability(.contentSize) .windowStyle(.plain) ImmersiveSpace(id: appModel.immersiveSpaceID) { ImmersiveView() .environment(appModel) .environment(playerViewModel) .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } .preferredSurroundingsEffect(surroundingsEffect) } .immersionStyle(selection: .constant(.mixed), in: .mixed) } } 4QBUJBM&GGFDUT7JEFP1MBZFS"QQTXJGU

Slide 73

Slide 73 text

@main struct SpatialEffectsVideoPlayerApp: App { @State private var appModel = AppModel() @State private var playerViewModel = AVPlayerViewModel() @State private var surroundingsEffect: SurroundingsEffect? = .semiDark var body: some Scene { WindowGroup { if playerViewModel.isPlaying { AVPlayerView(viewModel: playerViewModel) } else { ContentView() .environment(appModel) } } .windowResizability(.contentSize) .windowStyle(.plain) ImmersiveSpace(id: appModel.immersiveSpaceID) { ImmersiveView() .environment(appModel) .environment(playerViewModel) .onAppear { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } .preferredSurroundingsEffect(surroundingsEffect) } .immersionStyle(selection: .constant(.mixed), in: .mixed) } } 4QBUJBM&GGFDUT7JEFP1MBZFS"QQTXJGU

Slide 74

Slide 74 text

import SwiftUI struct AVPlayerView: UIViewControllerRepresentable { let viewModel: AVPlayerViewModel func makeUIViewController(context: Context) -> some UIViewController { return viewModel.makePlayerViewController() } func updateUIViewController(_ uiViewController: UIViewControllerType, context: Context) { // Update the AVPlayerViewController as needed } } "71MBZFS7JFXTXJGU

Slide 75

Slide 75 text

@Observable final class AVPlayerViewModel: NSObject { private(set) var isPlaying: Bool = false private var avPlayerViewController: AVPlayerViewController? private var avPlayer = AVPlayer() private let videoURL: URL? = { URL(string: "https://satoshi0212.github.io/hls/resources/index.m3u8") }() func makePlayerViewController() -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = avPlayer controller.delegate = self self.avPlayerViewController = controller self.avPlayerViewController?.delegate = self controller.modalPresentationStyle = .fullScreen return controller } func play() { guard !isPlaying, let videoURL else { return } isPlaying = true let item = AVPlayerItem(url: videoURL) let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil) metadataOutput.setDelegate(self, queue: DispatchQueue.main) item.add(metadataOutput) avPlayer.replaceCurrentItem(with: item) avPlayer.play() } func reset() { guard isPlaying else { return } isPlaying = false avPlayer.replaceCurrentItem(with: nil) } } "71MBZFS7JFX.PEFMTXJGU

Slide 76

Slide 76 text

@Observable final class AVPlayerViewModel: NSObject { private(set) var isPlaying: Bool = false private var avPlayerViewController: AVPlayerViewController? private var avPlayer = AVPlayer() private let videoURL: URL? = { URL(string: "https://satoshi0212.github.io/hls/resources/index.m3u8") }() func makePlayerViewController() -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = avPlayer controller.delegate = self self.avPlayerViewController = controller self.avPlayerViewController?.delegate = self controller.modalPresentationStyle = .fullScreen return controller } func play() { guard !isPlaying, let videoURL else { return } isPlaying = true let item = AVPlayerItem(url: videoURL) let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil) metadataOutput.setDelegate(self, queue: DispatchQueue.main) item.add(metadataOutput) avPlayer.replaceCurrentItem(with: item) avPlayer.play() } func reset() { guard isPlaying else { return } isPlaying = false avPlayer.replaceCurrentItem(with: nil) } } "71MBZFS7JFX.PEFMTXJGU

Slide 77

Slide 77 text

@Observable final class AVPlayerViewModel: NSObject { private(set) var isPlaying: Bool = false private var avPlayerViewController: AVPlayerViewController? private var avPlayer = AVPlayer() private let videoURL: URL? = { URL(string: "https://satoshi0212.github.io/hls/resources/index.m3u8") }() func makePlayerViewController() -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = avPlayer controller.delegate = self self.avPlayerViewController = controller self.avPlayerViewController?.delegate = self controller.modalPresentationStyle = .fullScreen return controller } func play() { guard !isPlaying, let videoURL else { return } isPlaying = true let item = AVPlayerItem(url: videoURL) let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil) metadataOutput.setDelegate(self, queue: DispatchQueue.main) item.add(metadataOutput) avPlayer.replaceCurrentItem(with: item) avPlayer.play() } func reset() { guard isPlaying else { return } isPlaying = false avPlayer.replaceCurrentItem(with: nil) } } "71MBZFS7JFX.PEFMTXJGU

Slide 78

Slide 78 text

struct ImmersiveView: View { @Environment(AVPlayerViewModel.self) private var playerViewModel @State var immersiveViewModel = ImmersiveViewModel() var body: some View { ZStack { RealityView { content in let entity = Entity() content.add(entity) immersiveViewModel.setup(entity: entity) } .gesture(SpatialTapGesture().targetedToAnyEntity() .onEnded { value in if value.entity.name == "StartButton" { playerViewModel.play() } } ) .onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue } .onDisappear { playerViewModel.reset() } .transition(.opacity) ... } } } *NNFSTJWF7JFXTXJGU

Slide 79

Slide 79 text

struct ImmersiveView: View { @Environment(AVPlayerViewModel.self) private var playerViewModel @State var immersiveViewModel = ImmersiveViewModel() var body: some View { ZStack { RealityView { content in let entity = Entity() content.add(entity) immersiveViewModel.setup(entity: entity) } .gesture(SpatialTapGesture().targetedToAnyEntity() .onEnded { value in if value.entity.name == "StartButton" { playerViewModel.play() } } ) .onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue } .onDisappear { playerViewModel.reset() } .transition(.opacity) ... } } } *NNFSTJWF7JFXTXJGU

Slide 80

Slide 80 text

No content

Slide 81

Slide 81 text

空間表現を実装する 0 2 3. 動画再 生 する Metadata受信を確認

Slide 82

Slide 82 text

extension AVPlayerViewModel: AVPlayerItemMetadataOutputPushDelegate { func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first, let metadataValue = item.value(forKey: "value") as? String { print("Metadata value: \(metadataValue)") // videoAction = VideoAction(rawValue: metadataValue) ?? .none } } } "71MBZFS7JFX.PEFMTXJGU

Slide 83

Slide 83 text

No content

Slide 84

Slide 84 text

🎉

Slide 85

Slide 85 text

作成内容 1 . Metadata付きHLSを作成する 2. 空間演出を作成する 3. 動画再 生 する 4. 空間演出を表 示 する 空間表現を実装する 0 2

Slide 86

Slide 86 text

空間表現を実装する 0 2 4. 空間演出を表 示 する

Slide 87

Slide 87 text

No content

Slide 88

Slide 88 text

No content

Slide 89

Slide 89 text

空間表現を実装する 0 2 演出系Viewを共通した構造で作成 4. 空間演出を表 示 する

Slide 90

Slide 90 text

import SwiftUI import RealityKit struct LineParticleView: View { static let viewName = "LineParticleView" @State var viewModel = LineParticleViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } } -JOF1BSUJDMF7JFXTXJGU

Slide 91

Slide 91 text

import SwiftUI import RealityKit struct RainParticleView: View { static let viewName = "RainParticleView" @State var viewModel = RainParticleViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } } 3BJO1BSUJDMF7JFXTXJGU

Slide 92

Slide 92 text

import SwiftUI import RealityKit struct FireworksParticleView: View { static let viewName = "FireworksParticleView" @State var viewModel = FireworksParticleViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } } 'JSFXPSLT1BSUJDMF7JFXTXJGU

Slide 93

Slide 93 text

import SwiftUI import RealityKit struct Env01View: View { static let viewName = "Env01View" @State var viewModel = Env01ViewModel() var body: some View { RealityView { content in let entity = Entity() content.add(entity) viewModel.setup(entity: entity) } } } &OW7JFXTXJGU

Slide 94

Slide 94 text

空間表現を実装する 0 2 ViewModelも共通の構造で作成 4. 空間演出を表 示 する

Slide 95

Slide 95 text

import RealityKit import Observation import RealityKitContent @MainActor @Observable final class LineParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle), let particleEntity = scene.findEntity(named: "ParticleEmitter") else { return } particleEntity.name = "lineParticle" particleEntity.position = [0.0, 1.2, -0.8] rootEntity?.addChild(particleEntity) } } ... -JOF1BSUJDMF7JFX.PEFMTXJGU

Slide 96

Slide 96 text

import RealityKit import Observation import RealityKitContent @MainActor @Observable final class LineParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle), let particleEntity = scene.findEntity(named: "ParticleEmitter") else { return } particleEntity.name = "lineParticle" particleEntity.position = [0.0, 1.2, -0.8] rootEntity?.addChild(particleEntity) } } ... -JOF1BSUJDMF7JFX.PEFMTXJGU

Slide 97

Slide 97 text

import RealityKit import Observation import RealityKitContent @MainActor @Observable final class LineParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle), let particleEntity = scene.findEntity(named: "ParticleEmitter") else { return } particleEntity.name = "lineParticle" particleEntity.position = [0.0, 1.2, -0.8] rootEntity?.addChild(particleEntity) } } ... -JOF1BSUJDMF7JFX.PEFMTXJGU

Slide 98

Slide 98 text

protocol LiveSequenceOperation { func reset() async func play() async func fadeIn() async func fadeOut() async }

Slide 99

Slide 99 text

import RealityKit import Observation import RealityKitContent @MainActor @Observable final class LineParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "LineParticle", in: realityKitContentBundle), let particleEntity = scene.findEntity(named: "ParticleEmitter") else { return } particleEntity.name = "lineParticle" particleEntity.position = [0.0, 1.2, -0.8] rootEntity?.addChild(particleEntity) } } ... -JOF1BSUJDMF7JFX.PEFMTXJGU

Slide 100

Slide 100 text

... func reset() { rootEntity?.opacity = 0.0 } func play() { rootEntity?.getFirstChildByName(name: "lineParticle")?.isEnabled = true } func fadeIn() { Task { await rootEntity?.setOpacity(1.0, animated: true, duration: 0.4) } } func fadeOut() { Task { await rootEntity?.setOpacity(0.0, animated: true, duration: 0.4) } } } -JOF1BSUJDMF7JFX.PEFMTXJGU

Slide 101

Slide 101 text

空間表現を実装する 0 2 フェードイン、フェードアウトについて補 足 説明 4. 空間演出を表 示 する

Slide 102

Slide 102 text

... func reset() { rootEntity?.opacity = 0.0 } func play() { rootEntity?.getFirstChildByName(name: "lineParticle")?.isEnabled = true } func fadeIn() { Task { await rootEntity?.setOpacity(1.0, animated: true, duration: 0.4) } } func fadeOut() { Task { await rootEntity?.setOpacity(0.0, animated: true, duration: 0.4) } } } -JOF1BSUJDMF7JFX.PEFMTXJGU

Slide 103

Slide 103 text

@MainActor func setOpacity(_ opacity: Float, animated: Bool, duration: TimeInterval = 0.2, delay: TimeInterval = 0, completion: (() -> Void) = {}) async { guard animated, let scene else { self.opacity = opacity return } if !components.has(OpacityComponent.self) { components[OpacityComponent.self] = OpacityComponent(opacity: 1.0) } let animation = FromToByAnimation(name: "Entity/setOpacity", to: opacity, duration: duration, timing: .linear, isAdditive: false, bindTarget: .opacity, delay: delay) do { let animationResource: AnimationResource = try .generate(with: animation) let animationPlaybackController = playAnimation(animationResource) let filtered = scene.publisher(for: AnimationEvents.PlaybackTerminated.self) .filter { $0.playbackController == animationPlaybackController } _ = filtered.values.filter { await $0.playbackController.isComplete } completion() } catch { print("Could not generate animation: \(error.localizedDescription)") } } &OUJUZTXJGU

Slide 104

Slide 104 text

@MainActor func setOpacity(_ opacity: Float, animated: Bool, duration: TimeInterval = 0.2, delay: TimeInterval = 0, completion: (() -> Void) = {}) async { guard animated, let scene else { self.opacity = opacity return } if !components.has(OpacityComponent.self) { components[OpacityComponent.self] = OpacityComponent(opacity: 1.0) } let animation = FromToByAnimation(name: "Entity/setOpacity", to: opacity, duration: duration, timing: .linear, isAdditive: false, bindTarget: .opacity, delay: delay) do { let animationResource: AnimationResource = try .generate(with: animation) let animationPlaybackController = playAnimation(animationResource) let filtered = scene.publisher(for: AnimationEvents.PlaybackTerminated.self) .filter { $0.playbackController == animationPlaybackController } _ = filtered.values.filter { await $0.playbackController.isComplete } completion() } catch { print("Could not generate animation: \(error.localizedDescription)") } } &OUJUZTXJGU

Slide 105

Slide 105 text

空間表現を実装する 0 2 LineParticleViewModel以外のViewModelの実装 4. 空間演出を表 示 する

Slide 106

Slide 106 text

@MainActor @Observable final class RainParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity.opacity = 0.0 let skyBoxEntity = Entity() skyBoxEntity.components.set(ModelComponent( mesh: .generateSphere(radius: 1000), materials: [UnlitMaterial(color: .black)] )) skyBoxEntity.scale *= .init(x: -1, y: 1, z: 1) rootEntity.addChild(skyBoxEntity) Task { if let scene = try? await Entity(named: "RainParticle", in: realityKitContentBundle) { let particleEntity = scene.findEntity(named: "ParticleEmitter")! particleEntity.name = "rainParticle" particleEntity.position = [0.0, 3.0, -2.0] rootEntity.addChild(particleEntity) } } } ... 3BJO1BSUJDMF7JFX.PEFMTXJGU

Slide 107

Slide 107 text

@MainActor @Observable final class RainParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity.opacity = 0.0 let skyBoxEntity = Entity() skyBoxEntity.components.set(ModelComponent( mesh: .generateSphere(radius: 1000), materials: [UnlitMaterial(color: .black)] )) skyBoxEntity.scale *= .init(x: -1, y: 1, z: 1) rootEntity.addChild(skyBoxEntity) Task { if let scene = try? await Entity(named: "RainParticle", in: realityKitContentBundle) { let particleEntity = scene.findEntity(named: "ParticleEmitter")! particleEntity.name = "rainParticle" particleEntity.position = [0.0, 3.0, -2.0] rootEntity.addChild(particleEntity) } } } ... 3BJO1BSUJDMF7JFX.PEFMTXJGU

Slide 108

Slide 108 text

... func reset() { rootEntity?.opacity = 0.0 } func play() { rootEntity?.getFirstChildByName(name: "rainParticle")?.isEnabled = true } func fadeIn() { Task { await rootEntity?.setOpacity(1.0, animated: true, duration: 1.4) } } func fadeOut() { Task { await rootEntity?.setOpacity(0.0, animated: true, duration: 1.4) } } } 3BJO1BSUJDMF7JFX.PEFMTXJGU

Slide 109

Slide 109 text

@MainActor @Observable final class FireworksParticleViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "Fireworks", in: realityKitContentBundle) else { return } rootEntity?.addChild(scene) } } ... } 'JSFXPSLT1BSUJDMF7JFX.PEFMTXJGU

Slide 110

Slide 110 text

@MainActor @Observable final class Env01ViewModel: LiveSequenceOperation { private var rootEntity: Entity? func setup(entity: Entity) { rootEntity = entity rootEntity?.opacity = 0.0 Task { guard let scene = try? await Entity(named: "Env_01", in: realityKitContentBundle) else { return } rootEntity?.addChild(scene) } } ... } &OW7JFX.PEFMTXJGU

Slide 111

Slide 111 text

空間表現を実装する 0 2 Viewを 生 成し配置 4. 空間演出を表 示 する

Slide 112

Slide 112 text

No content

Slide 113

Slide 113 text

@MainActor @Observable class ImmersiveViewModel { private(set) var rootEntity: Entity? let lineParticleView: LineParticleView = .init() let rainParticleView: RainParticleView = .init() let fireworksParticleView: FireworksParticleView = .init() let env01View: Env01View = .init() @ObservationIgnored private lazy var effectViewModels: [String : LiveSequenceOperation] = { return [ LineParticleView.viewName : self.lineParticleView.viewModel, RainParticleView.viewName : self.rainParticleView.viewModel, FireworksParticleView.viewName : self.fireworksParticleView.viewModel, Env01View.viewName : self.env01View.viewModel, ] }() ... *NNFSTJWF7JFX.PEFMTXJGU

Slide 114

Slide 114 text

struct ImmersiveView: View { @State var immersiveViewModel = ImmersiveViewModel() var body: some View { ZStack { RealityView { content in let entity = Entity() content.add(entity) immersiveViewModel.setup(entity: entity) } .gesture(SpatialTapGesture().targetedToAnyEntity() .onEnded { value in if value.entity.name == "StartButton" { playerViewModel.play() } } ) .onChange(of: playerViewModel.videoAction, initial: true) { oldValue, newValue in immersiveViewModel.processVideoAction(oldValue: oldValue, newValue: newValue) } .onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue } .onDisappear { playerViewModel.reset() } .transition(.opacity) // place effect views immersiveViewModel.lineParticleView immersiveViewModel.rainParticleView immersiveViewModel.fireworksParticleView immersiveViewModel.env01View } } } *NNFSTJWF7JFXTXJGU

Slide 115

Slide 115 text

struct ImmersiveView: View { @State var immersiveViewModel = ImmersiveViewModel() var body: some View { ZStack { RealityView { content in let entity = Entity() content.add(entity) immersiveViewModel.setup(entity: entity) } .gesture(SpatialTapGesture().targetedToAnyEntity() .onEnded { value in if value.entity.name == "StartButton" { playerViewModel.play() } } ) .onChange(of: playerViewModel.videoAction, initial: true) { oldValue, newValue in immersiveViewModel.processVideoAction(oldValue: oldValue, newValue: newValue) } .onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue } .onDisappear { playerViewModel.reset() } .transition(.opacity) // place effect views immersiveViewModel.lineParticleView immersiveViewModel.rainParticleView immersiveViewModel.fireworksParticleView immersiveViewModel.env01View } } } *NNFSTJWF7JFXTXJGU

Slide 116

Slide 116 text

空間表現を実装する 0 2 View、ViewModelの作成と配置ができた 4. 空間演出を表 示 する Metadataのtagに応じてViewを表 示 切り替え

Slide 117

Slide 117 text

enum VideoAction: String { case none case c_reset case c_on_line_particle case c_off_line_particle case c_on_rain_particle case c_off_rain_particle case c_on_fireworks_particle case c_off_fireworks_particle case c_on_env_01 case c_off_env_01 }

Slide 118

Slide 118 text

extension AVPlayerViewModel: AVPlayerItemMetadataOutputPushDelegate { func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first, let metadataValue = item.value(forKey: "value") as? String { print("Metadata value: \(metadataValue)") videoAction = VideoAction(rawValue: metadataValue) ?? .none } } } "71MBZFS7JFX.PEFMTXJGU

Slide 119

Slide 119 text

extension AVPlayerViewModel: AVPlayerItemMetadataOutputPushDelegate { func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first, let metadataValue = item.value(forKey: "value") as? String { print("Metadata value: \(metadataValue)") videoAction = VideoAction(rawValue: metadataValue) ?? .none } } } "71MBZFS7JFX.PEFMTXJGU

Slide 120

Slide 120 text

struct ImmersiveView: View { @State var immersiveViewModel = ImmersiveViewModel() var body: some View { ZStack { RealityView { content in let entity = Entity() content.add(entity) immersiveViewModel.setup(entity: entity) } .gesture(SpatialTapGesture().targetedToAnyEntity() .onEnded { value in if value.entity.name == "StartButton" { playerViewModel.play() } } ) .onChange(of: playerViewModel.videoAction, initial: true) { oldValue, newValue in immersiveViewModel.processVideoAction(oldValue: oldValue, newValue: newValue) } .onChange(of: playerViewModel.isPlaying, initial: false) { _, newValue in immersiveViewModel.rootEntity?.getFirstChildByName(name: "StartButton")?.isEnabled = !newValue } .onDisappear { playerViewModel.reset() } .transition(.opacity) // place effect views immersiveViewModel.lineParticleView immersiveViewModel.rainParticleView immersiveViewModel.fireworksParticleView immersiveViewModel.env01View } } } *NNFSTJWF7JFXTXJGU

Slide 121

Slide 121 text

func processVideoAction(oldValue: VideoAction = .none, newValue: VideoAction = .none) { // avoid continuous firing of actions other than reset action if newValue != .c_reset && oldValue == newValue { return } switch newValue { case .none: break case .c_reset: resetAction() case .c_on_line_particle: Task { await play(viewName: LineParticleView.viewName) await fadeIn(viewName: LineParticleView.viewName) } case .c_off_line_particle: Task { await fadeOut(viewName: LineParticleView.viewName) } case .c_on_rain_particle: ... *NNFSTJWF7JFX.PEFMTXJGU

Slide 122

Slide 122 text

func processVideoAction(oldValue: VideoAction = .none, newValue: VideoAction = .none) { // avoid continuous firing of actions other than reset action if newValue != .c_reset && oldValue == newValue { return } switch newValue { case .none: break case .c_reset: resetAction() case .c_on_line_particle: Task { await play(viewName: LineParticleView.viewName) await fadeIn(viewName: LineParticleView.viewName) } case .c_off_line_particle: Task { await fadeOut(viewName: LineParticleView.viewName) } case .c_on_rain_particle: ... *NNFSTJWF7JFX.PEFMTXJGU

Slide 123

Slide 123 text

空間表現を実装する 0 2 すべての要素が 用 意できた 4. 空間演出を表 示 する

Slide 124

Slide 124 text

No content

Slide 125

Slide 125 text

🎉

Slide 126

Slide 126 text

空間表現を実装する 0 2 今回はGUCCIのような「 2 D動画」と「空間表現」の連動を実装した 動画以外の多様な場 面 でも空間表現の応 用 が増えていくだろう

Slide 127

Slide 127 text

visionOSでの空間表現の事例 空間表現を実装する Immersive VideoのViewerを実装する 0 1 0 2 0 3 CONTENTS

Slide 128

Slide 128 text

0 3 Immersive VideoのViewerを実装する

Slide 129

Slide 129 text

0 3 Immersive VideoのViewerを実装する Apple Immersive Video Apple Vision Proで視聴できる、180度の視野 角 と 空間オーディオを備えた8Kの3Dビデオコンテンツ https://www.apple.com/jp/apple-vision-pro/

Slide 130

Slide 130 text

0 3 Immersive VideoのViewerを実装する 「Submerged」 「Submerged」はApple Immersive Videoで撮影 された初の脚本のある短編映画。視聴者は潜 水 艦に 乗り、乗組員たちは激しい 魚 雷攻撃を必死で 生 き延 びようとします。 https://www.apple.com/jp/newsroom/ 2 0 2 4 / 0 7 /new-apple-immersive-video-series-and-films-premiere-on-vision-pro/ (2024/10/11に公開されました)

Slide 131

Slide 131 text

0 3 Immersive VideoのViewerを実装する Apple標準の作成ワークフロー https://www.blackmagicdesign.com/jp/media/release/ 2 0 2 4 0 6 1 1 - 0 2 2024-2025に公開 見 込み

Slide 132

Slide 132 text

0 3 Immersive VideoのViewerを実装する 既存機器 https://www.youtube.com/watch?v=C 3 EKrDuBNf 4 ほぼ似た物の作成は可能

Slide 133

Slide 133 text

0 3 Immersive VideoのViewerを実装する 仕様 https://developer.apple.com/av-foundation/Stereo-Video-ISOBMFF-Extensions.pdf ISO Base Media File Format and Apple HEVC Stereo Video Video Extended Usage box hierarchy

Slide 134

Slide 134 text

0 3 Immersive VideoのViewerを実装する 変換実装例 https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_ 3 d_video_to_multiview_hevc_and_spatial_video

Slide 135

Slide 135 text

Immersive VideoのViewerを実装する

Slide 136

Slide 136 text

0 3 Immersive VideoのViewerを実装する 実装参考 謝辞 https://github.com/mikeswanson/SpatialPlayer

Slide 137

Slide 137 text

0 3 Immersive VideoのViewerを実装する Viewer

Slide 138

Slide 138 text

動画情報取得 0 3 Immersive VideoのViewerを実装する

Slide 139

Slide 139 text

static func getVideoInfo(asset: AVAsset) async -> VideoInfo? { let videoInfo = VideoInfo() guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else { print("No video track found") return nil } guard let (naturalSize, formatDescriptions, mediaCharacteristics) = try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics), let formatDescription = formatDescriptions.first else { print("Failed to load video properties") return nil } videoInfo.size = naturalSize videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo) let projection = VideoTools.getProjection(formatDescription: formatDescription) videoInfo.projectionType = projection.projectionType videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView return videoInfo }

Slide 140

Slide 140 text

static func getVideoInfo(asset: AVAsset) async -> VideoInfo? { let videoInfo = VideoInfo() guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else { print("No video track found") return nil } guard let (naturalSize, formatDescriptions, mediaCharacteristics) = try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics), let formatDescription = formatDescriptions.first else { print("Failed to load video properties") return nil } videoInfo.size = naturalSize videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo) let projection = VideoTools.getProjection(formatDescription: formatDescription) videoInfo.projectionType = projection.projectionType videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView return videoInfo }

Slide 141

Slide 141 text

static func getVideoInfo(asset: AVAsset) async -> VideoInfo? { let videoInfo = VideoInfo() guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else { print("No video track found") return nil } guard let (naturalSize, formatDescriptions, mediaCharacteristics) = try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics), let formatDescription = formatDescriptions.first else { print("Failed to load video properties") return nil } videoInfo.size = naturalSize videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo) let projection = VideoTools.getProjection(formatDescription: formatDescription) videoInfo.projectionType = projection.projectionType videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView return videoInfo }

Slide 142

Slide 142 text

static func getVideoInfo(asset: AVAsset) async -> VideoInfo? { let videoInfo = VideoInfo() guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else { print("No video track found") return nil } guard let (naturalSize, formatDescriptions, mediaCharacteristics) = try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics), let formatDescription = formatDescriptions.first else { print("Failed to load video properties") return nil } videoInfo.size = naturalSize videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo) let projection = VideoTools.getProjection(formatDescription: formatDescription) videoInfo.projectionType = projection.projectionType videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView return videoInfo }

Slide 143

Slide 143 text

static func getProjection(formatDescription: CMFormatDescription) -> ( projectionType: CMProjectionType?, horizontalFieldOfView: Float?) { var projectionType: CMProjectionType? var horizontalFieldOfView: Float? if let extensions = CMFormatDescriptionGetExtensions(formatDescription) as Dictionary? { if let projectionKind = extensions["ProjectionKind" as CFString] as? String { projectionType = CMProjectionType(fromString: projectionKind) ?? .rectangular } if let horizontalFieldOfViewValue = extensions[kCMFormatDescriptionExtension_HorizontalFieldOfView] as? UInt32 { horizontalFieldOfView = Float(horizontalFieldOfViewValue) / 1000.0 } } return (projectionType, horizontalFieldOfView) }

Slide 144

Slide 144 text

static func getProjection(formatDescription: CMFormatDescription) -> ( projectionType: CMProjectionType?, horizontalFieldOfView: Float?) { var projectionType: CMProjectionType? var horizontalFieldOfView: Float? if let extensions = CMFormatDescriptionGetExtensions(formatDescription) as Dictionary? { if let projectionKind = extensions["ProjectionKind" as CFString] as? String { projectionType = CMProjectionType(fromString: projectionKind) ?? .rectangular } if let horizontalFieldOfViewValue = extensions[kCMFormatDescriptionExtension_HorizontalFieldOfView] as? UInt32 { horizontalFieldOfView = Float(horizontalFieldOfViewValue) / 1000.0 } } return (projectionType, horizontalFieldOfView) } 1SPKFDUJPO,JOE͕)BMG&RVJSFDUBOHVMBSͩͱ૝ఆ௨Γ

Slide 145

Slide 145 text

static func getProjection(formatDescription: CMFormatDescription) -> ( projectionType: CMProjectionType?, horizontalFieldOfView: Float?) { var projectionType: CMProjectionType? var horizontalFieldOfView: Float? if let extensions = CMFormatDescriptionGetExtensions(formatDescription) as Dictionary? { if let projectionKind = extensions["ProjectionKind" as CFString] as? String { projectionType = CMProjectionType(fromString: projectionKind) ?? .rectangular } if let horizontalFieldOfViewValue = extensions[kCMFormatDescriptionExtension_HorizontalFieldOfView] as? UInt32 { horizontalFieldOfView = Float(horizontalFieldOfViewValue) / 1000.0 } } return (projectionType, horizontalFieldOfView) } )PSJ[POUBM'JFME0G7JFX͕ͩͱ૝ఆ௨Γ

Slide 146

Slide 146 text

投影 用 の半球メッシュ作成 0 3 Immersive VideoのViewerを実装する

Slide 147

Slide 147 text

static func makeVideoMesh(videoInfo: VideoInfo) async -> ( mesh: MeshResource, transform: Transform)? { let horizontalFieldOfView = videoInfo.horizontalFieldOfView ?? 65.0 let mesh = VideoTools.generateVideoSphere( radius: 10000.0, sourceHorizontalFov: horizontalFieldOfView, sourceVerticalFov: 180.0, clipHorizontalFov: horizontalFieldOfView, clipVerticalFov: 180.0, verticalSlices: 60, horizontalSlices: Int(horizontalFieldOfView) / 3) let transform = Transform( scale: .init(x: 1, y: 1, z: 1), rotation: .init(angle: -Float.pi / 2, axis: .init(x: 0, y: 1, z: 0)), translation: .init(x: 0, y: 0, z: 0)) return (mesh: mesh!, transform: transform) }

Slide 148

Slide 148 text

static func generateVideoSphere( radius: Float, sourceHorizontalFov: Float, sourceVerticalFov: Float, clipHorizontalFov: Float, clipVerticalFov: Float, verticalSlices: Int, horizontalSlices: Int ) -> MeshResource? { // Vertices ... // Normals ... // UVs ... // Indices ... var meshDescriptor = MeshDescriptor(name: "proceduralMesh") meshDescriptor.positions = MeshBuffer(vertices) meshDescriptor.normals = MeshBuffer(normals) meshDescriptor.primitives = .triangles(indices) meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates) let mesh = try? MeshResource.generate(from: [meshDescriptor]) return mesh }

Slide 149

Slide 149 text

// Vertices var vertices: [simd_float3] = Array( repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1)) let verticalScale: Float = clipVerticalFov / 180.0 let verticalOffset: Float = (1.0 - verticalScale) / 2.0 let horizontalScale: Float = clipHorizontalFov / 360.0 let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0 for y: Int in 0...horizontalSlices { let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi) let sin1 = sin(angle1) let cos1 = cos(angle1) for x: Int in 0...verticalSlices { let angle2 = ((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale) + (horizontalOffset * Float.pi * 2) let sin2 = sin(angle2) let cos2 = cos(angle2) vertices[x + (y * (verticalSlices + 1))] = SIMD3(sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius) } }

Slide 150

Slide 150 text

// Normals var normals: [SIMD3] = [] for vertex in vertices { normals.append(-normalize(vertex)) // Invert to show on inside of sphere } // UVs var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count) let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0 let uvVerticalScale = clipVerticalFov / sourceVerticalFov let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0 for y in 0...horizontalSlices { for x in 0...verticalSlices { var uv: simd_float2 = [ (Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)), ] uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset uvCoordinates[x + (y * (verticalSlices + 1))] = uv } }

Slide 151

Slide 151 text

// Normals var normals: [SIMD3] = [] for vertex in vertices { normals.append(-normalize(vertex)) // Invert to show on inside of sphere } // UVs var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count) let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0 let uvVerticalScale = clipVerticalFov / sourceVerticalFov let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0 for y in 0...horizontalSlices { for x in 0...verticalSlices { var uv: simd_float2 = [ (Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)), ] uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset uvCoordinates[x + (y * (verticalSlices + 1))] = uv } }

Slide 152

Slide 152 text

// Indices var indices: [UInt32] = [] for y in 0..

Slide 153

Slide 153 text

static func generateVideoSphere( radius: Float, sourceHorizontalFov: Float, sourceVerticalFov: Float, clipHorizontalFov: Float, clipVerticalFov: Float, verticalSlices: Int, horizontalSlices: Int ) -> MeshResource? { // Vertices ... // Normals ... // UVs ... // Indices ... var meshDescriptor = MeshDescriptor(name: "proceduralMesh") meshDescriptor.positions = MeshBuffer(vertices) meshDescriptor.normals = MeshBuffer(normals) meshDescriptor.primitives = .triangles(indices) meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates) let mesh = try? MeshResource.generate(from: [meshDescriptor]) return mesh }

Slide 154

Slide 154 text

再 生 処理 0 3 Immersive VideoのViewerを実装する

Slide 155

Slide 155 text

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

Slide 156

Slide 156 text

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

Slide 157

Slide 157 text

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

Slide 158

Slide 158 text

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

Slide 159

Slide 159 text

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

Slide 160

Slide 160 text

0 3 Immersive VideoのViewerを実装する

Slide 161

Slide 161 text

0 3 新しい視聴体験として普及する未来を期待 Immersive VideoのViewerを実装する

Slide 162

Slide 162 text

visionOSでの空間表現の事例 空間表現を実装する Immersive VideoのViewerを実装する 0 1 0 2 0 3 Wrap up