$30 off During Our Annual Pro Sale. View Details »
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
ARKitを触りながら振り返る
Search
hira
October 11, 2022
Technology
0
1.1k
ARKitを触りながら振り返る
iOSDC Japan 2022 After Talk LT1 スライド
hira
October 11, 2022
Tweet
Share
More Decks by hira
See All by hira
Universal Linksの挙動を知る
elu697
2
130
Other Decks in Technology
See All in Technology
モダンデータスタック (MDS) の話とデータ分析が起こすビジネス変革
sutotakeshi
0
470
日本Rubyの会の構造と実行とあと何か / hokurikurk01
takahashim
4
1k
[CMU-DB-2025FALL] Apache Fluss - A Streaming Storage for Real-Time Lakehouse
jark
0
110
CARTAのAI CoE が挑む「事業を進化させる AI エンジニアリング」 / carta ai coe evolution business ai engineering
carta_engineering
0
360
生成AIでテスト設計はどこまでできる? 「テスト粒度」を操るテーラリング術
shota_kusaba
0
700
大企業でもできる!ボトムアップで拡大させるプラットフォームの作り方
findy_eventslides
1
720
Karate+Database RiderによるAPI自動テスト導入工数をCline+GitLab MCPを使って2割削減を目指す! / 20251206 Kazuki Takahashi
shift_evolve
PRO
1
710
MLflowで始めるプロンプト管理、評価、最適化
databricksjapan
1
150
Lessons from Migrating to OpenSearch: Shard Design, Log Ingestion, and UI Decisions
sansantech
PRO
1
120
評価駆動開発で不確実性を制御する - MLflow 3が支えるエージェント開発
databricksjapan
1
120
AWSセキュリティアップデートとAWSを育てる話
cmusudakeisuke
0
240
Playwrightのソースコードに見る、自動テストを自動で書く技術
yusukeiwaki
13
5.3k
Featured
See All Featured
Imperfection Machines: The Place of Print at Facebook
scottboms
269
13k
jQuery: Nuts, Bolts and Bling
dougneiner
65
8.2k
Navigating Team Friction
lara
191
16k
Docker and Python
trallard
47
3.7k
Save Time (by Creating Custom Rails Generators)
garrettdimon
PRO
32
1.8k
Design and Strategy: How to Deal with People Who Don’t "Get" Design
morganepeng
132
19k
Typedesign – Prime Four
hannesfritz
42
2.9k
Art, The Web, and Tiny UX
lynnandtonic
303
21k
The Art of Delivering Value - GDevCon NA Keynote
reverentgeek
16
1.8k
Fashionably flexible responsive web design (full day workshop)
malarkey
407
66k
How To Stay Up To Date on Web Technology
chriscoyier
791
250k
XXLCSS - How to scale CSS and keep your sanity
sugarenia
249
1.3M
Transcript
4BOTBOגࣜձٕࣾज़ຊ෦.PCJMF"QQMJDBUJPOάϧʔϓ ฏࢁஐݾ "3,JUΛ৮Γͳ͕ΒৼΓฦΔ 88%$ J04%$ J04%$+BQBO"GUFS5BML
ࣗݾհ w !FMV w 4BOTBOגࣜձࣾ৽ଔೖࣾ w J04ΤϯδχΞ📱 w ͖ήʔϜ🎮 ෩ܠ🏞
͓͠ͳ͕͖ "3,JUτϐοΫ 0CKFDU$BQUVSF .PUJPO$BQUVSF ͓·͚
"3,JU 88%$
w ʹ"3,JU͔ΒຖΞοϓσʔτ͕ଓ͘ w ͷJ1BE1SPʹഎ໘5SVF%FQUIΧϝϥ͕ࡌ w ͔Β1SPγϦʔζʹ-J%"3εΩϟφࡌ w νοϓͷੑೳ্͕͠ෳࡶͳ͜ͱ͕Ͱ͖ΔΑ͏ʹͳͬͨ
w ,ʹରԠ w ΧϝϥػೳڧԽ w 1MBOFBODIPSͷਫ਼্ w ϞʔγϣϯΩϟϓνϟͷੑೳ্ w -PDBUJPOBODIPSͷҬ֦େ
τϐοΫ %JTDPWFS"3,JU
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
,ʹରԠ w ैདྷը૾Λॖখ͔ͯ͠ΒॲཧΛߦͬͯϨϯμϦϯά͍ͯͨ͠ w ,ͷը૾Λॲཧ͢ΔϝϞϦϦιʔεܭࢉೳྗෆނ w ࠷৽ͷϋʔυΣΞύϫʔ 💪 Ͱ,ॲཧ͕Ͱ͖ΔΑ͏ʹͳͬͨ
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά #FGPSF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά #FGPSF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά #FGPSF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά #FGPSF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά "GUFS ,NPEF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED 💪
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά "GUFS ,NPEF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED 💪
,ʹରԠ w ,ͰࡱӨ w ͰฏۉԽ͠ॖখ w NTຖʹॲཧ 㲈GQT
w ϨϯμϦϯά "GUFS ,NPEF IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED 💪
w ୯७ͳಈըͰ,GQT্͕ݶ w ЋͳॲཧΛ͢ΔͷͰGQTݶք
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
1MBOFBODIPSͷਫ਼্ w 1MBOFBODIPSʹઃఆͨ͠ΦϒδΣΫτ w ֯ɼ෯ͱߴ͞ΛೝࣝՄೳʹ IUUQTEFWFMPQFSBQQMFDPN
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
-PDBUJPOBODIPSͷҬ֦େ w 714ͱ͍͏Ґஔਪఆٕज़ w ҐஔਪఆʹۭؒͷεΩϟϯσʔλ͕ඞཁ w (14 ηϯαʔ ө૾ʹΑΓҐஔ͖ͷਪఆ w
౦ژؚΊෳͷࢢ͕Ճ͞Εͨ IUUQTEFWFMPQFSBQQMFDPN
-PDBUJPOBODIPSͷҬ֦େ w 714ͱ͍͏Ґஔਪఆٕज़ w ҐஔਪఆʹۭؒͷεΩϟϯσʔλ͕ඞཁ w (14 ηϯαʔ ө૾ʹΑΓҐஔ͖ͷਪఆ w
౦ژؚΊෳͷࢢ͕Ճ͞Εͨ IUUQTEFWFMPQFSBQQMFDPN
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
w ,ʹରԠ👉ϚγϯύϫʔͰॲཧͷޮԽ w ΧϝϥػೳڧԽ👉"34FTTJPOதʹ੩ࢭըΛΩϟϓνϟՄೳʹ )%3 w 1MBOFBODIPSͷਫ਼্👉ฏ໘ೝࣝํͷਫ਼্͕ w ϞʔγϣϯΩϟϓνϟੑೳ্👉શମతͳਫ਼্ ࣖݕग़Մೳ
w -PDBUJPOBODIPSͷҬ֦େ👉Ҭ֦େͰ౦ژՃ
͓ͼ
XBSOJOHMJCPCKD"EZMJCJTCFJOHSFBEGSPNQSPDFTTNFNPSZ 5IJTJOEJDBUFTUIBU--%#DPVMEOPUGJOEUIFPOEJTLTIBSFE DBDIFGPSUIJTEFWJDF5IJTXJMMMJLFMZSFEVDFEFCVHHJOH QFSGPSNBODF
࣮ػσόοάΛ͠Α͏ͱ͢Δͱىಈʹd͔͔ͬͨΓ "34FTTPO్͕தͰམͪɼσϞΛ༻ҙͰ͖·ͤΜͰͨ͠
w SNSd-JCSBSZ%FWFMPQFS9DPEFJ04a%FWJDF4VQQPSU w J04CFUBͰͳ͘ϦϦʔε൛෮ݩ w 9DPEFCFUBΛআ ϦϦʔε൛ͷΈଘࡏ w 👆Ͱͳ͔ͥղܾ
0CKFDU$BQUVSF 88%$
0CKFDU$BQUVSF w 88%$Ͱ3FBMJUZLJUʹՃ w ϞϊΛෳ໘͔ΒࡱӨ w ߹ͯ̏͠%ΦϒδΣΫτΛ࡞ w ཁ݅-J%"3ࡌ J04
"Ҏ߱ w ..BDPS".%(16 IUUQTEFWFMPQFSBQQMFDPN
0CKFDU$BQUVSF "3,JUͱ3FBMJUZ,JU w "3,JUϕʔεͱͳΔϑϨʔϜϫʔΫ w εΩϟϯΦϒδΣΫτදࣔ w 3FBMJUZ,JUԾͱݱ࣮ͷ౷߹ w ΦϒδΣΫτͷૢ࡞
w ॲཧɾܭࢉ w %ϨϯμϦϯά IUUQTEFWFMPQFSBQQMFDPN
0CKFDU$BQUVSF w "QQMFͷެࣜαϯϓϧ w ༰ʜࡱӨॲཧΛݟͯΈΔ
/// - Tag: WillBeginCapture func photoOutput(_ output: AVCapturePhotoOutput, willBeginCaptureFor resolvedSettings:
AVCaptureResolvedPhotoSettings) { maxPhotoProcessingTime = resolvedSettings.photoProcessingTimeRange.start + resolvedSettings.photoProcessingTimeRange.duration } /// - Tag: WillCapturePhoto func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { willCapturePhotoAnimation() // Retrieve the gravity vector at capture time. if motionManager.isDeviceMotionActive { gravity = motionManager.deviceMotion?.gravity logger.log("Captured gravity vector: \(String(describing: self.gravity))") } guard let maxPhotoProcessingTime = maxPhotoProcessingTime else { return } // Show a spinner if processing time exceeds one second. let oneSecond = CMTime(seconds: 1, preferredTimescale: 1) if maxPhotoProcessingTime > oneSecond { photoProcessingHandler(true) } } ॏྗՃΛอଘ
/// - Tag: DidFinishProcessingPhoto func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo:
AVCapturePhoto, error: Error?) { photoProcessingHandler(false) if let error = error { print("Error capturing photo: \(error)") photoData = nil } else { // Cache the HEIF representation of the data. photoData = photo } // Cache the depth data, if it exists, as a disparity map. logger.log("DidFinishProcessingPhoto: photo=\(String(describing: photo))") if let depthData = photo.depthData?.converting(toDepthDataType: kCVPixelFormatType_DisparityFloat32), let colorSpace = CGColorSpace(name: CGColorSpace.linearGray) { let depthImage = CIImage( cvImageBuffer: depthData.depthDataMap, options: [ .auxiliaryDisparity: true ] ) depthMapData = context.tiffRepresentation(of: depthImage, format: .Lf, colorSpace: colorSpace, options: [.disparityImage: depthImage]) } else { logger.error("colorSpace .linearGray not available... can't save depth data!") depthMapData = nil } } ਂϚοϓΛอଘ
if let photoOutputConnection = self.photoOutput.connection(with: .video) { photoOutputConnection.videoOrientation = videoPreviewLayerOrientation
} var photoSettings = AVCapturePhotoSettings() // Request HEIF photos if supported and enable high-resolution photos. if self.photoOutput.availablePhotoCodecTypes.contains(.hevc) { photoSettings = AVCapturePhotoSettings( format: [AVVideoCodecKey: AVVideoCodecType.hevc]) } // Turn off the flash. The app relies on ambient lighting to avoid specular highlights. if self.videoDeviceInput!.device.isFlashAvailable { photoSettings.flashMode = .off } // Turn on high-resolution, depth data, and quality prioritzation mode. photoSettings.isHighResolutionPhotoEnabled = true photoSettings.isDepthDataDeliveryEnabled = self.photoOutput.isDepthDataDeliveryEnabled photoSettings.photoQualityPrioritization = self.photoQualityPrioritizationMode // Request that the camera embed a depth map into the HEIC output file. photoSettings.embedsDepthDataInPhoto = true // Specify a preview image. if !photoSettings.__availablePreviewPhotoPixelFormatTypes.isEmpty { photoSettings.previewPhotoFormat = [kCVPixelBufferPixelFormatTypeKey: photoSettings.__availablePreviewPhotoPixelFormatTypes.first!, kCVPixelBufferWidthKey: self.previewWidth, kCVPixelBufferHeightKey: self.previewHeight] as [String: Any] logger.log("Found available previewPhotoFormat: \(String(describing: photoSettings.previewPhotoFormat))") } else { logger.warning("Can't find preview photo formats! Not setting...") } // Tell the camera to embed a preview image in the output file. photoSettings.embeddedThumbnailPhotoFormat = [ AVVideoCodecKey: AVVideoCodecType.jpeg, AVVideoWidthKey: self.thumbnailWidth, AVVideoHeightKey: self.thumbnailHeight ] DispatchQueue.main.async { self.isHighQualityMode = photoSettings.isHighResolutionPhotoEnabled && photoSettings.photoQualityPrioritization == .quality } Χϝϥͱը૾ͷઃఆ
ࡱӨը૾ ਂϚοϓ ॏྗՃ 9 : ;
let inputFolderUrl = URL(fileURLWithPath: inputFolder, isDirectory: true) let configuration =
makeConfigurationFromArguments() logger.log("Using configuration: \(String(describing: configuration))") // Try to create the session, or else exit. var maybeSession: PhotogrammetrySession? = nil do { maybeSession = try PhotogrammetrySession(input: inputFolderUrl, configuration: configuration) logger.log("Successfully created session.") } catch { logger.error("Error creating session: \(String(describing: error))") Foundation.exit(1) } ϑΥϧμΛ͚ͩ͢
let entity = try! ModelEntity.load(named: "test.usdz") let anchorEntity = AnchorEntity(plane:
.any) anchorEntity.addChild(entity) arView.scene.addAnchor(anchorEntity) &OUJUZͷՃ
w ෳͷࣸਅσʔλ͔Β%ΦϒδΣΫτΛ࡞ w ղ૾ใ͕ଟ͍΄Ͳਫ਼͕ߴ͘ͳΔ w "3,JUͰ,ࡱӨ͕ରԠ͠Ͳ͏มΘͬͨʁ
let config = ARWorldTrackingConfiguration() // ͕ѻ͑Δ࠷ߴը࣭ͰͷϑΥʔϚοτΛฦ͢ if let hiResCaptureVideoFormat =
ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing { config.videoFormat = hiResCaptureVideoFormat } // 4kը࣭͕ѻ͑Δ߹ͷΈϑΥʔϚοτΛฦ͢ if let res4kCaptureVideoFormat = ARWorldTrackingConfiguration.recommendedVideoFormatFor4KResolution { config.videoFormat = res4kCaptureVideoFormat } "34FTTJPOͰͷө૾ϑΥʔϚοτ
arView.session.captureHighResolutionFrame { arFrame, error in guard let imageBuffer = arFrame?.capturedImage
else { return } let ciImage = CIImage(cvPixelBuffer: imageBuffer) let w = CGFloat(CVPixelBufferGetWidth(imageBuffer)) let h = CGFloat(CVPixelBufferGetHeight(imageBuffer)) let rect:CGRect = CGRect.init(x: 0, y: 0, width: w, height: h) let context = CIContext.init() guard let cgImage = context.createCGImage(ciImage, from: rect) else { return } let uiimage = UIImage(cgImage: cgImage).rotated(by: 90.0 * CGFloat.pi / 180) UIImageWriteToSavedPhotosAlbum(uiimage, self, nil, nil) } "34FTTJPOதͷϑϨʔϜऔಘ "34FTTJPO͕ઃఆΛΑ͠ͳʹͯ͘͠Εͯ "3,JUͷϝιουΘ͔Γ͍͢
.PUJPO$BQUVSF 88%$
.PUJPO$BQUVSF w 88%$"3,JUͰൃද͞Εͨػೳ w ࣍ݩը૾ͱ-J%"3εΩϟφΛ༻͍ͨਂใ͔Βͷ࢟ਪఆ w ਪఆ͞ΕͨؔઅͷใΛ"3"ODIPS͔ΒऔಘͰ͖Δɽ
"3"ODIPS w Ծۭ͔ؒΒݟͨݱ࣮ੈքʹ͓͚ΔΦϒδΣΫτͷҐஔ͖ w ϫʔϧυ࠲ඪܥʹ͓͚Δ࠲ඪϙΠϯτʹͳΔ
"3"ODIPSͷछྨ w "31MBOF"ODIPSݱ࣮ۭ͔ؒΒݕग़͞Εͨฏ໘ͷҐஔܗঢ়ใ w "3*NBHF"ODIPS"3ϚʔΧʔ༧Ί༻ҙ͞Εͨը૾ͷใ w "30CKFDU"ODIPSϙϦΰϯϝογϡ͔Βݕग़ͨ͠ཧతͳใ w "3#PEZ"ODIPSਓͷؔઅͷใ w
"3'BDF"ODIPSਓͷإͷύʔπ͖ͷใ w "3.FTI"ODIPS "3(FP"ODIPS "3"QQ$MJQ$PEF"ODIPSʜ
let entity = try! ModelEntity.load(named: "test.usdz") let anchorEntity = AnchorEntity(plane:
.any) anchorEntity.addChild(entity) arView.scene.addAnchor(anchorEntity) &OUJUZͷՃ
"34$/7JFX "3,JU 4DFOF,JU "34FTTJPJO "3'SBNF "3"ODIPS "37JFX 3FBMJUZ,JU 4DFOF "ODIPS&OUJUZ
.PEFM&OUJUZ "ODIPS&OUJUZ &OUJUZ &OUJUZ $PO fi HVSBUJPO "3'SBNF "3"ODIPS
"3,JU 3FBMJUZ,JU "37JFX 4DFOF "ODIPS&OUJUZ .PEFM&OUJUZ "ODIPS&OUJUZ &OUJUZ &OUJUZ "34FTTJPJO
$PO fi HVSBUJPO
.PUJPO$BQUVSF w Ͳ͏࣮ߦ͢Δͷ͔ w ͜ΕެࣜͰ༗Γl$BQUVSJOH#PEZ.PUJPOJO%z w దͳϞσϧͷ༻ҙ w "3#PEZ5SBDLJOH$PO fi
HVSBUJPOͰ"34FTTJPOΛىಈ w Ϟσϧϩʔυ
Ϟσϧͷ༻ҙ
let configuration = ARBodyTrackingConfiguration() arView.session.run(configuration) arView.scene.addAnchor(characterAnchor) if let entity =
try? Entity.loadBodyTracked(named: "character/robot") { self.character = entity } Ϟσϧͷϩʔυ 4DFOFʹϞσϧͷ"ODIPSΛઃஔ Ϟσϧͷ#PEZ5SBDLFE&OUJUZͷ༻ҙ
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) { for anchor
in anchors { guard let bodyAnchor = anchor as? ARBodyAnchor else { continue } let bodyPosition = simd_make_float3(bodyAnchor.transform.columns.3) characterAnchor.position = bodyPosition + characterOffset characterAnchor.orientation = Transform(matrix: bodyAnchor.transform).rotation if let character = character, character.parent == nil { characterAnchor.addChild(character) } } } Ϟσϧͱ"3#PEZ"ODIPSͷଓ ϞσϧଆͰKPJOU/BNF߹Θ͍ͤͯΔͷͰɼ"ODIPSʹॏͶΔ͚ͩ
͓·͚
w ϐʔϓϧΦΫϧʔδϣϯ w 3PPN1MBO
ϐʔϓϧΦΫϧʔδϣϯ w ਂใ͔ΒਓͷલޙؔΛೝࣝ w "38PSME5SBDLJOH$PO fi HVSBUJPO w DPO
fi HGSBNF4FNBOUJDT<QFSTPO4FHNFOUBUJPO8JUI%FQUI>
let cameraAnchor = AnchorEntity(.camera) self.scene.addAnchor(cameraAnchor) cameraAnchor.addChild(background) background.position = [0,0,-5] എܠΛۭؒͷNޙΖʹઃஔ
ਓͷલޙҐஔΛೝ͍ࣝͯ͠ΔͷͰɼ ਓ͕̑NޙΖʹ͕͞Δͱਓ͕ӅΕΔ
w ;PPNͷഎܠكʹޙΖͷՈ۩͕ࣸΓࠐΜͩΓ͢Δ w ਓͷલޙҐஔΛೝࣝͰ͖Δͨͩը૾͔Βͷਓநग़Ͱͳ͍ w ࠓճͷ88%$ͰJ1IPOFΛΣϒΧϝϥʹ͢ΔτϐοΫ͋Γ w Ξοϓσʔτʹظ
3PPN1MBO w "3,JUΛར༻ͯ͠Χϝϥͱ-J%"3Λར༻ͨ͠ϑϩΞϚοϐϯά͕Մೳ w ڪΖ͘͠؆୯ͳίʔυͰ෦ͷ%औΓਤ͕࡞Մೳ w "QQMFͷαϯϓϧ͕ར༻Մೳ w l$SFBUFB%NPEFMPGBOJOUFSJPSSPPNCZHVJEJOHUIFVTFSUISPVHI BO"3FYQFSJFODFz
let roomCaptureView = RoomCaptureView(frame: view.frame) view.addSubview(roomCaptureView) roomCaptureView.captureSession.run(configuration: .init()) roomCaptureView.captureSession.stop() %FMFHBUFͰྃ࣌ʹ෦ͷΩϟϓνϟʔσʔλ͕ड͚औΕΔ
"3,JU·ͱΊ w ػೳ͕ॆ࣮͠؆୯ͳࣄͰ͋Ε༻ҙʹࢼͤΔڥʹͳ͖ͬͯͨ w 75VCFSJ1IPOFΛͬͯΩϟϥΫλʔϞσϧΛಈ͔ͤΔ෯͕͕ͬͨ w ͨͩࢼ͚ͩ͢Ͱਫ਼͕ߴ͍ͷͰ݁ߏΫΦϦςΟߴ͘ײָ͍ͯ͡͠ ͥͻ"3,JUΛ৮ͬͨ͜ͱແ͍ํ৮ΕͯΈ͍ͯͩ͘͞ʂ
5IBOLT