Slide 1

Slide 1 text

NBD04Ͱࣗ෼ͷΧϝϥΛ࡞ͬͯΈΑ͏ $PSF.FEJB*0&YUFOTJPOT ෰෦ஐ J04%$ !TINEFWFMPQ

Slide 2

Slide 2 text

NBD04ͷԾ૝Χϝϥ࣮૷ʹ͍ͭͯ࿩͠·͢

Slide 3

Slide 3 text

ʮ൒೥લͷࣗ෼͕ٽ͍ͯتͿࢿྉʯ 😂

Slide 4

Slide 4 text

෰෦ஐ $ZCFS"HFOU $ZCFS"*1SPEVDUJPOT Y3ΪϧυϦʔμʔ "3/FYU&YQFSU ࠷ۙͷڵຯ ɹWJTJPO04ɺ7JTJPO1SPɺ ɹ6OSFBM&OHJOFɺ/%* ɹJ1IPOFͷө૾ϫʔΫϑϩʔ׆༻

Slide 5

Slide 5 text

IUUQTHJUIVCDPNTBUPTIJWJTJPO04@%BZT ༨ஊWJTJPO04%BZTνϟϨϯδ΍Γ·ͨ͠

Slide 6

Slide 6 text

ݱ࣮֦ுͷΤϞ͞

Slide 7

Slide 7 text

Ծ૝Χϝϥ͸ ɹݱ࣮֦ு͍ͯ͠Δ͠ɹ ɹΧϝϥػೳΛιϑτ΢ΣΞͰ࣮ݱ͍ͯ͠Δ͠ ɹϋʔυ΢ΣΞͱ࿈ಈ΋Ͱ͖Δ ͷͰ޷͖

Slide 8

Slide 8 text

·ͣײँΛड़΂·͢ʂ IUUQTHJUIVCDPNOPQQFGPYXPMGDJ fi MUFSDBN

Slide 9

Slide 9 text

·ͣײँΛड़΂·͢ʂ IUUQTRJJUBDPNGV[JLJJUFNTDBDBFBE

Slide 10

Slide 10 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 11

Slide 11 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 12

Slide 12 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ

Slide 13

Slide 13 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ CFGPSFNBD04 %FWJDF"CTUSBDUJPO-BZFS %"- 1MVH*OT

Slide 14

Slide 14 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ IUUQTTQFBLFSEFDLDPNTBUPTIJNBDPTKJBYJBOHLBNFSBUFSPUVQVLBNVTIJ[IVBOHGBOHGBUPTPGBMTFYJBO

Slide 15

Slide 15 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ IUUQTEFWFMPQFSBQQMFDPNWJEFPTQMBZXXED

Slide 16

Slide 16 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ Ґஔ֬ೝ༻ΦϒδΣΫτΛஔ͍ͯΈΔ IUUQTEFWFMPQFSBQQMFDPNEPDVNFOUBUJPODPSFNFEJBJP

Slide 17

Slide 17 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ Ґஔ֬ೝ༻ΦϒδΣΫτΛஔ͍ͯΈΔ IUUQTEFWFMPQFSBQQMFDPNEPDVNFOUBUJPODPSFNFEJBJP

Slide 18

Slide 18 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ NBD04ʙ $PSF.FEJB*0&YUFOTJPOT

Slide 19

Slide 19 text

NBD04ͷԾ૝Χϝϥʹ͍ͭͯ IUUQTEFWFMPQFSBQQMFDPNEPDVNFOUBUJPODPSFNFEJBJPDSFBUJOH@B@DBNFSB@FYUFOTJPO@XJUI@DPSF@NFEJB@J@P

Slide 20

Slide 20 text

)PTUBQQ &YUFOTJPO

Slide 21

Slide 21 text

IUUQTEFWFMPQFSBQQMFDPNEPDVNFOUBUJPODPSFNFEJBJPDSFBUJOH@B@DBNFSB@FYUFOTJPO@XJUI@DPSF@NFEJB@J@P

Slide 22

Slide 22 text

IUUQTEFWFMPQFSBQQMFDPNEPDVNFOUBUJPODPSFNFEJBJPDSFBUJOH@B@DBNFSB@FYUFOTJPO@XJUI@DPSF@NFEJB@J@P

Slide 23

Slide 23 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 24

Slide 24 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 25

Slide 25 text

γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ

Slide 26

Slide 26 text

ϓϩδΣΫτ৽ن࡞੒ 4ZTUFN&YUFOTJPOͱ"QQ(SPVQT௥Ճ $BNFSB&YUFOTJPO௥Ճͱ"QQ(SPVQTઃఆ *OTUBMMͱ6OJOTUBMMϘλϯ௥Ճ Ϗϧυ͠"QQMJDBUJPO഑ԼʹBQQΛ഑ஔ ࣮ߦͱηΩϡϦςΟڐՄ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ

Slide 27

Slide 27 text

No content

Slide 28

Slide 28 text

No content

Slide 29

Slide 29 text

No content

Slide 30

Slide 30 text

No content

Slide 31

Slide 31 text

No content

Slide 32

Slide 32 text

No content

Slide 33

Slide 33 text

No content

Slide 34

Slide 34 text

No content

Slide 35

Slide 35 text

No content

Slide 36

Slide 36 text

No content

Slide 37

Slide 37 text

No content

Slide 38

Slide 38 text

No content

Slide 39

Slide 39 text

import SwiftUI import SystemExtensions struct ContentView: View { let extensionID: String = "tokyo.shmdevelopment.MyCreativeCamera.Extension" var body: some View { HStack { Button { let activationRequest = OSSystemExtensionRequest.activationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(activationRequest) } label: { Text("Install") } Button { let deactivationRequest = OSSystemExtensionRequest.deactivationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(deactivationRequest) } label: { Text("Uninstall") } } .padding() } } $POUFOU7JFXTXJGU

Slide 40

Slide 40 text

import SwiftUI import SystemExtensions struct ContentView: View { let extensionID: String = "tokyo.shmdevelopment.MyCreativeCamera.Extension" var body: some View { HStack { Button { let activationRequest = OSSystemExtensionRequest.activationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(activationRequest) } label: { Text("Install") } Button { let deactivationRequest = OSSystemExtensionRequest.deactivationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(deactivationRequest) } label: { Text("Uninstall") } } .padding() } } $POUFOU7JFXTXJGU

Slide 41

Slide 41 text

No content

Slide 42

Slide 42 text

No content

Slide 43

Slide 43 text

No content

Slide 44

Slide 44 text

No content

Slide 45

Slide 45 text

No content

Slide 46

Slide 46 text

🎉

Slide 47

Slide 47 text

ॳճҎ߱ͷߋ৽࡞ۀ͸গ͠࡞๏͕͋Γ·͢ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ

Slide 48

Slide 48 text

No content

Slide 49

Slide 49 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 50

Slide 50 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 51

Slide 51 text

Χϝϥө૾දࣔͱө૾΁ͷ߹੒

Slide 52

Slide 52 text

Χϝϥө૾දࣔ

Slide 53

Slide 53 text

&YUFOTJPOଆ

Slide 54

Slide 54 text

No content

Slide 55

Slide 55 text

No content

Slide 56

Slide 56 text

No content

Slide 57

Slide 57 text

No content

Slide 58

Slide 58 text

import AVFoundation import CoreImage import CoreImage.CIFilterBuiltins &YUFOTJPO1SPWJEFSTXJGU

Slide 59

Slide 59 text

extension AVCaptureDevice.DiscoverySession { static func faceTimeDevice() -> AVCaptureDevice { let discoverySession = AVCaptureDevice.DiscoverySession( deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .unspecified ) let devices = discoverySession.devices let device = devices.filter({ $0.manufacturer == "Apple Inc." && $0.modelID.hasPrefix("FaceTime ")}).first! return device } } &YUFOTJPO1SPWJEFSTXJGU

Slide 60

Slide 60 text

let input: AVCaptureDeviceInput = { let device = AVCaptureDevice.DiscoverySession.faceTimeDevice() return try! AVCaptureDeviceInput(device: device) }() let output: AVCaptureVideoDataOutput = { let output = AVCaptureVideoDataOutput() output.videoSettings = [ kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA ] return output }() lazy var session: AVCaptureSession = { var session = AVCaptureSession() session.addInput(input) output.setSampleBufferDelegate(self, queue: .main) session.addOutput(output) return session }() &YUFOTJPO1SPWJEFSTXJGU

Slide 61

Slide 61 text

let input: AVCaptureDeviceInput = { let device = AVCaptureDevice.DiscoverySession.faceTimeDevice() return try! AVCaptureDeviceInput(device: device) }() let output: AVCaptureVideoDataOutput = { let output = AVCaptureVideoDataOutput() output.videoSettings = [ kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA ] return output }() lazy var session: AVCaptureSession = { var session = AVCaptureSession() session.addInput(input) output.setSampleBufferDelegate(self, queue: .main) session.addOutput(output) return session }() &YUFOTJPO1SPWJEFSTXJGU

Slide 62

Slide 62 text

func startStreaming() { guard let _ = _bufferPool else { return } _streamingCounter += 1 _timer = DispatchSource.makeTimerSource(flags: .strict, queue: _timerQueue) _timer!.schedule(deadline: .now(), repeating: 1.0 / Double(kFrameRate), leeway: .seconds(0)) _timer!.setEventHandler { var err: OSStatus = 0 let now = CMClockGetTime(CMClockGetHostTimeClock()) var pixelBuffer: CVPixelBuffer? err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, self._bufferPool, self._bufferAuxAttributes, &pixelBuffer) if err != 0 { os_log(.error, "out of pixel buffers \(err)") } if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, []) var bufferPtr = CVPixelBufferGetBaseAddress(pixelBuffer)! let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer) memset(bufferPtr, 0, rowBytes * height) let whiteStripeStartRow = self._whiteStripeStartRow if self._whiteStripeIsAscending { self._whiteStripeStartRow = whiteStripeStartRow - 1 self._whiteStripeIsAscending = self._whiteStripeStartRow > 0 } else { self._whiteStripeStartRow = whiteStripeStartRow + 1 self._whiteStripeIsAscending = self._whiteStripeStartRow >= (height - kWhiteStripeHeight) } bufferPtr += rowBytes * Int(whiteStripeStartRow) for _ in 0..

Slide 63

Slide 63 text

func startStreaming() { guard let _ = _bufferPool else { return } _streamingCounter += 1 _timer = DispatchSource.makeTimerSource(flags: .strict, queue: _timerQueue) _timer!.schedule(deadline: .now(), repeating: 1.0 / Double(kFrameRate), leeway: .seconds(0)) _timer!.setEventHandler { var err: OSStatus = 0 let now = CMClockGetTime(CMClockGetHostTimeClock()) var pixelBuffer: CVPixelBuffer? err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, self._bufferPool, self._bufferAuxAttributes, &pixelBuffer) if err != 0 { os_log(.error, "out of pixel buffers \(err)") } if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, []) var bufferPtr = CVPixelBufferGetBaseAddress(pixelBuffer)! let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer) memset(bufferPtr, 0, rowBytes * height) let whiteStripeStartRow = self._whiteStripeStartRow if self._whiteStripeIsAscending { self._whiteStripeStartRow = whiteStripeStartRow - 1 self._whiteStripeIsAscending = self._whiteStripeStartRow > 0 } else { self._whiteStripeStartRow = whiteStripeStartRow + 1 self._whiteStripeIsAscending = self._whiteStripeStartRow >= (height - kWhiteStripeHeight) } bufferPtr += rowBytes * Int(whiteStripeStartRow) for _ in 0..

Slide 64

Slide 64 text

func startStreaming() { guard let _ = _bufferPool else { return } _streamingCounter += 1 _timer = DispatchSource.makeTimerSource(flags: .strict, queue: _timerQueue) _timer!.schedule(deadline: .now(), repeating: 1.0 / Double(kFrameRate), leeway: .seconds(0)) _timer!.setEventHandler { var err: OSStatus = 0 let now = CMClockGetTime(CMClockGetHostTimeClock()) var pixelBuffer: CVPixelBuffer? err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, self._bufferPool, self._bufferAuxAttributes, &pixelBuffer) if err != 0 { os_log(.error, "out of pixel buffers \(err)") } if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, []) var bufferPtr = CVPixelBufferGetBaseAddress(pixelBuffer)! let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer) memset(bufferPtr, 0, rowBytes * height) let whiteStripeStartRow = self._whiteStripeStartRow if self._whiteStripeIsAscending { self._whiteStripeStartRow = whiteStripeStartRow - 1 self._whiteStripeIsAscending = self._whiteStripeStartRow > 0 } else { self._whiteStripeStartRow = whiteStripeStartRow + 1 self._whiteStripeIsAscending = self._whiteStripeStartRow >= (height - kWhiteStripeHeight) } bufferPtr += rowBytes * Int(whiteStripeStartRow) for _ in 0..

Slide 65

Slide 65 text

self._streamSource.stream.send(sbuf, discontinuity: [], hostTimeInNanoseconds: UInt64(timingInfo.presentationTimeStamp.seconds * Double(NSEC_PER_SEC))) &YUFOTJPO1SPWJEFSTXJGU

Slide 66

Slide 66 text

func startStreaming() { guard let _ = _bufferPool else { return } _streamingCounter += 1 _timer = DispatchSource.makeTimerSource(flags: .strict, queue: _timerQueue) _timer!.schedule(deadline: .now(), repeating: 1.0 / Double(kFrameRate), leeway: .seconds(0)) _timer!.setEventHandler { var err: OSStatus = 0 let now = CMClockGetTime(CMClockGetHostTimeClock()) var pixelBuffer: CVPixelBuffer? err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, self._bufferPool, self._bufferAuxAttributes, &pixelBuffer) if err != 0 { os_log(.error, "out of pixel buffers \(err)") } if let pixelBuffer = pixelBuffer { CVPixelBufferLockBaseAddress(pixelBuffer, []) var bufferPtr = CVPixelBufferGetBaseAddress(pixelBuffer)! let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let rowBytes = CVPixelBufferGetBytesPerRow(pixelBuffer) memset(bufferPtr, 0, rowBytes * height) let whiteStripeStartRow = self._whiteStripeStartRow if self._whiteStripeIsAscending { self._whiteStripeStartRow = whiteStripeStartRow - 1 self._whiteStripeIsAscending = self._whiteStripeStartRow > 0 } else { self._whiteStripeStartRow = whiteStripeStartRow + 1 self._whiteStripeIsAscending = self._whiteStripeStartRow >= (height - kWhiteStripeHeight) } bufferPtr += rowBytes * Int(whiteStripeStartRow) for _ in 0..

Slide 67

Slide 67 text

func startStreaming() { session.startRunning() } &YUFOTJPO1SPWJEFSTXJGU DMBTT&YUFOTJPO%FWJDF4PVSDF

Slide 68

Slide 68 text

extension ExtensionDeviceSource: AVCaptureVideoDataOutputSampleBufferDelegate { func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { let inputImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!) let filter = CIFilter.sepiaTone() filter.inputImage = inputImage ciContext.render(filter.outputImage!, to: sampleBuffer.imageBuffer!) _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) } func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { _streamSource.stream.send(sampleBuffer, discontinuity: .sampleDropped, hostTimeInNanoseconds: 0) } } &YUFOTJPO1SPWJEFSTXJGU

Slide 69

Slide 69 text

extension ExtensionDeviceSource: AVCaptureVideoDataOutputSampleBufferDelegate { func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { let inputImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!) let filter = CIFilter.sepiaTone() filter.inputImage = inputImage ciContext.render(filter.outputImage!, to: sampleBuffer.imageBuffer!) _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) } func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { _streamSource.stream.send(sampleBuffer, discontinuity: .sampleDropped, hostTimeInNanoseconds: 0) } } &YUFOTJPO1SPWJEFSTXJGU

Slide 70

Slide 70 text

Χϝϥө૾Λදࣔ͢Δ

Slide 71

Slide 71 text

No content

Slide 72

Slide 72 text

🎉

Slide 73

Slide 73 text

Χϝϥө૾ʹը૾Λಈతʹ߹੒͢Δ

Slide 74

Slide 74 text

ϗετΞϓϦɺ&YUFOTJPOڞ௨෦෼

Slide 75

Slide 75 text

static func generate(text: String, size: CGFloat = 100, imageSize: NSSize = NSSize(width: 1920, height: 1080)) -> CIImage? { let font = NSFont(name: "HiraginoSans-W9", size: size) ?? NSFont.systemFont(ofSize: size) let image = NSImage(size: imageSize, flipped: false) { (rect) -> Bool in let paragraphStyle = NSMutableParagraphStyle() paragraphStyle.alignment = .center paragraphStyle.lineBreakMode = .byCharWrapping let numberOfLines: CGFloat = CGFloat(text.split(separator: "\n").count) let rectangle = NSRect(x: 0, y: imageSize.height - font.lineHeight() * numberOfLines, width: imageSize.width, height: font.lineHeight() * numberOfLines) let textAttributes = [ .strokeColor: NSColor.black, .foregroundColor: NSColor.white, .strokeWidth: -1, .font: font, .paragraphStyle: paragraphStyle ] as [NSAttributedString.Key : Any] (text as NSString).draw(in: rectangle, withAttributes: textAttributes) return true } return image.ciImage } 6UJMJUZTXJGU

Slide 76

Slide 76 text

&YUFOTJPOଆ

Slide 77

Slide 77 text

private var textImage: CIImage? // දࣔ֬ೝͷͨΊॳظॲཧ࣌ʹҰ౓͚ͩը૾࡞੒ textImage = Utility.generate(text: "Hello Ծ૝Χϝϥʂ") &YUFOTJPO1SPWJEFSTXJGU

Slide 78

Slide 78 text

private let filterComposite = CIFilter(name: "CISourceOverCompositing") private func compose(bgImage: CIImage, overlayImage: CIImage?) -> CIImage? { guard let filterComposite = filterComposite, let overlayImage = overlayImage else { return bgImage } filterComposite.setValue(overlayImage, forKeyPath: kCIInputImageKey) filterComposite.setValue(bgImage, forKeyPath: kCIInputBackgroundImageKey) return filterComposite.outputImage } &YUFOTJPO1SPWJEFSTXJGU

Slide 79

Slide 79 text

&YUFOTJPO1SPWJEFSTXJGU func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { let inputImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!) let filter = CIFilter.sepiaTone() filter.inputImage = inputImage ciContext.render(filter.outputImage!, to: sampleBuffer.imageBuffer!) _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) }

Slide 80

Slide 80 text

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { let inputImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!) if let compositedImage = compose(bgImage: inputImage, overlayImage: textImage) { ciContext.render(compositedImage, to: sampleBuffer.imageBuffer!) } _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) } &YUFOTJPO1SPWJEFSTXJGU

Slide 81

Slide 81 text

No content

Slide 82

Slide 82 text

🎉

Slide 83

Slide 83 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 84

Slide 84 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 85

Slide 85 text

͞Βʹػೳ֦ு

Slide 86

Slide 86 text

ϗετΞϓϦͱ࿈ಈͤ͞Δ

Slide 87

Slide 87 text

)PTUBQQ &YUFOTJPO

Slide 88

Slide 88 text

)PTUBQQ &YUFOTJPO ಉ͡"QQ(SPVQͷ 6TFS%FGBVMUT

Slide 89

Slide 89 text

ࠓճ͸ϥΠϒϥϦ%FGBVMUTΛ௥Ճ͠࢖༻ IUUQTHJUIVCDPNTJOESFTPSIVT%FGBVMUT

Slide 90

Slide 90 text

No content

Slide 91

Slide 91 text

No content

Slide 92

Slide 92 text

No content

Slide 93

Slide 93 text

No content

Slide 94

Slide 94 text

No content

Slide 95

Slide 95 text

ϗετΞϓϦɺ&YUFOTJPOڞ௨෦෼

Slide 96

Slide 96 text

import Defaults extension Defaults.Keys { public static let isBypass = Key("isBypass", default: false, suite: defaultsSuite) public static let message = Key("message", default: "Hello World", suite: defaultsSuite) } %FGBVMUT,FZTTXJGU

Slide 97

Slide 97 text

ϗετΞϓϦଆ

Slide 98

Slide 98 text

import SwiftUI import SystemExtensions import Defaults struct ContentView: View { @Default(.message) var message var body: some View { VStack { HStack { Button { let activationRequest = OSSystemExtensionRequest.activationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(activationRequest) } label: { Text("Install") } Button { let deactivationRequest = OSSystemExtensionRequest.deactivationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(deactivationRequest) } label: { Text("Uninstall") } Defaults.Toggle("Bypass", key: .isBypass) } .padding() HStack { TextEditor(text: $message) .font(.system(size: 16)) .border(Color.gray, width: 1) } .padding() } } } $POUFOU7JFXTXJGU

Slide 99

Slide 99 text

import SwiftUI import SystemExtensions import Defaults struct ContentView: View { @Default(.message) var message var body: some View { VStack { HStack { Button { let activationRequest = OSSystemExtensionRequest.activationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(activationRequest) } label: { Text("Install") } Button { let deactivationRequest = OSSystemExtensionRequest.deactivationRequest( forExtensionWithIdentifier: extensionID, queue: .main) OSSystemExtensionManager.shared.submitRequest(deactivationRequest) } label: { Text("Uninstall") } Defaults.Toggle("Bypass", key: .isBypass) } .padding() HStack { TextEditor(text: $message) .font(.system(size: 16)) .border(Color.gray, width: 1) } .padding() } } } $POUFOU7JFXTXJGU

Slide 100

Slide 100 text

&YUFOTJPOଆ

Slide 101

Slide 101 text

private func observeSettings() { Task { for await isBypass in Defaults.updates(.isBypass) { self.isBypass = isBypass } } Task { for await message in Defaults.updates(.message) { textImage = Utility.generate(text: message) } } } &YUFOTJPO1SPWJEFSTXJGU

Slide 102

Slide 102 text

private func observeSettings() { Task { for await isBypass in Defaults.updates(.isBypass) { self.isBypass = isBypass } } Task { for await message in Defaults.updates(.message) { textImage = Utility.generate(text: message) } } } &YUFOTJPO1SPWJEFSTXJGU

Slide 103

Slide 103 text

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { if isBypass { _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) return } let inputImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!) if let compositedImage = compose(bgImage: inputImage, overlayImage: textImage) { ciContext.render(compositedImage, to: sampleBuffer.imageBuffer!) } _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) } &YUFOTJPO1SPWJEFSTXJGU

Slide 104

Slide 104 text

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { if isBypass { _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) return } let inputImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!) if let compositedImage = compose(bgImage: inputImage, overlayImage: textImage) { ciContext.render(compositedImage, to: sampleBuffer.imageBuffer!) } _streamSource.stream.send(sampleBuffer, discontinuity: .time, hostTimeInNanoseconds: 0) } &YUFOTJPO1SPWJEFSTXJGU

Slide 105

Slide 105 text

ςΩετೖྗΛ൓ө

Slide 106

Slide 106 text

No content

Slide 107

Slide 107 text

🎉

Slide 108

Slide 108 text

͞Βʹػೳ֦ு

Slide 109

Slide 109 text

Ի੠ೝࣝ

Slide 110

Slide 110 text

ϗετΞϓϦଆ

Slide 111

Slide 111 text

No content

Slide 112

Slide 112 text

No content

Slide 113

Slide 113 text

No content

Slide 114

Slide 114 text

No content

Slide 115

Slide 115 text

class ViewModel: NSObject, ObservableObject, SpeechRecognizerDelegate { @Default(.message) var message: String let speechRecognizer = SpeechRecognizer() func setUpSpeechRecognizer() { speechRecognizer.setup(delegate: self) } func startSpeechRecognize() { guard !speechRecognizer.started else { return } speechRecognizer.start() } func stopSpeechRecognize() { speechRecognizer.stop() } func speechTextUpdate(value: String) { message = value } } $POUFOU7JFXTXJGU

Slide 116

Slide 116 text

import AVFoundation import Speech public class SpeechRecognizer { (...) public func setup(delegate: SpeechRecognizerDelegate) { self.delegate = delegate (...) recognizer = SFSpeechRecognizer(locale: Locale(identifier: "ja-JP")) } (...) private func startRecognitionTask() { recognitionTask = recognizer.recognitionTask(with: recognitionRequest) { result, error in var isFinal = false if let foundResult = result { self.previousResult = foundResult isFinal = foundResult.isFinal print(foundResult.bestTranscription.formattedString) self.delegate?.speechTextUpdate(value: foundResult.bestTranscription.formattedString) } if error != nil { print("\(error!.localizedDescription)") self.stop() } if isFinal { print("FINAL RESULT reached") self.stop() } } } } 4QFFDI3FDPHOJ[FSTXJGUҰ෦ൈਮ

Slide 117

Slide 117 text

import AVFoundation import Speech public class SpeechRecognizer { (...) public func setup(delegate: SpeechRecognizerDelegate) { self.delegate = delegate (...) recognizer = SFSpeechRecognizer(locale: Locale(identifier: "ja-JP")) } (...) private func startRecognitionTask() { recognitionTask = recognizer.recognitionTask(with: recognitionRequest) { result, error in var isFinal = false if let foundResult = result { self.previousResult = foundResult isFinal = foundResult.isFinal print(foundResult.bestTranscription.formattedString) self.delegate?.speechTextUpdate(value: foundResult.bestTranscription.formattedString) } if error != nil { print("\(error!.localizedDescription)") self.stop() } if isFinal { print("FINAL RESULT reached") self.stop() } } } } 4QFFDI3FDPHOJ[FSTXJGUҰ෦ൈਮ

Slide 118

Slide 118 text

No content

Slide 119

Slide 119 text

🎉

Slide 120

Slide 120 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 121

Slide 121 text

໨࣍ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ͞Βʹػೳ֦ு σϞ

Slide 122

Slide 122 text

σϞ

Slide 123

Slide 123 text

ࠔͬͨ࣌͸ ΞϓϦߋ৽࣌͸Ұ౓ফͯ͠ೖΕ௚͠ਪ঑ 04࠶ىಈ͢Δͱղܾ͢Δ͜ͱ΋ଟ͍ "QQ4BOECPYͷνΣοΫ๨Εͯ·ͤΜ͔ JOGPQMJTU΁ͷ௥ه๨Εͯ·ͤΜ͔

Slide 124

Slide 124 text

·ͱΊ NBD04ͷԾ૝Χϝϥʹ͍ͭͯ ɹɹ%"-ˠ$PSF.FEJB*0&YUFOTJPOT γϯϓϧͳ࣮૷Λ࡞੒͠ಈ͔ͯ͠ΈΔ ɹɹ$BNFSB&YUFOTJPO௥Ճͱ࡞๏ Χϝϥө૾දࣔͱө૾΁ͷ߹੒ ɹɹ෺ཧΧϝϥͱͷ࿈ಈɺιϑτ΢ΣΞͰͷՃ޻ ͞Βʹػೳ֦ு ɹɹίϯτϩʔϧΞϓϦͱ࿈ಈɺԻ੠ೝࣝ

Slide 125

Slide 125 text

(JWF"XBZ ࠓճͷσϞ࣮૷Λ(JU)VCʹΞοϓ͠·ͨ͠ IUUQTHJUIVCDPNTBUPTIJ.Z$SFBUJWF$BNFSB