Upgrade to Pro — share decks privately, control downloads, hide ads and more …

20170829_iOSLT_機械学習とVision.framework

shtnkgm
September 20, 2017

 20170829_iOSLT_機械学習とVision.framework

機械学習の基礎的な内容を交えつつ、iOS11で追加されたVision.frameworkの説明とデモ

shtnkgm

September 20, 2017
Tweet

More Decks by shtnkgm

Other Decks in Programming

Transcript

  1. χϡʔϥϧωοτϫʔΫͱ͸ — ػցֶशख๏ͷҰछ — ਓؒͷ೴ͷਆܦճ࿏໢Λ ਺ࣜϞσϧͰදͨ͠΋ͷ — NNͱུ͞ΕΔ ʢDNN1ɺRNN2ɺCNN3ͳͲʣ 3

    Convolutional Neural Networkʢ৞ΈࠐΈχϡʔϥϧωο τϫʔΫʣ 2 Recurrent Neural Networkʢ࠶ؼܕχϡʔϥϧωοτϫ ʔΫʣ 1 Deep Neural NetworkʢσΟʔϓχϡʔϥϧωοτϫʔ Ϋʣ
  2. VisionͰೝࣝͰ͖Δ΋ͷᶃ — إݕग़ / Face Detection and Recognition — όʔίʔυݕग़

    / Barcode Detection — ը૾ͷҐஔ߹Θͤ / Image Alignment Analysis — ςΩετݕग़ / Text Detection — ਫฏઢݕग़ / Horizon Detection
  3. private func startCapture() { let captureSession = AVCaptureSession() captureSession.sessionPreset =

    AVCaptureSessionPresetPhoto // ೖྗͷࢦఆ let captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) guard let input = try? AVCaptureDeviceInput(device: captureDevice) else { return } guard captureSession.canAddInput(input) else { return } captureSession.addInput(input) // ग़ྗͷࢦఆ let output: AVCaptureVideoDataOutput = AVCaptureVideoDataOutput() output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoQueue")) guard captureSession.canAddOutput(output) else { return } captureSession.addOutput(output) // ϓϨϏϡʔͷࢦఆ guard let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) else { return } previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill previewLayer.frame = view.bounds view.layer.insertSublayer(previewLayer, at: 0) // Ωϟϓνϟ։࢝ captureSession.startRunning() }
  4. ࡱӨϑϨʔϜຖʹݺ͹ΕΔDeleate extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate { func captureOutput(_ output: AVCaptureOutput!, didOutputSampleBuffer

    sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) { // CMSampleBufferΛCVPixelBufferʹม׵ guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return } // ͜ͷதʹVision.frameworkͷॲཧΛॻ͍͍ͯ͘ʢը૾ೝࣝ෦෼ʣ } }
  5. ը૾ೝࣝϦΫΤετΛ࡞੒ // ը૾ೝࣝϦΫΤετΛ࡞੒ʢҾ਺͸Ϟσϧͱϋϯυϥʣ let request = VNCoreMLRequest(model: model) { [weak

    self] (request: VNRequest, error: Error?) in guard let results = request.results as? [VNClassificationObservation] else { return } // ൑ผ݁Ռͱͦͷ֬৴౓Λ্Ґ3݅·Ͱදࣔ // identifier͸ΧϯϚ۠੾ΓͰෳ਺ॻ͔Ε͍ͯΔ͜ͱ͕͋ΔͷͰɺ࠷ॳͷ୯ޠͷΈऔಘ͢Δ let displayText = results.prefix(3) .flatMap { "\(Int($0.confidence * 100))% \($0.identifier.components(separatedBy: ", ")[0])" } .joined(separator: "\n") DispatchQueue.main.async { self?.textView.text = displayText } }
  6. ը૾ೝࣝ෦෼ͷ׬੒ܗ guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }

    guard let model = try? VNCoreMLModel(for: Resnet50().model) else { return } let request = VNCoreMLRequest(model: model) { [weak self] (request: VNRequest, error: Error?) in guard let results = request.results as? [VNClassificationObservation] else { return } let displayText = results.prefix(3) .flatMap { "\(Int($0.confidence * 100))% \($0.identifier.components(separatedBy: ", ")[0])" } .joined(separator: "\n") DispatchQueue.main.async { self?.textView.text = displayText } } try? VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:]).perform([request])
  7. ࢀߟࢿྉᶃ — Build more intelligent apps with machine learning. /

    Apple — Vision / Apple Developer Documentation — ʲWWDC2017ʳVision.framework ͷςΩετݕग़Λࢼ͠ ͯΈ·ͨ͠ʲiOS11ʳ — Keras + iOS11 CoreML + Vision Framework ʹΑΔɺ΋΋ ΫϩإࣝผΞϓϦͷ։ൃ — [Core ML] .mlmodel ϑΝΠϧΛ࡞੒͢Δ / ϑΣϯϦϧ
  8. ࢀߟࢿྉᶄ — [iOS 11] CoreMLͰը૾ͷࣝผΛࢼͯ͠Έ·ͨ͠ ʢVision.FrameworkΛ࢖Θͳ͍ύλʔϯʣ #WWDC2017 — Places205-GoogLeNetͰ৔ॴͷ൑ఆ /

    fabo.io — iOSDCͷϦδΣΫτίϯͰʰiOSͱσΟʔϓϥʔχϯάʱʹ ͍ͭͯ࿩͠·ͨ͠Add Star — [iOS 10][χϡʔϥϧωοτϫʔΫ] OSSͰAccelerateʹ௥Ճ ͞ΕͨBNNSΛཧղ͢Δ ~XORฤ~