Slide 1

Slide 1 text

The Latest in Developing for watchOS By: Conrad Stoll

Slide 2

Slide 2 text

No content

Slide 3

Slide 3 text

Agenda 4 Apple Watch History 4 Improvements in watchOS 4 4 Building an Apple Watch App with CoreML

Slide 4

Slide 4 text

The Apple Watch was released on April 24, 2015.

Slide 5

Slide 5 text

11/18/2014 - WatchKit 6/8/2015 - watchOS 2 6/13/2016 - watchOS 3 6/5/2017 - watchOS 4

Slide 6

Slide 6 text

A lot has changed in 2.5 years.

Slide 7

Slide 7 text

Native Watch Apps For the last 2 years!

Slide 8

Slide 8 text

HTTP Networking WatchConnectivity MMWormhole

Slide 9

Slide 9 text

Digital Crown

Slide 10

Slide 10 text

Glances

Slide 11

Slide 11 text

Complications

Slide 12

Slide 12 text

No content

Slide 13

Slide 13 text

We've come a long way!

Slide 14

Slide 14 text

Let's compare to iPhone

Slide 15

Slide 15 text

The iPhone was released on June 29, 2007.

Slide 16

Slide 16 text

3/6/2008 - iPhone SDK 7/18/2008 - iPhone OS 2 6/17/2009 - iPhone OS 3 6/21/2010 - iOS 4

Slide 17

Slide 17 text

So what changed?

Slide 18

Slide 18 text

Copy & Paste

Slide 19

Slide 19 text

Multitasking

Slide 20

Slide 20 text

The iPad

Slide 21

Slide 21 text

Push Notifications

Slide 22

Slide 22 text

In-App Purchase

Slide 23

Slide 23 text

Think about what we take for granted building iOS apps that debuted on the platform during the first 2-3 years

Slide 24

Slide 24 text

We're going to discuss examples where building Apple Watch apps is challenging...

Slide 25

Slide 25 text

Think about how similar these challenges sound to the first 2-3 years of developing for the iPhone.

Slide 26

Slide 26 text

But don't panic!

Slide 27

Slide 27 text

Think about how excited we were to build apps then.

Slide 28

Slide 28 text

And remember that every year it gets better !

Slide 29

Slide 29 text

How will watchOS 4 help developers build better Apple Watch apps? 1 Apple Watch Developers

Slide 30

Slide 30 text

4 Ways That Apple Watch apps can be better.

Slide 31

Slide 31 text

1 Perform faster.

Slide 32

Slide 32 text

Unified Process Runtime

Slide 33

Slide 33 text

Apps simply run faster No changes required

Slide 34

Slide 34 text

No content

Slide 35

Slide 35 text

Better touch latency Better pan performance Faster app launch

Slide 36

Slide 36 text

2 Always be up to date.

Slide 37

Slide 37 text

Frontmost App State

Slide 38

Slide 38 text

Apps should work best when they're being used.

Slide 39

Slide 39 text

WatchConnectivity Performance URLSession Performance Prioritized Tasks Receiving Notifications

Slide 40

Slide 40 text

No content

Slide 41

Slide 41 text

Extend Frontmost Time WKExtension.shared().isFrontmostTimeoutExtended = true

Slide 42

Slide 42 text

Example: Grocery List

Slide 43

Slide 43 text

Frontmost while Shopping // MARK: Table View Action Methods override func table(_ table: WKInterfaceTable, didSelectRowAt rowIndex: Int) { ... if remainingItems == 0 { WKExtension.shared().isFrontmostTimeoutExtended = false } else { WKExtension.shared().isFrontmostTimeoutExtended = true } }

Slide 44

Slide 44 text

No content

Slide 45

Slide 45 text

Additional Capabilities while Frontmost

Slide 46

Slide 46 text

Haptics and Audio Useful for Workouts

Slide 47

Slide 47 text

Play a haptic every mile ! func computeThatUserHasGoneTheExtraMile() { WKInterfaceDevice.current().play(.success) }

Slide 48

Slide 48 text

Other Useful Workout Features

Slide 49

Slide 49 text

Enable Water Lock for Workout Apps WKExtension.shared().enableWaterLock()

Slide 50

Slide 50 text

Hardware Pause and Resume Buttons #pragma mark - HKWorkoutSessionDelegate - (void)workoutSession:(HKWorkoutSession *)workoutSession didGenerateEvent:(HKWorkoutEvent *)event { if (event.type == HKWorkoutEventTypePauseOrResumeRequest) { if (self.paused) { [self signalResumeRun]; [[WKInterfaceDevice currentDevice] playHaptic:WKHapticTypeStart]; } else if (self.running) { [self signalPauseRun]; [[WKInterfaceDevice currentDevice] playHaptic:WKHapticTypeStop]; } } }

Slide 51

Slide 51 text

New Background Mode Location

Slide 52

Slide 52 text

Store Routes in HealthKit self.routeBuilder = [[HKWorkoutRouteBuilder alloc] initWithHealthStore:healthStore device:device]; // Add Locations to Route Builder [self.routeBuilder insertRouteData:locations completion:^(BOOL success, NSError * _Nullable error) { }]; // Save HKWorkout // Add Distance Samples to Workout // Finish the Route [self.routeBuilder finishRouteWithWorkout:workout metadata:metadata completion:^(HKWorkoutRoute * _Nullable workoutRoute, NSError * _Nullable error) { }];

Slide 53

Slide 53 text

Simpler Snapshots

Slide 54

Slide 54 text

Easier to perform all of your updates with one task

Slide 55

Slide 55 text

Example: WatchConnectivity Task // Watch app wakes up func handle(_ backgroundTasks: Set) { case let connectivityTask as WKWatchConnectivityRefreshBackgroundTask: // Activate WatchConnectivity Session // Update Root Interface Controller // Reload Complication connectivityTask.setTaskCompletedWithSnapshot(true) }

Slide 56

Slide 56 text

Snapshot Reasons typedef NS_ENUM(NSInteger, WKSnapshotReason) { WKSnapshotReasonAppScheduled = 0, WKSnapshotReasonReturnToDefaultState, WKSnapshotReasonComplicationUpdate, WKSnapshotReasonPrelaunch, WKSnapshotReasonAppBackgrounded }

Slide 57

Slide 57 text

ThermalState

Slide 58

Slide 58 text

Check Current System Load extension ProcessInfo { public enum ThermalState : Int { case nominal case fair case serious case critical } }

Slide 59

Slide 59 text

3 UI Design Flexibility.

Slide 60

Slide 60 text

Overlapping UI Elements

Slide 61

Slide 61 text

No content

Slide 62

Slide 62 text

Nesting Elements 4 Picker View 4 SpriteKit Scene 4 Pan Gesture 4 Buttons and Groups

Slide 63

Slide 63 text

Last Year - Beer Button

Slide 64

Slide 64 text

Needed to use a picker view behind a button

Slide 65

Slide 65 text

Vertical Page Layout

Slide 66

Slide 66 text

No content

Slide 67

Slide 67 text

Code Sample Vertical Paging WKInterfaceController.reloadRootPageControllers(withNames: names, contexts: contexts, orientation: .vertical, pageIndex: 1)

Slide 68

Slide 68 text

Set Initial Page Index Great for workout apps

Slide 69

Slide 69 text

Incremental Table Loading or Infinite Scrolling

Slide 70

Slide 70 text

watchOS Table View Best Practices 4 Load 5-10 cells initially 4 Try not to reload the whole table 4 Add more content when the user scrolls

Slide 71

Slide 71 text

Top and Bottom Scrolling Callbacks class WKInterfaceController : NSObject { func interfaceOffsetDidScrollToTop() func interfaceOffsetDidScrollToBottom() }

Slide 72

Slide 72 text

Autorotating Interface WKExtension.shared().isAutorotating = true

Slide 73

Slide 73 text

4 Work without an iPhone.

Slide 74

Slide 74 text

No content

Slide 75

Slide 75 text

Most apps are designed to assume there is an iPhone...

Slide 76

Slide 76 text

We need Apple Watch apps to work without an iPhone

Slide 77

Slide 77 text

Audio Playback Podcasts, Audiobooks, Music

Slide 78

Slide 78 text

AVAudioPlayer* and MusicKit

Slide 79

Slide 79 text

Recording Audio in the background* with custom UI

Slide 80

Slide 80 text

Bluetooth Accessories

Slide 81

Slide 81 text

Standalone apps need to process their own data

Slide 82

Slide 82 text

CoreML and Accelerate

Slide 83

Slide 83 text

Goal for this year: Ship an app using CoreML

Slide 84

Slide 84 text

Game: Snowman

Slide 85

Slide 85 text

Draw letters to guess a word or phrase

Slide 86

Slide 86 text

Apple Watches don't have keyboards...!

Slide 87

Slide 87 text

Uses Machine Learning for Handwriting Recognition

Slide 88

Slide 88 text

Public MNIST dataset for digits Lots of tutorials online Recently includes letters!!! Tutorials work for digits and letters!!!

Slide 89

Slide 89 text

A model needs to learn how to recognize letters a user might draw.

Slide 90

Slide 90 text

Training the model needs to happen before the game is shipped.

Slide 91

Slide 91 text

Training does not happen on the watch.

Slide 92

Slide 92 text

Training does not change no matter how many letters a user draws. Training is generic for all users.*

Slide 93

Slide 93 text

Download and Compile MLModel @interface MLModel (MLModelCompilation) + (nullable NSURL *)compileModelAtURL:(NSURL *)modelURL error:(NSError **)error; @end

Slide 94

Slide 94 text

Tools 4 Python 4 scikit-learn 4 Keras 4 Tensorflow 4 coremltools

Slide 95

Slide 95 text

Resources 4 Machine Learning Guide Podcast http://ocdevel.com/podcasts/machine-learning

Slide 96

Slide 96 text

Dataset Extended MNIST

Slide 97

Slide 97 text

Extended MNIST 387,361 training letter images2 23,941 testing images 28x28 pixels 2 https://arxiv.org/pdf/1702.05373.pdf

Slide 98

Slide 98 text

Model 1 Support Vector Machine

Slide 99

Slide 99 text

SVM models are easy to get started with

Slide 100

Slide 100 text

Model 1 num_samples = 10000 images = all_images[0:num_samples] labels = all_labels[0:num_samples] from sklearn import svm clf = svm.SVC() clf.fit(images,labels)

Slide 101

Slide 101 text

Accuracy: 69% Size: 50 MB Too big for Apple Watch ! Poor user experience

Slide 102

Slide 102 text

No content

Slide 103

Slide 103 text

Model 2 SVM, Split Alphabet

Slide 104

Slide 104 text

Model 2 half_letters = [1,2,3,6,7,9,10,13,16,19,21,23,24] ind = [val in half_letters for val in labels] labels_2=labels[ind] images_2=images[ind][:] from sklearn import svm clf = svm.SVC() clf.fit(images_2,labels_2)

Slide 105

Slide 105 text

Accuracy: 78% Size: 8 MB, x2 Works on Apple Watch! ! Accuracy could be better

Slide 106

Slide 106 text

Rule of thumb: If you can achieve ~80% accuracy with SVM, then the information you're trying to get at is in your data.

Slide 107

Slide 107 text

Model 3 Dimensionality Reduction with PCA

Slide 108

Slide 108 text

Reduce the number of variables the model needs to consider

Slide 109

Slide 109 text

Convert 784 variables into 25 variables

Slide 110

Slide 110 text

784 Variables

Slide 111

Slide 111 text

25 Variables

Slide 112

Slide 112 text

PCA from sklearn.decomposition import PCA pca = PCA(n_components=components); pca.fit(images) # PCA matrix...Save this for later mat = pca.components_

Slide 113

Slide 113 text

Model 3 # Saved from previous step mat = pca.components_ import numpy as np images_pca = np.matmul(images, mat.transpose()) from sklearn import svm clf = svm.SVC() clf.fit(images_pca,labels)

Slide 114

Slide 114 text

Using PCA in watchOS import Accelerate // Input: 784 // Output: 25 func transform(from input: [NSNumber]) -> [NSNumber] { let image = input.map { $0.floatValue } // 784 let mat = pcaMatrix // Saved from Python var result = [Float](repeating: 0.0, count: 25) // 25 vDSP_mmul(image, 1, mat, 1, &result, 1, 1, 25, 784) return result.map { NSNumber(value: $0) } }

Slide 115

Slide 115 text

Accuracy: 87% Size: 5MB

Slide 116

Slide 116 text

Ok, now we're !

Slide 117

Slide 117 text

Model 4 Convolutional Neural Network

Slide 118

Slide 118 text

Tutorials for Keras and Tensorflow 4 Simple Convolutional Neural Network for MNIST3 4 Deep Learning in Python4 4 Several talks at 360iDev 2017 4 https://elitedatascience.com/keras-tutorial-deep-learning-in-python 3 http://machinelearningmastery.com/handwritten-digit-recognition-using-convolutional-neural- networks-python-keras/

Slide 119

Slide 119 text

Keras Model from keras.models import Sequential from keras.layers import Dense, Dropout, Activation, Flatten from keras.layers import Convolution2D, MaxPooling2D def baseline_model(): model = Sequential() model.add(Conv2D(30, (5, 5), padding='valid', input_shape=(28,28,1), activation='relu')) model.add(MaxPooling2D(pool_size=(2, 2))) model.add(Conv2D(15, (3, 3), activation='relu')) model.add(MaxPooling2D(pool_size=(2, 2))) model.add(Dropout(0.2)) model.add(Flatten()) model.add(Dense(128, activation='relu')) model.add(Dense(50, activation='relu')) model.add(Dense(num_classes, activation='softmax')) # softmax # Compile model model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) return model

Slide 120

Slide 120 text

Model 4 # Build the model model = baseline_model() # Fit the model model.fit(X_train, y_train, validation_split=.1, epochs=10, batch_size=200, verbose=True)

Slide 121

Slide 121 text

Go get some ☕

Slide 122

Slide 122 text

Accuracy: 95% Size: 245 KB

Slide 123

Slide 123 text

Exporting Model to CoreML

Slide 124

Slide 124 text

coremltools import coremltools coreml_model = coremltools.converters.keras.convert(model, input_names = ['imageAlpha'], output_names = ['letterConfidence']) coreml_model.author = 'Kate Bonnen and Conrad Stoll' coreml_model.license = 'MIT' coreml_model.short_description = "Recognize the hand-drawn letter from an input image." coreml_model.input_description['imageAlpha'] = 'The input image alpha values, from top down, left to right.' coreml_model.output_description['letterConfidence'] = 'Confidence for each letter ranging from index 1 to 26. Ignore index 0.' coreml_model.save('letters_keras.mlmodel')

Slide 125

Slide 125 text

Importing a CoreML Model

Slide 126

Slide 126 text

Code Generation for your Model // Generated by CoreML class letters_keras { var model: MLModel convenience init() { let bundle = Bundle(for: letters_keras.self) let assetPath = bundle.url(forResource: "letters_keras", withExtension:"mlmodelc") try! self.init(contentsOf: assetPath!) } func prediction(imageAlpha: MLMultiArray) throws -> letters_kerasOutput { let input_ = letters_kerasInput(imageAlpha: imageAlpha) return try self.prediction(input: input_) } }

Slide 127

Slide 127 text

Building the Game

Slide 128

Slide 128 text

Snowman

Slide 129

Slide 129 text

Drawing a Letter Anywhere

Slide 130

Slide 130 text

Overlapping Interface Layers 4 Game Interface 4 SpriteKit Drawing Path 4 Pan Gesture 4 Confirmation Interface

Slide 131

Slide 131 text

Pan Gesture Captures Path @IBAction func didPan(sender : WKPanGestureRecognizer) { let location = sender.locationInObject() updateRecognitionLine(for: location, currentRecognizer: currentRecognizer) if sender.state == .ended { addSegmentAndWaitForConfirmation(with: currentRecognizer) } }

Slide 132

Slide 132 text

Setting up a Path Shape Node let line = SKShapeNode() line.fillColor = SKColor.clear line.isAntialiased = false line.lineWidth = strokeWidth line.lineCap = .round line.strokeColor = UIColor.white lineNode = line let scene = SKScene(size: size) scene.addChild(line) scene.backgroundColor = UIColor.clear drawScene.presentScene(scene)

Slide 133

Slide 133 text

Updating the Shape Node's Path func updateRecognitionLine(for location: CGPoint, currentRecognizer: Recognizer) { // Add the point to our path let path = currentRecognizer.addPoint(location) // Update the node's path lineNode?.path = path }

Slide 134

Slide 134 text

Making a Prediction

Slide 135

Slide 135 text

Prediction Steps 4 Stroke Path to Image 4 Center and Crop Image 4 Get Pixel Alpha Values Between 0 and 1 4 Convert to Vector 4 Send to CoreML

Slide 136

Slide 136 text

Stroke Path to Image UIGraphicsBeginImageContextWithOptions(CGSize(width: drawingWidth, height: drawingHeight), false, 0.0) let context = UIGraphicsGetCurrentContext()! context.setStrokeColor(UIColor.black.cgColor) path.lineJoinStyle = .round path.lineCapStyle = .round path.lineWidth = strokeWidth path.stroke(with: .normal, alpha: 1)

Slide 137

Slide 137 text

Compute Center and Crop Image 4 Letter centered 4 2px padding on every side 4 Square aspect ratio 4 Must be 28x28 pixels

Slide 138

Slide 138 text

Get Image Alpha Values extension UIImage { func getPixelAlphaValue(at point: CGPoint) -> CGFloat { guard let cgImage = cgImage, let pixelData = cgImage.dataProvider?.data else { return 0.0 } let data: UnsafePointer = CFDataGetBytePtr(pixelData) let bytesPerPixel = cgImage.bitsPerPixel / 8 let pixelInfo: Int = ((cgImage.bytesPerRow * Int(point.y)) + (Int(point.x) * bytesPerPixel)) // We don't need to know about color for this // let b = CGFloat(data[pixelInfo]) / CGFloat(255.0) // let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0) // let r = CGFloat(data[pixelInfo+2]) / CGFloat(255.0) // All we need is the alpha values let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0) return a } }

Slide 139

Slide 139 text

Matching Training and Input Data Structure let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0) a is between 0 and 1 not between 0 and 255

Slide 140

Slide 140 text

Get Every Pixel's Alpha Value extension UIImage { func pixelAlpha() -> [NSNumber] { var pixels = [NSNumber]() for w in 0...Int(self.size.width) - 1 { for h in 0...Int(self.size.height) - 1 { let point = CGPoint(x: w, y: h) let alpha = getPixelAlphaValue(at: point) let number = NSNumber(value: Float(alpha)) pixels.append(number) } } return pixels } }

Slide 141

Slide 141 text

Convert to MLMultiArray import CoreML let alphaValues = drawing.generateImageVectorForAlphaChannel() let multiArray = try! MLMultiArray(shape: [1,28,28], dataType: MLMultiArrayDataType.double) for (index, number) in alphaValues.enumerated() { multiArray[index] = number }

Slide 142

Slide 142 text

Here's what an MLMultiArray looks like

Slide 143

Slide 143 text

No content

Slide 144

Slide 144 text

Make a Prediction with CoreML import CoreML let model = letters_keras() let prediction = try! model.prediction(imageAlpha: multiArray)

Slide 145

Slide 145 text

Demo

Slide 146

Slide 146 text

Thank You

Slide 147

Slide 147 text

Conrad Stoll @conradstoll