Machine Learning as a Service
How to deploy ML Models as APIs without going nuts
Anand Chitipothu
@anandology
rorodata
1
Slide 2
Slide 2 text
Hello!
Anand Chitipothu
@anandology
Co-founder of rorodata
Platform-as-a-Service designed
for data scientists
2
Slide 3
Slide 3 text
Overview
— Overview of Machine Learning
— Why Machine Learning as a Service
— Traditional Approach
— Importance of Right Abstractions
— Our Approach
— Future work
— Summary
3
Slide 4
Slide 4 text
Machine Learning
4
Slide 5
Slide 5 text
Machine Learning as a Service
5
Slide 6
Slide 6 text
Model -> API
6
Slide 7
Slide 7 text
ML as a Service: The Traditional Approach
7
Slide 8
Slide 8 text
Origami Flapping Bird
8
Slide 9
Slide 9 text
Flapping Bird - Crease Patterns
9
Slide 10
Slide 10 text
Model -> API
10
Slide 11
Slide 11 text
Model -> API
11
Slide 12
Slide 12 text
Model -> Function -> API
12
Slide 13
Slide 13 text
Function -> API
13
Slide 14
Slide 14 text
Firefly!
Open-source tool built to turn Python functions into
API.
https://github.com/rorodata/firefly
pip install firefly-python
14
Slide 15
Slide 15 text
Step 1: Write your function
# sq.py
def square(n):
"""compute square of a number.
"""
return n*n
You just need to make sure the arguments and return
value are JSON-friendly.
15
Slide 16
Slide 16 text
Step 2: Run it as an API
$ firefly sq.square
http://127.0.0.1:8000/
...
16
Slide 17
Slide 17 text
Step 3: Use it
>>> import firefly
>>> api = firefly.Client("http://127.0.0.1:8000/")
>>> api.square(n=4)
16
Firefly comes with in-built client for accessing the
API.
17
Slide 18
Slide 18 text
Behind the Scenes
It is a RESTful API!
$ curl -d '{"n": 4}' http://127.0.0.1:8000/square
16
18
Slide 19
Slide 19 text
What about ML models?
Write your predict function and run it as API.
# image_recognition.py
import joblib
model = joblib.load("model.pkl")
def predict(image):
...
19
Slide 20
Slide 20 text
api = firefly.Client("http://127.0.0.1:8080")
response = api.predict(image={"url": "http://example.com/foo.jpg"})
20
Slide 21
Slide 21 text
Deployment?
How to deploy the API on a server?
21
Slide 22
Slide 22 text
Deployment Platform (1/2)
Write a config file in the project specifying what
services to run.
project: image-recognition
runtime: python3-keras
services:
- name: api
function: image_recognition.predict
22
Behind the Scenes
— It builts the docker images
— Starts the specified services
— Provides URL endpoints with HTTPS
24
Slide 25
Slide 25 text
Custom Dependencies?
Write a requirements.txt file!
The platform borrows the repo2docker interface from
the Binder project.
25
Slide 26
Slide 26 text
Building an App
26
Slide 27
Slide 27 text
Boom!
27
Slide 28
Slide 28 text
Cross-Origin-Resource-Sharing
Just add an additional field in your service
description.
services:
- name: api
function: image_recognition.predict
cors_allow_origins: *
28
Slide 29
Slide 29 text
Instance Size & Scale
Specify instance size and scale to deploy.
services:
- name: api
function: image_recognition.predict
cors_allow_origins: *
size: S2
scale: 4
29
Slide 30
Slide 30 text
Open-Source Version
Built a light version of the platform to deploy to
your own server.
http://github.com/rorodata/rorolite
Same API!
30
Slide 31
Slide 31 text
Firefly - Upcoming Features
— Extensible API
— Integration with Python 3 type hints
— Automatic API documentation
— Support for gRPC
31
Slide 32
Slide 32 text
Firefly - New API (WIP)
app = Firefly()
@app.function
def detect_outliers(df: pd.DataFrame) -> pd.DataFrame:
...
@app.function
def detect_faces(image: Image) -> List[Image]:
...
if __name__ == "__main__":
app.run()
32
Slide 33
Slide 33 text
Summary
— Deployment Machine Learning Models need not be
very complex!
— The key is to focus on reducing the complexity of
the interface.
— Our work is open-source, please build on top of
it!
33
Slide 34
Slide 34 text
Thank you
Slides - http://bit.ly/ml-service
Anand Chitipothu
@anandology
rorodata
34