Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Micro Service Architecture with Machine Learning Application

Swall0w
November 07, 2018

Micro Service Architecture with Machine Learning Application

This is one of the slides in Plone Conference, Tokyo, 2018.

Machine Learning (ML) Application has become one of the most important features in Web applications. Therefore, deploying ML Application properly is crucial to develop your web application. This talk gives you how to deploy ML Application as REST API and efficient architecture which makes you easier update your ML Application.

Swall0w

November 07, 2018
Tweet

More Decks by Swall0w

Other Decks in Technology

Transcript

  1. Who am I !2 • Research • Master student in

    SIT • Object Detection for Robotics | Computer Vision • Startup • Co-Founder of FAST ACCOUNTING CO.,LTD • Research Development for OCR, Architecture • Social Media • Twitter https://twitter.com/Swall0wTech • Github https://github.com/Swall0w https://github.com/Swall0w/microservice-ml-demo
  2. Overview !3 Introduction to Machine Learning (ML) How to deploy

    ML application on Flask Benefits of Micro Service Architecture (Micro services) How to realize it Demo
  3. Introduction to Machine Learning !4 “Machine Learning is a field

    of artificial intelligence that uses statistical techniques to give computer systems the ability to learn from data, without being explicitly programmed.” From Wikipedia Definition →Algorithm which extract meaningful patterns from data. Applications • Email Spam and Malware Filtering • Machine Translation • Forecasting such as sales • Product Recommendations →ML applications have become one of important part of Web.
  4. Let’s deploy ML applications !5 Details • Web Framework: Flask

    • ML Framework: Chainer, ChainerCV • ML Application: Object Detection Internet Web application ML Model REST API Simple Architecture
  5. How to use ML model !7 1 import chainer 2

    from chainercv.datasets import voc_bbox_label_names 3 from chainercv.links import YOLOv3 4 from chainercv import utils 5 6 7 def main(image): 8 model = YOLOv3( 9 n_fg_class=len(voc_bbox_label_names), 10 pretrained_model='voc0712') 11 12 img = utils.read_image(image, color=True) 13 bboxes, labels, scores = model.predict([img]) 14 bbox, label, score = bboxes[0], labels[0], scores[0] Minimal code Load model predict method
  6. How to use ML model !8 Example Input image Output

    image Results of Variable bbox: [[101.38964, 197.98589, 362.8902, 403.87808]] label: [11] score: [0.9999248]
  7. How to deploy ML model !9 1 app = flask.Flask(__name__)

    2 model = YOLOv3( 3 n_fg_class=len(voc_bbox_label_names), 4 pretrained_model='voc0712') 5 6 @app.route("/predict", methods=["POST"]) 7 def predict(): 8 data = {"success": False} 9 10 if flask.request.method == "POST": 11 if flask.request.files.get("image"): 12 image = flask.request.files["image"].read() 13 image = prepare_image(image) 14 bboxes, labels, scores = model.predict([image]) 15 16 data["predictions"] = pack_result_each_item( 17 bboxes, labels, scores) 18 data["success"] = True 19 20 return flask.jsonify(data) 21 22 if __name__ == "__main__": 23 app.run() predict method Load model
  8. Problems of this code !10 difficult to update model This

    simple ML application works well but… • Parameters • Algorithm Separate ML application from Flask
  9. Benefits of Micro services Separate ML application from Flask Internet

    Web application ML Model REST API The micro services allows a system to be divided into a smaller, individual and independent services. !11 • Simpler to deploy • Simpler to understand • Flexible in using technologies The micro services makes a system
  10. How do we connect services Web application ML Model !12

    Transport Protocols -> synchronous communications Serialization format -> Protocol Buffers (gRPC) gRPC Benefits • Multi languages(Python, C++…) • Quick to use because of its Interface Definition Language • Faster than HTTP + JSON
  11. How to use gRPC !13 syntax = “proto3"; service MLServer

    { rpc predict(stream Chunk) returns (Reply) {} } message Chunk { bytes buffer = 1; } message Boundingbox { repeated float box = 1; } message Reply { repeated Boundingbox boxes = 1; repeated string classes = 2; repeated float confs = 3; } Write Interface Definition Language ɾmessage: key-value pairs ɾservice: data access classes Generate modules with command ɾdetection_pb2.py ɾdetection_pb2_grpc.py detection.proto For image For result
  12. Implement server and client !14 1 class Servicer(detection_pb2_grpc.MLServerServicer): 2 def

    __init__(self, predictor): 3 self._predictor = predictor 4 5 def predict(self, request_iterator, context): 6 image = save_chunks_to_file_object(request_iterator) 7 # predict 8 boxes, classes, confs = self._predictor.predict(image) 9 bboxes = [detection_pb2.Boundingbox(box=box) for box in boxes] 10 return detection_pb2.Reply(boxes=bboxes, classes=classes, confs=confs) 1 class MLClient(object): 2 def __init__(self, address, parse=None): 3 channel = grpc.insecure_channel(address) 4 self.stub = detection_pb2_grpc.MLServerStub(channel) 5 self._parse = parse 6 7 def predict(self, in_file): 8 chunks_generator = get_virtual_file_chunks(in_file) 9 response = self.stub.predict(chunks_generator) 10 if self._parse != None: 11 response = self._parse(response) 12 return response Client Code Service Code
  13. How does web application change !15 1 app = flask.Flask(__name__)

    2 object_detection = detection_grpc.MLClient('localhost:8888', 3 parse=chainercv_parse) 4 5 @app.route("/predict", methods=["POST"]) 6 def predict(): 7 data = {"success": False} 8 9 if flask.request.method == "POST": 10 if flask.request.files.get("image"): 11 image = flask.request.files["image"].read() 12 13 predictions = object_detection.predict(image) 14 15 data["predictions"] = predictions 16 data["success"] = True 17 18 return flask.jsonify(data) The web apps doesn't care ML algorithm.
  14. Conclusion !17 Internet Web application ML Model REST API gRPC

    Introduction to Machine Learning (ML) How to deploy ML application on Flask Deployment as Micro Services • Simpler to deploy • Simpler to understand • Flexible in using technologies
  15. Thank you for listening Source code and sample are available

    at Github https://github.com/Swall0w/microservice-ml-demo