Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Introduction of AnnoOps for mass production with multi-task and large-volume / annops_english

7e96ce7e2e952f7d29e6bbc3b647f421?s=47 Yachi
April 30, 2020

Introduction of AnnoOps for mass production with multi-task and large-volume / annops_english

7e96ce7e2e952f7d29e6bbc3b647f421?s=128

Yachi

April 30, 2020
Tweet

Transcript

  1. Introduction of AnnoOps for mass production with multi-task and large-volume

    Toyota Research Institute Advanced Development (TRI-AD) Yusuke Yachide 2020.04.30
  2. Introduction • Yachi (Yusuke Yachide, Ph.D ◦ Software Platform/ MLTools

    ▪ MLOps development ▪ AnnoOps (Annotation platform & Service) ▪ Arene development @yachide yachide-yusuke-23a2 7035/ Arene
  3. ?

  4. Our goal is to make the world’s safest car. James

    Kuffner (CEO) The future of car is software. Nikos Michalakis (VP of software platform)
  5. None
  6. MLTools’s mission Increase ML model deployment on vehicle

  7. Today’s topic Introduction of Annotation work at TRI-AD ※アノテーション... 教師データ・正解データの作成

  8. What are annotation requirements? Volume: 1,000 ~ XX0,000 Accuracy: Several

    percent Data: Image, video, LiDAR etc Task: 20prj (Point, box, seg etc) Delivery data: 1, 2 month ~ a half year
  9. Multi-task / Large volume

  10. Policy Employing several annotation vendors Issue Vendor-agnostic annotation platform

  11. A review on annotation process ① Rule making… Defining annotation

    rule for annotators ② Project creation… Creating a project for annotation ④ Inspection… Inspecting annotation data ③ Annotation… Annotating data ⑤ Delivery… Releasing annotation data
  12. Issues in vendor agnostic approach (Today’s scope) ① Rule making…

    Defining annotation rule for annotators ② Project creation… Creating a project for annotation ④ Inspection… Inspecting annotation data ③ Annotation… Annotating data ⑤ Delivery… Releasing annotation data 3. Easy to deliver and to share annotation data… Unifying annotation format 1. Standardize annotation quality Making standard rule by analyzing vendor’s feature 2. Vendor agnostic platform... Handling different annotation format
  13. 0. Premise : two sides of the same coin for

    rule and inspection Rule Inspection Bad rule -> Defective product Defective product -> reflecting a rule
  14. 0. Premise : Know annotation vendors Vendor A : Good

    productivity / Excellent accuracy Vendor A : Excellent productivity / Good accuracy NG rate NG rate #annotated data #annotated data Date Date Start annotation with a new rule Start annotation with a new rule Both vendors has more than 100 annotator, then both of standardizing a rule and respecting unique case study is important
  15. 1. Standardize a rule → Use of flowchart Ministry of

    Land, Infrastructure and Transport Example: Traffic sign annotation Language agnostic standard rule • Impossible to unify subjectivity → Clarify subjectivity part and not • ML eng has thought of annotation → Need to pull out procedure from ML eng Road surface? Statistic object? Shape? Rectangle? Triangle? Circle? Yes Yes Yes No annotation Annotation
  16. 1. Case study: dig in vendor response Vendor A :

    Good productivity, Low NG rate → Many questions in the background! #Question #annotated data Date Normally, put NG case in the case study but for Vendor A, put Q/A in it. NG rate #annotated data Date
  17. 1. Effective of rule making efforts #Question #annotated data Date

    Vendor A : Improve productivityベンダーA:生産性が向上 Venor B : Decrease NG rate NG rate #annotated data Date NG 30% decrease Productivity 10% increase
  18. 2. Vendor agnostic annotation platform Unlabeled data format converter Project

    generation & submission Annotation Downloader Labeled data format converter
  19. 2. Vendor agnostic annotation platform Unlabeled data format converter Project

    generation & submission Annotation Downloader Labeled data format converter Tool #2 Tool #3 • Each vendor has own tool (Multiple in/out format) • Different annotation unit • (A project of multiple annotation task at a time is allowed or not) Each tool is different
  20. Example : Tool difference Vendor B : need to make

    different project in a pipe. (Bbox project → Segmentation) Vendor A : simultaneous BBox and segmentation annotation
  21. 2. Vendor agnostic annotation platform Unlabeled data format converter Project

    generation & submission Annotation Downloader Labeled data format converter Simultaneous annotation Nonsimultaneous with multiple annotation tasks (Process multiple annotation projects in series) task task Tool in Vendor A Tool Vendor B Control
  22. 3. Easy delivery and sharing dataset Unlabeled data format converter

    Project generation & submission Annotation Downloader Labeled data format converter MLOps Dataset DB Easy delivery with unified annotation format Standardize dataset loader
  23. Summary At TRI-AD, for multi-task and large volume annotation requirements

    like ➔ Volume: 1,000 ~ XX0,000 ➔ Accuracy: Several percent ➔ Data: Image, video, LiDAR etc ➔ Task: 20prj (Point, box, seg etc) ➔ Delivery data: 1, 2 month ~ a half year Build vendor agnostic annotation platform to use multiple vendors easily 7~8 people are handling above massible annotation project
  24. Silicon Valley “Innovation” シリコンバレーの
 イノベーション
 Japanese “Craftsmanship” 日本のモノづくり
 NOW HIRING