Slide 1

Slide 1 text

Introduction of AnnoOps for mass production with multi-task and large-volume Toyota Research Institute Advanced Development (TRI-AD) Yusuke Yachide 2020.04.30

Slide 2

Slide 2 text

Introduction ● Yachi (Yusuke Yachide, Ph.D ○ Software Platform/ MLTools ■ MLOps development ■ AnnoOps (Annotation platform & Service) ■ Arene development @yachide yachide-yusuke-23a2 7035/ Arene

Slide 3

Slide 3 text

?

Slide 4

Slide 4 text

Our goal is to make the world’s safest car. James Kuffner (CEO) The future of car is software. Nikos Michalakis (VP of software platform)

Slide 5

Slide 5 text

No content

Slide 6

Slide 6 text

MLTools’s mission Increase ML model deployment on vehicle

Slide 7

Slide 7 text

Today’s topic Introduction of Annotation work at TRI-AD ※アノテーション... 教師データ・正解データの作成

Slide 8

Slide 8 text

What are annotation requirements? Volume: 1,000 ~ XX0,000 Accuracy: Several percent Data: Image, video, LiDAR etc Task: 20prj (Point, box, seg etc) Delivery data: 1, 2 month ~ a half year

Slide 9

Slide 9 text

Multi-task / Large volume

Slide 10

Slide 10 text

Policy Employing several annotation vendors Issue Vendor-agnostic annotation platform

Slide 11

Slide 11 text

A review on annotation process ① Rule making… Defining annotation rule for annotators ② Project creation… Creating a project for annotation ④ Inspection… Inspecting annotation data ③ Annotation… Annotating data ⑤ Delivery… Releasing annotation data

Slide 12

Slide 12 text

Issues in vendor agnostic approach (Today’s scope) ① Rule making… Defining annotation rule for annotators ② Project creation… Creating a project for annotation ④ Inspection… Inspecting annotation data ③ Annotation… Annotating data ⑤ Delivery… Releasing annotation data 3. Easy to deliver and to share annotation data… Unifying annotation format 1. Standardize annotation quality Making standard rule by analyzing vendor’s feature 2. Vendor agnostic platform... Handling different annotation format

Slide 13

Slide 13 text

0. Premise : two sides of the same coin for rule and inspection Rule Inspection Bad rule -> Defective product Defective product -> reflecting a rule

Slide 14

Slide 14 text

0. Premise : Know annotation vendors Vendor A : Good productivity / Excellent accuracy Vendor A : Excellent productivity / Good accuracy NG rate NG rate #annotated data #annotated data Date Date Start annotation with a new rule Start annotation with a new rule Both vendors has more than 100 annotator, then both of standardizing a rule and respecting unique case study is important

Slide 15

Slide 15 text

1. Standardize a rule → Use of flowchart Ministry of Land, Infrastructure and Transport Example: Traffic sign annotation Language agnostic standard rule ● Impossible to unify subjectivity → Clarify subjectivity part and not ● ML eng has thought of annotation → Need to pull out procedure from ML eng Road surface? Statistic object? Shape? Rectangle? Triangle? Circle? Yes Yes Yes No annotation Annotation

Slide 16

Slide 16 text

1. Case study: dig in vendor response Vendor A : Good productivity, Low NG rate → Many questions in the background! #Question #annotated data Date Normally, put NG case in the case study but for Vendor A, put Q/A in it. NG rate #annotated data Date

Slide 17

Slide 17 text

1. Effective of rule making efforts #Question #annotated data Date Vendor A : Improve productivityベンダーA:生産性が向上 Venor B : Decrease NG rate NG rate #annotated data Date NG 30% decrease Productivity 10% increase

Slide 18

Slide 18 text

2. Vendor agnostic annotation platform Unlabeled data format converter Project generation & submission Annotation Downloader Labeled data format converter

Slide 19

Slide 19 text

2. Vendor agnostic annotation platform Unlabeled data format converter Project generation & submission Annotation Downloader Labeled data format converter Tool #2 Tool #3 ● Each vendor has own tool (Multiple in/out format) ● Different annotation unit ● (A project of multiple annotation task at a time is allowed or not) Each tool is different

Slide 20

Slide 20 text

Example : Tool difference Vendor B : need to make different project in a pipe. (Bbox project → Segmentation) Vendor A : simultaneous BBox and segmentation annotation

Slide 21

Slide 21 text

2. Vendor agnostic annotation platform Unlabeled data format converter Project generation & submission Annotation Downloader Labeled data format converter Simultaneous annotation Nonsimultaneous with multiple annotation tasks (Process multiple annotation projects in series) task task Tool in Vendor A Tool Vendor B Control

Slide 22

Slide 22 text

3. Easy delivery and sharing dataset Unlabeled data format converter Project generation & submission Annotation Downloader Labeled data format converter MLOps Dataset DB Easy delivery with unified annotation format Standardize dataset loader

Slide 23

Slide 23 text

Summary At TRI-AD, for multi-task and large volume annotation requirements like ➔ Volume: 1,000 ~ XX0,000 ➔ Accuracy: Several percent ➔ Data: Image, video, LiDAR etc ➔ Task: 20prj (Point, box, seg etc) ➔ Delivery data: 1, 2 month ~ a half year Build vendor agnostic annotation platform to use multiple vendors easily 7~8 people are handling above massible annotation project

Slide 24

Slide 24 text

Silicon Valley “Innovation” シリコンバレーの
 イノベーション
 Japanese “Craftsmanship” 日本のモノづくり
 NOW HIRING