Upgrade to Pro — share decks privately, control downloads, hide ads and more …

OOP 2020 - Die Rolle von Architektur im Zeitalter von KI und autonomen Systemen

OOP 2020 - Die Rolle von Architektur im Zeitalter von KI und autonomen Systemen

KI und autonome Systeme sind auf dem Vormarsch und dabei hochgradig von Daten abhängig. Das Thema ist omnipräsent und häufig sehr algorithmisch, daten-zentriert, auch codenah beleuchtet. Der Blick auf die Gesamtarchitektur von Systemen, die KI enthalten, fehlt dabei oft. Unser Vortrag unterstützt Softwarearchitekten dabei, den Überblick zu behalten und die wesentlichen Fragen zu stellen. Dazu orientieren wir uns an Schritten der Datensammlung, des maschinellen Lernens und der Modellnutzung, um Architekturaspekte systematisch zu beleuchten.

Dominik Rost

February 05, 2020
Tweet

More Decks by Dominik Rost

Other Decks in Technology

Transcript

  1. © Fraunhofer IESE 1 Dr. Matthias Naab Dr. Dominik Rost

    05.02.2020 OOP 2020 | München Die Rolle von Architektur im Zeitalter von KI und autonomen Systemen
  2. © Fraunhofer IESE 11 Dominik Rost Matthias Naab Software Architect

    Software Architect Data Science ¯\_(ツ)_/¯ Artificial Intelligence ¯\_(ツ)_/¯ Autonomous Systems ¯\_(ツ)_/¯
  3. © Fraunhofer IESE 12 Information Sources Company Websites, Success Stories,

    etc. Architecture ? Technologies, Tech Tutorials, etc.
  4. © Fraunhofer IESE 13 Goals of this Talk We elaborate

    the topic for software architects ◼ Create ◼ “Big Picture” of architecture of ML-based Systems ◼ Architecture language for AI-based systems ◼ Foundation for ◼ Structured thinking about and designing ML-based systems ◼ Talking to AI experts and data scientists ◼ Judging existing concepts and technologies and filling the own toolbox
  5. © Fraunhofer IESE 14 Approach Top-Down Bottom-Up System Decompostion according

    to Architecture Views Functions, Data, Deployment, … Explanation & Classification of Major Concepts ML Concepts, Process Steps, …
  6. © Fraunhofer IESE 15 Example: Autonomous Driving Information partially based

    on „Tesla Autonomy Day“, https://www.youtube.com/watch?v=-b041NXGPZ8
  7. © Fraunhofer IESE 18 Some Terms What is Artificial Intelligence?

    In 5 Minutes: https://www.youtube.com/watch?v=2ePf9rue1Ao
  8. © Fraunhofer IESE 19 Source: AWS Summit Berlin 2019: ML

    Crash Course; 15 Algorithms, 60 Minutes, No Equations
  9. © Fraunhofer IESE 21 Traditional ML-Based Engineering Traditional Systems vs.

    ML-based Systems System Input Data Program Output System Input Data Expected Output Model Mix of Dimensions :(
  10. © Fraunhofer IESE 22 Engineering Traditional Systems vs. ML-based Systems

    Software System Traditional Software Engineering (Methods & Tools) Requirements Input Data Software System Output Data Traditional DevTime RunTime SE for ML-Based Systems (Methods & Tools) Machine Learning Data Science (Methods & Tools) Software System based on ML ML Component Training Data Requirements Output Data Software System based on ML ML Component Input Data ML-Based ML-Training ML-Inference
  11. © Fraunhofer IESE 25 Scope and Focus wrt. AI /

    ML Software System Traditional Software Engineering (Methods & Tools) used to develop Software System based on ML ML Component used to develop SE for ML-Based Systems (Methods & Tools) Data Science (Methods & Tools) used to develop used to develop System with substantial size, complexity, quality requirements
  12. © Fraunhofer IESE 28 Out of Scope ◼ Foundations of

    ML ◼ Algorithms in detail ◼ Topology design of NNs ◼ Detailed technologies in ML ◼ Data analytics with respective tools ◼ Detailed architecture of autonomous driving systems
  13. © Fraunhofer IESE 30 Data Flow through Software System based

    on ML Software System based on ML ML Component Input Data Output Data Data Pre- Processing Data Post- Processing
  14. © Fraunhofer IESE 31 Multiple ML-Components in a System Software

    System based on ML ML C1 Input Data Output Data ML C3 ML C4 Architecture Decision: How many ML-Components and which ones? ML C2
  15. © Fraunhofer IESE 32 Example Autonomous Driving Alternative Functions Software

    System based on ML “Driving” Sensors, Cameras, … Driving Actuators Data Pre- Processing Data Post- Processing Alternative A Software System based on ML Driving Area Detection Sensors, Cameras, … Driving Actuators “Steering” Obstacle Detection Roadsign Detection Alternative B Software System based on ML Road Marking Detection Sensors, Cameras, … Driving Actuators … Alternative C Example Autonomous Driving
  16. © Fraunhofer IESE 33 Logical Structure of a ML Component

    (Generalized, Neural Network) ML Component Weights (Trained) Topology (Layers, Nodes, Relationships) Hyperparameters Config Data Basic Neural Network Logic Learning / Training Logic Code / Logic ML Model (fixed in inference) Training Data Activations State (Optional) Data Input Data Output Data Config Data The ML Component can be treated as a black box, architecturally The ML Component is the unit of training State: E.g. in Recurrent Neural Networks with feedback relationships
  17. © Fraunhofer IESE 34 ML Component Example Topology of a

    Convolutional Neural Network (CNN) Topology (Layers, Nodes, Relationships) Decisions about the topology of the Neural Network are mainly done by data scientists. Architects need a basic understanding to judge external implications. https://www.easy-tensorflow.com/tf-tutorials/convolutional-neural-nets-cnns
  18. © Fraunhofer IESE 36 Differences of Types of Learning /

    Training Supervised Learning Unsupervised Learning Reinforcement Learning (active) ML Component Training Data Action Labels Training Data ML Component ML Component Environment (e.g. simulated, real) Observation Reward
  19. © Fraunhofer IESE 37 1 Learning / Training Step ML

    Component Activations State (Optional) Weights (Trained) Topology (Layers, Nodes, Relationships) Hyperparameters Basic Neural Network Logic Learning / Training Logic Data Config Data Code / Logic ML Model (fixed in inference) Output Data 2) Training logic analyses output data Selected Training Data 1) Feed training data into NN 3) Adjust … - Weights - Add / delete neurons - Add / delete relationships - Functions of neurons - Hyperparameters [by learning logic or data scientist)
  20. © Fraunhofer IESE 39 Overall Lifecycles / Workflows and Data

    Involved ML-Training (DevTime) Data Collection Data Ingestion Data Preparation Model Selection & Training & Evaluation Model Persistence ML-Inference (RunTime) Data Ingestion Data Preparation Inference Model Deployment Large amounts of data Computing-intensive training Exploratory approach Concrete input data Inference is comparably cheap “Just computation”
  21. © Fraunhofer IESE 40 Feedback Data and Optimization (Batch Learning)

    ML-Training (DevTime) Data Collection Data Ingestion Data Preparation Model Selection & Training & Evaluation Model Persistence ML-Inference (RunTime) Data Ingestion Data Preparation Inference Model Deployment New Training Data from Live Operation Deploy optimized model
  22. © Fraunhofer IESE 41 Example Autonomous Driving Tesla: Data Collection

    from Current Fleet – Driving Real World, not Autonomously Yet Deploy optimized driving functions model New Training Data from Live Operation Camera images Driving situations Data labelled from driver behaviour / steering Data labelled from explicit user feedback Data labelled from additional sensors (e.g. radar) Central Data Collection and Learning Model Selection & Training & Evaluation Data Preparation Model Persistence Instruct cars, which data to collect Partially human pre-processed data
  23. © Fraunhofer IESE 42 Example Autonomous Driving Software System based

    on ML “Driving” Data Pre- Processing Data Post- Processing Central Data Collection and Learning Model Selection & Training & Evaluation Data Preparation Model Persistence ▪ Architects need overall system perspective ▪ Strong integration between runtime system (cars) and development time (learning and improvement) ▪ Continuous improvement and deployment ▪ Learning from the pre-phase of autonomous driving and continuously during operation
  24. © Fraunhofer IESE 43 Feedback Data and Optimization (Online Learning)

    ML-Inference (RunTime) Data Ingestion Data Preparation Inference ML-Training / Retraining (RunTime) Model Training / Optimization Model Persistence Model Deployment New Training Data from Live Operation Learning can happen at defined points in time (rather not after every inference) (DevTime) a ation Model Selection & Training Model Persistence Model Deployment The data science work is still done at DevTime Model is selected and training is done At Runtime, only optimization of the model
  25. © Fraunhofer IESE 45 Data Aspects ◼ Large amounts of

    data for training needed ◼ Amount depends on application area, available data and on ML models / algorithms ◼ Very different types and formats of data ◼ Text ◼ Images ◼ Video ◼ Audio ◼ … ◼ → require very different treatment ◼ → result in very different computational load
  26. © Fraunhofer IESE 46 Example Autonomous Driving Data Aspects in

    Autonomous Driving ◼ Data needs ◼ Large data ◼ Varied data ◼ Real data ◼ Collect data from the fleet ◼ Create simulation data ◼ Cover edge and unusual cases Image: https://www.youtube.com/watch?v=-b041NXGPZ8
  27. © Fraunhofer IESE 48 Design Alternatives Deployment Options Model Training

    / Optimization Model Persistence ML-Inference (RunTime) Data Ingestion Data Preparation Inference Model Deployment (DevTime) a ation Model Selection & Training Model Persistence Model Deployment Training HW Powerful Server ML Component Client Server ML Component Client ML Component Server Design Alternatives Client Server ML Component Client ML Component Server ML-Training (RunTime)
  28. © Fraunhofer IESE 50 Example Autonomous Driving Multiple Instances of

    Systems (Cars) ◼ Learning strategies ◼ Online-Learning in each car? ◼ Batch-Learning in a central system, only? ◼ Can cars communicate? ◼ Compare: ◼ Learning of typing recognition on mobile phone Training HW Powerful Server ML Component ML-Training (DevTime) ML-Inference (RunTime) New Training Data from Live Operation Deploy optimized model Software System based on ML ML Component Software System based on ML ML Component Software System based on ML ML Component Software System based on ML ML Component Software System based on ML ML Component Software System based on ML ML Component
  29. © Fraunhofer IESE 52 Available Technologies as Services / Libraries

    for ML Different Level of Reuse ML Component Activations State (Optional) Weights (Trained) Topology (Layers, Nodes, Relationships) Hyperparameters Basic Neural Network Logic Learning / Training Logic Data Config Data Code / Logic ML Model (fixed in inference) Fully trained model, immutable (as API or library) [e.g. Service for image tagging] Fully trained model, retrainable (as API or library) [e.g. Service for image tagging] Predefined topology (as API or library) [e.g. predefined CNNs] Basic ML model (as library) [e.g. general NN logic] Degree of freedom Knowledge needed Effort needed
  30. © Fraunhofer IESE 53 Microsoft AI & ML Technologies https://www.credera.com/wp-content/uploads/2018/04/The-Microsoft-AI-platform.png

    Fully trained model, immutable (as API or library) [e.g. Service for image tagging] Fully trained model, retrainable (as API or library) [e.g. Service for image tagging] Predefined topology (as API or library) [e.g. predefined CNNs] Basic ML model (as library) [e.g. general NN logic] Predefined topology (as API or library) [e.g. predefined CNNs]
  31. © Fraunhofer IESE 63 Quality Attributes in ML-based Systems (1/2)

    ◼ ML as a technology does inherently aim more at realizing functionality than at realizing quality attributes (in contrast to e.g. communication middleware, blockchain, …) ◼ However, ML can be used to support achieving some quality attributes (e.g. achieving certain aspects of security by for example detecting attack patterns with ML) ◼ The usage of ML has significant impact on quality attributes , and thus needs architectural treatment ◼ One key aspect: missing comprehensibility / explainability what is happening in the ML- component ◼ Safety, reliability: conflicting with safety standards, needs counter-measures ◼ UX: Explaining to the user what happens / integrating user into overall flow
  32. © Fraunhofer IESE 64 Quality Attributes in ML-based Systems (2/2)

    ◼ Fulfil the respective quality attributes of the system, respecting the overall “scale” of the system ◼ Performance (Latency, throughput), scalability, … ◼ Considering the runtime system, but also the devtime / learning system ◼ Completely different settings for quality attributes in different systems ◼ Playing “Go” against the world champion ◼ Massive power on a single complex task ◼ Calculation of a model for all product recommendations of Amazon ◼ Massive power on many smaller tasks ◼ Provide an adequate execution environment ◼ Sufficient computing power ◼ Sufficient storage capacity ◼ Provide the right data with adequate frequency and latency ◼ Architect has to know the requirements / implications of the ML algorithm / model
  33. © Fraunhofer IESE 66 Conclusion: What does it mean for

    me? What can I do? ◼ Keep an eye on the architectural big picture, even if there is ML in the system ;-) ◼ Understand the very nature of ML-based systems ◼ Learn from existing systems and their solution approaches ◼ Remember the essentials of software architecture ◼ Achieving Quality attributes ◼ Dealing with uncertainty ◼ Organizing and distributing work ◼ Fill your toolbox with knowledge about patterns and technologies in the ML-area ◼ Start working with data scientists / data engineers and establish a common language
  34. © Fraunhofer IESE 67 Dr. Matthias Naab Dr. Dominik Rost

    05.02.2020 OOP 2020 | München Die Rolle von Architektur im Zeitalter von KI und autonomen Systemen