Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Autonomous Vehicles (AVs): Basics and Testing Challenges

Autonomous Vehicles (AVs): Basics and Testing Challenges

The session will be held as part of Exactpro’s ongoing AI Testing Talks event series. Session expert, Julia Emelianova, PhD, Researcher, Exactpro, will cover:
📌the motivation behind and the process of autonomous vehicle development,
📌the main principles of automated driving and the existing navigation challenges,
📌testing approaches and the main testing objectives for autonomous vehicles.

Follow us on
LinkedIn https://www.linkedin.com/company/exactpro-systems-llc
Twitter https://twitter.com/exactpro

Exactpro
PRO

July 14, 2022
Tweet

More Decks by Exactpro

Other Decks in Technology

Transcript

  1. 1 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    BUILD SOFTWARE TO TEST SOFTWARE
    exactpro.com
    Autonomous Vehicles (AVs):
    Basics and Testing Challenges
    Julia Emelianova
    PhD, Researcher, Exactpro
    14 JULY | 1:30 PM SLST

    View Slide

  2. 2 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    Contents
    1. Motivation for Building Autonomous Vehicles (AVs) 4
    2. Autonomous Vehicles: History 5
    2.1. Phase 1. Foundational Research 5
    2.2. Phase 2. Grand Challenges 7
    2.3. Phase 3. Commercial Development 8
    3. Levels of Automation 9
    4. Automated and Connected Driving 11
    4.1. Internet of Vehicles (IoV) - Big Data Architecture 11
    4.2. Vehicle-to-everything (V2X) Communication 12
    4.3. Common AV Sensor Setup 13
    4.4. Main Parts of the AV Navigation Process 16

    View Slide

  3. 3 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    Contents
    5. AV Navigation Challenges 17
    5.1. Perception Challenges 17
    5.2. Localization Challenges 18
    5.3. Prediction and Decision Making Challenges 19
    5.4. Data Processing Challenges 20
    6. AV Testing 21
    6.1. Test Instances Based on X-In-the-Loop (XiL) Approaches 21
    6.2. AV Evolutionary Design and Testing Flow as a the V-Model 23
    6.3. Main Testing Objectives 24
    6.4. Examples of AV Simulators 25
    6.5. Scenario-Based SiL Testing Approach 27
    6.6. Examples for Attack Simulation 34
    6.7. Metrics for AV Testing 38

    View Slide

  4. 4 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    1. Motivation for Building Autonomous Vehicles (AVs)
    greater personal independence saving money
    increased productivity reduced traffic congestion environmental gains
    greater road safety
    Autonomous Vehicle (AV) is a self-driving car which moves safely with little or no human input.

    View Slide

  5. 5 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    2. Autonomous Vehicles: History
    1. The development of automated highway systems,
    in which vehicles depend significantly on the
    highway infrastructure to guide them:
    - follow magnets imbedded in the roadway to keep
    the line centering
    - track a radar reflective stripe, etc.
    https://www.ri.cmu.edu/pub_files/pub2/thorpe_charles_1997_2/thorpe_charles_1997_2.pdf
    https://www.youtube.com/watch?v=RlZEeIC_2lI
    1980 - 2003, university research centers, sometimes in partnership with transportation agencies and
    automotive companies, undertook basic studies of autonomous transportation. Two main technology
    concepts emerged from this work:
    https://www.jstor.org/stable/10.7249/j.ctt5hhwgz.11?seq=2
    National Automated Highway Systems
    Consortium Demo ‘97 in San Diego
    2.1. Phase 1. Foundational Research

    View Slide

  6. 6 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    2. Autonomous Vehicles: History
    2. The development of both semi-autonomous and
    autonomous vehicles that depend little on the
    highway infrastructure:
    - vision-guided vehicles VaMoRs (1986), VaMP and
    VITA-2 (the Prometheus project, 1987-1995)
    constructed by team led by Ernst Dickmanns
    https://www.lifehacker.com.au/2018/12/today-i-discovered-the-self-d
    riving-car-trials-of-the-1980s/
    - Navlab models 1-5 vehicles developed at Carnegie
    Mellon University (1984-1995)
    https://www.digitaltrends.com/cars/first-self-driving-car-ai-navlab-his
    tory/
    VaMoRs, 1986
    VaMP, 1995
    Navlab models 1-5, 1984-1995
    2.1. Phase 1. Foundational Research

    View Slide

  7. 7 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    2. Autonomous Vehicles: History
    2003 - 2007, DARPA Challenges:
    1. 2004 - self-navigate 240 km
    of desert roadway
    (no car completed the route)
    2. 2005 - self-navigate 212 km
    (5 cars completed the course)
    3. 2007 - the urban challenge in a mock city environment
    (4 cars completed the route in the allotted 6-hour time-limit)
    During the development, better software, camera, radar and laser
    sensors improved the road following and collision avoidance.
    The autonomous system was sensing the environment and made the
    decisions.
    2005 DARPA Grand Challenge
    https://en.wikipedia.org/wiki/DARPA_Grand_Challenge
    2.2. Phase 2. Grand Challenges

    View Slide

  8. 8 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    2. Autonomous Vehicles: History
    2.3. Phase 3. Commercial Development
    Since 2007

    View Slide

  9. 9 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    3. Levels of Automation
    https://www.sae.org/binaries/content/assets/cm/content/blog/sae-j3016-visual-chart_5.3.21.pdf
    https://en.wikipedia.org/wiki/Advanced_driver-assistance_system
    https://www.asam.net/index.php?eID=dumpFile&t=f&f=2776&token=6e8da9b58594ecefb20dcc06c0ac55480df14c67
    SAE J3016 (published in 2014)
    The Society of Automotive Engineers (SAE) defines 6 levels of driving automation ranging from 0 (fully
    manual) to 5 (fully autonomous).
    Advanced Driver-Assistance System (ADAS) - any of the groups of electronic technologies that assist
    drivers in driving and parking functions.

    View Slide

  10. 10 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    3. Levels of Automation
    Robotaxis with SAE Level 4 are now used for passenger transportation in several cities both in USA
    and China
    Mercedes-Benz becomes world’s first to get Level 3 autonomous driving system approved for European
    public roads starting from May 17 2022
    https://www.therobotreport.com/mercedes-rolls-out-level-3-autonomous-driving-tech-in-germany/
    https://waymo.com/waymo-driver/ https://www.autox.ai/en/index.html

    View Slide

  11. 11 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    4. Automated and Connected Driving
    4.1. Internet of Vehicles (IoV) - Big Data Architecture
    MAN - Metropolitan Area Network
    RFID - Radio Frequency IDentification
    https://www.sciencedirect.com/science/article/pii/S1877050918304083

    View Slide

  12. 12 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    4. Automated and Connected Driving
    4.2. Vehicle-to-everything (V2X) Communication
    https://www.thalesgroup.com/en/markets/digital-identity-and-security/iot/industries/automotive/use-cases/v2x

    View Slide

  13. 13 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    4. Automated and Connected Driving
    4.3. Common AV Sensor Setup
    - Inertial Measurement Units (IMU)
    or Inertial Navigation System (INS)
    - Global Positioning Systems (GPS)
    and Differential GPS (DGPS)
    - Odometry Sensors
    https://arxiv.org/pdf/1910.07738.pdf
    https://www.cpp.edu/~ftang/courses/CS521/notes/sensing.pdf
    https://www.ansys.com/content/dam/resource-center/article/ansys-advantage-autonomous-vehicles-aa-V12-i1.pdf (page 32)

    View Slide

  14. 14 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    4. Automated and Connected Driving
    4.3. Common AV Sensor Setup
    https://techgameworld.com/mercedes-drive-pilot-approved-in-germany/

    View Slide

  15. 15 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    RADAR:
    - object detection at a long distance
    and through fog or clouds
    - resolution several meters at a distance of 100 m
    - noisy sensor
    LiDAR:
    - a high level of accuracy for 3D mapping
    - resolution of a few centimeters at a distance of 100 m
    Camera:
    - colour detection
    - unable to measure distances
    4. Automated and Connected Driving
    4.3. Common AV Sensor Setup
    https://www.fierceelectronics.com/components/lidar-vs-radar
    https://higherlogicdownload.s3.amazonaws.com/AUVSI/14c12c18-fde1-4c1d-8548-035ad166c766/UploadedImages/2017/PDFs/Proceedings/BOs/Bo6-1.pdf
    https://www.researchgate.net/figure/A-possible-combination-of-sensors-for-all-weather-navigation_tbl2_346038646/download

    View Slide

  16. 16 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    4. Automated and Connected Driving
    4.4. Main Parts of the AV Navigation Process
    https://neptune.ai/blog/self-driving-cars-with-convolutional-neural-networks-cnn
    https://www.sciencedirect.com/science/article/abs/pii/S0968090X18302134

    View Slide

  17. 17 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    5. AV Navigation Challenges
    5.1. Perception Challenges
    ● Object labeling and classification
    ● Segmentation
    ● Correspondence problem
    ● Scene understanding
    https://neptune.ai/blog/self-driving-cars-with-convolutional-neural-networks-cnn
    https://www.youtube.com/watch?v=aQwqD5cB2ck
    The same object
    Labeling tools, ML and AI approaches help to solve them

    View Slide

  18. 18 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    5. AV Navigation Challenges
    5.2. Localization Challenges
    ● Positioning and Global localization challenges (GPS, INS)
    ● Road lines, traffic signs detection
    https://www.jstage.jst.go.jp/article/essfr/9/2/9_131/_pdf/-char/en
    https://neptune.ai/blog/self-driving-cars-with-convolutional-neural-networks-cnn

    View Slide

  19. 19 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    5.3. Prediction and Decision Making Challenges
    https://arxiv.org/abs/1912.11676
    ● Occupancy and flow prediction
    ● Pedestrians behavior prediction
    ● Self-driving car motion prediction
    Task examples:
    - adjusting speed when anticipating a
    curve
    - collision avoidance
    - follow a line or a path
    - speed control
    5. AV Navigation Challenges

    View Slide

  20. 20 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    5. AV Navigation Challenges
    5.4. Data Processing Challenges
    https://intellias.com/the-emerging-future-of-autonomus-driving/
    https://www.youtube.com/watch?v=lCohTPSFj3I (at 17:10)
    Large data volumes
    Ultra-high computing
    Identification
    problems
    Difficulties in finding a valuable data

    View Slide

  21. 21 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.1. Test Instances Based on X-In-the-Loop (XiL) Approaches
    https://www.avl.com/documents/10138/2699442/GSVF18_Validation+of+X-in-the-Loop+Approaches.pdf/0b13c98a-7e6d-45e7-baab-2a8d80403c38
    https://www.researchgate.net/publication/311919670_Autonomous_vehicles_testing_methods_review

    View Slide

  22. 22 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.1. Test Instances based on X-In-the-Loop (XiL) Approaches
    https://www.avl.com/documents/10138/2699442/GSVF18_Validation+of+X-in-the-Loop+Approaches.pdf/0b13c98a-7e6d-45e7-baab-2a8d80403c38

    View Slide

  23. 23 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.2. AV Evolutionary Design and Testing Flow as a V-Model
    https://www.researchgate.net/publication/311919670_Autonomous_vehicles_testing_methods_review

    View Slide

  24. 24 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.3. Main Testing Objectives
    Description Purpose(s)
    Search for the best sensors setup (number of sensors and their
    positioning on the vehicle)
    Perception
    Check the correctness of sensor data annotation Scene understanding, labeling
    and segmentation
    Check the decision making, planning and control algorithm
    model
    Prediction and decision making
    Possible attack simulations and search for vulnerabilities Security research

    View Slide

  25. 25 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.4. Examples of AV Simulators
    Open-source
    Simulator Owner
    SVL Simulator
    LG Electronics America
    R&D Centre
    Apollo Simulation Platform Baidu
    CARLA Simulator CARLA Team
    Udacity AV unity simulator Udacity
    SUMO Eclipse Foundation
    AirSim Microsoft
    SUMMIT
    PGDrive
    Closed-source
    Simulator Owner
    Prescan Siemens
    CarSim Mechanical Simulation Corporation
    Carcraft simulator Waymo
    Virtual Testing Suite platform Aurora
    VIRES VTD simulator VIRES Simulationstechnologie GmbH
    rFpro’s simulation software rFpro

    View Slide

  26. 26 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    Scene understanding, labeling and segmentation testing
    6. AV Testing
    6.4. Examples of AV Simulators
    https://www.youtube.com/watch?v=NksyrGA8Cek
    VIRES VTD simulator
    CARLA simulator
    https://carla.readthedocs.io/en/latest/ref_sensors/
    https://www.mdpi.com/2076-3417/12/1/281/htm

    View Slide

  27. 27 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    Simulation Cycle
    https://www.claytex.com/tech-blog/automated-testing-methodologies-for-autonomous-vehicles/
    Two main approaches for the scenario generation:
    1. Manual
    2. Automatic

    View Slide

  28. 28 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    https://www.asam.net/standards/simulation-domain-overview/
    In 2018 Association for Standardisation of Automation and Measuring Systems (ASAM) started the
    standardisation process to enable collaborative data development and data exchange between
    different tools and simulators and make autonomous vehicles testing more flexible and easy.

    View Slide

  29. 29 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    ASAM recorded data and scenarios workflow
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    https://www.asam.net/project-detail/asam-openlabel-v100/

    View Slide

  30. 30 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    Scenario generation is the process of taking a variable set from the Test Manager and creating a test
    case in simulation.
    Each simulation includes:
    ● Ego Vehicle with the system
    under test, i.e. the controller
    ● Road Network
    ● Road Features
    ● Traffic
    ● Weather
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    Data Layer Model for Scenario Description
    https://www.pegasusprojekt.de/files/tmpl/Pegasus-Abschlussveranstaltung/15_Scenario-Database.pdf

    View Slide

  31. 31 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    Example of Test Cases generation
    https://www.researchgate.net/publication/311919670_Autonomous_vehicles_testing_methods_review

    View Slide

  32. 32 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    Example of Test Cases generation
    SVL Simulator

    View Slide

  33. 33 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.5. Scenario-Based SiL Testing Approach
    Test areas:
    1. Configuration of the system
    a. Environmental Conditions
    i. Weather Conditions
    ii. Road Conditions
    iii. Illumination
    b. Traffic Infrastructure
    i. Traffic speed / Driving Modes
    ii. Agents Diversity
    c. Physical Infrastructure
    i. Roadway Types
    ii. Roadway Surfaces
    iii. Roadway Geometry
    iv. Geographic Area
    d. Operational Constraints
    i. Speed Limit
    ii. Traffic Conditions
    2. Configuration of the ego vehicle
    a. Vehicle model
    b. Sensor configuration
    3. Traffic actions
    a. Safety scenarios (regular traffic
    actions)
    b. Traffic accidents and road
    emergencies

    View Slide

  34. 34 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    https://dl.acm.org/doi/10.1145/3372297.3423359
    Attack generation against sensors on the Perception and Prediction levels
    Attack against camera
    6.6. Examples for Attack Simulation

    View Slide

  35. 35 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.6. Examples for Attack Simulation
    https://dl.acm.org/doi/10.1145/3372297.3423359
    Attacking Tesla via a digital billboard
    Attack generation against sensors on the Perception and Prediction levels
    Attack against camera

    View Slide

  36. 36 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.6. Examples for Attack Simulation
    https://sites.google.com/view/cav-sec/planfuzz
    https://www.ndss-symposium.org/wp-content/uploads/2022-177-paper.pdf
    Semantic Denial-of-Service (DoS) vulnerabilities are most
    generally exposed in practical AD systems due to the
    tendency to avoid safety incidents. They arise from the
    program code for AD planning and decision making systems.
    Lane following DoS attack. In this scenario, the AD vehicle
    keeps cruising in the current lane while static or dynamic
    obstacles are located outside of the current lane boundaries.
    The victim AD vehicle permanently stops due to off-the-road
    cardboard boxes.

    View Slide

  37. 37 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.6. Examples for Attack Simulation
    https://sites.google.com/view/cav-sec/planfuzz
    Lane Changing DoS Attack on Apollo. In this
    scenario, the victim AD vehicle gives up a
    necessary lane changing decision even though
    the lane it needs to change to is empty and the
    attacking vehicle following it shows no intention
    to change to that lane.

    View Slide

  38. 38 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.7. Metrics for AV Testing
    https://www.researchgate.net/publication/348116055_Quality_Metrics_and_Oracles_for_Autonomous_Vehicles_Testing
    1. Driving quality metrics based on
    mutual configuration parameters
    of:
    ● ego vehicle
    ● infrastructure elements
    (signs, borders, facilities)
    ● other traffic participants
    (vehicles, bikes, pedestrians)
    Examples: speed, acceleration, position,
    steering, braking and collisions.

    View Slide

  39. 39 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    6. AV Testing
    6.7. Metrics for AV Testing
    https://arxiv.org/abs/1912.11676
    2. Evaluation Metrics for vehicle behavior prediction
    a. Intention prediction metrics:
    ● Accuracy
    ● Precision
    ● Recall
    ● F1 Score
    ● Negative Log Likelihood (NLL)
    ● Average Prediction Time
    b. Trajectory prediction metrics:
    ● Final Displacement Error (FDE)
    ● Mean Absolute Error (MAE) or Root Mean Square Error (RMSE)
    ● Minimum of K Metric
    ● Cross Entropy
    ● Computation Time

    View Slide

  40. 40 BUILD SOFTWARE TO TEST SOFTWARE
    AI Testing Talks
    AI Testing Talks
    Thank You!
    Questions?

    View Slide