01 - Introduction Smart Robotics Undergrad 1
01 - Introduction Smart Robotics Undergrad 1
https://www.youtube.com/watch?v=B8R148hFxPw
Institute of Automotive Technology
Faculty for Mechanical Engineering
Technical University of Munich
1- 4
Introduction
Prof. Dr. Markus Lienkamp
Connected Autonomous
Cars Driving
Electric Shared
Powertrain Mobility
1- 7
Additional Slides
Some key technologies of these four mega trends are shown below. Also, note, that theses trends
influence each other. An example is the usage of self-driving robotaxis for concepts of shared / smart
mobility.
Connected Autonomous
Cars Over the Artificial
Driving
Air Intelligence
Cloud Sensor
Computing Technology
Vehicle to
Big Data
X
Smart Mobility as
Grids a Service
Energy Intermodal
Storage Mobility
Electric Shared
Powertrain Mobility
1- 8
Levels of Vehicle Autonomy – SAE1 / BASt2
Takeover-
readiness
Driver Vehicle Full
Lateral or
(Human) longitudinal Traffic (Machine) Autonomy
control Monitoring
Level 0.
No Automation: the driver performs all driving tasks, Example: Blind Spot Warning, LDW
Level 1.
Driver Assistance: vehicle is guided by driver, but some driving-assist features may be included in the vehicle.
Example: LKA or ACC
Level 2.
Partial Automation: vehicle has combined automated functions, like acceleration and steering, but the driver must
maintain control of all driving tasks and monitor the environment at all times. Example: LKA and ACC
Level 3.
Conditional Automation: vehicle can run autonomously, but the driver must be ready to take control of the vehicle at all
times with notice. Example: Traffic Jam Pilot
Level 4.
High Automation: vehicle is capable of performing all driving functions under certain conditions, but the driver has the
option to take control of vehicle. Example: Local driverless taxi, Autonomous Racing
Level 5.
Full Automation: vehicle is capable of performing all driving functions under all conditions, but the driver may have the
option to control the vehicle. Example: Full Autonomous Car
1- 10
Highway Pilot
Concept
▪ Highly automated freeway travel
by algorithm (Level 3-4)
▪ Traffic Jam Pilot up to 60 km/h
Increased productivity
Travel time can be used to work or to
rest. Connected car becomes a car
office.
https://www.dezeen.com/2014/02/21/driverless-car-concept-vehicle-xchange-by-rinspeed/
https://www.autoweek.com/news/technology/a1831286/five-levels-driving-autonomy-autoweek-explains/ 1- 11
Robotaxis
Concept
▪ Autonomous car cruise 24-7
▪ Passengers are picked up on
demand and dropped at destination
▪ No need of stations nor depots
▪ Relief city-centers of private cars
Makeover of townscape
▪ Parking areas become obsolete
▪ Reduction of total car volume to 1Τ3
https://www.sueddeutsche.de/auto/verkehrsanalyse-muenchen-ist-wieder-deutschlands-stau-hauptstadt-1.4832763
https://www.continental.com/de/produkte-und-innovationen/innovationen/unsere-vision-fuer-morgen/robo-taxi-214388 1- 12
Motivation – Access to Mobility
20
Occurrences All crashes per million miles traveled
Ensure safety for all ages per million
Fatal crashes per 100 million miles traveled
15 miles
10
0
16 20 25 30 35 40 45 50 55 60 65 70 75 80 85
Age
https://www.cnbc.com/2018/06/13/alphabet-waymo-testing-early-riders-interview-with-saswat-panigrahi.html
http://archive.boston.com/lifestyle/health/blog/inpractice/2012/02/driving_while_old.html 1- 13
Motivation – Vision Zero, 2050
Road fatalities in
Germany
12000 11300
10000
8000
6000
4000
2724
2000
0
1990 1995 2000 2005 2010 2015 2020
Year
1- 14
Clustering of Competences
High complexity
Software algorithms as well as
car manufacturing are
sophisticated and require
experience
Reduced cost
Huge investments in R&D1 and
cost saving from industrialization
Shared risk
Risk of technical decisions and
potential risk from liability and
warranty claims
Exemplary Clusters of Competences between OEM – Supplier – Mobility Provider and Tech Firm in the field of
autonomous driving
1- 16
Additional Slides
Autonomous driving is one of the most complex development challenges in the automotive industry. The broad range
of required skills and capabilities barely exists in-house at any traditional OEM, supplier or tech player. The latter are
well positioned then it comes to software development and agile working principles to achieve shorter development
cycles and time to market, but lack the experience with industrialization and scaling a real hardware business like
building cars. On the other side, OEMs and traditional automotive suppliers often struggle with the transformation
towards a new agile product and software development system with significantly short cycle times for E/E and
software-related functions. Cross-industry partnerships are an (inevitable) prerequisite to mitigate these complex
challenges and to close own technology blind spots. Many of the major stakeholders engaged in the development of
partnerships. In addition to the lack of technological or process expertise, there are several other reasons to join
forces. Reduced development costs and risk sharing between partners are further important drivers for the emergence
of those cooperation. Lastly, from a topline perspective, a larger addressable customer base and associated revenue
potentials have to be mentioned.
1- 17
Introduction
Prof. Dr. Markus Lienkamp
Kröger, 2016 - Automated Driving in Its Social, Historical and Cultural Contexts
John McCarthy, 1969 – Computer Controlled Cars (http://jmc.stanford.edu/commentary/progress/cars.pdf) 1- 19
Additional Slides
The first attempt of a driverless vehicle in public traffic was realized by Francis P. Houdina, a former U.S. Army
electrical engineer, in New York City in 1925. A modified Chandler sedan, later called the American Wonder, received
radio signals via an antenna that controlled its speed and direction. The car’s operator sat in a vehicle directly behind.
However, the journey ended with a crash into another car with a bunch of photographers. From the today’s viewpoint,
it was no real autonomous driving, but more teleoperated driving.
In 1969, John McCarthy, one of the founding fathers of artificial intelligence, wrote an essay titled “Computer-
Controlled Cars” about the software and functional architecture of autonomous vehicles, similar to modern AVs.
McCarthy referred to an “automatic chauffeur” capable of navigating a public road via a “television camera input that
uses the same visual input available to the human driver.” He wrote that users should be able to enter a destination
using a keyboard, which would then prompt the car to immediately drive them there. Additional commands would
allow users to change the destination, stop at a restroom or restaurant, slow down, or speed up in the case of an
emergency. No such vehicle was built, but McCarthy’s essay laid out the mission for other researchers to work
toward.
Reference:
https://www.digitaltrends.com/cars/history-of-self-driving-cars-milestones/
1- 20
Milestones of Autonomous Driving
Prof. Ernst Dickmanns from the University of the Federal Armed Forces in Munich (UniBW) developed for the first
time visually guided autonomous cars with digital processors onboard. In 1984, his team conceptualized the first
vehicle that used dynamical models for visual autonomous guidance: The VaMoRs (Versuchsfahrzeug für autonome
Mobilität und Rechnersehen) was a 5-ton van (Mercedes 508 D), that was able to carry the big sized computers and
cameras of this time. In summer 1987, the VaMoRs drove autonomously – only with the help of cameras, without
radar and GPS – 20 km with a speed up to 96 km/h (60 mph). The technology was based on a spatiotemporal
dynamic model called 4-D approach, which added to the three dimensions of space the category of time and
integrated a feedback of prediction errors.
The concept of vision-based autonomous driving gained momentum with the EUREKA-PROgraMme for a European
Traffic of Highest Efficiency and Unprecedented Safety (PROMETHEUS) of the European Union (1987–1994).
In the context of the PROMETHEUS-Project, the team of Dickmanns developed with Mercedes Benz two S-Class (W
140) robotic vehicles: VaMP (UniBw Munich) and VITA-2 (DBAG). During the final event in October 1994 in France,
the twin-robot vehicles drove more than 1000 km autonomously on three-lane highways around Paris, in the middle of
heavy traffic and with speeds up to 130 km/h. The system was based on real time evaluation of image sequences
caught by four cameras (320 x 240 pixels). Steering, throttle and brakes were controlled automatically through
computer commands. The next year, Dickmanns’ team piloted a Mercedes S-Class from Munich to Denmark, a trip of
more than 1,600 kilometers at a maximum speed of 180 km/h with, as Dickmanns notes, “about 95% of the
distance…traveled fully automatically.”
Reference:
M. Maurer, B. Lenz, H. Winner, und J. C. Gerdes, Autonomous Driving: Technical, Legal and Social Aspects. s.l.: Springer,
2016.
1- 22
Additional Slides
Announced on 30 July 2002, the first DARPA (Defense Advanced Research Projects Agency) Grand Challenge held
in Mojave Desert in the United States, mandated by the US Congress, had $1 Million of prize aimed to unman one-
third of Armed Forces’ ground combat vehicles by 2015. It was an autonomous robotic ground vehicle competition
with 150 miles of length. However, none of the vehicles travelled the whole length, the Red Team of Carnegie Mellon
University travelled farthest completing 11.9 km. Hence, no team could claim the prize as they could barely reach 5%
of the total distance.
The same year in June 2004, DARPA announced second grand challenge with 150 miles (212 km) off-road course
with $2 million prize, double than the previous one. With lessons learned and improved vehicles, 23 final participants
performed in October 2005. It was a challenging run that included three tunnels, more than 100 turns and navigating a
steep pass with sharp drop-offs. The Stanford Racing Team won $2 million prize with the winning time 6 hours and 53
minutes followed by the Red Team of Carnegie Mellon University. Total five teams completed the competition.
The Urban Challenge, the third installment in the series of the competition launched by Defense, was announced in
May 2006 and was held on November 3, 2007, at the Former George AFB Victorville, California. Building on the
success of the 2004 and 2005 Grand Challenges, this event aimed to build a vehicle which is capable of driving
without a human driver in traffic, maneuvering complex situations like parking, passing, and negotiating intersections.
This event was unique and truly groundbreaking as the first time autonomous vehicles have interacted with both,
highly automated and conventional cars, in the traffic of an urban environment.
The competition was tougher this time with 60 miles (97 km) of urban course. It was won by “Boss” of Tartan Racing
of Carnegie Mellon University with the average speed of 22.5 km/h with a complex urban environment and driving
time limited to a total of six hours.
Reference:
https://automotiveindianews.com/milestones-development-autonomous-driving/
1- 23
Milestones of Autonomous Driving
Bertha-Benz Ride
Pforzheim – Mannheim, 2013
Intelligent drive system with
close-to-production sensors
https://www.volkswagen-newsroom.com/de/volkswagen-news-international-ces-asia-3159
https://media.daimler.com/marsMediaSite/en/instance/ko/Pioneering-achievement-Autonomous-long-distance-drive-in-rural-
and-Mercedes-Benz-S-Class-INTELLIGENT-DRIVE-drives-autonomously-in-the-tracks-of-Bertha-Benz.xhtml?oid=9904223 1- 24
Milestones of Autonomous Driving
Tesla Autopilot
Tesla Models S, 2015
Highway Pilot (Level 2) based on
radar and camera perception
Adaptive Cruise Control and Lane
Change on Freeway
https://fortune.com/2015/12/17/tesla-mobileye/
https://www.youtube.com/watch?v=jdPIdNS2LUk&feature=emb_logo
https://www.theguardian.com/technology/2020/feb/25/tesla-driver-autopilot-crash 1- 25
Milestones of Autonomous Driving
Adaptation of legal
4 framework in progress
Waymo One
3
Legend S-Klasse
2
Model S A8 Lexus LS ID3 iX
Year
2015 2020 2025
1- 27
Additional Slides
Today, Level 2 systems are not only standard in premium vehicles, but are also used in lower vehicles classes. Common
systems are Lane Keeping Assistance (LKA) and Adaptive Cruise Control (ACC) on highways. The driver is allowed to take off
his hands for short times, but has to supervise the car in every situation. For Level 3 and Level 4 systems an adaptation of legal
framework is necessary in many countries. Furthermore, these systems result in higher costs for the costumer and increasing
cases of liability for the OEM. Therefore, the unofficial Level 2+ emerged, which exceed the functionality of typical Level 2 model,
but still requires the supervision of the human driver. OEMs use this level to point out the enhanced robustness of their level 2
systems or refer to advanced functionalities such as Lance Change Assistant or Cloud Based Services.
The progression from level 3 to level 4 is not a steady one. Classic rule-based ADAS function reach their limits with level 3
requirements. Linear “if then” conditions need to consider every possible use case or combination of use cases in any given
traffic situation, which is virtually impossible in urban environments (level 4 and 5). Apart from confined spaces such as highways
traffic situations are highly dynamic and complex. For this reason, self-learning systems based on artificial intelligence (AI) that
mimic human decision-making processes are critical for meeting the demand for complex scene interpretation, behavior
prediction and trajectory planning. AI is becoming a key technology in all areas along the automotive value chain is paramount
for the success of level 4+ AD systems.
The Tesla Autopilot, which was introduced with the Model S in 2015, was the first level 2 system. The Audi A8 is capable to drive
with level 3 up to 60 km/h, but due to legal aspects the system was not offered for costumers. BMW announced a level 3 system
for the iX, but took the announcement back because of technical and legal aspects. Honda received the type designation for
level 3 in Japan and plans to introduce the “Traffic Jam Pilot” in the Honda Legend in the first half of 2021. Mercedes-Benz
developed a level 3, which already is in the certification process through the KBA. Waymo started a level 4 robotaxi in the
greater area of Phoenix, Arizona, but it has to be mentioned that e.g. local weather conditions, lower the algorithmic
requirements. Many tech companies such as Zoox (amazon), Apple and Baidu already test or develop autonomous vehicle
software, but as of today it is unclear which level of autonomy and when the vehicle will be offered to public.
References:
https://www.autonews.com/cars-concepts/audi-quits-bid-give-a8-level-3-autonomy
https://europe.autonews.com/automakers/lexus-prepares-introduce-level-2-autonomy
https://jesmb.de/4449/
https://global.honda/newsroom/news/2020/4201111eng.html
https://www.automobilwoche.de/article/20201009/AGENTURMELDUNGEN/310089895/ohne-sicherheitsfahrer-waymo-oeffnet-robotaxi-
dienst-fuer-mehr-nutzer
1- 28
Introduction
Prof. Dr. Markus Lienkamp
Application Layer
Runtime Environment
Middleware
Service Layer
Complex
ECU Abstraction Layer Drivers
Basic
Microcontroller Abstraction Layer
Software Microcontroller
Layer Adapted from: https://www.autosar.org/fileadmin/user_upload/standards/classic/4-3/AUTOSAR_EXP_LayeredSoftwareArchitecture.pdf
1- 30
Additional Slides
The AUTOSAR or Automotive Open System Architecture was developed in 2003 to create a common standardized
software architecture for designing automotive electronic control units (ECUs). The AUTOSAR architecture is based
on a 3-layered architecture model, developed jointly by the stakeholders of the automotive industry including – the
automobile manufacturers, the suppliers, and the tool developers.
Application Layer:
In this layer are the software components with their functional code. The development of the functional code takes
place here independently of the vehicle bus and the hardware used. An exception build the software components for
sensors and actuators, which depend of the sensor and actuator concept.
1- 31
Perception
X = Lectures
1- 32
Perception I: Mapping & Localization
IMU Camera
Map representations Wheel Encoders LiDAR
Depending on the target algorithm, Radar
GPS (distance to satellites)
different map representations are
→ Vehicle Ego State → Perceive Environment
preferable.
1- 33
Perception II: Mapping & Localization
SLAM
Simultaneous Localization and Mapping is the central concept to solve the
dual problem of ego-localization and map generation. The most common
SLAM algorithms build upon a Kalman Filter, a Particle Filter and a graph-
based approach.
sensor SLAM
data front-end back-end estimate
feature map
extraction estimation
data
optimization
association
1- 34
Perception III: Detection
Detection tasks
Based on environmental sensors the
three main tasks are road, traffic light
and sign and object detection.
Sensor-dependent algorithms
The state of the art of algorithms for
camera, LiDAR, and RADAR detection
are presented, all of which are based
on deep learning algorithms.
Sensor fusion
Methods to fuse multiple sensor to
enhance the overall detection
performance are presented.
1- 35
Lecture Overview
5
Sensors Perception Prediction Actuators
2
Mapping
3
Localization
4
Detection
X = Lectures
1- 36
Prediction
Intention estimation
Motion prediction estimates the
Interaction Stochastic
intention and future positions of Processes
dynamic objects and quantifies the EGO
?
associated uncertainty.
!
Incomplete
Object Detection
Prediction methods
Prediction algorithms can be divided Latent Space
1
1
1- 37
Lecture Overview
5
Sensors Perception Prediction Planning Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
X = Lectures
1- 38
Planning I: Global Planning
Navigation Task
In hierarchical planning approaches,
global planning contains the navigation
task, i.e. the route to go from A to B.
Behavioral Planning
The idea of a state machine to switch
between discrete behavior models
enables situation-dependent planning.
1- 39
Planning II: Local Planning
Sensor-based planning
With the input of the global route, the
local planning module determines a
kinematically feasible, collision-free
trajectory.
1- 40
Lecture Overview
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
X = Lectures
1- 41
Motion Control Circle with
radius
Control Task
Based on a planned trajectory the motion
control module determines the required
command signals to process this
trajectory into vehicle motion and to
handle external disturbances.
1- 42
Lecture Overview
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
X = Lectures
1- 43
Lecture Overview
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
9
Safety
X = Lectures
1- 44
Safety Assessment
User Acceptance
Automotive safety and security requirement testing
Implementation
1- 45
Lecture Overview
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
9 10
Safety Teleoperated Driving
X = Lectures
1- 46
Teleoperated Driving
Remote Control
Remote control of the automated
vehicle in traffic situation, which are
too complex or lie outside the
operation design domain (ODD), is the
task of Teleoperated Driving.
1- 47
Lecture Overview
11
End-to-End
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
9 10
Safety Teleoperated Driving
X = Lectures
1- 48
End-to-End (E2E) and Combined Modules
Imitational Learning
Behavior Cloning and (inverse)
reinforcement learning are
methods to realize integral
software approaches.
1- 49
Lecture Overview
12
Human-Machine-Interface
11
End-to-End
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
9 10
Safety Teleoperated Driving
X = Lectures
1- 50
From Driver to Passenger
Human Factors
The shift from driver to passenger
requires a re-definition of HMI1
within as well as between vehicle
and environment to ensure safety,
comfort and confidence during
autonomous rides.
Customer Needs
New costumer needs occur, which
are a crucial aspect to reach the
acceptance for the new technology
on the consumer market.
1 Human-Machine-Interface 1- 51
Lecture Overview
12
Human-Machine-Interface
11
End-to-End
5 8
Sensors Perception Prediction Planning Control Actuators
2
Mapping Global
3 6
Localization Behavior
4
Detection 7 Local
9 10
Safety Teleoperated Driving
X = Lectures
1- 52
Introduction
Prof. Dr. Markus Lienkamp
https://www.bosch-mobility-solutions.com/en/
1- 54
Sensor Types
https://www.bosch-mobility-solutions.com/en/
https://www.car-bock.de/ABS-sensor-FA-VW-Golf4-left 1- 55
https://vrtracker.xyz/handling-imu-drift/
Inertial Measurement Unit (IMU)
𝒛
▪ 6 Degree of Freedom (DOF) 𝝍ሶ
▪ 3-axis linear accelerometer (𝑥, 𝑦, 𝑧)
Measurement based on capacitive
change of micromechanical structures 𝜽ሶ
due to mechanical forces (F = m a) 𝝋ሶ 𝒚
▪ 3-axis rate gyroscope (𝜑, 𝜃, 𝜓)
𝒙
Measurement based on Coriolis
principle: inertia force of oscillations in
a rotating system
▪ High accuracy and high sampling rate
▪ “Sensor Drift”: Accumulating position error
https://www.bosch-mobility-solutions.com/
1- 56
Global Navigation Satellite System (GNSS)
Additional geo-referenced
signals Know position
Correction of time-of-flight
position calculation
https://www.uavnavigation.com/support/kb/general/general-system-info/global-navigation-satellite-system-gnss 1- 57
Sensor Types
Perception sensors
▪ Detection of semantic information for
ADAS1 and autonomous driving
Radar - Specifications
• short-range (24 GHz): blind spot monitoring
• long-range (77 GHz): distance control
• Distance measurement: 0,5 – 250 m
• Field of view:
• Horizontal: 20° for Long-Range (<250m) and 60° - 120° for Mid-Range (<60m)
• Vertical: No vertical resolution
• Resolution:
• Distance: ~ 0.3m
• Relative velocity: ~ 0.1 m/s
• Azimuth angle: ~ 1°
Benefits
High robustness against weather (rain) and ambient light
Accurate measurement of distance and velocity (Direct speed measurement)
Cheap
Most used sensor for detection today (ACC, Collision avoidance system)
Drawbacks
No measurement of lateral velocity nor object size or type
Low resolution
Only for moving objects
No contrast/color
1- 61
LiDAR
+ 3D image generation of
environment
- High costs
https://www.rfwireless-world.com/Terminology/Advantages-and-Disadvantages-of-LiDAR.html
https://www.nhregister.com/technology/businessinsider/article/Lyft-wants-to-bring-self-driving-cars-to-Boston-11199145.php 1- 62
LiDAR – Visualization
LiDAR - Specifications
• Distance measurement: 1 – 200 m (Short Range: <30m)
• Field of view:
• Horizontal: 30° for Long-Range and up to 360° for Short-Range
• Vertical: 30°
• Resolution:
• Distance: ~ 0.02 m
• Relative velocity (via tracking): ~ 0.3 m/s
• Horizontal angle: ~ 0.1°
• Vertical angle: ~ 0.8°
Benefits
Measurement objects size (width, height) and lateral velocity
3D image generation of detected objects and mapping the surroundings
Accurate depth information
Higher resolution than radar
Drawbacks
Struggles in detection of black vehicle
High costs
High data rate
No contrast/color information
Weak performance in rain
1- 64
Camera
https://www.continental-automotive.com/de-DE/Passenger-Cars/Autonomous-Mobility/Enablers/Cameras/Stereo-Camera 1- 65
Camera – Depth estimation
A stereo depth map encodes the actual depth of every pixel. Red represents points that are close and green denotes those that are far away.
Only a few pixels (e.g. at the left image border or at left edges of objects) are without a depth measurement. Typically, this is due to left-to-
right occlusion or failed depth consistency checks.
Camera - Specifications
• Distance measurement: 1 – 200 m (Short Range: <30m)
• Field of view:
Horizontal: 30° - 45°, Fish-eye: 360°
• Resolution:
• No direct measurement, Depends on image resolution and algorithm
Benefits
Imitation of human perception of road traffic, road traffic relies on visual perception
Detection of road lanes, traffic sign and object classification possible (Roads are designed for human eyes)
Price: mass product from consumer electronics
High resolution
Small (Package)
Drawbacks
No direct measurement of position and velocity
Data processing of images is complex → .. But major improvements through Deep Neural Networks
1- 68
Comparison
RADAR LiDAR Camera
Range ++ o +
Resolution - + ++
Field of view - ++ +
Velocity ++ o -
3D-Perception - ++ o
Object features - o ++
Robustness (weather) ++ o -
Cost + -- ++
Package + -- +
1- 69
Level 0 – Blind Spot Warning
Short-Range
RADAR Ultrasonic
1- 70
Additional Slides
Blind Spot Warning:
Two ultrasonic sensors on each side of the vehicle serve as electronic eyes and monitor the space in the adjacent
lane, allowing the system to cover the dangerous blind spot. If another vehicle is situated in the monitored area, the
driver is alerted to the potential danger by means of a warning sign in the side mirror. If the driver fails to spot or
ignores this warning and activates the turn signal to change lanes, the system can also trigger an audible warning.
The system recognizes stationary objects on or alongside the road, such as guardrails, masts or parked vehicles, as
well as the driver's own overtaking maneuvers – and does not trigger the warning in this case.
Reference:
https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-
assistance-systems/blind-spot-detection/
1- 71
Level 1 – Adaptive Cruise Control (ACC)
Long-Range
RADAR Ultrasonic
1- 72
Additional Slides
ACC:
A radar sensor is usually at the core of the adaptive cruise control (ACC). Installed at the front of the vehicle, the
system permanently monitors the road ahead. As long as the road ahead is clear, ACC maintains the speed set by
the driver. If the system spots a slower vehicle within its detection range, it gently reduces speed by releasing the
accelerator or actively engaging the brake control system. If the vehicle ahead speeds up or changes lanes, the ACC
automatically accelerates to the driver’s desired speed.
Standard ACC can be activated from speeds of around 30 km/h (20 mph) upwards and supports the driver, primarily
on cross-country journeys or on freeways. The ACC stop & go variant is also active at speeds below 30 km/h (20
mph). It can maintain the set distance to the preceding vehicle even at very low speeds and can decelerate to a
complete standstill. If the vehicle has automatic transmission, and the traffic hold-up is only brief, ACC stop & go can
set the vehicle in motion once again. When the vehicle remains stopped longer, the driver needs only to reactivate the
system, for example by briefly stepping on the gas pedal to return to ACC mode. In this way, ACC stop & go supports
the driver even in heavy traffic and traffic jams.
Since ACC is a comfort and convenience system, brake interventions and vehicle acceleration only take place within
defined limits. Even with ACC switched on, it remains the driver’s responsibility to monitor the speed and distance
from the vehicle in front.
To increase comfort and safety of this function, a multi purpose camera can be installed in addition to the radar
sensor. By this, for instance, ACC can, thanks to the lateral measuring accuracy of the multi purpose camera, detect a
vehicle entering the driver’s own lane – either planned or unplanned – much earlier, enabling the system to respond
more dynamically. For a better and more robust understanding of the scene, data of the radar sensor and the camera
can be merged.
Reference:
https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-
assistance-systems/adaptive-cruise-control/
1- 73
Level 2 – ACC and Lane Keep Assist (LKA)
Mono- Long-Range
Camera RADAR Ultrasonic
1- 74
Additional Slides
LKA:
Lane keeping assist uses a video camera to detect the lane markings ahead of the vehicle and to monitor the
vehicle's position in its lane. If the vehicle’s distance to the lane markings falls below a defined minimum, the system
steps in. In vehicles with electric power steering, it gently, but noticeably countersteers in order to keep the vehicle in
the lane. In vehicles without electric power steering, it achieves the same effect by utilizing the electronic stability
program (ESP) to brake individual wheels.
Drivers can override the function at all times, so they retain control of the vehicle. If they activate the turn signal in
order to intentionally change lanes or turn, the system does not intervene.
Reference:
https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-
assistance-systems/lane-keeping-assist/
1- 75
Level 3 – 5
Stereo- Long-Range
LiDAR Camera RADAR Ultrasonic
1- 76
Additional Slides
More Details of the introduced sensors are given in the lecture „Advanced Driver Assistant Systems in Vehicles”, also
offered by the Institute of Automotive Technology, TUM.
https://www.mw.tum.de/en/ftm/teaching/courses/advanced-driver-assistant-systems-in-vehicles/
References
https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driving-
safety-systems/electronic-stability-program/inertial-sensor/
https://www.car-bock.de/ABS-sensor-FA-VW-Golf4-left
https://vrtracker.xyz/handling-imu-drift/
https://www.flaticon.com/
https://jumbonews.co.uk/news/1850510/global-automotive-ultrasonic-radar-market-2020-industry-scenariodevelopment-analysis-
strategies-growth-factors-and-forecast-to-2025/
https://www.japanautomotivedaily.com/2017/05/25/ricoh-denso-develop-worlds-smallest-adas-stereo-camera/
https://velodynelidar.com/products/ultra-puck/
https://www.continental-automotive.com/en-gl/Passenger-Cars/Autonomous-Mobility/Enablers/Radars
1- 77
Challenges of Detection
Challenges
▪ Conditions of illumination
▪ Weather conditions
▪ Static obstacles
▪ Reflection
1- 78
Sensors – Safety Concepts
Diversity
▪ Definition: Implementation of components with physical or technical
distinct operating principles.
▪ Every sensor has its benefits.
▪ Different types of sensors are applied to enhance robustness against
weather conditions and to obtain more information.
→ Common concept for perception sensors
Redundancy
▪ Definition: The duplication of critical components or functions of a system.
▪ Faulty detection should be minimized.
▪ Overlapping sensors of the same or different type to verify measurements.
1- 79
Actuators for Autonomous Driving
Actuation
Vehicle
(Controller)
Software
Measurement
1- 80
Additional Slides
Most vehicles on the road today already have all of the actuators needed to control the vehicle for automation. Up to
Level 4, these actuators are selectable by human input or by control input from software. Hence, they are designed for
manual driving as well and already part of ADAS-systems. The most common systems are electronic motors. These
group of actuators are called “X-by-Wire”, when there is no mechanical connection between control input and
component. However, to ensure reliability, systems like electric power steering systems still have a mechanical
connection in contrast to pure “Steer-by-Wire”-systems.
Throttle actuation is achieved in most modern vehicles via electronic control between the pedal and the drive train
(Drive-by-Wire). Electric assisted power steering has become the norm, already displacing hydraulically driven power
steering systems throughout the industry. Finally, brake actuation is achieved by way of electronic stability control,
which is required by law in the European Union since 2014 on all new passenger vehicles, <3,500 kg.
Reference:
https://www.robsonforensic.com/articles/autonomous-vehicles-sensors-expert/
1- 81
Actuators – Constraint: Safety
Fail safe
Property: Causing a device to revert to a
safe condition in the event of a
breakdown or malfunction.
Reliability
Same reliability as conventional systems
with human driver.
Concepts
Redundancy
Diversity
1- 82
Actuators – Constraint: Control
Actuator
Control variable Output y
Signal
Input u
time
h – min
✓ Navigation Road network
Itinerary
✗
Driving space
Command with road and min – s
Vehicle traffic
Trajectory
✗
Longitudinal and s – ms
Stabilisation lateral dynamics Road surface
Adapted from: Donges, E.: Aspekte der Aktiven Sicherheit bei der Führung von Personenkraftwagen. Automobil‐Industrie 27, 183–190 (1982) 1- 85
Complexity of Road Traffic
Stochastic Behavior
The intentions of other traffic
participants are unknown and non-
deterministic 100
0.4 14
Unlimited ODD1
Space Boeing 787 Level 1-2 Level 5
In Level 5, the number of possible Shuttle Dreamliner
scenarios is unlimited 90
40,000
Testing Environment:
California
30,000
20,000
16,830
10,411
10,000
2,604
426 231 158 66 41 39 31 4 1
0
https://www.dmv.ca.gov/portal/vehicle-industry-services/autonomous-vehicles/disengagement-reports/ 1- 87
Benchmark in passenger safety
Km per Km per Disengagement 2020
Disengagement
Km per Disengagement 2019
300,000
300,000
250,000
+ _
> + +
200,000
150,000
100,000
47,912 45,633
50,000
16,830
10,411
2,604 426 231 158 66 41 39 31 4 1
0
https://www.dmv.ca.gov/portal/vehicle-industry-services/autonomous-vehicles/disengagement-reports/ 1- 88
Additional Slides
https://www.dmv.ca.gov/portal/vehicle-industry-services/autonomous-vehicles/disengagement-reports/ 1- 89
Friction Estimation
Degradation of components
Tires and components reduce in
performance over lifetime, which is
another reason for a decreasing friction
value.
https://www.abs-autoservice.de/presse-aktuelles/maengel-bereifung-und-bremsen-sind-
haeufig-ursache-fuer-unfaelle-mit-personenschaeden/
1- 90
Validation
Explainbale AI
▪ AI-algorithms cause unhedgeable
risks ?
▪ Explainbality of AI is a major trend,
which targets this issue
1- 91
Additional Slides
• Average driven kilometer per test vehicle per year: 66,000km per year per vehicle
• So, 100,000 test vehicles would be required per year for a single OEM to validate one software release.
Additionally, 2-3 test drivers per vehicle required. Total Costs: ~ 10^9 €
.. And this war just one software release and would be required after every software update
1- 92
Additional Slides
Cyber Security
To ensure a comprehensive cybersecurity environment a multi-layered approach is required that leverages existing
cybersecurity frameworks and encourages industry to adopt best practices that improve the security posture of their
vehicles.
• A risk-based prioritized identification and protection process for critical vehicle systems
• Timely detection and rapid response to potential vehicle cybersecurity incidents.
• Architectures, methods, and measures that design-in cybersecurity and cyber resiliency, facilitating rapid recovery
from incidents when they occur.
• Methods for effective intelligence and information sharing across the industry to facilitate quick adoption of
industry-wide lessons learned.
• Creation of standards that articulate best practices.
From: https://cyberstartupobservatory.com/cybersecurity-connected-autonomous-vehicles/
1- 93
Social Acceptance
https://www.statista.com/aboutus/our-research-commitment
1- 94
Costumer Value
Motion Sickness
A high level of motion comfort is
required to enable reading, sleeping
or working during vehicle ride,
besides a respective road surface.
Driving Pleasure
Driving pleasure of manual driving
counteracts the request of highly
automated passenger cars
https://www.health.harvard.edu/diseases-and-conditions/coping-with-motion-sickness
https://www.bmw-m.com/de/topics/magazine-article-pool/m-donuts-zum-donut-day.html
1- 95
Legal Aspects
Liability law
▪ Currently, three-pillar liability system
between drivers, owners and
manufacturers
▪ New division of liability for Level 4-5
1 Straßenverkehrs-Ordnung
2 Straßenverkehrs-Gesetz 1- 96
Additional Slides
System is allowed to drive, but a person in the vehicle is required for observation. The use-case is not limited to
specific scenarios, but the location of the application is limited. Exemplary application could be: Shuttle-Transports,
People-Mover, Hub2Hub-Traffic, Traffic Offers in rural areas, Automated Valet Parking (e.g. via Smartphone)
1- 97
Ethical Aspects
Rational algorithm
▪ Algorithm acts rational and follows
the programmed logic
▪ How should the algorithm be
programmed for situations of
inevitable accidents?
Ethical dilemma
Opposing imperatives of:
▪ No weighting of human life
acceptable
▪ Imperative of damage minimization
1- 98
Business Model
https://www.ticketsnipers.com/ticket/cvc-21456-pedestrian-crosswalk-violation 1- 100
https://medium.com/rahuls-personal-blog/the-great-indian-road-traffic-4a1a5571661e
Our Approach
Unstructured Structured
Environment Environment
Limited ODD1 – Level 4 Unlimited ODD1 – Level 5
Vehicle application Addition of road rules
Evaluation of applicability Extension of generic algorithms
Validation of edge cases
Connected Autonomous
Mega Trends – CASE Cars Driving
mobility. No driver
intervention
required
Takeover-
readiness
Driver Vehicle Full
Lateral or
two important concepts on the Level 0 Level 1 Level 2 Level 3 Level 4 Level 5
Milestones
A brief overview of the history of
driverless cars and the status quo is
given.
1- 103
Summary – What did we learn today
12
Human-Machine-Interface
Lecture Overview 11
End-to-End
3 6
Detection 7
Behavior
Local
9 10
Open Challenges
Lines of code 500+
in million
1- 104
Literature
J. Betz et al., “A Software Architecture for an Autonomous Racecar,” in 2019 IEEE 89th
Vehicular Technology Conference (VTC2019-Spring), 2019, pp. 1–6.
J. Betz et al., “A Software Architecture for the Dynamic Path Planning of an
Autonomous Racecar at the Limits of Handling,” in 2019 IEEE International Conference
on Connected Vehicles and Expo (ICCVE), 2019, pp. 1–8.
S. Pendleton et al., “Perception, Planning, Control, and Coordination for Autonomous
Vehicles,” Machines, vol. 5, no. 1, p. 6, 2017, doi: 10.3390/machines5010006.
M. Maurer, B. Lenz, H. Winner, and J. C. Gerdes, Autonomous Driving: Technical,
Legal and Social Aspects. s.l.: Springer, 2016.
M. H. Daniel Watzenig, Ed., Automated Driving: Springer International Publishing,
2017.
A. Faisal, T. Yigitcanlar, M. Kamruzzaman, and G. Currie, “Understanding autonomous
vehicles: A systematic literature review on capability, impact, planning and policy,”
JTLU, vol. 12, no. 1, 2019, doi: 10.5198/jtlu.2019.1405.
1- 105