Advanced Vehicle Control and Safety

advertisement
Advanced Vehicle Control and
Safety
Chuck Thorpe
Carnegie Mellon University
1. The Problem
• Vehicles and highways have greatly improved safety: total
fatalities are down approximately 30% over the past 35
years
• Even with those improvements, there are still
approximately 40,000 fatalities / year in the US
• People haven’t improved: in 90% of all accidents, the
driver is a contributing cause
The Solution
• The Intelligent Vehicle Initiative (IVI) is a USDOT
program to use advanced electronics to improve vehicles,
with the dominant concern being safety.
• This tutorial is arranged around a series of advanced
functions, such as vehicle detection, that contribute to safer
and more intelligent vehicles. For each function, the
tutorial discusses a set of possible technologies.
• The next set of slides show the “user services” for the IVI
advanced vehicle control and safety systems. The
following charts show which technology functions support
each user service.
• Note the synergy: each technical function supports many
user services.
IVI User Services categories:
• Safety: (directly contributing to vehicle safety);
–
–
–
–
–
–
–
–
rear end collision warning
roadway departure warning
lane change / merge collision warning
intersection collision warning
railroad crossing collision warning
vision enhancement
location-specific warnings
collision notification
• Safety Impacting: (potential to distract or aid the driver);
– navigation and routing
– real-time traffic information
– driver comfort and convenience features
More Services
• Commercial Vehicle Services:
–
–
–
–
–
–
vehicle stability
vehicle diagnostics
driver condition monitoring
cargo identification
automated transactions
safety recorder
• Transit:
–
–
–
–
obstacle and pedestrian detection
precision docking
passenger monitoring
passenger information
More Services
• Specialty Vehicles:
– full automation
• Supporting Services:
– low friction warning
– longitudinal control
– lateral control
Technical functions
• There is a set of common vehicle functions that underlie
those user services:
–
–
–
–
–
–
–
–
–
sensing the position of other vehicles
sensing obstacles
sensing the position of the lane relative to your own vehicle
sensing vehicle position and motion
estimating braking performance
communication
reliability
miscellaneous functions
sensor-friendly vehicles and roadways
• The rest of this section shows how each of these functions
supports the various user services
Safety 1
Other
Veh.
Rear
End

Road
Dep
Lane
Chnge
InterSect
RR


Obst.
Lane
Pos
Control
Pos +
mtion
  
  
  
 
 
Brake
Comm
Relia
bility
Misc.
Clutter
  
  
 
  
  









Safety 2
Other
Veh.
Vision
Enhce
Obst.
  
Locspec
Coll
Notif
Smart
restrnt
Lane
Pos
 
Control
Pos +
mtion
Brake
Comm Relia
bility

   

 

Misc.
Clutter
 



Safety Impacting
Other
Veh.
Nav/
Rout
R-T
traffic
Driver
Comf.
Obst.
Lane
Pos
Control
Pos +
mtion


Brake
Comm Relia
bility
Misc.
  



Clutter
Commercial Vehicle
Other
Veh.
Obst.
Lane
Pos
Pos +
mtion
Brake
Comm Relia
bility
  
Stability
Driver
Cond.
Control
 

Vehicle
Diag.
Cargo ID

Auto
Transact.
Safety
recorder
  
Misc.
Clutter
Transit
Other
Veh.
Obst /
Ped
Prec.
Dock
Obst.
Lane
Pos
Control
Pos +
mtion


  
Brake
Comm Relia
bility
Clutter


Pass
Mntr
Pass
Info
Misc.

Specialty
Other
Veh.
Full
Auto
Obst.
Lane
Pos
Control
Pos +
mtion
Brake
Comm Relia
bility
Misc.
Clutter
         
Supporting
Other
Veh.
Low
Friction
Long
Ctrl
Lat
Ctrl
Obst.
Lane
Pos
Control
Pos +
mtion
Brake
Comm Relia
bility
Misc.
Clutter
  
 
      

  
   
Section 1 Questions:
• How many accidents occurred in the most recent year for
which statistics are available? Hint http://www.ohs.fhwa.dot.gov/info/saffacts.html and
http://www.census.gov/statab/www/
• How many fatalities?
• What was the dollar cost of those accidents?
• What kind of economic justification is there for the various
AVCSS services?
• Are there other on-vehicle functions that would be useful?
2
Sensing Other Vehicles
• Other vehicles need to be sensed in front for
adaptive cruise control and forward collision
warning; on the sides, for blind spot and lane
change / merge warning; and behind, for backup
warning and for lane change / merge warning of
overtaking vehicles.
• Sensing has to work in all weather, and at a variety
of ranges
2.1 Basic Geometry
Sensing straight ahead is
not sufficient; on a
curving road, a forwardlooking sensor needs to
have a wide field of
view, and sensed vehicle
position needs to be
combined with road
geometry to know
whether the lead vehicle
is in your lane, another
lane, or on the shoulder.
2.2 Targets and Clutter
• Other objects in the field of view can include roadside
signs, parked cars, overpasses, guard rails, etc; this is
referred to in the radar literature as “clutter”.
• Adaptive Cruise Control (ACC) systems, which are only
concerned with moving vehicles, can reject any stopped
object as clutter.
• Rear-end collision warning systems need to sense stopped
vehicles, and so need high-acuity sensing of vehicles and
lanes in order to separate targets (other vehicles) from
clutter.
2.3 Radar
• Radar is an excellent choice for seeing big metal objects through fog,
snow, or light rain
• The currently approved frequency is 77 GHz. Radar works at the speed
of light, so sensing is almost instantaneous.
• Simple radar is be a single spot with no information on bearing angle.
More sophisticated versions sweep the beam mechanically, or use two
or more beams and various processing schemes to measure bearing and
range
• Typical resolution (closest objects that can be distinguised) is 1 meter
in range, 3 degrees in bearing.
Radar Data
Data from a scanning radar.
Top image is video of the
scene, bottom is radar data,
with corresponding locations
marked. The radar data is range
(horizontal) and bearing angle
(vertical; up is left, down is
right). Brightness indicates
strength of return. Car A is
close and he center of the radar
return (the video image does
not extend as far to the right as
the radar); B is further and left;
C is further yet and is barely
visible above the roof of A; D
is much further and has a
bearing between A and B.
2.4 Ladar
• Ladar, lidar, and laser rangefinder are all synonyms. They refer to
measuring distance using the travel time of a laser beam. The laser can
be scanned over the scene with mirrors to produce a “range image”.
• Lasers can be focused to very small spots (fractions of a degree), so
they have much better resolution than radar. Instead of sensing a blob
with radar, a ladar can make many measurements as it scans, and can
measure fine details of shape.
• Since ladar is near visible light, it is blocked by the same kinds of
effects that impede human vision: fog, snow, and heavy rain will block
the signals.
Ladar Data
• The figures on the next page show data from a high-resolution
scanning laser rangefinder. Each picture is 480,000 pixels (points),
each corresponding to a separate ladar measurement.
• The top picture shows the reflectance data: this is the amount of laser
energy returned from that point in the scene, and is roughly equivalent
to a flash photo.
• The lower picture shows range data. Brightness encodes range: points
that are further away are displayed more brightly.
• Note the fine details of shape and appearance visible in this data. It is
possible to build a computer program that can identify which objects
are cars, and which direction they are facing; this can give early
warning of which vehicles may be on a collision course.
Ladar Data
2.5 Sonar
• Sonar works by measuring the time of flight of sound.
• Sound travels (relatively) slowly though air and is hard to
focus, so sonar is only useful for detecting objects at
ranges of a few meters or less.
• Sonars are inexpensive, and work in a most weather
conditions. The initial mass market application was in
Polaroid auto-focus cameras.
• Sonars are commercially available for blind spot sensors
and back-up warning sensors.
Side and Rear Sensors
Sonars
Radar
This bus is
equipped
with rear and
side sensors
for blind
spot
coverage
2.6 Communications
• If all vehicles on a roadway are equipped with ITS features, intervehicle communications can be used to determine relative positions.
• Each vehicle can broadcast its current location, derived from GPS or
other positioning systems.
• Vehicles can also broadcast other information, such as speeds, intent to
change lanes, or onset of emergency braking. This is crucial in
decreasing inter-vehicle spacing to increase throughput, while
maintaining safety.
• This kind of scheme is most appropriate for high-end IVI systems,
such as automated highways.
• The picture on the next page shows a “platoon” of tightly-spaced
automated vehicles, developed by the PATH program at UC Berkeley.
Platoons rely on communications 20 times a second to keep all
vehicles moving smoothly together.
Platoon
2.7 Driver models
• Sensing the current location of a nearby vehicle is not all:
it would be even better to predict future actions of the
vehicle. Unless that vehicle is fully automated, it is
necessary to model the behavior of that driver.
• As shown in the next slide (and as everyone knows from
personal experience), there is a great deal of variability in
people’s driving behavior.
• If a particular vehicle can be observed for some time, that
driver’s behavior can be estimated, and used to predict
future actions.
Driver Differences
The five drivers plotted
here each have
different behaviors for
one important
component of driving:
average lane position.
They have different
mean lane positions
when the road is
straight, and cut the
corners by different
amounts when the road
curves
Left curve
Straight
Right curve
Section 2 Questions:
• What are the advantages and disadvantages
of using radar vs. ladar?
• The speed of light is about 3*10^8 m/sec,
or, for a rule of thumb, a foot / nanosecond.
How long does it take a radar pulse to go to
and from an object 150 m away?
• Find two manufacturers of automotive or
truck radars on the www
3
Sensing Obstacles
• Obstacle detection is much more difficult than
vehicle detection: obstacles can be small, nonmetallic, and much harder to see
• Obstacles can be stationary or moving (e.g. deer
running across the road)
• For a passenger car at highway speeds, obstacles
need to be detected 100 m ahead. For trucks, the
distance is even longer.
• Obstacle detection is one of the most challenging
tasks for an intelligent vehicle
3.1 Obstacles on the Road
 State DOTs report cleaning up construction debris, fuel spills, car
parts, tire carcasses, and so forth.
 State highway patrols receive reports of washing machines, other home
appliances, ladders, pallets, deer, etc.
 A survey commissioned by a company that builds litter-retrieval
machines reports 185 million pieces of litter / week.
 Rural states report up to 35% of all rural crashes involve animals,
mostly deer but also including moose and elk as well as farm animals.
 A non-scientific survey of colleagues indicates that people have hit tire
carcasses, mufflers, deer, dogs, even a toilet.
3.2 Sensors
• Ladar, in its high-resolution scanning formats, is
useful for seeing small objects
• A variant is to use the reflectance channel of a
ladar, and to look for bright returns, which
probably come from objects sticking up out of the
roadway.
• Sonar has insufficient range
• Advanced radar and stereo vision systems may
work
3.3 Polarimetric radar
• Radar can be polarized in the same was as light.
• Just as polarized sunglasses help reduce light reflected
from shallow angles (glare), polarized radar transmitters
and receivers can separate the return from different
polarization directions; this provides cues to distinguish
horizontal surfaces and from vertical surfaces.
• Polarimetric radars built at U of Michigan are much better
than ordinary radar at separating small obstacles from
ground clutter.
• There is also some evidence that polarimetric radar will
give different returns for wet or snowy roads, giving some
information on road conditions.
3.4 Stereo vision
• Stereo works by finding the same point in
two or more cameras. Intersecting the lines
of view from the cameras gives the 3D
location of the object.
Stereo Guided Segmentation
• Low-resolution stereo for detection and recognition of nearby objects,
used for side-looking sensors on a bus.
• Left: Original image. Center: depth map from stereo; brighter is close.
Right: “blobs” of pixels at the same distance. The overlays on the
original image show detected objects, two pedestrians and a car.
• Further processing can examine each blob to separate people from
fixed obstructions, and generate appropriate driver warnings
Long-Range Stereo
Top: One of three images
from a stereo set. The
objects on the road are 15
cm tall at a range of 100
m from the camera.
Bottom: detected objects
in black. Besides the
obstacles on the road, the
system has found the
person, the sign, grass
along the road, and a
distant dip in the road
Section 3 Questions:
• Look up the connection between posted speeds
and vertical curvature in the AASHTO handbook.
Is the line of sight for a human driver, going over
the crest of a hill, better or worse than for a sensor
mounted in the front bumper?
• For extra credit, go out and run over obstacles
with your car, and decide what is the largest object
you would be willing to hit, and therefore the
smallest object that needs to be detected.
4
Sensing Lane Position
• Knowing lane position is necessary for
automated guidance and for lane departure
warning systems. It is also important for
rear-end collision warning, to know which
lane your vehicle is in as well as which lane
preceeding vehicles are in.
• Requirements are somewhat different for
each application.
4.1 Requirements
– reliability: high for warning systems, extremely high
for automated guidance
– availability: must be available nearly 100% for
automated guidance; lower availability acceptable for
warning systems provided a warning is given
– weather: should operate in most weather, warn and
disable if not operating
– accuracy: absolute accuracy of better than 30 cm
needed; no high-frequency jitter allowed for control
applications
– range: rear-end warning requires knowing lane position
of leading vehicle, to approx. 100m
4.2 Magnetics
UC Berkeley has pioneered the use of
permanent magnets, buried in the center
of the road, for lateral guidance. The
magnets can be inexpensive magnets, as
shown here, for most applications; or
more expensive but much smaller
magnets for bridge decks where drilling
large holes would damage the structure.
The magnets are sensed by
magnetometers underneath the front and
rear bumpers of the vehicle to provide
lateral position information.
The magnets can be installed north pole
up or down, providing a simple binary
code that can indicate e.g. map location.
More Magnets
An obvious advantage of
magnets is that they are not
affected by weather. Here,
they are used to mark the
edge of the shoulder, to
provide a visual indicator
to the snow plow operator.
Besides buried magnets, there are also efforts to place magnets in
lane marking tape. This would be less expensive to install, but
requires more sophisticated sensing, since the magnets are not
directly underneath the vehicle’s sensors.
4.3 Buried cables
The oldest way to perform automated
guidance, going back to the 1950’s, is
to follow a buried cable. The automated
trucks at the Westrack pavement test
site use two cables for redundancy, with
pickup coils mounted in triangular
frames at both front and back of the
truck. Buried cables are all-weather,
and the signal on the cable can be used
to send messages (e.g. “speed limit
change”). But cable installation and
maintenance are difficult.
4.4 Radar reflective surfaces
• Collision avoidance radar can
be used for lateral control with
modified lane-marking tape.
• Frequency-dependent tape
properties can provide
distance and other information
higher f
lower f
(a)
Radar
High-Frequency
Illumination
Low-Frequency
Illumination
Radar-Reflective Stripe
(b)
• Conventional lane marking
tape (3M Corp.) punched with
specific hole pattern to provide
frequency-selective retroreflection
4.5 Vision
Typical vision system for
lane tracking.
The detected position of the
solid line is shown by the
blue dots; the detected
dashed line by dark and
light blue dots. Overlayed
on the image is data from
other sensors, showing the
location of radar targets:
yellow X for right lane, red
X for current lane.
Experimenter interface
shown at bottom.
Section 4 Questions:
• What would be the relative advantages of
magnetics vs. vision?
• What is the disadvantage of buried cables?
5
Sensing vehicle position and
motion
• An estimate of vehicle motion, and position on a map, can
be used in several ways, depending on the resolution. For
example:
– coarse position (10s of meters) can be used to predict that a corner
is coming up
– medium position (meters) can be used to warn a driver to slow
down, based on the design speed of the upcoming curve
– fine positioning (cm) can be used to tell if the driver is drifting out
of their lane through the curve
• Several different technologies provide ways of measuring
absolute position and motion, at a variety of resolutions.
5.2 GPS
• The Global Positioning System is a satellite-based
navigation system, originally developed by the US
military. It works by broadcasting very accurate time
signals from a constellation of orbiting satellites. A
ground-based receiver can compare the times from several
satellites; the different in apparent times gives the
difference in time-of-flight of the signals from the
satellites, and therefore the difference in distance to each
satellite. Simple geometry gives the location of the groundbased unit and an accurate time.
More GPS
• This simple picture is distorted by two phenomena
– The US government deliberately introduces distortions
into the civilian version of the signal, in order to reduce
the accuracy of the system for potential enemies
– Local atmospheric effects refract the signals by varying
amounts
• The result is that raw GPS has an accuracy of only
10’s of meters
Differential GPS
• In Differential GPS, a base station has a GPS receiver at a
known location. It continually compares its known position
with the GPS reported position. The difference is the error
caused by selective availability and atmospheric distortion.
The base station broadcasts the correction terms to mobile
units. By applying the correction, the mobile units can
reduce their errors.
• The accuracy of DGPS is on the order of a few meters.
Carrier Phase GPS
• In carrier phase systems, the base station and the
mobile units watch both the broadcast time code,
and the actual waveforms of the carriers. By
counting waveforms, they can synchronize their
positions with each other to a fraction of a
wavelength.
• A good carrier-phase system, with good
conditions, can achieve accuracies of 2 cm or
better.
GPS Difficulties
• GPS requires a clear view of at least 4 satellites.
For aircraft applications, or in flat, open terrain,
this is not a problem.
• In mountainous terrain, or in urban canyons, GPS
signals can be blocked or (worse) can reflect from
tall objects and cause mistaken readings.
• Carrier-phase GPS is very sensitive to losing lock
on the satellites, and can become confused even
going under a large road sign.
Bottom line on GPS
• GPS is very useful for many applications.
• It is not yet 100% reliable, so is not ready
for control applications.
• Research continues on filling in gaps in
GPS coverage, and integrating GPS with
other sensors, so there is hope for the future.
Maps
• Accurate position is not useful unless combined with accurate maps.
• The first generation of digital maps were produced from paper maps,
and therefore are no more accurate than the paper products. Typical
quoted accuracies are 14 meters. This is sufficient for in-vehicle
navigation systems; until more sophisticated uses arise, there is little
market demand for high accuracy.
• The next generations of maps will be produced directly from aerial
photos and verified by driving selected routes with accurate GPS, so
the accuracies will improve.
• To be completely useful, maps should have additional information,
such as design speed of curves, grade of slopes, etc. This would aid
e.g. in warning drivers of excessive speed when entering a curve.
5.3 Inertial
• Inertial sensing measures acceleration, then integrates
acceleration to give velocity and again to give position.
• Since position is doubly-integrated, small errors in
acceleration build up rapidly.
• Inertial measurement is good for sensing braking forces or
for comparing wheel speed with ground speed and
calculating slip during braking.
• High-precision inertial navigation is not yet affordable for
the automotive market.
• Inertial measurement is useful to fill in short-term gaps in
GPS or other measurements.
5.4 Other sensors
• “Dead reckoning” uses estimates of distance travelled and
direction of travel.
• Odometry uses wheel encoders to measure distance
traveled. It is susceptible to errors due to tire slip, incorrect
estimates of wheel circumference due to changes in tire
inflation, etc. Road Rally enthusiasts can calibrate their
odometry to 0.1%; this is not practical for most vehicles.
• Standard compasses are affected by nearby metallic
objects, such as bridges or buildings.
More Sensors
• Image correlators directly measure vehicle motion by
watching the ground move by under the vehicle. These
systems are accurate to better than 0.1%
• Doppler radar is used in precision agriculture applications,
where it is important to measure the speed of farm
equipment even with significant tire slip.
Section 5 Questions:
• Why can’t you just use a magnetic compass
for heading?
• What’s the cheapest GPS unit you can find
on the web?
• Why would Japan have a higher market
penetration of GPS and moving map
displays than the US?
6
Predicting Braking
Performance
• Braking performance is key to setting many
parameters in automated control and in driver
warning systems.
• To set safe following distance, ideally the system
should know its own braking capability; the
braking capability of the lead car; and the reaction
time of the automated system or of the
• Braking performance of vehicles on identical
roadways can vary by a factor of 4
6.1 Basic formulas
• The basic formulas for the time and distance required to
bring a car to a stop are
• Time = reaction time + speed / deceleration
• Distance = speed * reaction time + ½ speed2 / deceleration
• Typical highway speeds are approximately 30 meters /
second; typical reaction times range from 100 milliseconds
for a fast computer-controlled sensor and brake actuator, to
up to 2 seconds for a human driver. The dominant
unknown factor is deceleration, or braking performance.
6.2 Wheel speeds and slip
Typical force vs. slip curve. As the
brakes are applied, the tires begin to slip,
which results in deceleration force. As
the slip increases, the force increases to
some maximum. After that point, the
wheels begin to lock and skid, and the
braking force decreases. Note that the
curves for wet and dry pavement start
off very close to each other, but reach
different peaks. This means that gently
tapping the brakes is not enough to tell
surface type, and therefore it is difficult
to predict maximum braking
performance without attempting hard
braking.
Dry surface
Force
(g)
Wet surface
Slip (%)
6.3 Surface condition sensing
• Several methods have been attempted to sense current
surface conditions:
– infrared spectrophotometers, tuned to detect differences between
ice, water, and dry pavement
– microphones in the wheel wells listening for water splash sounds
– roadside mini-weather stations with sensors built into the pavement
– careful instrumentation of all wheels of a car, looking for incipient
slip on the driving wheels
• None of the methods is completely successful yet.
Section 6 Questions:
• Have you ever encountered “black ice” that you
couldn’t tell was there?
• Calculate stopping distance for the following
parameters:
–
–
–
–
–
Truck with 1.0 sec reaction time and 0.3 g braking
Sedan with 1.0 sec reaction time and 0.7 g braking
Sedan with sleepy driver, 1.5 sec reaction time and 0.7 g braking
Sedan with poor brakes, 1.0 sec reaction and 0.5 g braking
Sports car with professional driver, 0.5 sec and 1.0 g
• Which factors dominate stopping distance?
7
Reliability
 Reliability engineering in intelligent vehicles is
difficult. Several characteristics of automobiles
are much different than, e.g., aircraft:
 Cost sensitivity: Usual practices that involve triplex
redundancy of critical components may not be
affordable in automobiles.
 Equipment used until end-of-life: In most safetycritical tasks, preventive maintenance schedules call
for replacing electronics before the end of their
design life. In the automotive environment, many
components are never replaced until they fail.
More Reliability
 Operation in uncontrolled environment: Vehicles
operate in harsh environments, with relatively
unskilled and untrained operators.
 Very large scale of deployment: An extremely
improbable event, one that occurs once in 109 hours,
would cause one failure in 73 years in the US
commercial air fleet. That same probability would
cause a failure once every 4.5 days in the US
automotive fleet, due to the much higher number of
vehicles. Even though the risk to a passenger might
be the same in both cases, the public perception of
risk could be much higher for cars.
7.1 Redundancy
• Duplex redundancy refers to having two copies of a
subsystem (e.g. computer). If a failure is detected in one
system, the other can be used.
• Triplex redundancy has three copies. for computers, the
output of all three can be compared, and the majority wins;
this provides automatic detection and correction of single
errors.
• Heterogeneous redundancy refers to doing the same
function with different means. For instance, if a steering
actuator fails on an automated vehicle, some steering
authority is available by differentially applying the right or
left brakes.
7.2 System-level solutions
• System level solutions build safety into the
system by considering the entire system. In
automated highways, the California PATH
approach of Platoons is designed to mitigate
the effects of an (unlikely) crash by having
vehicles so closely spaced that any collision
would be at a small relative velocity.
Questions:
• How reliable is your car? Your computer?
Would you trust your life to them?
• Describe heterogeneous redundancy, and
give an example.
8
Emerging technologies
• A number of other technologies are being
developed that will support intelligent vehicles.
• Some, such as electronic controls, are being
developed for other purposes (e.g. handling), but
will be useful for intelligent vehicles.
• As drivers become more accustomed to electronics
in vehicles, prices will fall, consumer acceptance
will increase, and the pace of adoption of new
technology could accelerate.
8.1 Control
• Current IVI applications are focused on driver assistance
rather than vehicle control; nevertheless, partial and full
automation will eventually be important.
• A wide variety of standard and advanced controls
techniques are being applied to road vehicles
• Vehicles to date have been designed for human control, not
automated control. For example, current steering system
geometry is designed for “good handling”, i.e. predictable
response for humans. The underlying hardware may need
to be modified for optimal automatic control.
Difficulties
• Automated control is especially difficult in some
situations:
 Emergency maneuvers: Control systems optimized for smooth
performance at cruise will not work for abrupt maneuvers in
emergency situations.
 Equipment failure: Special controllers need to be designed to cope
with tire blowout or loss of power brakes or power steering.
 Heavy vehicles: The load, and the distribution of the load, vary
much more for a heavy truck than for a passenger car. Truck
controllers need to be much more adaptable than light vehicle
controllers.
More Difficulties
 Low speeds: Engine and transmission dynamics are hardest to
model at slow speeds. Applications such as automated snow plows
or semi-automated busses will require careful throttle control
design.
 Low-friction surfaces: As addressed above, it is difficult to predict
the effective coefficient of friction on a particular road surface.
This affects not only braking performance but also the design of
throttle and steering controllers.
8.2 Actuation
• Full or partial automation will require actuators, i.e.
computer-controlled motors that can move the throttle,
brake, and steering.
• The state of the art is rapidly improving: vehicles are
available on the market with electronic fuel injection,
electronic power steering, and electronic power brakes, all
driven by performance and weight improvements for
manually-driven cars. This makes it much easier to add
computer control.
• Special-purpose actuators will still be needed in some
applications, such as quick-response throttles for closelyspaced platoons of cars.
8.3 Driver condition
• It is important to assess driver alertness,
both in a drowsy driver warning system,
and in an automated system that is
preparing to return control to the driver.
• Alertness can be sensed indirectly, by
watching lane-keeping performance; or
directly, by watching for eye blink rate and
closure.
Perclose
Measuring percentage of time
eyes are closed. This system
illuminates the face with two
IR wavelengths, one of which
reflects from the retina.
Subtracting the images will
create a blank image (if the
eyes are closed) or an image
with two bright spots (if the
eyes are open).
Top left: image with retinal reflections
Top right: no retinal reflections
Bottom left: difference image, note
two bright dots for reflections
8.4 Communications
• Infrastructure-based ITS projects are building roadway-to-vehicle
communications for traffic and routing information.
• Dedicated Short Range Communication (DSRC) is being developed
for warning of local conditions, such as ice, sharp curves, changes in
speed limit, or stopped traffic out of sight around a bend.
• Vehicle-vehicle communications will be increasingly important for
collision warning systems. The lead vehicle can communicate speed,
braking, intent to change lanes, traffic status ahead, etc.
• In the platoon version of full automation, vehicles need to
communicate with low latency, e.g. 20 times / second. This creates
interesting research questions on creating local nets, on managing both
inter- and intra-platoon communication, and on reliable
communications in urban canyons and other difficult environments.
Communications Technologies
• Most communications schemes rely on radio,
using a variety of bands.
• Schemes currently under research include:
– modulating the radar reflectivity of a surface, so radar-equipped
trailing vehicles can get information as well as range
– powering a transponder with radar energy, again to communicate
to a following radar-equipped vehicle
– modulating LED brake lights so trailing vehicles equipped with
detectors tuned to that particular wavelength can pick up
information
Section 8 Questions:
• What makes vehicle control difficult?
• What makes communications difficult?
9
Sensor-friendly roadways
and vehicles
• On-board sensing would work better if the
environment were designed for sensing.
• Current roadways have significant variability
(Bott’s dots, painted lines, thermoplastic stripes,
etc).
• Current roadways have many objects that cause
radar “clutter” (returns from objects that are not of
interest), such as guard rails, roadside signs,
bridge overpasses
9.2 Path prediction
• “Path Prediction” refers to estimating where the
vehicle’s current lane goes, so an obstacle
detection system knows where to look for stopped
cars and other obstructions.
• Sensor friendly systems will improve path
prediction by enhancing lane visibility.
• They will also improve obstacle detection by
reducing clutter from off-road objects and
increasing returns from other vehicles.
9.1 Dealing with clutter
 Clutter can be:
 Moved: Sign posts could be placed farther from the travel
lanes.
 Masked: Radar Absorbing Material (RAM) could be applied to
objects such as bridge abutments
 Marked: Polarizing reflectors, or filters that absorb only a
narrow frequency band, could be applied to large objects. They
would then still appear in a radar return, but would be marked
in the radar signal as known fixed objects
 Mapped: The locations and signatures of fixed objects could be
stored in a map, and provided to individual vehicles.
Sensor-Friendly Features
• Besides clutter suppression, sensor-friendly
systems can improve visibility:
– Lane markings can be improved with pigments
that reflect radar or near-visible wavelengths
– Vehicle visibility can be improved with radar
reflectors, either fixed or modulated for
communications
Microstrip patch retroreflector
antenna
• Without a stable aiming
point, radar-based vehicle
tracking is difficult. Lead
vehicle appears to wander • OSU patch retro-reflector
provides a distinctive,
wideband, vehicle marker.
Compact form factor is easily
attached to vehicles.
• Angle-invariant return provides
aim-point stability.
• Wide bandwidth permits good
range resolution.
Freq. [GHz]
Angle
Section 9 Questions:
• List four ways of handling clutter.
• How can sensor-friendly features help with
the path prediction problem?
10 Comments and conclusions
• Many of these technologies work best in combination: e.g.
lane tracking aids both lane departure and rear-end
warnings.
• Many of these work best with some infrastructure
assistance: e.g. lane departure systems need at least good
road delineation, and can take advantage of better
markings.
• In many cases, the technology is approaching readiness;
the remaining obstacles to deployment are legal and
institutional.
Acknowledgements
• My thanks to the CMU Navlab group, and the Automated Highways
Tech Team. Much of the research described here was supported by
NHTSA and FHWA.
• Photo credits: thanks to Liang Zhao (stereo), Todd Williamson (stereo),
Richard Grace (perclose), Gerald Stone and California PATH (magnets
and snowplow), Bill Stone and California PATH (platoon), Colin
Ashmore (buried wire), Umit Ozguner, Jon Young, and Brian A.
Baertlein (radar reflective surfaces and microstip antenna), K2T Inc
(ladar), Dirk Langer (radar), Parag Batavia (driver differences chart),
Assistware Technologies (vision system) and Todd Jochem (bus). All
pictures copyright by their owners; reproduced by permission.
Download