Springer - Linköping University

advertisement
Virtual Sensors for Vehicle Dynamics
Applications
U. Forssell, S. Ahlqvist, N. Persson, F. Gustafsson*
NIRA Dynamics AB
Teknikringen 1F
58330 Linköping, Sweden
Phone: +46/13/329800 Fax: +46/13/329829
Email: urban.forssell@nira.se
stefan.ahlqvist@nira.se
niclas.persson@nira.se
*Dept. of Electrical Engineering
Linköping University
58183 Linköping, Sweden.
Phone: +46/13/282706 Fax: +46/13/282622
Email: fredrik@isy.liu.se
Keywords: adaptive filter, inertial sensors, sensor fusion, signal processing
Abstract
This paper discusses sensor fusion as a means to compute virtual sensor signals
for certain vehicle attitude quantities, in particular vehicle yaw rate. It is shown
how sensor fusion can be used to increase the performance and availability of
standard sensors commonly available in a modern car. Test results from tests
performed with a real vehicle are presented.
1 Introduction
NIRA Dynamics is a research and development company specialising in signal
processing for vehicle dynamics applications. This paper presents some on-going
activities within our company directed at creating virtual sensors using advanced
sensor fusion software.
The sensor fusion idea is quite general and has many different application areas,
see e.g. [1] and the references therein. As the name indicates, sensor fusion is
about fusing information from several different physical sensors. The goal is to
compute new virtual sensor signals using information from the existing, physical
sensors. The virtual sensors can in principle be of two different types:
1.
High-precision and self-calibrating sensors, i.e. improved versions of the
physical sensors. The goal is either to achieve higher performance using
existing sensors or to reduce system cost by replacing expensive sensors by
cheaper ones and using sensor fusion to restore signal quality.
2.
Soft sensors, i.e. sensors that have no direct physical counterpart among the
sensors used but can be created using intelligent software solutions.
Sensor fusion is used in, for example, navigation, target tracking, aircraft attitude
estimation and various other military applications to achieve exactly these goals.
Our primary aim in our research and development efforts is to develop unique
sensor fusion based systems for vehicles - in particular for vehicle dynamics
applications – and the challenge is to utilise the potential to both improve
performance and reduce the system cost.
NIRA Dynamics Sensor Fusion
Road friction
High prec. yaw rate
Tire pressure
ABS
Gyro
Virtual
sensors
Control
unit
Diagnosis
unit
HMI
Sensor
integration
unit
Acc.
Engine
Degraded/"limp
home" functionality
Fig. 1. Sensor fusion in vehicles
Anti-locking brakes
Anti-spin
Dynamic stability
Adaptive cruise
Audio/visual warnings
Information displays
Figure 1 contains a schematic picture of how the sensor fusion ideas may be
applied to vehicles. To the left we have different types of sensors available in a
modern high-end car: wheel speed sensors (ABS), yaw and/or roll rate gyros,
accelerometers, and various engine (and powertrain) related signals. These signals
are fed into a sensor integration unit, which merges the information from the
different sensors and allows the computation of the virtual sensor signals. These,
in turn, may be used as inputs to various control systems, such as anti-spin
systems and adaptive cruise control systems, or in some Human/Machine Interface
(HMI), for example a display on the dashboard.
The possibility to compute virtual sensor signals is of course very appealing, but
sensor fusion also gives us tools to improve fault diagnosis of the physical sensors.
The reason for this is that, by using sensor fusion, we introduce analytical
redundancy, which can be used to detect and isolate different sensor faults. The
redundancy also implies that we can reconfigure the system if one or more sensors
brake down to achieve so-called degraded, or “limp home”, functionality.
Classical designs rely on hardware redundancy to achieve these goals, which is a
very expensive solution compared to using sensor fusion software.
The objective of this paper is to discuss sensor fusion algorithms in general and
sensor fusion algorithms for vehicle attitude estimation problems in particular. See
also the accompanying paper [2]. A number of active safety systems can benefit
from higher quality state information about vehicle attitude (speed, position,
orientation, etc.):

Anti-lock braking and anti-spin systems need accurate velocity information to
compute the slip.

Anti-spin systems for AWD vehicles need absolute vehicle velocity
information for computing the optimal slip.

Dynamic stability systems need an accurate and high-bandwidth yaw rate
signal to control the body slip angle (i.e. the difference in angle between the
steering wheel and the wheel's velocity vector).

Adaptive cruise control systems need accurate yaw rate information for its
situation awareness.

Smart airbag systems need accurate velocity and acceleration information to
control the release of the airbag during e.g. vehicle roll over.
A problem in many vehicle projects, however, is that the high-precision sensors
that are needed in order to meet the functional specifications typically have to be
replaced by less accurate sensors due to cost considerations. Hence, there is a huge
potential for using sensor fusion technology to create high-precision virtual
sensors at a very modest cost in this area. To exemplify our ideas we will in this
paper concentrate on the problem of estimating a high precision yaw rate signal
from a standard, low-cost gyro using sensor fusion.
The rest of the paper is organised as follows. Next, in Section 2, we review some
basic algebraic relations, which reveal much of the core ideas of the sensor fusion
approach. Then, we go on to discuss the details for how to estimate a highprecision yaw rate signal in Section 3. Section 4 contains some test results and,
finally, Section 5 summarises the paper.
2 Theoretical Foundation
The core idea behind our sensor fusion solution can be highlighted using the
following example. Consider two different sensors measuring the same varying
physical parameter gives separate measurements yi(t) of a the parameter x, where
each measurement has an offset bi with an offset scaling ci(t) according to a
known function of time. The measurements can be expressed algebraically as the
equations:
y1(t) = x(t) + c1(t)b1
y2(1) = x(t) + c2(t)b2
These two equations have three unknowns and is therefore insoluble, and the
offsets cannot be directly eliminated. When two measurements y1(1),y2(1) and
y1(2),y2(2) are available, there are two more equations and only one more
unknown, i.e. four equations and four unknowns. Thus, the offsets and the
variable parameter values x(1),x(2) can be solved under the condition that there is
no linear dependency in data. In this example, the linear independency condition
is:
c1(1)/c1(2)  c2(1)/c2(2)
If, for example, c1 is constant and c2(t) is the velocity vx(t), linear independency
occurs when the velocity has changed between two measurements. This leads to
observability, and under these conditions we can resolve all unknowns and hence
also determine the sought parameter x without error.
In practice there is a measurement noise added to each of the observations. In
order to eliminate the noise, a number of observation samples large enough to
constitute an overdetermined equation system is collected and solved using a least
squares solution. In a real-time application, this should be implemented using a
recursive filter, preferably a Kalman filter, into which the sampled observations
are input. Under certain identifiability assumptions (like persistence of excitation),
the Kalman filter gives consistent estimates of the sought quantity (quantities). As
always in adaptive filtering, there is a trade-off between noise suppression and
tracking ability, which must be handled with care for optimal performance of the
adaptive filter [1]. Here we will not go into details on this for sake of conciseness.
Next we will see how these basics translate into a more explicit form when
discussing high precision yaw rate signal estimation.
3 High Precision Yaw Rate Sensor
Our ideas for how to compute a high precision yaw rate sensor signal will be
described in this section. As indicated in figure 2, the idea is to utilise available
information from existing sensors in a modern car and to use sensor fusion to
compute an improved yaw rate signal.
The idea of our patent pending [2,3] system is to compute a bias and scale factor
free yaw rate signal using an adaptive filter that estimates the error in the yaw rate
signal from the yaw rate gyro and removes that from the original signal by means
of a simple subtraction operation, cf. figure 2. A feature of this solution is that the
bandwidth of the original signal is preserved and, in case of limited computational
power, the adaptive filter can run at a very moderate rate (it should be high enough
to capture the most important temperature drifts etc. of the sensor, though).
+
Yaw rate
gyro
High Precision Yaw rate
_
Wheel
speed
Sensor
integration
unit
Error estimate
Lateral
acc.meter
Fig. 2. Computation of the high precision yaw rate signal
The details of our solution are as follows. For the sake of simplicity of the
explanation, we are here assuming that there is no lateral movement of the vehicle.
In the relations:
vx
 v x R 1
R
v x2
ay 
 v x2 R 1  v x
R
 
 is the yaw rate from a gyro; vx is the velocity of the vehicle in the x-direction;
ay is the acceleration in the y-direction. The curve radius R is computed according
to the following relation, where R is defined as the distance to the centre of the
rear wheel axle (of length L):
L
vrr Rrr R  2


vrl Rrl R  L
2
The angular wheel velocities  for each of the respective wheels are received from
an ABS and the inverse R-1 of R is solved for in order to avoid numerical problems
when driving straight ahead. The wheel radii ratio is subject to an offset:
rrl
 1   ABS
rrr
The influence of the offset on the denominator is negligible, which results in the
following expression for inverse curve radius:
R 1 

1 2   rl
1 2  rl

(1   ABS )  1  Rm1 
 ABS
L  rl  1   rr
L  rl  1  rr

 rr
 rr
wherein the computable quantity
Rm1 

1 2   rl

 1
L  rl  1   rr 
 rr
is used for the inverse curve radius. Finally, the velocity at the centre of the rear
wheel axle is
 rl   rr
vx 
2
r
where r denotes the nominal wheel radius.
Thus, in a practical implementation of the system depicted in figure 2, the sensor
measurements are:
1.
y1(t) from a yaw rate sensor, i.e. gyro signal;
2.
1
y2(t) = v x Rm , from ABS sensors, Rm1 is computed as above; and
possibly
3.
y3(t) from a lateral acceleration sensor.
All these sensor measurements are subject to an offset and measurement noise
given by the relations:
y1 (t )   (t )   YR  e1 (t )
y 2 (t )  v x R m1  e 2 (t )   (t )  v x
1
2
L  rl
 rr
1
 rl
 ABS  e 2 (t )
 rr
y 3 (t )  v x (t )   ACC  e3 (t )
where ABS is an offset that depends on relative tire radius between left and right
wheels. In this model, the offsets may be estimated on-line using e.g. a recursive
least squares method or a Kalman filter. We prefer the latter due to the Kalman
filter’s advantageous tuning flexibility in multi-parameter estimation problems.
Remarks
1.
The important question of identifiability, that is, under what conditions are the
offsets possible to estimate, is answered by studying the rank of the matrix to
be inverted in the least squares solution. For the accelerometer sensor, the
matrix is given by:



1

N
1
N
1
 v t 
t 1
x
1 N 1

N t 1 v x t 
1 N 1

N t 1 v x t 2






and, in short, this matrix has full rank if and only if the velocity changes
during the time horizon. Furthermore, the more variation, the better estimate.
Similarly, the offsets are identifiable from yaw rate and ABS sensors if the
velocity or the curve radius changes anytime.
2.
Another problem to consider in a practical implementation of the system is
that of wheel spin and other abnormal driving conditions, which must be
considered separately. Due to space limitations, we do not go into the details
of this here. In general, these kinds of problems must be handled by turning
off or slowing down the adaptation for some (or all) of the parameters during
these conditions.
4 Test Results
To demonstrate the performance of the high precision yaw rate function we now
show the results of two different tests:
1.
A 105 second drive on public road, with rapid lane changing forth and back
on first part, then an aggressively taken roundabout and then a drive through
the same bend, with hard ABS-braking to complete the stop. The results are
plotted in figures 3-6.
2.
A test drive comprising of four laps in a large roundabout where the
improvement in long-term drift is clearly visible, cf. figure 7.
Test Drive Number 1
A map of the measurement drive is shown in figure 3. From the measurements we
can see an instant following of the correct yaw rate but a noisy estimate in the
solid line (HPY Direct), shown in figures 4 and 5. A filtered version is also
distinguishable from figure 4, which is highly noise attenuated but lags
approximately 40 ms in time, which gives an effective bandwidth limit of 25 Hz.
However, it should be pointed out that there exists no theoretical limit for this
bandwidth with our implementation. The filter effect is a mere compromise
between acceptable time lag and acceptably good noise attenuation. The zoomed
part (Fig. 4) is taken at a high rate of yaw (the roundabout part). For these
measurements the temperature gradient is close to zero as the gyro is left on a long
time before collection is started. The offset estimate is stable (Fig. 6), however
some influence of the aggressive drive style is shown in the estimate.
Fig. 3. Test drive 1
HPY - Fast and Direct
40
Yaw Rate [deg/sec]
38
36
34
32
30
46
46.2
46.4
46.6
46.8
47
47.2
Time [s]
47.4
47.6
47.8
48
Fig. 4. High precision estimate direct and low-pass filtered
HPY - Fast and Direct
40
30
Yaw Rate [deg/sec]
20
10
0
-10
-20
-30
-40
0
20
40
Time [s]
60
Fig. 5. Zoom filtered estimate
80
HPY offset estimate
4.8
[deg/sec]
4.75
GYRO
4.7
Yaw Rate Gyro Offset 
4.65
4.6
4.55
4.5
4.45
4.4
0
5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90
Time [s]
Fig. 6. Offset estimate
Test Drive Number 2
To demonstrate the performance improvements achieved in terms of yaw rate drift
we performed a test where a standard Volvo S80 with a production-type yaw rate
gyro was fitted with our high precision yaw rate system and driven four laps in a
large roundabout. Figure 7 contains trace plots from our inertial navigation system
using the raw yaw rate signal (red) and using our system (blue).
Fig. 7. Navigation performance with and without our system
The performance improvement is huge: the drift is decreased by a factor 10 in this
test (the standard gyro has a drift of about 2 degrees per second, our system less
than 0.2 degrees per second). In figure 7 one should notice that using the raw yaw
rate signal (red) we think we leave the roundabout with approximately 180
degrees wrong heading (top right instead of bottom left). For the blue trace the
plot almost exactly represents the true vehicle path. It should also be noted that in
this test the vehicle speed was computed without knowledge of exact tire radius.
5 Summary and Conclusions
We have discussed the usefulness of sensor fusion technology for vehicle
applications, in particular vehicular dynamics applications such as vehicle attitude
estimation. As an example of our ideas’ feasibility, we have in detail discussed
how a high precision yaw rate signal can be computed from a standard rate gyro
signal, standard wheel speed sensors, and possibly a lateral acceleration signal.
The level of performance achieved is a yaw rate signal with a drift of less than 0.2
degrees per second using a standard rate gyro with drift of about 2 degrees per
second.
References
[1] F. Gustafsson, Adaptive Filtering and Change Detection, Wiley & Sons, 2000
[2] F. Gustafsson et al., Sensor Fusion for Accurate Computation of Yaw Rate and
Absolute Velocity, SAE paper 2001-01-1064
[3] Swedish patent application SE0001353-2, Sensor Fusion System
Download