USING AEROSPACE TECHNOLOGY TO IMPROVE OBSTACLE DETECTION UNDER ADVERSE ENVIRONMENTAL CONDITIONS FOR CAR DRIVERS A. Amditis1, L. Andreone2, E. Bekiaris1 1. Institute of Communication and Computer Systems (ICCS) / National Technical University of Athens 9, Iroon Politechniou str., 15773 Zografou , Athens, Greece angelos@esd.ece.ntua.gr 2. Fiat Research Centre, Strada Torino 50, 10043 Orbassano (TO), Italy l.andreone@crf.it Abstract Microwave radar and far infrared sensor technologies have been used in aerospace applications for decades but both their technical limitations and high prices have prohibited their use in the automotive field. Instead, in this field low-cost and performance sensors were used. Thus, the typical automotive radar sensor is limited to the detection of moving targets and cannot recognise stationary obstacles. The relevant vision enhancement systems on the other hand, had poor behaviour in low visibility cases. Road safety and efficiency have requested the introduction of such improved environmental knowledge acquisition and leaded to the development of the first generation of ADAS (Advanced Driver Assistance Systems). As most fatal accidents occur at night or are influenced by rain, ice and fog, an improved obstacle detection system over the next 200 meters, has been estimated to lead to an accident reduction rate of up to 25% and a significant reduction in traffic congestions due to poor visibility conditions. It seems that the latest developments in the aerospace field regarding such sensors open now a window for their automotive application. Low cost microwave radars have emerged at the 77 GHz frequency, able to be seen 200 m in front, whereas far infrared sensor cost has been reduced by up to 30%, substituting rare materials used before (i.e. Germanium) by new materials (i.e. TEX glass) of lower cost and using more compact and less costly microbolometer sensors. In the described application, two such sensors (a microwave radar and a far infrared) of aerospace origin are being redesigned and integrated to meet the relevant automotive application requirements (in terms of range, volume, accuracy, cost, etc.). Even most important, emphasis is given on designing a completely new HMI, so that any car driver can use it. Indeed this constitutes an equal challenge as the technical one, since the airplane pilot is trained to translate and understand the complex sensors output, while the car driver is not. To make things worse, reaction margins of car drivers are usually smaller than airplane Pilots. For this application, an innovative HMI is designed and tested, in order to present the two sensors fused output in an understandable way to the driver, without requesting that he/she will take his/her eyes off the road. A number of image presentation techniques and use of obstacle symbols/metaphors will be comparatively tested in laboratory by 3 car manufacturers and several research institutes Europe-wide. The results of this work are expected to bring a new dimension not only in automotive obstacle detection systems technology but also to their HMI concept and relevant user’s acceptance. 1 Introduction Microwave radar and far infrared sensor technologies have been used in aerospace applications for decades but both their technical limitations and high prices have prohibited their use in the automotive field. Instead, in this field low-cost and performance sensors were used. Such systems were firstly introduced for defence applications allowing pilots to have a clear view of the traffic and tactical scenario in front them even in large distances and in adverse weather conditions. Soon similar systems were introduced also to the commercial aircrafts for a number of tasks improving significantly safety and efficiency. Such applications include: Weather radars – Microwave radars are used for many years now for detecting weather phenomena in front of the aircraft allowing the pilot to have a clear image of the situation in front of him. Traffic and obstacles detection systems. Radars are also used of course for detecting the traffic around the aircraft in order to avoid possible collisions between aircrafts. In the recent years IR sensors are also used for similar purposes covering mostly small distances, night and adverse weather conditions cases. In these cases IR sensors work better that the microwave radars. Safe approach, landing and taxi of an aircraft in near-zero visibility. In attempting to create a system that will assist the pilots on those tasks, an accurate "picture" of the outside world is required. This include the detection of terrain and obstacles For many years military aircraft have used Forward Looking Infra Red (FLIR) sensors superimposed on a Head Up Display (HUD) (or Head Mounted Display (HMD)) to create what is known as an Enhanced Vision System (EVS). This type of design can greatly increase the view outside the aircraft when flying at night and in poor visibility. Unfortunately, the FLIR is susceptible to certain densities of fog that render the sensor inoperative. To overcome this problem, the use of Millimetre Wave Radar (MMW) has been considered. The result being that the two types of sensor are complimentary to each other and so, the current state-of-the-art systems are attempting to merge the two different sets of data for common display to the pilot In all the above cases the two sensors were handled separately in terms of HMI and fusion of their data and findings. Only during the last few years engineers start to work seriously in merging the functions of these two sensors starting from the defence applications. One of the interesting projects having to do with the possible merging of radar and infrared sensor data is currently undertaken for aircraft application by the Braunschweiger Institut fuer Flugmachanik of DLR to develop a system to ensure a safe landing in reduced visibility conditions caused by fog, rain and snow. According to the forecasts at least five years will be needed until this system will be in use (1). Since the mid 80’s several efforts have been undertaken to transfer these technologies also to the automotive applications field, in order to develop anti-collision driver support systems, able to provide an effective support to vehicle drivers, especially in cases of reduced visibility, Indeed, around 37% of road accidents with injury (2,3,4,5,6,7) occur in conditions of limited visibility, like darkness and fog. Moreover, accident statistics at EU level (2) show that numbers and causes of vision-related accidents are comparable in the different EU countries. The worse environmental conditions in the North are compensated by the better ability of local drivers to cope with them; thus, leading to a problem of truly pan-European dimension calling for common technological action within Europe and beyond. 2 The Performance Levels and Problems of Stand-Alone Sensors Currently far infrared and radar sensors have been separately employed in various prototypes, to support the driver in different reduced visibility conditions. However, mature as they may be in technological terms, they have not been successful yet to present the complete traffic picture to the driver, especially under particularly adverse environmental conditions. The current lack of success in the technology transfer from the aerospace field to the automotive one is due to the very different requirements of the automotive applications, in terms of scenario complexity, visibility range and human machine interaction requirements (car drivers are far less trained and skilled in the use of information devices than airplane pilots). A number of different technological approaches that have been used so far and their limitations and shortcomings are summarised below. Ultrasound techniques, useful at very short distances since the technology is strongly dependent on propagation medium variations. CCD / CMOS sensors and image processing devices, active in the visibility range, but not offering a sensible benefit in any reduced visibility condition. Near infrared sensors (laser radar, infrared sensors) which do not offer a sensible benefit in fog. Far infrared sensors (sensitive in the range of 8-14 m), providing thermal images of the scenario independently from any light conditions. The enhanced visibility in fog and heavy rain condition is dependent on droplet dimension. Microwave range sensors (radar), operating properly in any poor visibility condition, providing the detection of objects in the road ahead. Information from stationary objects require powerful signal processing to be extracted; their functionality is limited in complex scenario like urban traffic. If one focuses on the benefits and shortcomings of radar and infrared sensors however, one can remark that the two technologies are fully complementary, as shown in the following Figure (a bullet denotes appropriate functioning a question mar a specific problem). Sensor Type Anticollision Radar Far IR Sensor EUCLIDE System Adverse Weather Condition Fog Snow Night ? 1 ? 2 Traffic scenarios Heavy rain ?2 Urban ? 3 Information Presentation Number of lanes Traffic Density Day or night ? 4 Figure :Strengths and weaknesses of anticollision radar and far IR sensors, when used in automotive applications 1. The performance is depends on the type of fog 2. Decrease of performance in heavy snow and rain 3. Object detection problems in complex traffic scenario like in city and inside tunnels 4. No enhanced perception of the external environment, only warnings 3 EUCLIDE integrated approach The main limitation of state of the art microwave radar sensors is related to the detection of objects not belonging to the road, like bridges, to the difficulty to extract road geometry, and to the relatively rough classification of objects types. However, this information can be extracted by image processing of infrared sensor data. The far infrared detector provides an enhanced image of the road ahead (up to 200 m, depending on the selected optical field of view) without the need of external illumination. The development of appropriate real time image processing algorithms will allow the extraction of essential features to support the radar object tracking. This will be particularly important for overhead object alarm suppression at long ranges. On the other hand, really heavy rain and snow can cause a decrease in the infrared image quality, since big droplets are seen as “spots” of different temperature, thus decreasing image quality. In fog the behaviour of the infrared sensor depends on fog droplet dimension, so that the advantage (in terms of increased visibility) could be decreased in presence of particular type of fog made of droplets of big dimension. Fusing the infrared sensor data with radar-recognised objects in such cases may however significantly decrease the droplets problems, by filtering those thermal images that do not correspond to a radar-recognised object. In addition, the driver may not easily interpret the thermal images of an infrared sensor not may he/she gain an enhanced traffic conditions perception by discrete visual or audio warnings on obstacles ahead from a radar sensor. When however merging the two data sources, a much more intuitive and continuous traffic scene feedback may be provided to the driver. For the above reasons, EUCLIDE Consortium decided to merge a state of the art radar and infrared sensors, both in terms of data fusion and user interface to the driver, to support the driver in r educed visibility, due to night and adverse weather conditions but also to warn the driver even in good visibility, when dangerous situations occur, thus addressing driver’s distraction too. 3 EUCLIDE system specifications The EUCLIDE vision enhancement system will offer the following innovative functionalities to the driver: the possibility to distinguish obstacles from what is outside and above the road, to identify the type of obstacle and to give the driver an enhanced perception of the road ahead. To achieve this goal, the following developments are planned: The 77 GHz scanning radar, developed for collision warning and avoidance, will be optimised, to achieve improved objects handling and classification, road estimation and prediction and a variety of modifications to the tracking subsystem. The high-resolution far infrared camera will be miniaturised in terms of camera and lenses electronics and will be equipped with a real time image processing board. A novel automotive sensors data fusion software will be developed, based upon relevant military technology, for plot level sensor fusion. However, the greatest innovation will rest with the new systems human machine interaction, that will feature an efficient and effective way to combine different warning signals to support the driver by providing the right help only when needed (both in conditions of reduced visibility, and in good visibility to minimise driver’s distraction). The aim is to develop an HMI effective and easy to use, open and interoperable to be adopted in on-vehicle advanced driver assistance system functions, avoiding overload of information, and characterised by an open architecture for future expansions. The permanent thermal image display of the infrared camera will be replaced by a reduced optical information for driver support and driver warning. The warning and support strategies will be managed in an innovative way with optical and acoustic media. The human machine interface development will be carried out and continuously verified in close interaction between the partners for the HMI design and those for the HMI technical and ergonomic evaluation. The technological step forward is to be found both in the definition of driver warning strategies and in the design and development of a head-up display based human machine interface to be applied both for images and symbols, with the following advantages. Simplification of the set-up and electronic control. This will be reached by avoiding the use of 2 optical elements for image projection and magnification (such set up would be critical especially in terms of on-vehicle integration and tolerances). System component cost reduction Component miniaturisation . Use of mass production components for cost effectiveness to allow a reasonable time to market The usability to display essential on-vehicle information. The open on-vehicle architecture for the applicability to essential on-vehicle information, like vehicle warning failure, and other on-vehicle warning support system Optimisation in terms of binocular vision adaptability. 4 Conclusions EUCLIDE Consortium aims to merge a microwave radar and a far infrared sensor, initially developed for and widely used in aerospace applications and later adapted for the automotive environment, to achieve the required functionality improvements (in terms of obstacle recognition and localisation, rain/.fog droplets filtering and user interface optimisation), to support car drivers under all types of adverse visibility scenarios (i.e. heavy rain, fog, night). Although the state of the art emanates from the aerospace technological field, the automotive environment poses much higher requirements on obstacles localisation precision and user interface design. Therefore, an extensive sensors improvement, data fusion and user interface re-design needs to take place the so developed sensors will be installed on a highly dynamic driving simulator (of DaimlerChrysler in Berlin) and two cars (a city car of Fiat and a luxury car of Volvo). A series of simulator and test track tests will be performed, estimating the new systems reliability, efficiency, usability and user acceptance under various parameters, such as: type of road, traffic conditions, road illumination, meteorological conditions, light conditions and vehicle speed range. The results of the project may however be fed back to the aerospace sectors, since merging of such sensors is currently the state of the art within this application field too; leading to aerospace systems that are more accurate and easier to be operated by the pilots. 5 References 1. 2. 3. 4. 5. 6. Bild der Wissenschaft 2/2000 Promote Road Safety in Europe COM(97) 131 fin. Rapport Annuel: Sécurité routière 1996. (Institut Belge pour la Sécurité Routière) Federal Statistics Institute, Offenbach, Germany, 1996 Great Britain - Transport Statistics Report. (Department of Transport, 1996) Northern Ireland - Road Traffic Accident Statistics: Annual Report, 1996. (Central Statistics Unit) 7. Sweden - Swedish National Road Administration, 1996