Implementing Codesign in Xilinx Virtex II Pro Betim Çiço, Hergys Rexha Department of Informatics Engineering Faculty of Information Technologies Polytechnic University of Tirana Outline Introduction The main algorithm proposed Spotlights Static lights Light pairs System implementation Results Conclusions Introduction Future of automotive systems? => video based driver assistance. Complex algorithm are required. (and they need to be executed faster and faster) One solution would be the use of dedicated HW. Drawback, is the lack of the flexibility. Nowadays systems for driver assistance offer features such, lane departure or cruise control cameras and radar sensors Introduction But we want more... The proposed vision-based concept We want help in more complex situations, like urban traffic. Separation, repetitive operations and high level application code. Repetitive operations are accelerated by coprocessors. High level application code is implemented fully programmable on standard CPU cores. Achivments Fast execution Flexibility The main algorithm An algorithm to detect cars by taillights. Tunnel or night time driving on roads with separated lanes for each direction. Properties of the algorithm: searches for light pairs that stem from cars. compare the properties of car lights with the lights of the tunnel. searches for the illumination of a license plate. We demonstrate that the best way for this algorithm to be implemented is the division into a hardware and a software part. Cont` The image is recorded from a video camera (25 grayscale frames per second). The image then is processed by the subsequent engines and software parts. At the end we have as a result, car identification by taillights. Basic Schematic Schematic Summary Hardware implemented parts: PlateSearchEngine (license of the car) Spotlight-Engine LabelingEngine Software implemented parts Programmable software that performs several actions like shown in the scheme. Spotlight Engine From Camera the frame goes: To the Spotlight Engine To the Platesearch Engine Depending on a threshold, the spotlight engine will give a binary image. Spotlights Bright regions in the image that are ideally round and surrounded by dark pixels Taillights of cars typically appear as spotlights Proposed algorithm applies a simple shape filter to the image We define the two pixel sets relative to the current pixel (CP). If all pixels in PF are darker than all pixels in PS, the CP is a spotlight pixel lum(PS) > lum(PF ) + threshold Labeling Engine Takes as input, the binary image given from Spotlight Engine Searches the binary image for regions of connected white pixels Creates a label for each region Gives as an output a list of labels Each label corresponds to one spotlight Static lights We have a list of spotlights that have been extracted We have these properties for each spot: Bounding rectangle Position in the image (coordinates of the center of the bounding rectangle) Total brightness Number of pixels Static lights, i.e. they do not move relative to the road Direction of the light’s motion vector can be used to determine static lights. For each light in the current frame, a close by light is searched in the previous frame Finding light pairs We have to consider Distance of lights, y-component Distance of lights, x-component A vehicle that is far away from the camera appears more in the top of the image than a closer vehicle Existence of additional light in between the pair Both taillights of a car are expected to be on the same height on a flat road In that case, the two outer lights of the four spotlights could be considered as one candidate pair At the end, including the continuity vector, we come up with the best candidates for light pairs. System Implementation The system was implemented on a Xilinx Virtex-II Pro FPGA That FPGA also features two embedded 300 MHz PowerPC CPU cores, one of which was used to run an embedded Linux operating system Results For the Spotlight Engine we have 7 times increase in speed, compared to Pentium 4 processor. 173 time increase in speed compared to a Power PC processor. Conclusions As we have seen through this work in today’s video assistance systems for drivers need to be real time and flexible . We used here a new type of design where hardware and software cooperate to give both real time performance and flexibility. What is more important: “By implementing hardware coprocessors we let repetitive tasks to be handled by hardware engines while the software part takes care of the interface between parts of the system and some computations on a small amount of data”. With Codesign we release the processor load during the computations. Thank you... bcico@icc-al.org, hrexha@gmail.com