Autonomous Navigation of Electric Vehicles BY Giridhar S. Chavan Design Thinking “Design is not what it looks like or feels like. Design is how it works.” Steve Jobs In 10 minutes DESIGN THINKING by Ken Baldauf Design Thinking Four Pillars Design Thinking Empathy Design Thinking Collaboration Design Thinking Inclusion Design Thinking Repeat/Iterate Design Thinking • Empathize with people's needs, • Collaborate with others across disciplines, skill sets, and perspectives, • Include every idea in visible form for evaluation, and • Repeat, iterating and testing solutions to perfect them, always with human needs at the center. Design Thinking Quiz Design Thinking or NOT Design Thinking Design Thinking Process Problem Design Challenge Space Solution Space Design Solution Design Thinking Problem Design Challenge Space Solution Space Design Solution Design Thinking Branding Design Thinkers Group Stanford d.school Ideo Ideo v2.0 Luma Institute IBM FSU Innovation Hub Empathize | (Re)frame | Ideate | Prototype | Test Empathize | Define | Ideate | Prototype | Test Gather | Generate | Make | Share Inspiration | Ideation | Implementation Looking | Understanding | Making Observe | Reflect | Make Empathize | Ideate | Build Design Thinking Methods & Tools Design Thinkers Group Stanford d.school Ideo Ideo v2.0 Luma Institute IBM FSU Innovation Hub Empathize | (Re)frame | Ideate | Prototype | Test Empathize | Define | Ideate | Prototype | Test Gather | Generate | Make | Share Inspiration | Ideation | Implementation Looking | Understanding | Making Observe | Reflect | Make Empathize | Ideate | Build Design Thinking Steps in Perspective Design Thinking Steps in Perspective Thank you! www.innovation.fsu.edu DESIGN THINKING Problem Statement How might we improve the grocery shopping experience in a manner that positively impacts people and the environment? The power of design thinking can unlock a new e-mobility ecosystem for environmental gain. How will understanding the need for collaborative design impact your strategy now? The IPCC (Intergovernmental Panel on Climate Change) – estimates traffic accounts for 24% of carbon emissions worldwide today. Think about that. By redesigning the way we move people and goods from A to B on a macro scale, we can set the world on a path to reducing almost one quarter of global emissions. Does building an e-mobility ecosystem mean more than consumer vehicles alone? That means careful consideration of geographic location, population density and human traffic to scale appropriately and create environments that promote convenience and enjoyable travel experiences. Lifestyle considerations too – bike racks, easy on and off entrances, entertainment systems, and smart charging ports for all users, as well as demographic design considerations. Are we guaranteeing and designing for improved mobility, safety and assurance for elderly travelers as well as young families? What will moving beyond design thinking mean for you? For those companies seeking investment opportunities, consider initiating outreach to the wide diversity of public transportation, electrification initiatives and private sector startups working to effect change. Consider where you can fast track the game-changing ideas and technologies that will begin to super-charge the emobility ecosystem that’s coming. Summary There’s opportunity for EV manufacturers, battery producers and utilities embarking on charging station network construction, to align and improve charging deliverables and pricing. And at the macro level, for every transit mode to design more efficient ways for greener, data-driven vehicles and transit systems to align and intersect efficiently for 21st Century consumer travel. This is an enormous, complex and multifaceted challenge and a true watershed moment to disengage with the models of the past and reinvent a more efficient, interconnected and greener future. Autonomous vehicle sensors The information collected with the sensors in autonomous vehicles, including the actual path ahead, traffic jams, and any obstacles on the road, can also be shared between cars that are connected through M2M technology. This is called vehicle-to-vehicle communication, and it can be an incredibly helpful resource for driving automation. Autonomous vehicles would be impossible without sensors: they allow the vehicle to see and sense everything on the road, as well as to collect the information needed in order to drive safely. Furthermore, this information is processed and analyzed in order to build a path from point A to point B and to send the appropriate instructions to the controls of the car, such as steering, acceleration, and braking. The majority of today’s automotive manufacturers most commonly use the following three types of sensors in autonomous vehicles: cameras, radars, and lidars. Lidar (Light Detection and Ranging) sensors work similar to radar systems, with the only difference being that they use lasers instead of radio waves. Apart from measuring the distances to various objects on the road, lidar allows creating 3D images of the detected objects and mapping the surroundings. Moreover, lidar can be configured to create a full 360-degree map around the vehicle rather than relying on a narrow field of view. These two advantages make autonomous vehicle manufacturers such as Google, Uber, and Toyota choose lidar systems. Moreover, lidar can be configured to create a full 360° map around the vehicle rather than simply relying on a narrow field of view. These two advantages have led autonomous vehicle manufacturers such as Google, Uber, and Toyota to choose lidar systems for their vehicles. What is M2M communication ? We live in a world where there are more internet of things (IoT) devices than people. These devices are interconnected through wired and wireless systems, which allows them to exchange information with no or minimal human assistance. This is called machine-tomachine (M2M) communication. Automotive Radar Automotive radars are used to detect the speed and range of objects in the vicinity of the car. An automotive radar consists of a transmitter and a receiver. The transmitter sends out radio waves that hit an object and bounce back to the receiver, determining the objects' distance, speed and direction. Automotive radar sensors can be classified into two categories: Short-Range Radar (SRR), and Long-Range Radar (LRR). Short-range radar (SRR): Short-range radars (SRR) use the 24 GHz frequency and are used for short range applications like blind-spot detection, parking aid or obstacle detection and collision avoidance. These radars need a steerable antenna with a large scanning angle, creating a wide field of view. Long-range radar (LRR): Long-range radars (LRR) using the 77 GHz band (from 76-81GHz) provide better accuracy and better resolution in a smaller package. They are used for measuring the distance to, speed of other vehicles and detecting objects within a wider field of view e.g. for cross traffic alert systems. Long range applications need directive antennas that provide a higher resolution within a more limited scanning range. Long-range radar (LRR) systems provide ranges of 80 m to 200 m or greater How image processing is necessary in autonomous car? The images captured from the autonomous car are processed by the proposed system which is used to control the autonomous vehicle. Canny edge detection was applied to the captured image for detecting the edges, Also, Hough transform was used to detect and mark the lanes immediately to the left and right of the car. Developers of self-driving cars use vast amounts of data from image recognition systems, along with machine learning and neural networks, to build systems that can drive autonomously. The neural networks identify patterns in the data, which is fed to the machine learning algorithms. Autonomous vehicles sense their surroundings using special gadgets placed on the car, examples include the lidar, radar, GPS, cameras etc. Advanced control systems (artificial intelligence) interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant sign. Autonomous vehicles are capable of updating their maps based on sensory input, allowing the vehicles to keep track of their position even when conditions change or when they enter uncharted environments. LIDAR 3D map and allow the car to “see” potential hazards by bouncing a laser beam off of surfaces surrounding the car in order to accurately determine the distance and the profile of that object. Mounted on top the car on a rotating motor RADAR Ability to accurately monitor speed of surrounding vehicles in real time. Mounted on the bumpers, with two sensors in the front bumper, and two in the rear, the radar units allow the car to avoid impact by sending a signal to the on-board processor. works in conjunction with other features on the car such as inertial measurement units SONAR AND HIGH-POWERED CAMERAS Sonar technology have narrow field of view and its relatively short effective range (about 6 meters) but allows the car to effectively crossreference data from other systems in real time Cameras mounted to the exterior with slight separation in order to give an overlapping view of the Car’s surroundings just like the human eye which provides overlapping images to the brain before determining things like depth of field, peripheral movement, and dimensionality of objects. Each camera has a 50- degree field of view and is accurate to about 30 meters. POSITIONING AND SOFTWARE The Positioning system works alongside the on-board cameras to process realworld information as well as GPS data, and driving speed to accurately determine the precise position of each vehicle, down to a few centimetres all while making smart corrections for things like traffic, road construction, and accidents. The software processes all of the data in real-time as well as modelling behavioural dynamics of other drivers, pedestrians, and objects around you. While some data is hard-coded into the car, such as stopping at red lights, other responses are learned based on previous driving experiences. Every mile driven on each car is logged, and this data is processed in an attempt to find solutions to every applicable situation. The learning algorithm processes the data of not just the car you’re riding in, but that of others in order to find an appropriate response to each possible problem