Uploaded by Metin Şakir Saza

BBIT CriticalDesignReport

advertisement
Middle East Technical University Northern Cyprus Campus
Electrical and Electronics Engineering Program
EEE 493 Engineering Design I
B-BIT Electronics
Critical Design Review Report
Vision Based Person Detection and Following Robot
2243756 Basri ACIR
2244044 Berk ÖKTEN
2316024 Ilgar Şahin KOÇAK
2243889 Tuğberk Özden ERGONCA
Supervised by
Asst. Prof. Dr. Cem DİREKOĞLU
Table of Contents
A.
Executive Summary............................................................................................. 3
B.
Introduction.......................................................................................................... 3
C.
Overall System Description ................................................................................. 4
C.1-Overall Block Diagram of Project Design ......................................................... 4
C.2-Maneuvering Algorithm .................................................................................... 5
C.3-Wi-Fi communication and data transfer ............................................................ 6
C.4-3D Drawings of The Project ............................................................................. 6
D.
Requirements and Compliance ........................................................................... 8
E.
Design Modifications ......................................................................................... 10
E.1-Hardware Modifications .................................................................................. 10
E.2-Software Modifications ................................................................................... 14
F.
Compatibility Analysis of Subblocks .................................................................. 15
F.1-Decision maker unit ........................................................................................ 15
G. Tests Procedures and Assessment of Test Results .......................................... 15
H.1-Maneuvering Algorithm Tests ......................................................................... 15
G.2-HC-SR04 Ultrasonic Sensor .......................................................................... 16
G.3-ESP32-CAM test ............................................................................................ 18
G.3. Convolutional Neural Network Training and Validation Results .................... 21
H.
Resource Managements ................................................................................... 23
H.1-Cost Analysis and Expected Weight, size ...................................................... 23
H.2-Power Distribution............................................................................................. 24
H.2.1-Motors...................................................................................................... 24
H.2.2-ESP32-CAM ............................................................................................ 24
H.2.3-Arduino Uno (Atmega328p microprocessor) ............................................ 24
H.2.4-HC-SR04 Ultrasonic Sensor .................................................................... 25
H.2.5-Gantt Chart .............................................................................................. 25
I.
Conclusion......................................................................................................... 26
J.
References ........................................................................................................ 26
Figure 1 System block diagram .................................................................................. 4
Figure 2 Flowchart of maneuvering algorithm ............................................................. 5
Figure 3 Communication flowchart ............................................................................. 6
Figure 4 Component Connections ............................................................................ 11
Figure 5 L293D Based Arduino Motor Shield [3] ...................................................... 12
Figure 6 Schematic of Arduino and motor shield ...................................................... 13
Figure 7 Block diagram of the decision making unit. ................................................ 15
Figure 8 Test code for maneuvering ......................................................................... 16
Figure 9 Schematic of Arduino and Ultrasonic Sensor ............................................. 17
Figure 10 Specifications of HC-SR004 Ultrasonic Sensor [4] ................................... 18
Figure 11 Code segment of libraries ......................................................................... 18
Figure 12 Code segment of SSID and password ...................................................... 18
Figure 13 ESP32-CAM wireless communication and data transfer test ................... 20
Figure 14: In the first try bad accuracy and high loss. .............................................. 21
Figure 15: Internal phase of the training ................................................................... 22
Figure 16: Final result of the training. ....................................................................... 22
Figure 17 Total weight and size of the robot ............................................................. 23
Figure 18 List of equipment’s and corresponding prizes.......................................... 23
Figure 19 Electrical Specifications of Motors [5] ....................................................... 24
Table 1 Communication comparison [1] [2] ................................................................ 8
Table 2 Wifi comparison [1] [2] ................................................................................... 9
Table 3 MCU Comparison [2] [1] ................................................................................ 9
Table 4 Arduino Uno Power Consumption................................................................ 24
Table 5 Total Power Consumption ........................................................................... 25
A. Executive Summary
In the current era, due to technological improvements in medicine, the average
lifespan of people increased. Hence, these recent developments come with additional
social duty responsibilities. Whether a person is old or not, the people living alone can
be exposed to fatal accidents that might result in severe injury or even death. To lighten
the responsibilities of people's monitoring elderly people or patients, B-BIT Electronics
introduces the Vision Based Person Detection and Following Robot project. The
project aims to follow the person of interest and respond to unusual events in the
enviroment. The robot will communicate with the hospital when necessary and produce
a warning sound to attract people nearby. In this project's development, one of the
main targets is to recognize a predetermined person’s body contour. Having a camera
on the body and regarding the motion system allows the robot to track the person of
interest. To produce a robust robot, one must be proficient in the following fields; the
robot body prototype design, power system and battery charger design, microcontroller
and ultrasonic sensor programing, computer vision algorithm development, and
wireless network communication. All B-BIT Electronics team members have been
educated in these areas at the Middle East Technical University and applied this
theoretical knowledge during the numerous internship processes. Besides the
theoretical and applied knowledge, all projects require a particular cost related to their
scope. The total projected cost is approximated as 305₺ (2020) for all expenses and
explained detaily in Figure 18 List of equipment’s and corresponding prizes . On the
other hand, the allocated time period is set as around one months to finish the project
successfully, and details are shown in H.2.5-Gantt Chart.
B. Introduction
Today, the average life-span of a human has increased to 75 years due to
advancing medical technology. The social behaviors of society differed in large parts
of the world compared to the past. Most elderly people live separately from their
families, and they need monitoring because of the health problems such as Alzheimer's
disease, heart disease brought by their age. Caregivers cannot monitor patients or
elders for 7/24. Fatal accidents may occur when the caregiver is not available if an
elderly person falls down or has a heart attack, resulting in the death of the elderly
person or corresponding patient. With the project that our company is developing, in
case of detecting a dangerous situation, an ambulance will be requested to the scene
located by GPS, and transportation of patient or elderly person will be done. What this
project brings is this detection and communication with hospitals done autonomously.
In this way, there will be no delay in calling an ambulance even if there is no other
person near the elderly person at that time.
Another feature of our company's project is that, in case of immediate first aid
or the necessity of using a medication required, the robot draws the nearby people's
attention with the buzzer. It will spread to the environment to increase the chance of
the elderly person to be found. So, the drugs and first aid that need to be used can be
applied to the elderly person through the people around. In this project document, the
purpose is for the robot to distinguish the selected person from other people, follow
them continuously, and perform the determined routines.‘First of all, the overall system
is described . Moving on, the requirements, design modifications and subsystems
regarding the project will be explained. Then the tests procedures and assessment of
test results are shown. At the end, it is concluded with resource managements.
C. Overall System Description
C.1-Overall Block Diagram of Project Design
Figure 1 System block diagram
The system proposed in this project consists of subsystems, which are battery
modules, image processing unit, sensor algorithms, wireless network communication,
data transfer and Arduino connections. The general working principle of this system
starts with ESP32-CAM. The captured video will be transmitted from ESP32 to the
laptop, and then this data will go through an image processing procedure, which is
written at OpenCV. After that, the produced data will be sent back to the ESP32, and
this data, which is the coordinate of the target send to Arduino UNO to adjust the
direction of the robot serially to make the robot turn whenever the desired target turns.
The ultrasonic sensor will measure the distance between the target and the robot. To
keep a certain distance, the distance data will be sent to the Arduino UNO, and
according to this data, the speed of the motor will be adjusted.
C.2-Maneuvering Algorithm
Figure 2 Flowchart of maneuvering algorithm
Processor of the robot will decide maneuvering behavior according to the data
coming from image processing unit and sensors. Figure 2 Flowchart of maneuvering
algorithmabove represents the straightforward decision making steps. Details of
communication with the other subsystems will be mentioned under the Compatibility
Analysis of Subblocks.
C.3-Wi-Fi communication and data transfer
Figure 3 Communication flowchart
The above figure shows a flowchart for the communication system. ESP32CAM will provide an IP address after connecting to the Wi-Fi network. The given IP
address will be entered on the browser in the ground station. This browser will show
the GUI which will be created for this project at PyQt5. Then a connection request will
be sent to the ESP32-CAM. After the ESP32-CAM responds, a web socket will be
created between the ground station and the ESP32-CAM for communication. ESP32CAM will send the images it receives with the OV2640 camera to the computer via
the web socket. The images taken will go through image processing at the ground
station and the coordinate information that will determine the motion directions of the
DC motors will be transmitted to Arduino UNO serially by ESP32-CAM. If the robot
wanted as manual usage, GUI at the browser offer buttons for direction of
movements for the robot.
C.4-3D Drawings of The Project
D. Requirements and Compliance
This project is an image processing and machine vision algorithm-based person
recognition and tracking autonomous robot. The robot closely follows the relevant
person introduced to the robot, keeping its distance always constant. Image
processing was used for the face and upper body recognition system not to lose the
relevant person among the communities. It will use sensor and camera feedback so
that the system does not lose the person concerned and always maintains the
distance between them. Primarily, Phyton and OpenCv were preferred for machine
vision algorithms and the person recognition subsystem. It has been deemed
appropriate to use ultrasonic sensors for person tracking and movement subsystem.
The data received from the sensor were continuously transmitted to the Arduino and
processed with the code in the Arduino and used in programming to control the
speed of the DC motors to keep the distance between the relevant person and the
robot constant. ESP32-CAM module was chosen for the wireless system
communication and data transfer subsystem. This module comes with a camera that
will receive the image to be processed. The camera will take image data to be used
in image processing on the ESP32-CAM. The ground station and ESP32-CAM will
communicate with each other by Wi-Fi and transfer the data to each other at the web
socket. By using image processing, the movement direction of the relevant person
will be determined. Later, to not lose the relevant person, the target's direction
information will be sent to the Arduino via serial connection. This information
transmitted will affect how DC motors motion directions and speeds according to the
code in the Arduino. Finally, in the robot prototype design and power systems
subsystem, a four-wheeled car model was chosen to prevent the robot body from
being blocked from moving anywhere during track while following the relevant
person. For the ultrasonic sensor and camera to work with maximum efficiency, it is
positioned at the front of the robot body, keeping it high with the help of a holder. For
the battery charger system, battery beds consist of 18650 Li-Ion batteries and are
planned to be charged in the external charger.
In case of communication module, two choice evaluated which are Bluetooth and
Wi-Fi. In order to make the final decision in the selection of the wireless
communication module to be used in the project, the two selected options are
compared in the table below.
Table 1 Communication comparison [1] [2]
After Wi-Fi choice made the protocol used at the module selected as 802.11n
considering the information in the table below.
Table 2 Wifi comparison [1] [2]
As can be seen in the table above, although Wi-Fi is disadvantageous in cost and
power consumption compared to bluetooth, there is an absolute advantage in other
areas. In addition, the features and the effective distance of the protocols used in the
Wi-Fi modules offered by IEEE are given in the table. After the above comparisons,
the wireless communication module was determined as Wi-Fi. As a result of a search
for a suitable Wi-Fi module, two options have emerged. These options are ESP32CAM and ESP8266. Since ESP8266 is only a Wi-Fi module, an extra camera module
is required. Together with the Raspberry Pi V1.3 camera module, which is the most
suitable camera module, it is more expensive than the ESP32-CAM. Low cost, which
is the biggest advantage of the ESP8266 over the ESP32-CAM, loses its advantage
for this reason. ESP32-CAM gains absolute advantage with the ESP32-S Wi-Fi
module, which is better quality than the ESP8266, and the OV2640 camera module
that comes with it. Table belowe illustrates comparasion between ESP8266 and
ESP32-S Wi-Fi.
Table 3 MCU Comparison [2] [1]
So, the selected Wi-Fi module will operate in station operating mode and connect
to the Wi-Fi network set up by an access point and use one of the 802.11 b / g / n
local area network protocols, depending on the situation. In the case of this project,
802.11n will be used. Almost all computers today use the 802.11ac protocol.
802.11ac is also backward compatible with 802.11b, g, and n, as the n, g, b protocols
are backwards compatible, respectively. Therefore, there will be no communication
failure between the Wi-Fi module and the targeted ground station.
In case of robot body kit there is two option which are 2WD robot car kit and 4WD
robot car kit. Because the purpose of the robot is following relevant person always,
robot body must be stable enough to took optimal images for image processing and
to not lose the track of the relevant person, body kit must have stable enough to not
bumping around. As a result, 4WD robot car selected.
In case of machine vision, MATLAB provides more robust design tools like Deep
Learning Network Designer toolbox. In this toolbox, user can create CNN and
arrange the internal layer options including convolutional filter and output
classification. On the other hand, creating data set is performed from MATLAB as
well. RGB frames are read from a file and converted to gray scale. After that, these
images resize as 227x227 and recorded into a class file. This built-in GUIs are better
option for CNN design rather than writing complicated python code with OpenCV.
To produce a robust project, an organization must make use of specific components
and simulation software.
Software:
● Python
● Micro C
● Proteus
● MATLAB
● Multisim
● OpenCV
● Arduino IDE
● PyQt5
Hardware: There will be some essential pieces to make the project work flawless with
implemented components.
● Arduino UNO R3
● L293D Based Arduino Motor Driver Shield
● DC Motor
● HCSR04 Ultrasonic Sensor
● ESP32-CAM
● Lithium-Ion Battery
E. Design Modifications
E.1-Hardware Modifications
While the parts are tested in the hardware assembly phase, the battery
selection was determined as a 9V alkaline battery. As the 9V alkaline battery could
not provide a stable current for Arduino and ESP32-CAM, random disconnections
were experienced after the connections were made. As a solution to this situation,
changing the battery has been deemed appropriate. This time the use of Lithium-Ion
batteries has been tried. Initially, it was designed as a 3S Li-Ion battery and tried to
distribute the system's power requirement via Arduino by supplying a voltage of
11.1V to the Arduino. Later, to prevent Arduino malfunctions that may occur due to
the instantaneous overcurrent of the four-DC motor connected to the Arduino shield,
it is thought to feed with a separate battery from the external power input of the
Arduino shield. Following this, two 2S 7.4V Li-Ion batteries were created, the first one
was connected to the jack input of the Arduino and the other to the external power
input of the Arduino shield. In order to cut the power of the battery attached to the
Arduino shield, a switch is placed at its input.
Figure 4 Component Connections
Motors and drivers are inseparable parts of many robotics and electronics
projects and have different types that you can use depending on their applications.
We employ drivers to control motors, as motors cannot be connected to
microcontrollers or boards like Arduino directly because they possibly need more
current than a microcontroller can drive. In this project, we work on the L293D
Arduino motor shield. It is a driver board based on L293 IC, which can drive 4 DC
motors and 2 stepper or servo motors at the same time. We employ 4 DC motors for
now but servo motors can be added to the circuit later optionally.
Figure 5 L293D Based Arduino Motor Shield [3]
After getting enough knowledge about the Arduino motor shield, it is hooked up
to our Arduino in proteus environment. As the motor shield nestle two L293D IC on
itself , two basic L293D IC is connected to the Arduino as a reference for the motor
shield. As a power supply, 7.4 V li-on battery is connected to the VCC (EXT_PWR
terminal) at the motor shield. Then, DC motors are connected to the M1 , M2 , M3 , M4
terminals which are shown as OUT1 , OUT2 in the schematic below. Pin 6 and 7 are
connected to the motor 1 and motor 2 (Left motors). Pin 4 and 5 are connected to
motor 2 and 3 (Right motors). EN1 and EN2 pins are always kept in High State (5V).
To control speed, it is essential to keep those pins high. VS and VSS are connected
7.4 V and 5V supplies respectively and GNG pins are grounded.
Figure 6 Schematic of Arduino and motor shield
To turn the rover right, left motors are driven forward while right motors are
driven backward. Also, to turn the rover left, right motors are driven forward while left
motors are driven backward.
E.2-Software Modifications
On the first phase of the project there was no neural networks for
machine vision. After test result of the initial face and body detection
algorithm, decided to create a 2 layered convolutional neural network
(CNN) for person detection. The figure on the right shows the CNN design.
This design takes input as 227x227 pixels gray pictures. Then, implement
the 2 internal layers to classify input image in 4 class. These classes include,
right side view, left side view, front view and back view of the person who
is followed by robot. To train this CNN, 2000 pictures of the one person is
used as data set which are classified as front, back, left and right. Every
folder includes 500 pictures for every side views.
The CNN implementation includes 2 phase which are training and
validation. To train CNN, design created with initial weights of the nets then
2000 pictures in 4 classes used for training of the layers. Training process is
covered on the part G. Accuracy of the CNN is mentioned on the part G as
well.
F. Compatibility Analysis of Subblocks
F.1-Decision maker unit
Decision maker unit, core of the robot is responsible of linking the subsystems
together and processing the data gathered from the subsystems. Final decisions
about maneuvering, responding events in the environment and power management
will be performed in this unit autonomously. Vision of the robot will be fed to a ground
station continuously (in our case a laptop for the prototype) and will be processed
over there. According to the input, events in the environment and distance, image
processing unit will produce specific signals to inform local decision making unit. This
unit will make us of these signals and will maneuver the vehicle or produce
necessary signals to warn people nearby or the hospital directly. Thus,
microprocessor of the robot links the sub systems and makes the final decisions,
these sub systems come together to shape the system itself.
Figure 7 Block diagram of the decision making unit.
G. Tests Procedures and Assessment of Test Results
H.1-Maneuvering Algorithm Tests
Functionality of the sensors and motors tested separately before combining
them. Once the functionality of each peace proved, all of them combined to set up
the system. A small test code, given below, is implemented to test the behavior of the
vehicle without image processing unit.
Figure 8 Test code for maneuvering
According to the observations after the test, vehicle’s movements are not agile
enough to follow a relatively fast target. Optimizations will be done to be able to catch
up with faster targets. Also adding a last direction flag will be beneficial since when
the robot lost its target, right it is circling around itself until the target appears again.
But if it can record the last direction that target went, this will save from time and
power.
G.2-HC-SR04 Ultrasonic Sensor
HC­SR04 Ultrasonic Sensor measures distances with sound waves, as the name
dictates. Usage of sensors that are compatible with Arduino UNO is very straightforward
because one needs to make certain connections, and after that, sensors are ready to be
coded.
Figure 9 Schematic of Arduino and Ultrasonic Sensor
Vision-based person detection and the following robot is equipped with a
rangefinder ultrasonic sensor for tracking systems.The sensor gives distance
information so that the robot will adjust its motion speed according to the chosen
distance limit of the tracked person to keep a certain distance. The data of distance
will be sent to the Arduino UNO.
Working principle of the Ultrasonic Sensor with Arduino is tested. According to
the data sent by test pin, output results are recorded to check whether connections
and communication is proper or not.
Also, specifications of HC-SR04 Ultrasonic Sensor is given below as purpose
of customer information.
Figure 10 Specifications of HC-SR004 Ultrasonic Sensor [4]
G.3-ESP32-CAM test
ESP32-CAM’s functionality is simulated. Video streaming code and camera pins are
given as below.
Figure 11 Code segment of libraries
The code segment at the figure above, necessary libraries are selected for the
camera and WiFi connection.
Figure 12 Code segment of SSID and password
The code segment at the figure above SSID and password are entered and
the server established. After the code initiates, the module will give an IP address for
video streaming as the figure below shows. By entering that IP address into the
laptop's internet browser, image processing input data started to be transferred to the
laptop.
IP address generation
After IP address entered to the browser, communication between ground
station and ESP32-CAM starts and data transfer at the web socket initiates. Figures
below shown a test session after the main components joined together. As the test
continue, it is shown that ESP32-CAM works fine after battery changes and random
disconnections have ended.
Various resolution settings have been tried and as a result, the optimum resolution at
which the image processing can work fine was found to be 400x296. Then, the
machine vision code, which was made using ready-made libraries, was tried to test
the capability of the camera in recognizing people. As a result, it was concluded that
the camera was sufficient to recognize the face of the person concerned.
Figure 13 ESP32-CAM wireless communication and data transfer test
G.3. Convolutional Neural Network Training and Validation Results
In this part CNN design training process result showed. Also, validation of the
trained CNN. There are 4 epochs for training and three training executed through
today. These trainings are showed on the next figures. As you can see from the
picture’s accuracy will be get better at a time after every improvements on the CNN
design.
Figure 14: In the first try bad accuracy and high loss.
Figure 15: Internal phase of the training
Figure 16: Final result of the training.
H. Resource Managements
H.1-Cost Analysis and Expected Weight, size
Figure 17 Total weight and size of the robot
Actual cost of the project is very close to estimated price which is 303 TL.
Figure 18 List of equipment’s and corresponding prizes
H.2-Power Distribution
H.2.1-Motors
The motor controller is run by 7.4 V battery. 4 DC motor will be connected. Each
motor draws approximately 140 mA at high speed, so the motor driver will draw around
560 mA from the battery.
Figure 19 Electrical Specifications of Motors [5]
H.2.2-ESP32-CAM
Turn off the flash lamp:180mA@5V;
Turn on the flash lamp and turn on the brightness to the maximum:310mA@5V
Deep-sleep: Minimum power consumption can be achieved 6mA@5V
Modern-sleep: Minimum up to 20mA@5V
Light-sleep: Minimum up to 6.7mA@5V
According to the rating of ESP32-CAM, the maximum current consumption is around
310 mA at 5V.
H.2.3-Arduino Uno (Atmega328p microprocessor)
The ATmega328p microprocessor draws about 5.2 mA depending on the clock
frequency and the supply voltage. The following table shows the typical and maximum
voltage for different clock frequencies and supply voltages.
Table 4 Arduino Uno Power Consumption
According to the ratings of ATmega328p , the maximum current consumption is
around 14 mA at 5V.
H.2.4-HC-SR04 Ultrasonic Sensor
Power Supply: +5V DC
Quiescent Current: <2mA
Working Current: 15 mA
According to the rating of HC-SR04 Ultrasonic Sensor, the efficient working current is
15 mA.
Total Power Consumption
Components
Maximum Current Ratings
L293D Based Arduino Motor Driver Shield (Motors)
560 mA
HC-SR04 Ultrasonic Sensor
15 mA
ESP32-CAM
310 mA
Arduino Uno R3 (Atmega328p Microprocessor)
14 mA
Total ≈ 900 mA
Table 5 Total Power Consumption
H.2.5-Gantt Chart
I. Conclusion
This project aims to save the sick or elderly people living alone from the dangerous
situations with minimum damage by providing the necessary precautions and calling
an ambulance when their health is in danger. Thanks to this project, customers will not
have to be observed by caregivers on 7/24. It requires a minimum of three caregivers
to provide 7/24 observations. With our company's project, the patient will be observed
on 7/24 for almost a single caregiver fee and will receive the necessary axioms when
necessary. Vision-based person detection and the following robot is equipped with a
rangefinder ultrasonic sensor and cameras for tracking systems. According to the
tracked person's path of choice, the robot will adjust its motion speed according to the
sensor's information and motion direction. Information processing will be between the
robot and the computer, and a Wi-Fi module will provide the connection between them.
The information gathered from the sensors will be analyzed on the computer, and the
appropriate instructions will be transmitted to the robot and processed by the MCU.
Necessary modifications according to the requirements of the robot design such as
battery change and image processing toolbox change done. Also, software and
hardware parts are updatated according to test and simulation results.
J. References
[1] "IEEE_802.11," [Online]. Available:
https://en.wikipedia.org/wiki/IEEE_802.11#802.11b.
[2] "Bluetooth_vs_Wifi," [Online]. Available:
https://www.diffen.com/difference/Bluetooth_vs_Wifi.
[3] "L293D Motor Driver Shield for Arduino," [Online]. Available:
https://mytectutor.com/l293d-motor-driver-shield-for-arduino/. [Accessed 5 4
2021].
[4] J. Elijah, "HCSR04 Ultrasonic Sensor," 16 Nov 2014. [Online]. Available:
https://datasheetspdf.com/pdf/1380136/ETC/HC-SR04/1.
[5] "Yellow Smart Car," [Online]. Available: https://shop.techmakers.com.my/yellowsmart-car-robot-3v-6v-dc-motor?limit=50. [Accessed 4 5 2021].
[6] "L239D," [Online]. Available: https://5.imimg.com/data5/PX/UK/MY1833510/l293d-based-arduino-motor-shield.pdf. [Accessed 6 4 2021].
Download