Design Report for Saluki Engineering Company

advertisement
Design Report for Saluki Engineering Company
Magnetic Particle Flux Tracking in River
Simulators
Prepared for Little River Research & Design
Prepared by Saluki Engineering Team #47
SEC F10-47-MAGNPART
April 26, 2011
Ali Albayat
Kyle Barringer
James Butler
Geoffrey Daniels
Joe Sobeski
Team 47 - MAGNPART
Saluki Engineering Company
Southern Illinois University
Mail Code 6603
Carbondale, IL 62901
April 26, 2011
Mr. Steve Gough
Little River Research & Design
514 East Main
Carbondale, IL 62901
Dear Mr. Gough,
On January 18, 2011 our team began work on the proposed system for tracking particle flux in
river simulators. We would like to thank you for the opportunity to work on this project. Included is a
design report that details our design and deliverables.
Team 47 has designed and created a software package for the MATLAB computing environment
that tracks particles in videos or a sequence of images. A graphical user interface was also developed to
simplify the use of the software.
The cost associated to implement this system is $1,850 which is for the MATLAB environment
and image acquisition device and requires no down time of a simulator.
Once again, thank you for the opportunity to work on this project. If you have any questions or
need any additional information, please contact me via the contact information below.
Sincerely,
Kyle Barringer
Southern Illinois University
Team 47 – MAGNPART Project Manager
kbarring@siu.edu
(618) 201-1127
SEC F10-47-MAGNPART
Page |2
Acknowledgements (JS)
SEC Team 47 would like to thank the following people and groups for their contributions to this
project. First and foremost we would like to thank the employees of Little River Research and Design
Company for providing us with this opportunity to work on this project and all their assistance given
throughout. The samples of the media used in their system, as well as the Hall Effect sensor they had on
hand were essential to truly understanding the nature of this project. We would also like to thank them
for making available their facilities in which we could use to get the information we needed. Team 47
would also like to thank our Faculty Technical Advisor, Dr. Alan Weston for all his insight into the project.
His ideas and expertise offered us new ways in which to pursue the project as well as Dr. Gregory
Wilkerson for his insight into the project. His expertise in sediment transport gave us good ideas on how
to track the media in the riverbed. We would also like to thank Dr. Suri Rajan for his expertise as a
motors professor and his understanding of the sensors used in the project. Team 47 would also like to
thank the department of Electrical and Computer Engineering at Southern Illinois University for
providing the funding for materials needed for the project.
SEC F10-47-MAGNPART
Page |3
Table of Contents
Acknowledgements (JS) ..................................................................................................................... 3
Table of Contents .............................................................................................................................. 4
List of Tables and Figures ................................................................................................................... 5
Executive Summary (KB) .................................................................................................................... 6
Project Description ............................................................................................................................ 7
Introduction (AA) ............................................................................................................................ 7
Overview (KB) .................................................................................................................................. 7
Sensors (JS & AA) ............................................................................................................................ 8
Arduino (GD) ................................................................................................................................. 10
Fluid (JB) ........................................................................................................................................ 11
Subsystem: Analysis Software (KB) .................................................................................................. 13
Introduction .................................................................................................................................. 13
Image Analysis ............................................................................................................................... 13
Video Analysis ............................................................................................................................... 17
Fault Analysis ................................................................................................................................ 22
Environmental Issues .................................................................................................................... 22
Societal Issues ............................................................................................................................... 22
Conclusions & Recommendations ................................................................................................ 23
References .................................................................................................................................... 23
Recommended Vendors ................................................................................................................ 24
Subsystem: Graphical User Interface Software (GD) ......................................................................... 25
Cost & Implementation (KB) ............................................................................................................ 27
Conclusions & Recommendations (JS) .............................................................................................. 27
Appendices ..................................................................................................................................... 28
Drawings ....................................................................................................................................... 29
User’s Guide (KB) .......................................................................................................................... 38
Technical Manual (KB) .................................................................................................................. 47
SEC F10-47-MAGNPART
Page |4
List of Tables and Figures
Table 1: Information contained in “TrackedParticles.txt” ......................................................................... 16
Table 2: Recommended webcam vendors ................................................................................................. 24
Table 3: Recommended digital SLR camera vendors ................................................................................. 24
Table 4: Recommended high-speed camera vendors ................................................................................ 24
Table 5: Cost of MATLAB ............................................................................................................................ 25
Figure 1: Subsystem Relationship Chart ...................................................................................................... 8
SEC F10-47-MAGNPART
Page |5
Executive Summary (KB)
The ability to track particles as they move through the Emriver river simulator would provide
researchers and educators with information that is not currently available. This information includes the
location of particles as they move throughout the river and the velocity with which they move. The
designed system contains these capabilities.
The system allows the user to analyze either pre-acquired images or videos. Choosing to analyze
images lets the user identify individual particles and track them over subsequent images. The video
option permits the user to identify a group of particles moving together and determine their
instantaneous velocity.
The system is comprised of software and accompanying graphical user interface that can run on
any computer with the Windows operating system and the MATLAB computing environment installed.
The system will accept images from any type of image device, which is the only cost associated with the
system, ranging from cheap webcams to expensive high-speed cameras. This system also has the
advantage of requiring no down-time to implement. The graphical user interface guides the user
through the steps required to analyze images and all resulting information is displayed in an easy to read
format on the screen.
The remainder of this report contains the following:

An overall project description

A look at costs associated with the image capturing device

Technical details associated with each part of the system

Non-technical issues affiliated with the system

Recommendations for implementation
The system has a cost of $1,850.
SEC F10-47-MAGNPART
Page |6
Project Description
Introduction (AA)
The purpose of this project is to create a system that could be used to measure particle velocity,
flux and particle transport rate in a simulated river system. The project was requested by Little River
Research and Design Company, with the purpose of improving their river simulators to help students in
the field of fluid geomorphology. Their intention of this project was to allow them to count the number
of the particles that flow through the simulator and measure the velocity of the particles that move.
They currently have two different models of the simulator which are the EM2 and EM4. The plan to go
about meeting the company’s request involved three primary methods. The first was a method
incorporating Hall Effect sensors, the second a camera-based optical method and the last was magnetic
separation utilizing electromagnets and microcontrollers. Many experiments were done on each
method, but the most successful was the camera-based optical sensor.
Overview (KB)
The designed system has different capabilities depending on if a sequence of images or a video
file is used. If an image sequence is used, the system takes two consecutive images and finds the
particles that moved in the time interval between the images. It will highlight these particles with red
circles as well as produce a text file that contains the coordinates of the particles and a unique
identification number. If a video is used, the system analyzes each frame and finds groups of particles
that are moving at roughly the same velocity, highlights them with a bounding box and displays their
velocity. The system is comprised of two software subsystems: analysis and user interface.
The analysis software executes out of view from the user and is responsible for performing the
work that was described above. The user interface software acts as a link between the user and the
analysis software. It allows the user to change parameters and start or stop analyses through the use of
figure windows, buttons and sliders. Figure 1 shows the relationship between the subsystems.
SEC F10-47-MAGNPART
Page |7
Graphical User Interface
Analysis Software
Figure 1. Subsystem Relationship Chart
In addition to the subsystems for the designed system, other considerations were evaluated.
The first was the use of Hall Effect sensors embedded into the river simulator which would utilize the
ferrous property of some of the particles. The second was the use of an electromagnet controlled by an
Arduino microcontroller to collect the particles at a regular time interval for measuring. The last was a
look at the effects of fluid dynamics on the sediment transport process.
Sensors (JS & AA)
For the sensor approach many different methods were considered. The initial idea was to use a
Hall Effect Sensor to detect the presence of the particles in the River Simulator. The Hall Effect Sensor is
a specific type of sensor that, by using a concept known as the Hall Effect, produces an output current
whenever it detects the presence of a magnetic field. In order to fully understand the Hall Effect sensor,
it is necessary to explain how the Hall Effect works.
The Hall Effect is the induced voltage created across a metal conductor transverse to an electric
current and magnetic field. The particles showed some ferromagnetic characteristics when it was
demonstrated that they were attracted to a magnet. It was necessary to determine whether or not the
particles were magnetic in themselves or simply ferrous. This was done by applying the particles to the
Hall Effect sensor and determining if any sort of output was created. Initially the sensor was tested with
SEC F10-47-MAGNPART
Page |8
a simple magnet to ensure proper functionality. After determining the sensor was operating correctly,
testing of the particles began. No response was produced by the sensor after applying the particles to it.
This led to the conclusion that the particles were simply ferrous and not magnetic. It then became
necessary to research means of detecting ferrous materials in order to apply this to the system.
After researching means of detecting ferrous particles, the concept of bias magnetic Hall Effect
sensors seemed plausible for these types of particles. The bias magnet Hall Effect concept is the same
concept as a regular Hall Effect sensor, except for the inclusion of a permanent magnet. The sensor
detects the presence of the magnetic field created by the permanent magnet. By doing this, the sensor
is able to detect when that magnetic field is altered by the presence of a metal object, thus seeming
applicable for the particles in the river system.
After researching this concept, a sensor of this variety found to be sold by a company named
Cherry Corp. They sold a Speed and Direction Sensor that included this embedded system of a Hall Effect
Sensor and a bias magnet. After searching for a retailer to sell it, the sensor was finally ordered and
arrived shortly thereafter. After the sensor arrived, it was clear that a certain connector was necessary
to connect the pins to any sort of device. After being unable to connect the sensor with alligator clips to
the oscilloscope, a Dremel tool was used to cut away some of the plastic encasing the pins. After doing
this it was possible to attach wires with alligator clips to each of the three pins. Following the
specification sheets from the products web page, the sensor was powered and connected to the
oscilloscope for testing. Upon connection it was noticed that there was some noise imposed on the
sensor. The voltage per division was decreased so as to “zoom in” on the wave form. The noise became
much more apparent. The intention was with the oscilloscope “zoomed in”, the response to the
particles could be more closely observed. When the particles were applied to the sensor, no particular
change was noted in the wave form. The sensor was then applied to various metal objects to attempt to
induce a response. On occasion when the magnet within the sensor would attach to other metallic
SEC F10-47-MAGNPART
Page |9
objects, a slight change in the wave form was noted. However, the response was not consistent and did
not always incur upon contact with another metal object. It was believed that this response was another
result of noise in the sensor attained by the exposed pins when moving it around. Since the sensor was
manufactured to be used in automobiles, a group member took the part to a professor who specialized
in that field. It was the professor’s opinion that the sensor was made for the specific application of
detecting moving gear tooth speed and that its effective range was far too small to be used for detecting
our particles.
After the attempt with the Speed and Direction sensor proved to be unsuccessful, other
methods were sought after. The concept of the bias magnet Hall Effect sensor still seemed applicable so
the group continued to pursue other means of utilizing it. The idea of using the original Hall Effect
sensor with another permanent magnet came about. After purchasing some powerful rare earth
magnets, they were then set in a position fixed above the Hall Effect sensor until an output was
generated by the sensor. Random metal objects were then placed between the magnet and the sensor
to verify if the imposing metal object would disrupt the magnetic field. Those preliminary tests proved
successful. The particles were then placed in between to see if change in the output occurred. No such
change was observed. After multiples trials it was believed that the particles were simply too small and
not ferrous enough to have an impact on the sensor and the magnetic field.
Arduino (GD)
Initially, the first idea for the design of our project would require an Arduino board. Thus our
team researched how the board was created, its functions and the types of applications it was used for.
Likewise, simple programs were created by the team to familiarize ourselves with its interface and
coding styles. However, later on in the semester the group decided to switch gears and go another route
using MATLAB.
SEC F10-47-MAGNPART
P a g e | 10
Fluid (JB)
The optical method has it usefulness that it gives velocity to the urea formaldehyde thermoset
plastic particles in lieu of rock-based sand in the river simulator. The task after the velocity is found is to
turn this into sediment particle transport rate(ppm). The matlab estimates the direction and speed of
object motion from one image to another or from one video frame to another using either the HornSchunck or the Lucas-Kanade method.
When the flow conditions exceed the criteria for incipient motion, sediment particles on the
streambed start to move. The transport of bed particles in a stream is a function of the fluid forces per
unit area…the tractive force or shear stress (
), acting on the streambed. Under steady, uniform flow
conditions, the shear stress is
(eq.II-2)
where:
is the specific weight of the fluid,
D is the mean depth, and
S is the water surface slope.
The gravitational force resisting particle entrainment,
, is proportional to:
(eq.II-3)
where:
is the specific weight of sediment, and
d is the particle diameter.
The Shields relation (Shields 1936) with modifications by Graf (1971) developed relations
associated with initiation of particle movement using the ratio of the fluid forces to the gravitational
SEC F10-47-MAGNPART
P a g e | 11
force, proportional to the dimensionless quantity called the critical dimensionless shear stress,
,
where:
(eq.II-4)
The dimensionless bed-material transport rate per unit width of streambed, Q*B is:
(eq.II-5)
where:
g is gravity, and
Qs is the volumetric transport rate per unit width of streambed determined from bedload samples.
The empirical function developed by Parker (1979) is
(eq.II-6)
where:
is the threshold value of
required to initiate particle motion.
Any tracking algorithm from fluid dynamics, machine vision, cell tracking will thus not solve the
problem. particles can switch between a ”direct ” (particles flowing with the flow of water) and a
”indirected” (particles transferring force to one another )mode of motion. Moreover, the two modes of
motion are of greatly different speed. Since most existing tracking algorithms are specifically tailored to
a particular type of motion, they cannot be readily used. In no way have we solved all of the above
problems, but the tracking software described in this report tries to at least approximately consider
them and points the way for future improvements.
SEC F10-47-MAGNPART
P a g e | 12
Since the method of using Hall Effect sensors did not produce favorable results the decision was
made to abandon it. The current method of using an imaging device with the accompanying software
was determined to have the best prospects early in the design process and it was known that a user
interface would be needed. Due to this, the electromagnet method was also abandoned in order to
allow for the user interface to be completed.
Since the system is only used for analysis, the consequences of a fault are not very threatening.
The possibility for a fault exists if certain parameters are incorrectly set, but the only effect is that an
error message is displayed and the analysis is aborted. Resetting the parameter value and running the
analysis again will correct the error.
Subsystem: Analysis Software (KB)
Introduction
The optical method is comprised of an image capturing device and software that analyzes the
images and videos that are captured. The original motivation for this idea was that the black particles
were sharply different in color from all other particles and when represented on a digital image, the
pixel values of these particles would be above a certain number. This threshold would permit software
to identify the black particles in an image, and comparing the x- and y- coordinates of detected particles
between two consecutive images would allow the particles to be tracked. An overview of the subsystem
from the software point of view can be seen in Drawing KB-1.
Image Analysis
The MATLAB programming language and computing environment includes functions that
perform image acquisition and processing. Due to this, it was chosen over other programming languages
such as C and C++ since the necessary functions would either need to be written specifically for the
SEC F10-47-MAGNPART
P a g e | 13
project or found from other sources. Even if the needed functions were found from other sources, the
possibility that they would need to be adapted to fit the project still existed.
Upon researching image processing capabilities within the MATLAB language, it was discovered
that images needed to be represented in grayscale color space. The benefit of this requirement was that
all particles could be tracked instead of only ones that fell within a certain pixel range. Further searching
also revealed that a group of researchers from Georgetown University had previously researched
particle tracking and developed several MATLAB files to locate particle positions and link these positions
to form particle trajectories [1].
After reading the code and experimenting, it was determined that these files could be utilized in
the system. The experiment consisted of taking an image during a simulation of the river simulator and
using the aforementioned files to locate particles. The decision to use these files was based on the fact
that they were already proven to work, could be customized to fit any special needs of the system and
their use would allow time for additional functionality to be added to the system.
The next step was to determine the best method for the user to acquire images. The
requirements were that the images needed to be clear and spaced at short, but equal time intervals to
ensure the particles stay in the image area for more than one time interval. To guarantee the timing
requirements were met, it was decided that the computer on which the program was running could be
used to acquire the images. The included “Image Acquisition Toolbox” within MATLAB appeared to
contain the necessary functionality. However, upon trying to utilize it, it was discovered that it required
all images to be stored in the memory of the computer and only after the acquisition of all images was
complete were they saved to the hard disk. The two problems associated with this was that large
sequences of images could not be taken due to the limitation of the memory size and images would not
be available immediately for processing. To solve this problem, a “for” loop was used to acquire images
and save them to the hard disk. This resulted in a delay between images of 0.8-1 second. This delay was
SEC F10-47-MAGNPART
P a g e | 14
longer than that of the built-in function, but since this solution was the only one that allowed for the
images to be saved to the hard disk immediately, the delay was considered acceptable.
The next step in the design process was to write code to pre-process the images so they could
be used by the files from the Georgetown University researchers. The pre-processing consisted of:

reading the images into the program via the “imread” command

adjusting the contrast of the images to account for improper lighting conditions by using the
“imadjust” command

converting the images to a numerical data type and inverting the colors so that the particles
appeared bright in color compared to the background by subtracting the images from 255 which
represents the color white
Once this was completed, two of the files could be used to perform particle locating:


bpass.m
pkfnd.m
Each file operates by being given parameters that include the name of the image and
customizable parameters that allow for more precise results.


bpass(image_array, lnoise, lobject)
o image_array
 the image, which is now expressed as a two-dimensional array
o lnoise
 length of the noise in pixels
o lobject
 (editable) This value is for the length in pixels slightly larger than a typical object
pkfnd(im,th,sz)
o im
 the image, still expressed as a two-dimensional array
o th
 (editable) This is the minimum brightness of a pixel that could be a local
maximum. A local maximum signifies that this pixel is a particle
o sz
 (editable) This value should be set slightly larger than the diameter, in pixels, of
a particle. This value should only be used if the image is blurry, otherwise it
should be set to zero
SEC F10-47-MAGNPART
P a g e | 15
The “pkfnd” function then returns an N x 2 array where N is the number of particles found. The
values contained in this array are the x- and y- coordinates of the found particles in the format [x,y]. The
code is designed to process two images at once so there are two N x 2 arrays of x- and y-coordinates.
Following this, the two arrays are compared and particles that have the same coordinates in
both lists are removed. This is to concentrate only on the particles that moved during the last time
frame since stationary particles are not of interest. Once this step is complete, both images are
displayed with the found particles highlighted by red circles. This allows the user to determine if the
results are satisfactory.
Then, each set of coordinates has a third column appended that contains either a number one
or two to identify it as belonging to the first image or the second. Then the two arrays are concatenated
vertically. Once this step has been completed, the array can be passed to the final function, “track.m”,
that is responsible for identifying the particles.

track(xyzs,maxdisp,param)
o xyzs
 the array to process, which was created in the last step
o maxdisp
 (editable) An estimate of the maximum distance a particle could move during
one time interval
The result of this function is a text file with the name “TrackedParticles.txt” that is saved for the
user to view. The text file contains an N x 4 array, where N is the number of particles found in both
images. The four columns contain the information shown in Table 1.
(x)
x-coordinate of
particle
(y)
y-coordinate of
particle
(t)
image number that
the particle was
found in
(ID)
unique ID number for
each identified
particle trajectory
Table 1. Information contained in “TrackedParticles.txt”
Upon testing the code execution on a simulation of the river simulator, it was found that the
image processing lagged behind real-time by roughly 15 seconds. This was due to the time required by
SEC F10-47-MAGNPART
P a g e | 16
the computer to process the images and display the results. Since this delay was unavoidable, it was
determined that performing the image analysis would be best done on images already acquired. This
has the advantage of allowing the user to acquire images with any type of image acquisition device,
regardless of its ability to be tethered to a computer with a USB cable.
To conclude the image analysis, the image analysis function takes two successive images and
performs the pre-processing described above. Then, the pre-processed images are displayed to the user
for verification. The processing of the images then takes place using the three external functions. The
two images will again be displayed in separate figure windows with the found particles highlighted by
red circles. This allows the user to determine if the results are acceptable or if the processing needs to
be executed again with different parameters. A text file of the .txt file type is created and saved in the
same directory that the program resides. The text file contains the x- and y-coordinates, image number
and unique identification information that were described above.
Video Analysis
Due to the possibility of long time intervals between images because of code execution or
manual triggering, another solution was sought to complement the image analysis. This solution uses
recorded video rather than images of the river simulator. Since many video acquisition devices allow for
frame rates of at least 15 frames per second, this analysis mode would not be dependent on code
execution, manual triggering, or limitations in camera lens speeds. It was found that the Simulink
modeling tool developed by MathWorks, the same company that distributes MATLAB, could be used.
Simulink has built-in support for processing video files. An example was found that appeared to
be similar to the application of the system. The example takes a video clip of moving traffic and
determines which objects are vehicles and counts them. This idea can also be applied to tracking
particles in a river simulator. Using the example as a guide, a system was designed and created that
tracks the particles and draws a bounding box around what it determines is a group of particles on the
SEC F10-47-MAGNPART
P a g e | 17
video. It also overlays a motion vector over the video that allows the user to see where motion is taking
place [2].
To test the system, video samples of the river simulator were acquired with a digital SLR camera
that had a frame rate of 15 frames per second. The only problem with this video was that the resolution
was not high enough to be able to discern individual particles. The decision was made to look for groups
of particles that had roughly the same velocity instead of individual particles. The sizes of the groups,
called blobs, are editable so that the user can determine the values that provide optimum results for
individual video files. This decision was made to allow greater functionality of the video analysis. Had
the decision to group particles together not been made, the user would have had to use high-speed
photography equipment which can be quite expensive to obtain clear video with individual particles
being discernible. This solution allows less expensive equipment to be used while still providing
acceptable results.
Once the ability to identify blobs was complete, the next step was to design a method to
determine the velocity of the blobs. It was discovered that Simulink had a block built in to its video
processing block-set, the Optical Flow block, which estimates the direction and speed of object motion
from one frame to the next. It computes the optical flow by solving the equation 𝐼𝑥 𝑢 + 𝐼𝑦 𝑣 + 𝐼𝑡 = 0
where𝐼𝑥 , 𝐼𝑦 and 𝐼𝑡 are the spatiotemporal brightness derivatives, 𝑢 is the horizontal optical flow and 𝑣 is
the vertical optical flow. The system uses the Lucas-Kanade method which divides the original image
into smaller sections and assumes a constant velocity in each section. This is in contrast to the HornSchunk method which assumes the optical flow is smooth over the entire image. Due to the possibility
of different velocities within one frame, the Lucas-Kanade method was considered to be a better
solution. Once the original image has been divided up into smaller sections, the block performs a
weighted least-square fit of the optical flow constraint equation to a constant model for [𝑢 𝑣]𝑇 in each
2
section, Ω, by minimizing the following equation: ∑𝑥∈𝛺 𝑊 2 [𝐼𝑥 𝑢 + 𝐼𝑦 𝑣 + 𝐼𝑡 ] . 𝑊 is a window function
SEC F10-47-MAGNPART
P a g e | 18
that emphasizes the constraints at the center of each section with the solution to the minimization
problem given by [3]:
∑ 𝑊 2 𝐼𝑥2
[
∑ 𝑊 2 𝐼𝑦 𝐼𝑥
∑ 𝑊 2 𝐼𝑥 𝐼𝑦
∑ 𝑊 2 𝐼𝑥 𝐼𝑡
𝑢
][ ] = −[
]
𝑣
∑ 𝑊 2 𝐼𝑦2
∑ 𝑊 2 𝐼𝑦 𝐼𝑡
The Optical Flow block outputs velocity values for all locations, and a method needed to be
designed to only take the values at the locations of the found blobs. This was accomplished by
comparing the matrix of velocities with the matrix of blob positions and displaying only those values. All
other values are displayed in the upper left-hand corner.
The Simulink model that comprises this solution is comprised of many blocks to perform the
necessary processing. A flowchart is shown in Drawing KB-2 that illustrates the flow of data in the
Simulink model. Drawing KB-3 shows the top level of the model with subsequent drawings showing the
various levels within this main drawing.
The first block is the “From Multimedia File” block which is used to select the .avi file to be
analyzed. This block inherits the sample time from the file to accommodate for differing frame rates and
upon beginning simulation, plays the file one time. The output of this block goes to the “R’G’B’ to
intensity” block as well as the “Display Results” block to be used for the “Original” video display.
The “R’G’B’ to intensity” block converts the color space from the typical RGB color scheme used
in color video to intensity. It accomplishes this by using the following equation [4]:
𝑅′
𝑖𝑛𝑡𝑒𝑛𝑠𝑖𝑡𝑦 = [0.299 0.587 0.114] [𝐺′].
𝐵′
The output of this block goes to the “Optical Flow” block.
The “Optical Flow” block computes the optical flow of each frame as described above. The
output of this block, which is the magnitude of the velocity, goes to the “Thresholding and Region
Filtering” block as well as the “Display Results” block.
SEC F10-47-MAGNPART
P a g e | 19
The “Thresholding and Region Filtering” block contains other smaller blocks which can be seen
in Drawing KB-4. The input of this block first goes through the “Velocity Threshold” block (Drawing KB-5)
which takes the mean velocity per frame as well as the mean velocity per frame across time. It then goes
through median filtering to help reduce noise in the frame. Then, it goes to the “Region Filtering” block
which itself is comprised of smaller blocks and can be seen in Drawing KB-6.
The “Region Filtering” block first performs blob analysis on its input. This is where the bounding
box that will surround each blob is determined. This block also contains the user-editable values that
determine the blob properties such as the maximum number of blobs, the minimum area of a blob and
the maximum area of a blob. The output of this block is used for the coordinates for which the velocities
will be printed in the “Velocities” display. The output is also used to determine the number of blobs and
to create the coordinates of the bounding boxes.
The final block is the “Display Results” block which also contains additional blocks as seen in
Drawing KB-7.
The first group is the “Original” block which takes the original video stream and displays it in a
video box for the user.
The second group is for the motion vector display. The velocity values from the “Optical Flow”
block are inputted to the “Optical Flow Lines” block which generates the coordinates of the optical flow
lines and then inputted into the “Draw Lines” block which draws the lines over the original video stream.
This is then inputted to the “Motion Vector” video display.
The third group is for the bounding boxes and the number of particles. This block is again
comprised of additional blocks and is seen in Drawing KB-8. The “Draw Rectangles” block uses the
bounding box coordinates to draw rectangles over the original video stream. The output of this block is
then inputted to a black that creates a small white rectangle for the background of the particle count.
This output along with the number of particles is inputted into the “Insert Text” block which prints the
SEC F10-47-MAGNPART
P a g e | 20
number of particles on the white background. This output is then displayed in the “Bounding Boxes”
video window.
The final group is for the velocities. The velocity values from the Optical Flow block are run
through a gain of 100. The values out of the Optical Flow block are on the order of one-hundredths and
this required too much screen space to display. By introducing a gain of 100, these values contain a
whole number component with a smaller decimal and are easier to read. Then, these values get
converted from a 2-D matrix to a 1-D array. This was required by the “Insert Text” block that comes
next. This block takes this input as well as the velocity coordinates from the Blob Analysis block and the
original video stream. It prints the velocities at the location of the upper left-hand corner of each
bounding box over the original video stream and then it is displayed in a window via the “Velocity”
block.
In conclusion of the video analysis, this solution is expected to analyze a .AVI video clip using
parameters that the user sets. Once the simulation begins, processing starts and the output is four
figure windows. One figure window displays the original video clip. The second figure window displays
the video clip with a vector field overlaid that allows the user to more easily see where motion is taking
place. The third figure window shows the video clip with the detected groups of particles enclosed by
yellow bounding boxes. The last figure window displays the velocity of each group of particles. The text
for each group is printed in the location of the upper left-hand corner of the corresponding bounding
box. This keeps the found blobs from being covered by both bounding boxes and text, while also
allowing for easy comparison to see which velocity value corresponds to which bounding box.
The image and video analysis methods that comprise the optical subsystem work closely with
the Graphical User Interface subsystem. The GUI is used to set and change parameters in this subsystem
and allows the user to see the results that are produced by this subsystem.
SEC F10-47-MAGNPART
P a g e | 21
Fault Analysis
Due to the ability for the user to change the parameters and the fact that the system is not
static, there is a possibility for the system to fail.
The one possibility is in the image analysis, if the minimum threshold is set too low, this will
cause many particles to be found and the high number will cause the trajectory prediction algorithm to
try to determine too many combinations which results in an error. If this happens, the analysis is
stopped and a warning message is displayed. This error will cause no damage to the system and there
are no consequences of this error. Increasing the threshold and running the simulation again is all that is
required to fix it. The only modification that could be made to the system to prevent this error would be
to limit the minimum value that the threshold could be set to, but this would extremely limit flexibility
for the user and since there are no consequences, this option was not taken.
In the video analysis portion, a warning can occur if the system finds more particles than the
value the user set for the maximum number of particles. This does not result in an error and the
simulation still continues, but a warning message is displayed to alert the user that the level was
reached and should be changed on future simulations.
Environmental Issues
The only environmental issue associated with the system is the disposal of the image acquisition
device. Documentation from the manufacturer of the device should be consulted as to best handle
disposal. The software portion of the system poses no environmental risks or issues.
Societal Issues
The end user will be required to capture images or videos, either manually or via the image
acquisition portion of the program, and save them onto the computer. To accomplish this manually, the
user will need to have experience using a digital camera to take images or video and transferring the
SEC F10-47-MAGNPART
P a g e | 22
resulting files to a computer. The user will then need to navigate the graphical user interface to change
the parameters to meet needed specifications and run the analysis, or acquire images from the user
interface if desired. The included user interface makes these tasks easy for any user that has common
computer skills.
Conclusions and Recommendations
In conclusion, this system analyzes images and videos for particle tracking purposes. It is capable
of acquiring a sequence of images or using images and videos provided by the user. It allows the user to
alter certain parameters to meet unique specifications before running the analysis. The analysis results
are then displayed for the user to see. It is recommended that the system can be implemented in its
current state, but additional functionality could be provided through additional work as outlined below.
This system currently needs to have a licensed version of MATLAB and Simulink in order to work.
Standalone functionality can be provided through additional work.
The image analysis portion could also benefit from additional work on the ability to determine
particle distances, comparing more than two images at a time and displaying the unique ID number of a
particle next to that particle on the image that is displayed.
The video analysis portion could benefit from additional work to convert the pixel velocities to
other units and exporting the velocity values to a text file.
References
[1] D. Blair and E. Dufresne. (n.d.). Particle Location and Tracking Tutorial [Online]. Available:
http://physics.georgetown.edu/matlab/tutorial.html
[2] MathWorks R2008b Documentation: Tracking Cars Using Optical Flow
[3] MathWorks R2011a Documentation: Computer Vision System Toolbox: Optical Flow, MathWorks,
Inc. [Online]. Available: http://www.mathworks.com/help/toolbox/vision/ref/opticalflow.html
[4] MathWorks R2011a Documentation: Computer Vision System Toolbox: Color Space Conversion,
MathWorks, Inc. [Online]. Available:
http://www.mathworks.com/help/toolbox/vision/ref/colorspaceconversion.html
SEC F10-47-MAGNPART
P a g e | 23
Recommended vendors
Webcams:
Vendor
Model
Price
Source
Information
Minimum
Requirements
Hewlett-Packard
Webcam HD-3110
$49.99
HP website1 (4/20/11)
Logitech
HD Webcam C310
$49.99
Logitech website2 (4/20/11)
AutoFocus, 30 fps, 5.7
Megapixels
2.4 GHz processor, 1GB RAM,
Windows XP/Vista/7, 320 MB
hard drive space
5 Megapixels
Microsoft
LifeCam HD-5000
$49.95
Microsoft website3
(4/20/11)
AutoFocus, 30 fps
1 GHz processor, 512 MB
RAM, 200 MB hard drive
space, Windows XP/Vista/7
Dual-Core 1.6 GHz
processor, 1 GB RAM,
Windows XP/Vista/7
Table 2. Recommended webcam vendors
Digital SLR cameras:
Vendor
Model
Price
Source
Information
Canon
EOS 60D
$999.99
Canon website4
(4/20/11)
18 Megapixels
Nikon
D7000
$1,109
Nikon website5
(4/20/11)
16.9 Megapixels
Sony
A580L
$899.99
Sony website6 (4/20/11)
16.2 Megapixels
Table 3. Recommended digital SLR camera vendors
High-speed cameras:
Vendor Photron USA, Inc.
Model
FASTCAM SA1.1
Price
$70,000-$90,000
Source
Phone conversation
(4/20/11)
Vision Research, Inc.
Phantom v7.3
$39,000
Phone conversation
(4/20/11)
Olympus
i-SPEED TR
$35,000-$40,000
Phone conversation
(4/20/11)
Table 4. Recommended high-speed camera vendors
1
http://www.shopping.hp.com/product/computer/categories/webcams/1/accessories/BK357AA%2523ABA
http://www.logitech.com/en-us/webcam-communications/webcams/devices/7076
3
http://www.microsoft.com/hardware/en-us/p/lifecam-hd-5000/7ND-00001
4
http://www.usa.canon.com/cusa/consumer/products/cameras/slr_cameras/eos_60d#Overview
5
http://imaging.nikon.com/lineup/dslr/d7000/spec.htm
6
http://www.sonystyle.com/webapp/wcs/stores/servlet/ProductDisplay?catalogId=10551&storeId=10151&langId=1&productId=8198552921666266164
2
SEC F10-47-MAGNPART
P a g e | 24
MATLAB:
MATLAB
Image Processing Toolbox
Image Acquisition Toolbox
Simulink
Computer Vision System Toolbox
DSP System Toolbox
Total:
$500
$200
$200
$500
$200
$200
$1,800
Table 5. Cost of MATLAB
Subsystem Description: Graphical User Interface Software (GD)
In designing the GUI there were a few ways it could have looked using different options and
objects. For example, using a displayed version of the files in a folder, sliders or scrolling of files and
settings, user input text boxes that controlled various settings, or radio and toggle buttons. The final
decision was to keep it simple on the user end and use predefined sliders, text boxes with text, and
buttons. There was also hardcoding that allowed the data to be processed and analyzed through codes
and functions in MATLAB. In order for the user to interact with these codes and functions an interface
would be needed. A further explanation of the GUI follows.
The team’s objective in this part was to create the Graphical User Interface (GUI) for the
program that will be analyzing and tracking the particles. There are a few basic components that
makeup the GUI. The parts are as follows: windows, buttons, text boxes and sliders. When the program
is running the first thing that appears is a screen with two buttons on it. These buttons are made of
functions which link each button to a different next screen. After a button is chosen, it then closes the
current screen depending on the users input or decision and moves onto the next screen. If the user
chooses the ‘Capture new images’ option then the next screen will have two buttons: ‘Start’ and ‘Stop’.
If the user were to choose the ‘Use Existing video or image files’ the next window will give the user three
options. The options are: ‘Back’, ‘Images’ and ‘Video’. The ‘Back’ button allows the user to return to the
SEC F10-47-MAGNPART
P a g e | 25
original window in case they wanted to capture new data. If the ‘Images’ button is chosen then the
current screen closes and in the new window there are new options. These options include selecting the
files the user wants to use. For the best results, the user should choose two consecutive images. When
this option is selected a file menu appears and the user can select the files they would like to use. The
next objects in the window are text boxes on the left which tell the user what settings they are able to
adjust. To adjust these settings they will use sliders which are located on the right side of each text box.
After adjusting the settings to their liking, they will select the ‘RUN’ button. This button will begin the
analysis of the selected files.
Going the other route of using previously recorded video files, the user will open a new window.
This window will also close the previous. The window is made of text boxes, sliders and buttons. After
the file is loaded the user can change the settings of how the video will be analyzing the file. This will be
done using the sliders. At the bottom of the screen there are a few buttons: ‘Simulate’, ’Pause’,
‘Continue’, ‘Stop’ and ‘Close.’ These buttons from left to right, allow the user to: analyze and track
particles, pause the simulation to record the values, resume the simulation, stop the simulation and the
Close button exits the simulation. For a flowchart see Drawing GD-1. This flow chart allows the user to
understand the thought process of the program. The only software that will be needed is the program
MATLAB which will be provided to the company.
The Graphical User Interface was successful in helping the user to simulate and analyze both
new video and existing images and video. The interface is user friendly, simple and provides a way for
the user to interact with the code. The program doesn’t require the user to do too much other than
have files available to use, and have a free hand to click. The coding of the user interface was linked
successfully to the hard code and contains functions which were used to simulate and analyze the data.
SEC F10-47-MAGNPART
P a g e | 26
Cost & Implementation (KB)
As mentioned earlier, the cost of implementing this system is $1,850. This cost is assumes that
the MATLAB computing environment and an image acquisition device need to be purchased. If either of
these items has already been purchased then the cost could potentially drop to $0. Because of the
academic nature of research into particle transport processes, the given MATLAB price is for an
academic and research license. If a license were needed to be purchased for commercial use, the price
would rise to $9,950. The cost was also calculated using the less expensive webcams for the image
acquisition device. If a digital SLR or high-speed camera were to be purchased, the price would rise
according to the prices listed in the recommended vendors section. Since the system works
independently of the river simulators there is no required down-time for implementation.
Conclusions & Recommendations (JS)
In conclusion, after studying several different designs it was determined that the method of
tracking particles optically was the most viable solution. The system consists of analytical software with
a user interface that inspects image and video files.
Although this system can be implemented in its current state, the possibility exists for increased
functionality to be added through additional work. This includes the ability for the system to run as a
standalone program without the need for the MATLAB computing environment as well as the additional
functionality for the image and video analysis portions as mentioned earlier.
As for the sensor-based method, different concepts and principles have yet to be researched
and implemented into the system. As recommended by a Geology professor during a presentation,
introducing magnetite into the system in conjunction with a Hall Effect sensor would be an applicable
way of measuring the particle movement.
SEC F10-47-MAGNPART
P a g e | 27
This design process has allowed the MAGNPART team members to improve their engineering
skills as well as their researching and learning methods. In addition, many soft skills that are important in
any workplace or team activity were enhanced. Working with Little River Research & Design has also
provided skills for managing relationships with clients and a unique insight to the field of fluvial
geomorphology.
Appendices
SEC F10-47-MAGNPART
P a g e | 28
A. Drawings
Main function
(ParticleTracker)
User interface setup
Setup for remaining figure windows
Existing Files
(File_Callback)
Camera Interface
(Capture_Callback)
User interface setup
User interface setup
Camera setup
Image Analysis
(image_Callback)
Video Analysis
(video_Callback)
Create directory for
images
User interface setup
User interface
setup
Begin acquiring
images
Load Simulink
model
Save acquired
images to directory
User interface
controls
Determine
difference between
last two images
taken
User interface
controls
Read two images
Adjust contrast of
two images
Convert to numerical
data type
Invert colors
“bpass” function
Display bpass result
images
“pkfnd” function
SEC F10-47-MAGNPART
Determine which
particles moved
between images
Display images with
moved particles
highlighted
Concatenate
matrices
SALUKI ENGINEERING COMPANY
Software Function Flowchart
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
“track” function
Drawn by: Kyle Barringer
Write results to text
file
Date: 4/16/11
Scale: N/A
Drawing
Number:
KB-1
Rev.0
P a g e | 29
Optical Flow
Convert from RGB to
Intensity
Select File
Thresholding and Region Filtering
Determine velocity
threshold
Region Filtering
Blob
Analysis
Determine
bounding
box
coordinates
Determine
blob
Extract
blob
coordinate
s
count
Original
Display Results
Display bounding boxes and number of particles
Draw the
boxes
Print the text for the number of
particles
Motion Vector
Convert velocity values to line
magnitudes
Velocity
Add gain of 100
Convert to
1-D array
Draw lines
Print velocity values at bounding
box locations
SALUKI ENGINEERING COMPANY
Video Analysis: Simulink Model Flowchart
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 4/16/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-2
Rev.0
P a g e | 30
Tracking Particles Using Optical Flow
Image
Pictures 006.avi
Image
V: 240x320, 15.0 fps
From Multimedia File
(Name of video file to be analyzed)
R'G'B' to
intensity
Color Space
Conversion
(Converts from RGB to Intensity)
I
Optical Flow
|V|^2
(Lucas-Kanade)
Count
Pictures 006.avi
V: 240x320, 15.0 fps
BBox
Optical Flow1
Vel
Thresholding and
Region Filtering
Video In
Vel_Coords
Display Results
SALUKI ENGINEERING COMPANY
Simulink model: Particle Tracker
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 2/18/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-3
Rev.0
P a g e | 31
1
Count
1
>=
Vel
Vel
Th. Vel
Velocity Threshold
I
Median
Filter
Close
Count
BW
2
Regions
3
Vel_Coords
Region Filtering
BBox
Vel_Coords
SALUKI ENGINEERING COMPANY
Simulink model: Thresholding and Region Filtering
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 2/18/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-4
Rev.0
P a g e | 32
1
1
Vel
Th. Vel
Mean velocity
per frame
Mean velocity
per frame
across time
SALUKI ENGINEERING COMPANY
Simulink model: Thresholding and Region Filtering
/ Velocity Threshold
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 2/18/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-5
Rev.0
P a g e | 33
select particles only of specified size
1
BW
BW
Blob
BBox
Analysis
In1
Idx
Regions
Select
Out1
Columns
2
Variable
Selector
>= line_row
single
int32
1
Count
3
Vel_Coords
SALUKI ENGINEERING COMPANY
Simulink model: Thresholding and Region Filtering
/ Region Filtering
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 3/21/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-6
Rev.0
P a g e | 34
1
Count
Count
2
Image
BBox Video Out
BBox
4
Video In
To Video
Display
Bounding Boxes
Video In
Display bounding boxes
and number of particles
Image
Variables
'%8f'
5
Image
Location
Vel_Coords
Velocity
Insert Text
Image
-KGain1
vel.Values
To Video
Display
Original
U( : )
Convert 2-D to 1-D
Image
3
To Video
Display
vel.Lines
Pts
Draw
Lines
Image
To Video
Display
Vel
Motion Vector
Optical Flow Lines
SALUKI ENGINEERING COMPANY
Simulink model: Display Results
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 3/21/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-7
Rev.0
P a g e | 35
Video In
3
White background
for count
Image
Draw
Rectangles
2
Image
Y0
Pts
U
BBox
A
Y
'%4d'
1
Assignment1
Draw Shapes
1
0
Video Out
Variables
Count
Insert Text
SALUKI ENGINEERING COMPANY
Simulink model: Display Results / Display bounding
boxes and number of particles
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Kyle Barringer
Date: 3/21/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
KB-8
Rev.0
P a g e | 36
Main
Screen
Capture New video
PREVIOUS
FILES
Second
Screen
START
Second
Screen
STOP
IMAGE
VIDEO
LOAD
FILE
LOAD
FILES
User Adjust Settings
User Adjust Settings
RUN
CONTINUE
SIMULATE
PAUS
E
STOP
SALUKI ENGINEERING COMPANY
Graphical User Interface Flowchart
SEC Reference # F10-47-MAGNPART
Client: Little River Research & Design
Drawn by: Geoffrey Daniel
Date: 4/23/11
SEC F10-47-MAGNPART
Scale: N/A
Drawing
Number:
GD-1
Rev.0
P a g e | 37
B. User’s Guide
SEC F10-47-MAGNPART
P a g e | 38
SEC F10-47-MAGNPART
P a g e | 39
SEC F10-47-MAGNPART
P a g e | 40
SEC F10-47-MAGNPART
P a g e | 41
SEC F10-47-MAGNPART
P a g e | 42
SEC F10-47-MAGNPART
P a g e | 43
SEC F10-47-MAGNPART
P a g e | 44
SEC F10-47-MAGNPART
P a g e | 45
SEC F10-47-MAGNPART
P a g e | 46
C. Technical Manual
SEC F10-47-MAGNPART
P a g e | 47
SEC F10-47-MAGNPART
P a g e | 48
SEC F10-47-MAGNPART
P a g e | 49
SEC F10-47-MAGNPART
P a g e | 50
SEC F10-47-MAGNPART
P a g e | 51
SEC F10-47-MAGNPART
P a g e | 52
SEC F10-47-MAGNPART
P a g e | 53
SEC F10-47-MAGNPART
P a g e | 54
SEC F10-47-MAGNPART
P a g e | 55
SEC F10-47-MAGNPART
P a g e | 56
SEC F10-47-MAGNPART
P a g e | 57
SEC F10-47-MAGNPART
P a g e | 58
SEC F10-47-MAGNPART
P a g e | 59
SEC F10-47-MAGNPART
P a g e | 60
SEC F10-47-MAGNPART
P a g e | 61
SEC F10-47-MAGNPART
P a g e | 62
SEC F10-47-MAGNPART
P a g e | 63
SEC F10-47-MAGNPART
P a g e | 64
SEC F10-47-MAGNPART
P a g e | 65
SEC F10-47-MAGNPART
P a g e | 66
SEC F10-47-MAGNPART
P a g e | 67
SEC F10-47-MAGNPART
P a g e | 68
SEC F10-47-MAGNPART
P a g e | 69
SEC F10-47-MAGNPART
P a g e | 70
SEC F10-47-MAGNPART
P a g e | 71
SEC F10-47-MAGNPART
P a g e | 72
SEC F10-47-MAGNPART
P a g e | 73
SEC F10-47-MAGNPART
P a g e | 74
SEC F10-47-MAGNPART
P a g e | 75
Download