Presentation - Computer Vision and Robotics Laboratory

advertisement
Machine Vision Analysis of the Energy Efficiency of
Intermodal Freight Trains: Sibley Site Update
Chris Barkan and Narendra Ahuja* Co-Principal Investigators
John M. Hart* - Senior Research Engineer, Riley Edwards - Lecturer
Graduate Students: Tristan Rickett, Avinash Kumar*
Undergraduate Students: Ruben Zhao*, Sujay Bhobe*,
Chun Yang, Suchithra Gopalakrishnan*, Phil Hyma, Mike Wnek,
*Beckman Institute for Advanced Science and Technology
Railroad Engineering Program, University of Illinois
Wayside Machine Vision System
IM Train
Image
Acquisition
System
Machine
Vision
Algorithms
Data
Analysis
System
To BNSF
2
Sibley Site Equipment Layout
3
Machine Vision Site Equipment
• Camera Enclosure on
Tower to House CCD
Camera
• Bungalow Containing
Computers and Train
Detection Circuitry
CCD Video
Camera
Machine
Vision
Processing
Equipment
Halogen
Lighting
• 30ft Tower for Set of
Halogen Lights and
Wireless Antennas
4
Train Detection System
Presence
Detectors
Wheel
Detectors
Loop
Detectors
Programmable
Logic Controller
Daylight
Sensor
Train Status
Monitor
Artificial
Lighting
Machine Vision
Computer
5
Machine Vision Site Equipment
CCD Video
Camera
Halogen
Lighting
Machine
Vision
Processing
Equipment
19" Equipment Rack
In Bungalow
Machine Vision
Computer
Train Detection
Sensors
Train Status
Monitor
6
Site Activation Scenario
•
•
•
•
•
•
•
•
•
•
•
•
•
•
System in Wait State and Continuously Monitoring Train Detectors
Train Approaches Machine Vision Site (West/East Bound)
Locomotive Triggers (East/West) Presence Detector
Lights are Turned On If Photocell Does Not Detect Daylight
Camera Turns On and Focuses Region of Interest on Target
Exposure Adjustment is Made Based on Target Only
Camera Region of Interest is Returned to Entire Scene
Locomotive Triggers (East/West) Wheel Detector
Video Recording is Started to Capture Background Appearance
Train Passes by Camera and is Recorded Against Background
Video is Taken at 30f/s and Buffered to Memory
Video Recording is Ended and Video Frames are Stored
If Lighting was Used, Lights are Turned Off
System Returns to Wait State for Next Train Arrival
7
Details of Activation Scenario
• System in Wait State Continuously Monitoring Train Detectors
– The Train Status Monitor (TSM) checks the state of the detectors
– Uses data acquisition board inside the pc connected to all detectors through
the Programmable Logic Controller (PLC)
•
•
•
•
•
•
West Presence Detector
West Wheel Detector
West Loop Detector
East Loop Detector
East Wheel Detector
East Presence Detector
– Controlled by a custom program - DIControl
• Frequency Rate of monitoring is admustable (currently 1/10 sec)
• When not looking at detectors, control of processor is returned to OS
8
Details of Activation Scenario
• Train Approaches Machine Vision Site (West/East Bound)
• Locomotive Triggers (East/West) Presence Detector
– Presence detector pulse is received by the PLC
– Presence detector pulse is also captured by the Train Status Monitor
9
Details of Activation Scenario
• Lights are Turned On If Photocell Does Not Detect Daylight
– PLC enables the power to the lights
– If the photocell is detecting light, it inhibits the power signal to the lights
– Lights now staggered to more evenly distribute lighting (initial config)
10
Details of Activation Scenario
• Camera Turns On and Focuses Region of Interest on Target
–
–
–
–
–
Camera is started by custom software – pgrAperture
Because video frames of the background are needed prior to the train,
the camera must adjust exposure without the presence of the train
The target is designed to reflect light similarly to the side of the train
The camera view is then restricted the region of the target only
11
Details of Activation Scenario
• Exposure Adjustment is Made Based on Target Only
–
–
–
–
The camera parameters are allowed to adjust to the lighting on the target,
with the exception of the shutter speed
The shutter speed is set to a value (2ms) determined experimentally
P:events image motion blur due to the moving train (normal camera below)
• Camera Region of Interest is Returned to Entire Scene
– Before train reaches wheel detector
12
Details of Activation Scenario
• Locomotive Triggers (East/West) Wheel Detector
– Wheel detector pulse is captured by the Train Status Monitor
– Wheel detector is placed 75ft from camera to start recording prior to train
• Video Recording is Started to Capture Background Appearance
– These frames are used to create a model of the background by the TMS
13
Details of Activation Scenario
• Train Passes by Camera and Is Recorded Against Background
– With exposure set by target, train should not appear dark even if
background is bright
14
Details of Activation Scenario
• Video is Taken at 30f/s and Buffered to Memory
– To continuous capture 30f/s, frames are buffered before converting to video
• Video Recording is Ended and Video Frames are Stored
– Videos are stored in multiple 1Gbyte segments for OS requirements
• If Lighting was Used, Lights are Turned Off
• System Returns to Wait State for Next Train Arrival
15
Now Testing System Automation
Video
Acquisition
Video
Storage
Train
Monitoring Sys
Train
Panorama
Gap
Measurements
Train
Score
Load
Identification
AEI Reader
Data
Train
Scoring System
Mini-Umler
Database
Loading
Evaluation
16
Demo In Computer Vision and Robotics Lab
of Duplicate
Image Acquisition Computer
Adjusting to Ambient Lighting Conditions and
Recording Video
17
Camera Line of Site
Viewing Volume
Camera
Inter-modal Train
18
Train Monitoring System
• Input : A video of an intermodal freight train
Our Machine Vision System
• Output : Length of gaps between the load
– Improve aerodynamic efficiency of the train
– Large savings on fuel costs
19
Challenge #1
• Varying outdoor imaging conditions
21
Challenge #2
• Different Types of Containers
22
Challenge #3
• Computations involved need to be fast to
handle railroad traffic.
– 1 day has 20-30 trains on both sites
– 1 train is completely captured in approx 5000
frames
– 1 frame is 640x480 pixels
– Need to process all frames
23
Method: Step 1
• Estimate initial train velocity in pixel shifts/frame
Image
shifts
by v
pixels
x
x+v
24
Method: Step 1
1. Select a square window and calculate normalized cross
correlation with the static background : C_background
x
25
Method: Step 1
2. Select another window at location x + v in the previous
frame
x
x+v
26
Method: Step 1
3. Calculate Normalized Cross Correlation between these two
windows as C_previous
x
x+v
27
Method: Step 1
4. Similarly calculate normalized cross correlation between
current frame and next frame as C_next
x-v
x
x+v
28
Method: Step 1
Calculate Foreground Cost = (C_previous + C_next –
C_background)/4
xv
x
x+
v
29
Method: Step 1
• Extract foreground region from a stripe at the
center of each train frame
Background
Foreground
30
Method: Step 1
• Repeat for consecutive frames
31
Method: Step 2
• Juxtapose stripes from consecutive frames to
generate panorama
32
Method: Step 2
• Post process panorama to remove background
near edges
33
Method: Step 3
• Classify each container into 3 different types
Double Stacks of two different kinds
Single
Stack
Trailer
34
Method: Step 4
• Obtain gap lengths and histogram for analysis
35
Results
• Tested on 110 train videos with 3 different types of
containers
– 573 Type 1 (Double Stack) containers
– 515 Type 2 (Trailers) containers
– 10 Type 3 (Single Stack)containers
• Gap detection is accurate to approx 1 ft error
• Confusion matrix for load type detection
Type 1
Type 1
Type 2
Type 3
573
5
0
Type 2
0
515
0
Type 3
0
0
10
36
Data Analysis System
Tristan Rickett
Outline
• Train Resistance
• Train Scoring System
– Description
– Inputs: AEI, TMS, UMLER
• AEI
– Current Setup at LPC and Sibley
– Using Available AEI Data for Sibley Videos
• Data Transfer
– Matching TMS file with AEI data
• Future Work
Train Resistance
• Train Resistance considers the effects of
inertia that tend to keep a body at rest and
the effects of friction that cause it to lose
momentum once moving
• The general equation for train resistance is
the following: R = AW + BV + CV2
– A = Journal Resistance
– B = Flange Resistance
– C = Aerodynamic Resistance
Sources of Aerodynamic Drag
•
•
•
•
Gap lengths
Varying heights
Rough surface
Drag area of the lead
locomotive
• Lack of streamlining
Current practice in intermodal freight
train loading
• Slot Utilization is metric used to measure the percentage
of the spaces (a.k.a. slots) on intermodal cars that are
used for loads
– However, this metric does not account for the size of
the slot and the size of the load
8 loads / 10 slots = 80% Slot Utilization
10 loads / 10 slots = 100% Slot Utilization
Y.C. Lai 2008
Slot Efficiency Methodology
• Slot Efficiency: comparison of the difference
between the actual and ideal loading
configuration
– This metric is similar to slot utilization except
that it also considers the energy efficiency of
the load-slot combination
Length
of
Actual
Load
Slot
Efficiency


100
%
Length
of
Ideal
Load
48
'
Slot
Efficiency
 
100
%
53
'
Slot
Efficiency

90
.6
%
Train Scoring System (TSS)
• The purpose of the train
scoring system is to
evaluate an intermodal
train’s loading efficiency
and provide an
aerodynamic coefficient
to estimate fuel
consumption
• The results from the TSS
can aid terminal
managers in creating
more fuel-efficient trains
Flow of TSS
TSS Inputs
• Mini-UMLER Database has the database
with all the railcars and their equipment
• Gap-length files contain the train’s
loadings and the gap lengths
• AEI (Automatic Equipment
Identification) data provides a list of the
train’s equipment and axle timestamps
Mini-UMLER Database
• The information contained in the database
includes the following:
– Car Initial (e.g. DTTX)
– Car Number (e.g. 749452)
– Car’s Outside length in feet (e.g. 270 ft)
– Car Type (e.g. S)
– Car Attribute 1 (e.g. 1)
– Car Attribute 2 (e.g. 6)
– Car Attribute 3 (e.g. 2)
Progress Made
• Improved how the code produces the
output
– It is now embeddable so that it can run from
inside another program
• Formatted a newer UMLER database
• Integrated TSS with the proposed system
automation
AEI Data Collection at LPC
• The TSS was originally programmed to
use AEI that had axle timestamp values
like the PRT AEI reader at LPC
• At Sibley, we have begun collecting videos
since last December but the problem is
that the AEI data does not have
timestamps
Addressing Present AEI Data
Acquisition
• If the hot-box data is available, it would be
worth calculating our own timestamps
using this available data
• With the new AEI reader for the Sibley
site, it is recommended that it provides
axle timestamps
Determination of Axle
Timestamps
• Using kinematics equations and some
assumptions, we can determine the
timestamps.
• Using di = vi x ti + 0.5aiti2
– Distance, di, is provided in the AEI data
– Assume velocity is around 20 to 25 mph (or
29.3 to 36.7 ft/sec)
– No acceleration
Determination of Axle
Timestamps
• Having an acceleration of zero cancels out
half of the equation allowing di / vi = ti .
• Because axle timestamp values are
cumulative, the final equation will be
– ti = ti – 1 + di / vi
Using a Wheel Detector for
Timestamps
• Use one wheel detector already installed
at the site to measure axle timestamp
values.
• System would be triggered by one of
presence detectors
• The difficulty is finding a place in the
automation where the AEI data can be
combined
Matching the Scoring Data with CAD
Data
• All videos and AEI are named according to
the date and time of when they were
captured
• With the date of the scored train, it can also
be attached to computer-aided dispatching
data so terminal managers can review the
efficiency of their trains loaded at their yard
Matching Data
Acknowledgements
• Special Thanks to:
• BNSF
– Paul Gabler, Hank Lees, Josh McBain, Larry
Milhon, Cory Pasta, and Mark Stehly
• LJN and Associates
– Leonard Nettles and Kevin Clarke
Interdisciplinary Team Members
• From previous presentation….
Download