What is Kalman Filter

advertisement
Intelligent Feature-guided Multi-object
Tracking Using Kalman Filter
MOHANRAJ .S
82006106019
RENGANATHAN.P
82006106030
PRASANNA.V
82006106309
Mrs. S.SUGANTHI M.E., (Ph.D).,
Professor & Head Of the Department
INTELLIGENT
FEATURE - GUIDED
MULTI - OBJECT
TRACKING
USING
KALMAN FILTER
USING
What do you mean by tracking ?
What is Kalman Filter ?
What are the different types of filters available ?
Why do we prefer Kalman Filter in our project ?
What are the conditions observed when objects
are being tracked in real time ?
What is the existing system ?
What is our contribution in this project ?
Scope of the Project
In this project, we proposed an idea of
intelligent feature-guided tracking using Kalman filtering.
• A new method is developed named CorrelationWeighted Histogram Intersection (CWHI), in which
correlation weights are applied to Histogram
Intersection (HI) method.
• We focus on multi-object tracking in traffic sequences
and our aim is to achieve efficient tracking of multiple
moving objects under the confusing situations.
• The proposed algorithm achieves robust tracking with
97.3% accuracy and 0.07% covariance error in different
real-time scenarios.
Load Video
Frame Conversion
Video Segmentation
Kalman Filter Algorithm
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% --- Executes on button press in getvideo.
function getvideo_Callback(hObject, eventdata, handles)
% hObject handle to getvideo (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
clc
global vfilename;
global m;
global sfrom;
[ vfilename, vpathname ] = uigetfile( '*.avi', 'Select an image to segment' );
m=aviread(strcat( vpathname, vfilename ));
sfrom=strcat( vpathname, vfilename );
fileinfo = aviinfo(sfrom);
copyfile(sfrom,'./frame');
vfile=strcat('./frame/',vfilename);
vfile=strcat(vfile, ' ... Frames : ',num2str(fileinfo.NumFrames),' ...
ImageType : ',fileinfo.ImageType);
• set(handles.axes1,'HandleVisibility','on','Visible','on','Units','pixels');
• movie(m);
• set(handles.axes1,'HandleVisibility','callback');
We used the function aviread - to import the video &
the function movie - to display it.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% --- Executes on button press in splitframes.
function splitframes_Callback(hObject, eventdata, handles)
% hObject handle to splitframes (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global sfrom;
global test;
global z;
global mframes;
global test
fileinfo = aviinfo(sfrom);
mframes=fileinfo.NumFrames;
%fileinfo.FramesPerSecond;
mwidth=fileinfo.Width;
mheight=fileinfo.Height;
mframes;
mwidth;
mheight;
%declare array
M_Array=[];
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% reading one frame at a time and storing it in to array
for i=1:fileinfo.NumFrames;
mov=aviread(sfrom,i);
M_Array=[M_Array mov];
end
O_Array=[];
%M_Array=[];
test=fileinfo.NumFrames;
y=sqrt(test);
z=round(y)+1;
h=figure;
for j=1:fileinfo.NumFrames
%montage(image,map);%displays the images in new figure
[image,map]=frame2im(M_Array(j));
outfile=strcat(num2str(j),'.jpg');
figure(h),subplot(z,z,j),imshow(image,map);
imwrite(image,outfile);
movefile(outfile,'./frame');
end
•
•
•
•
•
•
% --- Executes on button press in exit.
function exit_Callback(hObject, eventdata, handles)
% hObject handle to exit (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close all
•
•
•
•
•
•
% --- Executes on button press in next.
function next_Callback(hObject, eventdata, handles)
% hObject handle to next (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
segf;
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% --- Executes on button press in segfr.
function segfr_Callback(hObject, eventdata, handles)
% hObject handle to segfr (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global test;
global z;
k=5;
map=3;
for i=1:test
ima=strcat(num2str(i),'.jpg');
ima=strcat('./frame/',ima);
imr=imread(ima);
imr=rgb2gray(imr);
[mu,mask]=fcmeans(imr,k);
axes(handles.axes1);
set(handles.axes1,'HandleVisibility','on','Visible','on','Units','pixels');
imagesc(mask);
set(handles.axes1,'HandleVisibility','callback');
%figure(h),subplot(z,z,i),imagesc(mask),colormap(gray);
end
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% --- Executes on button press in features.
function features_Callback(hObject, eventdata, handles)
% hObject handle to features (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global mframes;
global vfs;
wb=waitbar(0,'Please wait...');
vfs=cell(1,mframes);
for i=1:mframes
fna=strcat(num2str(i),'.jpg');
fna=strcat('./frame/',fna);
im=imread(fna);
im=rgb2gray(im);
nper=(i/mframes)*100;
nper1=num2str(round(nper));
nper1=strcat(nper1, '%');
tit=strcat('Find Feature .... ',nper1);
tit=strcat(tit, ' >> File Processing... ');
tit=strcat(tit,fna);
waitbar(nper/100,wb,tit);
•
•
•
•
•
•
•
F=getfeatures(im);
vfs{i}=gf;
end
for i=1:mframes
%struct2cell(vfs(i))
end
close(wb);
•
•
•
•
•
•
•
% --- Executes on button press in kalman.
function kalman_Callback(hObject, eventdata, handles)
% hObject handle to kalman (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
kalmanf
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% --- Executes on button press in kalmanfilter.
function kalmanfilter_Callback(hObject, eventdata, handles)
% hObject handle to kalmanfilter (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
avi = aviread('samplevideo.avi');
video = {avi.cdata};
for a = 1:length(video)
imagesc(video{a});
axis image off
drawnow;
end;
disp('output video');
Kalmantracking(video);
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% --- Executes on button press in originalvideo.
function originalvideo_Callback(hObject, eventdata, handles)
% hObject handle to originalvideo (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --- Executes on button press in exit.
function exit_Callback(hObject, eventdata, handles)
% hObject handle to exit (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close all
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
function [mu,mask]=fcmeans(ima,k)
% check image
ima=double(ima);
copy=ima;
% make a copy
ima=ima(:);
% vectorize ima
mi=min(ima);
% deal with negative
ima=ima-mi+1; % and zero values
s=length(ima);
% create image histogram
m=max(ima)+1;
h=zeros(1,m);
hc=zeros(1,m);
for i=1:s
if(ima(i)>0) h(ima(i))=h(ima(i))+1;end;
end
ind=find(h);
hl=length(ind);
% initiate centroids
mu=(1:k)*m/(k+1);
% start process
while(true)
oldmu=mu;
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% current classification
for i=1:hl
c=abs(ind(i)-mu);
cc=find(c==min(c));
hc(ind(i))=cc(1);
end
%recalculation of means
for i=1:k,
a=find(hc==i);
mu(i)=sum(a.*h(a))/sum(h(a));
end
if(mu==oldmu) break;end;
end
% calculate mask
s=size(copy);
mask=zeros(s);
for i=1:s(1),
for j=1:s(2),
c=abs(copy(i,j)-mu);
a=find(c==min(c));
mask(i,j)=a(1);
end
end
mu=mu+mi-1; % recover real range
•
•
•
•
•
•
•
•
•
•
•
•
•
% function GETFEATURES extracts features from the pre-processed
image
% input: Xp - pre-processed image obtained by calling function
%
process_image
% output: F - A five-dimensional feature vector
function F = getfeatures(X)
global minpix
global rangepix
global sum_edge
global sum_locvar
global numObjects
% pre-processing
%Xp = process_image(X);
Xp = X;
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% extracting features
minpix = min_pixval(Xp);
rangepix = range_pixval(Xp);
sum_edge = sumof_edge(Xp);
sum_locvar = sumof_localvar(Xp);
numObjects = numberofobjects(Xp);
%correl=correlation(Xp);
%entro=entropy('tumor.jpg');
%contra=contrast(Xp);
%idm1=idm(Xp);
%homog=homogeneity('tumor.jpg');
%histo=sum(sum(histogram(Xp)));
F = [minpix;rangepix;sum_edge;sum_locvar;numObjects];
features.v=minpix;
features.v1=rangepix;
features.v2=sum_edge;
features.v3=sum_locvar;
features.v4=numObjects;
save(['features.mat'], 'features')
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% extracting features
minpix = min_pixval(Xp);
rangepix = range_pixval(Xp);
sum_edge = sumof_edge(Xp);
sum_locvar = sumof_localvar(Xp);
numObjects = numberofobjects(Xp);
%correl=correlation(Xp);
%entro=entropy('tumor.jpg');
%contra=contrast(Xp);
%idm1=idm(Xp);
%homog=homogeneity('tumor.jpg');
%histo=sum(sum(histogram(Xp)));
F = [minpix;rangepix;sum_edge;sum_locvar;numObjects];
features.v=minpix;
features.v1=rangepix;
features.v2=sum_edge;
features.v3=sum_locvar;
features.v4=numObjects;
save(['features.mat'], 'features')
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% this function calculate a minimum value of pixel intensity
% for each image after pre-processing to remove shadow
function mp = min_pixval(X3)
global minpix
% waitbar(10);
X = double(X3);
[m, n]=size(X);
% histogram of pixel intensity of original
[NX,bX] = HIST(reshape(X,m*n,1));
% num_thres = 5000;
% mp = min(bX(find(NX >= num_thres)));
mp = min(bX(find(NX)));
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% this function calculate a minimum value of pixel intensity
% for each image after pre-processing to remove shadow
function mp = min_pixval(X3)
global mp
global rangepix
waitbar(10);
X = double(X3);
% histogram of pixel intensity of original and shadow-removed images
[NX,bX] = HIST(reshape(X,size(X,1)*size(X,2),1),100);
num_thres = 5000;
mp = min(bX(find(NX >= num_thres)));
mp = min(bX(find(NX)));
•
•
•
•
•
•
•
•
•
•
•
•
function numObjects = numberofobjects(X3)
global numObjects
% estimate number of objects in image ...
%waitbar(50);
bw = bwareaopen(im2bw(X3,0.25),30);
bw = imfill(bw,'holes');
[labeled,numObjects] = bwlabel(bw,8);
% figure; imagesc(labeled);
% numObjects
•
•
•
•
•
•
•
•
•
•
•
•
% this function detect edges of crystal after pre-processing
% using Canny's method and calculate the global sum of the edges
function SoE = sumof_edge(X3)
global SoE
global sumof_edge
%waitbar(30);
Ed = edge(X3,'canny',0.1); % detect edges
SoE = sum(sum(Ed));
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
% this function calculate a global sum of local variance
% for each image after pre-processing to remove shadow
function SoLV = sumof_localvar(X3)
global sum_locvar
global numObjects
%waitbar(40);
X = double(X3);
% local variance of original and shadow-removed images
M = size(X,1)-1;
N = size(X,2)-1;
for i = 2:M
for j = 2:N
SX(i,j) = std(reshape(X(i-1:i+1,(j-1):(j+1)),9,1));
end
end
SoLV = sum(sum(SX.^2));
• % function [video, audio] = mmread(filename, frames, time,
disableVideo, disableAudio, matlabCommand, trySeeking, useFFGRAB)
• function video = mmread(filename)
• % [video, audio] = mmread(filename, frames, time, disableVideo,
• %
disableAudio, matlabCommand, trySeeking, useFFGRAB)
• % mmread reads virtually any media file. It now uses AVbin and
FFmpeg to
• % capture the data, this includes URLs. The code supports all major OSs
• % and architectures that Matlab runs on.
• function processFrame(data,width,height,frameNr,time)
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
function d = Kalmantracking(video)
if ischar(video)
% Load the video from an avi file.
avi = aviread(video);
pixels = double(cat(4,avi(1:2:end).cdata))/255;
clear avi
else
% Compile the pixel data into a single array
pixels = double(cat(4,video{1:2:end}))/255;
clear video
end
% Convert to RGB to GRAY SCALE image.
nFrames = size(pixels,4);
for f = 1:nFrames
pixel(:,:,f) = (rgb2gray(pixels(:,:,:,f)));
end
rows=240;
cols=320;
nrames=f;
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
for l = 2:nrames
d(:,:,l)=(abs(pixel(:,:,l)-pixel(:,:,l-1)));
k=d(:,:,l);
bw(:,:,l) = im2bw(k, .2);
bw1=bwlabel(bw(:,:,l));
imshow(bw(:,:,l))
hold on
cou=1;
for h=1:rows
for w=1:cols
if(bw(h,w,l)>0.5)
toplen = h;
if (cou == 1)
tpln=toplen;
end
cou=cou+1;
break
end
end
end
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
disp(toplen);
coun=1;
for w=1:cols
for h=1:rows
if(bw(h,w,l)>0.5)
leftsi = w;
if (coun == 1)
lftln=leftsi;
coun=coun+1;
end
break
end
end
end
disp(leftsi);
disp(lftln);
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
widh=leftsi-lftln;
heig=toplen-tpln;
widt=widh/2;
disp(widt);
heit=heig/2;
with=lftln+widt;
heth=tpln+heit;
wth(l)=with;
hth(l)=heth;
disp(heit);
disp(widh);
disp(heig);
rectangle('Position',[lftln tpln widh heig],'EdgeColor','r');
disp(with);
disp(heth);
plot(with,heth, 'r*');
drawnow;
hold off
end;
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
widh=leftsi-lftln;
heig=toplen-tpln;
widt=widh/2;
disp(widt);
heit=heig/2;
with=lftln+widt;
heth=tpln+heit;
wth(l)=with;
hth(l)=heth;
disp(heit);
disp(widh);
disp(heig);
rectangle('Position',[lftln tpln widh heig],'EdgeColor','r');
disp(with);
disp(heth);
plot(with,heth, 'r*');
drawnow;
hold off
end;
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
function video = dis(avi)
clear data
disp('input video');
avi = aviread('samplevideo.avi');
video = {avi.cdata};
for a = 1:length(video)
imagesc(video{a});
axis image off
drawnow;
end;
disp('output video');
tracking(video);
Load Video
Frame Conversion
Video Segmentation
Kalman Filter
•
Video Segmentation :
• Segmentation refers to the process of partitioning a video into
multiple segments (sets of pixels) (Also known as super pixels).
• Segmentation divides an image into its constituent regions or
objects.
• Segmentation of non trivial images is one of the difficult task in
image processing. Still under research.
• Segmentation accuracy determines the eventual success or failure
of computerized analysis procedure.
•
Kalman Filter Algorithm :
• The Kalman filter is recursive estimator. This means that only
the estimated state from the previous time step and the current
measurement are needed to compute the estimate for the current
state.
Good tracking of moving objects under
confusions.
Easy to catch the unauthorized person.
Time saving process.
Good performance.
Distinctive techniques.
• Using an Kalman filters and related estimators to track
user’s heads and limbs in virtual environments.
• The Kalman filter has been used extensively for data
fusion in navigation.
• Discovery of the Kalman Filter as a Practical Tool for
Aerospace and Industry
• The Kalman filter---Its recognition and development for
aerospace applications,
• In this work, a new approach is proposed for tracking
multiple moving object in confusing states (i.e. interobject occlusion and separation) using Kalman filter and
CWHI based algorithm.
• We exploited Kalman filter with proposed CWHI
based algorithm.
• Each moving object is assigned an individual Kalman
tracker which is assisted by “manager” during the
entire tracking process.
The proposed approach has shown good performance
when applied on several videos under confusing
situations.
• [1] M. Swain and D. Ballard, Color Indexing: In Proceedings of the Third IEEE
International Conference on Computer Vision, Japan, 1990, pp. 11–32.
• [2] R. E. Kalman, “A new approach to linear filtering and Prediction Problems,”
Transactions of the ASME, Journal of Basic Engineering, vol. 82, pp. 34-45, 1960.
• [3] P. S. Maybeck, “Stochastic Models Estimation and Control,” Vol. 1, New York:
Academic Press, 1979.
• [4] N. Funk, “A Study of the Kalman Filter applied to Visual Tracking”,
University of Alberta, Project for CMPUT 652, 2003.
• [5] N. Nguyen, H. H. Bui, S. Venkatesh, G. West, “Multiple camera coordination
in a surveillance system.” ACTA Automatica Sinica, vol. 29, pp. 408-422, 2003.
• [6] T. H. Chang, S. Gong., Tracking Multiple People with a Multi-Camera
System: IEEE Workshop on Multi-Object Tracking, 2001, pp. 19–26.
• [7] H. YU, Y. Wang., F. Kuang, Q. Wan, Multi-moving Targets Detecting and
Tracking in a Surveillance System: In Proceeding of the 5th World Congres on
Intelligent Control and Automation, China, June, 2004.
• [8] S. A. Vigus, D. R. Bull, C. N. Canagarajah, Video object tracking using region
split and merge and a Kalman filter tracking algorithm: In proceedings of ICIP,
2001, pp. 650-653.
• [9] A. Czyzewski, P. Dalka, Examining Kalman Filters Applied to Tracking
Objects in Motion: 9th International Workshop on Image Analysis for Multimedia
Interactive Services, 2008, pp. 175-178.
• [10] R. Collins, Y. Liu, and M. Leordeanu. “Online selection of
discriminative tracking features,” IEEE Transactions on Pattern Analysisand
Machine Intelligence, vol. 27, pp. 1631–1643, 2005.
• [11] T. Yang, S. Z. Li, Q. Pan, and J. Li. Real-time multiple objects tracking with
occlusion handling in dynamic scenes: In Proceedings of the 2005 IEEE Computer
Society Conference on Computer Vision and Pattern Recognition, 2005, pp. 970–
975.
• [12] F. Liu, Q. Liu, H. Lu, Robust Color-Based Tracking: 3rd International
Conference on Image and Graphics, 2004, pp. 132–135.
• [13] X. Limin, Object tracking using color-based Kalman particle filters: IEEE
International Conference on Signal Processing, 2004, pp. 679–682.
• [14] P. Pérez, C. Hue, J. Vermaak, and M. Gangnet, Color-Based Probabilistic
Tracking: 7th European Conf. on Computer Vision, 2002,pp. 661–674.
• [15] W. Jia, H. Zhang, X. He, and Q. Wu. Symmetric colour ratio gradients in spiral
architecture: Proceedings of the 18th International Conference on Pattern
Recognition, 2006.
• [16] G. Welch and G. Bishop, “An Introduction to the Kalman Filter”,University
of North Carolina at Chapel Hill, Technical Report 95-041, 1995.
The future work will be focused in
two directions.
First, histogram methods will be
investigated further;
Second more advanced tracking
techniques such as Extended Kalman
filter and Particle filter will be exploited to
handle more complex real-time scenarios.
TRANSMIT your QUERIES ???
AREA to prove our TALENTS !!!
We need your moral support & blessings till
we reach our destiny……….
Download