EE630: CAMERA MODEL Hassebrook 2009

advertisement
EE630: CAMERA MODEL
Hassebrook 2009
The goal of this visualization is to create a camera chip and lens model in MATLAB. Use a
different image for your experiment. Show the results for each step similar to this write up. Don’t
forget to show your point spread function and go ahead and crop out regions for more detail.
Turn in your code and write up with figures. You may want to blur your image more so it is
more readily apparent.
The model is a simple one and is not detailed enough to be specific to a particular chip
technology. However, it does provide dead regions between pixel sensors. The lens model is
limited to blurring based on a Gaussian point spread function. The model should be useful in
modeling typical blurring and image capture problems and can be enhanced for more specific
technologies.
STEP 1: CREATE CAMERA FOOTPRINT
Define a pixel footprint to be My,sample x Nx,sample {5x5}. The active area to be My,active x Nx,active
{3x3}, as shown in Fig. 1. You can use irect.m and fftshift, if you wish, or write your own
rectangular function that will give you the pixel footprint shown in Fig. 1.
Figure 1: Pixel Footprint.
Create the camera or chip footprint from the pixel footprint by using a kron (ie., Kronecker
multiplication) function.
B=ones(Mypixel,Nxpixel);
chip=kron(B,Pix2);
where {Mypixel, Nxpixel} is {48,64}. Use {480, 640} values for the visualization.
1 Figure 2: Camera chip footprint.
The size of the image in Fig. 2 is
Nx=Nxsample*Nxpixel; % size of entire field
My=Mysample*Mypixel;
STEP 2: INPUT IMAGE
A_jpg=double(imread('honduras14july09_1177.jpg','jpg')); % load image
Ar=A_jpg(:,:,1);
Ag=A_jpg(:,:,2);
Ab=A_jpg(:,:,3);
A=(Ar+Ag+Ab/3);
2 Figure 3: Chris and Eli in Honduras looking at features on a boulder.
STEP 3: APPLY LENS BLURRING
rx=3;ry=rx; % point spread diameter
x=0:Nx-1;x=x-Nx/2;
y=0:My-1;y=y-My/2;
h=gaperture(rx,ry,x,y); % creates a Gaussian point spread function
h=real(fftshift(h)); % shift to center the point spread to the corners
% BLUR FIELD
B(1:My,1:Nx)=A(1:My,1:Nx);
B=real(ifft2(fft2(B).*conj(fft2(h))));
Figure 4: Blurred image.
STEP 4: APPLY THE IMAGE FIELD
The blurred image represents the electromagnetic field intensity that is imaged onto the chip. The
pixel footprint is designed to mask out the field in between pixels but allow the field amplitude
to vary across the active area of the pixel.
3 C=B;
C=C.*chip;
Figure 5: Field amplitude across the camera chip.
The field distribution across the camera chip surface is shown in Fig. 5. The electronics
integrates the field in each active pixel region. For this model we are using intensities which is
proportional to energy. To get the total energy, in each pixel, to be at the center of the pixel, we
convolve a rectangular window function the size of each pixel region.
energy=irect(Mysample,Nxsample,My,Nx);
D=real(ifft2(fft2(C).*fft2(energy)));
Since we are only interested in the total energy at each pixel, we down sample by just selecting
the pixel center values.
% DOWNSAMPLE ENERGY
E=zeros(Mypixel,Nxpixel);
for m=1:Mypixel;
for n=1:Nxpixel;
m1=floor(m*Mysample-Mysample/2);
n1=floor(n*Nxsample-Nxsample/2);
E(m,n)=D(m1,n1);
end;
end;
The final image that would captured in discrete space is shown in Fig. 6.
4 Figure 6: Discrete space image captured by camera chip.
APPENDIX
How to display and output a figure:
figure(5);
imagesc(B);
axis image;
colormap gray;
print -dtiff 'ChipField.tif';
5 
Download