2011 IEEE International Conference on Consumer Electronics (ICCE)
Design of a DVP Compatible Bridge to Randomly Access Pixels
of High Speed Image Sensors
Tareq Hasan Khan and Khan Wahid, Member, IEEE
Abstract-- An efficient design of a digital-video-port (DVP)
compatible bridge architecture to randomly access image pixels
of a high speed image sensor is presented. FPGA synthesis results
show that the bridge can support a data rate up to 215 MHz. The
design is also synthesized using CMOS 0.18um technology.
I. INTRODUCTION
Most commercially available image sensors send image
data at a high speed and the pixel values can only be accessed
sequentially as they are sent row-by-row. Commercial
microcontrollers, on the other hand, run at a slower speed and
do not have sufficient on-board memory to store images of
large size. There has been a few works on the VLSI design of
randomly accessible image sensor [1]-[3], but none of them
are available in the market as commercial products.
In this paper, we present the design of a DVP compatible
bridge architecture that may be interfaced with most
commercial high-speed image sensors. The objective of the
proposed bridge is to overcome the speed gap between the
available image sensors and microcontrollers. In addition, it
provides buffering for image storage, I2C protocol, and
random access of image pixels through the parallel interface.
The maximum speed achieved is 215MHz using Xilinx FPGA
and 242MHz in a CMOS 0.18um process.
image capturing process can be started by asserting the Init pin
to high. The image sensor then begins to capture and send the
frames which are stored in iBRIDGE’s internal memory.
During the capturing process, Data(7:0) is kept in high
impedance mode. As soon as the image capturing process is
completed, the FrameReceived pin goes from low to high. At
this time, the image sensor goes to sleep mode to save power.
The Col(9:0), Row(8:0) and NextByte pins are then used by
the image processor to access any pixel from the RAM at the
desired speed and in a random access fashion.
II. THE IBRIDGE
Fig. 1. Block diagram of the iBRIDGE
The proposed bridge (referred as iBRIDGE) is placed in
between the image sensor and the image processor (or the
microcontroller). Fig. 1 shows the block diagram of the
iBRIDGE and its internal structure. The pins on the left hand
side are connected with an image sensor while those on the
right hand side are connected to the image processor or the
microcontroller.
Most leading commercial image sensors send image data
using a common standard interface known as digital video
port (DVP) parallel output interface. In the DVP interface, the
VD and HD pins indicate the ‘end of frame’ and ‘end of row’
respectively. Pixel data bytes are available for sampling at the
DOUT(0:7) bus at the positive edge of the DCLK signal. The
EXTCLK is the clock input to the image sensor. The
initialization and configuration of the image sensor are set by
the 2-wire (SCL and SDA) I2C protocol.
In order to capture a frame, at first the image processor
asserts the RST pin to high and then make it low. Then the
required image size and color must be selected by the
FrameSize(0:1) and RGB/Gray’ pins. Table I shows the
possible values of the FrameSize(1:0) pins and its
corresponding image sizes with memory requirements. The
TABLE I
FRAME SIZE SELECTION AND MEMORY REQUIREMENTS
FrameSize
Captured Image Size
Memory Requirement
(1:0)
(W x H) pixels
(bytes)
8 BPP
16 BPP
00
01
10
11
12,288
19,200
76,800
307,200
24,576
38,400
153,600
614,400
After placing the column and row value on the Col(9:0) and
Row(8:0) bus, the data (pixel value) appears on the Data(7:0)
bus after the memory access time which is typically less than
35ns. For an RGB(5-6-5) image, two consecutive bytes need
to be read for one pixel. A low on the NextByte pin gives the
first byte and a high on the NextByte pin gives the second byte
on the Data (7:0) bus. After the desired pixel values are read,
the process of capturing the next frame with the same
configuration can be repeated by asserting ReqFrame pin from
low to high. The time needed to capture a frame, TRe q − Re cieved ,
can be calculated as follows:
TRe q−Re cieved = TWakeup + TFrameStore
Where,
978-1-4244-8712-7/11/$26.00©2011 IEEE
128 x 96 (subQCIF)
160 x 120 (QQVGA)
320 x 240 (QVGA)
640 x 480 (VGA)
911
(1)
TWakeup = I 2C _ WriteCommandBits ×
1
f SCL
ª InitBlankBytes +
º
n
TFrameStore = ««(PixelBytes / Row + BlankBytes / Row )»» ×
f DCLK
«¬× TotalRow
»¼
(2)
(3)
Table II presents the variable values of (3) for a commercial
CMOS image sensor. However, values for other commercial
image sensors can be extracted from their data-sheets.
TABLE II
DATA AND BLANK BYTES SENT BY TOSHIBA IMAGE SENSOR
Image
Init
Pixel
Blank
Total
n
Size
Blank
Bytes/Row
Bytes/Row
Row
Bytes
(RGB)
SubQCIF
157
256
1304
254
2
QQVGA
157
320
1240
254
2
QVGA
157
640
920
254
2
VGA
157
1280
280
507
1
block of the FPGA board. The image sensor interface and the
image processor interface are assigned with different GPIO
ports. The microcontroller is connected with a personal
computer (PC) using COM port. A level converter is used to
generate the appropriate logic levels to communicate with the
PC. Software has been written in Windows environment to
display the captured images. The overall experimental setup is
shown in Fig. 3.
Fig. 3. Experimental setup
The I2C interface block takes device address, registers
address, and registers data as input from the Sensor Control
block and then generates I2C protocol bits at its SCL and SDA
output pins. The device address and register mappings may
vary with different image sensors; however, the iBRIDGE can
be initialized with the particular addresses at the start through
the Col(9:0) and Row(8:0) pins, which makes the proposed
iBRIDGE design universal. The clock divider block takes an
external clock and generates appropriate signals for internal
clocking. Fig. 2 shows the operational timing diagram of the
iBRIDGE [here, t1 = 1.25uS to 2.5uS (at least 1 I2C_CLK)].
Fig. 4 shows a full image (left) and a randomly accessed
image pixels (right) captured by the microcontroller using the
iBRIDGE. The proposed bridge was also synthesized using
0.18um CMOS technology. The design consumes 4.7K gates,
0.794 sq.mm silicon area and 27.6 mW of power while
running at 20MHz with a 3V supply.
FPGA Device
TABLE III
SYNTHESIS RESULTS ON FPGA
Area Utilization
Registers
LUT
Virtex2p, XC2VP2FG256
Virtex4, XC4VLX15SF363
Spartan3, XC3S50TQ144
Virtex E, XCV50ECS144
Virtex5, XC5VLX330
155
150
157
143
157
294
285
295
339
235
Max Freq. of
DCLK (MHz)
210.5
170
128.3
129.7
215.7
Fig. 4. Full (left) and randomly accessed pixel image (right) using iBRIDGE
IV. CONCLUSION
Here, the design of a DVP compatible bridge is proposed to
randomly access image pixels of a high-speed image sensor.
The prototype is suitable for many high data rate embedded
system application such as, pattern recognition, robotic vision,
bio-medical imaging, etc.
Fig. 2. Operational timing diagram of iBRIDGE
REFERENCES
III. EXPERIMENTAL RESULTS
The proposed iBRIDGE is modeled in VHDL and
simulated for functional verification. The design is then
synthesized in various FPGA technologies and the results are
presented in Table III. In order to verify the design in realworld hardware, the iBRIDGE has been synthesized in an
FPGA board that occupies 255 logic elements (LE), 125
registers, and 4 embedded 9-bit multiplier elements. The core
thermal power dissipation is 80mW. The random access
memory block of the iBRIDGE is connected to 512KB SRAM
[1]
[2]
[3]
912
O. Y. Pecht, R. Ginosar, and Y. S. Diamand, “A random access
photodiode array for intelligent image capture,” IEEE Trans. on
Electron Devices. vol. 38, no. 8, pp. 301-304 , 1991.
D. Scheffer, B. Dierickx, and G. Meynants. “Random addressable
2048x2048 active pixel image sensor,” IEEE Trans. on Electron
Devices, vol. 44, no. 10, pp. 1716-1720, 1997.
S. Decker, R. McGrath, K. Brehmer, and C. Sodini, “A 256x256 CMOS
imaging array with wide dynamic range pixels and column-parallel
digital output,” IEEE Journal of Solid-State Circuits, vol. 33, no. 12, pp.
2081-2091, 1998.