Fig 3 - uploaded by Paul Furth
Content may be subject to copyright.
9 Block Diagram of Decoder and each cell schematic.

9 Block Diagram of Decoder and each cell schematic.

Citations

... Active pixel sensors (APS) are used in digital cameras to convert the photo-generated signal to an analog voltage, which is then converted to digital data through an analog-to-digital convertor. CMOS APS are often used due to their lower noise when compared to the passive pixel sensor and lower power when compared to the charge coupled devices (CCD) sensors [7][8] [9]. APS are constructed with three transistors, often shorted to 3T APS. ...
... APS are constructed with three transistors, often shorted to 3T APS. Other architectures include four and five transistors to reduce noise and utilize global shuttering, however, these have reduced fill factor resulting in smaller pixel sizes [9]. Smaller pixels have lower dynamic range, lower fill factor, worse low light sensitivity, higher dark signal, and higher non-uniformity [10]. ...
... When reset is off, the photodiode accumulates charge where the voltage can be read by the source follower transistor in the 3T APS, which acts as a non-inverting buffer. This output voltage is then read when the row and column signal is selected [9]. Each pixel is read sequentially in the 3T architecture, resulting in a rolling shutter. ...
Article
This paper proposes a design methodology for the semi-automatic derivation of hardware dedicated to a generic class of image analysis reconstruction problems. The study will focus on the implications on the hardware of the associated estimation algorithm and of the built-in underlying minimization. A specific minimization strategy has been designed with a view to improving the efficiency of specialized hardware (in terms of clock cycle and surface area). Convergence proofs are given in the Appendix. This new nonlinear minimization has proved to be almost four times faster than the classical method used in such a context. The VLSI derivation tool presented here is based on a high-level specification of the updating rules defining the problem at hand. The complete derivation is illustrated on an edge-preserving optical flow estimator and on image restoration.