Image Credit: Valerio Pardi/Shutterstock.com
Digital imaging technology has revolutionized everything from the way we communicate to every field of scientific research. Starting from humble beginnings, camera sensor technology is still evolving and has the potential to continue to change our lives.
Research that led to commercial digital camera technology began in the 1960s, but the technology did not become widely adopted until the late 1980s and early 1990s. So far, commercial camera sensor technology has been based on two different devices: the Charge-Coupled Device (CCD) and the Complementary Metal Oxide Semiconductor (CMOS) device. Many are predicting future image sensor technology will be quantum-based.
In 1969, Willard Boyle and George Smith came up with the CCD while employees of Bell Laboratories in New Jersey. The CCD turned out to be the first commercially viable approach to capturing an image and digitizing it using a light-sensitive silicon chip.
The CCD sensor comprises pixels made from metal-oxide-semiconductor (MOS) capacitors. When light strikes a pixel, the photons are transformed into electrons as a result of the photoelectric effect: photons popping electrons out of silicon. On a CCD, ‘popped’ electrons are kept in each pixel’s capacitor, also referred to as the pixel's bucket.
At this point, the future image is still analog, with the number of electrons in the bucket directly related to the quantity of light that has struck the silicon detector.
To read the data stored in these buckets, the charge from each row of buckets must be moved stepwise from one bucket to another in the same way a "bucket brigade" might pass buckets of water from one person to the next. When these buckets get to the end of the row, they are 'poured out', measured and converted into digital information. The resulting image is essentially this digital grid.
At its most basic, a CCD produces a black-and-white image. Color imagery can be produced by putting a red, green or blue colored filter over each pixel. Color data can then be captured by each pixel, but just for one primary color per pixel. Furthermore, the software can be used to determine the color of neighboring pixels by using their brightness. The result is each pixel having a set of red, green and blue values.
A digital camera processes the original sensor data, known as a RAW file, into a JPEG for viewing.
While the CCD sensor had a long successful run, the CMOS sensor has become more popular in recent years, primarily because it can read information from each pixel, rather than row-by-row.
CMOS sensors are comprised of an array of photo-detecting pixels that produce charges when subjected to light and send each individual pixel charge to a location just outside the array, where their signals are boosted and assessed.
CMOS devices are founded on a metal oxide semiconductor; a technology that has been experimented with since the 1960s. However, there were significant challenges in making technology more viable than CCD technology. In particular, CMOS technology at the time experienced significant signal noise.
Since CMOS pixels are able to amplify their own signal, each one can produce its own useful output. The result is a lower voltage demand and zero charge transfer-efficiency problems. CMOS also allows for other camera electronics to be placed onto the same chip using standard production operations, which results in more compact, dependable and inexpensive devices.
Because CMOS technology uses less power, it is a better fit for multi-megapixel chips, which are found in modern cameras.
The New Frontier – Quantum Photography
While the future of camera sensor technology is hard to predict, many experts are saying that the quantum image sensor will drive imaging technology into the future.
Quantum image sensors (QIS) are presently the subject of extensive study and development. Businesses dedicated to quantum imaging technology are currently garnering attention from the biggest tech behemoths. According to reports, one or two technological breakthroughs stand in the way of quantum imaging becoming standard.
Silicon is the light-sensing material in digital cameras, but silicon sensors only have a 25% efficiency when it comes to light detection. On the other hand, quantum sensors appear capable of detecting 95% of the light that strikes them.
This degree of efficiency is based on the use of layers of quantum dots, placed on a conductive material such as silicon. These sensors can be reduced drastically in size without sacrificing image quality. To put it another way, a quantum sensor in a smartphone might be able to produce images that are better in quality than is currently seen in feature films.
References and Further Reading