If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Both accomplish the same end result, especially with today's technology. CMOS typically has lower power (equates to better battery life), but CCD inthe past produced better quality, but used a lot more power. Today, they've probably reached equal standing as far as performance. Here's an article I've pasted in from How Stuff Works:
CCD sensors, as mentioned above, create high-quality, low-noise images. CMOS sensors, traditionally, are more susceptible to noise.
Because each pixel on a CMOS sensor has several transistors located next to it, the light sensitivity of a CMOS chip tends to be lower. Many of the photons hitting the chip hit the transistors instead of the photodiode.
CMOS traditionally consumes little power. Implementing a sensor in CMOS yields a low-power sensor.
CCDs use a process that consumes lots of power. CCDs consume as much as 100 times more power than an equivalent CMOS sensor.
CMOS chips can be fabricated on just about any standard silicon production line, so they tend to be extremely inexpensive compared to CCD sensors.
CCD sensors have been mass produced for a longer period of time, so they are more mature. They tend to have higher quality and more pixels.
Based on these differences, you can see that CCDs tend to be used in cameras that focus on high-quality images with lots of pixels and excellent light sensitivity. CMOS sensors traditionally have lower quality, lower resolution and lower sensitivity. CMOS sensors are just now improving to the point where they reach near parity with CCD devices in some applications. CMOS cameras are usually less expensive and have great battery life.
Not sure I buy the noise issue. Older Nikons used CCD and had much more noise than the newer ones that use CMOS sensors.
I kind of agree. I always thought CMOS had better noise qualities than CCD. Canon always used CMOS, and made their own chips...maybe their technology dealt better with the noise that their CMOS sensor produced?? In the past, Sony made and OEM'd most of the CCD chips used by the other camera companies. Looks like Nikon is now embracing CMOS on their newer models..true on my 300D.
Vince "...the law of unintended consequences, sometimes, you get a truly memorable photograph" Gear: Canon G2, Canon 20D, Nikon D300...bunch of lenses My Flickr www.montalbanophotography.com
I have just cut and pasted the tech info on these CCD and CMOS technologies. Hope this helps.
CCD (charge coupled device) and CMOS (complementary metal oxide semiconductor) image sensors are two different technologies for capturing images digitally. Each has unique strengths and weaknesses giving advantages in different applications. Neither is categorically superior to the other, although vendors selling only one technology have usually claimed otherwise. In the last five years much has changed with both technologies, and many projections regarding the demise or ascendence of either have been proved false. The current situation and outlook for both technologies is vibrant, but a new framework exists for considering the relative strengths and opportunities of CCD and CMOS imagers.
Both types of imagers convert light into electric charge and process it into electronic signals. In a CCD sensor, every pixel's charge is transferred through a very limited number of output nodes (often just one) to be converted to voltage, buffered, and sent off-chip as an analog signal. All of the pixel can be devoted to light capture, and the output's uniformity (a key factor in image quality) is high. In a CMOS sensor, each pixel has its own charge-to-voltage conversion, and the sensor often also includes amplifiers, noise-correction, and digitization circuits, so that the chip outputs digital bits. These other functions increase the design complexity and reduce the area available for light capture. With each pixel doing its own conversion, uniformity is lower. But the chip can be built to require less off-chip circuitry for basic operation
My conlusion is that these cannot be compared, as each of them have their own advantages and disadvantages. The choice depends on the nature of application, expected end result and the last, but not the least, the budget . . . . will determine the technology to be looked at.
As I had mentioned in my earlier reply, the choice between CCD and CMOS depends on the application, I am listing here a few preferred applications associated with CMOS and CCD technologies for all your reference. Hope this helps
CMOS imagers offer superior integration, power dissipation and system size at the expense of image quality (particularly in low light) and flexibility. They are the technology
of choice for high-volume, spaceconstrained applications where image
quality requirements are low. This makes them a natural fit for security cameras, PC
videoconferencing, wireless handheld device videoconferencing, bar-code scanners,
fax machines, consumer scanners, toys, biometrics and some automotive invehicle
CCDs offer superior image quality and flexibility at the expense of system size. They remain the most suitable technology for high-end imaging applications, such as digital photography, broadcast television, high-performance industrial imaging, and most scientific and medical applications. Furthermore, flexibility means users can achieve greater system
differentiation with CCDs than with CMOS imagers.
Sustainable cost between the two technologies is approximately equal. This is a major contradiction to the traditional marketing pitch of virtually all of the solely CMOS imager companies.
FWIW, the new hot ticket for high-end compact cameras is a backlit CMOS sensor, which is supposed to improve image quality and enable very high-speed shooting. I'm not sure if the technology used to create these sensors is applicable to CCD's, but if any backlit CCD's exist, they haven't made it to the market yet.
There was a time when CCD was the better choice for image quality. The tech sector has put so much R and D into CMOS - that alot of the weak points of CMOS are not so weak any more. I`d say, if the tech sector had put equal research and development into them, that it would be a more important factor now. When choosing between two dslrs that are of the same generation - tend towards ccd for quality with early generations and cmos for late generations. When chosing between two different generations - it`s pretty clear that the newer generation sensors are better. (I say better losely)
The proof of pudding is in the eating. If you really want to compare any two camera models with CCD and CMOS, go to an outlet, take a snap with both cameras and blow them up full screen.
This is exactly what I did at a Canon outlet with these two models : Powershot SX 150 IS and SX230 HS. There can be an argument about the technical differences between the two models but my point is, in the end, what exactly do you use a camera for....to take brilliant shots you can be proud of and share the memories. So, putting everything aside, take a picture and see what you get, thats it.
Now, I took a single photo with same ISO settings of the same object and then we transferred to a computer. The result : a very noticeable and significant difference in terms of sharpness, clarity and noise. CCD (SX 150 IS) won hands down. Sorry, I could not get the images to show you here because I didn't have a usb drive with me.
Okay, that's my experience, there can be arguments about how CMOS is better under certain situations etc., honestly, I'd have nothing to say about that because I am not a professional photographer.
So, for me...its CCD.