The only thing that comes immediately to my mind is how many video cameras still use ccd instead of cmos because, until the last year or so, global shutter only worked for ccd, due to cmos electronics not offloading fast enough. I could be wrong. (Global shutter prevents ‘wobbly’ footage, skewing during a pan, etc. Theoretically with a global shutter, a camera could have a flash sync as fast as you want)
The technologies are actually about the same age, but CMOS was far too expensive to produce in the 60’s and 70’s to be viable (due to technological constraints) So CCD dominated the market until the early 00’s
Now, after a little research, because CCD signal (because it is offloaded by line in analogue fashion, before being digitized) a whole row of pixels can get saturated as the oversaturated pixel carries it’s signal over to the next one. (imagine your blown out highlights making rows of blown out pixels all the way across the image) This is called smear, and CCD’s are immune to this as each pixel is digitized by it’s own individual transistor, thus every pixel being treated independently.
CMOS sensors are made on standard silicon wafers and are thus dramatically less expensive to produce. (now) CCD’s are more sensitive and less noisy.
That’s all for now. It sounds like interesting reading topic. I’ll let you know if I learn anything else