Looking to Upgrade Your Cameras from an iPhone to Stream?
A lower resolution three-chip camera can actually render more accurate images to that of a higher resolution single-chip camera when you compare on a pixel-by-pixel resolution.

Credit: Courtesy of YouTube

Credit: Courtesy of YouTube
Photos & Slideshow

Cameras News
Hitachi Cameras Enhance Television Ministry at Texas Church Doing Video on a Budget: Plan, Research and List 4K Cameras: Camera, Lens Combination Crucial in Handling Tough Lighting Keys to Reaching People With IMAG: Engage Your AudienceCameras Resource
A Guide into PTZ Cameras, 5 Brands Worth A Closer Look (Part 3)
Let’s face it – camera technology has come a long way over the last two decades. Does anyone remember the first-generation digital cameras from the 1990s that used floppy disks?
The portable camera has for the most part been replaced with the latest generation of smartphone. The current image sensors are able to record HD—and even 4K resolution—with pretty surprising results.
In order to really dig into the what, how and why of camera technology and what makes one better than another, I need to dig into a bit of the technical details. That way you can understand what sets a $5,000 broadcast camera (or one even more expensive) apart from a $500 smartphone or DSLR camera.
For those of you who have followed me on laser projector discussions – you know that I talk about the single chip DLP projector versus the three-chip projectors, and how they offer much better color brightness and accuracy. The reason is that the one-chip projector only shows one color at a time. To display the full range of color, it flashes a red image, then green, and blue sequentially. This happens so fast that most of the human population renders that as a full-color image. The three-chip projectors are showing all three colors continually, so you always see a full-color image.
The simple answer here on the camera side is also true to an extent. There are single chip cameras and three-chip cameras.
The primary way an imaging sensor works is by light hitting it and the photons energizing the photo diode. Unfortunately, I don’t have the bandwidth in this article to get into what the P-N junction is (in short, it is a boundary or interface between different types of semiconductor materials), and how the phosphors or boron doping affect that, so let’s rather look to how these things work.
The Basics Behind Image Sensors
All image sensors are “dumb” to color – they generally sense light from about 190 to 1,100 nanometers. In the middle of that range is visible light which for most is around 380-700nm. For those of you who geek out about what frequency that is – it is 430-770THz. For those of you unfamiliar with tetrahertz, know that kilohertz represents 103 Hz, megahertz equates to 106 Hz, while gigahertz amounts to 109 Hz, and lastly, terahertz stands at 1012 Hz.
With both single and three-chip cameras, a color filter sends a certain bandwidth of visible light for each pixel. There is a pattern of what colors are in what order, which is called the Bayer Filter.

Latest Resource
For Lighting Design, What Software Is The Right Match For Your Needs? (Part 3)