Sarge wrote:Most of the earlier systems on that list basically have much more limited color palettes, and map to a smaller value than a 24-bit RGB colorspace. However, many of the systems have encoder chips that take that data and convert it into an RGB signal, which is an analog signal. I don't know that bits even come into play, since it's a digital-to-analog conversion, and I know that black levels aren't even a thing, being a purely digital construct.
Far as I know, it actually has everything to do with analog signals. There's only a limited range of chrominance amplitude (I believe) allowed, so when working digitally, the "broadcast safe" range is 16-235. TVs are calibrated to use that as pure black to pure white instead of the "full" range for 8-bit color information. Using values above or below that, when converted to analog, results in information bleeding into ranges used for other things - using white values over 235 will create an audible buzz for example, not a brighter white.
Computers and video games, and monitors intended for use with them, use the full range of values.
That said, the point of having the toggle in modern consoles isn't to show that they can do more color. It's basically a calibration setting. With HDMI (particularly), far more consoles could be plugged into computer monitors, more computers could be plugged into TVs, and so on. (a nice explanation is
here). More or less, it's picking the right gamma curve for your device/display combo.
Regarding older consoles, I would assume that they'd usually be using the 16-235 range due to conforming to NTSC or PAL signals, or at least assuming they'd be plugged into a TV. Probably not internally, but as part of the video encoder. Native RGB output may be an exception. Really not sure on those, but that'd be my guess.