What is quantization error anyway?
When image colors exist or are mapped into any RGB space, the space functions like a giant 3D grid of finite, tiny locations where a color can be placed. An image in 8-bit per channel form can sit in a 3D array of 16.7 million locations, or 256 planes in each of the three dimensions of the cube. Colors can be at the precise locations in the grid, but not in-between. If the RGB space is defined so as to cover a wider range of colors, the spaces between the steps increase proportionally. Since most images cover only a few percent of the volume of the RGB space, an image may exist in just a couple of dozen or fewer planes in vertical orientations. If such a space is enlarged to twice the radius, the number of planes of image info will be cut in half in each lateral dimension. The number of vertical steps (horizontal planes) doesn’t change as RGB spaces get wider or narrower, rather the preservation of vertical steps is best optimized by keeping the tone curves of spaces being used as the