I think this is the original, photographed and contributed by Adrian Pingstone: https://commons.wikimedia.org/wiki/File:Parrot.red.macaw.1.a...
But this particular derivative is the one that appears most often in the Wikipedia articles: https://commons.wikimedia.org/wiki/File:RGB_24bits_palette_s...
This parrot has occurred in several articles on the web. For example, here's one article from a decade or so ago: https://retroshowcase.gr/index.php?p=palette
Parrots are often used in articles and research papers about computer graphics and I think I know almost all the parrots that have ever appeared in computing literature. This particular one must be the oldest computing literature parrot I know!
By the way, I've always been fascinated by dithering ever since I first noticed it in newspapers as a child. Here was a clever human invention that could produce rich images with so little, something I could see every day and instinctively understand how it creates the optical illusion of smooth gradients, long before I knew what it was called.
For anyone interested in seeing how dithering can be pushed to the limits, play 'Return of the Obra Dinn'. Dithering will always remind you of this game after that.
- https://visualrambling.space/dithering-part-1
- https://store.steampowered.com/app/653530/Return_of_the_Obra...
Recent discussions:
Making Software - https://news.ycombinator.com/item?id=43678144
How does a screen work? - https://news.ycombinator.com/item?id=44550572
What is a color space? - https://news.ycombinator.com/item?id=45013154
By adding random noise to the screen it makes bands of color with harsh transitions imperceptible, and the dithering itself also isn't perceptible.
I'm sure there are better approaches nowadays but in some of my game projects I've used the screen space dither approach used in Portal 2 that was detailed in this talk: https://media.steampowered.com/apps/valve/2015/Alex_Vlachos_...
It's only a 3 line function but the jump in visual quality in dark scenes was dramatic. It always makes me sad when I see streamed content or games with bad banding, because the fix is so simple and cheap!
One thing that's important to note is that it's a bit tricky to make dithering on / off comparisons because resizing a screenshot of a scene with dithering makes the dithering no longer work unless one pixel in the image ends up exactly corresponding to one pixel on your screen
Unlike the examples in this post, this dithering is basically invisible at high resolutions, but it’s still very much in use.
Is this a reply to something?
Back in the late 90s maybe. Gifs and other paletted image formats were popular.
I even experimented with them. I designed various formats for The Palace. The most popular was 20-bit (6,6,6,2:RGBA, also 5,5,5,5; but the lack of color was intense, 15 bits versus 18 is quite a difference). This allowed fairly high color with anti-aliasing -edges that were semi transparent.
The article points out that, historically, RAM limitations were a major incentive for dithering on computer hardware. (It's the reason Heckbert discussed in his dissertation, too.) Palettizing your framebuffer is clearly one solution to this problem, but I wonder if chroma subsampling hardware might have been a better idea?
The ZX Spectrum did something vaguely like this: the screen was 256×192 pixels, and you could set the pixels independently to foreground and background colors, but the colors were provided by "attribute bytes" which each provided the color pairs for an 8×8 region http://www.breakintoprogram.co.uk/hardware/computers/zx-spec.... This gave you a pretty decent simulation of a 16-color gaming experience while using only 1.125 bits per pixel instead of the 4 you would need on an EGA. So you got a near-EGA-color experience on half the RAM budget of a CGA, and you could move things around the screen much faster than on even the CGA. (The EGA, however, had a customizable palette, so the ZX Spectrum game colors tend to be a lot more garish. The EGA also had 4.6× as many pixels.)
Occasionally in ZX Spectrum game videos like https://www.youtube.com/watch?v=Nx_RJLpWu98 you will see color-bleeding artifacts where two sprites overlap or a sprite crosses a boundary between two background colors. For applications like CAD the problem would have been significantly worse, and for reproducing photos it would have been awful.
The Nintendo did something similar, but I think had four colors per tile instead of two.
So, suppose it was 01987 and your hardware budget permitted 8 bits per pixel. The common approach at the time was to set a palette and dither to it. But suppose that, instead, you statically allocated five of those bits to brightness (a Y channel providing 32 levels of grayscale before dithering) and the other three to a 4:2:0 subsampled chroma (https://www.rtings.com/tv/learn/chroma-subsampling has nice illustrations). Each 2×2 4-pixel block on the display would have one sample of chroma, which could be a 12-bit sample: 6 bits of U and 6 bits of V. Moreover, you can interpolate the U and V values from one 2×2 block to the next. As long as you're careful to avoid drawing text on backgrounds that differ only in chroma (as in the examples in that web page) you'd get full resolution for antialiased text and near-photo-quality images.
That wouldn't liberate you completely from the need for dithering, but I think you could have produced much higher quality images that way than we in fact did with MCGA and VGA GIFs.