I run into this same failure mode often. We introduce purposeful scaffolding in the workflow that isn’t meant to stand alone, but exists solely to ensure the final output behaves as intended. Months later, someone is pitching how we should “lean into the bold saturated greens,” not realising the topic only exists because we specifically wanted neutral greens in the final output. The scaffold becomes the building.
In our work this kind of nuance isn’t optional, it is the project. If we lose track of which decisions are compensations and which are targets, outcomes drift badly and quietly, and everything built after is optimised for the wrong goal.
I’d genuinely value advice on preventing this. Is there a good name or framework for this pattern? Something concise that distinguishes a process artefact from product intent, and helps teams course-correct early without sounding like a semantics debate?
When you scan in a film you need to dust bust it, and generally clean it up (because there are physical scars on the film from going through the projector. Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted")
Ideally you'd use a non-real time scanner like this: https://www.filmlight.ltd.uk/products/northlight/overview_nl... which will collect both colour and infrared. This can help automate dust and scratch removal.
If you're unluckly you'll use a telecine machine, https://www.ebay.co.uk/itm/283479247780 which runs much faster, but has less time to dustbust and properly register the film (so it'll warp more)
However! that doesnt affect the colour. Those colour changes are deliberate and are a result of grading. Ie, a colourist has gone through and made changes to make each scene feel more effective. Ideally they'd alter the colour for emotion, but that depends on who's making the decision.
the mechanics are written out here: https://www.secretbatcave.co.uk/film/digital-intermediary/
Some examples:
https://www.reddit.com/r/Gameboy/comments/bvqaec/why_and_how...
The challenge is that everybody's memory is different, and sometime those memories are "I wish the graphics were rock sharp without the artifacts of the CRT". Other times our memories are of the crappy TV we were given as kids that was on its last legs and went black & white and flickered a lot.
The reality is that no matter what the intentions of the original animation teams were, the pipeline of artwork through film transfer to projection to reflection to the viewer's own eyeballs and brain has enough variety to it that it's simply too variable -- and too personal -- the really say what is correct.
Anecdote: one of the local theaters I grew up with was extremely poorly maintained, had a patched rip on one of the several dirty screens, and had projectors that would barely get through an hour of film without needing a "bump" from the projectionist (allowing the audience to go out and get more refreshments halfway through most films). No amount of intentionality by the production companies of the many films I saw there could have accounted for any of that. But I saw many of my favorite movies there.
I've come down with the opinion that these things are like wine. A good wine is the one you enjoy. I have preferences for these things, but they sometimes change, and other people are allowed to enjoy things in their own way.
It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.
Noodle made a charming video about going mad researching this: https://www.youtube.com/watch?v=lPU-kXEhSgk
To be the man responsible for the creation of Pixar, Industrial Light and Magic, Skywalker sound, LucasFilm games, THX, and Kerner Optical is a very impressive accomplishment for him and that's secondary to the main accomplishment he's known for in StarWars.
If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.
Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.
Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:
Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."
(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")
Side node - I wonder if it's a millenial thing that our memories are worse due to modern technology, or perhaps we are more aware of false memories due to the sheer availability of information like this blog post.
Toy Story is the only Pixar movie ever released on Laserdisc (along with all their shorts, in the same box set). Disney also released a lot of their 90s animation on Laserdisc.
So if you're a true cinephile, seek out the Laserdisc versions.
I suspect having shader plugins for TV and movie watching will become a thing.
"The input is supposed to be 24 FPS, so please find those frames from the input signal. Use AI to try to remove compression artifacts. Regrade digital for Kodak 35mm film. Then, flash each frame twice, with blackness in-between to emulate how movie theaters would project each frame twice. Moderate denoise. Add film grain."
I don't actually know what kind of filters I'd want, but I expect some people will have very strong opinions about the best way to watch given movies. I imagine browsing settings, like browsing user-contributed Controller settings on Steam Deck...
> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.
While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.
[1] https://adventofcomputing.libsyn.com/episode-88-beflix-early...
And I don't think I'm even being bitten by a nostalgia bug as per se because it was already a nostalgic fad long gone from any cinema near me when I grew up.
I’m sure many young people feel the exact opposite.
Load it up in DaVinci Resolve, knock the saturation and green curve down a bit, and boom, it looks like the film print.
Or you could slap a film-look LUT on, but you don't need to go that far.
As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.
Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.
https://www.vulture.com/2019/07/motion-smoothing-is-ruining-... https://www.filmindependent.org/blog/hacking-film-24-frames-...
If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/
It might be a fun experiment to make custom rips of these movies that look more like their theatrical releases. I'm curious how close one can get without needing to source an actual 35mm print.
Now, while I liked the first three, the first one always has a special place, because at the time it was really quite new-ish. Fully computer animated movies were quite rare. Pixar did several short videos before Toy Story, and I think there were some other movies too, give or take, but Toy Story kind of changed everything past that. Unfortunately many other computer-generated movies are absolute garbage nowadays. The big movie makers want money and don't care about anything else, so they ruin the interest of people who are not super-young anymore, because let's face it: older people are less likely to watch the latest marvel 3D animated zero-story movie that is a clone of prior clones.
It would be nice if AI, despite it also sucking to no ends, could allow us to produce 3D movies with little effort. I have a fantasy game world my local pen and paper RPG group built. Would be interesting to feed it a ton of data (we have generated all that already over decades) and come up with an interesting movie that relates the story of a part of this. This is just one example of many more. Quality-wise I still like Toy Story - I find it historically important, and it was also good at the respective time (all first three actually, although the storylines got progressively weaker; I did like Ken and Barbie though, but too much of the story seemed to go into wanting to milk out more money selling toys rather than telling a story. Tim Allen as Buzz Lightyear was always great though.)
I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.
It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.
This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.
And here I was thinking of re-watching some old Disney/Pixar movies soon :(
I wonder if artificial grain would actually make it look better.
Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.
---
> see the 35 mm trailer for reference
The article makes heavy use of referring to scans of trailers to show what colours, grain, sharpness, etc. looked like. This is quite problematic, because you are replying on a scan done by someone on the Internet to accurately depict what something looked like in a commercial cinema. Now, I am not a colour scientist (far from it!), but I am a motion picture film hobbyist and so can speak a bit about some of the potential issues.
When projected in a movie theatre, light is generated by a short-arc xenon lamp. This has a very particular output light spectrum, and the entire movie process is calibrated and designed to work with this. The reflectors (mirrors) in the lamphouse are tuned to it, the films are colour graded for it, and then the film recorders (cameras) are calibrated knowing that this will be how it is shown.
When a film is scanned, it is not lit by a xenon short-arc lamp, instead various other illumination methods are used depending on the scanner. CRTs and LEDs are common. Commercial scanners are, on the whole, designed to scan negative film. It's where the money is - and so they are setup to work with that, which is very different to positive movie release film stock. Scanners therefore have different profiles to try and capture the different film stocks, but in general, today's workflow involves scanning something in, and then colour correcting post-scan, to meet an artist's expectations/desires.
Scanning and accurately capturing what is on a piece of film is something that is really quite challenging, and not something that any commercial scanner today does, or claims to do.
The YouTube channels referenced are FT Depot, and 35mm Movie Trailers Scans. FT Depot uses a Lasergraphics 6.5K HDR scanner, which is a quite high end one today. It does have profiles for individual film stocks, so you can set that and then get a good scan, but even the sales brochure of it says:
> Many common negative film types are carefully characterized at Lasergraphics to allow our scanning software to compensate for variation. The result is more accurate color reproduction and less time spent color grading.
Note that it says that less time is spent colour grading - it is still not expected that it will accurately capture exactly what was on the film. It also specifies negative, I don't know whether it has positive stock profiles as I am not lucky enough to have worked with one - for this, I will assume it does.
The "scanner" used by 35mm Movie Trailers Scans is a DIY, homemade film scanner that (I think, at least the last time I spoke to them) uses an IMX-183 sensor. They have both a colour sensor and a monochrome sensor, I am not sure what was used to capture the scans linked in the video. Regardless of what was used, in such a scanner that doesn't have the benefit of film stock profiles, etc. there is no way to create a scan that accurately captures what was on the film, without some serious calibration and processing which isn't being done here. At best, you can make a scan, and then manually adjust it by eye afterwards to what you think looks good, or what you think the film looks like, but without doing this on a colour calibrated display with the original projected side-by-side for reference, this is not going to be that close to what it actually looked like.
Now, I don't want to come off as bashing a DIY scanner - I have made one too, and they are great! I love seeing the scans from them, especially old adverts, logos, snipes, etc. that aren't available anywhere else. But, it is not controversial at all to say that this is not colour calibrated in any way, and in no way reflects what one actually saw in a cinema when that trailer was projected.
All this is to say that statements like the following in the article are pretty misleading - as the differences may not be attributable to the direct-digital-release process at all, and could just be that a camera white balance was set wrong, or some post processing to what "looked good" came out different to the original:
> At times, especially in the colors, they’re almost unrecognizable
> Compared to the theatrical release, the look had changed. It was sharp and grainless, and the colors were kind of different
I don't disagree with the premise of the article - recording an image to film, and then scanning it in for a release _will_ result in a different look to doing a direct-digital workflow. That's why major Hollywood films spend money recording and scanning film to get the "film look" (although that's another can of worms!). It's just not an accurate comparison to put two images side by side, when one is of a trailer scan of unknown accuracy.
LOL, what? Anyone with a Blu-Ray rip file and FFmpeg can decide how it looks to them.