Besides the library, the PS2 is the most successful video game console of all time in terms of number of units shipped, and it stayed on the market for over ten years, featured a DVD drive, and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer, including their own official PS2 Linux distribution.
In a more perfect world, this would have:
(a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two: a smidge better than the former, but not quite as exotic as the latter (with its Cell CPU or the weird form factor; whereas the PS2's physical profile in comparison was perfect, whether in the original form or the Slim version), which could have:
(b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units even if/when Sony discontinued it themselves, periodically revving the platform (doubling the amount of memory here, providing a way to tap into higher clock speeds there) all while maintaining backwards compatibility such that you would be able to go out today and buy a brand new, $30 bargain-bin, commodity "PS2 clone" that can do basic computing tasks on it (in other words, not including the ability to run a modern Web browser or Electron apps), can play physical media, and supports all the original games and any other new games that explicitly target(ed) the same platform, or you could pay Steam Machine 2026 prices for the latest-gen "PS2" that retains native support for the original titles of the very first platform revision but unlocks also the ability to play those for every intermediate rev, too.
Since they were able to port the interpreter over they have been able to start rapidly start porting over these titles even with a small volunteer team.
2. https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp
I would think that emulation of the original game as closely as possible would be the gold standard of preservation, and native ports would be a cool alternative. As described in the article, native ports are typically not faithful reproductions but enhanced to use the latest hardware.
As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.
Of course, "Nosferatu, eine Symphonie des Grauens" is not for everyone, but I firmly believe that you can watch the new Dune and Lawrence of Arabia back to back and have similarly enjoyable time.
Fallout 1 and 2 are miles ahead of Fallout 3 (mostly due to uncanny valley phenomenon). Sure, the medium has changed a lot and modern consumers are used to more streamlined experience - my favorite example is the endless stream of Baldurs Gate "modern reimplementations" or rehashes, like Pilars of Eterniety that were too close to the original source, and then, suddenly, someone came up with Divinity, basically a Baldurs clone but with modern UI and QoL improvements.
But consoles are different.
This can truly be a window for the next generation to look back in the past.
However that approach will probably suit the least-ambitious PC-ports to PS2 (by studios that didn’t appreciate the difference) - rather as an ST emulator was a short cut to run the simplest Amiga games.
The latter means that even in the absence of a JIT, you would need to achieve 100% code coverage (akin to unit testing or fuzzing) to perform static recompilation, otherwise you need to compile code at runtime at which point you're back to state of the art emulation with a JIT. The only real downside of JITs is the added latency similar to the lag induced by shader compilation, but this could be addressed by having a smart code cache instead. That code cache realistically only needs to store a trace of potential starting locations, then the JIT can compile the code before starting the game.
Note that this "recompilation" and the "decompilation" projects like the famous Super Mario 64 one are almost orthogonal approaches in a way that the article failed to understand; this approach turns the assembly into C++ macros and then compiles the C++ (so basically using the C++ compiler as a macro re-assembler / emulation recompiler in a very weird way). The famous Super Mario 64 decompilation (and openrct and so on) use the output from an actual decompiler which attempts to reconstruct C from assembly, and then modify that code accordingly (basically, converting the game's object code back into some semblance of its source code, which this approach does NOT do).
Won't it be very difficult for the recompilation process or the dev to recognise when these are being relied on and to match the key behaviour?
Or is the idea to pull out the basics of the game structure in a form that runs on modern hardware, then the dev fleshes out the missing parts.
Because Nint€ndo or $ony (and others game companies) have a big problem, their old games are awesome and if the people can play these games, then the people will be happy and will not need new games or new sagas.
Because the problem is not the people playing old games, the real problem is the people will not pay for new games
And we know that these companies have army of lawyers (and "envelopes" to distribute among politicians) to change the laws and make illegal something that is not illegal.
Fortunately, a Debug build of this game was found on a dev unit (somehow), and that build does _not_ have crazy optimizations in place (Link-time Optimization) that make this feat impossible.
I am not somebody that is deep on low level assembly, but I love this game (and Rock Band 3 which uses the same engine), and I was curious to see how far I could get by building AI tools to help with this. A project of this magnitude is ... a gargantuan task. Maybe 50k hours of human effort? Could be 100k? Hard to say.
Anyway, I've been able to make significant progress by building tools for Claude Code to use and just letting Haiku rip. Honestly, it blows me away. Here is an example that is 100% decompiled now (they compile to the exact same code as in the binary the devs shipped).
https://github.com/freeqaz/dc3-decomp/blob/test-objdiff-work...
My branch has added over 1k functions now and worked on them[0]. Some is slop, but I wrote a skill that's been able to get the code quite decent with another pass. I even implemented vmx128 (custom 360-specific CPU instructions) into Ghidra and m2c to allow it to decompile more code. Blows my mind that this is possible with just hours of effort now!
Anybody else played with this?
0: https://github.com/freeqaz/dc3-decomp/tree/test-objdiff-work...
2 out of 4 links in the article are messed up, that's mind boggling... On a tech blog!
Is that how far deep we've sunk to assert it wasn't written by AI?