It's the closest thing to a Unix successor we ever got, taking the "everything is a file" philosophy to another level and allowing to easily share those files over the network to build distributed systems. Accessing any remote resources is easy and robust on Plan9, meanwhile on other systems we need to install specialized software with bad interoperability for each individual use case.
Plan9 also had some innovative UI features, such as mouse chording to edit text, nested window managers, the Plumber to run user-configurable commands on known text patterns system-wide, etc.
Its distributed nature should have meant it's perfect for today's world with mobile, desktop, cloud, and IoT devices all connected to each other. Instead, we're stuck with operating systems that were never designed for that.
There are still active forks of Plan9 such as 9front, but the original from Bell Labs is dead. The reasons it died are likely:
- Legal challenges (Plan9 license, pointless lawsuits, etc.) meant it wssn't adopted by major players in the industry.
- Plan9 was a distributed OS during a time when having a local computer became popular and affordable, while using a terminal to access a centrally managed computer fell out of fashion (though the latter sort of came back in a worse fashion with cloud computing).
- Bad marketing and posing itself as merely a research OS meant they couldn't capitalize on the .com boom.
- AT&T lost its near endless source of telephone revenue. Bell Labs was sold multiple times over the coming years, a lot of the Unix/Plan9 guys went to other companies like Google.
- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.
- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.
- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.
- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones. IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.
- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.
- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.
- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?
- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.
Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks.
Google Picasa: Everything local, so fast, so good. I'm never going to give my photos to G Photos.
Google Hangouts: Can't keep track of all the Google chat apps. I use Signal now.
Google G Suite Legacy: It was supposed to be free forever. They killed it, tried to make me pay. I migrated out of Google.
Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.
Google Finance: Tracked my stocks and funds there. Then they killed it. Won't trust them with my data again.
Google NFC Wallet: They killed it. Then Apple launched the same thing, and took over.
Google Chromecast Audio: It did one thing, which is all I needed. Sold mine as soon as they announced they were killing it.
Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this..
VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow.
[1] https://en.wikipedia.org/wiki/Midori_%28operating_system%29
But now with the new Meta Ray-Bans featuring a light field display and with new media like gaussian splats we're on the verge of being able to make full usage of all the data those cameras were able capture, beyond the demos of "what if you could fix your focus after shooting" of back then.
Beyond high tech, there's a big market for novelty kinda-bad cameras like Polaroids or Instax. The first Lytro has the perfect form factor for that and was already bulky enough that slapping a printer on it wouldn't have hurt.
They burned through $5B of 1999 dollars, building out a network in 23 cities, and had effectively zero customers. Finally shut down in 2001.
All their marketing was focused on "mobile professionals", whoever those were, while ignoring home users who were clamoring for faster internet where other ISPs dragged their feet.
Today, 5G femtocells have replicated some of the concept (radically small cell radius to increase geographic frequency reuse), but without the redundancy -- a femtocell that loses its uplink is dead in the water, not serving as a relay node. A Ricochet E-radio that lost its uplink (but still had power) would simply adjust its routing table and continue operating.
I often wonder, if AI had come 15 years earlier, would it have been a ton better because there weren't a billion different ways to do things? Would we have ever bothered to come up with all the different tech, if AI was just chugging through features efficiently, with consistent training data etc.?
It has been in existence in some form or another for nearly 30 years, but did not gain the traction it needed and as of writing it's still not in a usable state on real hardware. It's not abandoned, but progress on it is moving so slow that I doubt we'll ever see it be released in a state that's useful for real users.
It's too bad, because a drop in Windows replacement would be nice for all the people losing Windows 10 support right now.
On the other hand, I think people underestimate the difficulty involved in the project and compare it unfavorably to Linux, BSD, etc. Unix and its source code was pretty well publicly documented and understood for decades before those projects started, nothing like that ever really existed for Windows.
10+ years ago I'd regularly build all sorts of little utilities with it. It was surprisingly easy to use it to tap into things that are otherwise a lot more work. For instance I used it to monitor the data coming from a USB device. Like 3 nodes and 3 patches to make all of that work. Working little GUI app in seconds.
Apple hasn't touched it since 2016, I kind of hope it makes a comeback given Blender and more so Unreal Engine giving people a taste of the node based visual programming life.
You can still download it from Apple, and it still technically works but a lot of the most powerful nodes are broken in the newer OS's. I'd love to see the whole thing revitalized.
The internet before advertising, artificial intelligence, social media and bots. When folks created startups in their bedrooms or garages. The days when google slogan was “don’t be evil”.
Crazy fast compiler so doesn't frustrate trial & erroring students, decent type system without the wildness of say rust and all the basic programming building blocks you want students to grasp are present without language specific funkiness.
Full C# instead of god forbidden js.
Full vector dpi aware UI, with grid, complex animation, and all other stuff that html5/css didn’t have in 2018 but silverlight had even in 2010 (probable even earlier).
MVVM pattern, two-way bindings. Expression Blend (basically figma) that allowed designers create UI that was XAML, had sample data, and could be used be devs as is with maybe some cleanup.
Excellent tooling, static analysis, debugging, what have you.
Rendered and worked completely the same in any browser (safari, ie, chrome, opera, firefox) on mac and windows
If that thing still worked, boy would we be in a better place regarding web apps.
Unfortunately, iPhone killed adobe flash and Silverlight as an aftermath. Too slow processor, too much energy consumption.
- Client software that ran a VM which received "objects" from a central server (complete with versioning so it would intelligently download new objects when necessary). Versions were available for IBM (DOS), Windows, and Mac. Think of it as an early browser.
- Multiple access points and large internal network for storing and delivering content nationwide. This was their proprietary CDN.
- Robust programming language (TBOL/PAL) for developing client-side apps which could also interact with the servers. Just like Javascript.
- Vector (NAPLPS) graphics for fast downloading (remember, Prodigy started in the days when modems maxed out at 1200 baud); later they added JPG support.
- Vast array of online services: shopping, banking, nationwide news, BBSes, mail (before Internet email was popular), even airline reservations.
All this was run by a partnership between IBM, Sears, and CBS (the latter dropped out early). They were the Google of the time.
The creator, kentonv (on HN), commented about it recently here https://news.ycombinator.com/item?id=44848099
It looked a bit goofy in the promo videos, but under the hood it was doing real-time chord detection and accompaniment generation. Basically a prototype of what AI music tools like Suno, Udio, or Mubert are doing today, fifteen years too early.
If Microsoft had kept iterating on it with modern ML models, it could’ve become the "GarageBand for ideas that start as a hum."
This would have changed so much. Desktop apps powered by the engine of Firefox not Chrome.
Why? Not enough company buy in, not enough devs worked on it. Maybe developed before a major Firefox re-write?
https://wikipedia.org/wiki/Kuro5hin
I was a hold out on smartphones for a while and I used to print out k5 articles to read while afk... Just such an amazing collection of people sharing ideas and communal moderation, editing and up voting.
I learned about so many wierd and wonderful things from that site.
Instead it went chasing markets, abandoning existing users as it did so, in favour of potential larger pools of users elsewhere. In the end it failed to find a niche going forward while leaving a trail of abandoned niches behind it.
It was a series of experiments with new approaches to programming. Kind of reminded me of the research that gave us Smalltalk. It would have been interesting to see where they went with it, but they wound down the project.
They had built a solid streaming platform for low latency cloud gaming but failed hard on actually having interesting games to play on it. You just can't launch a gaming platform with a handful of games that have been available everywhere and expect it to succeed.
1. competing visions for how the entire system should work
2. dependence on early/experimental npm libraries
3. devs breaking existing features due to "innovation"
4. a lot of interpersonal drama because it was not just open source but also a social network
the ideas are really good, someone should make the project again and run with it
In 2011, before TypeScript, Next.js or even React, they had seamless server-client code, in a strongly typed functional language with support for features like JSX-like inline HTML, async/await, string interpolation, built-in MongoDB ORM, CSS-in-JS, and many syntax features that were added to ECMAScript since then.
I find it wild how this project was 90%+ correct on how we will build web apps 14 years later.
[1] https://austral-lang.org/ [2] https://austral-lang.org/spec/spec.html
- Based on BitTorrent ideas
- Completely decentralized websites' code and data
- Either completely decentralized or controllable-decentralized authentication
- Could be integrated into existing websites (!)
It's not kind of dead, there's a supported fork, but it still feels like a revolution that did not happen. It works really well.
All the buzz in the 2020's about WASM giving websites the ability to run compiled code at native speed, letting pages network with your server via WebRTC?
Yeah, you could do that with Java Applets in 1999.
If Sun (and later Oracle) had been less bumbling and more visionary -- if they hadn't forced you to use canvas instead of integrating Java's display API with the DOM, if they had a properly designed sandbox that wasn't full of security vulnerabilities?
Java and the JVM could have co-evolved with JavaScript as a second language of the Web. Now Java applets are well and truly dead; the plugin's been removed from browsers, and even the plugin API's that allowed it to function have been deprecated and removed (I think; I'm not 100% sure about that).
First Class had broader userbase, such as schools and organizations in the groupware/collaborative segment (but also Mac user groups and so on).
First Class was a comercial product (the server). It had filesharing (UL/DL), it had it's own desktop, mail, chat, IM, voice mail and more. Started out on Mac, but later became cross platform. Still you can find presentations and setup guides on old forgotten University/Schools websites.
Hotline on the other hand, was very easy to setup and also pretty lightweight. It had a server tracker. In the beginning it was Mac only. Lot's of warez servers, but also different (non-warez) communities. It had filesharing (ul/dl from the file area), chat and a newsboard. The decline came after it's developers released the Windows versions. Most servers became clickbait pron/warez with malware etc. People started to move away to web and it Hotline basically died out.
Now, there was some open source/clone projects that kept the spirit alive. But after a few years, web forums, torrents and other p2p-apps took over. But there is some servers running still in 2025 and open source server/client software still developed.
Compared to First Class. Hotline was the Wild West. It only took 15 minutes to set up your own server and announce it on a server tracker (or keep it private).
When i use Discord and other apps/services, it's not hard to think of FC/HL. But then, they were solutions of it's time.
More about: https://en.wikipedia.org/wiki/FirstClass
https://en.wikipedia.org/wiki/Hotline_Communications
https://www.macintoshrepository.org/6691-hotline-connect-cli...
They replaced StumbleUpon with "Mix", whatever it is. Probably because they didn't know how to earn money from it. Sad.
What was the bookmarks social tool called from 00’s? I loved it and it fell off the earth. You could save your bookmarks, “publish” them to the community, share etc..
What ever happened to those build your own homepage apps like startpage (I think)? I always thought those would take off
I just wanna make a mostly static site with links in and out of my domain. Maybe a light bit of interactivity for things like search that autocompletes.
I felt like the OS hit a stride around 8.1, with some markets sporting somewhat impressive marketshare, and the corporate politics of the whole situation and Nokia merger screwed it up badly.
I really think if Microsoft had doubled down and focused on getting flagship devices to all 4 flagship carriers it would have gone somewhere.
But I remember at the time having a dead end of hardware where competitors were putting out new phones on all 4 carriers every year. With Windows Phone you were hopping between carrier exclusives or getting nothing because all the new Nokia/Microsoft phones were low end or mid-range at best.
Ozzie, who had previously worked at IBM, was particularly interested in the challenge of remote collaboration. His vision culminated in the creation of Groove, which was released in 2001. The software distinguished itself from other collaboration tools of the time by allowing users to share files and work on documents in real-time—even without a continuous internet connection.
Groove’s architecture was innovative in that it utilized a peer-to-peer networking model, enabling users to interact directly with each other and share information seamlessly. This approach allowed for a level of flexibility and responsiveness that was often missing in traditional client-server models. Asynchronous collaboration was a key feature, where team members could work on projects without needing to be online simultaneously.
https://umatechnology.org/what-happened-to-microsoft-groove/
We built some things on it, was like CRDT for all the things.
Sorta related, iPod car interface was a reliable way to play and control music, now replaced with CarPlay which has problems and also messes up your nav.
Connect your phone to a display, mouse, keyboard and get a full desktop experience.
At the time smartphones were not powerful enough, cables were fiddly (adapters, HDMI, USB A instead of a single USB c cable) and virtualization and containers not quite there.
Today, going via pkvm seems like promising approach. Seamless sharing of data, apps etc. will take some work, though.
Just on principle, I'd have liked to see it on the market for more than 49 days! It pains me as an engineer to think of the effort to bring a hardware device to market for such a minuscule run.
Instead it went into a slow death spiral due to Windows 95.
That wasn't the case pre-internet or pre-cellphone, when I remember pining for something resembling those technologies.
[1] http://web.archive.org/web/20201014024057/https://www.youtub...
[2] https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
Also, I did not experience them personally, but I love watching computing history videos on YouTube, and a lot of the computers and operating systems from the 1980s and early 1990s got buried too soon, mostly because of their owners being short-sighted idiots in not realizing the full potential of what computers and video games could become, and having wildly successful hits on their hands with legions of faithful fans but not knowing how to build on that success or what the fans actually wanted to see in updated hardware.
Its 2025 and we still haven't solved secure online identification and we are still not using end-to-end encryption for e-mail, most e-mail is not even signed.
Interaction with state agencies is still mostly via paper-based mail. The only successfully deployed online offer of the german state administration seems to be the online portal for tax filings “elster.de”.
The use of a private key on the national ID card would have been able to provide all this and more using standard protocols.
At least for identification, there is an expensive effort to re-design something similar in a smartphone-centric way and with less security and not based on standard approaches called “EUDI wallets”.
For encrypted communication the agreed-on standard seems to be “log in to our portal with HTTPS and use our proprietary interfaces to send and receive messages”...
Why did it die: Too expensive (~30€/year for certificate, >100€ for reader one time) and too complicated to use. Not enough positve PR. Acceptance at state-provided sites was added too late. In modern times, everything must be done with the smartphone, handling of physical cards is considered backwards hence this is probably not going to come back...
Edit: Anothther simiarly advanced technoloy that also seems to have been replaced by inferiror substitute smartphone: HBCI banking (a standard...) using your actual bank card + reader device to authenticate transactions... replaced by proprietary app on proprietary smartphone OS...
If something like that existed today, powered by modern APIs and AI, it could become the ultimate no-code creativity playground.
And the similarly named but completely separate OLE Automation, which let you script programs, across process boundaries. This is what let you write in VB(A): Set w = New Word.Application: Set e = new Excel.Application: Set doc = w.Open("foo.doc"); etc... - this was to Office (mostly) what shell scripting is to Linux, and enabled a lot of ad-hoc business process automation.
Dual screen iPad killer, productivity optimised. IIRC Microsoft OneNote is its only legacy.
Killed because both the Windows team and the Office team thought it was stepping on their toes.
https://www.youtube.com/watch?v=e5wAn-4e5hQ
https://www.youtube.com/watch?v=QWsNFVvblLw
Summary:
>This presentation introduces Via, a virtual file system designed to address the challenges of large game downloads and storage. Unlike cloud gaming, which suffers from poor image quality, input latency, and high hosting costs, Via allows games to run locally while only downloading game data on demand. The setup process is demonstrated with Halo Infinite, showing a simple installation that involves signing into Steam and allocating storage space for Via's cache.
>Via creates a virtual Steam library, presenting all owned games as installed, even though their data is not fully downloaded. When a game is launched, Via's virtual file system intercepts requests and downloads only the necessary game content as it's needed. This on-demand downloading is integrated with the game's existing streaming capabilities, leveraging features like level-of-detail and asset streaming. Performance metrics are displayed, showing download rates, server ping, and disk commit rates, illustrating how Via fetches data in real-time.
>The system prioritizes caching frequently accessed data. After an initial download, subsequent play sessions benefit from the on-disk cache, significantly reducing or eliminating the need for network downloads. This means the actual size of a game becomes less relevant, as only a portion of it needs to be stored locally. While server locations are currently limited, the goal is to establish a global network to ensure low ping. The presentation concludes by highlighting Via's frictionless user experience, aiming for a setup so seamless that users are unaware of its presence. Via is currently in early access and free to use, with hopes of future distribution partnerships.
I'm amazed the video still has under 4,000 views. Sadly, Flaherty got hired by XAI and gave up promoting the project.
https://x.com/rflaherty71/status/1818668595779412141
But I could see the technology behind it working wonders for Steam, Game Pass, etc.
- Multimodality: Text/audio/images input and output. Integrated OCR.
- Connection with an asterisk server, it could send and receive voice phone calls! I used it to call for pizzas to a local place via whatsapp. This was prior to Google's famous demo calling a hairdresser to book a haircut.
- It understood humor and message sentiment, told jokes and sometimes even chimed in with a "haha" if somebody said something funny in a group chat or sent an appropriate gif reaction
- Memory (facts database)
- Useful features such as scheduling, polling, translations, image search, etc.
Regarding the tech, I used external models (Watson was pretty good at the time), plus classical NLP processing and symbolic reasoning that I learned in college.
Nobody understood the point of it (where's the GUI? how do I know what to ask it? customers asked) and I didn't make a single dime out of the project. I closed it a couple years later. Sometimes I wonder what could've been of it.
BT had this grand vision for basically providing rich multi-media through the phone line, but in ~1998. Think a mix of on-demand cable and "teleconferencing" with TV based internet (ceefax/red button on steriods)
It would have been revolutionary and kick started the UK's jump into online rich media.
However it wouldnt have got past the regulators as both sky and NTL(now virgin) would have protested loudly.
I think the market was still skeptical about nodejs on the server at the time but other than that I don’t really know why it didn’t take off
https://en.wikipedia.org/wiki/IGoogle
https://en.wikipedia.org/wiki/Google_Desktop
and why? = UI/UX
Died due to legal wranglings about patents, iirc.
* Rethinkdb: I made some small projects with it in the past and it was easy to easy
Still, even in the early days they had great black levels and zero motion lag - they’d advertise it as “600 fps”. They seriously improved on power draws and heat, and were definitely superior if you wanted an ideal movie or sports watching experience.
Buuut they were also competing with LED TVs, which could be really REALLY thin (rule of cool) and would just sip power. They died out.
The idea that you could read and write data at RAM speeds was really exciting to me. At work it's very common to see microscope image sets anywhere from 20 to 200 GB and file transfer rates can be a big bottleneck.
Archive capture circa 2023: https://web.archive.org/web/20230329173623/https://ddramdisk...
HN post from 2023: https://news.ycombinator.com/item?id=35195029
At its best, having IM, email, browser, games, keywords, chats, etc. was a beautiful idea IMO. That they were an ISP seemed secondary or even unrelated to the idea. But they chose to charge for access even in the age of broadband, and adopt gym level subscription tactics to boot, and people decided they'd rather not pay it which is to be expected. I often wonder if they'd have survived as a software company otherwise.
They were basically a better thought out Facebook before Facebook, in my opinion.
All of the upside and none of the downside of react
No JSX and no compiler, all native js
The main dev is paid by microsoft to do oss rust nowadays
I use choo for my personal projects and have used it twice professionally
https://github.com/choojs/choo#example
The example is like 25 lines and introduces all the concepts
Less moving parts than svelte
Also this: https://news.ycombinator.com/item?id=6676494
Redmart (Singapore): Best web based online store to this date (obviously personal view). No one even tries now that mobile apps have won.
https://techcrunch.com/2016/11/01/alibaba-lazada-redmart-con...
Nothing ever came close to easily find conferences to attend, and find the slides and conversation around them
I'd love to have an SGI laptop.
Or an SGI cell phone or VR headset.
Javascript/HTML based smartphone / app interface.
People talk so much about how you need to write code that fits well within the rest of the codebase, but what tools do we have to explore codebases and see what is connected to what? Clicking through files feels kind of stupid because if you have to work with changes that involve 40 files, good luck keeping any of that in your working memory. In my experience, the JetBrains dependency graphs also aren't good enough.
Sourcetrail was a code visualization tool that allowed you to visualize those dependencies and click around the codebase that way, see what methods are connected to what and so on, thanks to a lovely UI. I don't think it was enough alone, but I absolutely think we need something like this: https://www.dbvis.com/features/database-management/#explore-... but for your code, especially for codebases with hundreds of thousands or like above a million SLoC.
Example: https://github.com/CoatiSoftware/Sourcetrail/blob/master/doc...
Another example: https://github.com/CoatiSoftware/Sourcetrail/blob/master/doc...
I yearn to some day view entire codebases as graphs with similarly approachable visualization, where all the dependencies are highlighted when I click an element. This could also go so, so much further - you could have a debugger breakpoint set and see the variables at each place, alongside being able to visually see how code is called throughout the codebase, or hell, maybe even visualize every possible route that could be taken.
- Gnome2 dropped from Ubuntu in favor of Unity
- Ford Crown Victoria
> replaces visual monitoring with a sonic `ecology' of natural sounds, where each kind of sound represents a specific kind of network event.
https://www.usenix.org/conference/lisa-2000/peep-network-aur...
The concept is that we are wired to notice sounds that are out of the ordinary, but “ordinary” sounds are not distracting.
I had forgotten about this project for a number of years until I read Peter Watts’s Blindsight.
I’m not arguing the solutions it outlined are good, but I think some more discussion around how we interact with touch screens would be needed. Instead, we are still typing on a layout that was invented for mechanical typewriters - in 2025, on our touch screens.
1. When Windows Vista was being developed, there were plans to replace the file system with a database, allowing users to organize and search for files using database queries. This was known as WinFS (https://en.wikipedia.org/wiki/WinFS). I was looking forward to this in the mid-2000s. Unfortunately Vista was famously delayed, and in an attempt to get Vista released, Microsoft pared back features, and one of these features was WinFS. Instead of WinFS, we ended up getting improved file search capabilities. It's unfortunate that there's been no proposals for database file systems for desktop operating systems since.
2. OpenDoc (https://en.wikipedia.org/wiki/OpenDoc) was an Apple technology from the mid-1990s that promoted component-based software. Instead of large, monolithic applications such as Microsoft Excel and Adobe Photoshop, functionality would be offered in the form of components, and users and developers can combine these components to form larger solutions. For example, as an alternative to Adobe Photoshop, there would be a component for the drawing canvas, and there would be separate components for each editing feature. Components can be bought and sold on an open marketplace. It reminds me of Unix pipes, but for GUIs. There's a nice promotional video at https://www.youtube.com/watch?v=oFJdjk2rq4E.
OpenDoc was a radically different paradigm for software development and distribution, and I think this was could have been an interesting contender against the dominance that Microsoft and Adobe enjoys in their markets. OpenDoc actually did ship, and there were some products made using OpenDoc, most notably Apple's Cyberdog browser (https://en.wikipedia.org/wiki/Cyberdog).
Unfortunately, Apple was in dire straits in the mid-1990s. Windows 95 was a formidable challenger to Mac OS, and cheaper x86 PCs were viable alternatives to Macintosh hardware. Apple was an acquisition target; IBM and Apple almost merged, and there was also an attempt to merge Apple with Sun. Additionally, the Macintosh platform depended on the availability of software products like Microsoft Office and Adobe Photoshop, the very types of products that OpenDoc directly challenged. When Apple purchased NeXT in December 1996, Steve Jobs returned to Apple, and all work on OpenDoc ended not too long afterward, leading to this now-famous exchange during WWDC 1997 between Steve Jobs and an upset developer (https://www.youtube.com/watch?v=oeqPrUmVz-o).
I don't believe that OpenDoc fits in with Apple's business strategy, even today, and while Microsoft offers component-based technologies that are similar to OpenDoc (OLE, COM, DCOM, ActiveX, .NET), the Windows ecosystem is still dominated by monolithic applications.
I think it would have been cool had the FOSS community pursued component-based software. It would have been really cool to apt-get components from remote repositories and link them together, either using GUI tools, command-line tools, or programmatically to build custom solutions. Instead, we ended up with large, monolithic applications like LibreOffice, Firefox, GIMP, Inkscape, Scribus, etc.
3. I am particularly intrigued by Symbolics Genera (https://en.wikipedia.org/wiki/Genera_(operating_system)), an operating system designed for Symbolics Lisp machines (https://en.wikipedia.org/wiki/Symbolics). In Genera, everything is a Lisp object. The interface is an interesting hybrid of early GUIs and the command line. To me, Genera could have been a very interesting substrate for building component-based software; in fact, it would have been far easier building OpenDoc on top of Common Lisp than on top of C or C++. Sadly, Symbolics' fortunes soured after the AI winter of the late 1980s/early 1990s, and while Genera was ported to other platforms such as the DEC Alpha and later the x86-64 via the creation of a Lisp machine emulator, it's extremely difficult for people to obtain a legal copy, and it was never made open source. The closest things to Genera we have are Xerox Interlisp, a competing operating system that was recently made open source, and open-source descendants of Smalltalk-80: Squeak, Pharo, and Cuis-Smalltalk.
4. Apple's "interregnum" years between 1985 and 1996 were filled with many intriguing projects that were either never commercialized, were cancelled before release, or did not make a splash in the marketplace. One of the most interesting projects during the era was Bauhaus, a Lisp operating system developed for the Newton platform. Mikel Evins, a regular poster here, describes it here (https://mikelevins.github.io/posts/2021-07-12-reimagining-ba...). It would have been really cool to have a mass-market Lisp operating system, especially if it had the same support for ubiquitous dynamic objects like Symbolic Genera.
i first learned about it when i was working in an university group and had the task to transform a windowing algorithm already working on matlab to python. it felt like a modern linter and lsp with additional support through machine learning. i don't quite know why it got comparative small recognition, but perhaps enough to remain an avantgarde pioneering both python and machine learning support for further generations and wider applications.
One always must define a one sentence goal or purpose, before teams think about how to build something.
Cell processors, because most coders can't do parallelism well
Altera consumer FPGA, as they chose behavioral rather than declarative best practices... then the Intel merger... metastability in complex systems is hard, and most engineers can't do parallelism well...
World Wide Web, because social-media and Marketers
Dozens of personal projects, because sometimes things stop being fun. =3
Think flowcharts crossed with pseudocode but following Structured Programming principles.
Very useful for mocking up, designing and testing code logic before you write it.
> TUNES started in 1992-95 as an operating system project, but was never clearly defined, and it succumbed to design-by-committee syndrome and gradually failed. Compared to typical OS projects it had very ambitious goals, which you may find interesting.
It had its own cross platform UI and other frameworks too, so you could "write once in Java, and ship on all the things" .. well theoretically.
It got abandoned too soon. But it was quite fun to build apps with it for a while, almost Delphi- like. I always wonder if it went open source, if things would have been different vis a vis Flash, etc.
Hit, "ctrl + spacebar to search for anything with simple typed parameters for search" was a killer product in 2005 and now Microsoft finally got wise to copy it in 2025.
Why? Obviously close-to-zero market. It was unbelievable how those people though those projects would even succeed.
Unfortunately, it died because its very niche and also they couldnt keep up with development of drivers for desktops.. This is even worse today...
A place where artists and consumers could freely communicated and socialize without hazzle.
Died because of: Stupidity, commercialisation and walled-gardening.
People always fail to see something that is an inevitability. Humans lack foresight because they don't like change.
I used this when it was brand new for a bit and it was so incredibly smooth and worked so well. It solved the problem of controlling systemd units remotely so well. I'm pretty sure the only reason it never actually took off was kubernetes and coreos's acquisition, however it actively solves the 'other half' of the k8s problem which is managing the state of the host itself.
ISO/OSI had session layer. ie much of what QUIC does regarding underlying multiple transports.
Speaking of X.509 the s-expressions certificate format was more interesting in many ways.
I know a lot of it got folded into PHP, but the best parts of it, like native support for XHP and function whitelists, never got in AFAIK.
(Not the Linux distribution with the same name)
I have used it for years.
A two pane manager, it makes defining file associations, applications invoked by extensions and short cut buttons easy convenient.
Sadly it is abandonware now.
Slowly migrating to Double Commander now...
Looked cool during demos. Got killed when Flash died.
Their execution was of course bad but I think today current LLM models are better and faster and there is much more OSS models to reduce costs. Hardware though looked nice and pico projector interesting concept even though not the best executed.
make hardware expensive again!
Nope.
Brief (CC0): https://doi.org/10.5281/zenodo.17305774 Curious: would this structure have saved any of the projects mentioned here?
It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it.