One thing I lament is the decline of long-term, unfettered research across the industry. I’ve witnessed more companies switching to research management models where management exerts more control over the research directions of their employees, where research directions can abruptly change due to management decisions, and where there is an increased focus on profitability. I feel this short-term approach will cost society in the long term, since current funding models promote evolutionary work rather than riskier, potentially revolutionary work.
As someone who wanted to become a researcher out of curiosity and exploration, I feel alienated in this world where industry researchers are harangued about “delivering value,” and where academic researchers are pressured to raise grant money and to publish. I quit and switched to a full teaching career at a community college. I enjoy teaching, and while I miss the day-to-day lifestyle of research, I still plan to do research during my summer and winter breaks out of curiosity and not for career advancement.
It would be great if there were more opportunities for researchers to pursue their interests. Sadly, though, barring a cultural change, the only avenues I see for curiosity-driven researchers are becoming independently wealthy, living like a monk, or finding a job with ample free time. I’m fortunate to have the latter situation where I have 16 weeks per year that I could devote outside my job.
We need this. Like, really, we need someone to have created the xerox part of the 21st century, somewhere about 20 years ago.
I honestly though Google would be that - but apparently it's easier to fund R&D on "selling copying machines" than "selling ads". Maybe "selling ads" earn _too much_ money ? I don't know.
I know, I know, DeepMind and OpenAI and xAI are supposed to fix climate change any time soon, and cure cancer while they invent cold fusion etc, etc... and it's only because I'm a pessimistic myopist that I can only see them writing fake essays and generating spam, bad me.
Still. Assuming I'm really grumpy and want to talk about people doing research that affects the physical world in positive way - who's doing that on the scale of PARC or Bell ?
And it was all done, apparently, at least in the beginning, because they hired smart people and they let them do what they wanted.
> RCA Laboratories/the Sarnoff Research Center is surely one of the most important of the American corporate labs with similarities to Bell Labs. (It features prominently in Bob Johnstone's We Were Burning https://www.hachettebookgroup.com/titles/bob-johnstone/we-we... : it has a big role in the history of the Japanese semiconductor industry, in large part because of its roles in the development of the transistor and the LCD and its thirst for patent-licensing money.)
>> In Dealers of Lightning, Michael Hiltzik argues that by the 1990s PARC was no longer engaged in such unrestricted research decoupled from product development.
> According to Hiltzik and most other sources, the PARC Computer Science Lab's salad days were over as early as 1983, when Bob Taylor was forced to leave, while the work of the other PARC labs focussed on physics and materials science wasn't as notable in the period up to then.
Seriously: if this kind of thing interests you at all, go and read We Were Burning.
However, it should be seen as a starting point! Alternative hypothetical pasts and futures abound. One issue is that the stuff from the past always looks more legendary seen through the lens of nostalgia; it's much harder to look at the stuff around you and to go through the effort of really imagining the thing existing.
So that's my hypothesis - there isn't a smaller volume of interesting stuff going on, but viewing it with hope and curiosity might be a tad harder now, when everyone is so "worldy" (i.e., jaded and pessimistic).
Proof:
https://worrydream.com/ (brett victor)
and the other people doing dynamicland and realtalk, both discussed straightforwardly here:
https://dynamicland.org/2024/FAQ/
https://solidproject.org/about -- solid, tim berners-lee and co, also.
https://malleable.systems/catalog/ -- a great many of the projects here are in the same spirit, to me, as well!
https://spritely.institute/ -- spritely, too
https://duskos.org/ -- duskOS, from Virgil Dupras
https://100r.co/site/uxn.html -- 100 rabbits, uxn, vibrating with new ideas and aesthetics
https://qutech.nl/ -- quantum research institute in the netherlands, they recently established a network link for the first time I believe
etc etc. These are off the top of my head, and I'm fairly new to the whole space!
Kind of a strange statement. Fairchild took the "traitorous eight" from Shockley Semiconductor, which was founded by William Shockley, who famously co-invented the transistor at Bell Labs (and who named the "traitorous eight" as such.)
So while Fairchild "didn’t operate anything like a basic research lab", its co-invention of the IC was not unrelated to having a large amount of DNA from Bell Labs.
AT&T provided for most of its history, the best quality telephone service in the world, at a comparable price to anyone else, anywhere.
There were structural issues with the AT&T monopoly however, for example cross subsidization - the true cost of services was often hidden because they would use optional services (like toll calling) to subsidize basic access, and business lines would cross subsidize residential service.
The level that AT&T fought foreign connections (aka, bring your own phone), probably hastened their demise, in the end, the very technologies that AT&T introduced would turn long distance from a high margin, to low margin business - the brass at AT&T had to know that, but they still pinned the future of their manufacturing business on that - a manufacturing business that had never had to work in a competitive environment, yet was now expected to - because of this and other factors divestiture was doomed to failure.
I'm a believer in utilities being a natural monopoly, but AT&T was an example of effective regulatory capture, it did not, and does not have to be this way, however it was.
I can think of: AT&T, DuPont, Kodak, Xerox PARC, Westinghouse, IBM, GE, the original Edison labs (best as I can tell acquired by Western Union), Microsoft, Rockefeller University, Google Research.
Of notable industries and sectors, there's little I can think of in automobile, shipping, aircraft and aviation (though much is conducted through Nasa and military), railroads, steel (or other metals/mining), petroleum, textiles, or fianance. There's also the Manhattan Project and energy labs (which conduct both general energy research and of course much weapons development).
(I've asked similar questions before, see e.g., <https://news.ycombinator.com/item?id=41004023>.)
I'd like to poke at this question in a number of areas: what developments did occur, what limitations existed, where private-sector or public / government / academic research were more successful, and what conditions lead to both rise and fall of such institutions.
1947 was a magical year. That announcement had profound implications. They effectively invented something that would replace the huge existing base of vacuum tube components with miniaturized transistors. This miniaturization phase significantly influenced Von Nuemann's recommendation for the ballistic missile program. Many of the discrete component systems manufactured during this time remained in service into the 1980's.
This is a photo of a D-17B guidance computer that deployed on the Minuteman Missile in 1962, 15 years after creating the transistor, and was typical of military printed circuitry at the time for general purpose computers, disk/drum storage drives, and printers.
https://upload.wikimedia.org/wikipedia/commons/3/38/Autoneti...
"The D-17B weighed approximately 62 pounds (28 kg), contained 1,521 transistors, 6,282 diodes, 1,116 capacitors, and 5094 resistors. These components were mounted on double copper-clad, engraved, gold-plated, glass fiber laminate circuit boards. There were 75 of these circuit boards and each one was coated with a flexible polyurethane compound for moisture and vibration protection."
Rather than p(r)aying for the smartest people who have ever been born, design a corporation that can have the average high school dropout work in R&D and you will print money, innovation and goodwill.
[1] The Art of Doing Science and Engineering:
https://press.stripe.com/the-art-of-doing-science-and-engine...
One thing to consider is that Bell Labs didn't innovate for altruistic reasons like furthering the human race or scientific understanding. They innovated to further AT&T's monopoly and to increase shareholder value. This doesn't seem that different than what Meta, Google, NVIDIA, etc. are doing. Maybe in 10-20-30 years we will view the research that modern tech companies are doing through the same lens.
Although, I do admit that the freedom with which these scientists and engineers were able to conduct research is something special. Maybe that's the real difference here.
These organizations employed too many people of relatively mediocre ability relative to output, leading to waste and eventual disbandment. Today's private sector companies in FAMNG+ are making bigger breakthroughs in AI, apps, self-driving cars, etc. with fewer people relative to population and more profits. This is due to more selective hiring and performance metrics. Yeah those people form the 60s were smart, but today's STEM whiz kids are probably lapping them.