> He warned against mistaking command of words for possession of the solid things those words are meant to disclose. He joined language to substance, sequence to maturation, and study to direct contact with reality — principles that four centuries have not made less urgent.
There are maps that accurately represent a territory, and purely fictitious maps with no relation to any territory whatsoever. This is the spectrum of representation, and LLMs are pushing us towards creating maps that overwhelmingly occupy the latter extremity.
> More writing done in class. More oral defense of arguments. More seminars organized around live questions rather than passive downloads of information.
It's one thing to memorize arguments in favour of a position. It's another to actively defend your positions against those aggressively invested in proving you wrong. John Stuart Mill argued that only the latter activity produces the real understanding that allows an argument, or a tradition, to be renewed and kept alive across generations against constant attempts at refutation. If you are regurgitating a stance instead of actively fighting to defend one, do you really believe in what you are saying?
I think belief that words accurately represent a reality is going to become increasingly important in the years to come. There are now many pantheons to worship at in the 2026 ecosystem of ~digital gods~ AI models, and the question becomes whose version of reasoning you choose to accept as authoritative. Unfortunately, no single model can itself answer this question for you, for obvious reasons.
He's talking about scholasticism[1], but that has issues of its own[2].
Over the last number of years I’ve transitioned from coding database backends to physical labor. Part of this has to do with an addiction problem involving Adderall and other uppers and my choice to live clean, live in the world, and live in community with other people. But it also just feels right. I like to think that I can also work wherever, because I know how to pave a driveway. I know how to lay a foundation. I know how to frame a house. I’m learning about how to build septic. One day I’d like to build a house as a gift to my family. Instead of removing my physical self from my job so I can do it anywhere, I’ve taught myself skills that will be useful to my neighbors wherever I go.
My partner has chosen to work a very important but very “deep“ job in the local government bureaucracy. The only way his job works at all is that so many people know his face. He’s been a pillar of his community for 10 years and has proven over and over again to be trustworthy and likable around town. In pretty much every way he espouses the exact opposite philosophy of the digital nomad. His roots are so deep then if we moved it might kill him entirely (hyperbole).
I don’t especially know where I’m going with this, other than to say that there are ways forward that are not total alienation. There are ways to live where you are not competing with the machine. There is still a physical meatspace world full of people with hopes and dreams that cannot be captured digitally and cannot be replaced robotically. A world built on trust and care and mutual respect for one another. If you have a job in which you feel you are just “producing text”, I feel for you deeply. They’re coming for us all eventually, and thy started with the writers/programmers. What a strange time to be alive
There's a meme, "Why develop one's own expertise? It's a poor investment. When you need it, you can hire it." Does AI make us all trust-fund kids?
An intro-physics educator at a first-tier university, observes that their entering students, having attended such well-funded schools, with such highly-skilled teachers, presenting material so clearly... widely lack both the skills and inclination to wrestle with a body of knowledge to extract their own understanding. To the detriment of their early-college education. The sci-ed snark version is, raising all students to the level of these, would be both an unimaginably immense triumph, and an ongoing profound failing to teach well. Will AI give everyone material presented so clearly?
More rough-quantitative reasoning? Fermi questions. Especially if done by collaborative iterative bounding "Who can suggest another soft/hard upper/lower bound? ... What do you think of that argument?"
In contrast to a plug-and-chug theme, illustrated by an ideal gas law problem in a popular textbook, which despite years of use, and qc passes for multiple editions, has numbers for solid Argon. Reality checking, a feel for reasonable values, a "Is this approximation plausible here?", being pervasively "not on the exam".
Better to to tie education of words and numbers to their use. What happened to shop class?
For the last century, a lot of jobs have shifted from making stuff (food, goods, etc.) to providing services. So education has shifted to that and soft skills are now important. You can use a calculator if you need some numbers. It's fine if you don't do that in your head. You study something comparatively niche and useless and then you become a manager, consultant, marketing expert, or whatever that has very little connection to what you studied (history, antropology, whatever). The important skills that were taught are critical thinking, communicating, etc. Ironically, a lot of people with backgrounds like that are reverting to doing things with their hands in the end. Our cities are full of coffee shops, bakeries, jewellery makers, restaurants, etc. run by people with college degrees.
Modern AI driven technology is undoing the industrial revolution and creating a new one. The industrial revolution was all about uniformity and centralization to drive economies of scale. That meant people had to have the same baseline of skills so they could do the simple jobs that they were assigned to do. The smarter ones got promoted up. And you could build a lot with many people doing simple things like that. The bigger the company, the more money it made.
With modern technology, you can 3D print whatever you need, generate software, and run advanced manufacturing all in a small workshop just by yourself. You don't need a big company around you. That actually slows you down. The old services industry ran on soft skills. This new way of manufacturing runs on hard skills. And because its AI assisted you can do more at a small scale. Provided you understand what needs doing. Companies can be small, hyper specialized, and derive value from that. Their customers are other companies. Together they resemble what a pre-industrial revolution town would look like. Lots of specialists trades and shops all working together to produce wealth for the town. Instead of doing everything inside one big company, you now have complex clusters of companies, individuals, contractors, etc. working together.
Education has to focus on teaching people how to function in a world like that. It has to teach them not just one skill or trade but how to be able to adapt and combine different skills.
There are lots of HN commenters who dislike formal education. Many of them seem to admire people who dropped out of school but became wealthy without earning a degree, like Zuckerberg, Gates and so on
Greg Brockman of OpenAI is another college dropout
Also, Pangram says 100% AI generated (some sections with high confidence): https://www.pangram.com/history/af8d47c1-dcbd-48ed-83a8-eda6...