Fun fact: The most costly incident I ever "caused" was because I fixed a bug in an API to make it match the spec. The affected client escalated and forced me to restore the defect, even though it also affects many other clients.
Working in almost any mature space requires you to be extremely conservative in ways that have always proven difficult to specify.
Code isn't assembly, code is required to sufficiently express English in a way that is unambiguous.
You can't translate plain English into unambiguous code. Period. Not even engineers can. The only way this translation happens is that humans are good enough at communicating to get to a point that code can be produced and iterated on to produce an outcome with enough context the thing doesn't fall over after a week.
The only thing stopping AI doing this is the right communication models and enough context.
As an engineer by trade. Im quickly realizing how fucked we are right now.
I give us maybe 2-3 years until the last engineering roles are posted. Maybe longer depending on how many businesses survive the crash while refusing to automate their workflows.
Most of us are going to start transferring skills to more of a prompt programming and coordinator role.
Until those roles go to.
This is a very common myth, there was never a period where everyone programmed in assembly and then high level languages were introduced.
Pretty much since the first CPUs were released, there were already programming languages for them.
Alan Turing went from hooking physical wires up, to writing Autocode for the Ferranti in less than a Decade.
And it's not even that the period of Assembly programming was brief, spanning almost a decade after WW2, the guys wrote in symbolic mathematical language before setting on to write the physical schematics of their machines.
So no, there wasn't a period where we all programmed in Assembly and then we discovered programming languages and we saw that it was good.