OOP can be just about structuring code, like the Java OOP fundamentalism, where even a function must be a Runnable object (unless it's changed since Oracle took over). If there's anything that is not an object, it's a function!
Some things are not well-suited to OOP, like linear processing of information in a server. I suspect this is where the FP excitement came from. In transforming information and passing it around, no state is needed or wanted, and immutability is helpful. FP in a UI or a game is not so fun (witness all the hooks in React, which in anything complicated is difficult to follow), since both of those require considerable internal state.
Algorithms are a sort of middle ground. Some algorithms require keeping track of a bunch of things, others more or less just transform the inputs. OOP (internal to the algorithm) can make the former much clearer, while it is unhelpful for that latter.
Are we talking about using classes at all? Are we arguing about Monoliths vs [Micro]services?
I don't really think about "OOP" very often. I also don't think about microservices. What some people seem to be talking about when they say they use "OOP" seems strange and foreign to me, and I agree we shouldn't do it like that. But what _other_ people mean by "OOP" when they say they don't use it seems entirely reasonable and sane to me.
Anecdotally, I've replaced OOP with plain data structures and functions.
This has nothing to do with OOP, and can be made out of structured-programming components, or even purely functional components. In fact, stateless services are a staple of horizontal scaling, and could be a poster child of FP taking over the real world (along with React).
What made OOP problematic was mostly shared and concealed mutable state, and the ill-conceived idea of inheritance. Both of these traits are being actively eschewed in most modern software: mutable state is largely separated into databases, inheritance is often rejected in favor of composition. These are all practical, non-ideological choices, ways to relieve well-known pains. It this regard, OOP is on its way out, even in strongholds where it's ingrained into the very fabric, like JVM languages.
The complaints of the author are mostly about component-based architecture, with limited compile-time trust (know thy vendor) and large run-time distrust (every network request should be seen as a potentially malicious request). This is the price that we pay for having so many ready-made building blocks to choose from, and for the ability to make our services available (or access someone else's services) anywhere on the planet, 24/7.
Highly scalable architectures with their complexity were invented by companies who needed them, like, well, Google or Amazon. There is no good way to serve billions of requests daily. If you serve mere millions, you may not even need all that, and make do with a few beefier boxes. It has nothing to do with OOP, again.
Also even those recent ones that boost about not being OOP, have type systems that from computer science OOP type theory point of view, support OOP concepts (OOP is not class based inheritance and nothing else).
Naturally since not everyone is doing type theory studies on CS degrees, or attending CS degree to start with, then we get all these discussions about what is OOP or not.
As a total beginner to the functional programming world, something I've never seen mentioned at length is that OOP actually makes a ton of sense for CRUD and database operations.
I get not wanting crazy multi tier class inheritance, that seems like a disaster.
In my case, I wanted to do CRUD endpoints which were programmatically generated based on database schema. Turns out - it's super hard without an ORM or at least some kind of object layer. I got halfway through it before I realized what I was making was actually an ORM.
Please feel free to let me know why this is all an awful idea, or why I'm doing it wrong, I genuinely am just winging it.
OK, I'm out.
I have been programming since 1967. Early in my college days, when I was programming in FORTRAN and ALGOL-W, I came across structured programming. The core idea was that a language should provide direct support for frequently used patterns. Implementing what we now call while loops using IFs and GOTOs? How about adding a while loop to the language itself? And while we're at it, GOTO is never a good idea, don't use it even if your language provides it.
Then there were Abstract Datatypes, which provided my first encounter with the idea that the interface to an ADT was what you should program with, and that the implementation behind that interface was a separate (and maybe even inaccessible) thing. The canonical example of the day was a stack. You have PUSH and POP at the interface, and the implementation could be a linked list, or an array, or a circular array, or something else.
And then the next step in that evolution, a few years later, was OOP. The idea was not that big a step from ADTs and structured programming. Here are some common patterns (modularization, encapsulation, inheritance), and some programming language ideas to provide them directly. (As originally conceived, OOP also had a way of objects interacting, through messages. That is certainly not present in all OO languages.)
And that's all folks.
All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.
The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.
These pro- and anti-OOP discussions, which can get pretty voluminous and heated, seem a lot like religious wars. Look, we can all agree that the Golden Rule is a pretty good idea, regardless of the layers of terrible ideas that get piled onto different religions incorporating that rule.
Fake history; the term "software crisis" was coined in 1968:
https://en.wikipedia.org/wiki/Software_crisis
I get that the writing is tongue-in-cheek, but telling just-so stories about how things were so much better in the old days doesn't help anyone.
I also resent our modern problems, but I don't kid myself that I'd enjoy vintage problems any better.
Sounds like C.
When was the last time you did OO against a .h file without even needing access to the .c file?
> And so, the process/network boundary naturally became that highest and thickest wall
I've had it both ways. Probably everyone here has. It's difficult to make changes with microservices. You gotta open new routes, and wait for people to start using those routes before you close the old ones. But it's impossible to make changes to a monolith: other teams aren't using your routes, they're using your services and database tables.
Cloud: Separating resources from what gets deployed is a classic separation of concerns.
I don’t miss the days where I had to negotiate with the IT team on hardware, what gets run, and so on.
Personally, I believe the next evolution is a rebalkanization into private clouds. Mid-to-large companies have zero reason to tie their entire computing and expose information to hosting third parties.
OpenAPI: The industry went through a number of false starts on formal remoting calls (corba, dcom, soap). Those days sucked.
The RESTful APIs caught on, and of course at some point, the need for a formal contract was recognized.
But note how decoupled it is from the underlying stack: It forces the engineers to think about the contract as a separate concern.
The problem here is how fragile the web protocol and security actually is, but the past alternatives offer no solution here.
Dependency injection has to be the most successful one, but there's at least another dozen good ideas that came from OO world and has been found to be solid.
What has rarely proven to be a good idea instead is inheritance at behavior level. It's fine for interfaces, but that's it. Same for stateful classes, beyond simple data containers like refs.
You can even have classes in functional programming word, it's irrelevant, it's an implementation detail, what matters is that your computations are pure, and side effects are implemented in an encoded form that can be combined in a pure way (an IO or Effect data type works, but so can a simple lazy function encoding).
First, OpenAPI schemes aren't a replacement for type checkers, they allow for tooling to be built around the scheme and even generated code to be created instead of bespoke implementations for every client language. You still should do input (and even response) validation.
> docker compose replaces service factories,
Compose is a self-documenting set of services to run together in predictable ways in containerized environments.
> Kubernetes replaces the event loop.
K8s expands this to more complex orchestration across multiple systems.
> Every call across components acrues failure modes,
You can containerize fairly monolithic code pretty easily... especially with related services that run as a set... nothing requires micro services or complex orchestration in practice. I think of it as a set of predictable extensions to 12-factor apps.
> requires a slow march through (de)serialisation libraries, a long trek through the kernel’s scheduler. A TLB cache invalidation here, a socket poll there. Perhaps a sneaky HTTP request to localhost for desert.
This is how all client-server systems work... there's some level of (de)serialization involved and a system of coordination between clients a servers... nothing special here. HTTP just has the benefits of being human readable and standardized with tooling for security, authentication, proxies, caching, etc as wrappers and add-ons.
These have nothing to do with OOP or not. For that matter, OOP itself isn't necessarily a panacea... I've always favored utility/worker processes (even in a class based language) separated from state or storage objects that don't have internalized objects... exceptions maybe for attribute markup for type validation if the language supports it.
The rant as a whole doesn't make a lot of sense... if you want to rail against the complexities of micro-services and orchestration, which trades in-application complexity for deployment/orchestration complexity, that's a fair argument to make... especially when the same people are responsible for both. But that isn't an OOP thing at all.
I think the author is correctly picking up on how messy changes in best common practice can be. Also, different communities / verticals convert to the true religion on different schedules. The custom enterprise app guys are WAAAAY different than games programmers. I'm not sure you'll ever get those communities to speak the same language.
OOP is dead. Long live OOP.
Protocols have their issues, though[0]. Not exactly the same results.
[0] https://littlegreenviper.com/the-curious-case-of-the-protoco...
But they come at great cost. If you don't actually HAVE the problems they solve, do everything in your power to avoid them. If you can just throw money at larger servers, you should not use microservices.
And this has tangible costs, too. I saved more than $10k a month in hosting costs for a small startup by combining a few microservices (hosted on separate VMs) into a single service. The savings in development time by eliminating all of the serialization layers is also appreciable, too.
I'd say that right now the edge is with non-oop.
But dare I say too, llms will make this battle mean less than it did 5 years ago. Look, I'm not looking to battle llms vs not - but the bottom line is that nearly all code in a 10 years time will be written and/or managed 99% by llms.
Silly link, though. I highly suggest going back and clicking on the link.
> To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming has become an equally gigantic problem.
I read it twice; twice I got nothing.
It is incoherent. Vaguely attempting to take a swing at ... modularity???
Typo: dessert
When he starts talking about the “hygiene of the programmer”, he is referring to the concept of a “code smell” rather than making literal statements about the literal cleanliness of programmers.
From there, he is saying that the industry has distanced itself from object-oriented programming because it often causes problems and added “smells” to the architecture of code bases. This is regardless of what your specific definition of OO is.
Finally, he ends by raising awareness around the fact that even if people claim to not use much OO in their codebases, when you look at the total architected solution, those various services like Docker and so on are themselves various Gang of Four style OO patterns. Because we talk about OO in code, we are not watching the OO that happens around the code.
I will say that service-oriented architecture does have some advantages, and thus sometimes it's the right choice. Parallelism is pretty free and natural, you can run services on different machines, and that can also give you scalability if you need it. However, in my experience that architecture tends to be used in myriad situations where it clearly isn't needed and is a net negative. I have seen it happen.