Modern OOP (not the original OOP by Alan Kay) is a human anti pattern.
It commits the cardinal sin to easy to understand systems: It hides state, and breaks data lineage.
In otherwords:
1. You cannot just go back up the stack to see if anyone has changed data you depend on. You also need to follow all parent and sibling branches.
2. And in the case of inheritance you cannot reason about Child A without understanding Parent 1..N
As a result OOP systems quickly hit the limit of context one developer can hold in their brain when developing and debugging.
FP on the other hand encourages and in some cases enforces you to encapsulate the inputs and outputs of your system to the arguments and values of a function. Making the system easy to reason about at any level.
Powerful composability and more thorough and easy testing are just beautiful by products.
Next on the list of worst things to happen to programming is Python's popularity as a CSC101 language, and its toe hold in mathematics with the rise of ML.
> Next on the list of worst things to happen to programming is Python's popularity as a CSC101 language
My school kept track of computer science graduates, and the numbers dropped sharply after copying MIT's example for their intro course. And predictably it was 4 years after the change.
Some might call that "Gatekeeping" (though that's a more recent word in the vernacular), but I think it's more 90% of the jobs were C/C++/Java back then, and a BS degree was meant to get a graduate in a job in the real world.
Also students dropping out of the computer science program wasn't a great look when requesting funds for servers and stuff.
Worst thing that happened to programming, eh? Have you tried running a 2 year old javascript/node project that transpiles and gulps it's three billion dependencies into something alien - if it works. Which it won't because it hasn't been updated for 2 years.
I don't think this is a great article but if you hit Google Scholar and look for papers concerning OOP you'll be hard pressed to find any recent ones. Almost every programming language research paper is about functional programming. Recent practical crypto papers seem to use Go a lot but that isn't OOP.
OOP was a dead end and academia has moved on if they were ever interested in the first place. It is strange that industry is 180 degrees out of phase here even as they stress the importance of "computer science fundamentals" like data structures and algorithms.
No-one uses that original OOP at all, no-one sane anyway.
The way its used now is for dependency injection. All your logic is in services that are injectable and unit tested. All your data is in simple immutable DTOs.
All the OOP tricks, classes, instances, interfaces, polymorphism, its all good for wiring up your logic, replacing bits at runtime. No-one actually models their domain with pure OOP. Urgh, that would be awful.
But also to echo other commenters, this isn't interesting insight...
Being extremely enthusiastic or extremely angry about OOP is so 1990s. Tell us, is Java the New COBOL? Is Visual C++ COM/OLE inherently bloated Microsoft Bob Windows Longhorn software?
Absolutely is.
Modern OOP (not the original OOP by Alan Kay) is a human anti pattern.
It commits the cardinal sin to easy to understand systems: It hides state, and breaks data lineage.
In otherwords:
1. You cannot just go back up the stack to see if anyone has changed data you depend on. You also need to follow all parent and sibling branches.
2. And in the case of inheritance you cannot reason about Child A without understanding Parent 1..N
As a result OOP systems quickly hit the limit of context one developer can hold in their brain when developing and debugging.
FP on the other hand encourages and in some cases enforces you to encapsulate the inputs and outputs of your system to the arguments and values of a function. Making the system easy to reason about at any level.
Powerful composability and more thorough and easy testing are just beautiful by products.
Next on the list of worst things to happen to programming is Python's popularity as a CSC101 language, and its toe hold in mathematics with the rise of ML.
> Next on the list of worst things to happen to programming is Python's popularity as a CSC101 language
My school kept track of computer science graduates, and the numbers dropped sharply after copying MIT's example for their intro course. And predictably it was 4 years after the change.
Some might call that "Gatekeeping" (though that's a more recent word in the vernacular), but I think it's more 90% of the jobs were C/C++/Java back then, and a BS degree was meant to get a graduate in a job in the real world.
Also students dropping out of the computer science program wasn't a great look when requesting funds for servers and stuff.
The interpretation of Alan Kay's view on OOP is that it's not objects that are important, it's messaging.
https://wiki.c2.com/?AlanKayOnMessaging
Worst thing that happened to programming, eh? Have you tried running a 2 year old javascript/node project that transpiles and gulps it's three billion dependencies into something alien - if it works. Which it won't because it hasn't been updated for 2 years.
I don't think this is a great article but if you hit Google Scholar and look for papers concerning OOP you'll be hard pressed to find any recent ones. Almost every programming language research paper is about functional programming. Recent practical crypto papers seem to use Go a lot but that isn't OOP.
OOP was a dead end and academia has moved on if they were ever interested in the first place. It is strange that industry is 180 degrees out of phase here even as they stress the importance of "computer science fundamentals" like data structures and algorithms.
This is 10/10 ragebait
https://www.youtube.com/watch?v=wo84LFzx5nI
I thought that this was going to be a discussion about this old HN classic:
https://news.ycombinator.com/item?id=8420060
PS - don’t click the smashcompany link!!! The essay appears to have been replicated here:
https://medium.com/@jacobfriedman/object-oriented-programmin...
No-one uses that original OOP at all, no-one sane anyway. The way its used now is for dependency injection. All your logic is in services that are injectable and unit tested. All your data is in simple immutable DTOs.
All the OOP tricks, classes, instances, interfaces, polymorphism, its all good for wiring up your logic, replacing bits at runtime. No-one actually models their domain with pure OOP. Urgh, that would be awful.
But also to echo other commenters, this isn't interesting insight...
Being extremely enthusiastic or extremely angry about OOP is so 1990s. Tell us, is Java the New COBOL? Is Visual C++ COM/OLE inherently bloated Microsoft Bob Windows Longhorn software?
Is this a new way for Russia to undermine the West?
> why experienced Java (C#, C++, etc.) programmers can’t really be considered great engineers, and why code in Java cannot be considered good
...how was this written in 2025? This is like mid-2000s edgelord stuff.
Speaking to the choir with me :)
But I would add "so far", AI and vibe could very well overtake OOP in a year or 2.