It's possible that mainframe Oracle DBAs working on glass house HP 9000s looked at PC databases as toys, but within the PC industry these tools were considered powerful. Any growth of PC usage was a rising tide that lifted all the boats in the PC industry. If you had some experience programming in C, it was pretty easy to get a job. There weren't 100's (or 1000's) of job applicants for each position. If it was a niche segment, you could even know all the other candidates for an opening, as in "No Other Choice". (I was in a niche like that, developing architectural and facilities mgmt PC CAD software, then later cartography, for Microstation.)
I think the production shift from agriculture to manufacturing, or manufacturing to services, is probably a better way to understand what's happening now.
A friend of mine who graduated in 1983 with a computer science and an accounting degree, had an interview for an accounting job. Accounting was his actual passion, but he also enjoyed programming. He was told at one interview:
'Your computer science degree is useless. With this spreadsheet program I can do everything. There is no longer a need for programmers'
I think he was wrong, just like Thomas J Watson (CEO of IBM 1943 quote: "I think there is a world market for maybe five computers.")
So when I hear AI will replace ... I think it will enhance and redefine.
To be fair it's very rare for an accountant today to know a proper programming language. I wish that was part of their training since from my perspective accountants could really benefit from git, SQL, and a little Python.
I agree, and would push it further, that almost all professionals need to have a basic understanding of Logic and programming. At least to create a Sort routine (without copying it).
I believe schools should require a Logic (not just philosophy) class.
Most software development was either huge custom business systems or packaged desktop software complex enough to warrant fifty-dollar or more price tags at a time when $50 was worth a lot more than it is today.
Programming was in some ways far more and in other ways far less accessible to get into, and there weren’t software developers around every corner. Someone who could _use_ a computer with a modicum of fluency was a rarity, let alone someone who could program one.
It would be like asking whether accountants should fear for their jobs after the invention of the calculator.
Even MS Access (introduced something like a decade later) or the roughly equivalent FileMaker Pro were ways to build custom software for companies that would never have been able to afford the SWE team to build something from scratch. (And Access in particular tended to be wielded by people unfamiliar with relational database normalization in ways that made it a bit of a time-delayed footgun.)
Access is probably the best analogy for vibe coding in that sense — really good for building something small and custom for someone who couldn’t code it from scratch themselves. Yet not really suited for building large, complex production-grade (let alone “enterprise-grade”) systems — you could try, but you would come to regret it soon enough.
The main difference being that somehow a lot of software companies have managed to fall for the idea that it doesn’t have such limitations.
Spreadsheets became popular in the 80s, before PCs were ubiquitous and cheap. Computers cost in the order of 5 to 10 thousand dollars, with lots of incompatible systems. Spreadsheets were a killer app that drove adoption and grew the market, so I'd say on balance it was positive for programmers and software generally.
AI comes at quite a different time, when tech adoption was already pretty mature / saturated to start with.
Pure-software people didn’t exist; it was always someone who was a statistician, a geologist, a chemist, who was also ‘good at computers’ because it was a skill they cultivated along the way. As a result, most software was highly specialized and arcane. Way too expensive to be a toy.
Early tools like Lotus 1-2-3 and dbase were mind-blowing because they were so generalized and available on consumer appliances. Schools managed milk money, farmers planned crops, the perceived value was instant. There wasn’t an activity that couldn’t benefit.
Back then one computer with a spreadsheet and database was considered more than enough to grow ‘any’ entrepreneurial enterprise from zero to 200 employees. Even in the late 90s I was in a ~300 person multi-national that mostly ran on one Novell server and the entire company lived in Lotus Notes.
I don’t remember much fear that spreadsheets were going to replace programmers. What I remember more was a kind of simmering contempt for them, though time may be sanding the edges off that memory.
From the IT side, spreadsheets and later spreadmarts were a completely understandable headache.
The business needed solutions, and IT often couldn’t deliver fast enough. A lot of that was structural: IT was usually treated as a cost center, underfunded, and forced through layers of process and overengineering. So even small things could take too long, cost too much, and come wrapped in too much ceremony.
So the business used spreadsheets because they were powerful, flexible, and already there.
The real problem came later. Business process, business data, and domain knowledge ended up trapped inside undocumented spreadsheets living on someone’s PC. Out of sight, out of mind, and effectively unmaintained.
Then Joe retired, quit, or got laid off, and suddenly some weird but critical business function stopped working because Joe always did it and it lived on Joe’s machine.
That was the nightmare. The same IT group that didn’t have the time or resources to meet the need in the first place now had to reverse engineer a giant kludgy spreadsheet and somehow turn it into something supportable.
Which is part of why the current AI moment feels familiar to me. Not because AI is the same thing as a spreadsheet, but because the adoption pressure comes from the same places: businesses want to save money, people need to get their work done, and they will reach for whatever is powerful, available, and fast enough.
And just like with spreadsheets, some of what gets built this way will be genuinely useful, some of it will become invisible infrastructure, and some of it will turn into a future headache for whoever has to untangle it later.
I don’t think the lesson from spreadsheets was that end-user tools replaced professional software. It was that when the official path is too slow, too expensive, or too disconnected from the real need, people route around it. AI looks to me like the same pattern on a much larger scale. The hype says panacea. Reality will probably be messier.
My Bride worked for a banking/credit card company. Someone took the sample "northwinds" Microsoft Access sample app, changed the labels, and modified their business process to mimic a few of the existing queries. At some point, there was a company mandate to port all of these types of apps to Oracle. Oh, the huge manatee... The process/business was a non-trivial bit of cash flow.
There has always been a bit of back and forth. Giving long deadlines and crazy costs, the business will always kruft something together. Sometimes it works, sometimes folks get burned, sometimes you get a nice hand off.
Spreadsheets did not replace programmers, they mostly changed who could build small solutions. A lot of quick calculations and internal tools moved to spreadsheets, while developers focused more on building systems around them. In many teams it actually increased the need for proper software later on.
I was alive when the internet was not a thing and remember my neighbor getting a beefy hotrod 486 which was significantly faster than my 286 I scrapped together a lot of money for as a kid.
I liken AI to more like sewing machines.
Lets say our factory output was 10 tshirts a day by hand sewing. When sewing machines came around, we din't make 10 tshirts by 8:30am, send everyone home and pat ourselves on the back. Instead we just increased output per worker and kept investing. This drove the cost of goods way down, to the point where a negative pressure equilibrium was reached in the market: making tshirts any cheaper you'd have to give them away (which, 100 years ago was an absolutely absurd proposal, yet today is so common its sort of laughable). This lead to people moving out of the industry but new people moved in (industrial and chemical engineers, material scientists, technicians, mechanics etc).
In my mind, CEOs that are firing their staff are:
1. Making up for past over-hiring mistakes during Covid
2. Cashing future investment in for today's temporary gains.
we're in an economic pinch right now, so not surprised to see layoffs. AI firings are just an excuse; if this happened in 2016, you'd be rolling out AI to every employee in your company to outproduce your competitors. You can't cut your way to success.
It's possible that mainframe Oracle DBAs working on glass house HP 9000s looked at PC databases as toys, but within the PC industry these tools were considered powerful. Any growth of PC usage was a rising tide that lifted all the boats in the PC industry. If you had some experience programming in C, it was pretty easy to get a job. There weren't 100's (or 1000's) of job applicants for each position. If it was a niche segment, you could even know all the other candidates for an opening, as in "No Other Choice". (I was in a niche like that, developing architectural and facilities mgmt PC CAD software, then later cartography, for Microstation.)
I think the production shift from agriculture to manufacturing, or manufacturing to services, is probably a better way to understand what's happening now.
A friend of mine who graduated in 1983 with a computer science and an accounting degree, had an interview for an accounting job. Accounting was his actual passion, but he also enjoyed programming. He was told at one interview:
'Your computer science degree is useless. With this spreadsheet program I can do everything. There is no longer a need for programmers'
I think he was wrong, just like Thomas J Watson (CEO of IBM 1943 quote: "I think there is a world market for maybe five computers.")
So when I hear AI will replace ... I think it will enhance and redefine.
To be fair it's very rare for an accountant today to know a proper programming language. I wish that was part of their training since from my perspective accountants could really benefit from git, SQL, and a little Python.
I agree, and would push it further, that almost all professionals need to have a basic understanding of Logic and programming. At least to create a Sort routine (without copying it).
I believe schools should require a Logic (not just philosophy) class.
They addressed completely different use cases.
Most software development was either huge custom business systems or packaged desktop software complex enough to warrant fifty-dollar or more price tags at a time when $50 was worth a lot more than it is today.
Programming was in some ways far more and in other ways far less accessible to get into, and there weren’t software developers around every corner. Someone who could _use_ a computer with a modicum of fluency was a rarity, let alone someone who could program one.
It would be like asking whether accountants should fear for their jobs after the invention of the calculator.
Even MS Access (introduced something like a decade later) or the roughly equivalent FileMaker Pro were ways to build custom software for companies that would never have been able to afford the SWE team to build something from scratch. (And Access in particular tended to be wielded by people unfamiliar with relational database normalization in ways that made it a bit of a time-delayed footgun.)
Access is probably the best analogy for vibe coding in that sense — really good for building something small and custom for someone who couldn’t code it from scratch themselves. Yet not really suited for building large, complex production-grade (let alone “enterprise-grade”) systems — you could try, but you would come to regret it soon enough.
The main difference being that somehow a lot of software companies have managed to fall for the idea that it doesn’t have such limitations.
Spreadsheets became popular in the 80s, before PCs were ubiquitous and cheap. Computers cost in the order of 5 to 10 thousand dollars, with lots of incompatible systems. Spreadsheets were a killer app that drove adoption and grew the market, so I'd say on balance it was positive for programmers and software generally.
AI comes at quite a different time, when tech adoption was already pretty mature / saturated to start with.
Pure-software people didn’t exist; it was always someone who was a statistician, a geologist, a chemist, who was also ‘good at computers’ because it was a skill they cultivated along the way. As a result, most software was highly specialized and arcane. Way too expensive to be a toy.
Early tools like Lotus 1-2-3 and dbase were mind-blowing because they were so generalized and available on consumer appliances. Schools managed milk money, farmers planned crops, the perceived value was instant. There wasn’t an activity that couldn’t benefit.
Back then one computer with a spreadsheet and database was considered more than enough to grow ‘any’ entrepreneurial enterprise from zero to 200 employees. Even in the late 90s I was in a ~300 person multi-national that mostly ran on one Novell server and the entire company lived in Lotus Notes.
I don’t remember much fear that spreadsheets were going to replace programmers. What I remember more was a kind of simmering contempt for them, though time may be sanding the edges off that memory.
From the IT side, spreadsheets and later spreadmarts were a completely understandable headache.
The business needed solutions, and IT often couldn’t deliver fast enough. A lot of that was structural: IT was usually treated as a cost center, underfunded, and forced through layers of process and overengineering. So even small things could take too long, cost too much, and come wrapped in too much ceremony.
So the business used spreadsheets because they were powerful, flexible, and already there.
The real problem came later. Business process, business data, and domain knowledge ended up trapped inside undocumented spreadsheets living on someone’s PC. Out of sight, out of mind, and effectively unmaintained.
Then Joe retired, quit, or got laid off, and suddenly some weird but critical business function stopped working because Joe always did it and it lived on Joe’s machine.
That was the nightmare. The same IT group that didn’t have the time or resources to meet the need in the first place now had to reverse engineer a giant kludgy spreadsheet and somehow turn it into something supportable.
Which is part of why the current AI moment feels familiar to me. Not because AI is the same thing as a spreadsheet, but because the adoption pressure comes from the same places: businesses want to save money, people need to get their work done, and they will reach for whatever is powerful, available, and fast enough.
And just like with spreadsheets, some of what gets built this way will be genuinely useful, some of it will become invisible infrastructure, and some of it will turn into a future headache for whoever has to untangle it later.
I don’t think the lesson from spreadsheets was that end-user tools replaced professional software. It was that when the official path is too slow, too expensive, or too disconnected from the real need, people route around it. AI looks to me like the same pattern on a much larger scale. The hype says panacea. Reality will probably be messier.
My Bride worked for a banking/credit card company. Someone took the sample "northwinds" Microsoft Access sample app, changed the labels, and modified their business process to mimic a few of the existing queries. At some point, there was a company mandate to port all of these types of apps to Oracle. Oh, the huge manatee... The process/business was a non-trivial bit of cash flow.
There has always been a bit of back and forth. Giving long deadlines and crazy costs, the business will always kruft something together. Sometimes it works, sometimes folks get burned, sometimes you get a nice hand off.
Spreadsheets did not replace programmers, they mostly changed who could build small solutions. A lot of quick calculations and internal tools moved to spreadsheets, while developers focused more on building systems around them. In many teams it actually increased the need for proper software later on.
I was alive when the internet was not a thing and remember my neighbor getting a beefy hotrod 486 which was significantly faster than my 286 I scrapped together a lot of money for as a kid.
I liken AI to more like sewing machines.
Lets say our factory output was 10 tshirts a day by hand sewing. When sewing machines came around, we din't make 10 tshirts by 8:30am, send everyone home and pat ourselves on the back. Instead we just increased output per worker and kept investing. This drove the cost of goods way down, to the point where a negative pressure equilibrium was reached in the market: making tshirts any cheaper you'd have to give them away (which, 100 years ago was an absolutely absurd proposal, yet today is so common its sort of laughable). This lead to people moving out of the industry but new people moved in (industrial and chemical engineers, material scientists, technicians, mechanics etc).
In my mind, CEOs that are firing their staff are:
1. Making up for past over-hiring mistakes during Covid
2. Cashing future investment in for today's temporary gains.
we're in an economic pinch right now, so not surprised to see layoffs. AI firings are just an excuse; if this happened in 2016, you'd be rolling out AI to every employee in your company to outproduce your competitors. You can't cut your way to success.