I’m a Luddite when it comes to this stuff as well. I use it, but mostly just in Ask mode. The agentic stuff I do not like at all. It’s not perfect, and that means I have to troubleshoot and fix it anyway, so I would just assume do it myself so I properly understand what’s going on and can speak to it and be accountable for it. Lucky for me, my management talks about AI a lot, but they’re not over my shoulder hitting my knuckles with a ruler if I write actual code.
I’m not ready to throw in the towel yet. I want to see what things look like on the other side of the crash, when people get more realistic about using the things as tools instead of replacements, and get more realistic about their limitations. I’ve seen articles where a reporter with no stated experience created a dashboard for a few different things. While she admitted she wanted to throw her laptop in the ocean, she seemed to gloss over much of the hardship and didn’t mention how long it took to get something working. She also didn’t post a link, so there was no way to gauge how functional it was, beyond a couple screenshots.
My suspicion is that once the hype wears off, maybe anyone can code, but most people won’t want to. Then we’ll have the question of how professional developers best work. WYSIWYG web page editors used to be all the rage… anyone can make a website… but look at what we have now, professional are back to code and people not looking to write code are using very structured web-based platforms.
I’m with you on saving up money to get out, even if it’s just as an insurance policy. That said, I don’t think the collapse of the profession is inevitable just yet.
Anyway, I'm a hobby coder and, unlike you, I've really enjoyed AI-assisted development. I was never a strong developer, so coding always took me a long time, and my interest in projects faded quickly that forced me to relearn them from scratch after long breaks. With AI, I can actually finish projects, and my code quality has improved. GPT is a better developer than I am. Example: the first time I had it analyze a personal project, it found over 50 vulnerabilities.
I enjoy learning and understanding how code works, but since AI has largely automated typing code I've since then shifted my focus to higher level topics like software architecture and systems engineering. I am reading the book "designing data intensive applications" right now.
So use it. I've been programming for 45 years, and I've found it to be a really useful tool.
I'm still writing code, still doing all the fun stuff, but I'm moving along MUCH faster than before. Mostly because when I get stuck I ask the AI questions. About the code, about the API I'm talking to and so on. In the past I remember spending days finding really obscure bugs, or reading soooo much material to try and figure out that "in this case call A before B, but in that case call A before C.".
To me, it's made programming (the creative) part more fun, while removing the unfun stuff (like bug fixing.)
I'm using "chat" more than agents though - The AI doesn't edit my code directly.
My company doesn't really care how we use it, just as long as we use it to make ourselves faster. "Ignoring" it out of some nostalgia for the past is not helpful from an employer perspective.
I certainly don't miss the pre-internet days (when you sought out programming books, and coded with a reference manual in one hand) or the even the google days where trying to do the right search lead you to some answer you could kinda interpret.
Personally I'm a bit of an AI Gloomer because I do think it's effectively inevitable, and putting people out of work is not a good thing. People out of work eventually tend to do desperate things. Not a doomer because I don't think it's going to literally end the world.
From my perspective it's not about losing the job, it's that I don't use my brain anymore, I just write words... sure you need some architecture but I don't feel as engaged anymore, like I'm the one shaping the wood vs. a 3D printer making it kind of thing. I was thinking of analogies like driving through a track vs. just teleporting to the end of it... it seems inevitable, companies are about shipping features. That's why yeah from my side I think I will be getting out of it/do it for fun.
Seems to me the issue isn't so much the "AI adoption", it's the "forced". Forced to use a tool even when you don't think it's the best option, even when you think it's going to produce sub-optimal outcomes.
We're being devalued. Our engineering judgment is being devalued. We're being driven toward a cliff by those who know less than we do but think they know more.
If I had to give it a name, I might say: marginalized.
I’m a Luddite when it comes to this stuff as well. I use it, but mostly just in Ask mode. The agentic stuff I do not like at all. It’s not perfect, and that means I have to troubleshoot and fix it anyway, so I would just assume do it myself so I properly understand what’s going on and can speak to it and be accountable for it. Lucky for me, my management talks about AI a lot, but they’re not over my shoulder hitting my knuckles with a ruler if I write actual code.
I’m not ready to throw in the towel yet. I want to see what things look like on the other side of the crash, when people get more realistic about using the things as tools instead of replacements, and get more realistic about their limitations. I’ve seen articles where a reporter with no stated experience created a dashboard for a few different things. While she admitted she wanted to throw her laptop in the ocean, she seemed to gloss over much of the hardship and didn’t mention how long it took to get something working. She also didn’t post a link, so there was no way to gauge how functional it was, beyond a couple screenshots.
My suspicion is that once the hype wears off, maybe anyone can code, but most people won’t want to. Then we’ll have the question of how professional developers best work. WYSIWYG web page editors used to be all the rage… anyone can make a website… but look at what we have now, professional are back to code and people not looking to write code are using very structured web-based platforms.
I’m with you on saving up money to get out, even if it’s just as an insurance policy. That said, I don’t think the collapse of the profession is inevitable just yet.
Smart move to wait for the other side of the crash.
I'll coin a word for it: slopression.
Anyway, I'm a hobby coder and, unlike you, I've really enjoyed AI-assisted development. I was never a strong developer, so coding always took me a long time, and my interest in projects faded quickly that forced me to relearn them from scratch after long breaks. With AI, I can actually finish projects, and my code quality has improved. GPT is a better developer than I am. Example: the first time I had it analyze a personal project, it found over 50 vulnerabilities.
I enjoy learning and understanding how code works, but since AI has largely automated typing code I've since then shifted my focus to higher level topics like software architecture and systems engineering. I am reading the book "designing data intensive applications" right now.
Yeah that's cool to hear in my case I'm forced to use it or get fired kind of thing.
>> If I want to keep my job I need to use it.
So use it. I've been programming for 45 years, and I've found it to be a really useful tool.
I'm still writing code, still doing all the fun stuff, but I'm moving along MUCH faster than before. Mostly because when I get stuck I ask the AI questions. About the code, about the API I'm talking to and so on. In the past I remember spending days finding really obscure bugs, or reading soooo much material to try and figure out that "in this case call A before B, but in that case call A before C.".
To me, it's made programming (the creative) part more fun, while removing the unfun stuff (like bug fixing.)
I'm using "chat" more than agents though - The AI doesn't edit my code directly.
My company doesn't really care how we use it, just as long as we use it to make ourselves faster. "Ignoring" it out of some nostalgia for the past is not helpful from an employer perspective.
I certainly don't miss the pre-internet days (when you sought out programming books, and coded with a reference manual in one hand) or the even the google days where trying to do the right search lead you to some answer you could kinda interpret.
The terms are Zoomer, Bloomer, Gloomer or Doomer.
Personally I'm a bit of an AI Gloomer because I do think it's effectively inevitable, and putting people out of work is not a good thing. People out of work eventually tend to do desperate things. Not a doomer because I don't think it's going to literally end the world.
From my perspective it's not about losing the job, it's that I don't use my brain anymore, I just write words... sure you need some architecture but I don't feel as engaged anymore, like I'm the one shaping the wood vs. a 3D printer making it kind of thing. I was thinking of analogies like driving through a track vs. just teleporting to the end of it... it seems inevitable, companies are about shipping features. That's why yeah from my side I think I will be getting out of it/do it for fun.
The flip side... if I don't care about something I'll just vibe code it
The app doesn't do this... (AI makes changes) Run again
It's good for POCs in unknown tech territory
I just don't feel good about it
It is funny when you run out of tokens
MeloncholAI
DisenfranchAIsed
AIpathy
How would you feel for a secretary who refused to use a typewriter or an accountant who didn’t want to use a spreadsheet?
Times change. You’re just sad the times changed for you in a way you didn’t like.
Seems to me the issue isn't so much the "AI adoption", it's the "forced". Forced to use a tool even when you don't think it's the best option, even when you think it's going to produce sub-optimal outcomes.
We're being devalued. Our engineering judgment is being devalued. We're being driven toward a cliff by those who know less than we do but think they know more.
If I had to give it a name, I might say: marginalized.
Ennui