We've had a similar long running joke about architects.
Step 1. Design system that gets push back because a bunch of things appear to be buzzwords
Step 2. Force it through politically
Step 3. Quit with shiney buzzword on CV
Step 4. Narrowly avoid being around when shit hits the fan.
I find developers are usually much more concerned about it working well because it ends up being their baby. Not always of course, but more often than architects that don't actually have to do the work.
I don’t understand the premise: you were hired to do the job, you do the job, tooling improves so you do more job with less resources. It’s a win-win for everyone.
Lower-skilled people will be able to “do the job” with the new tooling (on a level that management believes to be good enough), and doing more job with less resources also means with less human resources. There is no win-win, similar to how there is no win-win for artists who in principle can now produce more “art” with less resources.
That's not new though, the bar for software development has been continuously dropping - at least on paper - since the profession started. I don't know assembly or manual memory management in C, but I do know languages and tools that allow me to do the job. Do I steal jobs from assembly / C developers?
Don't get me wrong, I don't like AI either and it's only a matter of time before my day job goes from "rewrite this (web) app from 5-10 years ago" to "rewrite this AI assisted or generated (web) app from 5-10 years ago". But I don't think it's going to cost that many jobs in the long run.
> it's only a matter of time before my day job goes from "rewrite this (web) app from 5-10 years ago" to "rewrite this AI assisted or generated (web) app from 5-10 years ago".
That seems optimistic. It all goes according to plan you won't get to write or rewrite anything anymore. You'll just be QA reviewing AI output. If you find something wrong you don't get to correct it, nobody will pay for coders, you'll just be telling AI about the problem so it can try to correct it until the code passes and the hallucinations are gone. That way they can pay you much less while you help train AI to get better and better at doing your old job.
I don’t think this is true at all, its very evident when you see how quickly ai assistants break down if they meet established, complex codebases that do a little more than your average todo list app.
some people who used to have a high-skill job will have their skill be devalued as AI takes over - this is a consequence of technological improvement. It is not a right that society maintains a level of value in an acquired skill. They will have to adapt.
As for fewer farmers, that is exactly it - those who would have been farmers would be required to acquire new skills or pursue something other than farming. Bringing this back to AI - artists, writers and programmers who get displaced will need to adapt. In the long term, the massive decrease in costs of production of various "creative" endeavours will produce new industries, new demand and increase overall wealth - even if it is not shared evenly (in the same sense that past technological leaps are also not shared evenly).
>There isn’t a rule of economics that says better technology makes more, better jobs for horses. It sounds shockingly dumb to even say that out loud, but swap horses for humans and suddenly people think it sounds about right.
Tooling has made us vastly more efficient since the days of FORTRAN and punch cards, which has caused millions more of us to be able to employed in the field. Few companies could afford websites built with ancient tech, but everyone can afford websites built with recent tech and, as a result, everyone has one.
NB: Firing 20% of employees requires a 25% increase in efficiency by the simple math.
My stance has always been to lean on the available tools to free up time to work on the more interesting problems that deliver value to the organisation / company. Has been a good strategy to date.
Sadly, the current environment does not reflect that in my experience. There is a vicious focus on keeping profit margins at a steady rate at all costs while slashing spend on tooling which requires re-work on solved problems. :/
At some point the music is going to stop and it's not going to be pretty I suspect. :(
organizations that don't trust their engineers to work towards delivering value (by using better tooling, efficiency increasing automation etc), means that they don't improve and is accepting the current status quo.
That's why you need to keep an eye out, and smell whether the management understands it or not. Plan to leave, as your value contribution will not give you back the reward that such contributions deserve in this type of organization.
That's the very superficial theory, but in practice it means your company can do 20% more (?? I can't math) work. I've worked in B2C companies for years and there's always more work, never a point where they have to downsize.
There's a lot of make work out there. And AI does help. Just today it got me out a self made jam with sqlalchemy. But it's not the panacea that the entertainment financial reports make it out to be.
That’s not new in the age of AI. I’ve refused code reviews with blatant race conditions, initialisation order problems, even stuff that literally triggers compiler warnings, and been told “it’s not a big deal” in response.