To me that leaves an opening for a software architect that can put these symbols together quickly and efficiently.
A core skill of such a person would be deep technical domain knowledge. Other things like negotiation, persuasion, communication, management are also core skills worth developing.
Of course, many peripheral jobs would still be needed like cybersecurity, databases, networking, hardware etc.
This kind of AI technology is coming soon. The good news is that most software people are really good at adapting to learning new things.
Why will programming still be our pivot? There is no perfect software. Every program has resource constraints and unique features. High-level constraints include choosing app submodules to please users. Low-level constraints also exist. Who can decide to cut off garbage collector for performance? Yes, the process in some aspects may become more declarative. But only you know your constraints and trade-offs. And you need to specify these constraints for a hypothetical perfect coding AI. The deeper you start to specify, the more you start programming. Maybe you'll even use "while" constructions in natural language programming.
If we imagine AI knowing all constraints and human needs, we enter sci-fi territory. I think we could collectively write a pretty decent sci-fi novel about this in the comments.
But in the end, the world is very complicated and this is only an opinion.
P.S. Right now I'm going to write code in a completely non-declarative style, and have fun doing it!
The code itself is an artifact of the process of theory building while solving problems using computers. As time goes on, and new wrinkles of the problem space are found, the programmer has to manage the process of revising and extending the theory to accommodate all new observations.
Knowing the keywords, APIs, and typing valid code are a hard set of skills to teach an LLM, it's taken billions of dollars of research to get close. It's still a hill to climb for the rest.
Software developers are a variant of engineers.
Coding or ai or whatever is just a tool at hand, one of many possible to reach a goal.
If any algorithm can take over the “engineering” core - good question. But we’re not even close to understand that question fully, let alone asking and answering it.
I absolutely detest these large frameworks and thus absolutely refuse to go back to fullstack development. I might, however, pivot back to fullstack development only after AI replaces all the unoriginal framework nonsense.
In the meantime I have already pivoted to proxy and API management. I also have a part time job as a senior technology principal in government with associate director experience, which can become a full time job if I want to relocate.
Let's put working on a different level of abstraction aside (like reviewing outputs and guiding AI) - it's already happening to some extent and developer jobs are always changing with or without AI.
If AI gets good enough to largely remove the need for software development jobs, I don't think there's a pivot, at least not at scale.
1. If AI gets good enough at SWE, it will very likely also be extremely good at any kind of knowledge work, so you can't go and be an accountant or a lawyer because they are losing their jobs at scale too.
2. An individual can go and be a plumber or an electrician but it can't be done at scale. If 40% of workers are doing knowledge work, they can't all switch to manual labor, there isn't enough demand in those fields to absorb 40% of people. (This is even if we ignore other problems like the fact that not everyone has the aptitude or ability to retrain)
3. Even if you individually are okay because you're still employed, or because you're independently wealthy, you still have a huge problem - at 30%+ levels of unemployment the economy and society as a whole begin to collapse. If you have a job, there wouldn't be enough people to pay for your services; if you have assets (stocks, property, currency) their value won't be preserved in an unstable society with high unemployment. It wouldn't be just a problem for the people who are directly affected, it would be everyone's problem.
That being said, in this scenario I think there will be lots of work that may look something maybe a bit more like SRE/devops does today, that is probably involving a lot of security/monitoring and code review/iteration on patches.
If AI progress continues and does not get misused, we end up with a utopia where we don't have to work. Otherwise, you have to be prepared for the doomer scenario. How you prepare is up to you.
There won't be any in between situation unless we hit a major roadblock somewhere - an unsolvable problem?
Doomer Situation: https://news.ycombinator.com/item?id=35364833
This is a tech enthusiast community so I don't expect people taking a doomer stance unless it hits them personally. A messup could be devastating in the future.
Assuming AI automates all that too, my degree was in EE and someone has to do the low level programming and wiring. If AI takes that, I'll just get a job connecting the AI brains to the nuclear fusion plants.
Or I'll just become a chess player. No matter how good AI plays chess, people still pay to watch humans play chess on YouTube.
That being said, I don't think AI will become "smarter" than humans in knowing what we don't know which is especially important when it comes to execution of ideas, so there is always gonna be "high-skilled" labor that can't be easily automated when it comes to being creative and innovating.
As someone working in FAANG seeing how incredibly low-labor the SWE work is though, it's hard not to see this still as a golden era that will be looked back on in decades.
A role like “tech lead” today is often about negotiating with other stakeholders, writing and vetting a design, and chunking it up for junior devs. That work stays relevant even if the junior devs are replaced with AI.
By that analogy, software developers will use AI to solve technical problems.
Coding one file at a time with AI is pretty decent for short small tasks but it doesn't scale.
If a company can now move faster because code gets deployed faster: what are companies gonna do with the gained time? Well, develop more features, right? And probably they will find out that now they need to deliver really complex features to beat their competitors. Perhaps we soon find that while AI can generate code just fine, they cannot yet perhaps generate truly complex systems? (i.e., AI cannot generate code that doesn’t resemble a bit their training material… so basically you don’t know what you don’t know).
I think if software developer jobs are taken over by AI, software developers will still be employable doing something related to software development. Perhaps not writing “raw code” anymore, but definitely using AI to meet customer needs. If any, AI will make companies in need of more developers (we’ll need to find another name for ourselves) I think. 50 years code, code wasn’t what companies needed, they wanted to solve business problems… but it turns out you need code for that. AI doesn’t change that, I think: we may not write code anymore and use AI to solve business needs, but we still need people to operate that AI.
AGI is a different thing, though. But then I don’t think we are close to that.