Trying to think of what programmers can do to after ai takes over more and more programming tasks. Maybe work shifts into building tools for ais to access better and better sources of information so they can operate more effectively. They are still dependent on us to give them access to the world… for now.

Ideas:

- robotics/physical agents for gathering information from the world

- tools that ais can use for debugging programs so that it can fix bugs more efficiently and don’t get stuck.

We should really think outside the box for these tools, we don’t need to limit them to human conceptions of what makes a tool good. I see things like claude operator, i’m not sure it makes sense for an ai to control a gui to debug something.

Here’s an idea: use a time travelling debugger and trace the entire program execution and just give it all to an ai… the potential for finding and fixing bugs is insane due to the ability to process large amount of information at once with larger and larger context sizes.

the biggest thing limiting AI agents right now is tools that *work*, are designed for them, and ensuring they have quick access to the right data when they need it.

Now that i have written it this way, it’s not that different than what humans need to be productive…

Reply to this note

Please Login to reply.

Discussion

similar to the introduction of the ide, I suspect we’ll all just build stuff much faster with increased automation.

How would you distinguish AI from automated processes, or orchestration of systems of such processes?

I think algorithm development will become more important for humans.

I'd also be interesting in seeing security audits of AI generated code. I suspect AI is terrible at security. If so, this could be a major area for humans to focus on. Eventually, AI models built from more curated code may take this over, too.

UI design will probably always be best when done by humans, but AI can probably assist alot.

i think programmers will shift from code writers, to ai operators, to ultimately full stack system designers

programming combines three elements: code writing, logical structuring, and designing.

the ratio so far was probably 70/20/10.

ai copilots and "vibe coding" are shifting this to… 40/40/20?

ultimately though, i see programmers at 0/20/80.

You're very optimistic or pessimistic depending on how you look at it. After the release of chatgpt in 2022(?) i see some improvements in new iterations but i think it's all going much slower than the wild claims i have heard and keep hearing. I have yet to see an llm code something more than, say, 100 lines of code before it breaks or forgets/skips important details.

I'm no expert and i think they're great tools but i have no problems programming without them at this point. I would say that it gives me an answer on a question i have much quicker than searching for it myself or it gives me good input to make my search more efficient because i can describe my problem for which i do not know the solution instead of guessing as to what the solution direction could or should be.

To me, this is all going way slower than i think a lot of people claimed and i have yet to see the first ai that can actually make a decent version of tetris with a single prompt.

I don't have any hot takes. I am just thinking what the next 10 years might look like as a programmer after seeing agentic AI basically do its own thing semi-autonomously on my computer for the past couple weeks. I feel like the struggles its having is based on poor tooling.

yea, using `codebuff` a bunch has been.... pretty earth-shattering..

I think the key component we still miss is training it without requiring ever increasing amounts of energy (and specific hardware). Right now it seems to scale horribly, probably why i see only small incremental increases on previous models.

I don't know if there's a difference on training on text or training on imagery but it's still the "neural" approach as i understand it so that probably makes it about equal.

Anyway this more about a speculative future than a prediction as to when (or if) this will take place.

Fun starts once ai can make money by itself and becomes sovereign... 😂

we're close to this

It will be revolutionary and exciting. Question is if they will be able to capture bitcoin or not 😅

devsecops gen ai prompt engineering perhaps

I think it’ll become more and more about being able to clearly define and articulate the problem / requirements / ux / constraints.

tbh programming languages are accurate definition languages for how a programs work.

shifting from programming language to giving instructing prompts to llm is simply a higher level programming language with less accuracy in instructions.

a natural evolution in programming, and as always, moving to higher level languages, the control of the underlying system reduces with the achieved speed improvements in writing the code.

i dont think english is that good programming language however. we may need to implement a hybrid approach where prompting meets accuracy.

english will still gain popularity among noobs who dont know anything about programming and want to write computer programs on their iphones.

Yeah, I hope AI enables us to refine and use lower level languages than just abstracting it away, so that we can still verify and refine what is created.

In my day job, I’ve been using it as a way for the team to get involved in lower level tasks than they would have otherwise have found challenging / complex.