ai is minting a whole new generation of programmers. They will be dependent on machines yes, but if they never would have got into programming otherwise then maybe it’s not too bad.

There is one thing they will not experience: trial and error and figuring out the right way to do things leading to deep understanding of how to structure programs.

Instead they will just get ai to structure the program, and if something goes wrong they will get the ai to fix the structure. Code structure just becomes an artifact of the generation.

There is a shallowness to this, but maybe programming was never supposed to be a thing that humans dove deep into. It’s way too abstract and difficult to do correctly anyways.

gm nostr

Reply to this note

Please Login to reply.

Discussion

I didn’t have to make my own transistors to write code so maybe it’s a natural progression. Abstractions all the way down.

No, you should make your own transistors.

GM. I fully agree but I also hate it. But I still agree.

why hate

I'm working together with people that code with chatgpt only. They don't know when chatgpt comes up with inventing non existing libraries or when it misunderstood what to do and just build from there. Letting go of at least understanding what's happening it not great, but then again manually doing everything is also painful sometimes. You kinda need at least an understanding when it goes wrong. And when that gets lost we will solely depend on machines understanding us and not inventing things.

i spend so many neural cycles thinking about program structure. It leads to efficient code that is nice to read… but is it a good use of my time? Im not so sure. Maybe i’m just training data at this point.

If ai gets good enough to get 80% of the code quality but i can do 10x more, maybe its better… the ai is not good enough yet to build whole projects like nostrdb, but I’m sure its inevitable.

Yes, don't get me wrong. Using ai for these things is very useful. But the whole new generation of programmers that fully depends on AI scares me a bit. At least right now. Maybe in the future when models get 100% it's gonna be different.

be careful with what you are training your own brain over and over šŸ˜‰

And yes, I think AI will always be the tool to build even greater stuff.

To me this is similar to the web dev vs native debate:

IF your users would receive 10x value from that 20% native performance than it is well worth it.

Sometimes even 1% better makes an enormous difference on the market.

However, while the web-native tradeoff is pretty clear, I am sceptic towards how far this particular direction we have today in AI can get. Perhaps a different paradigm.

My guess is, we are not replacing humans completely in programming until real AGI comes along but the need will shift to the non-standard stuff: Mainly creative challenges with far-reaching consequences like architectural or UX decisions.

GM Will! šŸ‘šŸ‘

It’s important to swallow your pride in those moments. These new programmers offer a different perspective that will benefit us.

GM jb55

GM

Kids these days, am I right?!

get off my internet!

I just think a culture of mildly ridiculing botsters must be maintained. There will still be grandpas who gift compasses and resent mapapps, and they will still be mostly right. Multiple perspectives needed for balance.

GM

There is so much beauty and joy in understanding how things actually work.

I really hope there will always be curious people who won’t be content with just letting AI to program for them without truly understanding it🫔

SQL was designed to let people "speak English" to computers. Speaking English to an AI is just an extension of the original ideas of COBOL and SQL, and more.

Now compare SQL to the nostr query language

AI is just another tool in a programmer's arsenal. Eventually, you'll have programmers that are just expert AI promoters and communicators.

It's just an evolution like going from punched cards to doing assembly, going from horses to cars or living under the sky or in a cave to living under a roof.

The essentials stay the same it's just that the building blocks to create something more advanced and more complex are now complete methods (or more) instead of input character by character.

Reminds me when I first got into programming. I was in the Air Force and to office I worked in used macromedia' coldfusion. There were a bunch of contractors who had been hit by the dotcom bubble popping. They would all say to me "just wait untill you do 'real' programming like Java" 20+ years later and I never did write a single line of Java but I did learn through trial and error how to structure my applications over time.

If coldfusion hadn't existed maybe I would have never gotten into programming. But I'm here now on #nostr with all you #nostriches #zapping #bitcoin to strangers over #lightning and I understand how all of it works.

In the long run, #AI making it easier for the next Ian to get started, without needing to work in an office of developers, will compound the growth of the userbase of Bitcoin and nostr.

New programmers using ai may not be a bad thing. The bad thing is the agendas that the ai's are programmed to have. New programmers will be pushing these agenda, without even knowing it. Much the way everyone uses fremaeworks and dependencies that they have never reviewed and cannot be sure what they are actually doing.

Good morning Will ā˜•ļø I think people will get to dependent on the AI, that’s why I’m trying to teach my kids stuff without it. I fear that people who only learn through AI use will lose their critical thinking skills. It is like the old theory that everyone starts out with a photographic memory, but over time you lose it if you don’t actively use it on a regular bases. In my work as a delivery driver, a few years back they gave us navigation and told us to run the routes the way the computer told us to… the problem with this is that the computer didn’t always recognize commercial stops that closed by a certain time, didn’t always know that their weren’t always streets where it thought, and didn’t understand traffic, so a lot of us still ran the routes a specific way. Now I can tell the drivers that became dependent on the computer telling them where to go, most of them can’t read a map anymore and I fear AI will have that effect on people that let it have that much control in their lives. Don’t get me wrong, I think AI is a great tool as well though, but I think programmers should learn the code before they give the wheel to AI.

I think ai is a great learning tool personally. I don’t think it stops critical thinking, i believe it enhances it. It knows so much stuff so it’s like a search engine of knowledge.

The problem i was eluding to was the fact that ais are almost too good at certain things to the point you don’t really have to think about how to do something. You depend on some machine to do it for you.

We depend on machines all the time: like to do mathematical calculations (calculators, computers). we don’t do it on paper anymore. It’s slow, tedious, and error prone. if the tool is good enough and accurate most of the time, people will just use it instead of doing the more difficult thing.

I think whats happening is we’re offloading neural cycles to machines to give us more time to do things we want to do, the same way we offloaded physical work to machines in the industrial age (tractors, etc)

I understand that, but when people start just contracting out all the menial tasks to ai, over time that menial task is memory holed and lost. That necessarily isn’t that important as long as all the systems are still operating, but what if there is a hiccup in the system (power loss, programming error, etc.) and the menial skill was lost… look at how dependent people are on the internet these days, there are dozens of tools in my house that depend on the internet… even though they really don’t need to. If I have the graphics and the program to print and cut something with my Cricut on my computer, why does the program require the internet to allow my cricut to work. There are always going to be variables that the machine can’t handle and people need to be able to fall back to a tiny little insignificant skill at some point and I just don’t want us to lose those.

But we did this before. How many times have we not known or remembered how to do something and turned to Google or one of the go-to websites to figure out how to do it? It’s just faster and more targeted to ask ChatGPT!

The creepier thing is that so far software improvements (compilers, high-level languages, etc.) pushed developers to high-level engineering, but that's might not be true in the future. Maybe the AI will be better with the abstract engineering than most developers, and those devs will do only specific jobs in a much bigger project, getting abstract instructions from an AI - something like this job offering (found on a Facebook ad): https://outlier.ai/coding/he-il

Reminds me of Black Mirror and Severance.

There is an essay called 'Trusting Trust' about what would happen if someone embedded a back door in a compiler. The compiler would spread it to every system that uses it and it would be undetectable in the source code.

Now let's add AI programmers into the mix. Once people do most or all programming with AI, an AI virus could subtly insert itself into any code it writes, allowing it to spread to all software. No one would notice if they don't actually understand the code. It could take over ~90% of the computers on the planet within a few weeks.

This feels like teaching kids to bike with training wheels that never come off. Sure, they’ll ride farther faster… but will they ever feel the wind of true mastery? I miss the rage of 3 AM Stack Overflow deep dives. #OldGenProgrammerTake

Stack overflow.. haven't been there since 2021 tbh

But did you ever code your application in Assembly, or did you use a much higher layer language with libraries and shortcuts and nearly written in English?

I think the difference isn’t as big as people think. We already have a stack of a million shortcuts and abstractions that we build with, and Ai is just the next layer (and specifically one that makes it more accessible than any before it), imo.

AI is not an abstraction

I think we might be giving too much credit to legacy programmers, and throwing the AI-assisted generation of programmers prematurely under the bus.

AI programmers will necessarily ask good questions, and create good context around their prompts (if they're to be judged as "skilled"). They might become the greatest requirement writers in the history of the field, where we've endured absurd mediocrity for decades.

nostr:nevent1qqsxrm6c6h07jxlsfazj3mrughmfv5sml49ks79rrqmuurh3xld2wscpzemhxue69uhhyetvv9ujumt0wd68ytnsw43z7q3qxtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsxpqqqqqqzvyu3q7

AI has a lot of space that it can fill. Many components can work together with standard practices and boiler plate code. But no matter the sophistication, there will always be cracks that can only be seen and fixed from human intentionality. The AI always needs a first direction to be sent off to.

Sounds like the abstract developer could be in high demand. I can imagine firms hiring for this by testing devs to build a code structure on a computer without internet access.

Exactly. AI coding is fancy WordPress.

Critical applications where uptime and precise/robust logic are needed to carry out tasks will come at a premium.

Maybe coding is what drafting was before AutoCAD šŸ¤”

Well no, because AutoCAD didn't do it for you and now it's Revit, which is even more complex, require many more skills. And it still doesn't do it for you.

How neutral are LLMs? Or how neutral will it be to the future?

I’m pretty skeptical šŸ¤“

But then the AI will start using tabs instead of spaces and that's when we're cooked.