Agreed. I do have to wonder that we're going to have to evolve just as quickly. One days vibe-coded app is tomorrows scrap. I already want to be building tools to just let my LLM servers run fully loaded in the background just cranking things out for me somehow XD

I think well get to a point where vibe-coding an app will happen just as quickly as the next entrepreneur can create an idea. Tech founders won't be needed.

it just continues to compound the more I think about it.

Reply to this note

Please Login to reply.

Discussion

You'll just focus on something else because that's what humans do. You'll be better at whatever that is than someone who is less educated. You'll find a way to be better or improve something. I guess what I don't buy into is the utopia where all humans are exactly equal and do absolutely nothing because AI will do everything. Some people will still be born smarter than others and will always find some way to shine brighter, all other things being equal. And if that isn't true, what the fuck is the point of humanity?

> And if that isn't true, what the fuck is the point of humanity?

I mean I don't think were ever going to know that.

But yes, only a subset of people will be, at least for a little while, really leveraging and building things with AI.

We are not all equal, and utopia doesn't exist in any natural form, and I don't think we should be manufacturing it. I think anyone that believes they treat everyone with equality is lying or has a dangerous mental condition. You don't treat strangers they way you treat your family members, if you do, your lying or insane. I you family or your own life more than a stranger, your lying or your insane. It's just a reality of human condition. Point is, my opinion, if you attempt to manufacture utopia, you will have to, at some point, break your own rules of how you value life. Lying about it, is denying reality and likely to hurt more people than it helps.

I agree.

These are things nostr:npub1w4jkwspqn9svwnlrw0nfg0u2yx4cj6yfmp53ya4xp7r24k7gly4qaq30zp and I used to talk about here and there.

It's interesting stuff. But I am also aware that what humanity is right now may not be what it will always be. It's easy to forget that we are at a point in evolution. It isn't necessarily over. I'm sure most humans at some point won't be entirely organic, for example. That is relevant. We should have people thinking about this stuff and discussing it, in my opinion.

We did? Baby brain rot is real 🤤 I used to remember every conversation I ever had. I may not recall your name past blinking, but I would remember every idea mentioned till the end of.. well apparently to the end of bachelorhood...

Lol Fair enough.

As for the tech founder thing, you're probably right. But even imagining a solution for a problem requires some ability. You'll have some people who are better at doing that than others. I could also compound this, which is my point. Right now, you're more likely to create a LLM server like that than someone who has no understanding of the topic or how to effectively gain the understanding.

Correct, and agreed. Skill sets will be modified. For example. I think people leverage LLMs to build things for them, will fall behind those who leverage LLMs to augment and build their own skills. I think an LLMs as the exist now, are far better teachers than builders.

Effectively using AI in the most efficient way IS the skill. I think we are misaligned on meaning a bit. I don't mean skill in the sense that a human will try to do a thing that AI can do better. That may be fun and useful (like playing Chess with another human), but the more educated human would realize that and focus on improving something else if it can be. They will simply use AI better than someone who is less capable, even if cognitively or due to laziness or disinterest. We already see this with the technology we have. I think this only breaks down at the point that nothing can possibly be improved by humans. I don't know how you'd even prove that limit, and maybe it becomes irrelevant broadly. But I don't think LLMs or AI will ever be used at an equal level by all humans. Will it make a practical difference in daily human life? Perhaps not at some point. I don't think that will be the case for any human currently living.

I also think there's a valid case for encouraging human skills even if AI can do it better in case AI goes sideways for some reason. I don't think every human should know the intricacies of farmimg and be capable of doing it using primitive tools, but I think it's worth having some humans who can if ever required. I think there will always be a case for humans who CAN do things, even if they don't have to. That's sort of a separate topic, but one I'm thinking about. Guardians of critical information and skills for human surival in a sense. Offline information on critical topics is a similar thing on my mind. Again, separate topic to what we're discussing.