Educated people using AI will be the game changer. Sure, average people will be able to do more things for themselves, and that's awesome. But the most interesting things will probably still be done by educated, skilled humans. I think it'll always be that way. It always has been.
Discussion
I think that what being “educated” means is going to change. There will be no straight line between today’s definition of the word and the meaning a decade from now.
I don't see why it would. Broad understanding of one's environment and available tools will always make one more capable than those who have a narrower understanding. Many people have access to the same tools and information I do yet are less productive and capable. The cream will always rise. I'll just be even better ten years from now.
Of course, we may be using different meanings, which is why it's important to define terms. I'll work on that. But this is what I mean by educated. I don't necessarily mean credentialed, which some people associate with educated. I mean broadly capable, skilled, and knowledgeable (critical thinking skills, fundamental understanding of engineering and problem solving, philosophy, etc). I'm always going to be more effective than someone without those things, all else being equal. I will beat them in some way.
A medical doctor will probably use AI more effectively for medicine than someone who has no medical training, for example. They'll just be doing even deeper work and pushing current limitations. I'm unconvinced that AI will solve every problem humans will ever face in every area of humanity. Maybe I'm wrong.
Agreed. I do have to wonder that we're going to have to evolve just as quickly. One days vibe-coded app is tomorrows scrap. I already want to be building tools to just let my LLM servers run fully loaded in the background just cranking things out for me somehow XD
I think well get to a point where vibe-coding an app will happen just as quickly as the next entrepreneur can create an idea. Tech founders won't be needed.
it just continues to compound the more I think about it.
You'll just focus on something else because that's what humans do. You'll be better at whatever that is than someone who is less educated. You'll find a way to be better or improve something. I guess what I don't buy into is the utopia where all humans are exactly equal and do absolutely nothing because AI will do everything. Some people will still be born smarter than others and will always find some way to shine brighter, all other things being equal. And if that isn't true, what the fuck is the point of humanity?
> And if that isn't true, what the fuck is the point of humanity?
I mean I don't think were ever going to know that.
But yes, only a subset of people will be, at least for a little while, really leveraging and building things with AI.
We are not all equal, and utopia doesn't exist in any natural form, and I don't think we should be manufacturing it. I think anyone that believes they treat everyone with equality is lying or has a dangerous mental condition. You don't treat strangers they way you treat your family members, if you do, your lying or insane. I you family or your own life more than a stranger, your lying or your insane. It's just a reality of human condition. Point is, my opinion, if you attempt to manufacture utopia, you will have to, at some point, break your own rules of how you value life. Lying about it, is denying reality and likely to hurt more people than it helps.
I agree.
These are things nostr:npub1w4jkwspqn9svwnlrw0nfg0u2yx4cj6yfmp53ya4xp7r24k7gly4qaq30zp and I used to talk about here and there.
It's interesting stuff. But I am also aware that what humanity is right now may not be what it will always be. It's easy to forget that we are at a point in evolution. It isn't necessarily over. I'm sure most humans at some point won't be entirely organic, for example. That is relevant. We should have people thinking about this stuff and discussing it, in my opinion.
As for the tech founder thing, you're probably right. But even imagining a solution for a problem requires some ability. You'll have some people who are better at doing that than others. I could also compound this, which is my point. Right now, you're more likely to create a LLM server like that than someone who has no understanding of the topic or how to effectively gain the understanding.
Correct, and agreed. Skill sets will be modified. For example. I think people leverage LLMs to build things for them, will fall behind those who leverage LLMs to augment and build their own skills. I think an LLMs as the exist now, are far better teachers than builders.
Effectively using AI in the most efficient way IS the skill. I think we are misaligned on meaning a bit. I don't mean skill in the sense that a human will try to do a thing that AI can do better. That may be fun and useful (like playing Chess with another human), but the more educated human would realize that and focus on improving something else if it can be. They will simply use AI better than someone who is less capable, even if cognitively or due to laziness or disinterest. We already see this with the technology we have. I think this only breaks down at the point that nothing can possibly be improved by humans. I don't know how you'd even prove that limit, and maybe it becomes irrelevant broadly. But I don't think LLMs or AI will ever be used at an equal level by all humans. Will it make a practical difference in daily human life? Perhaps not at some point. I don't think that will be the case for any human currently living.
I also think there's a valid case for encouraging human skills even if AI can do it better in case AI goes sideways for some reason. I don't think every human should know the intricacies of farmimg and be capable of doing it using primitive tools, but I think it's worth having some humans who can if ever required. I think there will always be a case for humans who CAN do things, even if they don't have to. That's sort of a separate topic, but one I'm thinking about. Guardians of critical information and skills for human surival in a sense. Offline information on critical topics is a similar thing on my mind. Again, separate topic to what we're discussing.
Agree, the biggest breakthroughs will likely come from people who can combine AI with their expertise and creativity. So even as AI becomes more common, human intent and know how will still be crucial for driving the most impactful advancements.
That's all I'm really saying. It may indeed become less meaningful given enough time. I just don't think humanity will be replaced by AI without the total eradication of humanity. And if we stick around, some humans will always find a way to beat others, all other things being equal.
We've seen the same thing with all technological advancements. The gap narrows, but the best among us still go just a little higher.