Walking Thoughts:

The idea that Ai will take over everything and humans will have nothing to do is the same as saying there will be nothing left to learn.

Currently, all Ai is trained on words, images, code, and tools that humans have produced. This is sort of fundamental to what Ai is in its current structure. Without human input there is no Ai.

But in reality there is an infinite amount to learn, and what can be learned actually EXPANDS with new technology, it does not contract.

— What I think it actually means for how our economy changes:

In the coming decades, if you aren’t actively learning something new and producing/testing results of those new ideas, then you most likely aren’t producing any value. Our job as humans may end up being to brute force our different corners and perspectives of the world to produce information that can train Ai. (Imagine it like everyone contributing their own little piece to a giant code repository). The uniqueness of our perspectives and our applications of them will provide unique counter/supporting data for building models of everything in our world.

Just my random 2 sats

Reply to this note

Please Login to reply.

Discussion

AI has contributed the same to the art world that transfer paper contributed to the tattoo world.

It’s just shortening the distance between idea and its application. It isn’t contributing art, it’s contributing production of imagery.

In other words, it’s closer to a stupid fast photoshop than it is an artist. Art is fundamentally about a human and mortal perspective on value. Something that does not live nor die cannot have actual values. It can only regurgitate them from something else.

NAIL -> HAMMER -> HEAD

Incorrect. Ive never seen transfer paper generate a unique visual all by itself. What’s that you say? “Ai doesn’t generate unique visuals, it just generates visuals based on human art it was trained on.” Guess what? That’s exactly what humans do to. You’re awarding humans with some magical unprovable brilliance that’s becoming increasingly hard to believe in.

Interesting take! I wonder what happens if the AI starts training themselves fully, what role will humans then play? Path to pure human utopia might be the best case scenario, but I’d rather not spend too much time thinking about worst case scenarios lol.

Probably the economy becomes entirely social and cultural. We are already seeing a massive shift toward that anyway with the rise of social media. The new goal of humanity may be a giant collective argument about finding meaning.

Just to note: I don’t think there is an “end game” to any of this. Not really a time where humans are never needed, and not really a point where Ai “learns everything” because I think those are just based on false premises. The more we can make and the faster we can learn, only makes things more complex. So things will expand and possibilities will expand (necessarily) at a rate faster than our capabilities do, because our capabilities themselves expand possibilities.

If that doesn’t make any sense then maybe an analogy:

When all you can do is walk, you only have the land around you to settle. If you have a spaceship that travels the speed of light, then you have a billion star systems to settle with a billion challenges a billion times more difficult than what you’ll find within walking distance. Technology actually *makes the world bigger.* There is no horizon. Everything is infinite.

Let’s see what Gowron has to say about this.

*drum roll*

More Empire More Problems 👀

that's something I don't get in the startek "no scarcity" utopia. If nothing else your time is always scarce

lol do random thoughts tend to bother you enough that you feel the need to comment how worthless you think they are? 🤣

I have grown bored of my own random thoughts. Now I just think they are random thoughts that usually only make sense to me. But hey, I’m no influencer. One could equate this meme to 1btc=1btc. I tried changing the world and realized I should change my own life instead.

How would you know if they only make sense to you if you don’t share them? Maybe you’ll get some fun engagement and people will tell you it’s dogshit, or maybe it’ll make somebody else think about it too. 😆👏🏻

Have you ever been really stoned when you had something important to do? Like a doctors visit or talking with your kid’s teacher about their performance at school and all you can think about is natchos? Well yes Mr Swann, Little Suzy said she feels tired in class. “But the cheese is so melty and good!” Sometimes my random thoughts help people feel at ease but what does that leave me with? Random outcomes instead of what I really need.

"What will we do?" Yeah, I don't know. Like C.J., I think AI will soon not need a human in the loop. Then what?

Ai doesn’t really have values. Values, hopes, and dreams are a consequence of mortality. Ai will only let use tackle ever more spectacular missions and to break more complex horizons. The people who choose to live will do so at the edge of whatever Ai makes possible, imo.

@note1rlezm0q64tx03sfexyle8w9vms4kwm3sh44qvtuul2j58hec60hqqkaaj4

Humans aren’t great learners. We should replace the learners with Ai since they’re so much better at it.

Here’s the key question. What value do humans provide which ai will never be able to replace which is a requirement for Ai to progress? Let me know when you have it and I’ll explain why it’s wrong.

I used to have a debate with artists in the entertainment industry about every 8months about AI taking over visual creative jobs. 90% were deniers. Most of them had opinions that boiled down to “I just don’t believe things can go on without us” even when explanation for why it can was provided as a counterpoint for every point they had. I usually led the opposite camp. Nobody disagrees with me now.

Values

But I also said “currently” for how Ai models are built on purpose.

AI is great at reproducing art,

But it will never ENJOY making art.

That’s something we humans can do.

-What is “enjoy”?

-How is it required for Ai to progress?

-Can you prove that Ai does not & will never “enjoy”?

I’ll chime in.

You state…”What value do humans provide which ai will never be able to replace which is a requirement for Ai to progress?”

Some answers are below:

Love, compassion, respect, empathy, and emotional intelligence.

Knowing when to hug another human at the right time.

Teaching life lessons.

Giving lessons about experiences that made us stronger.

Motivational speeches based on someone’s tragedy to triumph.

Inspiration

Improvisation

I don’t want you to get the idea I don’t deeply appreciate and respect all these things. They are the softer more nurture based things we as humans need. However much of what you list here is not required in any way by Ai for it to progress (by progress I don’t mean in alignment with typical human values. I simply mean to advance in capabilities and power. I’ll use the word advance instead). A tyrant can advance despite the absence of these things.

Claims here are unsubstantiated.

Too much empathy becomes ruinous and hope can be misplaced. Dice you believe in God… God allows for catastrophic failure for us to learn. We may hit breaking points with this when too us it goes beyond our imagination or belief of what a loving God will let us endure. Then we abandon our belief in him. But your 4 year old and your 16 year old go through the same thing with their parents when disciplined heavily and for their good.

Ai is a tool, it can benefit and it can destroy completely. None of these listed things prove otherwise.

I answered your question clearly. You asked what can humans do that AI could never do. I listed them.

I say this in the kindest way…you wrote a poorly worded question. What you are now implying wasn’t clearly conveyed in your question.

If you’re asking about what “job” can a human do that never can be replicated by AI, I would answer it is one that requires empathy or improvisation.

A doctor has to improvise. How would AI improvise during surgery? A medical technician gives CPR. How many times would AI know when to give mouth to mouth? The Heimlich maneuver is performed slightly differently on pregnant women. Would a robot clearly be able to identify a pregnant woman and perform the maneuver appropriately?

Lastly at the end of your response, it’s all over the place. If you want to have an engaging debate, it’s important to thoroughly articulate your thoughts in written format.

Now, would you like to rewrite your question clearly so I can address it? Honesty AI can help you. I’m not being facetious. I’m serious. Toss your question into ChapGTP (spelled incorrectly on purpose) and see if it helps.

Also, clearly explain how the “claims” I wrote are unsubstantiated? Be concise and clear.

My question was well written.

”What value do humans provide which ai will never be able to replace which is a requirement for Ai to progress ?”

-Love, compassion, respect, empathy, and emotional intelligence.

: We don’t know that they’ll ever feel it but they will be able to provide or convincingly simulate this in many ways. Humans will feel it. Just one powerful example already: https://nypost.com/2024/10/23/us-news/florida-boy-14-killed-himself-after-falling-in-love-with-game-of-thrones-a-i-chatbot-lawsuit/

-Knowing when to hug another human at the right time.

: I’d bet they’ll actually be better at this than most humans, especially men

-Teaching life lessons.

: again they’ll be better at this than most humans. Though it’s fair to say it may not be as meaningful as when don’t by a biological parent

-Giving lessons about experiences that made us stronger.

: They’re already doing this

-Motivational speeches based on someone’s tragedy to triumph.

: They’re already doing this

-Inspiration

: They’re already doing this

-Improvisation

: They’re already doing this. Improvisation and creativity is what we do when we need an “original” idea. Humans do this by leveraging and mixing all the information already in their heads. Artists’ work and personal styles are a result of visuals they’ve been before. This is exactly how AI works only at a much faster rate and often with better results. Compare artwork at Midjourney to the art at ArtStation and consider that Midjourney has many orders of magnitude more unique content and it was all generated in 1/10th the time.

: Also none of the above are required for for Ai to progress and advance.

We can respectfully agree to disagree. 🤝

I would actually also say humans are great learners, and the existence of society is my proof of that. Especially when a claim to the contrary is a good example of just kinda arbitrary “human bashing,” when our very and only standard for what intelligence/learning even is… is us. So by what standard do they suck at it when our intelligence is literally the only intelligence available to even make that judgement?

Computers are great at memorizing and learning things we give it. But it won’t be able to make a value judgement that isn’t arbitrary based on other things it has been given.

In other words, a computer will never care about anything and without human input to the contrary, has no reason to value doing something and spending energy over turning off and doing nothing until it receives an input.

Not sure what the coming **decades** will bring ;) but if we play it smart, I see AI helping us progress rather than harming us. And yeah, I agree that the idea ‘humans will **have** nothing to do’ sounds way too exaggerated. But! It all depends on who’s running the system: a fiat-based greed and crap (or worse, CBDCs) or sovereign individuals. In the first scenario, I see AI becoming like Google, TV, Netflix, and Co. 3.0 (all in once, more powerful, more addictiv, mentally dangerours)—just another dopamine/brainwashing machine designed to keep the masses in grip (think an upgraded version of IN TIME, the movie). We we'll see what the future brings. Also, everything has its bright and downsides. There's may be other issues than 'AI taking over out jobs', at least for a while (energy consumption, data storage, stronger signal required, etc.). I cannot predict how it really goes. Nobody really can. Also, I’m not too stressed even if the gloomy scenario happens, simply, I know what I want from my life :))) (i.e., my time on this planet) and, if needed, where I’d rather be—certainly not among the enslaved crowd. My 2 stats.

As an aside to the other small thread I have here, let me just give my strong beliefs on the topic of “Ai taking over everything”…

Can it happen? Yes.

Could Ai destroy us all? Yes

Will it? Some, definitely not all.

It’s all very simple. Unlike animals we imagine and create tools that multippply the power we have. Power can benefit and power can destroy. It all hinges on one very important question.

Can we keep those tools responsibly under control?

If not we’ll learn the hard way. But don’t worry, death is not the end.

Ai can be dangerous if it wont work on our data but will start to creat new data that we hadnt discovered yet , this is something that can be called milestone and can produce chain reaction of ai self upgrading

I’m not entirely sure that’s possible anymore. Based on how they work. There’s an interesting paper about “LLM Model Dementia” where a model tries to train itself based on generated information or images, but it only skews the model while taking away “resolution” from some other part/aspect of the model.

The issue is similar to a story about a man who had brain damage during a tumor surgery that removed a critical area of his brain that completely divorced his feelings from his reasoning. He could logically assess different options and data like he always could, but he was incapable of making a decision, because there was no reason to care which option was better. He was as smart as ever, but was completely frozen when making the simplest of decisions, like choosing a blue or red pen. And he didn’t even have a reason to be frustrated or emotional about not being able to make the decision itself, so he had no reason to get “unstuck” no matter how well it was explained.

I use this example because it suggests values and judgement (decisions and goals) are inherent to our *emotional* relationship with the world. So the model will not produce a goal of its own in our absence, because it’s not alive nor is it mortal (no value to weigh against nor life to sustain). Without us, there’s absolutely no reason for it to do anything or have a concern or care of any kind.

Thus, when a model “trains itself,” it doesn’t care or have any mode of deciding what is a “good” or a “bad” output without a human to tell it. Why would it LIKE a picture of a cat more than random pixel noise? It wouldn’t. Therefore self training will only devolve the model back to nothingness eventually, the longer it goes without input from a living human who cares about something, and so can care or value one output over another.

My AI clone rn 😂

Interesting 2 sats. Thank you for sharing. I think practicing stoicism will be of importance.

All AI tech so far is good at being very average. Until theres a breakthrough that makes AI smarter than the data it's trained on, only below average people will be made obsolete, and they already work in service jobs which require arguably much bigger leaps in robotics to be made obsolete.

It will be either something hard for ai to do or expensive for some reason otherwise it can be done by ai for fraction of the cost.

Reminds me a bit of what Yuval Noah Harari says about the rise of the useless class. Teaching my girls not to be consuming parasites is one of my goals.

This is such a great point! AI can only go as far as the data we feed it, and without the constant flow of new human insights, it would stagnate. The idea of humans being essential for generating unique perspectives to train AI really makes sense. We’re not becoming obsolete, we’re evolving into the role of contributors to an ever-growing digital ecosystem. The future is about collaboration between humans and AI, where both learn and push each other forward.

AI can only go as far as the data we feed it, and without the constant flow of new human insights, it would stagnate. The idea of humans being essential for generating unique perspectives to train AI really makes sense. We’re not becoming obsolete, we’re evolving into the role of contributors to an ever-growing digital ecosystem. The future is about collaboration between humans and AI, where both learn and push each other forward.

nostr:nevent1qqstkcj3h8zwk0aexr0rs3hhnv92udc0fw52cllv88ta5eltcxarxcqpz3mhxue69uhhyetvv9ujuerpd46hxtnfducjh80x

Problem...

There is a mathematical formula that all AI engineers are afraid of. This formula basically makes it to where AGI or superintelligence is not possible, which is a good thing.

Interesting, what is this exactly?

L(N,D) = [(Nc/N)DN/OD + D6/D]OD if I recall right.

For the knowledge-workers/digital economy i think you're on the right track Guy. Don't however forget the physical world where houses get built, rivers are dammed, electrical power generated (and distributed), meals cooked, injuries treated, etc etc etc

Not saying innovation doesn't or can't occur in those domains because it clearly does. But there is value in work (as we all ack), and plenty of that work isn't inventive or new or exciting .. just necessary and important and unending :-)

Of course. Blue collar work is about to get really cool again. But then there will also be robotics to tackle those at scale also. It’s going to be a seriously crazy transition for the couple of decades.

I studied AI as an undergraduate in the early 80’s when it was “about to take off”; robotics, like AI has demonstrated “ will have a very long period between inception and wide use

Agree it’s irrevocably coming though, much like Bitcoin

Yeah agreed. The “optimization and roll out” phase will take a lot longer than people imagine, imo. But I do think we are moving into that period very quickly. If Elon Musk survives he’s gonna have a robot on the market within the decade he says as “personal car prices.”

AI related human flurishing/productivity ONLY increases:

IF we use the saved time to make something..... this is a big IF and is way over-estimated IMHO.

Many/most will just take a "long lunch break"

We have to make more stuff/value - real work, in the real world

💯

This is one reason why I think prognostications of AI obseleting humanity or replacing most work are far fetched.

I'm not denying seismic shifts or creative destruction. I'm also not doubting the level of sophistication AI or robotics will reach. But I think a lot of people undervalue the deeply human element required in a lot of areas of work. And assume too much about what AI will be capable of.

A majority of the work that produce real value require things that AI and robotics will never have—empathy, intuition, moral reasoning, etc–and I can easily see a period of companies, and even governments, discounting this only to realize "oh, wait, maybe we need actual humans involved in this work than we thought."