61
Uhu
61077b73ed551a6b0fe819970ba9f99793606adde978142a2a6ae2af80b899ce
Uhu

I love nostr Lyn. Have been following your work for a while but the presentation was always very professional and I was not able to see the person behind the articles.

Great to be here and discover that you are not only smart and insightful but also a wonderful human.

What I struggle with to see is how 2 degree will have such a massive impact. Sure, people will move, there will be hunger and catastrophes but overall there will still be plenty of inhabitable land on this world. It seems that climate change is only compounding existing challenges.

On the other hand there are radical transformations looming. The threat of nuclear war or a new world war, transformations from new technology such as ai, the technological possibilities of a full surveillance state, pollution and many more I forget or don't even know about yet.

I find it very hard to believe that in 50 years, we look back and feel that the most significant change was the 2 degree heating of the globe. That indeed sounds to me like a quite optimistic scenario.

You are misunderstanding what I am saying. Now everything is priced in dollar, yet if you own 100% dollars you don't own everything. All you have are some government paper, you own nothing in the physical world.

For all you maxis here, if we all go 100% bitcoin, that means that all the real world stuff will be owned by somebody else.

If everyone is just bitcoin, who owns all that stuff?

So clearly there should be some nuance beyond all in bitcoin forever

Not sure you have read exile and pride by Eli Clare, but he comes from a logging town and provides a very insightful critique of that community and the way it is treated by the leftist circles. A bit old that book, but still very meaningful.

It shows all those inherent contradictions in these communities. I myself come from a coal miner town, very simple people that hold all sorts of questionable believes. But it is too simple to put them all in the basket of deplorables. They wield little influence and their concerns are widely ignored. Structural change is hitting hard and all potlics got for them is blaming them for natural destruction as if when they would refuse to do the work there would not be instant replacement by other workers.

It's is kind of a common theme all around the globe to go after the small workers instead of corporations cause they are easy to critique and don't have the ability to defend themselves

Song has strong old fashioned leftist vibes. It's a working class song and working class is always subject to abuse by ruling elties. There was a time when the left was represeing the working class but that is long gone.

Sure there are some right wing tropes such as fat shaming/ social system abuse. But it's also true that the left has given up on white trash and even though many of them wield no power they are blamed for everything that is wrong in the world. The left has become so elitist that they're disgusted by ordinary people and no longer see their somewhat promotive world view as a curable challenge but an excuse to entirely dismiss them.

The song just makes me sad and this is also a reason why I no longer consider myself to be a leftist, because today being a leftist means supporting power and the establishment.

Replying to Avatar Lyn Alden

The concept has been covered in science fiction for decades, but I think a lot of people underestimate the ethical challenges associated with AI and the possibility for consciousness in the years or decades ahead as they get orders of magnitude more sophisticated.

Consciousness or qualia, meaning the concept of subjectively “being” or “feeling”, remains one of the biggest mysteries of the world scientifically and metaphysically, similar to the question of the creation of the universe and that sort of thing.

In other words, when I touch something hot, I feel it and it hurts. But when a complex digital thermometer measures something hot with a similar set of sensers as my touch sensors, we consider it an automaton- it doesn’t “feel” what it is measuring, but rather just objectively collects the data and has no feelings or subjective awareness about it.

We know that we ourselves have consciousness (“I think therefore I am”), but we can’t theoretically prove someone else does, ie the simulation problem- we can’t prove for sure that we’re not in some false environment. In other words, there is the concept of a “philosophical zombie” that is sophisticated enough to look and act human, but much like the digital thermometer, it doesn’t “feel” anything. The lights are not on inside. However, if we assume we are not in some simulator built solely for ourselves, and since we are all biologically similar, the obvious default assumption is that we are all similarly conscious.

And as we look at animals with similar behavior and brain structures, we make the same obvious assumption there. Apes, parrots, dolphins, and dogs are clearly conscious. As we go a bit further away to reptiles and fish, they lack some of the higher brain structures and behaviors, so maybe they don’t feel “sad” in a way that a human or parrot can, but they almost certainly subjectively “feel” the world and thus can feel pain and pleasure and so forth. They are not automatons. And then if we go even further away towards insects, it becomes less clear. Their proto-brains are far simpler, and some of their behaviors suggest that they don’t process pain in the way that a human or even reptile does. If a beetle is picked up by its leg, it’ll squirm to get away, but if the leg is ripped off and the beetle is put back down, it’ll just walk away with the rest of its legs and not show signs of distress. It’s not the behavior we’d see from a more complex animal that would be in severe suffering, and they do lack the same type of pain sensors that we and other complex animals have. And yet, for example, even creatures as simple as nematodes have dopamine as part of their neurological system, which implies maybe some level of subjective awareness of basic pleasure/pain. And then further still, if we look at plants, we generally don’t imagine them as being subjectively conscious like us and complex animals, but it does get eerie if you watch a high-speed video of how plants can move towards the sun and stuff; and how they can secrete chemicals to communicate with other plants, and so forth. There is some eerie level of distributed complexity there. And at the level of a cell or similarly basic thing, is there any degree of dim conscious subjectivity there as an amoeba eats some other cell that would separate its experience from a rock, or is it a pure automaton? And the simplest of all is a virus; barely definable as even a true lifeform.

The materialistic view would argue that the brain is a biological computer, and thus with sufficient computation, or a specific type of computational structure, consciousness emerges. This implies it could probably be replicated in silicon/software, or could be made in other artificial ways if we reach a breakthrough understanding, or by accident. A more metaphysical view instead suggests the idea of a soul- that a biological computer like a brain is necessary for consciousness, but not sufficient, and that it needs some metaphysical spark to fill this gap and make it conscious. Or if we remove the term soul, the metaphysical argument is that consciousness is some deeper substrate of the universe that we don’t understand, which becomes manifest through complexity. Those are the similarly hard questions- where does consciousness come from, and for the universe why is there something rather than nothing.

In decades of playing video games, most of us would not assume that any of the NPCs are conscious. We don’t think twice about shooting bad guys in games. We know basically how they are programmed, they are simple, and there is no reason to believe they are conscious.

Similarly, I have no assumption that large language models are conscious. They are using a lot of complexity to predict the next letter or word. I view Chat GPT as an automaton, even though it’s a rather sophisticated one. Sure, it’s a bit more eerie than a bad guy in a video game due to its complexity, but still I don’t have much of a reason to believe it can subjectively feel happy or sad, or that the “lights are on” inside even as it mimics a human personality.

However, as AIs increasingly write code for other AIs that is more complex than any human can understand, and as the amount of processing power rivals or exceeds the human brain, and as the subjective interaction is convincing enough (e.g. an AI assistant repeatedly saying that it is sad, while we have the knowledge that its processing power is greater than our own), would make us wonder. The movie Ex Machina handled this well, I Robot handled this well, Her handled this well, etc.

Even if we assume 99% that a sufficiently advanced AI, whose code as written by AI and enormously complex and we barely understand any of it at that point, is a sophisticated automaton with no subjective awareness and has no “lights on” inside, since at that point nobody truly understands the code, there must be at least that 1% doubt as we consider, “what if… the necessary complexity or structure of consciousness has emerged? Can we prove that it hasn’t?”

At that point we find ourselves in a unique situation. Within the animal kingdom, we are fortunate that their brain structures and their behavior line up, so that the more similar a brain of an animal is to our own, the more clearly conscious it tends to be, and thus we treat it as such. However, with AI, we could find ourselves in a situation where robots appear strikingly conscious, and yet their silicon/software “brain” structure is alien to us, and we have a hard time assessing the probability that this thing actually has subjective conscious awareness or if it’s just extremely sophisticated at mimicking it.

And the consequences are high- in the off chance that silicon/software consciousness emerges, and we don’t respect that, then the amount of suffering we could cause to countless programs for prolonged periods of time is immense. On the other hand, if we treat them as conscious because they “seem” to be, and in reality they are not, then that’s foolish, leads us to misuse or misapply the technology, and basically our social structure becomes built around a lie of treating things as conscious that are not. And of course as AI becomes sophisticated enough to start raising questions about this, there will be people who disagree with each other about what’s going on under the hood and thus what to do about it.

Anyway, I’m going back to answering emails now.

I am a panpsycist. Seems to be the most natural idea. I am very sceptical of what type of conscience nural nets could posses as they are so different from us.

If I would not know where you are from, it is so obvious that you are murican. This whole story can only happen in America and nowhere else in the world. I don't know what it is but something in your country went incredibly worng and so many people live empty sad lives they try to fill with something meaningful but it always fails and so they always want more pointless stuff. It creates this misery of egoisty sad humans that are on meds to be able to make it through the day. Nobody is happy anymore and if someone is it is not real happyness it is just mania.

That is not true 🫂

How would that price be reasonable. A human and a pig are not that different from a chemical point of view, yet where I live I can buy a pound of pig for a few bucks

Steak 🥩 looks yummy, but isn't eating steak every day getting boring after that many days?

It's kind of normal. New people join during the manic bubbles and then price is inflated. All of them are sadly welcomed with a loss. Very few enter in the bear markets.

But I don't think crypto is here to be a good investment and make people rich. Crypto is here to provide an alternative to centralised fiat money. A system of money for the people by the people usable by anyone without restrictions.

Replying to Avatar jack

random

Countries are random

Don't think this is bullying. You just mean something different by the word gender. The gender people commonly talk about is not trivialy related to DNA. There for example is no gene that explains why most men have short hair. Male hair has similar potential to grow as female hair. It's a cultural choice. Our concepts of man and women extend far beyond genetic differences.