Replying to Avatar Ricemoon

Caitlin Johnstone for once with more personal but nevertheless insightful post on X:

"Had a medical incident in my family the other day. It's funny what a reminder of human mortality can do to dispel all the little resentments and dramas that can build up between loved ones over the years and cause all the old grievances to be seen for the insignificant mind fluff that they are.

And right now I feel sorrowful that it so often takes a major health scare or accident to remind us of this. We all know we're all going to die, but we let the small stuff come between us anyway. We let the little quibbles in our heads stop us from touching hands and experiencing intimacy with each other during our fleeting time on this beautiful planet.

In the play Waiting for Godot, Beckett writes that our mothers "give birth astride of a grave," and it's just so true.

"They give birth astride of a grave, the light gleams an instant, then it's night once more," the character Pozzo laments.

The line resonates because that really is what the human experience feels like. We get a short time here, and then we're gone.

How bizarre is it, then, that we still find time to hate each other? That we still have time for grudges and resentment? That our mothers give birth astride of a grave, and we punch and kick each other on the way down?

Bukowski said, "We're all going to die, all of us, what a circus! That alone should make us love each other but it doesn't. We are terrorized and flattened by trivialities, we are eaten up by nothing."

It's about the weirdest thing you could possibly imagine."

1. What a playwright's character says is not necessarily what the playwright believes (there is some conflation in Caitlin's piece). The Pozzo quote is very nihilistic and I personally reject it (and find it a very demeaning image).

2. I always find the 'life is short' line very misleading, along with the 'YOLO' exhortation. Life is very long; the spirit is eternal.

Reply to this note

Please Login to reply.

Discussion

Interesting view and obviously you have your Beckett present. I read it in school (I think we even played it) which is meanwhile almost 40 years ago and therefore a bit dusty.

To your second point, I see it the same way. I believe what is written in the Bhagavad Gita:

"Never was there a time when you did not exist, and there never will be a time when you cease to exist."

Yes, the Bhagavad Gita quote is spot on and is intuitively true to me - that consciousness is eternal. It makes as little sense that our consciousness 'appears' when we are born as it does that it disappears when we die.

(Also why I reject the idea that consciousness can 'emerge' from a machine, as the 'AI' cultists believe, but that is another matter.)

If I understand correctly, the typical AI crowd only likes to deal with weak emergence, when strong emergence provides a significant addition to the concept.

I reject completely the concepts of 'artificial intelligence' and 'emergence'. So I'm not particularly interested in distinctions between weak or strong 'emergence' or the distinction one sometimes sees between 'AI' and 'AGI'. For me it is all magical thinking.

That's not to say I don't credit that machine learning is possible, it obviously is and is a very powerful technology that will have a tremendous and society changing impact. And LLMs are also powerful tools, along with other machine analysis & generation tools.

IOW, I think the language here is important (eg Machine Learning vs 'AI') as it helps to avoid succumbing to what I call 'Science Fiction brain'.

p.s. 'Artificial intelligence' and 'emergence' are fine when they apoear as tropes in science fiction, where they serve the same role as magic has for centuries in folk and fairy tales. The mistake with 'Science Fiction brain' is in thinking these fictional constructs are possibilities in 'the Future', perhaps how ppl also mistook that magic-making was possible from fairy tales.

Hah

Is this from a client that summarises replies? Looks useful.

Btw, for some reason I haven't been getting notifications on yr replies recently. I'll need to look into this as I value my exchanges with you here.

Summarize Notifications on iOS generates these, for nostr:npub18m76awca3y37hkvuneavuw6pjj4525fw90necxmadrvjg0sdy6qsngq955 in this case. I mostly find its clumsiness amusing, I do appreciate that the routines supposedly run fully local though.

Regarding notifications, I have occasionally wondered whether my relay list somehow limits or slows my post distribution. The list seems in line with other users here though, and the relays pretty consistently reachable.

I agreed, and have thought similarly regarding machine learning, admittedly influenced by Apple conclusively employing the term until the recent marketing pivot.

Regarding emergence, can we replace it with the term self organization in order to ground it?

I can spontaneously discern nothing new or useful from your characterizations, the scale and direction of self organizing structures seem like valid, and practical theory to me.

*agree

> Regarding emergence, can we replace it with the term self organization in order to ground it?

Thanks for this suggestion, I'll look into it. Frankly my understanding of how 'emergence' is used may be uninformed. I had assumed it referred simply to the emergence of a conscious intelligence (the possibility of which I reject). However if it refers to the emergence of self organisation then I am much more open to that. Perhaps that is the distinction you are referring to between weak and strong emergence? Anyway, it appears I need to do some reading here.

Yeah, I would have to read up again for more precise and dependable explanations, the emergence theory I encountered, a decade or longer ago, unrelated to any mass product marketing, concerned itself with self organization, and no presumptions to immediately model and explain consciousness. It did play with the notion of cellular automata, which I find uncontroversial, if only in its limited marketability.

I would happily hear your thoughts on the weak and strong emergence distinction.

Isn’t it possible, given (as far as we know) the 13+ billion years the universe has become a vastly complex thing far beyond human comprehension, and the human brain a vastly complex system, that consciousness, emotion, our concept of ‘soul’ etc are all just output of that system, that we can only interpret as such given our limited ability to understand it? And when the system dies so does our consciousness? I haven’t read the Bhagavad Gita, but this idea doesn’t contradict the piece quoted (which doesn’t specify consciousness) since our atoms were and will be transformed into other things, and given the 1st Law of Thermodynamics says our energy can’t be destroyed.

You're describing a materialist philosophy; that the material world we perceive is reality and all that there is.

If you want to interpret the Gita quote through a materialist 'lens' that's fine, although it would be a heterodox reading.