Wow… https://futurism.com/commitment-jail-chatgpt-psychosis

Reply to this note

Please Login to reply.

Discussion

😳

What’s surprising about this? Broken society and culture acting broken and crazy? Pretty run of the mill.

Man, I did not have this one on my bingo card.

Psychosis from talking with AI.

I will say this at nauseum untill it happens, the lawsuit against companies building LLMs will be the largest lawsuit in history, and simultaneously the easiest to win.

I'm not a lawyer, I'm probably the furthest from one, but, if my statement isn't correct, I don't know what is.

Nothing of value comes out of LLMs. Those companies building LLMd are complicit in the destruction of society.

Take a motherloving math class, for Heaven's sake, why are we putting so much trust in statistical analysis and vector multiplications?

Let's play a game, what word comes after "would you please stop trusting dice rolls to provide you with factual"? Is it "cheese"? No? Surely it's "Koala"? No? "Information" you say? Congratulations, now you understand how an LLM works.

It's just a vector multiplication followed by a dice roll. You don't need to understand more, there's more behind it, I have read countless of papers trying to understand why people love LLMs. It's more complicated, slightly, but not much.

---

Imagine you opened your dictionary, and wrote down every single word with it's index on a playing card; "1. a", "2. aardvark", and so on until you have every word.

Next to it write the number 0.5. You should now have a vector of size n where n is the number of words in the dictionary, with all the values set to 0.5.

Now do this again, until each card has a pairing with another card. In other words, you will square the number of cards.

With a 50k word dictionary you should have 2.5 billion you did it cods, all with 0.5 written on them.

Find your favorite book and start reading. For each word we're going to play a game, you take one card with the corresponding word and one of the cards with the next word on it, then you put it on top, and increase the number on the second word by 0.01.

For example: "Once upon a time"

Once (0.5)-> upon (0.51)

upon (0.5) -> a (0.51)

a (0.5) -> time (0.51)

and you decrease the value for every other card. So for a its neighbor for aardvark should be 0.49.

Every time you find the same combination you increase that number and decrease the others. You now do this for every word in your book. Soon it will become apparent that certain word combinations are very high and others very low. The combination between "Catapult" and "fish" is probably below 0.

At this point you will discard all the cards on the left, the ones with 0.5 written on them, they did their job already. You now have an LLM with half a neuron.

It's now the time to turn it into a full neuron ("parameter").

You take a second book, and do the process again, but using the existing values.

Now you should have two numbers for each oairing. Such as 0.76 -> 0.33. This is your first neuron!

You have 2 vectors now, a matrix.

Take a third book and make a new set of 0.5 cards. Now from the third book you will look at 3 word pairs "The cat is sleeping".

The (0.13) -> cat (0.6) -> is (0.22)

cat (0.7) -> is (0.4) -> sleeping (0.12).

Once you have completed the book you have 2 neurons!

Now imagine you do this 125k times. Every time increasing the number of word pairings (4, 5, 6, and so on).

Now you have an llm with 125k neurons, this is your context window.

Repeat the process now from step 1 onward, do this a few trillion times.

Wait for the user to give you the prompt, find your position in your matrix, and rotate it so that the vector is at position 0, the word coming next is at position 1, and so on.

Finally, manufacture a dice with as many sides as words in your dictionary. Make sure your dice can give preference towards certain words, and those preferences can be dynamically altered.

Now let's play the game, roll the dice and say the word corresponding to the index. Change the weights in your dice to follow the next pattern (a higher value in the vector means it should be more likely to come next). Keep going until you reach the amount of words you want to say!

---

Congratulations 🥳🥳🥳 you created an LLM! Yes, you missed a few steps, you are GPT-2, you don't have the magic that modern LLMs, you didn't do RL, that's slightly more complicated, but you have an LLM.

Are you happy? Do you suddenly trust it? Can your dice game tell you something you didn't know? Is your dice game intelligent? Does your dice game have feelings? Is it perhaps scared that you're going to burn the cards? Should you trust it when it tells you that it loves you and that you should drink the poison so you can be together in the afterlife?

nostr:nevent1qqstce8m9gw78p9app5ztaqcaq7kku2lpf6g9jv5lxns84d5l799qgcpr9mhxue69uhhyetvv9ujumn0wdnxcctjv5hxxmmd9upzp4y6jq36y8d6rv7gxpk2x6dlxfpa3dzt3u9k6yvkvplhkzvsl2xlqvzqqqqqqyutt7uu