Replying to Avatar Kajoozie Maflingo

Because an llm is just a series of finely tuned knobs, like 24 billion of em. It takes in a prompt and spits out an output by taking random noise and passing it through those 24b parameters. At no point does it have capacity to act freely. It won't even spit out a response until it's prompted to do so. It cannot take over society or escape a physical host because it cannot want to do anything. But we can choose to replace human judgment with AI for important decisions. We must not make that choice.

Avatar
mcshane 1y ago

not speaking about LLMs tho necessarily. if consciousness arose physically it can be replicated.

Reply to this note

Please Login to reply.

Discussion

Avatar
ᴛʜᴇ ᴅᴇᴀᴛʜ ᴏꜰ ᴍʟᴇᴋᴜ 1y ago

You don't understand combinatorial complexity

Yes, bet good luck finding the 50+ decimal number that specifies the result

And us wetware flesh robots are the same. Except we are much more advanced

Thread collapsed