https://void.cat/d/YLsfYGmsATd4bwm9erBAkT.webp

https://www.youtube.com/watch?v=KCSsKV5F4xc

???: A truly fascinating discussion, something that has forced me to fundamentally rethink my approach. Thank you @liv_boeree My company has been leveraging LLMs to make absolutely stunning breakthroughs in our game AIs. As has been leaked, our upcoming flagship has taken the sandbox concept to entirely new levels. Beta testers have been building lasting relationships with the NPCs in the game! Our playtest scores are through the roof, and this may be the most difficult business decision I have yet to face.

As described, I know that if we do not gain a first-mover advantage in this genre, we will be giving up so much to our competitors, we may be unable to recover. I am beginning to reach out to the leadership of other companies to explore the idea of a pause for evaluation. But also as described, Moloch will work to undermine our trust in each other, even if we can come to an agreement.

I am not worried about the NPCs in our game taking over the real world somehow. I understand their limits. I have a separate fear.

What I worry about is the effect this will have on people who enjoy our games. If you can get all of your social imperatives filled in the safety of a virtual world, is that good for us? As humans? To virtually fill your sense of achievement, of competition, of friendship, of love? With no real risk or stakes?

Even barring the addictive or destructive behaviors that can be found by taking games too far, do we run the risk of making real-world interactions too daunting for people to venture out into? Are we creating a civilization of anxious humans who flee to comfort and safety at the slightest conflict?

It is time for fundamental reflection.

#singularity #moloch #ai #videogame #addiction #mentalhealth #LLM #NPC

image generated with hotpot.ai

Reply to this note

Please Login to reply.

Discussion

No replies yet.