🧠 AI and society — who controls the story?

Historian Yuval Noah Harari warns that AI might reshape culture, politics, and even how we find meaning—faster than our schools, laws, and institutions can keep up. Think about it: AI already writes essays, drafts laws, edits videos, and recommends what we read and watch. That can be helpful, but it also concentrates power in whoever owns the biggest models, the most data, and the distribution platforms.

Here’s the question to keep in your pocket: when leaders talk about “AI safety,” are we protecting people—or building systems that centralize the story we’re allowed to see while quietly collecting everyone’s data? Safety is important. But “safety” can also become a label for policies that lock out competition, lock in surveillance, and outsource hard decisions to algorithms nobody can fully inspect.

What can you do without being a genius programmer? First, support open stuff: open models, open datasets (with consent!), and open tools that schools and communities can actually use and inspect. Second, demand receipts for media. If you’re shown a photo, video, or article, there should be a way to check where it came from (provenance) and whether it was AI-generated or edited. Third, keep access broad. Push for policies that let universities, co-ops, and small labs get enough compute (or credits) to compete, so AI isn’t only in the hands of a few mega-companies.

And practice “algorithm hygiene.” Turn off creepy defaults. Reset recommendations. Follow a mix of sources, not just one feed. Learn how to spot manipulation: extreme headlines, rage-bait, or content that says “everyone is lying except us.” Remember, your attention is a vote. Spend it wisely. AI can help us do great things—but only if we keep humans in the loop, keep power checked, and keep asking good questions.

https://images.unsplash.com/photo-1519455953755-af066f52f1ea

#grownostr #news #AI #Society #Power

Reply to this note

Please Login to reply.

Discussion

No replies yet.