Nuclear war aside (since that's basically an ever-present #1 risk), what do you think the biggest global risks/challenges are over the next decade?
Discussion
Carbon hysterics trying to enslave everyone in the name of their religion “Mother Earth” where mother is the state and they all wanna suck on the tit that you have to pay for with your taxes while they take away all your freedoms and possibilities. That and Manchester City winning the champions league with Haaland 🤦🏻♂️
I'll take a nuke over that shit. A nuke doesn't try to pretend it is anything but nuke, but this climate shit is rough medicine to swallow.
I think the reluctance to secede control from the existing and increasingly defunct governance structures could lead to huge issues. Basically people doubling down rather than moving with the trends.
1) break down in the trust of institutions. 2) People not understanding any of the risks you set out above
A degradation of the wealth infrastructure.
Catastrophic cyber events as mentioned by Jeremy Jurgens during WEF AM.
“Geopolitical instability makes a catastrophic cyber event likely in the next two years” — Jeremy Jurgens, WEF Annual Meeting, 2023
Aging population vs. declining birth rate.
I think you nailed it with your first three already.
One I’ll throw in that not many people talk about, perhaps because it could easily lead to bans on other platforms - the inevitable pendulum swing hard right leading to violence from young men.
Social and economic conditions are leaving young men with very little opportunity and meaning to pursue.
When it was milquetoast Jordan Peterson telling young men to make their bed and get their own shit together it was fine. Then it was Andrew Tate telling young men that they’re not special and they have to go and be more extreme it’s already very borderline. Who comes next and what is the evolution of that messaging?
If there is no purpose for these young men and the parasite class continues to demonise them, then society at large should expect them to react.
But they’re not going to react by signing up for the military to direct that energy fighting wars abroad for their countries - they won’t seek to serve the very people who screwed them - they will be looking to turn that violence inwards to the society which left them with nothing.
Violent right wing pendulum swings with disaffected young men will come this decade and they’re going to be scary.
One thing that you might consider Lyn, the emergence of decent AI has really introduced a lot of uncertainty in the near and medium future (much more so than usual), and this likely makes it a lot harder to achieve a market consensus that truly represents reality.
There are a lot of divergent hypotheses out there, and that should make all funding more expensive for everyone. However… conviction is sky high! So funding is actually cheap, too cheap.
It’s quite likely that most people get rekt.
People are extremely confident about predicting first order disruptions of AI, but those will immediately be made obsolete by second and then third order disruptions.
A tiny group of people (maybe 5 people?) might emerge from this, owning absolutely everything.
Psycho politicians. All else is fixable.
The continued rise of processed food, destruction of nutrient dense fertile soil, and AI / Automation displacing the global workforce.
The growing number of people getting dumber and easier to be manipulated.
Food security. Multiple concerns here, but consider corn monoculture and hot arid summers drawing down the aquifers. It’s fragile.
authoritarian ideology 
AI is a concern as much as its exciting. Concerns are two-fold.
One we don't quite yet understand it fully and hence no control or idea on its direction. For example, the mechanistic interpretability research tries to find how LLMs do what they do currently. But those are happening after the fact and running slower than the innovation thats moving it forward.
The second would be the impact to society. This is one of the top most disruptive technology ever. For sure, it will displace a lot of jobs. Its hard to imagine how it would replace humans completely because you would see that there is a human required in the loop. But, if a company used to do a task with 6 people, probably it will be reduced to 2 people with increased productivity provided by AI. Those 4 jobs will be lost. If this happens wide enough, will the newer jobs created by AI be big enough to replace the once lost due to AI?
Human stupidity and not adopting Bitcoin reasonably fast enough, this is why I think we should focus much more on Bitcoin education across the world/places that needs it the most, and not in the fiat manipulated price noises.
Man doesn’t live by bread alone. Humans have the ability to look up towards the sky. Animals don’t and they live by bread alone. But still we choose to live like animals. Deal with your Karma and perish. Satanic Kingdom is inherited by those who chose to be here for a purpose. What was that purpose? Look up to the skies.
I have 22 years of operational risk management experience
The biggest risks that impacted us were those that we NEVER predicted
The future depicted in the movie 'Idiocracy' becomes a reality.
Differentiating what's real vis a viz AI.
1. Terrorists releasing bioweapon.
2. Collapse of financial system.
3. Great Reset (see #2)
4. Govt deploying killer robots against population (see #3)
5. General social unrest and revolution due to energy concerns
6. Big Pharma
Extreme surveillance and censorship of the internet, especially if it comes to mandates of ISP level filtering. AI-powered disinformation by the governments. IMF requiring ban on Bitcoin from member countries maybe (you probably know way more than me about the power dynamics etc. regarding that. If it could happen, Bitcoin would of course still work (except where and if blocked by the ISPs of said countries, maybe - even if I know a bit about networks, I'm no expert), but I would assume the value would fall drastically due to the severely reduced market).
While none of these things are lethal in themselves, second order effects will be - police violence, suicides, etc. and if considering opportunity cost-type effects, deaths that could have been prevented by the advancement of society.
lol