Darwinian Demons: Climate Change and the AI Arms Race (with Kristian Rönn)

Sep 10, 2025 1h 17m 17 insights Episode Page ↗
Kristian, entrepreneur and author of The Darwinian Trap, discusses global governance challenges in climate change and AI. He argues AI is a meta-technology posing existential risks via arms races and gradual human disempowerment, advocating for global chip governance and a shift in problem-solving narratives.
Actionable Insights

1. Shift Problem-Solving Narrative

Change the narrative from blaming individuals (corrupt politicians, greedy CEOs) to understanding and addressing the systemic “game” or root causes that incentivize harmful behaviors, as this approach is more effective for solving core problems.

2. Adopt Utilitarian Long-Termism

Care about all beings, including those in the future, and focus on global coordination to reduce global risks like climate change, AI, and nuclear threats, as this perspective is crucial for tackling complex, interconnected issues.

3. Prioritize AI for Climate Solutions

Recognize that aligned artificial intelligence could significantly simplify solving climate change by enabling breakthroughs in areas like tree planting, fusion energy, and carbon capture, making it a highly impactful area to focus on.

4. Govern AI Through Chip Controls

Advocate for and implement governance mechanisms at the compute level (AI chips/GPUs), potentially building hardware-level checks to prevent training of dangerous AI models without safety evaluations, as this offers a tractable way to control powerful AI development.

5. Develop Specialized AI Tools

Shift AI development from creating general intelligences to building specialized tools (e.g., chess engines, protein folding algorithms) that excel in specific domains, making them easier to control and predict for safety.

6. Embrace Slow Technological Evolution

For long-term survival, civilizations should adopt a strategy of slow and careful technological innovation, investing heavily in predicting and simulating the implications of new technologies before deployment, to avoid “landmines” in the fitness landscape.

7. Increase Supply Chain Transparency for Safety

Foster greater transparency in technology supply chains (e.g., AI chips) to empower every actor to “vote” against dangerous uses, leveraging interconnectedness to enforce safety standards and prevent unilateral risky actions.

8. Utilize Export Controls for AI Safety

Governments, especially those with control over advanced AI chip production, should use export controls to enforce tracking and safety evaluations for AI models trained with these chips, incentivizing safer development globally.

9. Monitor AI Chip Locations

Implement systems to track the location and access of advanced AI chips (GPUs) to prevent their use by hostile nations or terrorist groups, as this is a more tractable problem for immediate AI governance.

10. Prioritize Safety Over Profit in AI

Recognize that financial incentives can override safety concerns in AI development, as seen with OpenAI’s board drama, and advocate for structures that prioritize safety and mission over profit to mitigate existential risks.

11. Beware of Hacking Metrics

Be aware of Goodhart’s Law: when a metric (e.g., academic citations, paper count) becomes a target, it ceases to be a good measure, as people will find ways to hack it rather than genuinely improve quality.

12. Pressure Supply Chains for Net Zero

If you are a company or consumer in an economy committed to net zero, pressure suppliers in your value chain to also adopt net zero practices, as on average 90% of emissions are located in supply chains.

13. Invest in Green Technologies

Support and invest in green technologies, as international agreements signal capital flows towards them, driving innovation and cost-effectiveness, making them more competitive than fossil fuels.

14. Answer Life-Changing Questions

Visit clearerthinking.org to answer a set of scientifically validated “life-changing questions” that 83% of people found valuable for gaining new insights and improving their self-understanding.

15. Subscribe to “One Helpful Idea” Newsletter

Sign up for the “One Helpful Idea” email newsletter at podcast.clearerthinking.org to receive a weekly valuable idea, new podcast episodes, essays, and event announcements.

16. Rate and Review Podcasts

Rate and review podcasts you enjoy on your listening platform, as this helps other potential listeners discover the show and supports its growth.

17. Provide Podcast Feedback

Give feedback, ask questions, and leave comments for podcasts you listen to, as this helps the creators improve the show over time.