TL;DR
At FOSDEM 2026 Michiel Leenaars examined how geopolitical conflict, resource scarcity and adversarial AI are reshaping risks for free and open source software. He argued the community faces new incentives for manipulation and supply-chain attack, and urged a mix of technical safeguards and human oversight to preserve trust.
What happened
At the FOSDEM 2026 main track session in room Janson (video-only), speaker Michiel Leenaars framed Free and Open Source Software (FOSS) as a vast, collaborative digital commons now exposed to new pressures. Leenaars traced FOSS’s growth from a relatively stable post–Cold War period to a present where geopolitical conflict leverages low-cost technologies and where social platforms built on FOSS can exacerbate polarization. He warned that generative language models and other AI systems — presented as tools to accelerate coding — introduce a novel, hard-to-audit attack vector: opaque model outputs can subtly alter code and the training process itself can be targeted to produce malicious artifacts. The talk considered how these trends intersect with resource scarcity and climate stresses, and explored defensive strategies such as compartmentalisation, formal and symbolic verification, and combined human–AI workflows as ways to try to maintain a defensible FOSS ecosystem.
Why it matters
- FOSS underpins large parts of global infrastructure; new threats could erode long‑standing trust assumptions.
- Opaque AI-generated code increases supply-chain risk because errors or manipulation are harder to detect.
- Geopolitical use of widely available FOSS can amplify disinformation, surveillance and authoritarian control.
- Resource scarcity and climate pressures interact with technological change, shaping priorities and attack surfaces.
Key facts
- Event: FOSDEM 2026, Main Track, room Janson (video-only), Saturday 10:00–10:50.
- Speaker: Michiel Leenaars presented the session titled “FOSS in times of war, scarcity and (adversarial) AI.”
- Talk premise: geopolitical conflicts and bot-generated code create new risks for the global FOSS community.
- The speaker framed FOSS as a large collaborative achievement involving developers, translators, writers and civil society.
- The talk referenced historical assumptions of post–Cold War stability and said that reality has since shifted.
- Leenaars cited the Snowden and Shadow Brokers revelations as an example of earlier, hidden agendas affecting tech trust.
- Concern: generative models (LLMs) can rewrite code unpredictably and are described as black boxes lacking inherent ethics or truth.
- Adversarial training and subtle manipulation of training data were highlighted as realistic ways to compromise code produced or influenced by AI.
- Proposed mitigations mentioned include compartmentalisation, formal and symbolic proofs, and blended human–AI review.
What to watch next
- The extent to which communities adopt AI-assisted coding in core FOSS projects and how review workflows change.
- Evidence of adversarially poisoned code making it into widely used repositories (not confirmed in the source).
- Emergence of community or technical standards for verifying AI-generated contributions, including formal verification and compartmentalisation.
- Regulatory or policy moves addressing ‘any use’ licensing and dual-use concerns (not confirmed in the source).
Quick glossary
- FOSS: Free and Open Source Software — software distributed with source code that users can inspect, modify and share under permissive or copyleft licenses.
- Adversarial AI: Techniques or attacks that intentionally manipulate machine learning models, training data or outputs to produce incorrect or malicious behavior.
- Black box: A system whose internal workings are not transparent or easily interpretable, making its outputs difficult to audit or explain.
- Supply chain (software): The collection of dependencies, libraries, build tools and contributors whose combined outputs form deployed software.
- Compartmentalisation: A defensive approach that isolates components or processes to limit the impact of a compromised part on the broader system.
Reader FAQ
Who gave the talk and where was it presented?
Michiel Leenaars spoke at FOSDEM 2026 on the Main Track in the Janson room (video-only).
Is there evidence that AI has already poisoned FOSS repositories?
The talk warned that adversarial training and model outputs could enable subtle manipulation, but specific incidents are not confirmed in the source.
What mitigations were suggested?
Leenaars discussed combining human review with AI, compartmentalisation, and formal or symbolic verification techniques as possible defenses.
Does the talk claim AI will definitely replace FOSS development?
The speaker suggested the risk that AI-driven coding could displace aspects of FOSS work, but an outright outcome is not confirmed in the source.

FOSDEM 2026 / Schedule / Events / Main tracks / Main Track / FOSS in times of war, scarcity and (adversarial) AI FOSS in times of war, scarcity and (adversarial)…
Sources
- FOSS in times of war, scarcity and (adversarial) AI
- 2026 – FOSDEM 2025
- The Fog of AI
- Artificial Intelligence and International Conflict in Cyberspace
Related posts
- U.S. Greenhouse Gas Emissions Rose in 2025 as Coal-Fired Power Rebounded
- UK launches Project Nightfall to fast-track long-range missiles for Ukraine
- Developer Publishes PowerShell Script to Remove AI Features from Windows