Germany’s domestic intelligence service, the Bundesamt für Verfassungsschutz (BfV), has reportedly decided not to adopt software from US firm Palantir, according to media reports. Instead, the agency is said to have settled on a product from French company ChapsVision, though neither the BfV nor the vendor have officially confirmed the decision. The Interior Ministry declined to comment on operational matters, saying public statements could reveal working methods and pose security risks.
Officials say the choice of software rests on technical capabilities rather than the origin of the manufacturer. The BfV and other security agencies — including the foreign intelligence service (BND) and the federal criminal police office (BKA) — are seeking powerful AI-based analysis tools to support counterintelligence, counterterrorism and monitoring of political and religious extremism. BfV president Sinan Selen has described plans to broaden the agency’s “toolbox.”
Any expansion of technical powers for security agencies is tied to planned legal reforms. Berlin has been preparing legislation to clarify what tools intelligence and police bodies may use; ultimately, the Bundestag must approve changes. Proposed measures involving artificial intelligence and facial recognition have sparked sharp debate. The Left party opposes broadening surveillance powers and argues that replacing Palantir with another supplier is not the real issue: the key problem, its spokespeople say, is the logic of automated mass data aggregation and profiling. They call for clear legal limits and strict oversight to protect fundamental rights.
Civil liberties groups have already challenged the use of such software in court. The German Society for Civil Rights (GFF) successfully contested provisions in Hesse after the Federal Constitutional Court found indiscriminate automated data evaluation unconstitutional. The Hesse laws were amended, but the GFF filed further constitutional complaints in 2024 and 2025 — including a case against Bavaria — that are still pending.
Legal advocates warn about the opaque nature of many analysis platforms. Franziska Görlitz, a lawyer and case coordinator at the GFF, welcomed a reported move away from Palantir only insofar as it might reflect attention to “digital sovereignty.” Her larger concern is that many tools operate as “black boxes”: it is often unclear how they reach conclusions, what errors or biases they contain, and how far they can intrude on rights. She and other critics emphasize risks such as wrongful targeting, discriminatory outcomes, and a chilling effect on civic activity — people might avoid protests or distance themselves from others for fear of being recorded in state databases.
Palantir’s leadership has pushed back on Germany’s critical debate. CEO Alex Karp told German media he found the mix of caution and rejection surprising and argued Germany could benefit from his company’s expertise; he criticized what he described as alarmist rhetoric around AI tools. Meanwhile, academics and commentators such as Dutch political scientist Cas Mudde have been even more strident, accusing Palantir of promoting an authoritarian model in writings like Karp’s The Technological Republic and urging European governments to cut ties.
For now, the reported BfV decision — and the wider push by German security agencies to acquire advanced data-analysis systems — sits amid continuing legal and political contestation. Legislators, courts and advocacy groups will play central roles in defining what tools agencies may use and under what safeguards, as Germany seeks to balance security needs with protections for privacy and civil liberties.
This article was translated from German.