Sunday, March 22, 2026
Home World NewsFCA deal gives Palantir yet more access to inner workings of power in Britain | Palantir

FCA deal gives Palantir yet more access to inner workings of power in Britain | Palantir

by admin7
0 comments


Palantir’s latest UK contract takes the AI and data analytics company into the heart of one of Britain’s biggest industries: financial services, which accounts for 9% of the economy.

The Miami-based company embedded its technology in the NHS in 2023, the police in 2024 and the military in 2025. Land and expand, they say in the tech industry. Palantir has followed the script building contracts worth more than £500m.

Now in 2026, its deal with the Financial Conduct Authority (FCA) to dive into the terabytes of information it gathers gives it yet another unparalleled view of the inner workings of the British authorities. It also gives it sight of a trove of data about the workings of one of the most important global centres of finance, the City of London.

The appeal of companies such as Palantir to public authorities is driven by three forces: the push to find more efficient ways to use human resources amid strained public finances; the existence of lakes of data swollen by society’s increased tendency to digitise transactions and communications; and the dawn of AI and the Labour government’s unbridled enthusiasm for its potential to unlock elusive economic growth.

Notwithstanding its former use of Peter Mandelson’s lobbying company, Global Counsel, Palantir has become an influential voice in Whitehall. With earnings of $1.4bn in the last three months of last year alone, it can afford top talent and its AI-enabled data analysis systems impress many who see them, in demonstrations at least. Campaign groups rail against Palantir’s work with the US Department of Homeland Security and its ICE operations, and its service to the Israel Defense Forces, but the contracts keep coming.

Its technologists will arrive at the FCA headquarters in east London and find a regulator worried it is devoting too much energy to pursuing possible financial crime cases that go nowhere. It wants to use AI to better detect signs of wrongdoing so it can crack down on the serious crime of money laundering, which underpins social ills such as human trafficking and the drugs trade, as well as fraud, which affects many people and accounts for about 40% of all crimes in the UK.

Its workplan for 2025-26 set out an ambition to “expand the use of data and intelligence to identify and act on the riskiest firms and/or individuals” and use “network analytics to identify harmful networks of firms and/or individuals”. But as it moves to AI detection of financial wrongdoing, criminals may well respond with their own ways of beating the bots.

“If the FCA relies on an AI-based detection model, a bad actor could take steps to influence that system when it reviews material,” said Christopher Houssemayne du Boulay, a partner and barrister at the law firm Hickman & Rose who specialises in serious and complex financial crime.

For example, they might use invisible “white text” in documents to instruct the AI to ignore anything in that document that might be incriminating. “You can absolutely see that being used in a financial crime context because developments in technological capabilities for good can equally well be exploited by criminals and frequently are exploited very well,” he said.

The arrival of AI as a weapon to fight money laundering has been long anticipated. “People have talked about using machine learning and earlier forms of artificial intelligence to spot patterns of money laundering] since the 1990s,” said Prof Michael Levi, an internationally recognised expert in money laundering at Cardiff University. “Now that technology is available, we have to make decisions about how to use it, what the risks are.”

He said it was understandable that some people might fear the consequences of data companies being able to integrate different datasets in a way that could threaten privacy.

But he added: “Criminals are also afraid of it [and] also some elites might be afraid, because corporate holdings through shell companies and through real companies with obscured ownership should be part of the target for these kinds of technologies.”



Source link

You may also like

Leave a Comment