Rise of the machines must be monitored, say global finance regulators

The FSB says in its first report on artificial intelligence (AI) and machine learning that the risks they pose need monitoring.

Update: 2017-11-02 03:54 GMT
AI and machine learning refer to technology that is replacing traditional methods to assess the creditworthiness of customers, to crunch data, price insurance contracts and spot profitable trades across markets. (Photo credit: Reuters)

Replacing bank and insurance workers with machines risks creating a dependency on outside technology companies beyond the reach of regulators, the global Financial Stability Board (FSB) said on Wednesday. The FSB, which coordinates financial regulation across the Group of 20 Economies (G20), said in its first report on artificial intelligence (AI) and machine learning that the risks they pose need monitoring.

AI and machine learning refer to technology that is replacing traditional methods to assess the creditworthiness of customers, to crunch data, price insurance contracts and spot profitable trades across markets. There are no international regulatory standards for AI and machine learning, but the FSB left open whether new rules are needed. Data on rapidly growing usage of AI is largely unavailable, leaving regulators unsure about the impact of potential new and unexpected links between markets and banks, the report said.

AI could, for example, lead to ‘non-sustainable’ increases in credit by automating credit scoring. While AI shows substantial promise if risks are properly managed, it could create too much dependency among banks and insurers on the few specialist businesses that provide AI technology. Expected rapid growth in AI also raises the prospect of outside technology players expanding their influence over the finance sector.

“This could, in turn, lead to the emergence of new systemically important players that could fall outside the regulatory perimeter,” the FSB said.

If a major AI provider went bust, it could lead to operational disruptions at a large number of financial firms at the same time, especially if used in “mission critical” applications, the report said. Regulators could also find it difficult to identify who has made key financial decisions that go wrong. “If AI and machine learning-based decisions cause losses to financial intermediaries across the financial system, there may be a lack of clarity around responsibility,” the report said.

The pace of technological advance will also make it harder to fashion durable rules for AI activity that some academics expect to revolutionize the financial sector.

The report said that RegTech investment, or use of machines to comply with a welter of new regulations introduced to tackle money laundering and make banks safer, could reach $6.45 billion by 2020.

Consultancy Accenture said in May that three-quarters of bankers surveyed believed that AI will become the primary way banks interact with customers within the next three years. European insurers invested $400 million in “InsurTech” or real-time technology to help to reduce payouts.

Nordea the Nordic region’s biggest bank, said last month that automation would help it to shed at least 4,000 staff. It has already introduced an AI chat box to answer common customer questions. Dutch bank ING wants to increase the number of traders using AI. Fund managers are also using outside specialists to obtain machine-learning tools that sift through news and research for insight into market trends.

So-called ‘quant’ funds use AI to manage $1 trillion in assets. Though that is only a fraction of the $40 trillion in mutual funds globally, the FSB said industry estimates suggest that could grow rapidly. The regulators themselves are also using AI to make it easier to detect fraud and money laundering, while central banks expect to use AI for real-time predictions using big data to help to determine monetary policy, the report said.

The FSB acknowledged that AI is helping the financial sector to cut costs, improve profitability and widen choice for customers, but added that it also raises concerns over privacy of data used in AI applications.

Tags:    

Similar News