Skip to content
Search

Latest Stories

Bank of England warns of racist and sexist AI bots

The warning coincides with escalating concerns surrounding AI technology, notably from prime minister Rishi Sunak

Bank of England warns of racist and sexist AI bots

BANK OF ENGLAND officials have issued a warning about the menacing presence of discriminatory artificial intelligence (AI) bots that have the potential to endanger the financial system.

The report from the bank's fintech hub highlighted the risks associated with AI bots perpetuating racist and sexist biases, raising concerns about their capacity to discriminate against both customers and employees, reported The Telegraph.


The report underscores the susceptibility of self-teaching algorithms to absorbing biases from the datasets they are trained on and the broader societal context.

Authored by analyst Kathleen Blake, the report emphasises the disruptive issues these biases may create for financial institutions, insurers, and the overall financial system.

Blake pointed out that potential AI-driven discrimination has the capacity to "exacerbate" risks related to financial stability by eroding trust within the system.

The use of "biased or unfair AI" not only poses reputational hazards but also legal risks for companies, Blake added, which could subsequently attract scrutiny from regulatory authorities.

Several noteworthy AI-related incidents were cited in the report, including an algorithm employed by Apple and Goldman Sachs for assessing credit card applications, which reportedly offered lower credit limits to women compared to men.

This particular issue underwent scrutiny by the New York state department of financial services in 2021, ultimately revealing that while not deliberate, it showcased significant deficiencies in customer service and transparency.

Another instance discussed was Amazon's experience with a recruitment algorithm, which, as Blake highlighted, unfairly penalised female applicants.

This discriminatory outcome was attributed to the algorithm's training on resumes submitted over a ten-year period, reflecting the prevailing male dominance in the industry, the report said.

Consequently, the algorithm was discontinued in 2018 due to concerns of sexism, particularly in cases where applicants used terms such as "women's," such as "women's chess club captain" on their CVs.

In recent months, the government has raised alarms about the potential misuse of AI in creating bio-weapons and the loss of control over such software.

The Department for Science, Innovation, and Technology conveyed in a statement that humanity stands at a pivotal juncture in history, emphasising the significance of addressing AI challenges rather than turning a blind eye to the issues at hand.

More For You

Jaguar Land Rover

Vehicle production came to a complete halt on September (1) with JLR unable to resume global operations until five weeks later

Getty Images

Jaguar Land Rover production plunges 43 per cent following devastating cyber attack

Highlights

  • JLR produced only 59,200 cars in final quarter of 2025 compared to 104,400 previous year, down 43 per cent due to cyber attack fallout.
  • Operations halted globally for five weeks from September after August breach described as Britain's most expensive cyber attack.
  • Retail sales plummeted 25 per cent to 79,600 vehicles; company preparing to launch £100,000+ electric Jaguar saloon later this year.

Car production at Jaguar Land Rover plummeted by 45,000 vehicles in the final quarter of 2025 as the British automotive giant struggled with the aftermath of what experts have described as the most expensive cyber attack in British history.

The company revealed total output in the three months to December was down 43 per cent compared to last year, despite restarting factory lines in the second week of October. JLR produced just 59,200 cars in the final quarter of 2025, compared to 104,400 the previous year.

Keep ReadingShow less