Skip to content
Search

Latest Stories

Bank of England warns of racist and sexist AI bots

The warning coincides with escalating concerns surrounding AI technology, notably from prime minister Rishi Sunak

Bank of England warns of racist and sexist AI bots

BANK OF ENGLAND officials have issued a warning about the menacing presence of discriminatory artificial intelligence (AI) bots that have the potential to endanger the financial system.

The report from the bank's fintech hub highlighted the risks associated with AI bots perpetuating racist and sexist biases, raising concerns about their capacity to discriminate against both customers and employees, reported The Telegraph.


The report underscores the susceptibility of self-teaching algorithms to absorbing biases from the datasets they are trained on and the broader societal context.

Authored by analyst Kathleen Blake, the report emphasises the disruptive issues these biases may create for financial institutions, insurers, and the overall financial system.

Blake pointed out that potential AI-driven discrimination has the capacity to "exacerbate" risks related to financial stability by eroding trust within the system.

The use of "biased or unfair AI" not only poses reputational hazards but also legal risks for companies, Blake added, which could subsequently attract scrutiny from regulatory authorities.

Several noteworthy AI-related incidents were cited in the report, including an algorithm employed by Apple and Goldman Sachs for assessing credit card applications, which reportedly offered lower credit limits to women compared to men.

This particular issue underwent scrutiny by the New York state department of financial services in 2021, ultimately revealing that while not deliberate, it showcased significant deficiencies in customer service and transparency.

Another instance discussed was Amazon's experience with a recruitment algorithm, which, as Blake highlighted, unfairly penalised female applicants.

This discriminatory outcome was attributed to the algorithm's training on resumes submitted over a ten-year period, reflecting the prevailing male dominance in the industry, the report said.

Consequently, the algorithm was discontinued in 2018 due to concerns of sexism, particularly in cases where applicants used terms such as "women's," such as "women's chess club captain" on their CVs.

In recent months, the government has raised alarms about the potential misuse of AI in creating bio-weapons and the loss of control over such software.

The Department for Science, Innovation, and Technology conveyed in a statement that humanity stands at a pivotal juncture in history, emphasising the significance of addressing AI challenges rather than turning a blind eye to the issues at hand.

More For You

Rachel Reeves

Under the policy, property owners will face a recurring annual charge additional to existing council tax liability.

Getty Images

Rachel Reeves announces annual tax on homes worth over £2 million

Highlights

  • New annual surcharge on homes worth over £2 m comes into force in April 2028, rising with inflation.
  • Tax starts at £2,500 for properties valued £2m-£2.5m, reaching £7,500 for homes worth £5m or more.
  • London and South East disproportionately affected, with 82 per cent of recent £2m-plus sales in these regions.
Britain has announced a new annual tax on homes worth more than £2 million, expected to raise £400 million by 2029-30, according to estimates from the Office for Budget Responsibility.

Chancellor Rachel Reeves pointed that the measure would address "a long-standing source of wealth inequality in our country" by targeting "less than the top 1 per cent of properties". The surcharge will come into force in April 2028.

Under the policy, property owners will face a recurring annual charge additional to existing council tax liability. The rate starts at £2,500 for homes valued between £2 m and £2.5 m, rising to £3,500 for properties worth £2.5 m to £3.5 m, £5,000 for £3.5 m to £5 m, and £7,500 for those valued at £5 m or more.

Keep ReadingShow less