Skip to content
Search

Latest Stories

British family of three among missing in Miami building collapse

British family of three among missing in Miami building collapse

A pregnant British woman, her husband and their one-year-old daughter are reportedly missing among nearly 100 people who are still unaccounted for after an ocean front apartment block in Miami collapsed late Thursday (24) night.

According to a family member, Bhavna and Vishal Patel, along with daughter Aishani were staying at the 12-storey Champlain Towers South building when it was reduced to rubble.


A spokesperson for Miami-Dade Mayor’s office said so far four people have died from the building collapse.

As questions are being raised about the structural failings of the building, pleas being made to help locating the family.

On Thursday (24) night, Nicolette Brent, the UK’s Consul General in Miami visited the family reunification centre in Surfside and said her team was “ready to help any British nationals who may have been involved in this tragic incident.”

Miami-Dade mayor Daniella Levine Cava spoke with US president Joe Biden over telephone, after the collapse happened.

Biden said: "I say to the people of Florida, whatever help you want, that the federal government can provide, just ask us, we'll be there."

Meanwhile, a spokesman for the UK in America told The Telegraph: “We are working with the local authorities in Miami-Dade county to establish if any British nationals have been involved in the residential building collapse that took place earlier today."

More For You

ChatGPT goblin references

The problem first came to light after the launch of GPT-5.1 in November

Getty Images

What caused ChatGPT to start using goblin references in its responses?

Highlights

  • Goblin mentions in ChatGPT rose 175 per cent after GPT-5.1 launched in November.
  • The "Nerdy" personality drove 66.7 per cent of all goblin references.
  • OpenAI has retired the personality and removed creature words from training data.
OpenAI has explained how an attempt to make ChatGPT more fun and personable accidentally gave it a fixation on goblins, gremlins, and other creatures.
The company published a detailed blog post laying out exactly how the problem started, how it spread, and what was done to fix it.

The issue traces back to a feature called the Nerdy personality, one of several personality options OpenAI built to let ChatGPT communicate in different styles.

This particular one was designed to make the AI sound enthusiastic, witty, and playful, like a knowledgeable friend who enjoys making complex ideas accessible.

Keep ReadingShow less