SOUTH Asian women are the least likely to check their breasts for signs of cancer, new research revealed this week.
Around 40 per cent never check their breasts at all, compared to 27 per cent of black women and 13 per cent for other ethnicities, according to the Estée Lauder Companies (ELC) UK & Ireland’s 2021 Breast Cancer Campaign.
The analysis also found a third of south Asian women said they do not know what to look for or forget to carry out the checks on their own. More than one in 20 (7 per cent) don’t feel comfortable checking themselves due to cultural reasons.
Kreena Dhiman, who was diagnosed with breast cancer in 2013, said she was unsurprised by the statistics. “If I consider my own circumstance, I never self-checked,” Dhiman told Eastern Eye. “Conversations around our bodies simply don’t happen within our community and, as a result, knowledge on subjects such as breast health is almost non-existent.”
Kreena Dhiman
Dhiman said there are several factors at play, including cultural implications that create difficulties when discussing your body. “As a Hindu, from a young age we are told that when on your period you are ‘unclean’ or ‘impure’ and unable to go close to any places of worship or religious events,” explained Dhiman. “That conditions us to believe that there is something wrong with the way our bodies function, it creates barriers and builds walls when it comes to the female body.”
She believes that the immigrant history of south Asians in the UK may create difficulties too. Many of the community have arrived in the UK because of immigration and have endured great struggles, Dhiman explained.
“Incredibly, the south Asian community came through that struggle, and many have
gone on to raise a generation of highly driven, highly successful and well-educated
children,” she said, “but those children have seen the adversity their elders have faced, they carry an element of responsibility, a desire to protect and repay their elders.
“It’s those children, my generation, who find it difficult to display vulnerability, because we have raised to be strong, to be successful, to achieve. Ill health stands to get in the way of that, so it’s far easier to ignore the warning signs.”
Research also showed that south Asian women believe there is a stigma in their community around speaking about breast cancer as it is not openly discussed.
Dhiman agreed with the findings, commenting that stigma is “rife” among some
members of the Asian community. When women are introduced to prospective suitors for marriage, Dhiman said cancer could ruin a potential match. “It can be a real block in the road where those suffering are perceived to be damaged goods,” she said. “It’s those stereotypes that needed to be broken.”
Dr Zoe Williams, GP and broadcaster, said breast health should be a part of our self-care routines alike to brushing your teeth. “There’s no shame, breasts are part of our bodies,” she explained. “It’s our responsibility to take care of them. Regular checking is vital, ideally once a month, but remember checking your breasts is a skill and like any other skill it takes practice to get good at it.”
She advised women to look out for several different signs – not just lumps. Symptoms can include irritation or dimpling of the skin on the breast or flaky skin in the nipple area, for instance. Dhiman’s first symptom was an inverted nipple.
“If you notice any unusual changes, it’s important to contact your GP as soon as possible,” Williams said.
Dhiman said her message to Eastern Eye readers would be that the problem will not go away if they simply ignore it. “This isn’t something that we can hide away from,” she said. “Breast cancer is real, and the lack of awareness in our community means that it’s often detected later than our western peers.
“That ultimately means that we will lose proportionately more lives to the disease, and that needs to change.”
Additional key findings revealed that 82 per cent of black, south Asian and LGBTQIA+ women believe there needs to be better access to tools and resources that feature a more diverse range of people to highlight that breast cancer can affect every body.
A child is hoisted into a small boat as migrants wait in the water for a 'taxi boat' to take them across the channel to the UK at dawn on September 19, 2025 in Gravelines, France. (Photo by Dan Kitwood/Getty Images)
BRITAIN's plan to use artificial intelligence (AI) to assess the ages of asylum seekers has sparked concern among human rights groups, who warn the technology could misclassify children as adults and deny them vital protections.
The government intends to introduce facial age-estimation technology in 2026 to verify the ages of migrants claiming to be under 18, particularly those arriving on small boats from France. Officials say the move will help prevent adults from posing as children to exploit the asylum system.
Prime Minister Keir Starmer is under growing pressure to control migration, as Nigel Farage’s anti-immigration Reform UK party gains support in opinion polls. More than 35,000 people have crossed the English Channel in small boats this year, a 33 per cent rise on the same period in 2024.
Meanwhile, rights campaigners and social workers argue that assessing the age of migrants is a complex and sensitive process that cannot be replaced by technology.
“Assessing the ages of migrants is a complex process which should not be open to shortcuts,” said Luke Geoghegan, head of policy and research at the British Association of Social Workers. “This should never be compromised for perceived quicker results through artificial intelligence.”
Children who arrive in the UK without parents or guardians are entitled to legal aid, education, and social worker support under the care of local authorities. Charities fear that using facial recognition systems could result in minors being wrongly placed in adult asylum hotels, without proper safeguarding or support.
The Home Office said the technology would not be used in isolation. “Robust age assessments for migrants are vital to maintaining border security,” a spokesperson said. “This technology will not be used alone, but as part of a broad set of methods used by trained assessors.”
Governments worldwide are increasingly turning to artificial intelligence to manage migration. Britain announced in April that it would deploy AI tools to speed up asylum decisions, helping caseworkers summarise interviews and analyse country-specific data. In July, it signed a partnership with OpenAI to explore how to use AI in education, justice, defence, and security.
But rights groups have warned that asylum seekers should not be used as test subjects for unproven technologies. “The asylum system must not be the testing ground for deeply flawed AI tools operating with minimal transparency,” said Sile Reynolds, head of asylum advocacy at Freedom from Torture.
Anna Bacciarelli, senior AI researcher at Human Rights Watch, said facial age estimation could “undermine privacy and other human rights”, adding: “We don’t actually know if it works.”
Facial recognition technologies have previously faced criticism for extracting sensitive biometric data and reinforcing racial or gender biases. They have also been used by London’s police at protests and public events, including the Notting Hill Carnival.
“There are always going to be worries about sensitive biometric data being taken from vulnerable people and used against them,” said Tim Squirrell, head of strategy at tech rights group Foxglove. “The machine tells you that you’re 19 – how do you question that? It’s completely unaccountable.”
Experts say AI models trained on biased or incomplete data can reproduce historic prejudices. The Greater Manchester Immigration Aid Unit (GMIAU) said some young asylum seekers had been told they were too tall or too hairy to be under 18.
“Children are being treated as subjects of immigration control, not as children,” said Rivka Shaw, a GMIAU policy officer, describing the practice as “linked to racism and adultification.”
The Helen Bamber Foundation found that nearly half of migrants reassessed in 2024 – about 680 people – were actually children wrongly sent to adult accommodation.
“A child in adult housing is put in a shared room with strangers and no safeguarding checks,” said Kamena Dorling, the foundation’s policy director.
A July report by the Independent Chief Inspector of Borders and Immigration urged the Home Office to involve trained child-protection professionals in age decisions.
“Decisions on age should be made by child-protection professionals,” said Dorling. “All the concerns we have about human decision-making also apply to AI decision-making.”
By clicking the 'Subscribe’, you agree to receive our newsletter, marketing communications and industry
partners/sponsors sharing promotional product information via email and print communication from Garavi Gujarat
Publications Ltd and subsidiaries. You have the right to withdraw your consent at any time by clicking the
unsubscribe link in our emails. We will use your email address to personalize our communications and send you
relevant offers. Your data will be stored up to 30 days after unsubscribing.
Contact us at data@amg.biz to see how we manage and store your data.