A NEW immersive play performed in a Bradford car park will explore the significance of car culture within British Asian communities and how it acts as an escape from racism.
Performed in Oastler Market car park overlooking the city centre, Peaceophobia examines rising Islamophobia from the perspective of three young British Pakistani men from a modified car club.
Co-written by acclaimed playwright Zia Ahmed, the story shows how the men use their love of cars and their faith as a sanctuary from the world around them. Prior to writing the script, Ahmed spent time with the performers to learn about their passion for car culture. He quickly witnessed the extent of the love they have for their vehicles.
“I think a lot of that (love) is to do with the agency, it is their space, and it is something they own,” he told Eastern Eye. “They can put their personality onto it and it’s not something that they need to suppress – they can go as loud or as big as they want.”
The play will be performed in a car park in Bradford
Modifying their vehicles is a creative outlet and can act as a source of escapism, Ahmed added. “There is a lot of joy and pride that comes into maintaining the cars,” he said.
Although the play centres upon the prominent car culture within Bradford’s Asian community, London-based writer Ahmed does not have a huge amount of experience with motors. “I can’t actually drive,” he laughed.
The production was co-created by members of Speakers’ Corner, a political, creative collective of women and girls. Although Speakers’ Corner is made up of females, the show primarily focuses upon the lives of Muslim men.
While speaking to journalists to promote the show, Ahmed admitted the creators had been questioned several times on why the play centred on Muslim men.
“We’ve been asked that question a lot and it’s like, who else has been talking about (the men)?” he said. “The narrative for a young Muslim man in this country has always been around criminality and poverty and while those things will be touched upon, the focus of the show is the cars and faith in a positive way, not just solely about Islamophobia.
"As much as Islamophobia impacts in a negative way, there’s still the love for the faith that keeps you going.”
The show is offering a platform to the men, Ahmed added, a place to share their stories and experiences.
Speakers’ Corner Iram Rehman said the project began as a campaign to promote that Islam comes from peace. “We are part of a movement of young people using their voices to make a positive change and promote peace instead of being silenced,” Rehman explained.
The show’s title is a spin on Islamophobia, Ahmed noted. He said: “It’s about three guys trying to find peace through their religion – and how can you have a phobia of peace?”
The play is performed by three men – Mohammad Ali Yunis, Casper Ahmed, and Sohail Hussain. The trio all keep their real names for the play, a conscious decision by the creators.
“They are playing versions of themselves and (the play) has come from workshopping with them and listening to their stories,” Ahmed explained. “Of course, there’s a performance element but (Peaceophobia) is unashamedly about these three Pakistani-Muslim men from Bradford.”
Zia Ahmed
The project was conceived before the outbreak of the Covid-19 pandemic last year and Ahmed considered whether it would still be timely. However, he noted Muslims are still being profiled and discriminated against because of their religion.
“Islamophobia isn’t gone, it didn’t stop during the pandemic,” he said. “But the show is not just about that – it’s also about the happier, hopeful side of faith and the love of the cars.
"Whatever preconceptions that you might have about the cars, seeing the detail and the care and the love that they have (for their vehicles), I think anyone can relate to that.”
Peaceophobia plays at Oastler Market car park, Bradford, from Friday (10) to next Saturday (18); then at Contact Manchester from September 29 to October 2.
A child is hoisted into a small boat as migrants wait in the water for a 'taxi boat' to take them across the channel to the UK at dawn on September 19, 2025 in Gravelines, France. (Photo by Dan Kitwood/Getty Images)
BRITAIN's plan to use artificial intelligence (AI) to assess the ages of asylum seekers has sparked concern among human rights groups, who warn the technology could misclassify children as adults and deny them vital protections.
The government intends to introduce facial age-estimation technology in 2026 to verify the ages of migrants claiming to be under 18, particularly those arriving on small boats from France. Officials say the move will help prevent adults from posing as children to exploit the asylum system.
Prime Minister Keir Starmer is under growing pressure to control migration, as Nigel Farage’s anti-immigration Reform UK party gains support in opinion polls. More than 35,000 people have crossed the English Channel in small boats this year, a 33 per cent rise on the same period in 2024.
Meanwhile, rights campaigners and social workers argue that assessing the age of migrants is a complex and sensitive process that cannot be replaced by technology.
“Assessing the ages of migrants is a complex process which should not be open to shortcuts,” said Luke Geoghegan, head of policy and research at the British Association of Social Workers. “This should never be compromised for perceived quicker results through artificial intelligence.”
Children who arrive in the UK without parents or guardians are entitled to legal aid, education, and social worker support under the care of local authorities. Charities fear that using facial recognition systems could result in minors being wrongly placed in adult asylum hotels, without proper safeguarding or support.
The Home Office said the technology would not be used in isolation. “Robust age assessments for migrants are vital to maintaining border security,” a spokesperson said. “This technology will not be used alone, but as part of a broad set of methods used by trained assessors.”
Governments worldwide are increasingly turning to artificial intelligence to manage migration. Britain announced in April that it would deploy AI tools to speed up asylum decisions, helping caseworkers summarise interviews and analyse country-specific data. In July, it signed a partnership with OpenAI to explore how to use AI in education, justice, defence, and security.
But rights groups have warned that asylum seekers should not be used as test subjects for unproven technologies. “The asylum system must not be the testing ground for deeply flawed AI tools operating with minimal transparency,” said Sile Reynolds, head of asylum advocacy at Freedom from Torture.
Anna Bacciarelli, senior AI researcher at Human Rights Watch, said facial age estimation could “undermine privacy and other human rights”, adding: “We don’t actually know if it works.”
Facial recognition technologies have previously faced criticism for extracting sensitive biometric data and reinforcing racial or gender biases. They have also been used by London’s police at protests and public events, including the Notting Hill Carnival.
“There are always going to be worries about sensitive biometric data being taken from vulnerable people and used against them,” said Tim Squirrell, head of strategy at tech rights group Foxglove. “The machine tells you that you’re 19 – how do you question that? It’s completely unaccountable.”
Experts say AI models trained on biased or incomplete data can reproduce historic prejudices. The Greater Manchester Immigration Aid Unit (GMIAU) said some young asylum seekers had been told they were too tall or too hairy to be under 18.
“Children are being treated as subjects of immigration control, not as children,” said Rivka Shaw, a GMIAU policy officer, describing the practice as “linked to racism and adultification.”
The Helen Bamber Foundation found that nearly half of migrants reassessed in 2024 – about 680 people – were actually children wrongly sent to adult accommodation.
“A child in adult housing is put in a shared room with strangers and no safeguarding checks,” said Kamena Dorling, the foundation’s policy director.
A July report by the Independent Chief Inspector of Borders and Immigration urged the Home Office to involve trained child-protection professionals in age decisions.
“Decisions on age should be made by child-protection professionals,” said Dorling. “All the concerns we have about human decision-making also apply to AI decision-making.”
By clicking the 'Subscribe’, you agree to receive our newsletter, marketing communications and industry
partners/sponsors sharing promotional product information via email and print communication from Garavi Gujarat
Publications Ltd and subsidiaries. You have the right to withdraw your consent at any time by clicking the
unsubscribe link in our emails. We will use your email address to personalize our communications and send you
relevant offers. Your data will be stored up to 30 days after unsubscribing.
Contact us at data@amg.biz to see how we manage and store your data.