Skip to content
Search

Latest Stories

US jury indicts six, including Indian national, in Amazon bribery conspiracy

A US jury has indicted six people, including an Indian national, on criminal charges for bribing Amazon workers to restock blocked goods or gain unfair competitive advantage in the online marketplace.

Those charged in the case served as consultants for third-party sellers, doling out over $100,000 to employees and contractors at the e-commerce giant for favours or intelligence in a scheme dating back to at least 2017.


Fraudulently reinstated products and merchants reportedly went on to generate more than $100 million in sales in total revenue.

Bribes were paid to at least 10 people, including Amazon contractor Nishad Kunju of Hyderabad in India, prosecutors said.

Kunju went on to become an outside consultant himself, and bribed former colleagues still working for Seattle-based Amazon, according to the indictment.

An Indian-American, Rohit Kadimisetty of southern California, was also named in the indictment.

The others identified were Hadis Nuhanovic of Georgia, and Joseph Nilsen Ephraim Rosenburg and Kristen Leccese of New York.

The defendants will appear at a federal court in Seattle on October 15 to face conspiracy charges.

"Realising they could not compete on a level playing field, the subjects turned to bribery and fraud in order to gain the upper hand," said FBI special agent Raymond Duda.

"What's equally concerning, not only did they attempt to increase sales of their own products, but sought to damage and discredit their competitors."

Illicit favours gained through bribery included extra shelf space in distribution centres; inside data they could use against rivals; and reinstatement of accounts blocked or suspended for rule breaking.

Reinstated products included dietary supplements suspended because of safety complaints; household electronics that had been flagged as flammable, and consumer goods removed for intellectual-property violations, prosecutors noted.

"As the world moves increasingly to online commerce, we must ensure that the marketplace is not corrupted with unfair advantages obtained by bribes and kick-backs," said US attorney Brian Moran.

"The ultimate victim from this criminal conduct is the buying public who get inferior or even dangerous goods that should have been removed from the marketplace."

Amazon said it had worked hard "to build a great experience for customers and sellers, and bad actors like those in this case detract from the flourishing community of honest entrepreneurs that make up the vast majority of its sellers".

The e-com giant added that it "has systems in place to detect suspicious behaviour by sellers or employees, and teams in place to investigate and stop prohibited activity".

"We are especially disappointed by the actions of this limited group of now former employees, and appreciate the collaboration and support from law enforcement to bring them and the bad actors they were entwined with to justice," the company said.

More For You

UK moves to ban DeepNude-style AI ‘nudification’ apps in online abuse crackdown

Creating explicit deepfake images of someone without their consent is already a criminal offence

iStock

UK moves to ban DeepNude-style AI ‘nudification’ apps in online abuse crackdown

Highlights

  • Government plans to ban AI tools that digitally remove clothing from images
  • New offences target the creation and supply of nudification apps
  • Measures form part of a wider strategy to cut violence against women and girls

Ban targets AI-powered image abuse

The UK government says it will ban so-called “nudification” apps, describing them as tools that fuel misogyny and online abuse. The announcement is made on Thursday as part of a broader plan to halve violence against women and girls.

Under the proposed laws, it will become illegal to create or supply artificial intelligence tools that allow users to edit images to make it appear as though a person’s clothing has been removed. The government says the offences will strengthen existing rules on sexually explicit deepfakes and intimate image abuse.

Keep ReadingShow less