Skip to content
Search

Latest Stories

Instagram removes DM encryption from today: What users should do to stay safe

Users who enabled encryption feature must act as changes take effect today

Instagram DM encryption

Meta can’t read WhatsApp messages, but it can see who you talk to, when, and how often and use that data for ads and recommendations

iStock

Highlights

  • Instagram switches off end-to-end encryption just before federal deepfake law enforcement begins.
  • Meta can now read private messages it previously could not access.
  • Privacy experts warn against storing downloaded chats in Google Drive or iCloud.
Instagram is removing a privacy feature from May 8 that previously stopped the company from accessing the content of users’ direct messages.
The change comes just days before a new US federal law requires platforms to scan and remove harmful content.
The change affects users who turned on Instagram's end-to-end encryption option for direct messages.
Most Instagram users never switched on this feature, according to digital privacy expert Harry Maugans. For the small number who did, the protection ends on May 8.

End-to-end encryption works like a sealed envelope. The platform can see who sent a message and who received it, but cannot open it to read what is inside.

When Instagram removes this feature, it effectively removes the privacy layer that kept messages hidden. As a result, Meta would be able to access the content of those messages.


Users who had enabled the feature would have noticed a lock icon in their direct message chats. However, what will happen to those already encrypted conversations after the deadline is still unclear, Maugans told Fox News.

He said there are two possible outcomes being discussed: either those messages may be moved into regular chat history, or they could be deleted entirely.

This uncertainty is why users are being advised to download their encrypted messages before the change takes effect.

The timing of this move is being linked to the Take It Down Act, a US federal law passed in May 2025. The law requires platforms to remove non-consensual deepfake images within 48 hours of a report.

Platforms are expected to have a working system in place by May 19. Maugans said the timing is unlikely to be a coincidence.

He explained that platforms are required to act quickly against harmful content, but encrypted messaging prevents them from seeing what is being shared, which creates a conflict between privacy and content moderation requirements.

What users should do

If you have been using encrypted Instagram direct messages, it is important to download a copy of your chats. However, how you store that backup is just as important as saving it.

Maugans warned that uploading the downloaded file to services like Google Drive, iCloud, or any cloud storage removes its protection, because it becomes an unencrypted version of your conversations.

He added that if the goal is to prevent access by data brokers, users need to be very careful about where the file is kept.

Saving the backup in cloud platforms reduces its privacy benefit, so it is safer to keep it stored locally on your device.

Meta has suggested switching to WhatsApp for users who want ongoing encrypted messaging, as it is also owned by the company. However, WhatsApp only protects message content, not the surrounding data.

Maugans explained that details such as how often people talk, when they communicate, and how frequently messages are exchanged are still visible. This helps build a communication pattern that can be used in Meta’s systems.

Although WhatsApp cannot read message content, it can still see who is talking to whom and when, and this information continues to support advertising and recommendation tools.

For users who want stronger privacy, Maugans recommends moving to platforms outside Meta’s ecosystem.

He pointed to Signal as a widely trusted option, noting that it is a nonprofit service focused on transparency and strong encryption, and is designed specifically for secure communication.

Signal is free and available at signal.org and in the App Store and Google Play. Other options include Apple iMessage for conversations between Apple devices, though messages to non-Apple phones fall back to standard SMS without the same protection.

For victims of non-consensual intimate images, the Take It Down Act provides a legal path to force removal once platforms have their systems in place after May 19.

Reports can be filed directly with the platform, and the Federal Trade Commission handles enforcement against companies that do not comply.

More For You

Anthropic Claude chatbot competing with ChatGPT in AI market

Anthropic focuses on business customers instead of individual users, unlike OpenAI which targets consumers

Getty Images

Anthropic eyes up to $900bn valuation as Claude chatbot takes on ChatGPT

Highlights

  • Company seeks $900bn valuation, surpassing OpenAI.
  • Revenue jumped fivefold to $45bn annually.
  • Capacity shortages force user restrictions.
The artificial intelligence company Anthropic is working to raise $50bn in fresh funding that would value the business at $900bn, according to industry reports.
This would place the Claude chatbot maker ahead of competitor OpenAI in the increasingly competitive AI sector.

The company's chief financial officer, Krishna Rao, has been speaking with potential investors about the funding round, the Financial Times reported.

No final deal has been agreed, though the transaction could happen by summer. A successful raise would bring Anthropic close to joining technology giants like Apple, Microsoft and Amazon in the trillion-dollar club.

Keep ReadingShow less