Privacy

Created
Thu, 24/04/2025 - 20:49

I have recently been asked by the Panoptykon Foundation if it was possible to create an online age verification system that would not be a privacy nightmare. I highly recommend reading their piece, which dives into several issues around age verification.

I replied that yes, under certain assumptions, this is possible. And provided a rough sketch of such a system.

But before we dive into it, I have to be clear: I am not a fan of introducing online age verification systems. Privacy is just one of the many issues related to them. I dive into some of those later in this post.

However, the broader context is that age verification is becoming required by law in for a lot of services. And systems that are used for it are horrendously bad – like, say, Discord’s idea to verify age of people either by requring a scan of an ID, or enabling the camera in order to let them analyze our face.

Created
Tue, 25/06/2024 - 20:01

Telegram is a popular – especially in the East – internet messenger. It bills itself as “encrypted”, “private”, and “secure”. One of its creators (and the CEO of the company that operates the service), Pavel Durov, has for years been suggesting, in a more or less direct manner, that other internet messenger services expose our conversations and endanger our privacy.

It’s a pretty crude, yet surprisingly effective strategy for distracting attention away from Telegram’s very real problems with security and privacy. And there are quite a few of those.

Created
Tue, 02/07/2024 - 11:21

Automatically tagging or filtering child sexual exploitation materials (CSAM) cannot be effective and preserve privacy at the same time, regardless of what kind of tech one throws at it. Because what is and what is not CSAM is highly dependent on context.

Literally the same photo, bit-by-bit identical, can be an innocent memorabilia when sent between family members, and a case of CSAM if shared on a child porn group.

The information necessary to tell whether or not it is CSAM is not available in the file being shared. It is impossible to tell it apart by any kind of technical means based on the file alone. The current debate about filtering child sexual exploitation materials (CSAM) on end-to-end encrypted messaging services, like all previous such debates (of which there were many), mostly ignores this basic point.

Created
Wed, 01/11/2023 - 03:59

Jessica Buxbaum investigates the partnership between X and an Israeli tech company run by former intelligence officials, highlighting its potential impact on surveillance, censorship, and digital rights.

The post Identity Verification or Data Exposure? Twitter Using Israeli Tech Firm Headed by Ex-Military Officials to Verify Users appeared first on MintPress News.

Created
Sun, 29/10/2023 - 06:46
Former Labour MP Paul Farrelly explains the circumstances surrounding a new legal investigation into whether members of Parliament’s Digital, Culture, Media and Sport (DCMS), looking into phone-hacking and press criminality, were systematically hacked by Murdoch empire