Discussion
TikTok won't protect DMs with controversial privacy tech, saying it would put users at risk
Traster: I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.
khalic: No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this.The logic of "anything is better than before" is also fallacious.
miki123211: It makes certain users less safe in certain situations.E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters.
khalic: Claiming e2e makes children less safe is flat out dishonest. And the irony of you criticising “absolutes” after trying to pass one is just delicious.
gzread: What are children at risk of, when E2EE is used?What are children at risk of, when E2EE is not used?
roughly: > What are children at risk of, when E2EE is used?Potential exposure to abusive adults.> What are children at risk of, when E2EE is not used?State-sanctioned violence.
roncesvalles: Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.
philipallstar: Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this.Pick your definition of safe.
trashb: In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on.Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium.
roughly: Right, but it currently isn't a sensitive topic - homosexuality is, as of 2026, broadly legal in the United States. That's a relatively new state of affairs, historically speaking, and one which Afghanistan shared as recently as 2021.
computerex: TikTok is a front for government surveillance, so it's not really surprising that this is their position.
trashb: The government are able to access your conversations, data and connections with e2ee in place already. I don't see how not having e2ee would have an effect on that ability in any way.
Cider9986: Please provide proof for these claims.
MetaWhirledPeas: "makes users less safe"They don't believe that. It makes it more difficult to deal with governments, is all. Big Brother needs your messages from time to time, and TikTok is not willing to risk getting shut down to argue against that. We can't have pesky principles getting in the way of money.
londons_explore: Tiktok has private messaging, and it is used by hundreds of millions of people.IMO no consumer service should have private 1:1 messaging without e2e. Either only do public messaging (ie. Like a forum), or implement e2e.
RobotToaster: Tiktok has direct messages, they don't even call them private.It's better that they're honest about this, nobody should believe for a second that WhatsApp or FB messages are truly E2EE.DM on social media shouldn't be used for anything remotely private. It's a convenience feature, nothing more.
throwaway290: > nobody should believe for a second that WhatsApp or FB messages are truly E2EEThat's interesting. You think all firms that audited WhatsApp and Signal protocol used by WhatsApp and all programmers who worked there for decades and can see a lie and leak if it was true are all crooks? valid opinion I guess, but I won't call it "no one should believe for a second(curious you didn't mention Telegram, it is actually marketed as secure and e2e and it has completely gimped "secret chats" that are off by default and used by like almost nobody.)
max-privatevoid: I'll believe it when it's FOSS
quotemstr: It's one thing to make a policy decision I disagree with. It's another to lie, blatantly, to my face about it. But what do you expect from people who bought TikTok specifically so they could add censorship and lied about it being some kind of national security issue?
swiftcoder: > the controversial privacy feature used by nearly all its rivals"controversial" according to who? The NSA / GCHQ?
a13o: Listed in the article are the National Society for the Prevention of Cruelty to Children and the Internet Watch Foundation, which monitors and removes child sexual abuse material from the internet.The recent Meta lawsuits also mention opposition from the National Center for Missing and Exploited Children and Meta's own executives: Monika Bickert (head of content policy) and Antigone Davis (global head of safety). Both executives mention the danger end-to-end encryption poses to children when attached to a social media graph.https://www.reuters.com/legal/government/meta-executive-warn...
swiftcoder: > Both executives mention the danger end-to-end encryption poses to children when attached to a social media graphSo the fact that we welded a messaging platform onto a global-child-discovery-service is bad? Sure. Not encrypting that messaging platform is sort of closing the barn door after the horse has gone walkabout
oscaracso: It is a considerably larger threat for anonymous strangers to be able to establish private lines of communication with children than for them to know that Lisa Simpson (8) lives in Springfield and attends Springfield Elementary. In terms of discovery, most people are already aware that children can be found in school.
swiftcoder: I don't see how you arrive at that conclusion? The risk in being able to connect to a random victim somewhere in the world appears to be strictly less than being able to target a specific victim in your local geographical area to whom you could gain physical accessHence why nobody up in arms (in either direction) about e2e encryption for Chatroulette
danlitt: > there is more of an expectation of privacy on those mediumWhat does the "p" in "pm" stand for?
trashb: excuse me, I confused "Private messages" (pm) for "Direct messages" (dm).I will update above
danlitt: I don't think you confused anything, except for the terminology the platform uses. There is an obvious expectation of privacy when sending direct messages!
sleepybrett: Hasn't been true ANYTIME IN HISTORY. Hell it was well understood even by children that no conversation you had on the telephone was truly private. That's why cyphers were invented.
giancarlostoro: I forget if its WhatsApp that technically lets you sync chats in unencrypted form to iCloud which is the “loophole” around this, though you can lockdown your iCloud even tighter, not sure it Apple can do much if you fully lock down your iCloud, not sure if this has been legally tested? Its not a very advertised feature its just a setting.
ianburrell: iCloud backups are encrypted, and can be end-to-end encrypted.Also, backups have nothing to do with the messages being end-to-end encrypted. Like if you don't use a passcode on the phone, the messages are still encrypted.
eloisant: Honestly I'm tired with every app trying to become the everything app.Now TikTok wants to be a messaging app. Snapchat has a short video feed just like TikTok. WhatsApp only has a text feed, how long until they also add a video feed?
emulatedmedia: All social media should be considered a front for government surveillance
dlev_pika: Particularly true as oligarchs co-opt gov
dlev_pika: L337 Hax0rs, of course
zzo38computer: In my opinion, a separate software should be used for the end-to-end encryption than for the communication, although there are other things to do for security other than only programming the computer correctly (such as securely agreeing the keys and ciphers in person).
throw0101c: > Tiktok has direct messages, they don't even call them private.It may not be called that, but what are users expecting? Some folks may later be surprised when a warrant gets issued (e.g., from a divorce judge).
giancarlostoro: If you are a grown adult and dont do research on “messaging apps” (which Tik Tok is not) then thats really on you.
red-iron-pine: 80% of the population does not and will never do that level of deep dive on appssame discussion for any form of technology be it TVs or changing their car's oilthe deliberate app-store-ification of all things computer is also designed to keep people from asking those questions -- just download in and install, pleb.it's why the Zoomers can't email attachments or change file types: all of the computers they grew up with were designed so they never had to understand what happens under the hood.
johnisgood: And I think because of all the handholding we are left worse off.
xeckr: Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.
debazel: Children are just too effect of a tool when building a surveillance state. We should have banned children from owning open computers a long time ago just like we do with Alcohol, Driving licenses, etc.Instead children would own special devices that are locked down and tagged with a "underage" flag when interacting with online services, while adults could continue as normal. We already heavily restrict the freedom of children so there is plenty of precedent for this. Optionally we could provide service points to unlock devices when they turn 18 to avoid E-waste as well.This way it's the point of sale where you provide your ID, instead of attaching it to the hardware itself and sending it out to every single SaaS on the planet to do what they wish.
azinman2: Would be a nightmare to implement and achieve the goal, but I have to say I think it’s more right than wrong. All of the data is very clear about the harms.China has restrictions for social media and screen time for kids — how do they implement this?
iamnothere: Why on earth would we be looking to China as a template on how we should run free societies? Are you mad?
azinman2: Good ideas can come from anywhere. Shutting yourself off only does a disservice. You don’t need to replicate 100% of another society to recognize individual strengths.https://www.technologyreview.com/2023/08/09/1077567/china-ch...That describes something very similar to what the OP suggested.
maxdo: People seriously discuss privacy in Chinese app . With all respect, their government will not allow you even a hint of privacy
smugglerFlynn: > as long as there are relatively good options of apps that do have privacy (and I think there are)Once you have enormous network effect like TikTok has, you don't really have any free selection of alternative apps. You are free to use one, but you will be the only sad user over there.Regulations are needed that would force large platforms like TikTok and Instagram to enable federation, opening them up to actual competition. This way platforms would be able to compete on monetisation and usability, instead of competing on locking in their precious users more strictly.
acheron: “Will we ever end the MySpace monopoly?”> MySpace is well on the way to becoming what economists call a "natural monopoly". Users have invested so much social capital in putting up data about themselves it is not worth their changing sites, especially since every new user that MySpace attracts adds to its value as a network of interacting people.> "In social networking, there is a huge advantage to have scale. You can find almost anyone on MySpace and the more time that has been invested in the site, the more locked in people are".https://www.theguardian.com/technology/2007/feb/08/business....
2OEH8eoCRo0: What safety issue are users most likely to encounter?
tuwtuwtuwtuw: The email protocols would like to have a chat with you.
kgwxd: You can bring your own encryption to that, and bring your own client to automate it.
em-bee: you can encrypt the content but not the metadata, not even the subject unless you use a customized client that encodes it (like deltachat which doesn't use a subject at all), but then you still have your email address exposed.for all intents and purposes email is not e2ee.
Bender: Email encryption for most people is sufficient even if the metadata is exposed. One can simply state in their email encryption "Bing Bing Bong" or "Why did you not put the trash out?" which might mean to the recipient :: "check the second SFTP server" or "let the cat outside" or "Jump on my private Mumble chat server" or "Get on my private self hosted IRC server". The email message need not be encrypted for that matter.The intended payload can be in an header-less encrypted file on a throw-away SFTP server in the tmpfs ram disk.
tuwtuwtuwtuw: So it's end to end encrypted except that third parties can see who you communicated with and when? Sure.
tuwtuwtuwtuw: I can bring my own encryption to tiktok as well. Has roughly the same usability and usage.
SuperSandro2000: hahaha, good one
gorgoiler: I don’t really understand how we are supposed to believe in e2ee in closed proprietary apps. Even if some trusted auditor confirms they have plumbed in libsignal correctly, we have no way of knowing that their rendering code is free of content scanning hooks.We know the technology exists. Apple had it all polished and ready to go for image scanning. I suppose the only thing in which we can place our faith is that it would be such an enormous scandal to be caught in the act that WhatsApp et al daren’t even try it.(There is something to be said for e2ee: it protects you against an attack on Meta’s servers. Anyone who gets a shell will have nothing more than random data. Anyone who finds a hard drive in the data centre dumpster will have nothing more than a paperweight.)
AlienRobot: We don't even know if the passwords aren't stored in plain text.
ranyume: This might be off-topic but on-topic about child safety... but I'm surprised people are being myopic about age verification. Age verification should be banned, but people ignore that nowadays most widely used online services already ask for your age and act accordingly: twitter, youtube, google in general, any online marketplace. They already got so much data on their users and optimize their algorithms for those groups in an opaque way.So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.
Nursie: > Age verification should be bannedWhy?> They already got so much data on their usersThere are a variety of ways (see "Verifiable Credentials") that ages can be verified without handing over any data other than "Is old enough" to social media services.
shakna: Age verification obliviates anonymity on the internet. If everything you do, _can_ be tracked by the government, it _will_ be.Allowing for more effective propaganda, electrol control, and lights a fire on the concept of a government _representing_ anyone.
Almondsetat: Ok, and? Presenting your ID at a number of IRL estamblishments also heavily reduces anonymity
gschizas: The difference is that IRL establishments don't sell off that data to anyone else, nor do they have the ability to collate that data with data from other establishments to make a profile of you.(at least not yet)
fragmede: If you think the nightclub that scans your driver's license magstripe isn't selling your data off, when they could be making money off of it? Between PatronScan,Intellicheck, Scantek, and TokenWorks, yeah a dingy bar where it's a dude visually checking isn't it, but a nightclub and quick swipe totally is.
s3p: [delayed]
shakna: The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it.
michaelmior: Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.
danlitt: > I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encryptedWould it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.> However, an alternative could be allowing the sharing of the encryption key with a parentRight, but this is worlds apart from "sharing the encryption key with a private company", is it not?
InsomniacL: > Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?Police can access your home with a warrant.Police cannot access your E2EE DMs with a warrant.
danlitt: Not answering my question!> Police cannot access your E2EE DMs with a warrant.They can and do, regularly. What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caught. But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.They also can't prevent you from flushing drugs down the toilet, but somehow people are still convicted for drug-related crimes all the time. So - yes, obviously, the police could prosecute more crimes if we gave up this protection. That's how limitations on police power work.
Tadpole9181: > But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.Uh, it absolutely isn't? WTF dystopian idea is this?
foobarchu: This viewpoint isn't a slippery slope, it's a runaway train."You moved into a neighborhood with lead pipes? That's on you, should have done more research" "Your vitamins contained undisclosed allergens? You're an adult, and it didn't say it DIDN'T contain those" "Passwords stolen because your provider stored them in plaintext? They never claimed to store them securely, so it's really on you"
AlexandrB: Legislating that everyone must always be safe regardless of what app they use is a one-way ticket to walled gardens for everything. This kind of safety is the rationale behind things like secure boot, Apple's App Store, and remote attestation.Also consider what this means for open source. No hobbyist can ship an IM app if they don't go all the way and E2E encrypt (and security audit) the damn thing. The barriers of entry this creates are huge and very beneficial for the already powerful since they can afford to deal with this stuff from day one.
dvngnt_: > nobody should believe for a second that WhatsApp or FB messages are truly E2EE.Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better though
chimeracoder: > Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better thoughCorrect. WhatsApp uses the Signal protocol, and there is zero evidence of them reading message contents except with the consent of one of the users involved (such as a user reporting a message for moderation purposes).(And before anyone takes issue with that last qualifier, consent from at least one party is the bar for secure communications on any platform, Signal included. If you don't trust the person you are communicating with, no amount of encryption will protect you).Discovering a backdoor in WhatsApp for Facebook/Meta to read messages would be a career-defining finding for a security researcher, so it's not like this is some topic nobody has ever thought to investigate.