Discussion
TikTok won't protect DMs with controversial privacy tech, saying it would put users at risk
Traster: I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.
khalic: No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this.The logic of "anything is better than before" is also fallacious.
miki123211: It makes certain users less safe in certain situations.E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters.
khalic: Claiming e2e makes children less safe is flat out dishonest. And the irony of you criticising “absolutes” after trying to pass one is just delicious.
gzread: What are children at risk of, when E2EE is used?What are children at risk of, when E2EE is not used?
roughly: > What are children at risk of, when E2EE is used?Potential exposure to abusive adults.> What are children at risk of, when E2EE is not used?State-sanctioned violence.
roncesvalles: Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.
philipallstar: Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this.Pick your definition of safe.
trashb: In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on.Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium.
roughly: Right, but it currently isn't a sensitive topic - homosexuality is, as of 2026, broadly legal in the United States. That's a relatively new state of affairs, historically speaking, and one which Afghanistan shared as recently as 2021.
computerex: TikTok is a front for government surveillance, so it's not really surprising that this is their position.
trashb: The government are able to access your conversations, data and connections with e2ee in place already. I don't see how not having e2ee would have an effect on that ability in any way.
Cider9986: Please provide proof for these claims.
MetaWhirledPeas: "makes users less safe"They don't believe that. It makes it more difficult to deal with governments, is all. Big Brother needs your messages from time to time, and TikTok is not willing to risk getting shut down to argue against that. We can't have pesky principles getting in the way of money.
londons_explore: Tiktok has private messaging, and it is used by hundreds of millions of people.IMO no consumer service should have private 1:1 messaging without e2e. Either only do public messaging (ie. Like a forum), or implement e2e.
RobotToaster: Tiktok has direct messages, they don't even call them private.It's better that they're honest about this, nobody should believe for a second that WhatsApp or FB messages are truly E2EE.DM on social media shouldn't be used for anything remotely private. It's a convenience feature, nothing more.
throwaway290: > nobody should believe for a second that WhatsApp or FB messages are truly E2EEThat's interesting. You think all firms that audited WhatsApp and Signal protocol used by WhatsApp and all programmers who worked there for decades and can see a lie and leak if it was true are all crooks? valid opinion I guess, but I won't call it "no one should believe for a second(curious you didn't mention Telegram, it is actually marketed as secure and e2e and it has completely gimped "secret chats" that are off by default and used by like almost nobody.)
max-privatevoid: I'll believe it when it's FOSS
quotemstr: It's one thing to make a policy decision I disagree with. It's another to lie, blatantly, to my face about it. But what do you expect from people who bought TikTok specifically so they could add censorship and lied about it being some kind of national security issue?
swiftcoder: > the controversial privacy feature used by nearly all its rivals"controversial" according to who? The NSA / GCHQ?
a13o: Listed in the article are the National Society for the Prevention of Cruelty to Children and the Internet Watch Foundation, which monitors and removes child sexual abuse material from the internet.The recent Meta lawsuits also mention opposition from the National Center for Missing and Exploited Children and Meta's own executives: Monika Bickert (head of content policy) and Antigone Davis (global head of safety). Both executives mention the danger end-to-end encryption poses to children when attached to a social media graph.https://www.reuters.com/legal/government/meta-executive-warn...
swiftcoder: > Both executives mention the danger end-to-end encryption poses to children when attached to a social media graphSo the fact that we welded a messaging platform onto a global-child-discovery-service is bad? Sure. Not encrypting that messaging platform is sort of closing the barn door after the horse has gone walkabout
oscaracso: It is a considerably larger threat for anonymous strangers to be able to establish private lines of communication with children than for them to know that Lisa Simpson (8) lives in Springfield and attends Springfield Elementary. In terms of discovery, most people are already aware that children can be found in school.
swiftcoder: I don't see how you arrive at that conclusion? The risk in being able to connect to a random victim somewhere in the world appears to be strictly less than being able to target a specific victim in your local geographical area to whom you could gain physical accessHence why nobody up in arms (in either direction) about e2e encryption for Chatroulette
danlitt: > there is more of an expectation of privacy on those mediumWhat does the "p" in "pm" stand for?
trashb: excuse me, I confused "Private messages" (pm) for "Direct messages" (dm).I will update above
danlitt: I don't think you confused anything, except for the terminology the platform uses. There is an obvious expectation of privacy when sending direct messages!
sleepybrett: Hasn't been true ANYTIME IN HISTORY. Hell it was well understood even by children that no conversation you had on the telephone was truly private. That's why cyphers were invented.
giancarlostoro: I forget if its WhatsApp that technically lets you sync chats in unencrypted form to iCloud which is the “loophole” around this, though you can lockdown your iCloud even tighter, not sure it Apple can do much if you fully lock down your iCloud, not sure if this has been legally tested? Its not a very advertised feature its just a setting.
ianburrell: iCloud backups are encrypted, and can be end-to-end encrypted.Also, backups have nothing to do with the messages being end-to-end encrypted. Like if you don't use a passcode on the phone, the messages are still encrypted.
eloisant: Honestly I'm tired with every app trying to become the everything app.Now TikTok wants to be a messaging app. Snapchat has a short video feed just like TikTok. WhatsApp only has a text feed, how long until they also add a video feed?
emulatedmedia: All social media should be considered a front for government surveillance
dlev_pika: Particularly true as oligarchs co-opt gov
dlev_pika: L337 Hax0rs, of course
zzo38computer: In my opinion, a separate software should be used for the end-to-end encryption than for the communication, although there are other things to do for security other than only programming the computer correctly (such as securely agreeing the keys and ciphers in person).
throw0101c: > Tiktok has direct messages, they don't even call them private.It may not be called that, but what are users expecting? Some folks may later be surprised when a warrant gets issued (e.g., from a divorce judge).
giancarlostoro: If you are a grown adult and dont do research on “messaging apps” (which Tik Tok is not) then thats really on you.
red-iron-pine: 80% of the population does not and will never do that level of deep dive on appssame discussion for any form of technology be it TVs or changing their car's oilthe deliberate app-store-ification of all things computer is also designed to keep people from asking those questions -- just download in and install, pleb.it's why the Zoomers can't email attachments or change file types: all of the computers they grew up with were designed so they never had to understand what happens under the hood.
johnisgood: And I think because of all the handholding we are left worse off.
xeckr: Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.
debazel: Children are just too effect of a tool when building a surveillance state. We should have banned children from owning open computers a long time ago just like we do with Alcohol, Driving licenses, etc.Instead children would own special devices that are locked down and tagged with a "underage" flag when interacting with online services, while adults could continue as normal. We already heavily restrict the freedom of children so there is plenty of precedent for this. Optionally we could provide service points to unlock devices when they turn 18 to avoid E-waste as well.This way it's the point of sale where you provide your ID, instead of attaching it to the hardware itself and sending it out to every single SaaS on the planet to do what they wish.
azinman2: Would be a nightmare to implement and achieve the goal, but I have to say I think it’s more right than wrong. All of the data is very clear about the harms.China has restrictions for social media and screen time for kids — how do they implement this?
iamnothere: Why on earth would we be looking to China as a template on how we should run free societies? Are you mad?
azinman2: Good ideas can come from anywhere. Shutting yourself off only does a disservice. You don’t need to replicate 100% of another society to recognize individual strengths.https://www.technologyreview.com/2023/08/09/1077567/china-ch...That describes something very similar to what the OP suggested.
maxdo: People seriously discuss privacy in Chinese app . With all respect, their government will not allow you even a hint of privacy
smugglerFlynn: > as long as there are relatively good options of apps that do have privacy (and I think there are)Once you have enormous network effect like TikTok has, you don't really have any free selection of alternative apps. You are free to use one, but you will be the only sad user over there.Regulations are needed that would force large platforms like TikTok and Instagram to enable federation, opening them up to actual competition. This way platforms would be able to compete on monetisation and usability, instead of competing on locking in their precious users more strictly.
acheron: “Will we ever end the MySpace monopoly?”> MySpace is well on the way to becoming what economists call a "natural monopoly". Users have invested so much social capital in putting up data about themselves it is not worth their changing sites, especially since every new user that MySpace attracts adds to its value as a network of interacting people.> "In social networking, there is a huge advantage to have scale. You can find almost anyone on MySpace and the more time that has been invested in the site, the more locked in people are".https://www.theguardian.com/technology/2007/feb/08/business....
2OEH8eoCRo0: What safety issue are users most likely to encounter?
tuwtuwtuwtuw: The email protocols would like to have a chat with you.
kgwxd: You can bring your own encryption to that, and bring your own client to automate it.
em-bee: you can encrypt the content but not the metadata, not even the subject unless you use a customized client that encodes it (like deltachat which doesn't use a subject at all), but then you still have your email address exposed.for all intents and purposes email is not e2ee.
Bender: Email encryption for most people is sufficient even if the metadata is exposed. One can simply state in their email encryption "Bing Bing Bong" or "Why did you not put the trash out?" which might mean to the recipient :: "check the second SFTP server" or "let the cat outside" or "Jump on my private Mumble chat server" or "Get on my private self hosted IRC server". The email message need not be encrypted for that matter.The intended payload can be in an header-less encrypted file on a throw-away SFTP server in the tmpfs ram disk.
tuwtuwtuwtuw: So it's end to end encrypted except that third parties can see who you communicated with and when? Sure.
tuwtuwtuwtuw: I can bring my own encryption to tiktok as well. Has roughly the same usability and usage.
SuperSandro2000: hahaha, good one
gorgoiler: I don’t really understand how we are supposed to believe in e2ee in closed proprietary apps. Even if some trusted auditor confirms they have plumbed in libsignal correctly, we have no way of knowing that their rendering code is free of content scanning hooks.We know the technology exists. Apple had it all polished and ready to go for image scanning. I suppose the only thing in which we can place our faith is that it would be such an enormous scandal to be caught in the act that WhatsApp et al daren’t even try it.(There is something to be said for e2ee: it protects you against an attack on Meta’s servers. Anyone who gets a shell will have nothing more than random data. Anyone who finds a hard drive in the data centre dumpster will have nothing more than a paperweight.)
AlienRobot: We don't even know if the passwords aren't stored in plain text.
ranyume: This might be off-topic but on-topic about child safety... but I'm surprised people are being myopic about age verification. Age verification should be banned, but people ignore that nowadays most widely used online services already ask for your age and act accordingly: twitter, youtube, google in general, any online marketplace. They already got so much data on their users and optimize their algorithms for those groups in an opaque way.So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.
Nursie: > Age verification should be bannedWhy?> They already got so much data on their usersThere are a variety of ways (see "Verifiable Credentials") that ages can be verified without handing over any data other than "Is old enough" to social media services.
shakna: Age verification obliviates anonymity on the internet. If everything you do, _can_ be tracked by the government, it _will_ be.Allowing for more effective propaganda, electrol control, and lights a fire on the concept of a government _representing_ anyone.
Almondsetat: Ok, and? Presenting your ID at a number of IRL estamblishments also heavily reduces anonymity
gschizas: The difference is that IRL establishments don't sell off that data to anyone else, nor do they have the ability to collate that data with data from other establishments to make a profile of you.(at least not yet)
fragmede: If you think the nightclub that scans your driver's license magstripe isn't selling your data off, when they could be making money off of it? Between PatronScan,Intellicheck, Scantek, and TokenWorks, yeah a dingy bar where it's a dude visually checking isn't it, but a nightclub and quick swipe totally is.
s3p: [delayed]
shakna: The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it.
michaelmior: Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.
danlitt: > I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encryptedWould it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.> However, an alternative could be allowing the sharing of the encryption key with a parentRight, but this is worlds apart from "sharing the encryption key with a private company", is it not?
InsomniacL: > Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?Police can access your home with a warrant.Police cannot access your E2EE DMs with a warrant.
danlitt: Not answering my question!> Police cannot access your E2EE DMs with a warrant.They can and do, regularly. What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caught. But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.They also can't prevent you from flushing drugs down the toilet, but somehow people are still convicted for drug-related crimes all the time. So - yes, obviously, the police could prosecute more crimes if we gave up this protection. That's how limitations on police power work.
Tadpole9181: > But refusing to give up encryption keys and supiciously empty chat histories with a valid warrant is very good evidence of a crime in itself.Uh, it absolutely isn't? WTF dystopian idea is this?
foobarchu: This viewpoint isn't a slippery slope, it's a runaway train."You moved into a neighborhood with lead pipes? That's on you, should have done more research" "Your vitamins contained undisclosed allergens? You're an adult, and it didn't say it DIDN'T contain those" "Passwords stolen because your provider stored them in plaintext? They never claimed to store them securely, so it's really on you"
AlexandrB: Legislating that everyone must always be safe regardless of what app they use is a one-way ticket to walled gardens for everything. This kind of safety is the rationale behind things like secure boot, Apple's App Store, and remote attestation.Also consider what this means for open source. No hobbyist can ship an IM app if they don't go all the way and E2E encrypt (and security audit) the damn thing. The barriers of entry this creates are huge and very beneficial for the already powerful since they can afford to deal with this stuff from day one.
dvngnt_: > nobody should believe for a second that WhatsApp or FB messages are truly E2EE.Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better though
chimeracoder: > Meta still tracks analytics which isn't good for privacy, but I'm not aware of any news of them or 3rd parties reading messages without consent of one of the 1st parties? Signal is probably much better thoughCorrect. WhatsApp uses the Signal protocol, and there is zero evidence of them reading message contents except with the consent of one of the users involved (such as a user reporting a message for moderation purposes).(And before anyone takes issue with that last qualifier, consent from at least one party is the bar for secure communications on any platform, Signal included. If you don't trust the person you are communicating with, no amount of encryption will protect you).Discovering a backdoor in WhatsApp for Facebook/Meta to read messages would be a career-defining finding for a security researcher, so it's not like this is some topic nobody has ever thought to investigate.
tayo42: I can't tell if this is sarcasm or not
keybored: It’s about 15 years ahead of its time. Too enlightened for most.
keybored: The most important principle in the modern age is the freedom to prey on wallets. You can’t give parents tools to conveniently restrict what their children do. Impressionable minds ought to live in a lord of the flies state where they are bombarded with stuff to nag to their parents about and give them FOMO about what their friends have that they don’t have.That’s why children must be free.
hogwasher: Most people couldn't tell you how their car works, at least not enough to fix it. Is that handholding, too?People can't be knowledgable about everything. There's just too much information in the world, and too many different skills that could be learned, and not enough time.A carpenter can rely on power tools without understanding fully how the tools work, and it's fine, as long as the tools are made to safe standards and the user understands basic safety instructions (e.g. wear protective eyewear).To me, making sure that apps don't screw with people, even if they don't understand how the apps work, is roughly the equivalent of making sure power drills are made safely so they don't explode in peoples' hands.
pixl97: >Regulations are neededLolololol. No, not regulations. Regulators. With the people we currently have voted into office in the US the only regulations we are going to get are ones saying Sam and Peter must look at everything you do all the time.Until we stop voting for more authoritarianism, expect ever increasing amounts of authoritarianism.
hogwasher: I think it was clear what they meant.
hogwasher: Sure, but then everyone moved to Facebook. The monopolist changed, but not the monopolistic market and the lack of consumer choice.And nobody gained privacy in the process (I rather think everyone lost even more of it).The situation currently permits only a tiny number of winning companies at a time, and the userbase is locked in even as the site becomes wildly unpopular, until some threshold of discontent is reached, and then everyone moves, and then that new site also enshittifies and the cycle repeats.Federation is a mechanism whereby people would be able to actually choose providers as individuals and at any time, instead of having to wait years for a critical mass of upset people to build up and leave [current most popular social media site], and instead of being forced to go to [new most popular social media site].
hogwasher: Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor? Because no matter how you do that, that would result in false positives, and either unfair auto-bans and erroneous reports to law enforcement (so no human views the images), or human employees viewing other adults' consensual nudes that were meant to be private. Or it would result in adult employees viewing nudes sent from one minor to another minor, which would also be a major breach of those minors' privacy.There is a program whereby police can generate hashes based on CSAM images, and then those hashes can be automatically compared against the hashes of uploaded photos on websites, so as to identify known CSAM images without any investigator having to actually view the CSAM and further infringe on the victim's privacy. But that only works vs. already known images, and can be done automatically whenever an image is uploaded, prior to encryption. The encryption doesn't prevent it.Point being, disallowing encryption sacrifices a lot, while potentially not even being that useful for catching child abusers in practice.I'm sure some offenders could be caught this way, but it would also cause so many problems itself.
eru: Parents are already allowed to restrict their children access to 'dangerous' things like open computers or knives.
thaumasiotes: I don't think debazel was saying that children should have been banned from owning computers for the benefit of the children. He was saying that children should have been banned from owning computers so that the government would have no excuse to regulate what's allowed on computers.
eru: Well, it didn't work for alcohol and tobacco: in addition to being banned for children in many jurisdictions they are still heavily regulated.
LoganDark: Monitoring children's DMs is the responsibility of the parents, not megacorps. If a parent wants to install a keylogger or screen recorder on their child's PC, that's their decision. But Google should not be able to. Neither should... literally anyone else except maybe an employer on a work-provided device.
ranyume: > Monitoring children's DMs is the responsibility of the parents, not megacorpsAbsolutely. But what responsibilities do megacorps have? Right now, everyone seems to avoid this question, and make do with megacorps not being responsible. This means: "we'll allow megacorps to be as they are and not take any responsibilities for the effects they cause to society". Instead of them taking responsibilities, we're collecting everyone's data and calling it a day by banning children from social networks... and this is because there are many interests involved (not related to child development and safety).
acuozzo: > But what responsibilities do megacorps have? Right now, everyone seems to avoid this questionClear, simple, direct: Whatever was required of The Bell Telephone Company and nothing more.
da_chicken: So there should be a human operator manually gatekeeping every individual request to connect with another endpoint?It's a good thing those human operators couldn't listen in to whichever conversation they wanted.
acuozzo: Human operators were not required of The Bell Telephone Company by law. Bell switched to mechanical switching stations as soon as doing so was economically advantageous.(Reconsider my post. I'm arguing for no regulation.)
da_chicken: You're arguing for no regulation and your example is one of the most oppressive and stifling monopolies in American history?
NoahZuniga: > What they can't do is prevent you from deleting your DMs if you know you're under investigation and likely to be caughtIf you are pretty confident your under investigation then this is might be Obstruction of Justice and that's pretty illegal.
londons_explore: Doesn't have to be a law. Can just be standard engineering practice.Websockets for example are always encrypted (not e2e). That means anyone who implements a chess game over websockets gets encryption at no extra effort.We just need e2e to be just as easy. For example maybe imagine a new type of unicode which is encrypted. Your application just deals with 'unicode' strings and the OS handles encryption and decryption for you, including if you send those strings over the network to others.
throw0101c: > Most people couldn't tell you how their car works […]Most people couldn't tell you how their furnace or water heater works, or flush toilet (siphonic effect).
beeflet: federation would never work. How would it work here? Either you are forcing tiktok to give pageviews to federations of spam, or you are letting tiktok decide which federations to work with, which essentially results in no federation.
smugglerFlynn: Nobody stops spammers from creating websites, but we still have search engines and web. Nobody stops spammers from sending emails, but we still use SMTP.It is just a matter of tools we build to rank and filter content. With open protocols platforms can actually compete on antispam tools, among other features.
smugglerFlynn: I would argue the only thing that does stop current situation from snowballing into something much worse are pre-existing institutions and regulations.That's also why dismantling and challenging these is often the very first priority for authoritarian actors.
pixl97: >I'm not aware of any news of themYet. Until they say "We delete these messages after X time and they are gone gone, and we're not reading them" Assume they are reading them, or will read them and the information just hasn't got out yet.I mean we keep finding more and more cases where companies like FB and Google were reading messages years ago and it wasn't till now we found out.
nindalf: > We delete these messages after X timeThey never had the plaintext of the messages in the first place, so they don't need to delete them. That's what end-to-end encrypted means.
pixl97: They don't need the plaintext if they have your key. Since they wrote the application you have zero clue if they do or not.
greatgib: They said "less safe" but they didn't say "less safe" for who. It is obviously less safe for dictatorial government and so they can't tolerate that...
michaelmior: > Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant?This is a false equivalency. I don't have to use TikTok DMs if I want E2EE. I don't have a choice about laws that allow the police to violate my rights. I'm not claiming that all E2EE apps should be banned.> Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?Exactly why I suggested that as a possible alternative.
michaelmior: > Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor?No, I was not suggesting that.
dijit: I've been making this argument for a long time, and it's never popular.People want to believe in E2EE, it's almost like religion at this point.Protecting people is synonymous with E2EE, even if you cant verify it, and it can be potentially broken.I was even more controversial and singled out Signal as an example: https://blog.dijit.sh/i-don-t-trust-signal/
autoexec: There are good reasons to not trust signal. The very first line of their privacy & terms page says "Signal is designed to never collect or store any sensitive information" but then they started collecting and permanently storing sensitive user data in the cloud and never updated that page. Much more recently they started collected and storing message content in the cloud for some users, but they still refuse to update that page. I'm pretty sure it's big fat dead canary warning users away from Signal. Any service that markets itself to whistleblowers and activists then also outright lies to them about the risks they take when using it can't be trusted for anything.
EmbarrassedHelp: Encryption is "controversial" according to the BBC's "reporters"
EmbarrassedHelp: > I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.The problem with that idea, that you are implying E2E should require age verification. Everyone should have access to secure end to end encryption.
derbOac: It's a kind of Trojan horse propaganda in my opinion.Users get used to the argument with TikTok and then apply it to other platforms.Put it this way: why wouldn't those same arguments apply to any platform (if you believed them)?
Nursie: > Age verification obliviates anonymity on the internet.How so?Please explain in detail, because there are already schemes such as "verifiable credentials" which allow people to prove they are of age without handing over ID to online services.
afiori: because most implementations are not going to be like that.
Nursie: In the context of "Age verification should be banned" though, we're already talking about legislative intervention. If there's no particular problem with schemes that are like that then we don't necessarily need a blanket ban on age verification.Perhaps what we're really saying is "Ban age verification that collects lots of personal information".Or perhaps we could distil it down further to "Ban unnecessary collection and storage of PII". In which case, Congrats! You've arrived back at the GDPR :)Which I think is a good thing, and should be strengthened further.(Also the other response to "because most implementations are not going to be like that" is "why not?". People are already building such ecosystems.)
AnthonyMouse: > If there's no particular problem with schemes that are like that then we don't necessarily need a blanket ban on age verification.There is a problem with schemes like that.The way computer security works is, attacks always get better, they never get worse. A scheme that nobody has found any privacy holes in when it's enacted will have one found a week after.The way governments work is, the compromise bill passes if the people who care about privacy support it because then it has the votes of the people who care about privacy and the people who want to ID everyone. But then when the vulnerability is found, the people who care about privacy can't get it fixed because they can't pass a new bill without also having the votes of the people who want to ID everyone, and those people already have what they want. More specifically, many of them then have what they really want, which is to invade everyone's privacy, as they were hoping to do once the vulnerability was found.Which means you need it to be perfect the first time or it's already ossified and can't be fixed. But the chances of that happening in practice are zero, which means it needs to not happen at all.
Nursie: > There is a problem with schemes like that./goes on to discuss how government legislation of specific schemes is the issue, not the schemes themselves.Then we don't legislate specific schemes? The GDPR doesn't do that, for instance, it spells out responsibilities and penalties but doesn't say "Though shalt use this specific algorithm".Remember, this discussion started with a call to ban all age checks, which itself is a government action and restriction on the agency of private business.There are ways that private entities can implement age checks both securely and without leaking much other information, so it seems very heavy-handed to ban them. Private entities are building such systems between themselves already, without government mandates on the specifics.
AnthonyMouse: > Then we don't legislate specific schemes?Except that you have to in this case because IDs are issued by the government and then it's the government having to provide some privacy-protecting means of using them, which is the thing they're incapable of in practice.> There are ways that private entities can implement age checks both securely and without leaking much other informationI have yet to see a single one implemented in real life. Moreover, private entities have the perverse incentive to do the opposite, because they find it profitable to track people, or find it unprofitable to spend the resources necessary to prevent themselves from being infiltrated by foreign governments when their business is the sort which is useful to them as these are.
techpression: I don’t remember reading about ads in phone calls, nor the complete mapping of customers behaviors to use in contexts not being the phone call.The apples to oranges in this comparison is probably top five on HN ever.
acuozzo: > nor the complete mapping of customers behaviors to use in contexts not being the phone callThis is because the telephone system was regulated with wiretapping laws, among others.> I don’t remember reading about ads in phone callsSee above, but also: Junk Faxes & Telemarketing/Robocalls.> The apples to oranges in this comparison is probably top five on HN ever.It all comes down to whether you view social media as a communications platform or a publishing platform.
iso1631: Whatever was required of the new york times and nothing more.If the NYT publishes and advert or editorial, it's held accountable for the contents.
acuozzo: Touché!The question is: Are social media services more similar to communications platforms or publishing platforms?My reply obviously treats them like the former and yours like the latter.
ranyume: I'd say that at minimum social networks need to be required to show how their algorithm works and allow users control over their data. They must be able to know why a content was served to them. Nowadays social networks are so pervasive in society, affecting it and molding it to unknown interests, that this is the bare minimum for a free society.Ideally, users should be able to modify the algorithm, so they can get just what they want, while simultaneously maximizing free speech. If something isn't illegal, it shouldn't be hidden or removed.
drnick1: > Nowadays social networks are so pervasive in society, affecting it and molding it to unknown interestsI think this is the real issue. We should free ourselves from "social networks" such as Tiktok, Facebook, Instagram and others. Even with direct messages truly E2EE, they create countless other privacy problems. They enable surveillance of people at scale and should be completely shunned for that reason alone.
acuozzo: So does (and did!) the telephone system.
LZ_Khan: It's fine except for their argument that it makes people less safe. If they want to disallow encryption they don't need to lie to people while they're at it.
nradov: Lots of other consumer services such as Strava have direct messaging without e2e encryption. No privacy is guaranteed. This is fine, they're not deceiving anyone about how it works.
iso1631: Posts made are like the letters page.But even if you don't believe that should be controlled, when it comes to the adverts these publish surely it has to be the latter.
gzread: Yes, that is a fair argument and most countries allow the use of surveillance cameras in public for this reason.
danlitt: in public is the operative word (and surveillance cameras in public are extremely recent and very controversial, so not as strong an argument as you might be thinking)
danlitt: > This is a false equivalency.I'm not making an equivalency. I'm just trying to get you to think how something that is at surface level true is not necessarily a "fair argument".> I don't have to use TikTok DMs if I want E2EE.I don't know why you think this is a convincing argument. It is currently illegal to tap people's phone lines, but when phones were invented it obviously was not illegal. It became illegal in part because people had a reasonable expectation of privacy when using the phone. They also have a reasonable expectation of privacy when using TikTok DMs - that's why people call them "private messages" so often!> Exactly why I suggested that as a possible alternative.My point is that you are offering these as alternatives when they are profoundly different proposals. It is like me saying I am pro forced sterilization and then offering as an alternative "we could just only allow it when people ask for it". That's a completely different thing! Having autonomy over your online life as a family rather than necessarily as an individual is totally ok. Surrendering that autonomy is not.
ThoAppelsin: DMs are akin to private conversations in real life. Thus, every DM feature should entail E2EE.It’s ok for a platform to not feature private conversations. They should just have no DM feature at all, then; make all messages publicly visible.Private conversations are indeed not for all ages. Parents should be able to grant access to that on individual basis.
kreco: > They should just have no DM feature at all, then; make all messages publicly visible.This makes no sense.I can discuss something in a bar which is not a very private conversation, I wouldn't care if someone else hear what I'm saying. But I also don't want someone to record it and post it on the internet to be seen by the whole world.Privacy is not just boolean you toggle somewhere.
93po: In a bar you're not speaking directly into a microphone that is permanently saving everything you say for later instant access by every government and advertising agency that wants to prosecute you or invade your privacy to sell you something
kreco: Exactly.You didn't mention the fact that my mom cannot access the recording of my microphone.That's what ThoAppelsin is proposing.It should be fairly implicit that if you are using a free product from a private company you are the products.However it's definitely not implicit that every I do on the platform will get publicly known by everyone else.
technofiend: I once publicly stated it's understandable that someone would post an ad that says "No YouTubers" because people don't want to be content for others. The reply I got was "but you're being recorded all the time anyway", as if those are remotely related.
johnisgood: And I never said that people should be knowledgeable about everything... but knowing that your direct messages are not private goes as far as ~1999 or earlier.
pino83: Does it matter. It's just some arbitrary company. They do have the freedom to decide those things however they want, right? The customer can then decide whether to switch or not.
hk__2: Switch to what exactly?
pino83: If there is nothing else, then you as a customer has screwed up with it before, right? And then the entire strategy/philosophy is maybe to be reviewed?!Or, in other words: If there is no alternative, this is due to your own faults. Either deal with it, or find ways to undo your mistakes.
hk__2: How is it the customer’s fault that no good alternative to TikTok exists?
trashb: Myth: End-to-end encryption (E2EE) is the only way to ensure robust cybersecurity. Reality: E2EE carries its own risks and vulnerabilities. No single, standalone method achieves bulletproof cybersecurity. Robust cybersecurity requires layering multiple, diligently managed security measures and best practices. Malevolent actors can exploit E2E encryption to avoid critical data security scanning, to allow malware inside a network or onto a device, and to evade law enforcement. https://www.fbi.gov/how-we-investigate/lawful-access/lawful-...
lunias: Do whatever you want with your app, but you should be punished if you lie. Saying E2E encryption makes users less safe is a lie.
gzread: Alternatives to private messaging on TikTok?Uh, Signal. SimpleX. Session. XMPP/OTR. PGP.Discussing things on TikTok, that the government must not know about, seems a bad idea.
pino83: Exactly. And once those ones are established: Why not have all discussions there; not just the ones where you explicitly want to hide something (for whatever dubious or legit reason).
gzread: Because you know someone's TikTok handle and you don't know their email address and PGP key
pino83: Yes, indeed, as far as I understood younger people, they'd definitely do a lot of "very smart things" (tm) before they get the idea that one can indeed exchange email addresses via tiktok. Guess what: It's not solely for emojies! :-p Theoretically - although no one ever did - you _could_ exchange useful information there as well.
bougainvilley: so we agree that governments only using the safety of children as pretext to extend their control of people's lives, otherwise there are better solution protect children of the harms of the internet.
eru: I would put the blame on voters more than governments.