Discussion
Intel Demos Chip to Compute With Encrypted Data
zvqcMMV6Zcr: > Heracles, which sped up FHE computing tasks as much as 5,000-fold compared to a top-of the-line Intel server CPU.That is nice speed-up compared to generic hardware but everyone probably wants to know how much slower it is than performing same operations on plain text data? I am sure 50% penalty is acceptable, 95% is probably not.
freedomben: Perhaps it's a cynical way to look at it, but in the days of the war on general purpose computing, and locked-down devices, I have to consider the news in terms of how it could be used against the users and device owners. I don't know enough to provide useful analysis so I won't try, but instead pose as questions to the much smarter people who might have some interesting thoughts to share.There are two, non-exclusive paths I'm thinking at the moment:1. DRM: Might this enable a next level of DRM?2. Hardware attestation: Might this enable a deeper level of hardware attestation?
egorfine: > how it could be used against the users and device ownersSame here.Can't wait to KYC myself in order to use a CPU.
gruez: See: https://news.ycombinator.com/item?id=47323743It's not related to DRM or trusted computing.
esseph: Everything about this in my head screams "bad idea".If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
u1hcw9nx: In FHE the hardware running it don't know the secrets. That's the point.First you encrypt the data. Then you send it to hardware to compute, get result back and decrypt it.
inetknght: Not yet.
gruez: What does that even mean?A: "Intel/AMD is adding instructions to accelerate AES"B: "Might this enable a next level of DRM? Might this enable a deeper level of hardware attestation?"A: "wtf are you talking about? It's just instructions to make certain types of computations faster, it has nothing to do with DRM or hardware attestation."B: "Not yet."I'm sure in some way it probably helps DRM or hardware attestation to some extent, but not any more than say, 3nm process node helps DRM or hardware attestation by making it faster.
JanoMartinez: One thing I'm curious about is whether this could change how cloud providers handle sensitive workloads.If computation can happen directly on encrypted data, does that reduce the need for trusted environments like SGX/TEE, or does it mostly complement them?
youknownothing: I don't think it's applicable to DRM because you eventually need the decrypted content: DRM is typically used for books, music, video, etc., you can't enjoy an encrypted video.I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.
freedomben: Yes it must be decrypted eventually, but I've read about systems (I think HDMI does this) where the keys are stored in the end device (like the TV or monitor) that the user can't access. Given that we already have that, I think I agree that this news doesn't change anything, but I wonder if there are clever uses I haven't thought of
corysama: There are applications that are currently doing this without hardware support and accepting much worse than 95% performance loss to do so.This hardware won’t make the technique attractive for ALL computation. But, it could dramatically increase the range of applications.
bobbiechen: Agreed. When I was working on TEEs/confidential computing, just about everyone agreed that FHE was conceptually attractive (trust the math instead of trusting a hardware vendor) but the overhead of FHE was so insanely high. Think 1000x slowdowns turning your hour-long batch job into something that takes over a month to run instead.
NegativeLatency: Rent out your spare compute, like seti@home or folding@home, but it’s something someone could repackage and sell as a service.
Chance-Device: FHE is the future of AI. I predict local models with encrypted weights will become the norm. Both privacy preserving (insofar as anything on our devices can be) and locked down to prevent misuse. It may not be pretty but I think this is where we will end up.
Foobar8568: FHE is impractical by all means. Either it's trivially broken and unsecured or the space requirements go beyond anything usable.There is basically no business demand beside from sellers and scholars.
Foobar8568: Now we know why Intel more or less abandonned SEAL and rejected GPU requests.
patchnull: Current FHE on general CPUs is typically 10,000x to 100,000x slower than plaintext, depending on the scheme and operation. So even with a 5,000x ASIC speedup you are still looking at roughly 20-100x overhead vs unencrypted compute.That rules out anything latency-sensitive, but for batch workloads like aggregating encrypted medical records or running simple ML inference on private data it starts to become practical. The real unlock is not raw speed parity but getting FHE fast enough that you can justify the privacy tradeoff for specific regulated workloads.
tromp: 10,000x to 100,000x / 5,000x = 2 to 10x, not 20 to 100x.
boramalper: If you're interested in "private AI", see Confer [0] by Moxie Marlinspike, the founder of Signal private messaging app. They go into more detail in their blog. [1][0] https://confer.to/[1] https://confer.to/blog/2025/12/confessions-to-a-data-lake/
CamperBob2: I don't get how this can work, and Moxie (or rather his LLM) never bothers to explain. How can an LLM possibly exchange encrypted text with the user without decrypting it?The correct solution isn't yet another cloud service, but rather local models.
cwmma: In theory you only need to trust the hardware to be correct, since it doesn't have the decryption key the worst it can do is give you a wrong answer. In theory.
esseph: But can you trust the hardware encryption to not be backdoored, by design?That's my point, this sounds like a way to create a backdoor for at-rest data.
bilekas: > That's my point, this sounds like a way to create a backdoor for at-rest data.I get the feeling honestly it seems more expensive and more effort to backdoor it..
Reptur: [delayed]
Foobar8568: But you leak all type of information and and the retrieve either leak even more data or you'll end up with transferring a god knows amount of data or your encryption is trivially broken or spend days/month/years to unencrypt.
bilekas: I don't know how you got these ideas but when you crack it, do make sure to write a post about it. Can't wait for that writeup.
boramalper: They explain it in Private inference [0] if you want to read about it.[0] https://confer.to/blog/2026/01/private-inference/
bilekas: This is incredible work.. And makes the technology absolutely viable.However... In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware. My cynic could envision the technology export ban worldwide in the vein of RSA [0] .Why would any company offer the customers real out of the box e2e encryption possibilities built into their devices.DRM was mentioned by another user. This will not be used to enable privacy for the masses.https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...
vasco: Regarding DRM I don't see how it'll survive "Camera in front of the screen" + "AI video upscaling" once the second part is good enough. Can't DRM between the screen and your eyes. Until they put DRM in Neuralink.
FrasiertheLion: Arguably this is less useful for consumer hardware in the first place. This is mostly useful when I don’t trust the service provider with my data but still need to use their services (casting my vote, encrypted inference, and so forth)
autoexec: > In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware.Why not when government can just force companies to backdoor their hardware for them. That way users are secure most of the time except from the government (until the backdoor in intel's chips gets discovered anyway), and users have a false sense of security/privacy so people are more likely to share their secrets with corporations and the government gets to spy on people communicating more openly with each other.
KoolKat23: This is quite the opposite, better than we have.It raises the hurdle for those looking to surveil.If a tree falls in the forest and no one is around to hear it, does it make a sound?
Foobar8568: LWE estimator isn't a proxy for this?
Chance-Device: If you know how to reverse engineer weights or even hidden states through simple text output without logprobs I’d be interested in hearing about it. I imagine a lot of other people would be too.
RiverCrochet: > Can't DRM between the screen and your eyes.No, but media can be watermarked in imperceptible ways, and then if all players are required to check and act on such watermarks, the gap becomes narrow enough to probably be effective.See Cinavia.
numpad0: [delayed]
benlivengood: 1. The private key is required to see anything computed under FHE, so DRM is pretty unlikely.2. No, anyone can run the FHE computations anywhere on any hardware if they have the evaluation key (which would also have to be present in any FHE hardware).
Frieren: > how it could be used against the usersWe are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.3. Unskippable ads with data gathering at the CPU level.
dimitrios1: I distinctly remember from university in one of my more senior classes designing logic gates, chaining together ands, nands, ors, nors, xors, and then working our way up to numerical processors, ALUs, and eventually latches, RAM, and CPUs. The capstone was creating an assembly to control it all.I remember how thinking how fun it was! I could see unfolded before me how there would be endless ways to configure, reconfigure, optimize, etc.I know there are a few open source chip efforts, but wondering maybe now is the time to pull the community together and organize more intentionally around that. Maybe open source chipsets won't be as fast as their corporate counterparts, but I think we are definitely at an inflection point now in society where we would need this to maintain freedom.If anyone is working in that area, I am very interested. I am very green, but still have the old textbooks I could dust off (just don't have the ole college provided mentor graphics -- or I guess siemens now -- design tool anymore).
linguae: I was just thinking about this a few days ago, but not just for the CPU (which we have RISC-V and OpenPOWER), but for an entire system, including the GPU, audio, disk controllers, networking, etc. I think a great target would be mid-2000s graphics and networking; I could go back to a 2006 Mac Pro without too much hardship. Having a fully-open equivalent to mid-2000s hardware would be a boon for open computing.
observationist: Regarding DRM, You could use stream ciphers and other well understood cryptography schemes to use a FHE chip like this to create an effectively tamper-proof and interception proof OS, with the FHE chip supplementing normal processors. You'd basically be setting up e2ee between the streaming server and the display, audio output, or other stream target, and there'd be no way to intercept or inspect unencrypted data without breaking the device. Put in modern tamper detection and you get a very secure setup, with modern performance, and a FHE chip basically just handling keys and encapsulation operations, fairly low compute and bandwidth needs. DRM and attestation both, as well as fairly dystopian manufacturer and corporate controls over devices users should own.
observationist: KYC = Kill Your ConscienceIt's truly amazing how modern people just blithely sacrifice their privacy and integrity for no good reason. Just to let big tech corporations more efficiently siphon money out of the market. And then they fight you passionately when you call out those companies for being unnecessarily invasive and intrusive.The four horsemen of the infocalypse are such profoundly reliable boogeymen, we really need a huge psychological study across all modern cultures to see why they're so effective at dismantling rational thought in the general public, and how we can innoculate society against it without damaging other important social behaviors.
xeonmc: https://www.smbc-comics.com/comic/2014-05-27
ddtaylor: HDCP does some of that already in many of your devices.
amelius: I'm also thinking of what happens when quantum computing becomes available.But when homomorphic encryption becomes efficient, perhaps governments can force companies to apply it (though they would lose their opportunity for backdooring, but E2EE is a thing too so I wouldn't worry too much).
anon291: Well yeah... You do the initial encryption yourself by whatever means you trust
anon291: Math literacy needs to become standard for computer scientists. These takes are so bad
anon291: If it were as fast as a normal chip, it would obviate the need
F7F7F7: When we are at the point where society feels the need that privacy means encryption at compute ... a product like this (or anything else in the supply chain) is not going to save them.
gigatexal: If they can get this shrunk down and efficient enough in a future scenario I think Apple could move back to Intel for this with their stance on encryption and things it being a pillar of their image.
Joel_Mckay: Not going to happen anytime soon, as the modern M4/ARM unified memory with on-chip GPU is years ahead of Intel. The software ecosystem is slowly growing to leverage this chip architecture, and due to the annoying PC RAM, SSD, and RTX GPU shenanigans it is no longer the lower value option.The PC market was made shitty enough this year, that Mid/High class Mac Pro/laptops are actually often a better value deal now (if and only if your use-case is covered software wise.)Intel does plan on a RTX + amd64 SoC soon, but still pooched the memory interface with a 30 year old mailbox kludge. Intel probably wont survive this choice without bailouts. =3
bigyabai: > (if and only if your use-case is covered software wise.)Judging by Nvidia's current valuation, that's a parenthetical worth ~4 trillion dollars. Apple isn't muscling AMD or Nvidia out of the datacenter anytime soon, and they're basically feeding Intel Foundry customers by dominating TSMC fab capacity. Apple's contribution to the chip shortage is so bad that even they have considered using Intel Foundry Services: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...It's been 7 years of Apple Silicon and the macOS market share really hasn't shifted much. The Year Of Apple Silicon For People Whose Use-Case Is Covered Software Wise was 2019; the majority of remaining customers aren't showing any interest.
Joel_Mckay: > It's been 7 years of Apple Silicon and the macOS market share really hasn't shifted muchIndeed, but a local LLM finishing in 3 days instead of 1 on a $40k GPU changes the economic decision priority for some.Apple sales grew "21.3% year-over-year as of the second quarter of 2025", but also sales flattened as supply chain pricing shocks from "AI"/tariffs hit late last year."Judging by Nvidia's current valuation" is a bad bet with current circular investment conditions.We shall see, but as EOL drivers and OS rot hits legacy NVIDIA hardware... people are going to have to find some compromise in the next 2 years. Even AMD 9850X3D currently cost less than 64G of low end PC ddr5 memory.Odd times for sure =3
bawolff: > homomorphic encryption chip speeds operations 5,000-fold5000 * 0 is still 0.I joke, but i think relative numbers like this are very misleading as FHE is starting from such an absurdly slow place.Still, this is pretty cool and there are probably niche applications that become possible with this, but i think this is a small enough speed up that it is still very niche.
bigbuppo: They probably meant "know your customer", you know, where you have to submit to an anal probe to think about getting a bank account and withdrawing more than $8 of cash at a time will trigger a suspicious activity report for money laundering/tax evasion while the Epstein class are getting away with the most heinous crimes possible.
cmeacham98: KYC is generally a force for good because it prevents fraud. While it is not reasonable for Discord to collect your identity that is a fair requirement for a bank account because money laundering is a serious problem worth preventing.The reason the 'Epstein class' are able to get away with crimes is because in recent US elections the US voted to elect politicions that intentionally are not investigating those crimes and even pardoned some criminals convicted of them.
mc32: Don’t pardoned people by definition need to have been convicted of a crime whether real or in some select instances otherwise? Can you pardon someone not convicted of a (federal) crime?
heavyset_go: Yes, the last president pardoned himself and his family on his way out.
monocasa: I mean, this would be perfect for the key provisioning portions of widevine or bluray.
FrasiertheLion: The model is running in a secure enclave that spans the GPU using NVIDIA Confidential Computing: https://www.nvidia.com/en-us/data-center/solutions/confident.... The connection is encrypted with a key that is only accessible inside the enclave.Within the enclave itself, DRAM and PCIe connections between the CPU and GPU are encrypted, but the CPU registers and the GPU onboard memory are plaintext. So the computation is happening on plaintext data, it’s just extremely difficult to access it from even the machine running the enclave.
olejorgenb: How is it then much different than trusting the policies of Anthropic etc? To be fair you need some enterprise deal to get the truly zero retention policy.
FrasiertheLion: Enclaves have a property that allows the hardware to compute a measurement (a cryptographic hash) of everything running inside it, such as the firmware, system software such as the operating system and drivers, the application code, the security configuration. This is signed by the hardware manufacturer (Intel/AMD + NVIDIA).Then, verification involves a three part approach. Disclaimer: I'm the cofounder of Tinfoil: https://tinfoil.sh/, we also run inference inside secure enclaves. So I'll explain this as we do it.First, you open source the code that's running in the enclave, and pin a commitment to it to a transparency log (in our case, Sigstore).Then, when a client connects to the server (that's running in the enclave), the enclave computes the measurement of its current state and returns that to the client. This process is called remote attestation.The client then fetches the pinned measurements from Sigstore and compares it against the fetched measurements from the enclave. This guarantees that the code running in the enclave is the same as the code that was committed to publicly.So if someone claimed they were only analyzing aggregated metrics, they could not suddenly start analyzing individual request metrics because the code would change -> hash changes -> verification fails.
mc32: I’m not sure if that has precedent. It’s unusual to grant a pardon before a case is brought to court.In any event, my point was all presidents who grant pardons grant them to people convicted of a crime; it’s not a recent development. But that was framed as being upsetting precedent.
jasomill: I'm still waiting for the first password manager to incorporate biometrics and security questions, as predicted decades ago by Douglas Adams:There were so many different ways in which you were required to provide absolute proof of your identity these days that life could easily become extremely tiresome just from that factor alone, never mind the deeper existential problems of trying to function as a coherent consciousness in an epistemologically ambiguous physical universe. Just look at cash point machines, for instance. Queues of people standing around waiting to have their fingerprints read, their retinas scanned, bits of skin scraped from the nape of the neck and undergoing instant (or nearly instant-a good six or seven seconds in tedious reality) genetic analysis, then having to answer trick questions about members of their family they didn't even remember they had, and about their recorded preferences for tablecloth colours. And that was just to get a bit of spare cash for the weekend. If you were trying to raise a loan for a jetcar, sign a missile treaty or pay an entire restaurant bill things could get really trying.Hence the Ident-i-Eeze. This encoded every single piece of information about you, your body and your life into one all-purpose machine-readable card that you could then carry around in your wallet, and therefore represented technology's greatest triumph to date over both itself and plain common sense.
gpapilion: Just to level set here. I think its important to realize this is really focused on allowing things like search to operate on encrypted data. This technique allows you to perform an operation on the data without decrypting it. Think a row in a database with email, first, last, and mailing address. You want to search by email to retrieve the other data, but don't want that data unencrypted since it is PII.In general, this solution would be expensive and targeted at data lakes, or areas where you want to run computation but not necessarily expose the data.With regard to DRM, one key thing to remember is that it has to be cheap, and widely deployable. Part of the reason dvds were easily broken is that the algorithm chosen was inexpensive both computationally, so you can install it on as many clients as possible.
jasomill: This is an exceptionally good point. For example, I suspect two major reasons DRM has been more successful on game consoles than video players are the much smaller ecosystems and much larger BOMs, not necessarily in that order.
jasomill: Sure, but we already have good enough players, open source even, that don't support this technology, and recent codecs have, if anything, become more open, so this only seems problematic for playback on non-general purpose computing devices like smart TVs, set top boxes, and maybe smartphones, tablets, and battery-powered PCs if the tech is incorporated into hardware decoders for all acceptable codecs.
eulgro: In science fiction maybe. We're hitting real limits on compute while AI is still far from a level where it would harmful, and FHE is orders of magnitude less efficient than direct calculation.
fc417fc802: If this were similar to SGX (which is what I initially assumed) then "not yet" is a perfectly reasonable position to take. However it's actually homomorphic encryption implemented in hardware thus not relevant to DRM (AFAIK).That said, the unfortunate reality is that the same constructs that underpin DRM are also required to build a secure system. The only difference is who controls the root of trust. As such the problems with DRM (and hardware ownership more generally) are political as opposed to technical in nature.
fc417fc802: > if all players are requiredMassive if. Why would I voluntarily purchase gimped hardware?Cinavia depended on being implemented by the player itself. It's difficult to see how (for example) a smart tv could implement it for streams coming in via HDMI from a computer the user has full control of.
monocasa: Not according to Ex parte Garland (1866).> 9. The power of pardon conferred by the Constitution upon the President is unlimited except in cases of impeachment. It extends to every offence known to the law, and may be exercised at any time after its commission, either before legal proceedings are taken or during their pendency, or after conviction and judgment. The power is not subject to legislative control.https://tile.loc.gov/storage-services/service/ll/usrep/usrep...Basically you can't pardon acts that haven't happened yet, but you can pardon before any legal action has been taken on prior acts.
esseph: [delayed]
gruez: >If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.Are we reading the same article? It's talking about homorphic encryption, ie. doing mathematical operations on already encrypted data, without being aware of its cleartext contents. It's not related to SGX or other trusted computing technologies.
esseph: [delayed]
direwolf20: I see the same thing every time there's a new medical thing.> We discovered a substance that boosts your innate immune system and non-specifically clears out throat infections.> This will be good for people prone to throat infections.> Not when it's mandated.someone else told me they're going to spy on your windows with drones to make sure you're verifying your age to your OS, like what??? I thought we were waking up to oppression but we're just inventing fake oppression to be mad at instead of responding to real oppression.
matheusmoreira: There's no point. The big chip makers control all the billion dollar fabs. Governments and corporations can easily dictate terms. We'll lose this battle unless we develop a way to cheaply fabricate chips in a garage.The future is bleak.
direwolf20: Make one out of relays and use it to run PGP
direwolf20: The saying is "every accusation is a confession". If the political class claims to be preventing us from doing something that we obviously are not doing, we should assume they're doing that thing until proven otherwise.
15155: DVD players also didn't have a great key revocation and forced field updates of keys and software and such. Blu Ray did, and was somewhat more effective. I also imagine console manufacturers have far more control over the supply chain at large.Consoles after the original Xbox (which had an epic piracy ecosystem) all had online integration. The Xbox 360 had a massive piracy scene, but it was 100% offline only. The Xbox One has had no such breaches that I am aware of.RE: BOM - famously, with many of these examples, certain specific disc drives or mainboards were far more compromised than others.
direwolf20: You would purchase a Blu-ray player in order to play Blu-rays, pretty simple. They have this watermarking.
brookst: You’re right it’s a cynical take. I don’t get cynicism for the sake of it, detached from technical reality.No, this does nothing for DRM or HW attestation. The interesting thought is: not everything is a conspiracy. Yes, that’s just what a conspirator would say. But it’s also true.
gpapilion: Home networks have made this much easier. DVD players didn’t expect network access for software updates etc…
jackyinger: How is searching encrypted data not going to be used for exfiltration? What a terrible idea.I’m sure you can name benign useful things you could use it for. But it seems to me you’re blatantly overlooking the obvious flaw.There is no getting around doing search on encrypted data reducing the level of secrecy. To have an even minutely useful search result, some information within the searched corpus must be exposed.
coliveira: Not everything is a conspiracy, yes. But when we have a class of conspirators in power, and we do have, everything can be used by the conspiracy.
gigatexal: Acktually ahem not to be that guy but to be that guy haha (insert me here) …Apple’s Mac market share of the PC market went from 6.6% to 9% (https://www.cultofmac.com/news/mac-shipments-2025-apple) so that’s nothing to balk at. The MacBook Neo might grow that even more as maybe it converts low end buyers into locked in users in the ecosystem and then they move on to more Mac’s.
fc417fc802: Right. To play legally purchased blu-rays. Who pirates movies and then burns them on a disk? And if someone did do that why would they be using a gimped blu-ray player instead of a media PC?The only thing this scheme was ever going to catch was full blown counterfeit disks sold on a street corner to your average joe. I think that was only ever much of a thing in the developing world. Or was it just before my time?
zvr: For anyone interested, the corresponding software has been released under an Open Source license: https://github.com/IntelLabs/encrypted-computing-sdk
Foobar8568: It's Microsoft who did the library, damn, I can't understand how I misremembered that after working on it for a few months last year.
kittikitti: I find it petty for Intel to describe more software-based solutions to fully homomorphic encryption (FHM) as "software cheats". This is especially since their competition, Duality Technologies, specializes more on the software side and they are certainly much smaller in size.When you have giant corporations like Intel being able to label their smaller competition's technology as "software cheats", then it becomes an incredibly toxic environment. If anyone were to do it to Intel, they would be sued for libel and slander and other anti-competitive tactics.However, I shouldn't be surprised. The industry normalizes this type of discourse. At the same time, the same giant corporations will preach about AI safety and claim you can only trust them with it.That being said, this is a great innovation by Intel. I was impressed at their technology and the thorough discussion about how this type of computing is related to GPU's and CPU's. It's especially interesting given the emergence of computational memory applications.
Joel_Mckay: Depending on the specific reporting period that data also seems consistent. But thanks for citing your source, as some otherwise great folks don't bother to check past Google Gemini nonsense when it starts lying.Apples only issue is its walled garden ecosystem eliminates most small/medium software studio content. In a way, the FOSS projects have greatly increased the MacOS software options available, and the recent Steam port is very promising.Win11 has caused a massive shift in users to posix like systems. This will only improve most of the ecosystem. =3
direwolf20: The idea would be that when you see a recording of a Blu-ray, you can track down who bought the Blu-ray. However that part was never implemented. However it WAS implemented on Netflix which is why pirates don't like using Netflix as a source.
RiverCrochet: Macrovision is a crude DRM scheme that was required by law to exist in all VCRs towards the end of their time in the 90s. Requiring TVs to check for and only display video for streams that present a certificate through such an embedded data stream could simply be called "Macrovision Next Generation Content Assurance."
fc417fc802: Sure, setting aside notions of common sense and accountability to the public a western government could hypothetically impose the equivalent (inverse?) of the EURion constellation on all digital displays. Of course you'd also need to patch the hole of authorized devices (ex laptops) running FOSS video players playing back pirated streams. Which is to say that it doesn't actually solve the "problem" unless it turns into a full blown war on general purpose computing.But wait! Even that's not good enough because my (now illicit) pirate box can present the stream embedded in a webpage for the locked down device that I don't control to play back on the DRM'd TV. So I guess now we're also going to want a scheme to prevent government approved devices from establishing network connections with unapproved ones?
fc417fc802: Okay thinking about it some more it could be made to work provided that the watermarks could be either muxed or rewritten by devices in the middle of the chain. However the entire thing would be a bit on the complex side, would require reendcoding a given video stream in real time on the fly at each relevant point (ex at the streaming service CDN edge node and then again at the client device), and will most likely be rendered pointless unless all display devices are legally mandated to require the presence of the watermark in order to function.Meanwhile AI appears poised to give us unlimited approximately free (plus a few kW hours) entertainment at least assuming it doesn't end up somehow killing us all.
sota_pop: Is this whole concept essentially a fundamental misunderstanding of the difference between "encryption" and "encoding"? I don't mean to be pedantic and don't want to make assumptions due to my respect for the source, but I don't understand how you can meaningfully manipulate the data that has been _actually_ encrypted? Doesn't the ability to accurately manipulate it imply that you have some understanding of its underlying meaning? The article is light on algorithmic details:> "...a mathematical transformation, sort of like the Fourier transform. It encrypts data using a quantum-computer-proof algorithm..."I am assuming there is some deep learning at play here i.e. it is manipulating the data within the latent space. If this is true, then would the embedding process really be considered "encryption"? You could argue it is security through obscurity (in the sense that the latent space basis is arbitrary/learned), but it feels like two different things to me.
subset: (Disclaimer: I am not a cryptographer and this is a heavily simplified explanation). Homomorphic encryption is built on the foundation of 'hard problems' (e.g. the Learning with Errors Problem) - loosely, computational problems that are thought to be impossible to reverse without being in the possession of a secret key.The crux of HE is that it provides a _homomorphism_: you map from the space of plaintext to the space of cipher texts, but the mapping preserves arithmetic properties such as addition and multiplication. To be clear - this means that the server can add and multiply the cipher texts, but the plaintext result of that operation is still irreversible without the private key. To the server, it looks like random noise.I don't think it's helpful to think about this as connected to deep learning or embedding spaces. An excellent resource I'd recommend is Jeremy Kun's guide: https://www.jeremykun.com/2024/05/04/fhe-overview/