Discussion
Claude Code's Source Didn't Leak. It Was Already Public for Years.
durzo22: write your blog yourself if ppl are supposed to read it not this llm slop
ryandrake: I successfully did this the other day. There was a web app I used quite a bit with an annoying performance issue (in some cases its graphics code would spin my CPU at 100% constantly, fans full-blast). I asked Claude to fetch the code and fed it a few performance traces I took through Firefox, and it cut through all those obfuscated variables like they weren't even there, easily re-interpreting what each function actually did, finding a plausible root cause and workaround (which worked).Can you generally trust it to de-obfuscate reliably? No idea. My sample size is 1.
Retr0id: Minification is not obfuscation and obfuscation is not security, but no amount of deobfuscation will recover the comments in the source, which are often more insightful than the source itself.
notepad0x90: isn't it fair for an article about AI deobfuscating code to be written by AI?
Retr0id: Not really, no.
maxwg: JS was never really obfuscated - it wasn't the goal of minification. Minifiers especially struggle with ES6 classes/etc, outputting code that is almost human readable.Proper obfuscation libraries exist, typically at the cost of a pretty notable amount of performance that I'd wager most are not willing to sacrificeAnd like even the best of client-side DRM, everything can be reverse engineered. All the code has been downloaded to the user's machine. It's one of the (IMO terrible) excuses for the SaaSification of all software
Gigachad: I expect it these days but it’s still disrespectful slop pushing out real work.
socalgal2: If it’s too hard to read ask your ai to deobfuscate it :D
tw04: Huh? Their justification for "ofuscation isn't security" is by pointing out that the Claude source wasn't obfuscated, it was minified. And it could be "deobfuscated by claude itself" - even though, again, they said the code wasn't obfuscated.So I guess, ask Claude to deobfuscate some code that's ACTUALLY OBFUSCATED if you want to claim obfuscation provides ZERO additional security.>We analyzed this file at AfterPack as part of a deobfuscation case study. What we found: it's minified, not obfuscated.>Here's the difference. Minification — what every bundler (esbuild, Webpack, Rollup) does by default — shortens variable names and removes whitespace. It makes code smaller for shipping. It was never designed to hide anything.>Here's where it gets interesting. We didn't need source maps to extract Claude Code's internals. We asked Claude — Anthropic's own model — to analyze and deobfuscate the minified cli.js file.
sublinear: > No one talks about this. There's no VentureBeat headline about GitHub shipping email addresses in their JS bundles. No Hacker News thread about internal URLs exposed in Anthropic's CDN scriptsThat's a huge sign none of that information is truly sensitive. What is being implied here?> AI Makes This UrgentNo it doesn't. This is blogspam and media hype nobody is interested in. Unless the demographics have really shifted that much in the last few years, HN is one of the worst places to attempt this marketing style.
layer8: > AfterPack approaches this differently. Instead of layering reversible transforms on top of each other, AfterPack uses non-linear, irreversible transforms — closer to how a hash function works than how a traditional obfuscator works. The output is functionally equivalent to the input, but the transformation destroys semantic meaning in a way that cannot be reversed — even by AfterPack itself. There's no inverse function. No secret key that unlocks the original.That’s probably fun when trying to analyze bugs occurring in production. :)
Retr0id: What they describe is snake oil. Even if you assume it is mathematically possible in the general case (which is debatable!), it'll likely have a huge performance overhead. See https://en.wikipedia.org/wiki/Indistinguishability_obfuscati...
TurdF3rguson: If the comments were in the original source that the model trained on... Then sure, those are recoverable too.
josephg: I did something similar yesterday. I'm playing a little idle game, and wanted to optimise my playthrough. I pointed claude at the game's data files, and in a few short minutes it reverse engineered the game data and extracted it to CSV / JSON files for analysis.In this case, it turned out the data - and source code for the game - was in a big minified javascript file. Claude extracted all the data I wanted in about 2 minutes.
gertop: Fair? No. Par for the course? Unfortunately yes.
throwup238: What they’re describing is a polymorphic virus. A great analogy for SV startups.It works great in assembly, not so much for higher level languages.
motohagiography: slight historical note, it might be interesting to see how the brief period of "white box cryptography" stands up to AI today. At the time there were a few companies with products that had trouble finding fit (for straightforward security reasons) but they were essentially commercial obfuscators that made heavy use of lookup tables, miniature virtual machines, and esolang concepts that worked mainly against human reverse engineers.An example was this early AES proposal: https://link.springer.com/chapter/10.1007/3-540-36492-7_17
Retr0id: Whitebox cryptography is widely deployed, in browser plugins for DRM.
brookst: Is all polymorphic code virii?
throwup238: Not at all, but in practice no one has any use for the technique except to obfuscate viruses, with the exception of academic research.
notepad0x90: It's a cat and mouse game, it provides the desired level of security for people who use it. It isn't used to prevent people from finding vulnerabilities (not mostly at least). It's used to deter competition, prevent clones of the application,etc.. it's make-shift "DRM". There are ways to defeat even AI-assisted analysis running in a proper browser. But I think it's not a good idea to give anyone ideas on this subject. proper-DRM is hellish enough.Was there ever an obfuscated JS code a human couldn't reverse given enough time? It's like most people's doors, it won't stop someone with a battering ram, but it will ideally slow them down enough for you to hide or get your guns. in this case, it won't even slow them down, until it does (hence: cat and mouse game).
integralid: >Was there ever an obfuscated JS code a human couldn't reverse given enough time?I reverse malware for a living and no there wasn't. With some experience even the best obfuscation is actually pretty easy to defeat. But the goal of malware analysis is to extract some knowledge (what this code does, IPs, URLs, tokens). Getting a runnable, clean version would often be a long tedious work.
mediumsmart: JavaScript code is the essence of minified security.
integralid: minification was originally about sending less bytes on the wire and saving a bit of performance. Somewhere along the road people started misusing this for security, because JS evolved from "a few snippets of code to make my site more interactive" to SPAs
postalrat: Often like 1 in 100 js files?
throwaway9980: Nicholson entered the mantrap and the double doors closed behind him. He emptied his pockets and disrobed before donning the clean suit that had been provided to him by the orderlies. The camera watching him appeared satisfied that he was properly prepared and, more to the point, that the vendor was properly protected. The doors to the inner chamber opened and he proceeded into the hallway. He passed several doors until he reached the one that was labeled with the name of the vendor. He pressed the button on the doorframe. A satisfying tactile click, a spinning light illuminating around the button, a click, and then the door opened soundlessly. A single desk with a small chair and a computer terminal awaited him. He sat down and the screen turned on automatically. Finally, he was able to set about classifying his expenses from a recent trip to Tokyo. It was inconvenient, but a small price to pay to ensure that the vendor’s unique interfaces, their intellectual property, couldn’t be copied by the replication machines. Their eyes and their ears were everywhere in the outside world. Simply by seeing your software, these machines could copy its essence. The risks of operating software in the wild required that proprietary software be protected. Hidden away from eavesdroppers. Such was the world in 2037.
what: It seems pretty clearly ai written.
beej71: About 8 months ago I deliberately obfuscated some merge sort code to give to my software engineering students to make them upset. :) I munged it up pretty good, changing the structure, making the variable names completely misleading, destroying the symmetry, etc. Out of curiosity, I fed it into ChatGPT and it had it figured out in zero seconds.