Discussion
We Might All Be AI Engineers Now
amelius: > Building systems that supervise AI agents, training models, wiring up pipelines where the AI does the heavy lifting and I do the thinking. Honestly? I’m having more fun than ever.I'm sure some people are having fun that way.But I'm also sure some people don't like to play with systems that produce fuzzy outputs and break in unexpected moments, even though overall they are a net win. It's almost as if you're dealing with humans. Some people just prefer to sit in a room and think, and they now feel this is taken away from them.
noemit: Not a day goes by that a fellow engineer doesn't text me a screenshot of something stupid an AI did in their codebase. But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.The catch about the "guided" piece is that it requires an already-good engineer. I work with engineers around the world and the skill level varies a lot - AI has not been able to bridge the gap. I am generalizing, but I can see how AI can 10x the work of the typical engineer working in Startups in California. Even your comment about curiosity highlights this. It's the beginning of an even more K-shaped engineering workforce.Even people who were previously not great engineers, if they are curious and always enjoyed the learning part - they are now supercharged to learn new ways of building, and they are able to try it out, learn from their mistakes at an accelerated pace.Unfortunately, this group, the curious ones, IMHO is a minority.
bitwize: The phrase "shape up or ship out" is an apt one I've heard. Agentic AI is a core part of software engineering. Either you are learning and using these tools, or you're not a professional and don't belong in the field.
input_sh: Quite frankly, if AI can write better code than most of your engineers "hundreds of times", then your hiring team is doing something terribly wrong.
nbvkappowqpeop: I'm just an old school programmer who loves writing code, and the recent AI developments have just taken the most fun part away from me.
Bukhmanizer: This essay somehow sounds worse than AI slop, like ChatGPT did a line of coke before writing this out.I use AI everyday for coding. But if someone so obviously puts this little effort into their work that they put out into the world, I don’t think I trust them to do it properly when they’re writing code.
roli64: Lost me at "I’m building something right now. I won’t get into the details. You don’t give away the idea."
codemog: It’s kind of funny seeing all the AI hype guys talking about their 10 OpenClaw instances all running doing work and when you ask what it is, you can never get a straight answer..For the record though, I love agentic coding. It deals with the accumulated cruft of software for me.
Cthulhu_: Maybe. The reality of software engineering is that there's a lot of mediocre developers on the market and a lot of mediocre code being written; that's part of the industry, and the jobs of engineers working with other engineers and/or LLMs is that of quality control, through e.g. static analysis, code reviews, teaching, studying, etc.
CrzyLngPwd: It sounds a bit no-true-scotsman to me.
theshrike79: The "most engineers" not "most engineers we've hired".But also "most engineers" aren't very good. AIs know tricks that the average "I write code for my dayjob" person doesn't know or frankly won't bother to learn.
rl3: Perhaps execution is cheap now and ideas aren't?Personally I'm quite pleased with this inversion.
kdheiwns: Engineers will go back in and fix it when they notice a problem. Or find someone who can. AI will send happy little emoji while it continues to trash your codebase and brings it to a state of total unmaintainability.
pydry: >But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.Are you serious? I've been hearing this constantly. since mid 2025.The gaslighting over AI is really something else.Ive also never seen jobs advertised before whose job was to lobby skeptical engineers over about how to engage in technical work. This is entirely new. There is a priesthood developing over this.
kolinko: you’ve been hearing that since mid 2025 bc that’s when it became true.
duggan: Very much on the same page as the author, I think AI is a phenomenal accelerant.If you're going in the right direction, acceleration is very useful. It rewards those who know what they're doing, certainly. What's maybe being left out is that, over a large enough distribution, it's going to accelerate people who are accidentally going in the right direction, too.There's a baseline value in going fast.
baxtr: Isn’t that a bit like saying: we are all computer engineers now in the 90s?Anyone not using excel after 1990 (?) would have lost their job.Same with AI. It’s a tool you need to use to stay competitive.Is it perfect? No. The same way Excel, PowerPoint etc were never perfect. So many hours wasted calling Helpdesk. Same with AI.But now way you can avoid using it.
tern: I am solidly in this "curious" camp. I've read HN for the past 15(?) years. I dropped out of CS and got an art agree instead. My career is elsewhere, but along the way, understanding systems was a hobby.I always kind of wanted to stop everything else and learn "real engineering," but I didn't. Instead, I just read hundreds (thousands?) of arcane articles about enterprise software architecture, programming language design, compiler optimization, and open source politics in my free time.There are many bits of tacit knowledge I don't have. I know I don't have it, because I have that knowledge in other domains. I know that I don't know what I don't know about being a "real engineer."But I also know what taste is. I know what questions to ask. I know the magic words, and where to look for answers.For people like me, this feels like an insane golden age. I have no shortage of ideas, and now the only thing I have a shortage of is hands, eyes, and on a good week, tokens.
jruz: I find really sad how people are so stubborn to dismiss AI as a slop generator. I completely agree with the author, once you spend the time building a good enough harness oh boy you start getting those sweet gains, but it takes a lot of time and effort but is absolutely worth it.
javadhu: I agree on the curiosity part, I have a non CS background but I have learned to program just out of curiosity. This led me to build production applications which companies actually use and this is before the AI era.Now, with AI I feel like I have an assistant engineer with me who can help me build exciting things.
noemit: I'm currently teaching a group of very curious non-technical content creators at one of the firms I consult at. I set up Codex for them, created the repo to have lots of hand-holding built in - and they took off. It's been 4 weeks and we already have 3 internal tools deployed, one of which eliminated the busy work of another team so much that they now have twice the capacity. These are all things 'real' engineers and product managers could have done, but just empowering people to solve their own problems is way faster. Today, several of them came to me and asked me to explain what APIs are (They want to use the google workspace APIs for something)I wrote out a list of topics/key words to ask AI about and teach themselves. I've already set up the integration in an example app I will give them, and I literally have no idea what they are going to build next, but I'm .. thrilled. Today was the first moment I realized, maybe these are the junior engineers of the future. The fact that they have nontechnical backgrounds is a huge bonus - one has a PhD in Biology, one a masters in writing - they bring so much to the process that a typical engineering team lacks. Thinking of writing up this case study/experience because it's been a highlight of my career.
input_sh: Even speaking from a pure statistical perspective, it is quite literally impossible for "AI" that outputs world's-most-average-answer to be better than "most engineers".In fact, it's pretty easy to conclude what percentage of engineers it's better than: all it does is it consumes as much data as possible and returns the statistically most probable answer, therefore it's gonna be better than roughly 50% of engineers. Maybe you can claim that it's better than 60% of engineers because bottom-of-the-barrel engineers tend to not publish their works online for it to be used as training data, but for every one of those you have a bunch of non-engineers that don't do this for a living putting their shitty attempts at getting stuff done using code online, so I'm actually gonna correct myself immediately and say that it's about 40%.The same goes for every other output: it's gonna make the world's most average article, the most average song in a genre and so on. You can nudge it to be slightly better than the average with great effort, but no, you absolutely cannot make it better than most.
q3k: The work is mysterious and important.
rimmontrieu: > But guided? The models can write better code than most developers. That’s the part people don’t want to sit with. When guided.Where do you draw the line between just enough guidance vs too much hand holding to an agent? At some point, wouldn't it be better to just do it yourself and be done with the project (while also build your muscle memory, experiences and the mental model for future projects, just like tons of regular devs have done in the past)
kirito1337: fr, like in 2020 I started to learn programming in C/C++ at 9 and in 2023 when the AI bubble just went on, it feels like I did it all for nothing
bambax: I agree wholeheartedly with all that is said in this article. When guided, AI amplifies the productivity of experts immensely.There are two problems left, though.One is, laypersons don't understand the difference between "guided" and "vibe coded". This shouldn't matter, but it does, because in most organizations managers are laypersons who don't know anything about coding whatsoever, aren't interested by the topic at all, and think developers are interchangeable.The other problem is, how do you develop those instincts when you're starting up, now that AI is a better junior coder than most junior coders? This is something one needs to think about hard as a society. We old farts are going to be fine, but we're eventually going to die (retire first, if we're lucky; then die).What comes after? How do we produce experts in the age of AI?
jstanley: I think the problem is overstated.People always learn the things they need to learn.Were people clutching their pearls about how programmers were going to lack the fundamentals of assembly language after compilers came along? Probably, but it turned out fine.People who need to program in assembly language still do. People who need to touch low-level things probably understand some of it but not as deeply. Most of us never need to worry about it.
holyra: what about the environmental impact of AI, especially agentic AI? I keep reading praise for AI on the orange site, but its environmental impact is rarely discussed. It seems that everyone has already adopted this technology, which is destroying our world a little more.
holyra: Personally, I dismiss AI, mainly agenetic ones, because of its environmental impact. I hope that one day everyone will be held accountable for it.
yanis_t: They will never admit it, but many are scared of losing their jobs.This threat, while not yet realized, is very real from a strictly economic perspective.AI or not, any tool that improves productivity can lead to workforce reduction.Consider this oversimplified example: You own a bakery. You have 10 people making 1,000 loaves of bread per month. Now, you have new semi-automatic ovens that allow you to make the same amount of bread with only 5 people.You have a choice: fire 5 people, or produce 2,000 loaves per month. But does the city really need that many loaves?To make matters worse, all your competitors also have the same semi-automatic ovens...
input_sh: And those mediocre engineers put their work online, as do top-tier developers. In fact, I would say that the scale is likely tilted towards mediocre engineers putting more stuff online than really good ones.So statistically speaking, when the "AI" consumes all of that as its training data and returns the most likely answer when prompted, what percentage of developers will it be better than?
wartywhoa23: These people also prefer plastic averaged-out images of AI girls to real ones.The Average is their top-tier.
rel_ic: This is kind of like saying a kid can never become a better programmer than the average of his teachers.IMHO, the reasons not to use AI are social, not logical.
input_sh: The kid can learn and become better over time, while "AI" can only be retrained using better training data.I'm not against using AI by any means, but I know what to use it for: for stuff where I can only do a worse than half the population because I can't be bothered to learn it properly. I don't want to toot my own horn, but I'd say I'm definitely better at my niche than 50% of the people. There are plenty of other niches where I'm not.
bojan: On another note, if you had 100 engineers and you lay almost all of them off and keep 5 super-AI-accelerated engineers, and your competitor keeps 50 of such engineers, your competitor is still able to iterate 10x as fast. So you still lay people off at the risk of falling behind.
thefounder: The issue is that you become lazy after a while and stop “leading the design”. And I think that’s ok because most of the code is just throwaway code. You would rewrite your project/app several times by the time it’s worth it to pay attention to “proper” architecture. I wish I had these AIs 10 years ago so that I can focus on everything I wanted to build instead to become a framework developer/engineer.
hansmayer: > But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.Because the instances of this happening are a) random and b) rarely ever happening ?
turblety: Maybe the bakery expands to make more than just loaves of bread, maybe different cakes, sandwiches, maybe expand delivery to nearby towns.
0x3f: A bit simplistic. The bakery can just expand its product range or do various other things to add work. In fact that's exactly what I would expect to happen at a tech company, ceteris paribus.
hansmayer: > Consider this oversimplified example: You own a bakery. You have 10 people making 1,000 loaves of bread per month. Now, you have new semi-automatic ovens that allow you to make the same amount of bread with only 5 people.That is actually the case with a lot of bakeries these days. But the one major difference being,the baker can rely with almost 100% reliability that the form, shape and ingredients used will be exact to the rounding error. Each time. No matter how many times they use the oven. And they don't have to invent strategies on how to "best use the ovens", they don't claim to "vibe-bake" 10x more than what they used to bake before etc... The semi-automated ovens just effing work!Now show me an LLM that even remotely provides this kind of experience.
v3xro: The only way I see out of this crisis (yes I'm not on the token-using side of this) is strict liability for companies making software products (just like in the physical world). Then it doesn't matter if the token-generator spits out code or a software engineer spits out code - the company's incentives are aligned such that if something breaks it's on them to fix it and sort out any externalities caused. This will probably mean no vibe-coded side hustles but I personally am OK with that.
sd9: I agree. I've got more lazy over time too. But the cost of creating code is so cheap... it's now less important to be perfect the first time the code hits prod (application dependant). It can be rewritten from scratch in no time. The bar for 'maintainability' is a lot lower now, because the AI has more capacity and persistence to maintain terrible code.I'm sure plenty of people disagree with me. But I'm a good hand programmer, and I just don't feel the need to do that any more. I got into this to build things for other people, and AI is letting me do that more efficiently. Yes, I've had to give up a puritan approach to code quality.
wartywhoa23: Yet another wannabe systems engineer cheers the robbery and loss of job of real systems engineers.
tern: I know it's not anyone's fault exactly, but the current state of systems in general is an absolute shit show. If you care about what you do, I'd expect you to be cheering that we just might have an opportunity for a renaissance.Moreover, this kind of thinking is incredibly backward. If you were better than me then, you can easily become much better than I'll ever be in the future.
jwr: Finally a take that I can agree with.
sd9: Calling somebody a wannabe systems engineer is unneccessarily antagonistic.
dist-epoch: The environmental impact of AI replacing a human programmer is orders of magnitude lower than the environmental impact of that programmer. Look up average US water consumption and CO2 emissions per capita.And then add on top the environmental impact of all of the money that programmer gets from programming - travels around the world, buying large houses, ...If you care about the environment, you should want AI's replacing humans at most jobs so that they can no longer afford traveling around the world and buying extravagant stuff.
wartywhoa23: So you mean that human programmers who was replaced by AI are dead by now?"You'll be fine digging trenches, programmer", they said.Seriously, though:...so that they can no longer afford traveling around the world...This is either a sarcasm I failed to parse, or pure technofascism.
wartywhoa23: All environmental impacts are equal, but some of them are more equal than the others!
JR1427: This is what I find interesting - the response from most companies is "we will need fewer engineers because of AI", not "we can build more things because of AI".What is driving companies to want to get rid of people, rather than do more? Is it just short-term investor-driven thinking?
jasomill: In other words, there's probably a market for a model trained on a curated collection of high-quality code.
codebolt: One issue is that developers have been trained for the past few decades to look for solutions to problems online by just dumping a few relevant keywords into Google. But to get the most out of AI you should really be prompting as if you were writing a formal letter to the British throne explaining the background of your request. Basic English writing skills, and the ability to formulate your thoughts in a clear manner, has become essential skills for engineering (and something many developers simply lack).
holyra: This comes from a dystopian book (Animal Farm). What is your point?
salawat: The optimization function of capitalism and it's instrumental convergence. The AI Alignment problem is already here, and it is us.
wartywhoa23: If you read the book, my point should be crystal clear.
kelipso: Doubt it”s sustainable. These big models keep improving at a fast pace and any progress like this made in a niche would likely get caught up to very quickly.
kif: But that's the problem. Something that can be so reliable at times, can also fail miserably at others. I've seen this in myself and colleagues of mine, where LLM use leads to faster burnout and higher cognitive load. You're not just coding anymore, you're thinking about what needs to be done, and then reviewing it as if someone else wrote the code.LLMs are great for rapid prototyping, boilerplate, that kind of thing. I myself use them daily. But the amount of mistakes Claude makes is not negligible in my experience.
salawat: >There's a baseline value in going fast.Maybe to the people writing the invoices for the infra you're renting, sure. Or to the people who get paid to dig you out of the consequences you inevitably bring about. Remember, the faster the timescale, the worse we are wired to effectively handle it as human beings. We're playing with a fire that catches and spreads so fast, by the time anyone realizes the forest is catching and starting to react, the entire forest is already well on the way to joining in the blaze.
bravetraveler: You know, about ten years ago, I felt the same way about linting. Alas, The Product is trending towards one big fuzz ball.
coldtea: And "taking the fun out" is one thing. Making 50% or more of coders redandunt is a whole other can of worms.
coldtea: >The environmental impact of AI replacing a human programmer is orders of magnitude lower than the environmental impact of that programmer. Look up average US water consumption and CO2 emissions per capita.The programmer will continue to exist as a consumer of those things even if they get replaced by AI in their job.
dist-epoch: But he will no longer have that much money to spend on environment damaging products.
sn0wflak3s: I get this. I don't think either of you is wrong. There's a real loss in not writing something from scratch and feeling it come together under your hands. I'm not dismissing that.I have immense respect for the senior engineers who came before me. They built the systems and the thinking that everything I do now sits on top of. I learned from people. Not from AI. The engineers who reviewed my terrible pull requests, the ones who sat with me and explained why my approach was wrong. That's irreplaceable. The article is about where I think things are going, not about what everyone should enjoy.
sn0wflak3s: I wrote it myself. But the irony isn't lost on me. "Who did what" is kind of the whole point of the article. Appreciate the feedback.
The problem is: you can’t justify this throughput to someone who doesn’t understand real software engineering. They see the output and think “well the AI did it.” No. The AI executed it. I designed it. I knew what to ask for, how to decompose the problem, what patterns to use, when the model was going off track, and how to correct it. That’s not prompting. That’s engineering.
ChrisMarshallNY: > The problem is: you can’t justify this throughput to someone who doesn’t understand real software engineering. They see the output and think “well the AI did it.” No. The AI executed it. I designed it. I knew what to ask for, how to decompose the problem, what patterns to use, when the model was going off track, and how to correct it. That’s not prompting. That’s engineering.That’s the “money quote,” for me. Often, I’m the one that causes the problem, because of errors in prompting. Sometimes, the AI catches it, sometimes, it goes into the ditch, and I need to call for a tow.The big deal, is that I can considerably “up my game,” and get a lot done, alone. The velocity is kind of jaw-dropping.I’m not [yet] at the level of the author, and tend to follow a more “synchronous” path, but I’m seeing similar results (and enjoying myself).
noemit: There are two types of engineers who use AI:- Ones who see it generated something bad, and blame the AI.- Ones who see it generated something bad, and revert it and try to prompt better, with more clarity and guidance.
ChrisMarshallNY: Three types:- Ones that use it as a “pair partner,” as opposed to an employee.Thanks for the implicit insult. That was helpful.
miningape: - Ones who see it generated something bad, and realise it'd be faster to just hand fix the issues than babysit an LLM