Discussion
Why I am leaving the AI party after one drink
mstank: While I applaud her and wish her well — writing like this reminds me of a couple of things.First my aging father insisting on navigating using his unfortunately fading memory instead of Google maps. Some people just won’t pick up technology out of habit or spite, even if it hinders them.Second, a quote I read here that I’ll paraphrase “you can be the best marathon runner in the world and still lose a race to a guy on a bike.” Know the race you’re racing. It often changes.I think it’s valid and commendable to keep to keep the old ways alive, but also potentially dangerous to not realize they’re old ways.
haolez: I don't have a stake on AI, but more and more I see the following patterns:- people that give in to AI do so because the technical merits suddenly became too big to ignore (even for seasoned developers that were previously against it) - people who avoid AI center their arguments on principles and personal discomfortJust from that, you can kind of see where this is going.
legitster: > I don’t want to feel this kind of “addiction.”> I don’t want to depend on something doing the work I earn money with.> I don’t want to give up my brain and become lazy and not think for myself anymore.There are a lot of good reasons we should be skeptical of AI and not give up on essential skills. But sometimes I want to shake these people by the shoulders. Do you drive an automatic car? Do you use a microwave? Do you buy food from a grocery store? Do you own power tools?The entire point of civilization and society is that we are all "addicted" to technology and progress. But the invention of the plow did not, in fact, make us lazier or stop using our brain. We just moved on to the next problems. Maybe the Amish are have it right and we should just be happy with a certain level of technology. But none of us have "lost" the ability to go backwards if we really wanted.You can finally ask a computer to think and solve problems, and it will! People act like this is a brave new world, but this is literally what computers were supposed to be doing for us 50 years ago! If somebody finally came out with a fusion reactor tomorrow I would half expect people to suddenly come out and say "Oh, I don't think I can support this. What about the soul of solar panels? I think cheap electricity is going to make things too easy."
ddellacosta: > There are a lot of good reasons we should be skeptical of AI and not give up on essential skills. But sometimes I want to shake these people by the shoulders. Do you drive an automatic car? Do you use a microwave? Do you buy food from a grocery store? Do you own power tools?Are you seriously suggesting that LLMs are exactly the same as these technologies in terms of impact, especially in terms of the effect on our minds and behavior? If so I think you're the one who needs to be shaken.https://publichealthpolicyjournal.com/mit-study-finds-artifi...https://www.theguardian.com/lifeandstyle/2026/mar/26/ai-chat...
deadbabe: > Do you drive an automatic car? Do you use a microwave? Do you buy food from a grocery store? Do you own power tools?The answer to these questions could easily be no, and life is way better for it.
coxmi: This is an insane claim:> The entire point of civilization and society is that we are all "addicted" to technology and progress.Technology is like much of material reality, in that we can think whatever the hell we like about its various forms, especially so if we’re surrounded by it.
JPKab: Excellent point on the automatic car.I love driving a manual transmission. But I also understood why it was so hard for me to find a new Jeep Wrangler with a manual transmission a few years ago.
yomismoaqui: I would pay money to read rants on a forum like Hackernews but in 190X when cars started sharing the roads with horse carriages.I bet it would look something like the posts we are seeing today with developers and agentic AI.
basket_horse: Why can’t people just acknowledge AI is good at some things and bad at others. Why does every post say AI is either groundbreaking or terrible. Get a grip people. It’s a tool.
api: Nuanced opinions don't get clicks and don't fit in tweets. Algorithms hate context.We've created a communications system bottlenecked by virality and short form text and video in which all nuance and context is stripped from everything.This, far far more than anything AI is doing, is what's making us dumber.
ognav: I must caution people against this, it reads like class warfare!My wife's second cousin was a mediocre programmer without a job and no girlfriend. Once he used Claude Code, he vibe coded 10 note taking apps and 20 route planners in a week. He now has a YouTube channel and speaks at AI conferences. He has two girlfriends and a Lamborghini.Please do not take the route of the Luddite. Be like my wife's second cousin![This looks like a regular Claude ad these days, so to be clear: this is sarcasm.]
rspoerri: Human thinks eating food from fire is bad because it looks charred and you might burn your fingers doing so. /sI remember the time when people insisted that they would never use a mobile phone. I remember the time when people didnt understand my presentation about the magical "internet" (8th grade school in 94).
tines: The problem is that you're likening fundamentally unlike things. AI isn't like a microwave or an automatic car or a power tool. It does not augment you. As I said elsewhere: AI is not a bicycle for the mind, it's an easy chair. You will lose more than you ever gain.
mattbis: I love it to help me learn faster, it's making me more intelligent exponentially.I really want to keep my brain cell count though and I still do most of the programming myself..Some projects / tasks sure it can do that ; but I still check every damn line it generated.. There is usually always some mistake in there.And AI code will need to be legally audited so...
JPKab: "I'm not going to use this technology that obviously enhances my productivity because <insert emotional subjective reasoning that no customer would ever care about here>."I think a lot of people have forgotten why we actually get paid to write code. The person who wants an automated billing system doesn't care if you hand-typed it or not, or if the CSS that would have taken 2 hours to write took 8 seconds via an AI plus 60 seconds of you tweaking a border you didn't like. They just want their billing system. And if you are the person that takes 20x longer to build it, you're going to quickly get outcompeted. Sorry.
jr3592: I love posts like this. AI is easily the most disruptive thing to hit our industry in over a decade and it feels like one of those "this changes everything" moments. Reading how it's impacting others is cathartic and helps shape my own understanding.Here are some thoughts I have from reading this article:> The AI can’t “see” the output, so some responsive refinements were just not correct. Within one CSS rule block there were redundant declarations.This 1,000%.Vibe coding has its issue and for me personally, frontend polish, responsiveness, and overall quality is the #1 most glaring of them that simply re-prompting often can't solve.Even with the ability to screen shot your UI that hasn't solved things like glitchy animations. If you want to do anything even remotely above a junior level like scroll animations, page transitions, etc. good luck. AI will certainly try to do it for you, but inevitably it will not work perfectly and you will need to manually refine or even re-write code. When the code base isn't yours, that makes these re-writes a lot less fun.> The guilty conscience at the same time, like I was cheating. I realized that when I move on like this, my project will never truly feel like my own.I've wrestled with this over the last year, and still do to some extent. I'm trying to shift my perspective and envision myself as a brand new developer maybe 16 or 17 years of age. Would I think this isn't my work? I doubt it. I'd probably just (correctly) assume that this is the state of the art, this is how you do it.Unfortunately this doesn't fix a bigger problem... I just don't enjoy vibe coding as a craft. There's something special about sitting down in the morning with your coffee and taking on a difficult programming problem. You start writing some code, the solutions start to formalize in your mind, there's a strong back-and-forth effect where as you code, the concepts crystalize further... small wins fuel a wonderful dopamine hit experience... intellisense completions, compilation completions, page refreshes, etc. are now all replaced with dull moments often waiting for the agent to return its response, which you now read.> I’m curious (and a little bit scared) to see where we will go from here. I hope that in the end I can be part of a community that values craftsmanship, individuality and honest, high-quality work.I really hope so too... But speaking honestly, I think this ship is sailing away quite quickly.Time is money, and it always has been this way. Very few organizations can afford the luxury of time when building, designing, etc. I see no chance for this genie to go back in the bottle, and I believe it has (and will continue) to fundamentally change the nature of our work.Over time as these models improve, there's a chance it could dramatically reduce the overall need for developers... It will start with low level teams as we're seeing already, but could expand.I have been saying this to everyone -- what's your exit strategy?I'm not saying you need to panic, but you need a plan for what happens if / when salaries tank dramatically. I hate to be "that guy" but in life I've found expecting the worst, isn't always a bad thing. Keep your mood up, prepare for the worst possible outcome, and be pleasantly surprised if that's not what happens.
TheGRS: I know I read several posts like these every day, but I've been thinking more and more that I should stick to the chat window and having AI guide me instead of doing the work for me. Great tool for showing me what to do when I don't know and offering me guidance, and rubber ducking of course, but I definitely lose the context and understanding when I just let it rip and write everything for me.
aeternum: A lot of this is just wrong. AI can now see the output.It's interesting that highly flawed opinion pieces like this are so popular.
rickdeaconx: The idea of getting information from other engineers relies on the idea that the other engineers aren't already complacent with AI.
firmretention: I've found two personal use cases for LLM generated code:(1) I have an idea for some app, but either I feel it won't be useful enough/save me enough time to justify developing it, or I simply don't feel the problem is interesting enough to be motivated by it. In that case, a vibe coded tool is perfect. It generally does one simple thing, and I don't care about long term maintenance, because it just needs to keep doing that thing.(2) Adding a feature to an open source project. Again, it's a case of "I want this feature, but am not willing to spend the time needed to implement it." Even a relatively simple open source project can take a day or two just to get a basic understanding of the code and where I need to make the changes. Now I can often just get a functioning vibe-coded implementation within a few hours.(2) leaves me with some unsettling feelings about how this will affect the future of open source software. Some of the features I've implemented this way may very well be useful to other users, but I can't in good conscience just dump a vibe coded pull request on a project and except them to do the work of vetting it. But if I didn't have the energy to implement the change myself, I'm definitely not going to bother doing the work of going through all the LLM generated code, cleaning it up to the standards of the project, etc. Whereas before I didn't have a choice, and the idea of getting the change ready for a PR was much less daunting since I understood the problem space and solution well.So at least for myself, I can see a future where many of the apps I use are bespoke forks of popular applications. Extrapolate that to many, many people and an interesting landscape emerges.
slumpt_: to be clear, the work you’re doing is only relevant to the extent it produces productnobody cares if you use your brain or not. they do care if you’re efficiently delivering reliable product
legitster: I think of themselves as very practical - I drive a manual, I fix my own cars, I do my own house projects, I cook my own meals.Which is part of the reason these anti-AI screeds fall on deaf ears for me. My generation has willingly abandoned all of these legitimately useful hard-skills But there's also nothing preventing you from picking and choosing what you care about.
cush: > I want to shake these people by the shoulders... Do you use a microwave?Microwaves aren't doing active problem solving though. It seems what the author is trying to say is they enjoy problem solving and they find coding a rewarding and creative experience. Sure microwaves saved at-home cooks might enjoy zapping a frozen dinner, but the author is a chef who enjoys writing their own recipes and cooking from scratch. AI isn't just the microwave, it's also the chef.> None of us have "lost" the ability to go backwards if we really wantedThis absolutely isn't true. Using google maps quickly makes people poorer at navigation - skills need to be practiced. The author thinks letting AI into their kitchen to cook for them will change themself cognitively and make them lazy and lose their skills. And that would be true.What it sounds like you're getting at but never said is there might be newer skills on the other side that are even more rewarding, which may be true. But if history is any indication, there will be no shortage of folks who like things the old way and want to use their meat brains to provide bespoke goods and services that AI can't.
TechSquidTV: Im hearing a lot of opinion, but nothing convincing.
bicepjai: All knowledge started as someone's opinion. The goal isn't to avoid opinions, it's to stress-test them. That's exactly what HN is for.
tjchear: Here’s my experience: just yesterday I had to tackle this task that’d have required a backend engineer and a frontend engineer several days, so I tasked several Claude code agents to work on them autonomously. With the time freed up, I didn’t just twiddle my thumbs. I used it to read up on this topic that was making the rounds yesterday and gained a better understanding of it - something hard to do when you juggle both a job and raising a family. I could then reinvest the time I used to learn something by using them in some other projects.Just my two cents. No matter whether you use AI or not, I’m sure you’ll gain something.
Nevermark: Some of us have been waiting our whole lives for a comprehensive DWIM command.> DWIM is an embodiment of the idea that the user is interacting with an agent who attempts to interpret the user's request from contextual information. Since we want the user to feel that he is conversing with the system, he should not be stopped and forced to correct himself or give additional information in situations where the correction or information is obvious. [0]— Teitelman and his Xerox PARC colleague Larry Masinter, Xerox PARC, in 1981[0] https://en.wikipedia.org/wiki/DWIM
deepfriedrice: The automatic transmission gives us more dexterity for... what exactly? Fiddling with the dash, reaching for something in the back seat, texting? The best case human has much more control but the average case seems worse off.
tines: Yeah, things get more and more terrible over time. Your point?
PeterStuer: I am the opposite. I do not understand how people do not get at least 5x more productive with GenAI.Maybe it is because I do not do much front-end design. Maybe it is because I'm a bit more diligent than your average "viber", or maybe because for me it is easier to spot a suboptimal solution, or challange with edge cases from experience etc.But these people turning their backs, not in principle, because that I fully understand, but because of underperformance?Maybe their expectations are way out there? Maybe (most likely) it is the application domain? Maybe plainly a skill issue?But seeing how GenAI is plowing through tough fields, I would not turn my back on it even if it wasn't there (yet) in my domain.
legitster: This is purely a matter of perception. Cooking a meal is a deeply intellectual process. If I buy a meal from a restaurant, yes I am losing a skill. But if making a hollandaise is not a skill I ever need in my life, it's not really a practical loss.AI is taking problems and putting them in a drawer so we never have to think about it again. Matches de-intellectualized making a fire. A washing machine de-intellectualized doing laundry. These are now solved problems.Our brainpower spent on them is effectively worth nothing. The only reason we need to learn to make a fire from scratch is for the intellectual satisfaction or for emergency situations. The same reason we would choose to work on the problems that AI can now solve.It only a loss if you think the skill and ability you are losing is intrinsically valuable, and the only thing you are going to replace it with is leisure.
cush: > making a hollandaise is not a skill I ever need in my lifeI know you just wanted to poke at the analogy, but if you like hollandaise, it's one of the easiest and most rewarding sauces to make at home! Restaurant hollaindaise is usually terrible
legitster: Agreed.(Though it's not as easy as a béchamel, and yet I still see people buy jarred alfredo sauces. You can literally make an amazing alfredo sauce with pantry ingredients in less time than it takes to boil the noodles! Why would anyone buy an alfredo sauce!?)Although this more or less is my point. If people are willing to give up these incredibly high reward, low effort skills - how much more uphill is the battle to make people code and process data?
rustystump: Most arguments against it are built on some moral principle and not on objective reality of usefulness.Crypto used to be the thing to hate but that made sense as the objective usefulness of crypto was meek. AI models were always crazy useful but prohibitively expensive. Youd need an entire team to build your models. Now you dont.
hirvi74: > I've been thinking more and more that I should stick to the chat window and having AI guide me instead of doing the work for me.That's how I have been using AI for years. I feel like my productivity has skyrocketed over the past year or two, and all my code is still written by hand. It's like having StackOverflow on demand. I also never really have to worry about tokens or usage limits. I don't think I have ever hit the limit on the $20 Claude plan, and I use Claude every day.
lowlander: Patient: "Doctor, it hurts when I do this." Doctor: "Then don't do that!"I'm finding that how you choose to use it makes all the difference in whether it's useful or not. I understand the reticence to jump on the hype train and it's taken some reps to find the parts of building with AI that I don't like and how to navigate it and keep it from making choices I wouldn't make or are low quality.> asking for a recommended tech stackthis is up to you. you can just tell it what tech stack to use. better yet, bootstrap the project yourself and give it to AI as the starting point. nobody is saying AI has to make these choices for you and you're not allowed anymore.> I wasn’t happy with some of them because of my own experiences in the past... Even when deciding against something for a reason, Claude Code tried to push me back on the suggested track.this kind of sounds like many human teammates at work... you don't always like their suggestions or they aren't convinced by your arguments? the difference being with AI you can just tell it what to do, no persuasion required.nothing about AI prevents you from thinking about design choices, architecture, data modeling, or even the minutiae if you want to. the only thing telling AI to do those things for you is you!
politelemon: > When you know CSS well (and I would consider myself as someone who does), you quickly find weird and broken things in the generated code.You have encountered https://en.wiktionary.org/wiki/Gell-Mann_Amnesia_effect
deadbabe: Customer doesn’t give a fuck how long a billing system took to make, they only care that it works correctly.A billing system only truly gets built once, then possibly maintained in perpetuity. This makes the advantage of building it 20x times faster pointless. AI builds it in a day, will it matter 5 years from now if that billing system was instead built by hand in 20 days a long time ago? No.The speed advantage of AI only comes into play when you have a lot of code to crank out continuously.Do you have a need to constantly build bespoke billing systems at a rate of 1 per day? Probably not. So who cares. Take your little AI grift charging $1000/month somewhere else. It’s not needed.
bluGill: Every billing system in use is constantly maintained with new features, bug fixes ane the like. The system of 20 years ago would apply the wrong tax laws today. the people asking for the new feature today care about how easy those are to add
prewett: I think adding new features is exactly the sort of place where AI is terrible, at least after you do it for a while. I think it's going to have a tendency to regenerate the whole function(s), but it's not deterministic. Plus, as others have said, the code isn't clean. So you're going to get accretions of messy code, the actual implementation of which will change around each time it gets generated. Anything not clearly specified is apt to get changed, which will probably cause regressions. I had AI write some graphs in D3.js recently, and as I asked for different things, the colors would change, how (if) the font sizes were specified would change, all kinds of minor things. I didn't care, because I modified the output by hand, and it was short. But this is not the sort of behavior I want my code output to have.I think after a while the accretions are going to get slow, and probably unmaintainable even for AI. And by that time, the code will be completely unreadable. It will probably make the code written by people who probably should not be developers that I have had to clean up look fairly straightforward in comparison.
bluGill: That is why I understand everything before I commit. Ai can write a lot of bad code - but an expert can guide it to good code.
sunir: It’s not insane. They are correct that is the point of civilization which carries information from generation to generation outside the oral tradition in a systematic organized reliable way.
coxmi: The point of civilisation, however loose that idea may be, is, if it’s anything at all, determined by people.Technology exists today in a way that feels like it could be defining its own path in a sense, but much like oral tradition, neither are large enough concepts to describe civilisation.
saltcured: Or for the memetics fans out there, the point of people, if it's anything at all, is determined by civilization..
tines: Now you're getting it! The modern way of life which prioritizes convenience and production destroys human connection. Making sauce is now pointless. Let's go one step further and make every other thing you might do equally pointless. Welcome to the hellscape!
mhluongo: If I'm not mistaken, this was Socrates' exact perspective on writing.
jplusequalt: >Socrates' exact perspective on writingAgain, writing replacing memorization is not a good 1:1 comparison to AI replacing technical understanding. Someone still needs to understand what is written and act upon that knowledge. That requires skill and experience in the domain they're working within.However, a person using an AI does not need to understand the underlying problem to get results. A person can ask Claude Code to write them a web app dashboard without having ever learned JS/CSS/HTML. It does not require them to have skills within a domain.Also, we need to be honest with ourselves. Human brains did not evolve for the instant gratification of modern technology. We've already seen what technology has done to our attention spans. I am concerned over what further reliance on technology, particularly AI, will do to our brains.
legitster: > However, a person using an AI does not need to understand the underlying problem to get results. A person can ask Claude Code to write them a web app dashboard without having ever learned JS/CSS/HTML. It does not require them to have skills within a domain.This perspective is funny to me because of how much the modern web is already built around web developers refusing to use CSS and PHP. The giving up of the skills happened before the automation.
jplusequalt: >It only a loss if you think the skill and ability you are losing is intrinsically valuableWhat about the skill of learning itself? I would suggest that's one of the most important skills humans have evolved. The more integrated AI becomes in our societies, the more it will automate away potential opportunities for learning. I can forsee a world tightly integrated with AI where people are not only physically sedentary, but mentally as well.As we progress further into the future, we need more educated people than ever to tackle the exponentially increasing complexities of our society. But AI presents an obstacle that many will never cross due to how to convenient it is to skip the messy work of understanding.Also, this problem is not unique to AI. It existed before the GPTs and Claude's of the world. But it's a problem of scale, and every company on the Earth right now is trying to scale AI up as fast as possible.
legitster: Here's a practical example: I am using AI to help me with my garden. It's been amazing - it helps me identify plants, identify soil issues, what fertilizer to use and what days to apply it, etc.What exactly did AI take from me? Spending hours of research on Google and Youtube to glean little incomplete bits and pieces? Calling a yard service?It's also clearly obvious when AI gives bad or incorrect advice - I am still trying different things and watching for the results.Coding is a outlier example where AI can just do the work semi-competently without anyone checking it. But I think it speaks more to the nature of coding itself - coding is a means to an end and for most people not an actual pursuit in itself.
dogleash: >Do you use a microwave?No, not really. I recently bought one after 18 months without. I got it for house-guests to use instead of judging me for not having one.Reheating leftovers got slightly easier. And I guess I could put away my kettle.
sunir: Technodeterminism is a common feeling in paradigm shifting moments. Don’t forget who is at the helm of the change.
jplusequalt: >What exactly did AI take from me? Spending hours of research on Google and Youtube to glean little incomplete bits and pieces? Calling a yard service?An opportunity for a deeper understanding of gardening? If you spend hours researching on gardening and come away with an incomplete understanding of what you were attempting to do, I'm not sure that's immediately the fault of the research available. It could be that you just didn't do a good job searching for the necessary information.In this way, AI can be a boon. It helps figure out what you actually want to know in the moment. But I think it would be a step to far to say that a smattering of specific questions can replace the sturdy foundation povided by a typical education--e.g. through apprenticeship, books, etc.>It's also clearly obvious when AI gives bad or incorrect adviceIs it? Isn't this a __core__ problem that researchers around the world are trying to solve? Also, __how__ could you make such a statement unless you already possessed the knowledge ahead of time to make such a judgment? I think it's hard to know if something is bad advice by looking at just cause and effect. It could be that you just lack the understanding to put the advice into practice.
allenrb: For what it’s worth (ie, absolutely nothing), I agree with her 100%. I didn’t get into this field in order to prompt an AI to take care of the details. I got into it because I love the details.I’m a strong performer on a good team at a company many people would want to work at… and I know the clock is ticking. Sooner or later, I will be too slow.I’m not going to claim that this is the wrong way to go. It’s obviously the future, and the future doesn’t care what allenrb does or does not want. I’m somewhat hopeful that power and cooling requirements will come down by multiple factors of 10x over time, reducing the environmental damage.The fact is, I love what I’ve been able to do “the old way” and just don’t feel the urge to move on. So it goes.
dd8601fn: Someone the other day was talking about there being two kinds of builders. One likes the details of doing, where the other likes the things they produce.The idea was that one likes AI and the other naturally hates it.I thought about that for a bit and decided that, like most things, if you’re any good at something the “hard way” you probably have some of both. Or at least I’m sure it’s true for me.I LOVE that I can produce the things I want to create without spending months crafting lines of text. The “I know how to architect this, I know what a decent data model looks like, I have a good idea of where someone is likely to introduce security or scaling problems. I can pilot this plane and produce something GOOD.”But, I really also HATE looking at the final product and forever measuring, in my head, how much of it is even mine. Which parts I haven’t thoroughly reviewed, or would have spent a week learning and didn’t, or maybe wouldn’t have accomplished correctly at all? Am I a fraud, now? I wasn’t before…It’s a really painful trade for me.
mr_mitm: Douglas Adams really put it best:> “I’ve come up with a set of rules that describe our reactions to technologies,” writes Douglas Adams in The Salmon of Doubt.> 1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.> 2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.> 3. Anything invented after you’re thirty-five is against the natural order of things.
bachmeier: That's probably true to some extent, but I'm not completely on board.> 1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.Television and calculators were in the world when I was born, but I never viewed them as "natural". TV always seemed to be a way to distract yourself from the world.> 2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.I was happy to get on board with the WWW, the web browser, and widespread email usage. Those were revolutionary technologies with immense values. On the other hand, I'm still not on board with text messaging, phone scrolling, or social media. If I could, I'd eliminate social media from society.> 3. Anything invented after you’re thirty-five is against the natural order of things.I'm over 50 and a strong believer in the value of the LLM. It's a work tool that I can use at work and put away when I'm at home (or not, depending on my mood). It's new and exciting and revolutionary and a move in the right direction for humanity.