Discussion
Martynas M.
robotswantdata: The current deployments of chatbots are not the bar to compare with. There’s an incoming wave of extremely capable agents and process reimagining that is going to be highly disruptive.Been in this space over a decade and this time really is different. It’s hard for humans to perceive the exponential, it will be slow then sudden.
rando77: It's worth taking actions that take large scale job losses due to AI in the future into consideration, even if now is not the time
Madmallard: Are you sure about that one?What exactly will these agents be able to do with enough consistency, accuracy, and reliability that people will want to hire them over humans?In my experience with even the most basic implementation of agents, i.e. customer service chat bots, I literally cannot stand interacting with them even once. They are extremely unhelpful and I will hang up or immediately ask to speak to a human.
edu: Obviously your support chatbot with talk to your flavor of clawd that will call Claude Code that will code a solution that will be reviewed by Codex that will merge and release it and then will ping clawd that will send an email to the user announcing that their issue has been fixed. /s just in case
badgersnake: The next AI I’m working on is going be amazing and change the world. Please back my series A you won’t regret it.(Let’s not talk about my blockchain startup and my VR startup and my NFT startup). My house is nice though.
faangguyindia: wait what? I am in india and jobs are being lost by thousands everyday in IT field.infact, i go and implement dumb AI models in many companies and executives immediately show "how many people they can fire with this advancement".
kensai: Just watch this for some depressive vibes: https://x.com/TechLayoffLover
Ygg2: No posts found.
monegator: I love how the comments are always full of doomposting.Get prepared. Something is coming *soon*And how any even slightly skepctical commend gets downvoted to hell. One may start thinking there are bots promoting the narrative.
intellectronica: Every morning the turkey rejoiced and said to himself "oh joy, I'm such a lucky turkey, I don't have to do anything, the food is plentiful, I just eat and shoot the breeze the whole day long, what an awesome life!" Until one morning, the day before thanksgiving, the turkey rejoiced about the awesome day he's about to have ... just to be picked up 5 minutes later and dragged to the slaughterhouse.
qsera: Where is the news that say this?
faangguyindia: why would these private consultancy firms leak news?
qsera: I mean, not them, but the people who are (supposedly) losing these jobs might...
askl: It's always different this time. It always will only take a couple more months or years. And then people move on to the next hype topic.
aurareturn: Let's suppose you are a medium sized business. You've always wanted to provide top quality customer service but couldn't do it before because you'd need to hire 5 people to do it. Instead, you strategically decided to not provide quality customer service and sell the product at a lower cost. So you have no customer service person in the company. Service is bad. It limits growth. But it was strategic to not provide good service in order to gain an advantage somewhere else in the business.But now, you can hire 1 customer service person, who could then use AI agents to provide the top quality customer service. Previously, you needed to hire 5 people, which wasn't worth it.So you went from no customer service employee to 1.I suspect that this is what will happen. Many companies will hire their first customer service person or more. Many big companies will layoff most of their customer service people. The net effect might actually increase total customer service employment.
truetraveller: "Top customer service" and AI do not mix. People hate an AI response more than a late, real response.
ares623: Well ain't this a chronological oddity. Always 6 months away!I don't want Codex dammit! I'm a Claude Code man.
chrz: Lets go back to waterfall even harder and write the super correct and detailed design doc.
odyssey7: It’s a compelling story; but what you’re describing is, to the turkey, a black swan event, rather than an obvious inevitability that all the other turkeys keep telling the turkey is going to happen.
fabian2k: I think the argument here is a bit of a strawman, though there is a good point in there as well. AI will not automate all customer support, but it has the potential to automate a large fraction of it.The anecdote in there is about complex B2B enterprise software. That's not the majority of customer support, and is very heavy on escalating to actual experts.You don't have to remove 100% of the jobs to have huge effects. Automating large parts of a few sectors would already create significant disruptions.
skywhopper: The article literally addresses this point. The easily automated stuff doesn’t save that much money. The big costs of support are the hard things you can’t automate.
jonathanstrange: My biggest worry currently isn't even job-related, it's that corporations and authorities will use AI for customer/client relations but that this AI will not be allowed to make any significant changes and is therefore an utter waste of time. In many places, this could turn an already dire situation into an absolute nightmare. What might make it even worse is that authorities - and probably also corporations - will likely ban or block user AI agents, so you cannot even use your own AI to negotiate with their AI.That's something that needs to be addressed by lawmakers ASAP. There needs to be a right to speak to a human, or (the perhaps overly tech optimistic route) a prohibition of AI that doesn't have adequate decision-making power.
motbus3: Not putting names but a company I know closely had 20 engineers and now has only 4. And I feel like they plan for less
cess11: https://xcancel.com/TechLayoffLover works for me.
jdalsgaard: > Because the remaining 10% is what required most of the CS team’s time. They built an FAQ you can talk to.These days it's hard to get people to read an email longer then 5 lines - yet people are super excited about abundant masses of text generated by LLMs. It does not compute....
fabian2k: For this specific business. I don't think that is true for every business or field.
lelanthran: To be perfectly honest, the majority of work is going to see a restructure soon anyway."Triaging by LLM before sending task to any human" can work for almost anything, not just support calls. On another story I saw someone mention that they'd like something like an ad-blocker, but for content - a "content-blocker". Not too hard running even a local model that, via a browser extension, scnas the current page and places it into one of several bins: Read verbatim, summarise with ChatAI, Ignore completely, Read and mark for re-reading.Software dev? Bin a ticket into "complex", "simple", "talk to lead dev".Software proposal? Bin the proposal into "CotS available", "FOSS available", "Quick dev", "Too costly to proceed".Bookkeeping? Accounting? They all have tasks that can be binned.What does this all mean, I hear you ask? Well, you no longer need as many employees if some of the bins are "ChatAI and/or agent can complete this" with human review.So, yeah, a lot of people are going to be out of work if this works like they say it does.
mjd: Or it might mean they produce more valuable product and more of it and therefore need more devs to do it.If a dev produces value for the company, and then the company can automate away the least valuable part of the dev's job, the dev is now more valuable. Why would tbe company get rid of them just at that moment?Well, some will, because some companies are badly-run. Others will take advantage of the opportunity.
mikkupikku: Years ago when you went into computers, you didn't have normies warning you that one day computers will program themselves? 20 years ago, nobody could tell you if this would happen in 20 years or 200 years, but I do believe there has been a general sense of this sort of thing happening eventually.
flanked-evergl: Managed decline policies of western governments are much more threatening to white-collar workers and everyone else than AI will ever be.AI will enable significantly faster economic growth, which is something the EU has been making impossible with legislation designed to destroy Europe's economic advantage.
skywhopper: There is such a mind-bogglingly huge amount of waste in IT services worldwide, particularly in the consulting and offshoring areas, that big swings, up and down, in that area don’t actually have anything to do with what works well or doesn’t. Decisions are made to offshore work or drop offshore contracts based on the latest hype cycle, not whether it is effective or worthwhile.So while there may be lots of consultants losing their jobs, that’s not because AI tools do the work better. It’s because management thinks investors will accept the story that AI tools will do the work better and save money. Management, and investors, don’t know, can’t judge, and honestly don’t actually care if it’s better or worse. And they run things so poorly it would be impossible to tell anyway.
robotswantdata: The voice agents in development right now feel 100x the current chatbots deployed by companies.I had same opinion till a few months ago, now would prefer the [redacted company so as to not give free marketing] AI agent. You’ll start seeing this wave in around 3-6 months as most are in trial
Stromgren: I’ve been involved in building a system that reads structured data from a special form of contracts from a specific industry. Prices, clauses, pick up, delivery, etc. A couple hundred datapoints per contract. We had many discussions around how to present and sell an imperfect system. The thing is, the potential customers are today transcribing the contracts manually and we quickly realized that people make a ton of mistakes doing that. It became obvious when we were working on assertion datasets ourself. It’s not a perfect system and you have to consider how you use the data (aggregating for price indexing for instance), but we’re actually doing better than what people are achieving when they have to transcribe data for hours a day.
windward: So I get to continue the hedonistic life without having to plan for a long slaughter? That's not such a bad way to go.
keiferski: Bifurcation is the right model and it’s already happening:For things where the end customer doesn’t care if they’re interacting with an AI, reading content by an AI, etc. – or if the company doesn’t care what the customer thinks (see: automated phone customer support lines for the last twenty years) – the work will be replaced by AI work. Examples are any kind of rote documentation, generic digital asset creation like blog images, low level customer support, and most things where the company doesn’t really care about the customer, because the company is getting paid regardless.If it does matter what the end customer thinks, the role will become increasingly humanistic in nature. Examples are high-end enterprise sales, personality and expertise-driven media and content, and anything where being “revealed” as an AI is perceived negatively.
badgersnake: AI bros flagged it to death
mikkupikku: I implemented HN article triage years ago using nothing more than naive Bayesian classification on the text of the headlines. Worked surprisingly well, you might try that.
windward: >And I feel like they plan for lessI think this mentality must have its own imminent apocalypse. Gifted with an enormous increase in potential productivity, the decision is to do the same but cheaper? Who allocates capital to such spiritless commodification? It all feels like using a printing press to make one bible a month.There must be a role that can be more productive. It might not necessarily be our skillsets that fit those roles - and the roles might be more stratified - but someone is going to be able to be do more, be paid more.
gloxkiqcza: In some ways AI sounds almost utopian. I theory it could redistribute manpower more evenly between small and large businesses, allowing them to compete more fairly and improving the efficiency of capitalism (the idealistic model, not the real world state). However, than you remember that the AI tech is currently almost fully in control by the big tech (and its next generation) and you have to ask whether they’ll be able to sabotage that improvement because they will do their worst for sure since liberating the market is not beneficial to them. Let’s hope that despite all odds and current trends we actually reach a state where AI is possible to run on-prem/locally and there are still SOTA models at least as open as they are today.
mikkupikku: Nobody thinks this way, so all the posts saying it must be bots. Wow, look at all the posts saying the thing, there must be a lot of bots.Or maybe you're choosing to perceive bots when actually a lot of people disagree with you?
CodeCompost: You jest but this is precisely what we have done. Our customers have downright rejected SCRUM. It is considered a waste of money.
n4r9: At a recent AI workshop management made clear that they see AI as rendering sprints and scrums obsolete, that Kanban makes a lot more sense, and that estimating effort/story-points is also becoming meaningless. Which is a strong silver lining if you ask me.
odyssey7: I want to understand how AI leads to this outcome.
rob74: Right, so it's time to dismantle environment/climate protection, worker safety/rights, employee protections etc. etc. like the Trump Administration is currently doing over in the US and make Europe great again?!(actually, MEGA would be a great acronym, but Trump's friends in the EU are more focused on dismantling the EU rather than making it great)
rob74: That's probably for the best, really, prevents you from wasting your time reading lots of AI slop...
rob74: Is this based on anything real or just AI-generated slop meant to trigger angry reactions? It doesn't quote any sources for any of the stories, so as far as I can see, they're probably 100% made up...
odyssey7: The fact that a work of satire that stimulated interesting discussion has been flagged is telling.
n4r9: I think it's to do with the bottleneck shifting away from code generation and towards specifying and reviewing and integrating code. The process of working with AI agents to produce specs, tech specs, code, and reviews lends itself more to a flow-based structure (like kanban).Bear in mind this is a B2B enterprise company with a mix of legacy and greenfield. Might elbe different elsewhere.
lelanthran: > Or it might mean they produce more valuable product and more of it and therefore need more devs to do it.You're assuming unbounded demand for whatever product the company is producing. If demand for their product is bounded, having 1 dev produce the output of 5 devs means that the company is going to have devs simply sitting around doing nothing for most of the day.> If a dev produces value for the company, and then the company can automate away the least valuable part of the dev's job, the dev is now more valuable.I don't follow this argument - there is a practical limit to how much development a company requires. In the past they may have had a team of 10 to satisfy that limit. If the limit is satisfied by a team of 2 the company... does what exactly?After all, a limit is a limit.
tripledry: > It’s hard for humans to perceive the exponential, it will be slow then sudden.True, but also there are perception biases that lead us to believe progress is exponential, even though it might as well be an S-curve.I'm having a hard time finding the right terms, but I'm sure there is some bias to think that "the line goes up".
monegator: I mean, looking around in social media i would describe most of LLM preachers and worshippers as either conmen or non-tech guys that still have no clue about the technology behind.. the kind that worship elon and his kind, without questioning every new absurd sales pitch that comes from that bunch as the future, no matter how little is that based in reality.On the other side the doom posters tend to be awfully mediocre professionals (or again, conmen leveraging FOMO). Skeptics like in the article tend to be dismissed. I'm also a skeptic, and someone who you would define as a 10x i think, except a few years ago i would have just been, you know, good at my job?Please let me know when i'm going to be automated so i can start becoming good at something else. The future may not be bright for a number of reason, but i still have not submitted to doom.
mikkupikku: In between the brainless masses mindlessly regurgitating press releases (most of whom are human, not bots) or reddit doomerisms from the other side, there are tons of people discussing real successes and failures they've had with the tech. If you can't see the middle, it's probably because you're in one of the extremes.
aurareturn: Whoever has compute, will have the power. This is why big tech is plowing $1t into data center capex in 2026.Disclaimer: I'm an AI compute investor.
InfinityByTen: Spot on! I hate being sucked into an "accountability sink", where delay/bad treatment/ tangential answers are ok (somehow acceptable) and justified because it's not personal, "it's just the process".
mjd: Every company I have ever worked for has wanted to produce more better stuff to sell for more money. Some couldn't because they were resource constrained.Where are these businesses that only ever want to sell the same amount of the same stuff forever?
lelanthran: > Every company I have ever worked for has wanted to produce more better stuff to sell for more money.So has every company I've ever been in, but at the same time, there problem was never production, it was always sales.No company I have ever been in had the problem of "demand is so large that even if we double output we still cannot satisfy it".Both things are true at the same time - companies want to produce more, but their rate of production is not the limiting factor, the rate of sales is.> Where are these businesses that only ever want to sell the same amount of the same stuff forever?Where did I make that claim? What companies want is to sell more stuff, but production is not what is preventing them from selling more stuff.Doubling production in a company does not lead to doubling sales - an increase in one never causes the other.
ForHackernews: Strongly dispute this. Compute very depreciates rapidly. Inference is cheaper than training. DeepSeek was the warning shot across their bow, but the big AI firms can't afford to change course without jeopardizing their "Wile E. Coyote off the cliff" economics.LLM performance is already plateauing; models will get more efficient. Good-enough models will be deployed on chips, the same way H.264 is a good-enough video codec but used ubiquitously.
badgersnake: We can’t have dissent around here.