llm/5888b8dc-b96e-4444-9c3c-465dde409e92/topic-2-146074a1-ca3c-4b40-91c4-4e1b15015810-input.json
You are a comment summarizer. Given a topic and a list of comments tagged with that topic, write a single paragraph summarizing the key points and perspectives expressed in the comments. TOPIC: Skill atrophy concerns with AI COMMENTS: 1. The difference is that after you’ve googled it for ½ hour, you’ve learned something. If you ask an LLM to do it for you, you’re none the wiser. 2. Learning means friction, it's not going to happen any other way. 3. Some of it is friction, some of it is play. With AI you can get faster to the play part where you do learn a fair bit. But in a sense I agree that less is retained. I think that is not because of lack of friciton, instead is the fast pace of getting what you want now. You no longer need to make a conscious effort to remember any of it because it's effortless to get it again with AI if you ever need it. If that's what you mean by friction then I agree. 4. And I don't want to use tools I don't understand at least to some degree. I always get nervous when I do something but don't know why I do that something 5. I didnt build my car either. But I understand a bit of most of the main mechanics, like how the ABS works, how powered steering does, how an ICE works and so on. 6. I've seen this argument a few times before and I'm never quite convinced by it because, well, all those arguments are correct. It was an existential threat to the scribes and destroyed their jobs, the majority of printed books are considered less aesthetically pleasing than a properly illuminated manuscript, and hand copying is considered a spiritual act by many traditions. I'm not sure if I say it's a correct argument, but considering everyone in this thread is a lot closer to being a scribe than a printing press owner, I'm surprised there's less sympathy. 7. Exactly. What makes it even more odd for me is they are mostly describing doing nothing when using their agents. I see the "providing important context, setting guardrails, orchestration" bits appended, and it seems like the most shallow, narrowest moat one can imagine. Why do people believe this part is any less tractable for future LLMs? Is it because they spent years gaining that experience? Some imagined fuzziness or other hand-waving while muttering something about the nature of "problem spaces"? That is the case for everything the LLMs are toppling at the moment. What is to say some new pre-training magic, post-training trick, or ingenious harness won't come along and drive some precious block of your engineering identity into obsolescence? The bits about 'the future is the product' are even stranger (the present is already the product?). To paraphrase theophite on Bluesky, people seem to believe that if there is a well free for all to draw from, that there will still exist a substantial market willing to pay them to draw from this well. 8. Well the lesson is that for all of us who invested a lot of time and effort to become good software developers the value of our skill set is now near zero. 9. It's a little shameful but I still struggle when centering divs on a page. Yes, I know about flexbox for more than a decade but always have to search to remember how it is done. So instead of refreshing that less used knowledge I just ask the AI to do it for me. The implications of this vs searching MDN Docs is another conversation to have. 10. A little hypothesis: a lot of .Net and Java stuff is mainlined from a giant mega corp straight to developers through a curated certification, MVP, blogging, and conference circuit apparatus designed to create unquestioned corporate friendly, highly profitable, dogma. You say ‘website’ and from the letter ‘b’ they’re having a Pavlovian response (“ Azure hosted SharePoint, data lake, MSSQL, user directory, analytics, PowerBI, and… ”). Microsoft’s dedication to infusing OpenAI tech into everything seems like a play to cut even those tepid brains out of the loop and capture the vehicles of planning and production. Training your workforce to be dependent on third-party thinking, planning, and advice is an interesting strategy. 11. Calling it now: AI withdrawal will become a documented disorder. 12. I can absolutely see that happening. It's already kind of happened to me a couple of times when I found myself offline and was still trying to work on my local app. Like any addiction, I expect it to cost me some money in the future 13. Just you wait until the powers that be take cars away from us! What absolute FOOLS you all are to shape your lives around something that could be taken away from us at any time! How are you going to get to work when gas stations magically disappear off the face of the planet? I ride a horse to work, and y'all are idiots for developing a dependency on cars. Next thing you're gonna tell me is we're going to go to war for oil to protect your way of life. Come on! 14. Aren't you afraid it's gonna be a race to the bottom ? the software industry is now whoever pays gemini to deploy something prompted in a few days. Everybody can, so the market will be inundated by a lot of people, and usually this makes for a bad market (a few shiny one gets 90% of the share while the rest fight for breadcrumbs) I'm personally more afraid that stupid sales oriented will take my job instead of losing it to solid teams of dedicated expert that invested a lot of skills in making something on their own. it seems like value inversion 15. Possibly which means devs will have to pivot ... I dont know where though since it would mean most jobs are over and a new economy must be invented 16. I think everyone worries about this. No one knows how it's going to turn out, none of us have any control over it and there doesn't seem to be anything you can do to prepare ahead of time. 17. Definitely. I'm not disparaging the process of assembling IKEA furniture, nor the process of producing software using LLMs. I've done both, and they have their time and place. What I'm pushing back on is the idea that these are equivalent to carpentry and programming. I think we need new terminology to describe this new process. "Vibe coding" is at the extreme end of it, and "LLM-assisted software development" is a mouthful. Although, the IKEA analogy could be more accurate: the assembly instructions can be wrong; some screws may be missing; you ordered an office chair and got a dining chair; a desk may have five legs; etc. Also, the thing you built is made out of hollow MDF, and will collapse under moderate levels of stress. And if you don't have prior experience building furniture, you end up with no usable skills to modify the end result beyond the manufacturer's original specifications. So, sure, the seemingly quick and easy process might be convenient when it works. Though I've found that it often requires more time and effort to produce what I want, and I end up with a lackluster product, and no learned skills to show for it. Thus learning the difficult process is a more rewarding long-term investment if you plan to continue building software or furniture in the future. :) 18. Yes, people who were at best average engineers and those that atrophied at their skill through lack of practice seem to be the biggest AI fanboys in my social media. It's telling, isn't it? 19. One concern is those less experienced engineers might never become experienced if they’re using AI from the start. Not that everyone needs to be good at coding. But I wonder what new grads are like these days. I suspect few people can fight the temptation to make their lives a little easier and skip learning some lessons. 20. No but I don't use it to generate code usually. I gave agents a solid go and I didn't feel more productive, just became more stupid. 21. Perhaps it is a skill issue. But I don't really see the point of trying when it seems like the gains are marginal. If agent workflows really do start offering 2x+ level improvements then perhaps I'll switch over, in the meantime I won't have to suffer mental degradation from constant LLM usage. 22. > I feel like I can manage the entire stack again - with confidence. By not managing anything? Ignorance is bliss, I guess. I understand it. I've found myself looking at new stacks and tech, not knowing what I didn't know, and wondering where to start. But if you skip these fundamentals of the modern dev cycle, what happens when the LLM fails? 23. Then it fails and the world doesn't end. You fix it or delegate it and move on. Most people aren't working on code for power grids and fighter jets. There's room for failure. This same argument was used by the old timers when younger programmers couldn't code assembly or C on bare metal systems. 24. This isn't supposed to be a slam on LLMs. They're genuinely useful for automating a lot of menial things... It's just there's a point where we end up automating ourselves out of the equation, where we lose opportunity to learn, and earn personal fulfilment. Web dev is a soft target. It is very complex in parts, and what feels like a lot of menial boilerplate worth abstracting, but not understanding messy topics like CSS fundamentals, browser differences, form handling and accessibility means you don't know to ask your LLM for them. You have to know what you don't know before you can consciously tell an LLM to do it for you. LLMs will get better, but does that improve things or just relegated the human experience further and further away from accomplishment? 25. > Over the past two decades, I’ve worked with a lot of talented people: backend developers, frontend developers, marketers, leaders, and more. I can lean on those experiences, fall back on how they did things, and implement their methods with AI. Will that really work? You interacted with the end product, but you don't have the experience and learned lessons that those people had. Are you sure this isn't the LLM reinforcing false confidence? Is the AI providing you with the real thing or a cheap imitation and how can you tell? 26. Maybe its just me but I enjoy learning how all these systems work. Vibe Coding and LLMs basically take that away from me, so I dont think ill ever be as hyped for AI as other coders 27. It sounds like a first april entry. Things such as: "They’re far from perfect, but claude and codex gave me the leverage I desperately needed." Yikes. I most definitely don't want AI to take away abilities. I do kind of approach web development differently. Rather than static HTML and CSS for the most part (which I, of course, also use), ruby acts as primary wrapper and I treat HTML tags like objects as well as everything else. So I kind of describe a web page on a (one level higher) layer. It is not 100% perfect as some things are messy (also due to legacy, some of the code I started writing 20 years ago, updated some of it but other parts need to be upated too, which is only possible when time permits); but even with this in mind, I simply could never go back to using the web with HTML and CSS as a primary means to describe web-related content. It would just be very inefficient use of my time. > When AI generates code, I know when it’s good and when it’s not. Ok - now I know this is a first april entry indeed. > There’s mental space for creativity in building software again. Which, of course, would not make any sense. Now the article is a first april entry, but if we were to assume he would write this for real, why would AI have taken away creativity? People can still think on their own. In theory they could have the great ideas - and AI autogenerates all necessary code. So this use case would not be that terrible IF it were to work perfectly well. I don't see it work that way right now. AI often just is a mega-spammer everywhere. It spams out crap, some of which is useful, but the default is crap. > AI really has made web development fun again. Not really. But I also think that the whole web-stack should be simplified and streamlined. Instead what I see is the opposite happening. Complexity rises. And JavaScript sucks so much it is really unbearable. You can do many useful things in JavaScript, but as a language it is a true clown language. I used to think I dislike PHP the most, but I no longer use PHP yet I have to use JavaScript. Every second line of code I ask myself why this joke could have ever become popular. Even Java evolved and got better. JavaScript appears to have gotten more stupid over the years. 28. To extend the metaphor, which provides better exercise for your body? A bicycle or a powered exoskeleton with turret cannons? 29. Laziness, or job search, or parenting, or health issues, or caregiving, or something else. It's not a binary stay-current-or-you're-lazy situation, it's that the entire industry is moving to shorter timelines, smaller teams, and more technical complexity for web projects simultaneously. LLMs are a huge dopamine hit for short term gains when you're spinning plates day after day. The question is what the ecosystem will look like when everybody's been using LLMs as a stopgap for an extended period of time. 30. Web development may be fun again but you aren’t developing. You order and became a customer. Maybe you can distinguish good code from bad code but how long will you check it? Auditing wasn’t the fun part ever. And I bet at some point you will recognize a missing feeling of accomplishment because you didn’t figure out the how, you just ordered the what. We wouldn’t call someone a painter who let AI do the painting. 31. When stuff was getting too complicated, I looked for ways to make things simpler. Developers have spent decades trying to figure out ways to make things simpler, less code the better, only to throw it all out the window because chatbot go brrrrrr. 32. honestly, with LLMs, everything is fun again. embedded dev with a billion toolchains, GPU development with each vendors bespoke API, ffmpeg with its billion parameters - if anything, you could say LLMs bailed us out of the impending ultra-specialization. without LLMs, we might be facing a world where 30% of the workforce is in software dev. i am keeping my eyes peeled on vibe-coding PCB layouts and schematics. a lot of eyes in that direction already but its still early. 33. > LLMs bailed us out of the impending ultra-specialization. This is fundamentally what makes them so DAMAGING to humanity. They didn't bail us out, they robbed us of it. 34. Specialization is for insects, as Heinlein said. We are going back to the Renaissance Man ideal and I'm all for it. 35. isn't it exactly the opposite? LLMs have killed the generalist, only specialists with very targeted skills have anything marketable 36. 100% the opposite. LLMs lack high level creativity, wisdom and taste. Being a generalist is how you build these. For example, there's a common core to music, art, food, writing, etc that you don't see until you've gotten good at 3+ aesthetic fields. There are common patterns in different academic disciplines and activities that can supercharge your priors and help you make better decisions. LLMs can "see" these these connections if explicitly prompted with domains and details, but they don't seem to reason with them in mind or lean on them by default. On the other hand, LLMs are being aggressively RL'd by the top 10% of various fields, so single field expertise by some of the best in the world is 100% baked in and the default. 37. “LLMs bailed us out of the impending ultra-specialization” - well said! Write a concise, engaging paragraph (3-5 sentences) that captures the main ideas, notable perspectives, and overall sentiment of these comments regarding the topic. Focus on the most interesting and representative viewpoints. Do not use bullet points or lists - write flowing prose.
Skill atrophy concerns with AI
37