Summarizer

LLM Input

llm/fa6df919-50f4-440a-804d-6a9d3e9721d8/topic-8-1282c119-37c9-4651-9ab3-8f97c5306f73-input.json

prompt

The following is content for you to summarize. Do not respond to the comments—summarize them.

<topic>
Skill Atrophy Fears # Worries that relying on AI will cause developers to lose skills, never develop expertise, and become unable to debug or understand their own systems
</topic>

<comments_about_topic>
1. Some of it is friction, some of it is play. With AI you can get faster to the play part where you do learn a fair bit. But in a sense I agree that less is retained. I think that is not because of lack of friciton, instead is the fast pace of getting what you want now. You no longer need to make a conscious effort to remember any of it because it's effortless to get it again with AI if you ever need it. If that's what you mean by friction then I agree.

2. And I don't want to use tools I don't understand at least to some degree. I always get nervous when I do something but don't know why I do that something

3. >> The difference is that after you’ve googled it for ½ hour, you’ve learned something.

I've been programming for 15+ years, and I think I've forgotten the overwhelming majority of the things I've googled. Hell, I can barely remember the things I've googled yesterday .

4. Yes and no.

Yes, I recon coding is dead.

No, that doesn't mean there's nothing to learn.

People like to make comparisons to calculators rendering mental arithmetic obsolete, so here's an anecdote: First year of university, I went to a local store and picked up three items each costing less than £1, the cashier rang up a total of more than £3 (I'd calculated the exact total and pre-prepared the change before reaching the head of the queue, but the exact price of 3 items isn't important enough to remember 20+ years later). The till itself was undoubtedly perfectly executing whatever maths it had been given, I assume the cashier mistyped or double-scanned. As I said, I had the exact total, the fact that I had to explain "three items costing less than £1 each cannot add up to more than £3" to the cashier shows that even this trivial level of mental arithmetic is not universal.

I now code with LLMs. They are so much faster than doing it by hand. But if I didn't already have experience of code review, I'd be limited to vibe-coding (by the original definition, not even checking). I've experimented with that to see what the result is, and the result is technical debt building up. I know what to do about that because of my experience with it in the past, and I can guide the LLM through that process, but if I didn't have that experience, the LLM would pile up more and more technical debt and grind the metaphorical motorbike's metaphorical wheels into the metaphorical mud.

5. It's a little shameful but I still struggle when centering divs on a page. Yes, I know about flexbox for more than a decade but always have to search to remember how it is done.

So instead of refreshing that less used knowledge I just ask the AI to do it for me. The implications of this vs searching MDN Docs is another conversation to have.

6. A little hypothesis: a lot of .Net and Java stuff is mainlined from a giant mega corp straight to developers through a curated certification, MVP, blogging, and conference circuit apparatus designed to create unquestioned corporate friendly, highly profitable, dogma. You say ‘website’ and from the letter ‘b’ they’re having a Pavlovian response (“ Azure hosted SharePoint, data lake, MSSQL, user directory, analytics, PowerBI, and… ”).

Microsoft’s dedication to infusing OpenAI tech into everything seems like a play to cut even those tepid brains out of the loop and capture the vehicles of planning and production. Training your workforce to be dependent on third-party thinking, planning, and advice is an interesting strategy.

7. Calling it now: AI withdrawal will become a documented disorder.

8. I can absolutely see that happening. It's already kind of happened to me a couple of times when I found myself offline and was still trying to work on my local app. Like any addiction, I expect it to cost me some money in the future

9. Just you wait until the powers that be take cars away from us! What absolute FOOLS you all are to shape your lives around something that could be taken away from us at any time! How are you going to get to work when gas stations magically disappear off the face of the planet? I ride a horse to work, and y'all are idiots for developing a dependency on cars. Next thing you're gonna tell me is we're going to go to war for oil to protect your way of life.

Come on!

10. Yes, people who were at best average engineers and those that atrophied at their skill through lack of practice seem to be the biggest AI fanboys in my social media.

It's telling, isn't it?

11. > you need fancy tooling to ensure everyone can work at a reasonable level of productivity.

If you have a thousand people working on a single product, yes, but you also have the resources to have dedicated tool support teams at that level. In my experience, if you’re under multiple dozens of developers or not everyone works on all of your projects, the tools fragment because people aren’t combining or configuring them the same way and there’s enough churn in the front-end tool space that you’ll hit various compatibility issues which lower the effectiveness of sharing across projects. This is especially true if you’ve hired people who self-identify as, say, Next or Tailwind developers rather than web developers and lack the understanding of the underlying technology to fix complex problems.

12. One concern is those less experienced engineers might never become experienced if they’re using AI from the start. Not that everyone needs to be good at coding. But I wonder what new grads are like these days. I suspect few people can fight the temptation to make their lives a little easier and skip learning some lessons.

13. No but I don't use it to generate code usually.

I gave agents a solid go and I didn't feel more productive, just became more stupid.

14. Perhaps it is a skill issue. But I don't really see the point of trying when it seems like the gains are marginal. If agent workflows really do start offering 2x+ level improvements then perhaps I'll switch over, in the meantime I won't have to suffer mental degradation from constant LLM usage.

15. > I feel like I can manage the entire stack again - with confidence.

By not managing anything? Ignorance is bliss, I guess.

I understand it. I've found myself looking at new stacks and tech, not knowing what I didn't know, and wondering where to start. But if you skip these fundamentals of the modern dev cycle, what happens when the LLM fails?

16. Then it fails and the world doesn't end. You fix it or delegate it and move on. Most people aren't working on code for power grids and fighter jets. There's room for failure.
This same argument was used by the old timers when younger programmers couldn't code assembly or C on bare metal systems.

17. This isn't supposed to be a slam on LLMs. They're genuinely useful for automating a lot of menial things... It's just there's a point where we end up automating ourselves out of the equation, where we lose opportunity to learn, and earn personal fulfilment.

Web dev is a soft target. It is very complex in parts, and what feels like a lot of menial boilerplate worth abstracting, but not understanding messy topics like CSS fundamentals, browser differences, form handling and accessibility means you don't know to ask your LLM for them.

You have to know what you don't know before you can consciously tell an LLM to do it for you.

LLMs will get better, but does that improve things or just relegated the human experience further and further away from accomplishment?

18. AI makes finishing projects easier. But I would steer away from starting them.

In order for me to be comfortable with a code base and consider it mine I need to have written the foundation, not merely reviewed in. Once the pillars are there, LLMs do make further development faster and I can concentrate on fun details (like tinkering with CSS or thinking about some very specific details).

19. > Over the past two decades, I’ve worked with a lot of talented people: backend developers, frontend developers, marketers, leaders, and more. I can lean on those experiences, fall back on how they did things, and implement their methods with AI.

Will that really work? You interacted with the end product, but you don't have the experience and learned lessons that those people had. Are you sure this isn't the LLM reinforcing false confidence? Is the AI providing you with the real thing or a cheap imitation and how can you tell?

20. It sounds like a first april entry.

Things such as:

"They’re far from perfect, but claude and codex gave me the leverage I desperately needed."

Yikes. I most definitely don't want AI to take away abilities.

I do kind of approach web development differently. Rather than static
HTML and CSS for the most part (which I, of course, also use), ruby acts
as primary wrapper and I treat HTML tags like objects as well as everything
else. So I kind of describe a web page on a (one level higher) layer. It
is not 100% perfect as some things are messy (also due to legacy, some of
the code I started writing 20 years ago, updated some of it but other parts
need to be upated too, which is only possible when time permits); but even
with this in mind, I simply could never go back to using the web with HTML
and CSS as a primary means to describe web-related content. It would just be
very inefficient use of my time.

> When AI generates code, I know when it’s good and when it’s not.

Ok - now I know this is a first april entry indeed.

> There’s mental space for creativity in building software again.

Which, of course, would not make any sense. Now the article is a first
april entry, but if we were to assume he would write this for real, why
would AI have taken away creativity? People can still think on their own.
In theory they could have the great ideas - and AI autogenerates all
necessary code. So this use case would not be that terrible IF it were
to work perfectly well. I don't see it work that way right now. AI often
just is a mega-spammer everywhere. It spams out crap, some of which is
useful, but the default is crap.

> AI really has made web development fun again.

Not really. But I also think that the whole web-stack should be simplified
and streamlined. Instead what I see is the opposite happening. Complexity
rises. And JavaScript sucks so much it is really unbearable. You can do
many useful things in JavaScript, but as a language it is a true clown
language. I used to think I dislike PHP the most, but I no longer use PHP
yet I have to use JavaScript. Every second line of code I ask myself why
this joke could have ever become popular. Even Java evolved and got better.
JavaScript appears to have gotten more stupid over the years.

21. To extend the metaphor, which provides better exercise for your body? A bicycle or a powered exoskeleton with turret cannons?

22. Ok but you didn't ask any questions in the transcript you provided. Maybe that one was an outlier?

In order to learn you generally need to actually do the thing, and usually multiple times. My point is that it's easy to use an AI to shortcut that part, with a healthy dose of sycophancy to make you feel like you learned so well.

23. >>Starting a new project once felt insurmountable. Now, it feels realistic again.

Honestly, this does not give me confidence in anything else you said. If you can't spin up a new project on your own in a few minutes, you may not be equipped to deal with or debug whatever AI spins up for you.

>>When AI generates code, I know when it’s good and when it’s not. I’v seen the good and the bad, and I can iterate from there. Even with refinement and back-and-forth prompting, I’m easily 10x more productive

Minus a baseline, it's hard to tell what this means. 10x nothing is nothing. How am I supposed to know what 1x is for you, is there a 1x site I can look at to understand what 10x would mean? My overall feeling prior to reading this was "I should hire this guy", and after reading it my overwhelming thought was "eat a dick, you sociopathic self-aggrandizing tool." Moreover, if you have skill which you feel is augmented by these tools, then you may want to lean more heavily on that skill now if you think that the tool itself makes everyone capable of writing the same amazing code you do. Because it sounds like you will be unemployed soon if not already, as a casualty of the nonsense engine you're blogging about and touting.

24. > LLMs bailed us out of the impending ultra-specialization.

This is fundamentally what makes them so DAMAGING to humanity. They didn't bail us out, they robbed us of it.

25. isn't it exactly the opposite? LLMs have killed the generalist, only specialists with very targeted skills have anything marketable
</comments_about_topic>

Write a concise, engaging paragraph (3-5 sentences) summarizing the key points and perspectives in these comments about the topic. Focus on the most interesting viewpoints. Do not use bullet points—write flowing prose.

topic

Skill Atrophy Fears # Worries that relying on AI will cause developers to lose skills, never develop expertise, and become unable to debug or understand their own systems

commentCount

25

← Back to job