Not So Fast: AI Coding Tools Can Actually Reduce Productivity
-
Not So Fast: AI Coding Tools Can Actually Reduce Productivity
secondthoughts.ai/p/ai-coding-slowdown -
Not So Fast: AI Coding Tools Can Actually Reduce Productivity
secondthoughts.ai/p/ai-coding-slowdown -
Not So Fast: AI Coding Tools Can Actually Reduce Productivity
secondthoughts.ai/p/ai-coding-slowdownNote that the takeaway isn't "AI sucks" but rather that developers felt it made them faster even though the numbers showed the exact opposite. That may be due to the output quality, but also due to inexperience with using these tools.
-
Note that the takeaway isn't "AI sucks" but rather that developers felt it made them faster even though the numbers showed the exact opposite. That may be due to the output quality, but also due to inexperience with using these tools.
@volpeon the takeaway is devs sucks -
Note that the takeaway isn't "AI sucks" but rather that developers felt it made them faster even though the numbers showed the exact opposite. That may be due to the output quality, but also due to inexperience with using these tools.
@volpeon I can believe it but it's also for a specific case of developing where the developer has high familiarity with the codebase, as I understand it -
@volpeon I can believe it but it's also for a specific case of developing where the developer has high familiarity with the codebase, as I understand it
@sun Yeah, from what I've read in comments AI tools help people with getting started with things you aren't familiar with, but as you gain experience (provided you're willing to learn from what the AI produced) you may be better off writing things yourself. Makes sense to me
-
Note that the takeaway isn't "AI sucks" but rather that developers felt it made them faster even though the numbers showed the exact opposite. That may be due to the output quality, but also due to inexperience with using these tools.
The coding applications built on those models, like Cursor, are going to keep improving to make better use of the models
This part is funny, though. Just as Cursor is forced to enshittify because Anthropic upped their prices for enterprise customers (which is most likely because they're in trouble themselves). -
The coding applications built on those models, like Cursor, are going to keep improving to make better use of the models
This part is funny, though. Just as Cursor is forced to enshittify because Anthropic upped their prices for enterprise customers (which is most likely because they're in trouble themselves).The problem tools like Cursor have is that unlike classic software, AI is horrible to run at scale. With something like a social network, the cost per user goes down as the number of user increases. With AI, you can't have this kind of parallelism that brings the cost down and that means there's linear growth. Computations on the GPU are specific to one model invocation, and a model invocation can't handle multiple requests at once.
-
The problem tools like Cursor have is that unlike classic software, AI is horrible to run at scale. With something like a social network, the cost per user goes down as the number of user increases. With AI, you can't have this kind of parallelism that brings the cost down and that means there's linear growth. Computations on the GPU are specific to one model invocation, and a model invocation can't handle multiple requests at once.
When you run an LLM, and then another one for a different user, they will use twice the amount of VRAM and twice the number of cores to get the same performance as the original single run.
Let's say you have a database server used by one application, and then you add another application. How much do the resource requirements increase? Not by another 100%, that's for sure.