AI Coding Tools Influence Productivity Inconsistently

Not So Fast: AI Coding Tools Can Actually Reduce Productivity by Steve Newman is a detailed response to METR’s Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity study. The implied conclusion is AI tools decrease productivity by 20%, but this isn’t the only conclusion, and more study is absolutely required.

[This study] applies to a difficult scenario for AI tools (experienced developers working in complex codebases with high quality standards), and may be partially explained by developers choosing a more relaxed pace to conserve energy, or leveraging AI to do a more thorough job.
– Steve Newman

Under the section Some Kind of Help is the Kind of Help We All Can Do Without is exactly what I’d expect: The slowdown is attributable to spending a lot of time dealing with AI output being substandard. I believe this effect can be reduced by giving up on AI assistance faster. In my experience, AI tooling is best used for simple tasks when you verify the suggested code/tool usage by reviewing manuals/guides, or when you only use the output as a first-pass glance to see what tools/libraries you should look up to better understand options.

To me, it seems that many programmers are too focused on repetitively trying AI tools when that usually isn’t very effective. If AI can’t be coerced into correct output within a few tries, it usually will take more effort to keep trying than to write it yourself.


I wrote the following from the perspective of wanting this study to be false:

There are several potential reasons for the study results to be false, and these pitfalls were accounted for, but I feel some arguments were not well-supported.

  • Overuse of AI: I think the reasoning for why this effect wasn’t present is shaky because it reduced the sample size significantly.
  • Lack of experience with AI tools: This was treated as a non-issue, but relying on self-reporting to make that determination, which is generally unreliable (which was pointed out elsewhere). (Though, there was not an observable change over the course of the study, indicating changing experience is unlikely to affect the result.)
  • Difference in thoroughness: This effect may have influenced the result, but there was no significant effect shown either way. This means more study is required.
  • More time might not mean more effort. This was presented with nothing to argue for or against it – because it needs further study.

(The most important thing to acknowledge is that it’s complex, and we don’t have all the answers.)

Conclusions belong at the top of articles.

Studies are traditionally formatted in a way that leaves their conclusions to the end. We’ve all been taught this for essay writing in school. This should not be carried over to blog posts and articles published online. I also think this is bad practice in general, but at least online, where attention spans are their shortest, put your key takeaways at the top, or at least provide a link to them from the top.

Hank on vlogbrothers explains how the overload of information online is analogous to how nutrition information is overwhelming and not helpful. (This hopefully explains one of the biggest reasons why the important stuff needs to be clear and accessible.)

Writers have a strong impulse to save their best for last. We care about what we write and want it to be fully appreciated, but that’s just not going to happen. When you bury the lead, you are spreading misinformation, even if you’ve said nothing wrong.

Putting conclusions at the end is based on the assumption that everyone reads the whole thing. Almost no one does that. The majority look at a headline only. The next 99% only read the beginning, and the next group doesn’t finish it either. A minority finishes reading everything they start, and that’s actually a bad thing to do. Many things aren’t worth reading ALL of. Like this, why are you still reading? I’ve made the point already. This text is fluff at the end, existing to emphasize a point you should already have understood from the rest.

Fuck Windows ..and Ubuntu (A Rant)

My OS History

I started with Windows 95, and it was okay. Upgrading to Windows 98 helped a lot and I still love that OS. My next experience was with Windows XP, and it was good. When Windows Vista first came out, I tried it and had several problems with it. (No, I don’t remember what they were.)

Somewhere around this time I was introduced to Linux and tried a few distributions. (My favorites were Ubuntu, slax, & Antergos. Later, my absolute favorite would be CrunchBang. I still miss all of these, including old versions of Ubuntu.)

After Windows 7 came out, my experience with Windows started to go downhill. Nonsensical errors (why does an administrator not have full disk access?), rebooting my computer without consent (no matter how many times I disabled this “feature”), running slowly despite good hardware.. the list is long. This is when I first thought about using Linux for things besides programming.

I again skipped Windows versions until Windows 10, mainly due to free upgrades being offered and hating the UI changes in Windows 8 that were partially walked back. At this point, my hatred started. Default applications I can’t uninstall or even hide, advertisements built-in, a virtual assistant I could not disable or remove always running in the background.. and they even removed the pretense of controlling updates.

Oh, and the default malware included to spy on your usage, again, without consent.

But the problem is that I was running a mildly successful gaming YouTube channel at the time, I needed Windows because no video editor on Linux was good enough, and games only work on Windows.

Then Steam announced Proton, and reviews were good. Over time, Linux seemingly became viable. There was even a hot new video editor called daVinci Resolve, and it runs on Windows, macOS, and Linux!

What Happened to Ubuntu?

I ran various versions of Ubuntu as a secondary OS or on a USB drive from 8 to 18 without problems. Common problems like networking, video card support, and audio issues were never difficult – and often did not even occur. Ubuntu was a good choice because its popularity made it more often supported, and it was usually stable.

Not this time. I spent weeks trying to get it working, and while I was eventually successful, it was only through stubbornness and a lot of reading.

I started with a new SSD, as my old one only had 128 GB of space, and I was going to need a lot more for video editing and running games that require better disk streaming. First, the install failed because a disk I wasn’t even using is corrupted. Then, it failed to install the bootloader. Then it failed because of a partially completed Ubuntu install. I moved on to trying elementary OS (a derivative of Ubuntu) because it has several improvements and is still widely supported, but this also failed.

Turns out, since version 14.04, there’s a bug where Ubuntu won’t install a bootloader if you select any disk besides the first. There is no warning of this anywhere, and I had to find a bug report from half a decade ago to even learn this. So, I removed all disks except my new SSD, and moved it to first SATA port on the motherboard, and Ubuntu .. still didn’t install.

Time to try again, except I accidentally booted into Ubuntu from the SSD.. you know, the OS that failed to install? So, it turns out that not only does it fail to install a bootloader under most possible conditions, but a success crashes the installer. Oh well, at least I now have a working system, time to update!

Ubuntu Prominently Publishes Broken Versions

I run updates, and find out there’s a new OS version. I’d started with version 20.10 because it was what was out when I started this, and version 21.04 had released since then. I run the upgrade.. and now I can’t boot anymore. This has never happened to me before, and this is a brand new system.

Turns out, version 21.04 shipped with a bug that breaks the bootloader on any system, whether it be a fresh install or through an upgrade. Here’s the fucking problem: They only disabled update notifications, instead of pulling the faulty update or OFFERING ANY WARNING WHATSOEVER.

There is no reason I couldn’t have been notified not to update. There is no reason to keep a broken release public. There is no reason for any of this to have happened the way it did.

This is unacceptable, and even since fixing the problem on my system, Ubuntu has just been a completely different system than what it was. They added ads/spyware to the base OS and pushed updates that break configuration & uninstall apps. It’s just not good anymore, and it makes me sad.