When Open Source Maintainers Don’t Understand Community is Important

This is just to vent frustration at a thoroughly stupid experience I had recently. A portion of that stupidity is me failing to read something correctly, but I’m just really stuck on the stupidity of the response to me asking for help:

I asked for clarification, and was told to go away.

My reaction clearly indicates that I am not undrstanding something, and I even tried to give context to where I’m coming from so that it would be easier to spot what I misunderstood, but instead I was told to go ask a bot.

And then they blocked anyone from ever asking for help again.

The public is not allowed to open issues now.

What’s most frustrating to me about this is that it coincides perfectly with another issue I ran into today where I couldn’t add an important detail to an old issue. Past conversations are useful to people looking for assistance, especially when one solves their problem and explains it. When I am blocked from replying to something with a solution, anyone in the future experiencing the same issue is likewise blocked from finding the answer.

I now know what I messed up, but I’m not allowed to pass that knowledge to the future, because I was confused and made a mistake in how I asked for help.

There’s another layer to this that is often ignored: When this is the response the average newbie gets when they first try to contribute, they are encouraged to never ask again, or in the case of submitting pull requests, encouraged to never try to help again.

When open source maintainers discourage newbies, they cannibalize the future of their software.


Okay, that’s my entire point, but I also encounted some funny things as part of this.

What is a contribution? GitHub doesn’t know!

I think it’s interesting that GitHub says the repo limited opening issues / commenting on issues to past contributers, but I am a past contributer. GitHub clearly considers issues to be contributions, as every profile has a graph showing issues as part of their contributions:

My contributions: 89% commits, 11% issues.

AI tools can be very powerful, but they can also be very stupid

Earlier today, I tested Perplexity AI’s capability to answer a few basic questions easily answered through traditional search engines, such as which insect has the largest brain and which country is the current leader in development of thorium-based reactors. The results? It doesn’t know ants are insects, thinks fruit flies have large brains just because they have been the subject of a large number of studies, and ignores India in favor of China because western media reports on China a lot more.

But you know what, I wanted to test this asshole’s suggestion to ask ChatGPT about my problem, and surprisingly, it gave a very clear and accurate response!

Note (2024-10-02): Open AI has since removed the ability to access web sites from ChatGPT, and dumbed it down significantly. It is no longer a viable tool for most use cases.

ChatGPT points out what I misread: I have to clone the repo AND run NPM, not just run NPM.

When you offer binaries for a project, they have to actually exist..

To be fair, this is a fairly recent change to the ReadMe, but maybe you should publish binaries before advertising that you publish binaries?

Getting Started: Download a release binary and run it. Simple.
The advertised binaries don't exist.

Installation and usage aren’t the same thing

It’s understandable to be confused about whether someone has correctly installed something, but after confirming that installation has worked, ignoring the question asked is unhelpful to say the least.

After confirming that I've installed it, my question is ignored.