this isn’t like, a particularly principled stance, because ultimately i think that the bulk of the ethical responsibility for computers lies at the feet of the people providing the computers, and not at the feet of some randos who wrote a software library that someone else decided to run on the computers, but also, no i do not want to make those software platforms i disagree with strongly on a philosophical level Better
there’s plenty of tech thinkpiecing right now which is like “if a web scraper uses libxml2 to process web pages does that mean that contributing to libxml2 is being complicit in web scraping” and i don’t buy that, i think (1) the people who are complicit in the web scraping are the people providing the physical infrastructure (machines, wires, electricity) required to carry it out, and (2) libxml2 isn’t even that good, and the fact that the web scrapers can Just Use It instead of writing their own XML parser might save them a little bit of time and intellectual work in the short term, but in the long term they would just write their own XML parser, it’s not like nobody knows how to do that, secrecy regarding the methods and computer code is not a meaningful impediment
programmers seem very committed in the present moment to thinking of code as concrete and material but infrastructure as abstract and ephemeral and i would like to suggest that maybe that is exactly backwards
@Lady genuinely curious. What's wrong with Rails?
@wallhackio speak of the devil and he shall appear: https://hachyderm.io/@pat/112216636273524747
@Lady ah. That is disappointing