Interesting quantitative look at web performance and how designs made for people with high-end devices can be practically unusable for people on low-end devices, which disproportionately affects poorer people and people in developing countries. Also discusses how sites game Google’s performance metrics—maybe not news to the web devs among ye, but it was new to me. The arrogance of the Discourse founder was astounding.
RETVRN to static web pages.[1]
Also, from one of the appendices:
In principle, HN should be the slowest social media site or link aggregator because it’s written in a custom Lisp that isn’t highly optimized and the code was originally written with brevity and cleverness in mind, which generally gives you fairly poor performance. However, that’s only poor relative to what you’d get if you were writing high-performance code, which is not a relevant point of comparison here.
Although even static web pages can be fraught—see his other post on speeding up his site 50x by tearing out a bunch of unnecessary crap. ↩︎
This one? https://endtimes.dev/why-your-website-should-be-under-14kb-in-size/
yes