←back to thread

Critical CSS

(critical-css-extractor.kigo.studio)
234 points stevenpotts | 4 comments | | HN request time: 0.286s | source
Show context
todotask2 ◴[] No.43902273[source]
When I tested mine, I got the following:

Built on Astro web framework

HTML: 27.52KB uncompressed (6.10KB compressed)

JS: <10KB (compressed)

Critical CSS: 57KB uncompressed (7KB compressed) — tested using this site for performance analysis.

In comparison, many similar sites range from 100KB (uncompressed) to as much as 1MB.

The thing is, I can build clean HTML with no inline CSS or JavaScript. I also added resource hints (not Early Hints, since my Nginx setup doesn't support that out of the box), which slightly improve load times when combined with HTTP/2 and short-interval caching via Nginx. This setup allows me to hit a 100/100 performance score without relying on Critical CSS or inline JavaScript.

If every page adds 7KB, isn’t it wasteful—especially when all you need is a lightweight SPA or, better yet, more edge caching to reduce the carbon footprint? We don’t need to keep transmitting unnecessary data around the world with bloated HTML like Elementor for WordPress.

Why serve users unnecessary bloat? Mobile devices have limited battery life. It's not impossible to achieve lighting fast experience once you move away from shared hosting territory.

replies(2): >>43902335 #>>43904057 #
1. kijin ◴[] No.43902335[source]
Yeah, it's a neat trick but kinda pointless. In a world with CDNs and HTTP/2, all this does is waste bandwidth in order to look slightly better in artificial benchmarks.

It might improve time to first paint by 10-20ms, but this is a webpage, not a first-person shooter. Besides, subsequent page loads will be slower.

replies(3): >>43902971 #>>43903285 #>>43904848 #
2. aitchnyu ◴[] No.43902971[source]
Yup, whereever we deviated from straightforward asset downloads to optimize something, we always end up slower or buggy. Like manually downloading display images or using websockets to upload stuff. Turns out servers and browsers have spent more person-years optimizing it better than me.
3. todotask2 ◴[] No.43903285[source]
And Critical CSS requires reducing the CSP (Content Security Policy), which I have already hardened almost entirely along with Permissions Policy.
4. nashashmi ◴[] No.43904848[source]
Imagine this: before serving the page, a filter seeks out the critical css, inserts it, and removes all css links. Greatly improving page load times and reducing CDN loads.

Edit: on second reading, it seems like you are saying when another page from the same server with the same style loads again, the css would have to be reloaded and This increases bandwidth in cases where a site visitor loads multiple pages. So yes it is optimum for conditions where the referrer is external to the site.