←back to thread

752 points dceddia | 1 comments | | HN request time: 0.203s | source
Show context
yomlica8 ◴[] No.36447314[source]
It blows my mind how unresponsive modern tech is, and it frustrates me constantly. What makes it even worse is how unpredictable the lags are so you can't even train yourself around it.

I was watching Halt and Catch Fire and in the first season the engineering team makes a great effort to meet something called the "Doherty Threshold" to keep the responsiveness of the machine so the user doesn't get frustrated and lose interest. I guess that is lost to time!

replies(18): >>36447344 #>>36447520 #>>36447558 #>>36447932 #>>36447949 #>>36449090 #>>36449889 #>>36450472 #>>36450591 #>>36451868 #>>36452042 #>>36453741 #>>36454246 #>>36454271 #>>36454404 #>>36454473 #>>36462340 #>>36469396 #
kitsunesoba ◴[] No.36447949[source]
This is what happens when you have a leaning tower of abstractions, with each layer being developed with a philosophy of, "it's good enough". Some performance loss is unavoidable when you're adding layers, but that aforementioned attitude of indifference has a multiplicative effect which dramatically increases losses. By the time you get to the endpoint, the losses snowball into something rather ridiculous.
replies(3): >>36448999 #>>36449608 #>>36451720 #
1. Arwill ◴[] No.36451720[source]
Each level of abstraction has its own caching and buffering routines because the underlying layers are slow, and without the ability to make them better, you can only put your own cache on top of it. This helps initially, but in the end, the time goes wasted managing all those caches and buffers at every given layer.