←back to thread

752 points dceddia | 5 comments | | HN request time: 0.962s | source
Show context
yomlica8 ◴[] No.36447314[source]
It blows my mind how unresponsive modern tech is, and it frustrates me constantly. What makes it even worse is how unpredictable the lags are so you can't even train yourself around it.

I was watching Halt and Catch Fire and in the first season the engineering team makes a great effort to meet something called the "Doherty Threshold" to keep the responsiveness of the machine so the user doesn't get frustrated and lose interest. I guess that is lost to time!

replies(18): >>36447344 #>>36447520 #>>36447558 #>>36447932 #>>36447949 #>>36449090 #>>36449889 #>>36450472 #>>36450591 #>>36451868 #>>36452042 #>>36453741 #>>36454246 #>>36454271 #>>36454404 #>>36454473 #>>36462340 #>>36469396 #
sidewndr46 ◴[] No.36447344[source]
Even worse is the new trend of web pages optimizing for page load time. You wind up with a page that loads "instantly" but has almost none of the data you need displayed. Instead there are 2 or 3 AJAX requests to load the data & populate the DOM. Each one results in a repaint, wasting CPU and causing the page content to move around.
replies(13): >>36447430 #>>36448035 #>>36448135 #>>36448336 #>>36448834 #>>36449278 #>>36449850 #>>36450266 #>>36454683 #>>36455856 #>>36456553 #>>36457699 #>>36458429 #
danieldk ◴[] No.36448336[source]
This drives me crazy, especially because it breaks finding within a page. Eg. if you order food and you already know what you want.

Old days: Cmd + f, type what you want.

New days: first scroll to the end of the page so that all the contents are actually loaded. Cmd + f, type what you want.

Is just a list of dishes, some with small thumbnails, some without any images at all. If you can't load a page with 30 dishes fast enough, you have a serious problem (you could always lazily load the thumbnails if you want to cheat).

replies(6): >>36448673 #>>36448968 #>>36449626 #>>36449636 #>>36449814 #>>36454049 #
nzach ◴[] No.36449814[source]
>If you can't load a page with 30 dishes fast enough, you have a serious problem

That depends on your scale. If your product is "large enough" it is relatively easy to get into the range of several seconds of response time.

Here are some of the steps you may want to execute before responding a resquest to your user:

- Get all the dishes that have the filters the user selected

- Remove all dishes from restaurants that doen't delivery in the user location

- Remove all dishes from restaurants that aren't open right now

- Get all discount campaigns for the user and apply its effects for every dish

- Reorder the dish list based on the history of the user interactions

Now imagine that for every step in this list you have, at least, a single team of developers. Add some legacy requirements and a little bit of tech debt... That's it, now you have the perfect stage for a request that takes 5-10 seconds.

replies(3): >>36450058 #>>36450270 #>>36455639 #
1. jerf ◴[] No.36450270[source]
None of the things you said mentioned should be hard. We did complicated things like that and more in the 1990s.

But it was different...

Yeah. It was. That's exactly my point.

A major problem is the number of places in our code stacks where developers think it's perfectly normal for things to take 50ms or 500ms that aren't. I am not a performance maniac but I'm always keeping a mental budget in my head for how long things should take, and if something that should be 50us takes 50ms I generally at some point dig in and figure out why. If you don't even realize that something should be snappy you'll never dig into why your accidentally quadratic code is as slow as it is.

Another one I think is ever-increasingly to blame is the much celebrated PHP-esque "fully isolated page", where a given request is generated and then everything is thrown away. It was always a performance disaster, but when you go from 1 request to dozens for the simplest page render it becomes extra catastrophic. A lot of my web sites are a lot faster than my fellow developers expect simply because I reject that as a model for page generation. Things are a lot faster if you're only serving what was actually requested and not starting everything up from scratch.

Relatedly, developers really underestimate precomputation, which is very relevant to your point. Your hypothetical page layout is slow because you waited until the user actually clicked "menu" to start generating all that. Why did you do that? You should have computed that all at login time and have it stored right at your fingertips, because it is a reasonable assumption given the sort of page you're talking about that if the user logged in, they are there to make an order, not to look at their credit card settings. Even if it expensive for reasons out of your control (location API, for instance) if you already did the work you can serve the user instantly.

Having precomputed all this data, you might as well shove it all down to the client and let them manipulate it there with zero further network requests. A menu is a trivial amount of information.

It isn't even like precomputation is hard. It's the same code, just running at a different time.

"But what about when that doesn't work?" Well, you do something else. You've got a huge list of options. I haven't even scratched the surface. This isn't a treatise on how to speed up every conceivable website, this is a cri de coeur to stop making excuses for not even trying, and just try a little.

And it is SO MUCH FUN. Those of you who don't try you have no idea what you are missing out on. It is completely normal on a code base no one has ever profiled before to find a 50ms process and improve it to 50us with just a handful of lines tweaked. It is completely normal to examine a DB query taking 3 seconds and find that with a single ALTER TABLE ADD INDEX cut that down to 2us. This is the most fun I have at work. Give it a try. It's addictive!

replies(2): >>36450558 #>>36450979 #
2. ericd ◴[] No.36450558[source]
Yeah, your last point is totally spot on, it is so gratifying to make something feel obviously much faster, in a way that few other things in programming are, and there's usually a lot of low hanging fruit.

Also, if you work on a website, the Google crawler seems to allocate a certain amount of wall time (not just CPU time) to crawling your page. If you can get your pages to respond extremely quickly, more of your pages will be indexed, and you're going to match for more keywords. So if for some reason people aren't convinced that speed is an important feature for users wanting to use your site, maybe SEO benefits will help make the case.

3. Sohcahtoa82 ◴[] No.36450979[source]
ALTER TABLE ADD INDEX, in some cases, can create speed increases in multiple orders of magnitude.

I was using a SAST program that used MS SQL Server and was generating reports, and often finding the reports took HOURS to generate, even when the report was only ~50 pages. A report on one specific project took over a DAY to generate a report. I thought it was ludicrous, so I logged onto the SQL server to investigate and found that one query was taking 99% of the time. This query was searching through a table with tens of millions of rows, but not indexed on the specific rows it was checking against, and many variations of the query were being used to generate the report. I added the index (Only took about an hour, IIRC), and what took hours now took a couple minutes.

I was always surprised the software didn't create that index to begin with.

replies(1): >>36455993 #
4. simooooo ◴[] No.36455993[source]
Because it can hog a lot of disk space and slow inserts
replies(1): >>36494555 #
5. Sohcahtoa82 ◴[] No.36494555{3}[source]
Somehow missed this reply for 3 days...

In my use case, impact on inserts was not noticed. I did notice higher disk space usage, but it absolutely was worth it. Spending $200 on a larger disk was absolutely worth saving literally days on report generation.