←back to thread

324 points onnnon | 8 comments | | HN request time: 0s | source | bottom
Show context
irskep ◴[] No.42729983[source]
I agree with most of the other comments here, and it sounds like Shopify made sound tradeoffs for their business. I'm sure the people who use Shopify's apps are able to accomplish the tasks they need to.

But as a user of computers and occasional native mobile app developer, hearing "<500ms screen load times" stated as a win is very disappointing. Having your app burn battery for half a second doing absolutely nothing is bad UX. That kind of latency does have a meaningful effect on productivity for a heavy user.

Besides that, having done a serious evaluation of whether to migrate a pair of native apps supported by multi-person engineering teams to RN, I think this is a very level-headed take on how to make such a migration work in practice. If you're going to take this path, this is the way to do it. I just hope that people choose targets closer to 100ms.

replies(11): >>42730123 #>>42730268 #>>42730440 #>>42730580 #>>42730668 #>>42730720 #>>42732024 #>>42732603 #>>42734492 #>>42735167 #>>42737372 #
fxtentacle ◴[] No.42730123[source]
I would read the <500ms screen loads as follows:

When the user clicks a button, we start a server round-trip and fetch the data and do client-side parsing, layout, formatting and rendering and then less than 500ms later, the user can see the result on his/her screen.

With a worst-case ping of 200ms for a round-trip, that leaves about 200ms for DB queries and then 100ms for the GUI rendering, which is roughly what you'd expect.

replies(7): >>42730497 #>>42730551 #>>42730748 #>>42731484 #>>42732820 #>>42733328 #>>42733722 #
joaohaas ◴[] No.42730748[source]
Since the post is about the benefits of react, I'm sure if requests were involved they would mention it.

Also, even if it was involved, 200ms for round-trip and DB queries is complete bonkers. Most round-trips don't take more than 100ms, and if you're taking 200ms for a DB query on an app with millions of users, you're screwed. Most queries should take max 20-30ms, with some outliers in places where optimization is hard taking up to 80ms.

replies(4): >>42732645 #>>42733310 #>>42734929 #>>42737646 #
andy_ppp ◴[] No.42732645[source]
I do not understand this thinking at all, a parsed response into whatever rendering engine, even if extremely fast is going to be a large percentage of this 500ms page load. Diminishing it with magical thinking about pure database queries under load with no understanding of the complexity of Shopify is quite frankly ridiculous, next up you’ll be telling everyone to roll there own file sharing with rsync or something…
replies(1): >>42735052 #
flohofwoe ◴[] No.42735052[source]
I know - old man yells at cloud and stuff - but some 8-bit home computers from the 80s completed their entire boot sequence in about half a second. What does a 'UI rendering engine' need to do that takes half a second on a device that's tens of thousands of times faster? Everything on modern computers should be 'instant' (some of that time may include internet latency of course, but I assume that the Shopify devs don't live on the moon).
replies(3): >>42735209 #>>42736003 #>>42736410 #
1. netdevphoenix ◴[] No.42736003[source]
Not sure why people keep bringing the old (my machine x years ago was faster). Machines nowadays do way more than machines from 80s. Whether the tasks they do are useful or not is separate discussion.
replies(1): >>42738041 #
2. sgarland ◴[] No.42738041[source]
Casey Muratori has a clip [0] discussing the performance differences between Visual Studio in 2004 vs. today.

Anecdotally, I’ve been playing AoE2: DE a lot recently, and have noticed it briefly stuttering / freezing during battles. My PC isn’t state of the art by any means (Ryzen 7 3700X, 32GB PC4-24000, RX580 8GB), but this is an isometric RTS we’re talking about. In 2004, I was playing AoE2 (the original) on an AMD XP2000+ with maybe 1GB of RAM at most. I do not ever remember it stuttering, freezing, or in any way struggling. Prior to that, I was playing it on a Pentium III 550 MHz, and a Celeron 333 MHz. Same thing.

A great anti-example of this pattern is Factorio. It’s also an isometric top-down game, with RTS elements, but the devs are serious about performance. It’s tracking god knows how many tens or hundreds of thousands of objects (they’re simulating fluid flow in pipes FFS), with a goal of 60 FPS/UPS.

Yes, computers today are doing more than computers from the 80s or 90s, but the hardware is so many orders of magnitude faster that it shouldn’t matter. Software is by and large slower, and it’s a deliberate choice, because it doesn’t have to be that way.

[0]: https://www.youtube.com/watch?v=MR4i3Ho9zZY

replies(2): >>42738352 #>>42748763 #
3. netdevphoenix ◴[] No.42738352[source]
> I’ve been playing AoE2

If you buy poor software instead of good software (yes, branding, IP and whatever but that's just even more reason for companies not to make it good), complaining doesn't help does it. Commercial software is made to be sold and if it sells enough that's all company executives care about. As long as enough people buy it, it will continue to be made.

Company devs trying to get more time/resources to improve performance will be told no unless they can make a realistic business case that explains how the expense of increased focus on performance will be financially worth in terms of revenue. If enough people buy poor software, improving it is not business smart. Companies exist to make money not necessarily to make good products or provide a good service.

I understand your point but you need to understand that business execs don't care about that unless it significantly impacts revenue or costs in the present or very near future.

replies(1): >>42738815 #
4. sgarland ◴[] No.42738815{3}[source]
Nah, it’s not just that. IME, most devs are completely unaware of how this stuff works. They don’t need to, because there are so many abstractions, and because the industry expectation has shifted such that it isn’t a requirement. I’ve also met some who are aware, but don’t care at all, because no one above them cares.

Tech interviews are wildly stupid: they’ll hammer you on being able to optimally code some algorithm under pressure on a time limit, but there’s zero mention of physical attributes like cache line access, let alone a realistic problem involving data structures. Just once, I’d love to see “code a simple B+tree, and then discuss how its use in RDBMS impacts query times depending on the selected key.”

5. jgalt212 ◴[] No.42748763[source]
My 16GB box was crashing due to VS Code. When I went to 32 GB, it stopped crashing. And I'm not running any resource hungry plugins. It blows my mind you can be this junky and still have #1 market share.
replies(3): >>42749152 #>>42767653 #>>42809028 #
6. sgarland ◴[] No.42749152{3}[source]
Yeah, I went to Neovim a couple of years ago, and haven’t looked back. There are enough plugins to make it equivalent in useful features, IMO.
7. netdevphoenix ◴[] No.42767653{3}[source]
Devs often think that you need polished engineering to make it into the market. But the reality often is that half baked products built on lush dinners with prospective clients, dreamy promises, strong sales skills and effective market win the game over and over. Of course, if you also have a well engineered product, even better. But it is clearly not necessary
8. markus_zhang ◴[] No.42809028{3}[source]
The new idea is to push out shit when baking and incrementally improve as many users test for free.

Or maybe not new. I remember Power BI was barely useable back in 2018 as the editor lacks a lot of things.