Most active commenters
  • Sesse__(3)

←back to thread

446 points talboren | 37 comments | | HN request time: 0.562s | source | bottom
1. muglug ◴[] No.45039093[source]
Improvements merged within the last two days by the WebKit team: https://github.com/orgs/community/discussions/170922#discuss...

For my sins I occasionally create large PRs (> 1,000 files) in GitHub, and teammates (who mostly all use Chrome) will sometimes say "I'll approve once it loads for me..."

replies(6): >>45039371 #>>45039546 #>>45039877 #>>45040882 #>>45043276 #>>45050622 #
2. patrickmay ◴[] No.45039371[source]
That seems essentially unreviewable. If you can share without violating an NDA, what kind of PR would involve that many files?
replies(6): >>45039414 #>>45039433 #>>45039455 #>>45039478 #>>45040667 #>>45041353 #
3. codezero ◴[] No.45039414[source]
Can’t speak for the person above but we keep a lot of configuration files in git and could easily write a thousand new configs in a single PR, or adding a new key to all the configs for example.
4. trenchpilgrim ◴[] No.45039433[source]
Ones where you have a lot of generated files you commit into Git, and you change the output of the generator tool.
5. scsh ◴[] No.45039455[source]
If the project you're working on vendors dependencies it's pretty easy to end up with that many files being changed when adding or updating, even when trying to make as narrow updates as possible in one PR.
6. bob1029 ◴[] No.45039478[source]
"Upgrade solution from .NET Framework 4.8 => .NET 8"

"Rename 'CustomerEmailAddress' to 'CustomerEmail'"

"Upgrade 3rd party API from v3 to v4"

I genuinely don't get this notion of a "max # of files in a PR". It all comes off to me as post hoc justification of really shitty technology decisions at GitHub.

replies(2): >>45039963 #>>45041641 #
7. celsoazevedo ◴[] No.45039546[source]
How long until those improvements reach users? I assume it requires an OS update or does Safari use something similar to Firefox and Chrome for faster updates?
replies(1): >>45039624 #
8. rootnod3 ◴[] No.45039624[source]
There is a developer version you can install. There is beta, but that overrides your existing Safari and rollback might be tricky sometimes.

But there is also the Safari Technology Preview, which installs as a separate app, but is also a bit more unstable. Similar to Chrome Canary.

replies(2): >>45040189 #>>45040801 #
9. blinkingled ◴[] No.45039877[source]
Thanks, that is definitely a good sign - given the rendering engine monopoly state of Chrome+derivatives and lack of great momentum behind Firefox adoption we need Apple to actively keep Safari not just viable but great even if only on macOS/iOS.
10. ambicapter ◴[] No.45039963{3}[source]
It's not GitHub-specific advice, it's about reviewability of the PR vs. human working memory/maximum attention span.
replies(2): >>45040291 #>>45041083 #
11. philistine ◴[] No.45040189{3}[source]
STP is a great thing if you wished you had two different Safaris. Profiles just don't work as well as a completely different app.
12. eviks ◴[] No.45040291{4}[source]
How much working memory/attention span is required to look through 1000 identical lines "-CustomerEmailAddress +CustomerEmail"?
replies(1): >>45040910 #
13. cesarb ◴[] No.45040667[source]
> what kind of PR would involve that many files?

A very simple example: migrating from JavaEE to JakartaEE. Every single Java source file has to have the imports changed from "javax." to "jakarta.", which can easily be thousands of files. It's also easy to review (and any file which missed that change will fail when compiling on the CI).

14. dylan604 ◴[] No.45040801{3}[source]
I had to download STP for a specific case I don't even remember. Ever since, I get frequent OS Update notifications with new STP versions. Updates without a fully system which means no rebooting necessary. About as easy any other software typically does it, only this is using the OS' upgrade so it does make it those extra steps instead of clicking the update->relaunch button
15. Sesse__ ◴[] No.45040882[source]
Interesting how _everyone_ here blames JS and React, yet the fixes you linked are about CSS performance.
replies(4): >>45041372 #>>45041777 #>>45042832 #>>45049426 #
16. shadowgovt ◴[] No.45040910{5}[source]
Ideally, you automate a check like that. Because the answer turns out to actually be "humans are profoundly bad at that kind of pattern recognition."

A computer will be able to tell that the 497th has a misspelled `CusomerEmail` or that change 829 is a regexp failure that trimmed the boolean "CustomerEmailAddressed" to "CustomerEmailed" with 100% reliability; humans, not so much.

replies(3): >>45041140 #>>45041171 #>>45047557 #
17. sidewndr46 ◴[] No.45041083{4}[source]
I pretty frequently have conversations with other engineers where I point out that a piece of code makes an assumption that mostly holds true, but doesn't always hold true. Hence, a user visible bug.

The usual response is something like "if you're correct, wouldn't that mean there are hundreds of cases where this needs to be fixed to resolve this bug?". The answer obviously being yes. Incoming 100+ file PR to resolve this issue. I have no other ideas for how someone is supposed to resolve an issue in this scenario

18. makeitdouble ◴[] No.45041140{6}[source]
You're not just reviewing the individual lines, but also which context, and which files are impacted. And automating that part would still mean reviewing the automation and the 1000+ changes to move to it.

Sure 1000+ changes kills the soul, we're not good at that, but sometimes there's just no other decent choice.

19. eviks ◴[] No.45041171{6}[source]
Oh, certainly, didn't mean that you had to avoid using your IDE to autorename a variable yourself (to avoid the boolean issue) and diffed results to those of the PR

Or that you had to avoid Ctrl+F "CustomerEmail" and see whether you had 1000 matches that matches the number of changed files or only 999 due to some typo.

Or using the web interface to filter by file type to batch your reviews.

Or...

Just that in none of those cases there is anything close to our memory/attention capacity.

replies(2): >>45041585 #>>45042397 #
20. moffkalast ◴[] No.45041353[source]
Convert space indents to tabs, as god intended.
replies(1): >>45050078 #
21. jchw ◴[] No.45041372[source]
You certainly can build slow apps with React, it doesn't make building slow things that hard. But honestly, React primitives (component mounting/unmounting, rendering, virtual DOM diffing, etc.) just aren't that slow/inefficient and using React in a fairly naive way isn't half-bad for data-heavy apps.

I actually have been trying to figure out how to get my React application (unreleased) to perform less laggy in Safari than it does in Firefox/Chrome, and it seems like it is related to all the damn DOM elements. This sucks. Virtualizing viewports adds loads of complexity and breaks some built-in browser features, so I generally prefer not to do it. But, at least in my case, Safari seems to struggle with doing certain layout operations with a shit load of elements more than Chrome and Firefox do.

replies(1): >>45041527 #
22. Sesse__ ◴[] No.45041527{3}[source]
> You certainly can build slow apps with React, it doesn't make building slow things that hard.

By all means. It sometimes feels like React is more the symptom than the actual issue, though.

Personally I generally just like having less code; generally makes for fewer footguns. But that's an incredibly hard sell in general (and of course not the entire story).

23. ◴[] No.45041585{7}[source]
24. bityard ◴[] No.45041641{3}[source]
I've always thought those kinds of large-scale search-and-replace diffs should not generally be expected to be reviewed. If a review is 1000's of lines of identical changes (or newly-vendored code), literally nobody is actually reading it, even if they are somehow able to convince themselves that they are.

I would rather just see the steps you ran to generate the diff and review that instead.

25. ajross ◴[] No.45041777[source]
Confirmation of priors is a powerful drug. And performance engineering is really hard and often lives at a different layer of the stack than the one you know.

It's just easier to blame the tools (or companies!) you already hate.

26. shadowgovt ◴[] No.45042397{7}[source]
I envy your IDE being able to do a rename of that scale.

I work in a large C++ codebase and a rename like that will actually just crash my vscode instance straight-up.

(There are good automated tools that make it straightforward to script up a repository-wide mutation like this however. But they still generate PRs that require human review; in the case of the one I used, it'd break the PR up into tranches of 50-ish files per tranche and then hunt down individuals with authority to review the root directory of the tranche and assign it to them. Quite useful!)

replies(1): >>45043353 #
27. psygn89 ◴[] No.45042832[source]
JS is the logical place to start with all the virtualization and fanciness.

But CSS has bit me with heavy pages (causing a few seconds of lag that even devtools debugging/logging didn't point towards). We know wildcard selectors can impact performance, but in my case there were many open ended selectors like `:not(.what) .ever` where the `:not()` not being attached to anything made it act like a wildcard with conditions. Using `:has()` will do the same with additional overhead. Safari was the worst at handling large pages and these types of selectors and I noticed more sluggishness 2-3 years ago.

replies(1): >>45043040 #
28. Sesse__ ◴[] No.45043040{3}[source]
`:not(.what) .ever` should be fairly fast, unless you have lots of `class="ever"` elements. Not ideal, but not as bad as e.g. `.ever :not(.what)` would be. `:has()` is just inherently slow if there's a significant amount of elements to search, even though browsers have some caching and tricks.

Normally, you be able to debug selector matching performance (and in general, see how much style computation costs you), so it's a bit weird if you have phantom multi-second delays.

29. ◴[] No.45043276[source]
30. trenchpilgrim ◴[] No.45043353{8}[source]
Yeah VSCode is pretty terrible at refactorings that Jetbrains or Zed will do basically instantly.
31. const_cast ◴[] No.45047557{6}[source]
We already automate that, it's called a compiler. The human review is just for kicks for this type of thing.

Of course some languages... PHP... aren't so lucky. $customer->cusomerEmail? Good luck dealing with that critical in production, fuckheads!

replies(1): >>45051422 #
32. MobiusHorizons ◴[] No.45049426[source]
These performance problems are new since a rewrite which also added react. Could be just a coincidence, but that is why people blame react.
33. itsgabriel ◴[] No.45050078{3}[source]
as god indented, you mean?
replies(1): >>45061273 #
34. darkwater ◴[] No.45050622[source]
So, it was actually a bug in Safari's WebKit exposed by Github (which might be abusing it a bit anyway). But hey, since we are all web developers or web users, it's easier to shoot on the pianist (Github in this case)
35. muglug ◴[] No.45051422{7}[source]
What if I told you there were tools for that
replies(1): >>45068742 #
36. moffkalast ◴[] No.45061273{4}[source]
I'm stealing that one, lmao
37. const_cast ◴[] No.45068742{8}[source]
I'm aware of the tools, I write PHP daily.

The point is moreso that PHP won't stop you from doing that. It will run, and it will continue running, and then it will throw an error at some point. Maybe.

If the code is actually executed. If it's in a branch that's only executed like 1/1000 times... not good.