Most active commenters
  • lukan(5)
  • gmueckl(4)
  • shadowgovt(4)
  • lucideer(4)
  • bitpush(3)
  • thayne(3)

←back to thread

706 points janpio | 96 comments | | HN request time: 1.3s | source | bottom
1. arccy ◴[] No.45676475[source]
If you're going to host user content on subdomains, then you should probably have your site on the Public Suffix List https://publicsuffix.org/list/ . That should eventually make its way into various services so they know that a tainted subdomain doesn't taint the entire site....
replies(15): >>45676781 #>>45676818 #>>45677023 #>>45677080 #>>45677130 #>>45677226 #>>45677274 #>>45677297 #>>45677341 #>>45677379 #>>45677725 #>>45677758 #>>45678975 #>>45679154 #>>45679258 #
2. o11c ◴[] No.45676781[source]
Is that actually relevant when only images are user content?

Normally I see the PSL in context of e.g. cookies or user-supplied forms.

replies(1): >>45677246 #
3. andrewstuart2 ◴[] No.45676818[source]
Aw. I saw Jothan Frakes and briefly thought my favorite Starfleet first officer's actor had gotten into writing software later in life.
4. r_lee ◴[] No.45677023[source]
Does Google use this for Safe Browsing though?
replies(1): >>45677036 #
5. akerl_ ◴[] No.45677036[source]
Looks like it? https://developers.google.com/safe-browsing/reference/URLs.a...
6. CaptainOfCoit ◴[] No.45677080[source]
I think it's somewhat tribal webdev knowledge that if you host user generated content you need to be on the PSL otherwise you'll eventually end up where Immich is now.

I'm not sure how people not already having hit this very issue before is supposed to know about it beforehand though, one of those things that you don't really come across until you're hit by it.

replies(3): >>45677097 #>>45677221 #>>45677257 #
7. hu3 ◴[] No.45677097[source]
This is the first time I hear about https://publicsuffix.org
replies(1): >>45677199 #
8. ggm ◴[] No.45677130[source]
I think this only is true if you host independent entities. If you simply construct deep names about yourself with demonstrable chain of authority back, I don't think the PSL wants to know. Otherwise there is no hierarchy the dots are just convenience strings and it's a flat namespace the size of the PSLs length.
9. btown ◴[] No.45677199{3}[source]
You're in good company! From 12 days ago: https://news.ycombinator.com/item?id=45538760
10. no_wizard ◴[] No.45677221[source]
I’ve been doing this for at least 15 years and it’s the first I heard of this.

Fun learning new things so often but I never once heard of the public suffix list.

That said, I do know the other best practices mentioned elsewhere

replies(1): >>45677554 #
11. aftbit ◴[] No.45677226[source]
I thought this story would be about some malicious PR that convinced their CI to build a page featuring phishing, malware, porn, etc. It looks like Google is simply flagging their legit, self-created Preview builds as being phishing, and banning the entire domain. Getting immich.cloud on the PSL is probably the right thing to do for other reasons, and may decrease the blast radius here.
12. dspillett ◴[] No.45677246[source]
> Is that actually relevant when only images are user content?

Yes. For instance in circumstances exactly as described in the thread you are commenting in now and the article it refers to.

Services like google's bad site warning system may use it to indicate that it shouldn't consider a whole domain harmful if it considers a small number of its subdomains to be so, where otherwise they would. It is no guarantee, of course.

replies(1): >>45677880 #
13. tonyhart7 ◴[] No.45677257[source]
so its skill issue ??? or just google being bad????
replies(1): >>45677435 #
14. LennyHenrysNuts ◴[] No.45677274[source]
The root cause is bad behaviour by google. This is merely a workaround.
replies(1): >>45677284 #
15. bitpush ◴[] No.45677284[source]
Remember, this is a free service that Google is offering for even their competitors to use.

And it is incredibly valuable thing. You might not think it is, but internet is filled utterly dangerous, scammy, phisy, malwary websites and everyday Safe Browsing (via Chrome, Firefox and Safari - yes, Safari uses Safe Browsing) keeps users safe.

If immich didnt follow best practice that's Google's fault? You're showing your naivety, and bias here.

replies(6): >>45677317 #>>45677323 #>>45677395 #>>45678677 #>>45678682 #>>45679318 #
16. fukka42 ◴[] No.45677297[source]
This is not about user content, but about their own preview environments! Google decided their preview environments were impersonating... Something? And decided to block the entire domain.
17. liquid_thyme ◴[] No.45677317{3}[source]
>You might not think it is, but internet is filled utterly dangerous, scammy, phisy, malwary websites

Google is happy to take their money and show scammy ads. Google ads are the most common vector for fake software support scams. Most people google something like "microsoft support" and end up there. Has Google ever banned their own ad domains?

Google is the last entity I would trust to be neutral here.

18. NetMageSCW ◴[] No.45677323{3}[source]
Please point me to where GoDaddy or any other hosting site mentions public suffix, or where Apple or Google or Mozilla have a listing hosting best practices that include avoiding false positives by Safe Browsing…
replies(1): >>45677462 #
19. 827a ◴[] No.45677341[source]
They aren't hosting user content; it was their pull request preview domains that was triggering it.

This is very clearly just bad code from Google.

20. 0xbadcafebee ◴[] No.45677379[source]

  In the past, browsers used an algorithm which only denied setting wide-ranging cookies for top-level domains with no dots (e.g. com or org). However, this did not work for top-level domains where only third-level registrations are allowed (e.g. co.uk). In these cases, websites could set a cookie for .co.uk which would be passed onto every website registered under co.uk.

  Since there was and remains no algorithmic method of finding the highest level at which a domain may be registered for a particular top-level domain (the policies differ with each registry), the only method is to create a list. This is the aim of the Public Suffix List.
  
  (https://publicsuffix.org/learn/)
So, once they realized web browsers are all inherently flawed, their solution was to maintain a static list of websites.

God I hate the web. The engineering equivalent of a car made of duct tape.

replies(6): >>45677442 #>>45678161 #>>45678382 #>>45678520 #>>45678922 #>>45679006 #
21. delis-thumbs-7e ◴[] No.45677395{3}[source]
Oh c’mon. Google does not offer free services. Everyone should know that by now.
replies(1): >>45678700 #
22. yndoendo ◴[] No.45677435{3}[source]
I will go with Google being bad / evil for 500.

Google 90s to 2010 is nothings like Google 2025. There is a reason they removed "Don't be evil" ... being evil and authoritarian makes more money.

Looking at you Manifest V2 ... pour one out for your homies.

replies(3): >>45677808 #>>45677823 #>>45678538 #
23. lukan ◴[] No.45677442[source]
"The engineering equivalent of a car made of duct tape"

Kind of. But do you have a better proposition?

replies(2): >>45677503 #>>45678251 #
24. gruez ◴[] No.45677462{4}[source]
>GoDaddy or any other hosting site mentions public suffix

They don't need to mention it because they handle it on behalf of the client. Them recommending best practices like using separate domains makes as much sense as them recommending what TLS configs to use.

>or where Apple or Google or Mozilla have a listing hosting best practices that include avoiding false positives by Safe Browsing…

Since were those sites the go to place to learn how to host a site? Apple doesn't offer anything related to web hosting besides "a computer that can run nginx". Google might be the place to ask if you were your aunt and "google" means "internet" to her. Mozilla is the most plausible one because they host MDN, but hosting documentation on HTML/CSS/JS doesn't necessarily mean they offer hosting advice, any more than expecting docs.djangoproject.com to contain hosting advice.

replies(1): >>45677700 #
25. gmueckl ◴[] No.45677503{3}[source]
A part of the issue is IMO that browsers have become ridiculously bloated everything-programs. You could take about 90% of that out and into dedicated tools and end up with something vastly saner and safer and not a lot less capable for all practical purposes. Instead, we collectively are OK with frosting this atrocious layer cake that is today's web with multiple flavors of security measures of sometimes questionable utility.

End of random rant.

replies(4): >>45677688 #>>45677734 #>>45677747 #>>45678076 #
26. foobarian ◴[] No.45677554{3}[source]
First rule of the public suffix list...
27. sefrost ◴[] No.45677688{4}[source]
You are right from a technical point, I think, but in reality - how would one begin to make that change?
28. Zak ◴[] No.45677700{5}[source]
The underlying question is how are people supposed to know about this before they have a big problem?
replies(1): >>45677757 #
29. david_van_loon ◴[] No.45677725[source]
The issue isn't the user-hosted content - I'm running a release build of Immich on my own server and Google flagged my entire domain.
replies(2): >>45677919 #>>45678012 #
30. lukan ◴[] No.45677734{4}[source]
"You could take about 90% of that out and into dedicated tools "

But then you would loose plattform independency, the main selling point of this atrocity.

Having all those APIs in a sandbox that mostly just work on billion devices is pretty powerful and a potential succesor to HTML would have to beat that, to be adopted.

The best thing to happen, that I can see, is that a sane subset crystalizes, that people start to use dominantly, with the rest becoming legacy, only maintained to have it still working.

But I do dream of a fresh rewrite of the web since university (and the web was way slimmer back then), but I got a bit more pragmatic and I think I understood now the massive problem of solving trusted human communication better. It ain't easy in the real world.

replies(3): >>45677833 #>>45677843 #>>45678003 #
31. nemothekid ◴[] No.45677747{4}[source]
>A part of the issue is IMO that browsers have become ridiculously bloated everything-programs.

I don't see how that solves the issue that PSL tries to fix. I was a script kiddy hosting neopets phishing pages on free cpanel servers from <random>.ripway.com back in 2007. Browsers were way less capable then.

replies(1): >>45677763 #
32. nemothekid ◴[] No.45677757{6}[source]
If you have a service where anyone can sign up and host content on your subdomain, it really is your responsibility to know. Calling this "unfair" because you didn't know is naive.

If amazon shutdown your AWS account, because those same scammers used those domains to host CP rather than phishing pages, would you accept the excuse of "how was I supposed to know?"

replies(1): >>45678880 #
33. thayne ◴[] No.45677758[source]
Looking through some of the links in this post, I there are actually two separate issues here:

1. Immich hosts user content on their domain. And should thus be on the public suffic list.

2. When users host an open source self hosted project like immich, jellyfin, etc. on their own domain it gets flagged as phishing because it looks an awful lot like the publicly hosted version, but it's on a different domain, and possibly a domain that might look suspicious to someone unfamiliar with the project, because it includes the name of the software in the domain. Something like immich.example.com.

The first one is fairly straightforward to deal with, if you know about the public suffix list. I don't know of a good solution for the second though.

replies(4): >>45677810 #>>45677812 #>>45678057 #>>45678836 #
34. lukan ◴[] No.45677763{5}[source]
PSL and the way cookies work is just part of the mess. A new approach could solve that in a different way, taking into account all the experience we had with scriptkiddies and professional scammers and pishers since then. But I also don't really have an idea where and how to start.
replies(1): >>45677820 #
35. shadowgovt ◴[] No.45677808{4}[source]
Sympathy for the devil, people keep using Google's browser because the safe search guards catch more bad actors than they false positive good actors.
replies(1): >>45678312 #
36. smaudet ◴[] No.45677810[source]
I don't think the Internet should be run by being on special lists (other than like, a globally run registry of domain names)...

I get that SPAM, etc., are an issue, but, like f* google-chrome, I want to browse the web, not some carefully curated list of sites some giant tech company has chosen.

A) you shouldn't be using google-chrome at all B) Firefox should definitely not be using that list either C) if you are going to have a "safe sites" list, that should definitely be a non-profit running that, not an automated robot working for a large probably-evil company...

replies(6): >>45677835 #>>45677892 #>>45677899 #>>45677928 #>>45678115 #>>45678656 #
37. VTimofeenko ◴[] No.45677812[source]
> When users host an open source self hosted project like immich, jellyfin, etc. on their own domain...

I was just deploying your_spotify and gave it your-spotify.<my services domain> and there was a warning in the logs that talked about thud, linking the issue:

https://github.com/Yooooomi/your_spotify/issues/271

38. shadowgovt ◴[] No.45677820{6}[source]
And of course, if the new solution completely invalidates old sites, it just won't get picked up. People prefer slightly broken but accessible to better designed but inaccessible.
replies(2): >>45678253 #>>45679014 #
39. tonyhart7 ◴[] No.45677823{4}[source]
downvoted for saying truth

many google employee is in here, so I dont expect them to be agree with you

40. gmueckl ◴[] No.45677833{5}[source]
But do we need e.g serial port or raw USB access straight from a random website? Even WebRTC is a bit of a stretch. There is a lot of cruft in modern browsers that does little except increase attack surface.

This all just drives a need to come up with ever more tacked-on protection schemes because browsers have big targets painted on them.

replies(5): >>45677839 #>>45677890 #>>45678065 #>>45678383 #>>45679283 #
41. shadowgovt ◴[] No.45677835{3}[source]
There are other browsers if you want to browse the web with the blinders off.

It's browser beware when you do, but you can do it.

42. shadowgovt ◴[] No.45677839{6}[source]
How else am I going to make a game in the browser that be controlled with a controller?
replies(1): >>45678826 #
43. smaudet ◴[] No.45677843{5}[source]
> Having all those APIs in a sandbox that mostly just work on billion devices is pretty powerful and a potential succesor to HTML would have to beat that, to be adopted.

I think the giant major downside, is that they've written a rootkit that runs on everything, and to try to make up for that they want to make it so only sites they allow can run.

It's not really very powerful at all if nobody can use it, at that point you are better off just not bothering with it at all.

The Internet may remain, but the Web may really be dead.

replies(2): >>45677951 #>>45679303 #
44. thayne ◴[] No.45677880{3}[source]
Well, using the public suffix list _also_ isolates cookies and treats the subdomains as different sites, which may or may not be desirable.

For example, if users are supposed to log in on the base account in order to access content on the subdomains, then using the public suffix list would be problematic.

replies(1): >>45679231 #
45. lukan ◴[] No.45677890{6}[source]
WebRTC I use since many years and would miss it a lot. P2P is awesome.

WebUSB I don't use or would miss it right now, but .. the main potential use case is security and it sounds somewhat reasonable

"Use in multi-factor authentication

WebUSB in combination with special purpose devices and public identification registries can be used as key piece in an infrastructure scale solution to digital identity on the internet."

https://en.wikipedia.org/wiki/WebUSB

46. knowriju ◴[] No.45677892{3}[source]
If you have such strong feelings, you could always use vanilla chromium.
47. jonas21 ◴[] No.45677899{3}[source]
You can turn it off in Chrome settings if you want.
48. mixologic ◴[] No.45677919[source]
Is it on your own domain?
replies(1): >>45677930 #
49. thayne ◴[] No.45677928{3}[source]
Firefox and Safari also use the list. At least by default, I think you can turn it off in firefox. And on the whole, I think it is valuable to have _a_ list of known-unsafe sites. And note that Safe Browsing is a blocklist, not an allowlist.

The problem is that at least some of the people maintaining this list seem to be a little trigger happy. And I definitely thing Google probably isn't the best custodian of such a list, as they have obvious conflicts of interest.

replies(1): >>45678231 #
50. david_van_loon ◴[] No.45677930{3}[source]
Yes, my own domain.
51. lukan ◴[] No.45677951{6}[source]
"It's not really very powerful at all if nobody can use it"

But people do use it, like the both of us right now?

People also use maps, do online banking, play games, start complex interactive learning environments, collaborate in real time on documents etc.

All of that works right now.

52. ngold ◴[] No.45678003{5}[source]
Not sure if it counts but I've been enjoying librewolf. I believe just a stripped down firefox.
53. liqilin1567 ◴[] No.45678057[source]
That means the Safe Browsing abuse could be weaponized against self-hosted services, oh my...
replies(1): >>45678150 #
54. com2kid ◴[] No.45678065{6}[source]
Itch.io games and controller support.

You have sites now that let you debug microcontrollers on your browser, super cool.

Same thing but with firmware updates in the browser. Cross platform, replaced a mess of ugly broken vendor tools.

55. Kim_Bruning ◴[] No.45678076{4}[source]
Are you saying we should make a <Unix Equivalent Of A Browser?> A large set of really simple tools that each do one thing really really really pedantically well?

This might be what's needed to break out of the current local optimum.

replies(1): >>45678831 #
56. awesome_dude ◴[] No.45678115{3}[source]
Oh god, you reminded me the horrors of hosting my own mailserver and all of the white/blacklist BS you have to worry about being a small operator (it's SUPER easy to end up on the blacklists, and is SUPER hard to get onto whitelists)
57. sschueller ◴[] No.45678150{3}[source]
New directive from the Whitehouse. Block all non approved sites. If you don't do it we will block your merger etc...
58. modeless ◴[] No.45678161[source]
Show me a platform not made out of duct tape and I'll show you a platform nobody uses.
replies(2): >>45678422 #>>45678958 #
59. zenmac ◴[] No.45678231{4}[source]
>I think it is valuable to have _a_ list of known-unsafe sites

And how and who should define what is consider unsafe sites?

replies(1): >>45678271 #
60. jadengeller ◴[] No.45678251{3}[source]
I'd probably say we ought to use DNS.
replies(1): >>45678843 #
61. motorest ◴[] No.45678253{7}[source]
> People prefer slightly broken but accessible to better designed but inaccessible.

It's not even broken as the edge cases are addressed by ad-hoc solutions.

OP is complaining about global infrastructure not having a pristine design. At best it's a complain over a desirable trait. It's hardly a reason to pull the Jr developer card and mindlessly advocate for throwing everything out and starting over.

62. MostlyStable ◴[] No.45678271{5}[source]
Ideally there should be several/many and the user should be able to direct their browser as to which they would like to use (or none at all)
63. hulitu ◴[] No.45678312{5}[source]
> people keep using Google's browser because the safe search guards catch more bad actors than they false positive good actors.

This is the first thing i disable in Chrome, Firefox and Edge. The only safe thing they do is safely sending all my browsing history to Google or Microsoft.

64. starfallg ◴[] No.45678382[source]
That's the nature of decentralised control. It's not just DNS, phone numbers work in the same way.
65. hulitu ◴[] No.45678383{6}[source]
> But do we need e.g serial port or raw USB access straight from a random website?

Yes. Regards, CIA, Mossad, FSB etc.

66. vincnetas ◴[] No.45678422{3}[source]
regular cars?
replies(1): >>45678466 #
67. MonaroVXR ◴[] No.45678466{4}[source]
The Honda issue where setting a certain radio station, would brick the infotainment? That good enough?
replies(1): >>45678915 #
68. lucideer ◴[] No.45678520[source]
> God I hate the web

This is mostly a browser security mistake but also partly a product of ICANN policy & the design of the domain system, so it's not just the web.

Also, the list isn't really that long, compared to, say, certificate transparency logs; now that's a truly mad solution.

69. lucideer ◴[] No.45678538{4}[source]
Don't get me wrong, Google is bad/evil in many ways, but the public suffix list exists to solve a real risk to users. Google is flagging this for a legit reason in this particular case.
70. lucideer ◴[] No.45678656{3}[source]
> I don't think the Internet should be run by being on special lists

People are reacting as if this list is some kind of overbearing way of tracking what people do on the web - it's almost the opposite of that. It's worth clarifying this is just a suffix list for user-hosted content. It's neither a list of user-hosted domains nor a list of safe websites generally - it's just suffixes for a very small specific use-case: a company providing subdomains. You can think of this as a registry of domain sub-letters.

For instance:

- GitHub.io is on the list but GitHub.com is not - GitHub.com is still considered safe

- I self-host an immich instance on my own domain name - my immich instance isn't flagged & I don't need to add anything to the list because I fully own the domain.

The specific instance is just for Immich themselves who fully own "immich.cloud" but sublet subdomains under it to users.

> *if you are going to have a "safe sites" list"

This is not a safe sites list! This is not even a sites list at all - suffixes are not sites. This also isn't even a "safe" list - in fact it's really a "dangerous" list for browsers & various tooling to effectively segregate security & privacy contexts.

Google is flagging the Immich domain not because it's missing from the safe list but because it has legitimate dangers & it's missing from the dangerous list that informs web clients of said dangers so they can handle them appropriately.

71. udev4096 ◴[] No.45678677{3}[source]
The irony is fucking palpable. You are showing off your naivety and bias here. Imagine defending the most evil, trillion dollar corp. How many ignorant sell outs do we have on HN?
replies(1): >>45678696 #
72. realusername ◴[] No.45678682{3}[source]
The argument would work better if Google wasn't the #1 distributor of scams and malware in the world with adsense. (Which strangely isn't flagged by safe browsing, maybe a coincidence)
73. bitpush ◴[] No.45678696{4}[source]
> Imagine defending the most evil, trillion dollar corp

Hyperbole much?

74. bitpush ◴[] No.45678700{4}[source]
What is Safari getting by using Safe Browsing?
75. gmueckl ◴[] No.45678826{7}[source]
Every decent host OS already has a dedicated driver stack to provide game controller input to applications in a useful manner. Why the heck would you ship a reimplementation of that in JS in a website?
76. gmueckl ◴[] No.45678831{5}[source]
I haven't thought of it that way, but that might be a solution.
replies(1): >>45679324 #
77. lucideer ◴[] No.45678836[source]
> I don't know of a good solution for the second though.

I know the second issue can be a legitimate problem but I feel like the first issue is the primary problem here & the "solution" to the second issue is a remedy that's worse than the disease.

The public suffix list is a great system (despite getting serious backlash here in HN comments, mainly from people who have jumped to wildly exaggerated conclusions about what it is). Beyond that though, flagging domains for phishing for having duplicate content smells like an anti-self-host policy: sure there's phishers making clone sites, but the vast majority of sites flagged are going to be legit unless you employ a more targeted heuristic, but doing so isn't incentivised by Google's (or most company's) business model.

78. asplake ◴[] No.45678843{4}[source]
And while we’re at it, 1) mark domains as https-only, and 2) when root domains map to a subdomain (eg www).
79. asmor ◴[] No.45678880{7}[source]
Nothing in this article indicates UGC is the problem. It's that Google thinks there's an "official" central immich and these instances are impersonating it.

What malicious UGC would you even deliver over this domain? An image with scam instructiins? CSAM isn't even in scope for Safe Browsing, just phishing and malware.

80. dgoldstein0 ◴[] No.45678915{5}[source]
Never heard of this. Link please?
81. KronisLV ◴[] No.45678922[source]
> Since there was and remains no algorithmic method of finding the highest level at which a domain may be registered for a particular top-level domain

A centralized list like this not just for domains as a whole (e.g. co.uk) but also specific sites (e.g. s3-object-lambda.eu-west-1.amazonaws.com) is both kind of crazy in that the list will bloat a lot over the years, as well as a security risk for any platform that needs this functionality but would prefer not to leak any details publicly.

We already have the concept of a .well-known directory that you can use, when talking to a specific site. Similarly, we know how you can nest subdomains, like c.b.a.x, and it's more or less certain that you can't create a subdomain b without the involvement of a, so it should be possible to walk the chain.

Example:

  c --> https://b.a.x/.well-known/public-suffix
  b --> https://a.x/.well-known/public-suffix
  a --> https://x/.well-known/public-suffix
Maybe ship the domains with the browsers and such and leave generic sites like AWS or whatever to describe things themselves. Hell, maybe that could also have been a TXT record in DNS as well.
replies(1): >>45679097 #
82. bell-cot ◴[] No.45678958{3}[source]
Admitting I'm old, but my HP-11C still gets pretty-regular use.

And judging by eBay prices, or the SwissMicros product line, I suspect I have plenty of company.

83. BartjeD ◴[] No.45678975[source]
There is no law appointing that organization as a world wide authority on tainted/non tainted sites.

The fact it's used by one or more browsers in that way is a lawsuit waiting to happen.

Because they, the browsers, are pointing a finger to someone else and accusing them of criminal behavior. That is what a normal user understands this warning as.

Turns out they are wrong. And in being wrong they may well have harmed the party they pointed at, in reputation and / or sales.

It's remarkable how short sighted this is, given that the web is so international. Its not a defense to say some third party has a list, and you're not on it so you're dangerous

Incredible

replies(1): >>45679010 #
84. formerly_proven ◴[] No.45679006[source]
Why is it a centrally maintained list of domains, when there is a whole extensible system for attaching metadata to domain names?
85. jtwaleson ◴[] No.45679010[source]
As far as I know there is currently no international alternative authority for this. So definitely not ideal, but better than not having the warnings.
replies(1): >>45679045 #
86. friendzis ◴[] No.45679014{7}[source]
> People prefer slightly broken but accessible to better designed but inaccessible.

We live in world where whatever faang adopts is de facto a standard. Accessible these days means google/gmail/facebook/instagram/tiktok works. Everything else is usually forced to follow along.

People will adopt whatever gives them access to their daily dose of doomscrolling and then complain about rather crucial part of their lives like online banking not working.

> And of course, if the new solution completely invalidates old sites, it just won't get picked up.

Old sites don't matter, only high-traffic sites riddled with dark patterns matter. That's the reality, even if it is harsh.

87. BartjeD ◴[] No.45679045{3}[source]
Yes but that's not a legal argument.

You're honor, we hurt the plaintiff because it's better than nothing!

replies(1): >>45679178 #
88. IshKebab ◴[] No.45679097{3}[source]
I presume it has to be a curated list otherwise spammers would use it to evade blocks. Otherwise why not just use DNS?
89. ZeWaka ◴[] No.45679154[source]
Oh - of course this is where I find the answer why there's a giant domain list bloating my web bundles (tough-cookie/tldts).
90. jtwaleson ◴[] No.45679178{4}[source]
True, and agreed that lawsuits are likely. Disagree that it's short-sighted. The legal system hasn't caught up with internet technology and global platforms. Until it does, I think browsers are right to implement this despite legal issues they might face.
91. dspillett ◴[] No.45679231{4}[source]
Cross domain identity management is a little extra work, but it's far from a difficult problem. I understand the objection to needing to do it when a shared cookie is so easy, but if you want subdomains to be protected from each other because they do not have shared responsibility for each other then it makes sense in terms of privacy & security that they don't automatically share identity tokens and other client-side data.
92. fc417fc802 ◴[] No.45679258[source]
How does the PSL make any sense? What stops an attacker from offering free static hosting and then making use of their own service?

I appreciate the issue it tries to solve but it doesn't seem like a sane solution to me.

93. sofixa ◴[] No.45679283{6}[source]
> Even WebRTC is a bit of a stretch

You remove that, and videoconferencing (for business or person to person) has to rely on downloading an app, meaning whoever is behind the website has to release for 10-15 OSes now. Some already do, but not everyone has that budget so now there's a massive moat around it.

> But do we need e.g serial port or raw USB access straight from a random website

Being able to flash an IoT (e.g. ESP32) device from the browser is useful for a lot of people. For the "normies", there was also Stadia allowing you to flash their controller to be a generic Bluetooth/usb one on a website, using that webUSB. Without it Google would have had to release an app for multiple OSes, or more likely, would have just left the devices as paperweights. Also, you can use FIDO/U2F keys directly now, which is pretty good.

Browsers are the modern Excel, people complain that they do too much and you only need 20%. But it's a different 20% for everyone.

94. sofixa ◴[] No.45679303{6}[source]
> to try to make up for that they want to make it so only sites they allow can run

What do you mean, you can run whatever you want on localhost, and it's quite easy to host whatever you want for whoever you want too. Maybe the biggest modern added barrier to entry is that having TLS is strongly encouraged/even needed for some things, but this is an easily solved problem.

95. 63stack ◴[] No.45679318{3}[source]
Holy shit look into the mirror.

One of the internet's biggest source of scams, phishing, and malware and everything you are complaining about is google adsense.

Google is using the list to bully out competitors, while telling you it's for keeping you safe.

_You_ are showing naivety and bias.

96. magackame ◴[] No.45679324{6}[source]
There was an attempt in that direction.

https://www.uzbl.org/