Most active commenters
  • (6)

160 points fueloil | 43 comments | | HN request time: 1.26s | source | bottom
1. ◴[] No.42479920[source]
2. jaronilan ◴[] No.42480357[source]
OT: earlier this year I wrote a short fictional story about SEO….

https://github.com/jaronilan/stories/blob/main/Duplicitous.p...

replies(2): >>42480627 #>>42480778 #
3. nothercastle ◴[] No.42480493[source]
Just goes to show you how far behind the SEO scammers Google is. Despite being bigger and wealthier Google simply can’t catch the more nimble SEO scammers
replies(3): >>42480527 #>>42480782 #>>42482272 #
4. wkat4242 ◴[] No.42480527[source]
I'm pretty sure Google is milking them now by just selling ad words, they no longer care about the search results, just how much money they make
replies(5): >>42480709 #>>42480799 #>>42480867 #>>42481182 #>>42481804 #
5. arjvik ◴[] No.42480627[source]
Wow, love the story, still processing it
6. greenchair ◴[] No.42480632[source]
geo location trick is absolutely fascinating!
replies(1): >>42480871 #
7. throwaway743 ◴[] No.42480697[source]
So sick of dealing with Google, between this, the play store, admob, etc. They'll punish you for the slightest things, have zero meaningful customer service/means of recourse, and in many cases don't have a way to even reach out to customer service (admob's email option is visible but throws an error every time you click it). Not to mention, they try to steer you towards their useless "community forums" that sre filled with "diamond users" who just spam copy/paste responses.

If your business depends on their services, you're fucked if you slip up in the smallest way. Have fun trying to get ahold of anyone who can help, unless you're lucky and have a friend who works there.

Side note, they're going to further penalize apps based on performance/ANRs, yet they haven't fixed the issue of admob's banner ads causing performance issues.

It just feels awful. We need more options that can actually help small businesses, not hurt or threaten them.

replies(5): >>42480786 #>>42480863 #>>42480896 #>>42482118 #>>42482328 #
8. ◴[] No.42480709{3}[source]
9. ◴[] No.42480778[source]
10. ◴[] No.42480782[source]
11. ◴[] No.42480786[source]
12. bn-l ◴[] No.42480799{3}[source]
Also SEO provides at least something for the long tail. My belief is that the algorithm is able to predict what looks like “content” to a broad consumer base. Then the fatter the tail gets the more they manually tweak the results (like editors doing publishing).
13. mavamaarten ◴[] No.42480863[source]
Yup. Dealing with Google often gives me strong "I'm being held hostage" vibes.

Our apps have been rejected from the Play Store for bogus reasons multiple times. Sometimes it's an easy fix (aka just release the same update but with higher version number - to get another bot or human to take a look), sometimes it takes a week, sometimes we've had to pull strings and had to escalate our issue through a contact at Google. But if you're a small fish, they will absolutely let you rot.

14. Drakim ◴[] No.42480867{3}[source]
What you are saying sounds like hyperbole, but all it takes is for the management within google to blindly chase metrics and they end up doing exactly that.
replies(2): >>42480893 #>>42480964 #
15. Nextgrid ◴[] No.42480871[source]
And also absolutely trivial to detect automatically. The fact that Google doesn't do it proves they don't give a shit about improving search quality beyond superficial efforts whose main objective is just to make it look like they do. Maybe the fact that most spam sites have Google Analytics & Ads has something to do with it?
replies(3): >>42481610 #>>42482173 #>>42482681 #
16. Nextgrid ◴[] No.42480893{4}[source]
That's the point - willingly saying that they promote spam sites to milk ad revenue could land them in antitrust trouble. But all they have to so is merely not do anything that would downrank said spam sites, and the outcome is exactly the same yet they are now legally in the clear and can just blame incompetence for the (totally predictable) outcome.
17. pyr0hu ◴[] No.42480896[source]
Can agree. We are awaiting for a response on our DUNs number. They said we have to provide a DUNS number, which we have, and it has to be exactly 9 characters long. Ours is 10. Apple accepts it, Google does not. Even the DUNS lookup site only finds our company using the 10 characters number.

Google gives no response just extended the deadline until they remove our account for not providing the DUNs number.

Funny thing is that our number starts with a zero so theoretically it could be 9 characters long but the official lookup requires the 0 prefix

18. bootloop ◴[] No.42480964{4}[source]
I mean that is exactly what each and every Google team does. There is no decision being made without having the data "supporting" it.

But every graph can be shown nicely and things which are common sense don't need a metric to proof it.

19. equestria ◴[] No.42481182{3}[source]
I doubt it works this way. People at Google use search too, and they don't like what they see.

Part of the problem is that they're fighting against financial incentives that they themselves created. There's plenty of upside and little downside to abusing it, so it's just endless whack-a-mole.

Another issue is just how bureaucratic the process has become. They want it to look good to the regulators and the courts, so they put up with a pattern of abuse for five years, then announce some well-reasoned but narrow policy change (e.g. "product reviews now need to be actual hands-on reviews"), and... a month later, spammers are just adding an extra lie on all the fake review websites.

20. daft_pink ◴[] No.42481334[source]
I don’t quite understand why they don’t just let trusted users manually rate search results and feed that through AI to determine ranking penalties similar but not exactly like Kagi.
replies(4): >>42481356 #>>42481500 #>>42481655 #>>42481705 #
21. tensor ◴[] No.42481356[source]
Probably because they prefer a ranking order that promotes sites with more of their ads or some such, and/or causes users to spend more time going through results looking at more ads.
replies(1): >>42481447 #
22. schmidtleonard ◴[] No.42481447{3}[source]
> The goals of the advertising business model do not always correspond to providing quality search to users.

- Sergey Brin and Lawrence Page, The Anatomy of a Large-Scale Hypertextual Web Search Engine

23. Tomte ◴[] No.42481455[source]
> Techopedia started life as a solid tech site. Now, it’s a front for gambling and crypto. When it was penalized, it experienced a massive crash in traffic and rankings.

Google actually did something to improve search!

replies(2): >>42481753 #>>42483101 #
24. webdoodle ◴[] No.42481500[source]
For the same reason Twitter, Facebook and other anti-social media companies fired there Trust and Safety teams. The parasitic elite can't trust hoomans to not turn on them and promote sites and ideas that they consider dangerous to there rule. This is the reason A.I. has risen in prominence, and is clearly biased when talking about the class war.
replies(1): >>42482591 #
25. Macha ◴[] No.42481610{3}[source]
I mean spammers can (and do) get more fancy with e.g. detecting the source AS and its owner for cloaking, but I guess the spammers here are using geo location for a veneer plausible deniability that they're not specifically targeting Google.
26. Nasrudith ◴[] No.42481655[source]
For one because it would be quite abusable. Get a user account to trusted status and then sell it. This is in addition to any scalability issues.
27. summerlight ◴[] No.42481705[source]
https://support.google.com/websearch/answer/9281931?hl=en

They're already doing that with paid quality raters. I suppose your question is about opening this up more widely, but that's basically giving those spammers a tool to directly influence the ranking, which is going to be even worse.

28. ◴[] No.42481753[source]
29. summerlight ◴[] No.42481804{3}[source]
This is a prevalent misconception that assumes advertisers don't care about how their money is spent! Advertisers and Google are actually concerned about SEO garbage. Nowadays, most advertisers tend to pay based on # of conversions, its value and ROI. Those spammy sites usually yield a garbage CVR even though their CTR seems great. Advertisers don't like this.

Looks like people don't acknowledge that # of clicks is no more important metric. That seemed to be important when it was the only meaningful performance metric. But the ultimate metric that matters to money is advertiser budget allocation. If they see Google search performs worse in terms of conversions, they will cut their budget there. And this is the real problem that Google has.

replies(2): >>42481831 #>>42483742 #
30. ryoshu ◴[] No.42481831{4}[source]
Where else is that money being spent?
31. echelon ◴[] No.42482118[source]
Call your representative and tell them you believe Google is behaving in anti-competitive, trust like behavior.
32. sneak ◴[] No.42482173{3}[source]
You can’t prove a negative.
33. hinkley ◴[] No.42482272[source]
It’s getting harder and harder for me to find real answers on Google. For a little while DDG was doing okay but they seem to be just as bad if not worse now.
34. xnx ◴[] No.42482288[source]
The images are too small for me to make out the search terms. It seems like they're casino and crypto related? The whole category is a cess pit. I'm not sure there's any such thing as good rankings in that area. My concern for quality of search results from Google in this category is close to zero.
replies(1): >>42483518 #
35. hinkley ◴[] No.42482328[source]
Web hosting site I worked for had way too much of its traffic from Google spidering. Vanity URLs make them hit you like a ton of bricks. If you throttle them it reduces your score. If you 429 them that seems to be even worse. Canonical URLs don’t really save you, because they have to load the response to see they’ve already visited (just saves your score not traffic), caching can help somewhat, but if your static pages meant for bots differ too much from the real page then that’s the worst offense of all. So you need internal caching and at the end of the day you just have to scale up to deal with their bullshit. Given that Google sells cloud services, this now looks like a protection racket. Would be a shame if something happened to your website…
36. kevinventullo ◴[] No.42482591{3}[source]
Are there explicit examples of bias in AI when talking about the class war?
37. ryukoposting ◴[] No.42482681{3}[source]
Off the top of my head, the most trivial approach is also trivial to defeat. Google sets up a VPN? Just give its IP address special treatment just like you're already doing with the geofencing trick.

The only thing I can think of that'd be tough to deal with is giving that VPN server a second route to the internet, and hiding that behind the biggest consumer CG-NAT you can find. They'll have to use a bogus UA string too.

To be fair, it's well within Google's capabilities. I'm surprised they don't already do something like this, honestly. Trivial? I wouldn't say so.

On a related note: between SEO abuse like this, and the ongoing systematic IP theft by OpenAI, Google, and friends, I worry we're entering an irreversible dark age for the web. Crazy abuses of NAT and UA strings will run rampant, and the only solution will be to serve nothing of value to anyone at all without a paywall.

replies(1): >>42483454 #
38. julianeon ◴[] No.42483101[source]
But with that kind of money stream behind it, the mouse can go back to his investors and purchase a tiny mouse jetpack (or whatever the SEO equivalent is) to gain an advantage. If the cat learns to swat that, he'll come back with a toy car. Google is fairly active but eventually they give up: the mouse is a lot more motivated to get his cheese than the cat is to get the mouse. Google can penalize them 10 times in a row, but the 11th time, they win - and they essentially have the bankroll for infinite attempts.
39. nisa ◴[] No.42483454{4}[source]
Google has caching proxies in every country for almost every ISP. My rural ISP in Germany with only 65k users has it. I'm sure adding this to the contract is not impossible. On the other hand I have no idea how big the amount of data is that Google is moving around for crawling but for some statical sampling it might be enough to use the ASN of these boxes. On the other hand this probably amounts to more abuse Mail for the ISP. Lots of bandwidth that is more expensive for the ISP than Google and it's working directly against having these boxes there in the first place to remove traffic leaving the ISP.
40. joecool1029 ◴[] No.42483518[source]
> My concern for quality of search results from Google in this category is close to zero.

This thinking is the problem.

I'm not sure the best answer is to manually penalize some of the regular players. More will fill their place and it only incentivizes churn and burn behavior.

In short, what I mean to say is you take out big established institutions you consider shady but people are searching for, those people will still seek out even shadier new players to fit their need. It makes them even less safe.

41. Inviz ◴[] No.42483535[source]
My newish e-commerce website was removed from google.crawling is fine, no penalties. Then google removed website link from my business profile, because it’s not in search results. Any tips what to do next?
42. nothercastle ◴[] No.42483742{4}[source]
Different groups often look at different metrics. It’s possible that the sales group cares about one value while the dev group optimizes to another