https://gehrcke.de/2023/09/google-changes-recently-i-see-mor...
The wrong RSS thing may have just tipped the scales over to Google not caring.
No more Google. No more websites. A distributed swarm of ephemeral signed posts. Shared, rebroadcasted.
When you find someone like James and you like them, you follow them. Your local algorithm then prioritizes finding new content from them. You bookmark their author signature.
Like RSS but better. Fully distributed.
Your own local interest graph, but also the power of your peers' interest graphs.
Content is ephemeral but can also live forever if any nodes keep rebroadcasting it. Every post has a unique ID, so you can search for it later in the swarm or some persistent index utility.
The Internet should have become fully p2p. That would have been magical. But platforms stole the limelight just as the majority of the rest of the world got online.
If we nerds had but a few more years...
The amount of spam has increased enormously and I have no doubt there are a number of such anti-spam flags and a number of false positive casualties along the way.
However, if they do it for the statutory term, they can then successfully apply for existing-use rights.
Yet I've seen expert witnesses bring up Google pins on Maps during tribunal over planning permits and the tribunal sort of acts as if it's all legit.
I've even seen the tribunals report publish screenshots from Google maps as part of their judgement.
You know what else we need? We need food to be free. We need medicine to be free, especially medicines which end epidemics and transmissible disease. We need education to be free. We need to end homelessness. We need to end pollution. We need to end nationalism, racism, xenophobia, sexism. We need freedom of speech, religion, print, association. We need to end war.
There are a lot of things we as a society need. But we can't even make "p2p internet" work, and we already have it. (And please just forget the word 'distributed', because it's misleading you into thinking it's a transformative idea, when it's not)
Just the open is similar, but the intent is totally different, and so is the focus keyword.
Not facing this issue in Bing and other search engines.
On the other side of the same coin there are already governments that will make you legally responsible of what your page's visitors write in comments. This renders any p2p internet legally unbearable (i.e. someone goes to your page, posts some bad word and you get jailed). So far they say "it's only for big companies" but it's a lie, just boiling frogs.
Some popular models on Hugging Face never appear in the results, but the sub-pages (discussion, files, quants, etc.) do.
Some Reddit pages show up only in their auto-translated form, and in a language Google has no reason to think I speak. (Maybe there's some deduplication to keep machine translations out of the results, but it's misfiring and discarding the original instead?)
YouTube should get split out and then broken up. Google Search should get split out and broken up. etc.
This is not a problem you solve with code. This is a problem you solve with law.
That's not to say I don't have gripes with how Google Maps works, but I just don't know why the other factors were not considered.
Primary domain cannot be found via search - Bing knows about brand, LinkedIn, YouTube channel and but refuses to show search results about primary domain.
Bing search console does not give any clue, force reindexing does not help. Google search works fine.
I just checked a few local restaurants to me in London that opened in the last few years, and the ratio of reviews is about 16:1 for google maps. It looks like stuff that’s been around longer has a much better ratio towards trip advisor though.
Almost certainly Instagram/tiktok are though. I know a few places which have been ruined by becoming TikTok tourist hotspots.
1. Ai overview: my page impressions were high, my ranking was high, but click through took a dive. People read the generated text and move along without ever clicking.
2. You are now a spammer. Around August, traffic took a second plunge. In my logs, I noticed these weird queries in my search page. Basically people were searching for crypto and scammy websites on my blog. Odd, but not like they were finding anything. Turns out, their search query was displayed as an h1 on the page and crawled by google. I was basically displaying spam.
I don't have much control over ai overview because disabling it means I don't appear in search at all. But for the spam, I could do something. I added a robot noindex on the search page. A week later, both impressions and clicks recovered.
Edit: Adding write up I did a couple weeks ago https://idiallo.com/blog/how-i-became-a-spammer
"cannot do anything" is relative. Google did something about it (at least for the first 10-15 years) but I am sure that was not their primary intention nor they were sure it will work. So "we have no clue what will work to reduce it" is more appropriate.
Now I think everybody has tools to build stuff easier (you could not make a television or a newspaper 50 years ago). That is just an observation of possibility, not a guarantee of success.
I would settle for simpler, attainable things. Equal opportunity for next generation. Quality education for everybody. Focus on merit not other characteristics. Personal freedom if it does not infringe on the freedom of people around you (ex: there can't be such thing as a "freedom to pollute").
In my view Internet as p2p worked pretty well to improve the previous status quo in many areas (not all). But there will never be a "stable solution", life and humans are dynamic. We do have some good and free stuff on the Internet today because of the groundwork laid out 30 years ago by the open source movement. Any plan started today will have noticeable effect in many years. So "we can't even make" sounds more of an excuse to not start, rather than an honest take.
You can avaoid this by no caching search pages and applying noindex via X-robots tag https://developers.google.com/search/docs/crawling-indexing/...
example.com/search?q=text+scam.com+text
On my website, I'll display "text scam.com text - search result" now google will see that link in my h1 tag and page title and say i am probably promoting scams.Also, the reason this appeared suddenly is because I added support for unicode in search. Before that, the page would fail if you added unicode. So the moment i fixed it, I allowed spammers to have their links displayed on my page.
Google got smart and found out such exploits, and penalized sites that do this.
Most efficient = cheaper. A lot of times cheaper sacrifices quality, and sometimes safety.
[1] https://cyberinsider.com/threat-actors-inject-fake-support-n...
But basically what happened: In august 2025 we finished the first working version of our shop. I wanted to accelerate indexing after some weeks because only ~50 of our pages were indexed and submitted the sitemap and everything got de-indexed within days. I thought for the longest time that its content quality because we sell niche trading cards and the descriptions are all one liners i made in Excel. ("This is $cardname from $set for your collection or deck!"). And because its single trading cards we have 7000+ products that are very similiar. (We did do all product images ourselves I thought google would like this but alas).
But later we added binders, whole sets and took a lot of care with their product data. The frontpage also got a massive overhaul - no shot. Not one page in index. We still get traffic from marketplaces and our older non-shop site. The shop itself lives on a subdomain (shop.myoldsite.com). The normal site also has a sitemap but that one was submitted 2022. I later rewrote how my sitemaps were generated and deleted the old ones in search console hoping this would help. It did not. (The old sitemap was generated by the shop system and was very large. Some forums mentioned that its better to create a chunked sitemap so I made a script that creates lists with 1000 products at a time as well as an index for them.)
Later observations are:
- Both sitemaps i deleted in GSC are still getting crawled and are STILL THERE. You cant see them in the overview but if you have the old links they still appear as normal.
- We eventually started submitting product data to google merchant center as well. It works 100% fine and our products are getting found and bought. The clicks still even show up in search console!!!! So I have a shop with 0 indexed pages in GSC that gets clicks every day. WTHeck?
So like... I dont even know anymore. Maybe we also have to restart like the person in the blog did and move the shop to a new domain and NEVER give google a sitemap. If I really go that route I will probably delete the cronjob that creates the sitemap in case google finds it by itself. But also like what the heck? I have worked in a web agency for 5 years and created a new webpage about every 2-8 weeks so i roughly launached about 50-70 webpages and shops and i NEVER saw that happen. Is it an ai hallucinating? Is it anti spam gone too far? Is it a straight up bug that they dont see? Who knows. I dont
(Good article though and I hope maybe some other people chime in and googlers browsing HN see this stuff).
Every family should be provided with a UBI that covers food and rent (not in the city). That is a more attainable goal and would solve the same problems (better, in fact).
(Not saying that UBI is a panacea, but I've lived in countries that have experimented with such and it seems the best of the alternatives)
Since these are very low quality results surely one of Google's 10000 engineers can tweak this away.
We have a consultant for the topic but I am not sure how much of that conversation I could share publicly so I will refrain myself of doing so.
But I think I can say that it is not only about data structure or quality. The changes in methodology applied by Google in September might be playing a stronger role than what people initially thought
From what you've described, you've just re-invented webrings.
They say the data before and after is not comparable anymore as they are not counting certain events below a threshold anymore. You might need to have your own analytics to understand your traffic from now own.