Many years ago I used to run the Firefox NoScript extension exclusively. For sites that I trusted and visited frequently I would add their domains to an exceptions list. For sites that I wasn't sure about I would load it with all scripts disabled and then selectively kept allowing scripts until the site was functional, starting with the scripts hosted on the same domain as the site I wanted to see/use.
Eventually I got too lazy to keep doing that but outside of the painstaking overhead it was by far the best web experience I ever had. I started getting pretty good at recognizing what scripts I needed to enable to get the site to load/work. Plus, uBlock Origin and annoyances filters got so good I didn't stress about the web so much any more.
But all this got me thinking, why not have the browser block all scripts by default, then have an AI agent selectively enable scripts until I get the functionality I need? I can even give feedback to the agent so it can improve over time. This would essentially be automating what I was dong myself years ago. Why wouldn't this work? Do I not understand AI? Or web technology? Or are people already doing this?
find that hard to believe. but even if you find something using an api not implement by firefox, chances are you definitely do not want that feature anyway, the firefox gave in to really awful stuff and only drew the line on obviously egregious privacy violation ones.
I definitely run into pages broken in firefox desktop or especially firefox mobile. Extra especially on proofs of concept advertised here on HN.
Many see 'it works on Chrome and mobile Safari' as 'it works' and they can get project signoff / ship / get paid / whatever and don't care about other users
The company that has the application may not know until a few users complain (if they complain) and by that point it could be too late due to the contract, or they may not understand what a different browser is or care either.