Some examples: https://www.reddit.com/r/browsers/comments/124kphe/what_do_y...
Today, we're excited to introduce Gosub, a new open-source browser engine that we are building from the ground up in Rust!
Gosub aims to be a modern, modular, and highly flexible browser engine. While still in the early development and experimentation phase, Gosub is shaping up nicely, and we’re looking to onboard more contributors to help us bring this project to life.
Some of the key highlights:
* Written in Rust: We're leveraging Rust's safety and performance features to create a robust and efficient engine.
* Modular Design: The project is organized around modules, allowing for clean separation of concerns and easier collaboration. It also allows us to easily swap components based on needs and allows more freedom for engine implementers in the future.
* Collaborative and open source: We’re building Gosub with the intention of making it approachable and open to contributions, aiming to create a project that's easier to understand and collaborate on compared to existing browsers.
Instead of writing another shell around Chromium or WebKit, we decided to write a browser engine from scratch. We believe that having a diverse landscape of engines is the only way to defeat a monoculture that is currently threatening current browsers and by extension the internet itself. We cannot and should not let a very small number of large companies dictate the future of the web and its usage.With Gosub, we're aiming to build something more approachable that can evolve with the latest web technologies, all while being open to contributors from day one.
We’re looking for developers with or without experience in Rust. You just need to be interested in browser technologies. There are plenty of opportunities to work on core modules, document our progress, and help shape the project's direction.
We can already render simple pages, including the hackernews front page. However, to render most sites correctly, it is still a long journey, so come and join us!
Some examples: https://www.reddit.com/r/browsers/comments/124kphe/what_do_y...
The Firefox engine is great if you don't want something controlled by Google. And it's going to be much easier to develop plugins or extensions for that versus trying to write your own browser from scratch, which to be honest will probably cost hundreds of millions of dollars. If not more.
I do imagine tools like this being useful for something like web scraping, but it's never going to be an end user product.
There are plenty of browser engines that have fallen by the wayside, often for reasons unrelated to the founders, but that particular list actually paints a surprising amount of the opposite to me. For once since Chrome came out (it at least introduced v8 even if it started with webkit) I have the feeling there are browsers engines in development that can actually attempt to load the modern web and will actually have products for general use cases (instead of specialized ones) in the coming years.
I am reminded of this:
“I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones.”
– Linus Torvalds, announcing the software that became what we now know as Linux
Do we need more "fully compatible" engines? I could imagine there are use cases for browser engines that work with just parts of the specification, particularly the most common ones used in the wild.
Like an application platform (forget documents) built entirely on wasm, and with capability based security. That would let you launch apps made within the platform just as easily as you currently open a website.
The platform would need some primitives for rendering, UI, accessibility and input handling. But hopefully a lot of those APIs could be much lower level than the web provides today. Move all the high level abstractions into library code that developers link into their wasm bundles. (For example, I’m thinking about most of what css does today.)
That would allow much faster innovation in layout engines and other things the web does today, and a smaller api surface area should lead to better security.
It’s quite possible to build something like this today. It’s just a lot of work.
Maybe when chatgpt is a bit smarter, it might be able to do the lion’s share of the work to make this happen.
In the end you will end up with the platform / OS which will lose to competitors because of performance and lack of features and do not expect it to be secure. Developers will manage to leave some holes and hackers will find their way.
By tighter integration with the final product, the browser can provide specialized elements or APIs to simplify the actual application code.
I think it's used a few other places as well.
Mozilla had one a while ago: https://github.com/browserhtml/browserhtml . I'm sure it could be updated.
I switched back to Safari and it worked normal immediately.
Change any of those variables and you may have a winning proposition.
Even the old pre-tvOS Apple TV apps were kinda web apps - XML and JavaScript delivered over HTTP
The web's sandboxed security model makes it better for users. And that in turn drives popularity.
I think the same could be true for a good application platform. The trick is using the sandboxing + capability based security model to enable "new" usability features that traditional applications can never deliver.
It comes from things built for children to play in that have an edge with contents inside it. You play "in a sandbox" without having to deal with anything outside of the sandbox.
The sand in a children's sandbox spills over and gets everywhere. The children playing inside it .. they don't have to care. They are playing inside the sandbox, and for now, the world outside it is not relevant.
My girlfriend has strong opinions about how we use the word "library" to describe a software package.
Surely the metaphor should be that one package is a "book" and the entire package repository is the library, right? Left pad is an entire library? Huh?
And then we have no collective noun for collections of libraries!
I've heard this a number of times; but circa 1991 I'm not convinced. The critical GNU components to scaffold Linux adoption were:
- GCC + C runtime
- A shell (plus a bunch of small utilities)
You could make a reasonable argument that Linux was and remains a much bigger achievement, namely because GNU has never actually managed to make a usable kernel (Hurd still has no USB support, for example. I don't know if this is for technical or political reasons). On the flip side, there were plenty of compilers and shells kicking around that Linux could have bundled.
RMS makes all sorts of wild claims about how much of a distro is "GNU software" (a criterion that is never clearly defined - is any non-kernel GPL software part of GNU?) by comparing lines of code; but at the time of these claims distros tended to bundle anything and everything that you might possibly want.
Contemporaneously to all this, 386BSD had created a full freely-distributable Unix with no GNU code; and its descendants might have captured the mindshare were it not for an extended copyright battle from AT&T.