For example, if I wanted to LoadFromFile() + Render() the building renderlet into a deferred rendering pipeline, would I be able to do that?
The graphics stack continues to be one of the biggest bottlenecks in portability. One day I realized that WebAssembly (Wasm) actually held the solution to the madness. It’s runnable anywhere, embeddable into anything, and performant enough for real-time graphics. So I quit my job and dove into the adventure of creating a portable, embeddable WASM-based graphics framework from the ground up: high-level enough for app developers to easily make whatever graphics they want, and low-level enough to take full advantage of the GPU and everything else needed for a high-performance application.
I call it Renderlet to emphasize the embeddable aspect — you can make self-contained graphics modules that do just what you want, connect them together, and make them run on anything or in anything with trivial interop.
If you think of how Unity made it easy for devs to build cross-platform games, the idea is to do the same thing for all visual applications.
Somewhere along the way I got into YC as a solo founder (!) but mostly I’ve been heads-down building this thing for the last 6 months. It’s not quite ready for an open alpha release, but it’s close—close enough that I’m ready to write about it, show it off, and start getting feedback. This is the thing I dreamed of as an application developer, and I want to know what you think!
When Rive open-sourced their 2D vector engine and made a splash on HN a couple weeks ago (https://news.ycombinator.com/item?id=39766893), I was intrigued. Rive’s renderer is built as a higher-level 2D API similar to SVG, whereas the Wander renderer (the open-source runtime part of Renderlet) exposes a lower-level 3D API over the GPU. Could Renderlet use its GPU backend to run the Rive Renderer library, enabling any 3D app to have a 2D vector backend? Yes it can - I implemented it!
You can see it working here: https://vimeo.com/929416955 and there’s a deep technical dive here: https://github.com/renderlet/wander/wiki/Using-renderlet-wit.... The code for my runtime Wasm Renderer (a.k.a. Wander) is here: https://github.com/renderlet/wander.
I’ll come back and do a proper Show HN or Launch HN when the compiler is ready for anyone to use and I have the integration working on all platforms, but I hope this is interesting enough to take a look at now. I want to hear what you think of this!
The renderlet is a bundle of WebAssembly code that handles data flow for graphics objects. Input is just function parameters, output writes serialized data to a specific place in Wasm linear memory. With the Wasm Component Model, in the future can use much more complex types as input and output.
LoadFromFile() - Instantiates the Wasm module
Render() - runs the code in the module, wander uploads the output data to the GPU
Functions on the render tree - do things with the uploaded GPU data - like bind a texture to a slot, or ID3D11DeviceContext::Draw, for example.
There's some nuance about shading. In the current version, the host app is still responsible for attaching a shader, so should be no issue using the data in a deferred shading pipeline. In the future, the renderlet needs to be able to attach its own shaders, in which case it would have to be configured to use a host app's deferred shading pipeline. I think it is possible, but complicated, to build an API for this, where the host and then the renderlet are both involved in a lighting pass.
Of course, if all shading is handled within the renderlet, it entirely the concept of deferred shading, and this becomes an easier problem to solve.