←back to thread

579 points seanisom | 2 comments | | HN request time: 0.444s | source

I used to work at Adobe on the infrastructure powering big applications like Photoshop and Acrobat. One of our worst headaches was making these really powerful codebases work on desktop, web, mobile, and the cloud without having to completely rewrite them. For example, to get Lightroom and Photoshop working on the web we took a winding path through JavaScript, Google’s PNaCl, asm.js, and finally WebAssembly, all while having to rethink our GPU architecture around these devices. We even had to get single-threaded builds working and rebuild the UI around Web Components. Today the web builds work great, but it was a decade-long journey to get there!

The graphics stack continues to be one of the biggest bottlenecks in portability. One day I realized that WebAssembly (Wasm) actually held the solution to the madness. It’s runnable anywhere, embeddable into anything, and performant enough for real-time graphics. So I quit my job and dove into the adventure of creating a portable, embeddable WASM-based graphics framework from the ground up: high-level enough for app developers to easily make whatever graphics they want, and low-level enough to take full advantage of the GPU and everything else needed for a high-performance application.

I call it Renderlet to emphasize the embeddable aspect — you can make self-contained graphics modules that do just what you want, connect them together, and make them run on anything or in anything with trivial interop.

If you think of how Unity made it easy for devs to build cross-platform games, the idea is to do the same thing for all visual applications.

Somewhere along the way I got into YC as a solo founder (!) but mostly I’ve been heads-down building this thing for the last 6 months. It’s not quite ready for an open alpha release, but it’s close—close enough that I’m ready to write about it, show it off, and start getting feedback. This is the thing I dreamed of as an application developer, and I want to know what you think!

When Rive open-sourced their 2D vector engine and made a splash on HN a couple weeks ago (https://news.ycombinator.com/item?id=39766893), I was intrigued. Rive’s renderer is built as a higher-level 2D API similar to SVG, whereas the Wander renderer (the open-source runtime part of Renderlet) exposes a lower-level 3D API over the GPU. Could Renderlet use its GPU backend to run the Rive Renderer library, enabling any 3D app to have a 2D vector backend? Yes it can - I implemented it!

You can see it working here: https://vimeo.com/929416955 and there’s a deep technical dive here: https://github.com/renderlet/wander/wiki/Using-renderlet-wit.... The code for my runtime Wasm Renderer (a.k.a. Wander) is here: https://github.com/renderlet/wander.

I’ll come back and do a proper Show HN or Launch HN when the compiler is ready for anyone to use and I have the integration working on all platforms, but I hope this is interesting enough to take a look at now. I want to hear what you think of this!

Show context
zengid ◴[] No.39911269[source]
This is super neat and I am very interested!

I'm in a rush so I can't look to closely now but I have a few questions (and please forgive any stupid questions, I'm not a graphics dev, just a hobbyist):

What's the runtime like? Is there an event loop driving the rendering? (who calls the `render` on each frame? are there hooks into that? ) FFI story? Who owns the window pointer?

I'm interested in audio plugins, and VSTs (etc) have a lot of constrains on what can be done around event loops and window management. JUCE is pretty much the de-facto solution there, but it's pretty old and feels crufty.

replies(3): >>39911396 #>>39911899 #>>39913843 #
1. seanisom ◴[] No.39911396[source]
Great questions!

The host app owns the event loop. I don't foresee that changing even once we re-architect around WebGPU (allowing the Wasm guest to control shaders), as the host app is responsible for "driving" the render tree, including passing in state (like a timer used for animations). The host app owns the window pointer, as renderlets are always designed to be hosted in an environment (either an app or a browser). Open to feedback on this, though.

FFI is coming with the C API soon!

I don't know much about audio but I see a ton of parallels - well-defined data flow across a set of components running real-time, arbitrary code. Simulations also come to mind.

replies(1): >>39913662 #
2. zengid ◴[] No.39913662[source]
Thank you for the reply! I'm excited to watch as this project progresses, and I wish you the best of luck!