←back to thread

202 points Towaway69 | 5 comments | | HN request time: 0.267s | source

Hi There,

Erlang-RED has been my project for the last couple of months and I would love to get some feedback from the HN community.

The idea is to take advantage of Erlangs message passing and low overhead processes to have true concurrency in Node-RED flows. Plus also to bring low-code visual flow-based programming to Erlang.

1. mystraline ◴[] No.44007580[source]
I was an early adopter of NodeRed. Earlier, it worked exceptionally well.

Now? Not so much.

Well, that's not exactly true. Base NodeRed works as well as before. But the libraries of modules to interface with all sorts of websites, APIs, hardware, and other stuff is rotten to the core.

Most plugins/js modules either just don't work, or 'successfully fail'. The easier fail case is where the module can't be installed due to ancient (6mo or older JS, sigh) modules.

I abandoned NR because its basically a hyper-speed bitrot due to terrible library module versioning. And I didn't want to reinvent the wheel on every system I wanted to touch.

replies(4): >>44007710 #>>44007764 #>>44008976 #>>44009882 #
2. wslh ◴[] No.44007710[source]
I don't think this is easily to solve, in general. Similar orchestrators (e.g. n8n) have the same issues because there are a lot of components dependencies that change with time and there is no real cohesion between the core and all kind of plugins. Probably a future "contracts/manifests" linking orchestrators with components could help.
3. Towaway69 ◴[] No.44007764[source]
I wrote an article about this[1] and you are definitely right: a lot of packages are rotting away a bit however, because NodeRED does a lot to stay backward compatible, older packages still work. The oldest ones I'm actively using are five or more years old, i.e., haven't been touched nor updated - still work.

What I tried to say in the article is the same as you say: base NodeRED, the core works really well and has great features - no questions. And even if packages die, the core will still remain usable and that makes a difference.

Its a bit like saying Linux is a pile of broken bits because the ls command has been updated in ten years: Linux will always work and those commands that are based on the core will continue to work becausae the Linux kernel will largely remain backward compatible. Packages fail when they have external dependencies that change but code that is solely based on the core will continue to work.

[1] https://blog.openmindmap.org/blog/crunchy-numbers

4. 01100011 ◴[] No.44008976[source]
Not surprising. Whenever you have a project like NR or HA you have a ton of barely working glue code written by people who just want to get it working and move on without any sort of commitment to maintenance. It allows these projects to rapidly expand their support but then that support quickly rots unless the core team assumes responsibility for maintenance. I really want to mess with home automation again but this sort of low software quality and resulting instability and maintenance hassles makes it not worth the effort.
5. ◴[] No.44009882[source]