But mine is still working locally now. If it stops working locally, what even is the point anymore?
It used to considered vile that drug dealers tried to hook their users and force dependence... turns out that they were just ahead of the curve.
I hate it, for myself I don’t use it but when having to share API stuff I have to use it because that’s what other people understand.
Good for postman business, bad for everyone.
It brought to mind this quote:
“It’s only software developers and drug dealers who call people users,”
From a recent article that came through the feed:
https://www.theguardian.com/technology/2025/oct/18/are-we-li...
I haven't used postman or insomnia in a while since they went to the cloud, so I could just be missing it, but that's also a non-starter for me.
[0]: https://www.jetbrains.com/help/idea/http-client-in-product-c...
[1]: https://learn.microsoft.com/en-us/aspnet/core/test/http-file...
[2]: https://marketplace.visualstudio.com/items?itemName=humao.re...
Edit: Ah, so here it is: https://posting.sh
Sincere question, been studying lots of OSS commercial licensing and always wonder what works in which context
how would someone use this in a project that operates within VS Code Remote where the source sits on a remote server and isn't physically on the file system.
https://web.archive.org/web/20140604204111/http://www.getpos...
It's great. You can even paste a curl command into it and it will automatically convert and format it. You can then use the Copy button to convert your changes back to curl.
Yes, it's a good-faith license. The license doesn't even apply to the OSS version (only prebuilt binaries).
The bet is that super fans will pay for it in the early days and, as it gets adopted by larger companies, they will pay in order to comply with the legalities of commercial use. So far, it's working! The largest company so far is 34 seats, with a couple more in the pipe!
I'm not quite sure why Yaak wouldn't work in this case. It it because your running server wouldn't be accessible to Yaak, running on your system?
Lately I've just been using a Phoenix LiveBook notebook, with the Req package loaded into it. I can make requests, do arbitrary transforms on the data, and generally stay right at home in a language I like and understand
If you don't know elixir, I'm sure jupyter or some other notebook system would do just as nice of a job
It makes good sense because companies actually have an absurd amount of liability to you if they violate your agreement.
https://github.com/jamierpond/yapi
Run this:
yapi -c ./users.yapi.yaml
With this file: # users.yapi.yaml
# yaml-language-server: $schema=https://pond.audio/yapi/schema
url: http://localhost:3000
method: GET
path: /api/users
query:
select[name]: true
select[tag]: true
limit: 10
Or just `yapi` to use fzf to find configs.When you use a remote, the code is on the remote and all your editing functions (search, version control, terminal, extensions) happen in the remote via a worker process.
So in a remote session, everything is “local” to the remote. You may have no file “mount” of the thing at all on your host desktop machine. If you do a git commit, it’s running inside/on the remote. If you do a file search the files are searched on the remote, rather than downloading them over some network filesystem and searching locally.
The GP’s point is, I think: if you implemented Yaak as a VSCode extension, it could be made to function either in a local session or inside a remote (on a server accessed via SSH, a docker container, on the linux side of WSL etc.) and therefore have fast rather than slow access to the code, git repo etc.
I do essentially all my dev work (apart from compiling the odd mac app) inside remotes of various kinds to create reproducible environments, avoid cluttering the host, sandbox the tools, give me freedom to work from more than one machine etc., and I run into this sort of thing quite a bit.
There are at least two clients like this for VSCode —- Thunder Client and EchoAPI, and I believe both function in a remote session.
P.S. I loved Insomnia before the bad happened; it really helped with learning APIs. Thanks.
You can be an Oracle and audit your customers and develop that adversarial relationship. The idea is that that sort of thing makes you rot in the long run.
Posting.sh -> Postman imports are experimental which makes it a non-starter for people like myself with large Postman collections. TUI only also makes it harder to switch.
Insomnia -> Owned by another large tech company.
Yaak -> Made by the same guy who created AND SOLD Insomnia above. Not exactly comforting to switch over for. How long till this one also gets sold?
Any other great local tools out there? I would like to be done with Postman.
« Offline only - We take security and privacy seriously. Bruno is an offline tool and there is no syncing of your data to any cloud »
And the solo dev has a better product already and might actually win haha.
Underdog story.
Rooting for you!
Wow, in a world dominated by gigabytes of electron application, people thinks 10 MB is the optimal size for a simple utility TUI app.
As a reference, (from archlinux repo), vim’s install package is 2.3MB, curl is 1.2MB, lua (the complete language interpreter) is 362KB
Let's see how long it takes for one of these programs to break the cycle.
#!/bin/bash -x
TOK="my-jwt-tok"
case "$1" in
get-foo)
curl -H "header: bearer $TOK" "http://www.example.com/foo" | jq .
;;
post-bar)
curl \
-H "header: bearer $TOK" "http://www.example.com/bar" \
-H "content-type: application/json" \
--data-raw '{"baz":"bap"}'
;;
*)
;;
esac
used like: ./example.bash get-foo
I know it doesn't have the functionality of postman, but this is how I build up interactions with a new API.sign away means getting rid of vcs.
https://github.com/postmanlabs/postman-app-support/issues/69...
So, I said fuck it and switched to a real, open source alternative, Insomnia, instead:
These api clients are rocket science, the barrier for entry is very low.
I loved to use their free desktop app for building API documentations which can be used for scaffolding / generating APIs but for some reason I don't remember right now I had to stop using it.
- requests chaining,
- capturing and passing data from a response to another request,
- response tests (JSONPath, XPath, SSL certs, redirects etc...)
There is nice syntax sugar for requesting REST/SOAP/GraphQL APIs but, at the core, it's just libcurl! You can export you files to curl commands for instance. (I'm one of the maintainers)
[1]: https://hurl.dev
There are SO many alternatives. It’s curl UI wrapper with secrets* management! Why do we all need enterprise licenses??
*and the secrets were all exposed in logs!!
It will never disappear, enshittify, or let you down. It's already modern, and has a great UI. It's available everywhere. It supports every protocol and feature under the sun. Those fancy features you think you need: you don't. Whatever you're missing can be easily added via simple shell scripts or aliases.
Then a VC fund gives these developers a dumptruck full of money and expect returns immediately afterward. Something like Postman likely doesn't make a ton of money unless they're doing something anti-consumer like selling data.
Handy Dandy Notebook as well, but that requires some reformulation to get everything in terms of standard curl/node/python/etc commands. (whether that’s better or worse than a custom http dsl is a matter of some debate)
Is it basically "an IDE for playing with API's"?
Is it only for HTTP-based API's?
Does it come with canned functionality for popular services out there?
I read this but still don't feel like I fully understand what it does https://www.postman.com/api-platform/api-client/
EDIT: This blog post https://schier.co/blog/call-for-beta-testers made it click a little better: "...app that makes testing REST APIs super easy"
Postman founder here. I did not time this with an AWS outage of this magnitude but I posted about filesystem, git, and offline support coming to Postman last week: https://x.com/a85/status/1978979495836356819?s=46
Postman has a lot of capabilities now that require the cloud but there is still an offline client built in just for requests.
Building sign-in and cloud features were not due to a VC-led conspiracy. A large number of companies depend on APIs (like AWS) and have thousands of services and APIs. Customers need to manage them and wanted us to build it.
Take the Micro editor. It's written in Go, and packs a fair bit of functionality into a single 12 meg static binary (of which a few megs is probably the runtime.)
Kreya is privacy-first since its first commit five years ago, since we were fed up with Postman and Insomnia. Happy to answer any questions
One idea: since you are doing good-faith licenses anyway, maybe you could add in the possibility to pay for some kind of one-time license? I don't particularly need or want updates from my API tool, I just want it to work and not break. I would be fine with paying a one time commercial license that gives non expiring right to use a particular version.
https://anonymousdata.medium.com/postman-is-logging-all-your...
https://blog.postman.com/engineering/postman-free-is-secure-...
Postman allows for turning off history, keeping variables local, setting up a local vault all in the free product and in more advanced plans, there are secret scanning capabilities for IT and security teams.
https://blog.postman.com/choose-the-right-postman-plan-for-y...
These issues are not unique to Postman and apply to all cloud products like GitHub as an instance. Products that are “offline” just shift the burden to the user.
Other then that, its same old curl.
There has been a release in september, issues has been solved within last month, and multiple pull requests has been managed (merged and rejected) also recently.
Maybe you refer to issues specific to a platform? Thanks in advance.
I guess a substitution would be a git repo with curl scripts and environment variables?
We also have some non-tech people who use postman to run tests.
Edit: oh my, you also made Insomnia, that I used when Postman was on the enshittification path...
Now I just have a Makefile with a bunch of curl invocations, or Python tests with requests to match against expected responses.
I get the whining, but teams need ways to share their complex workflows, and teams are where the money is for all dev focused software.
That's who pays for all your tools to have free versions.
People who use make and curl to jury rig some unshareable solution together that no-one else in their company would even bother trying to use aren't worth any money to companies.
Disclaimer: I maintain it.
???
Mash 'em, boil 'em, put 'em in git, next to your code?
I was thinking back to running X sessions on remote machines, sending for example a text editor view back across the network to my desktop.
VSCode remote feels to my fiftysomething brain to be logically quite like that, only you are sending the display back from the remote worker using web techniques, and rather than to a display manager, you are sending it back into the shell of an editor, so it appears to be largely indistinguishable from a session running on your local machine.
Teams that are knowledgeable jury rig their own custom solutions without all the enterprise cruft. They make solutions that fix their problem and they do it faster than the teams who use bloated enterprise solutions.
I am tired of seeing over engineered enterprise solutions that that are implemented and never used because they can’t be integrated into the dev workflow easily. Simple bash script that does the task it was designed to do beats any enterprise crap.
But to your question - I have saved based authenticated request to our company useful APIs - github/jira/artifactory - so when I want to string together some micro tool to do something in batch, I don't have to remember where do I create API key, and how do they accept it.
Each one has its strengths, and weaknesses, Insomnia can export the saved requests as `curl` commands so it's a nice visualisation to iterate over a complex call until it's working, and then be exported if needed to be automated; `curl` is quite ubiquitous but clunky to remember the exact arguments I might need; HTTPie has a nice argument syntax so it's quite readable to be quickly edited but isn't present without installing Python, pip, and pulling it.
Those of us who can survive without desperate monetization plays are worth quite a lot, actually. They say 'jury rig', we say 'engineer'.
Also I am not counting that Insomina won't follow the same footsteps as Postman.
Personal feelings aside, snap is working fine. It's maybe a bit slow on startup, but that's it.
There are several FOSS command line tools that can do this easier, e.g. https://httpie.io/cli
Git is pretty good at sharing you know
Complacent corporate teams. Agile teams need to simplify their workflows, and know that a Makefile can be better than some closed down, Cloud-first tool.
>That's who pays for all your tools to have free versions
Nah, we have free versions for stuff without enterprise editions too.
>People who use make and curl to jury rig some unshareable solution together that no-one else in their company would even bother trying
It's that "no-one else" that doesn't bring value.
Anyways, the folks have spoken, no need to double down. There are more than a dozen alternatives to it, and new ones are coming up.
I'm helping build a new one.
- Completely offline.
- Gives the ability to build reusable blocks (headers, query params, etc)
- Let's you document everything in Markdown.
- Imports your collections and cURLs.
And:
The world does not need more than 4 computers.
-Ken Olsen, or someone (in the mainframe days).
(Both are alleged / apocryphal quotes. :)
Startup a dev server on the remote machine and forward the port to localhost. It should now be accessible via http://localhost:[port] on your local machine in the browser or any application, as if it’s running locally.
I find it’s very useful for also for interacting with DBs/Redis. Just forward the port your DB communicates on and use whatever client on your local machine to interact with it.
As far as I know this works with any service that communicates via TCP
It would be so much faster and easier for the postman reps to just shut down the conversation. And yet, for some reason, they keep it going for very long while still being extremely evasive when it comes to any concern raised about data sovereignty.
On top of that, Canonical is pushing snap very hard. Try to uninstall the Firefox snap - e.g. because it doesn’t integrate well with password managers - and install it using apt from the Mozilla repo. Ubuntu will later just silently replace it back with the snap version.
I’m about to switch away from Ubuntu as a result of this.
The CEO committed to open-sourcing it, as well as to not monetize on anything that doesn't introduce operational costs to the team.
Apps like Postman are the wrong tool for this purpose.
If you want to share workflows, let alone complex workflows, any automated test suite is far better suited for this purpose.
We are in the age of LLMs and coding agents, which make BDD-style test frameworks even more relevant, as they allow developers to implement the workflows, verify they work, and leave behind an enforceable and verifiable human-readable description of those workflows.
Bash and Perl scripts run, truly, everywhere - so you get real collaboration. I can share it with anyone on my team and they can use it.
Converted a bunch stuff just laying in my shell history into actual actionable files finally :D
Won't immediately help with giving that tool access to the file system or Git.
For a local VM, you can have file system mounts, fast enough for Git.
SSHFS could help in some genuinely distant remotes with access to the file system (though SSHFS in the context of multiple file writers is fraught with risk of file corruption; been there, bought that T-shirt).
With properly network-remote VMs, nothing helps that much with giving the tool access to performing Git operations on changes inside your remote: Git is slow over network file systems even when they are quite local.
This is the real power of the VS Code remote after all; everything happens there.
https://freakonomics.com/2008/04/our-daily-bleg-did-ibm-real...
Gates denies ever saying anything like the 640K quote, but it was possibly someone at Microsoft being salty about the 640K limit that IBM had imposed on the PC through its design.
What you're saying doesn’t sound familiar whatsoever, but I'd really like to look more into it.
Connecting to a license server is pretty much standard.
For Postman it is annoying because it was never explicitly stated and it seemed like they are cool kids making nice helpful app. But really they are in for money. Which is not bad have to say but the way they did it is bad.
For recreational internet use, I use yy025 + TCP client + TLS proxy. No fees, no telemetry, no BS. I can select from a long list of TCP clients in this setup, e.g., original netcat, tcploop, tcpclient, socat, etc., as well as a variety of TLS proxies, e.g., tinyproxy+stunnel, haproxy, etc. I can modify the source of all the programs and can do more than is possible if using an "HTTP client", e.g., curl
Of course I am not testing "web apps" for a commercial enterprise. Nor do I use a so-called "modern", graphical browser. I retrieve data without a browser. Since I use HTTP every day in textmode it is "interesting"^1 to see software that somehow commercialises similar HTTP use, e.g., Postman, Burp, etc.
1. For example, https://pitchbook.com/news/articles/postman-valuation-2-bill...
Thanks!
They have a lot of inerita, but that's it. If you're in Greenfield development, there is a close to 0% chance you will choose Oracle as your RDBMS.
Um, oops.
* The text is tiny for my old eyes. I figured there's probably a setting for it and hit Cmd-, and found there's no settings UI whatsoever. No keyboard shortcut either it seems, and no help menu either, so no searching for "keyboard" with Shift-Cmd-/
* .void files may be markdown, but no markdown editor will recognize it as such. Maybe support .void.md as well. I also couldn't find any way to edit the markdown source of a .void file from voiden, which is a bit ironic for a tool that loudly advertises the markdown format as a central feature.
* Could there be a block that expands into the full URL of the request and parameters above it (or perhaps as args)? How about another that renders as a cURL command, which would cover POST/PUT/PATCH requests nicely too. My API documentation always has cURL request examples and I detest writing them by hand.
* While I'm suggesting blocks, one that renders the response headers/body to the preceding request would also be handy. It should support a placeholder response that gets replaced when the request is actually run (and perhaps a "save" button to persist it in the markdown). Responses get long, so maybe have a max-height for the block and make it scrollable
I'm actually really excited about Voiden and hope these can be addressed. It has a similar feel to Jetbrains' .http format, but an evolutionary leap beyond it. It also feels really raw and unfinished.
But the UX is just terrible (for me) or at least has been every time I've tried to start using it more.
I mean, come on, the most basic use of creating a request goes like this:
1. Sorry, can't let you create a request before you create a collection.
2. Sorry, can't let you create a collection before you make a decision on in which path you will be storing all your collections.
3. Sorry, can't let you create a request before you think of a good name for it.
etc.
Like what the heck? This should be just one click to create a new untitled request then fill in the URL and send.
At first I thought this might be growing pains since it was new but every year I try it and it hasn't improved.
That being said, it would be awesome to have something inside Yaak where I could test API endpoints, like integration tests for APIs.
it might have changed now but it did not support grpc endpoints, which was a dealbreaker for me. but definitely appreciate the project and i hope it reaches core feature parity soon.
But that's A) me personally and B) me in Cloud/Startup type companies, so of course we don't got with Oracle.
But like you mentioned, inertia. So my previous gigs that were large multi-national of course were all Oracle. And they were all huge and had zero reason to not just buy the Oracle tax. Which is why Oracle is going strong.
Despite all the rage, Oracle can still survive quite some time on running boring things like I don't know, many large banks and other boring old businesses. Which of those is really gonna go "AWS Aurora MySQL" when the have had an in-house "Oracle Exadata" run their entire business operation "just fine" for longer than those Cloud providers have even be around?
Fully local and no hidden proxy. https://github.com/chapar-rest/chapar
I didn't know you created Yaak!
I just downloaded Yaak and it's been awesome, thank you!
I downloaded this through AUR on Arch and one bit of feedback is that I wish you'd make the sig verification a whole bunch easier, thanks!