←back to thread

4 points liulanggoukk | 3 comments | | HN request time: 0s | source

I just learned an expensive lesson and wanted to share it here so others don’t make the same mistake.

I recently lost $300 because of an API key leak. It started with a surprise $200 charge from Google Cloud, and when I looked into it, I found another $100 charge from the day before. Both were for Gemini API usage that I never intentionally set up.

After digging, I discovered the issue: I had hard-coded an API key in a script that was part of a feature I ended up deprecating. The file was only in the codebase for two days, but that was enough for the key to leak. Google actually sent me alerts about unusual activity, but I missed them because they went to a less-frequently-checked email account.

Here’s what I learned:

Never hardcode API keys - Use environment variables or a .env file, even for temporary code.

Set up billing alerts - Google Cloud (and other providers) let you set up alerts for unexpected charges.

Check all linked emails - Don’t ignore notifications, even if they’re sent to secondary accounts.

Don’t rely solely on GitHub’s secret scanning - It’s useful, but renaming variables can bypass it.

This happened while I was experimenting with "vibe coding" (letting AI generate code quickly), but I realized too late that human oversight is still crucial, especially for security.

Hope this helps someone avoid the same costly mistake!

TL;DR: Hard-coded an API key in a deprecated script, key leaked, and I got charged $300. Always use environment variables and set up billing alerts!

Show context
giveita ◴[] No.45248495[source]
I hate API keys. We need to get rid of them. Everyone who can influence this ... please do.

The alternative? JWT or suchlike. Authenticate each session with zero trust.

At big corp work everything is Okta / JWT / Yubikey etc. Very very occasionally an API key.

replies(1): >>45254524 #
1. scarface_74 ◴[] No.45254524[source]
So exactly how would you suggest using a YubiKey in a script that runs automatically and is meant to run unsupervised?

Wouldn’t it be logical that Google knew about zero trust? The problem wasn’t the API Key, the problem was that the poster didn’t use best practices - see my other comment.

Even if it wasn’t a built in facility like the three or four ways to authenticate with GCP or AWS programmatically and you did have to use long live API keys, you could still piggy back off the cloud providers access I mentioned and read from a secure cloud hosted vault using your temporary keys from your script.

In the case of AWS read your third party API key from secrets manager and read secret manager based on your keys in your home directory or better yet your short lived local keys in your environment variables - not a local environment file that you will probably forget to use .gitignore for

replies(1): >>45255530 #
2. giveita ◴[] No.45255530[source]
Ideally an unauthorised script e.g. CI/CD is authenticated via a session initially. Yes under the hood a secret is stored and you could argue its morally an API key - however the UX wouldnt be developer logs in, copies a key to their clipboard then pastes it hopefully in the CI secrets section but also likely in the code.
replies(1): >>45255686 #
3. scarface_74 ◴[] No.45255686[source]
I know more about AWS, but GCP from what I read is similar, best practice is that you have a web page that you authenticate to via SSO and get temporary access keys that you assign to environment variables. The SDK automatically knows how to read from the environment variables locally.

When you run your code on the cloud platform, you attach privileges to the run time environment (VM, Lambda, docker runtime, etc) that are properly scoped for least privilege. The SDK also knows how to get your permissions from it automatically. You never need to worry about your code getting the proper access keys.

I’ve done most of my CI/CD using AWS native services that you also attach the role to the runtime. For instance CodeBuild is really just a Linux or Windows Docker runtime that you can run anything in and you attach permissions to your CodeBuild project. Of course your AWS access is controlled ideally via your SSO or 2FA.

I have done some work with Azure DevOps - which doesn’t have anything to do with Azure. You can also use it to deploy your AWS and you store your access keys in an Azure controlled vault and your pipeline gives AWS permissions to your scripts. I think the same thing works with GitHub Actions.