←back to thread

45 points scolby33 | 1 comments | | HN request time: 0s | source
Show context
theamk ◴[] No.46195792[source]
Deprecations via warnings don't reliably work anywhere, in general.

If you are a good developer, you'll have extensive unit test coverage and CI. You never see the unit test output (unless they fail) - so warnings go unnoticed.

If you are a bad developer, you have no idea what you are doing and you ignore all warnings unless program crashes.

replies(8): >>46196064 #>>46197215 #>>46203939 #>>46220846 #>>46221176 #>>46221435 #>>46221650 #>>46221723 #
1. ploxiln ◴[] No.46221176[source]
When I update python version, python packages, container image, etc for a service, I take a quick look at CI output, in addition to the all the other checks I do (like a couple basic real-world-usage end-to-end usage tests), to "smoke test" whether something not caught by outright CI failure caused some subtle problem.

So, I do often see deprecation warnings in CI output, and fix them. Am I a bad developer?

I think the mistake here is making some warnings default-hidden. The developer who cares about the user running their the app in a terminal can add a line of code to suppress them for users, and be more aware of this whole topic as a result (and have it more evident near the entrypoint of the program, for later devs to see also).

I think that making warnings error or hidden removes warnings as a useful tool.

But this is an old argument: Who should see Python warnings? (2017) https://lwn.net/Articles/740804/