←back to thread

258 points rbanffy | 3 comments | | HN request time: 0s | source
Show context
AlexanderDhoore ◴[] No.44003888[source]
Am I the only one who sort of fears the day when Python loses the GIL? I don't think Python developers know what they’re asking for. I don't really trust complex multithreaded code in any language. Python, with its dynamic nature, I trust least of all.
replies(19): >>44003924 #>>44003936 #>>44003940 #>>44003943 #>>44003945 #>>44003958 #>>44003971 #>>44004203 #>>44004251 #>>44004431 #>>44004501 #>>44005012 #>>44005100 #>>44005259 #>>44005773 #>>44006165 #>>44007388 #>>44011009 #>>44011917 #
jillesvangurp ◴[] No.44004251[source]
You are not the only one who is afraid of changes and a bit change resistant. I think the issue here is that the reasons for this fear are not very rational. And also the interest of the wider community is to deal with technical debt. And the GIL is pure technical debt. Defensible 30 years ago, a bit awkward 20 years ago, and downright annoying and embarrassing now that world + dog does all their AI data processing with python at scale for the last 10. It had to go in the interest of future proofing the platform.

What changes for you? Nothing unless you start using threads. You probably weren't using threads anyway because there is little to no point in python to using them. Most python code bases completely ignore the threading module and instead use non blocking IO, async, or similar things. The GIL thing only kicks in if you actually use threads.

If you don't use threads, removing the GIL changes nothing. There's no code that will break. All those C libraries that aren't thread safe are still single threaded, etc. Only if you now start using threads do you need to pay attention.

There's some threaded python code of course that people may have written in python somewhat naively in the hope that it would make things faster that is constantly hitting the GIL and is effectively single threaded. That code now might run a little faster. And probably with more bugs because naive threaded code tends to have those.

But a simple solution to address your fears: simply don't use threads. You'll be fine.

Or learn how to use threads. Because now you finally can and it isn't that hard if you have the right abstractions. I'm sure those will follow in future releases. Structured concurrency is probably high on the agenda of some people in the community.

replies(4): >>44004471 #>>44004545 #>>44005797 #>>44005830 #
HDThoreaun ◴[] No.44004545[source]
> But a simple solution to address your fears: simply don't use threads. You'll be fine.

Im not worried about new code. Im worried about stuff written 15 years ago by a monkey who had no idea how threads work and just read something on stack overflow that said to use threading. This code will likely break when run post-GIL. I suspect there is actually quite a bit of it.

replies(5): >>44004632 #>>44004665 #>>44004939 #>>44008198 #>>44010469 #
bayindirh ◴[] No.44004665[source]
Software rots, software tools evolve. When Intel released performance primitives libraries which required recompilation to analyze multi-threaded libraries, we were amazed. Now, these tools are built into processors as performance counters and we have way more advanced tools to analyze how systems behave.

Older code will break, but they break all the time. A language changes how something behaves in a new revision, suddenly 20 year old bedrock tools are getting massively patched to accommodate both new and old behavior.

Is it painful, ugly, unpleasant? Yes, yes and yes. However change is inevitable, because some of the behavior was rooted in inability to do some things with current technology, and as hurdles are cleared, we change how things work.

My father's friend told me that length of a variable's name used to affect compile/link times. Now we can test whether we have memory leaks in Rust. That thing was impossible 15 years ago due to performance of the processors.

replies(4): >>44005661 #>>44005802 #>>44007054 #>>44010622 #
delusional ◴[] No.44005661[source]
> Software rots

No it does not. I hate that analogy so much because it leads to such bad behavior. Software is a digital artifact that can does not degrade. With the right attitude, you'd be able to execute the same binary on new machines for as long as you desired. That is not true of organic matter that actually rots.

The only reason we need to change software is that we trade that off against something else. Instructions are reworked, because chasing the universal Turing machine takes a few sacrifices. If all software has to run on the same hardware, those two artifacts have to have a dialogue about what they need from each other.

If we didnt want the universal machine to do anything new. If we had a valuable product. We could just keep making the machine that executes that product. It never rots.

replies(6): >>44005751 #>>44005771 #>>44005775 #>>44006313 #>>44006656 #>>44010640 #
eblume ◴[] No.44005775{6}[source]
Software is written with a context, and the context degrades. It must be renewed. It rots, sorry.
replies(1): >>44006626 #
1. igouy ◴[] No.44006626{7}[source]
You said it's the context that rots.
replies(1): >>44006736 #
2. bayindirh ◴[] No.44006736[source]
It's a matter of perspective, I guess...

When you look from the program's perspective, the context changes and becomes unrecognizable, IOW, it rots.

When you look from the context's perspective, the program changes by not evolving and keeping up with the context, IOW, it rots.

Maybe we anthropomorphize both and say "they grow apart". :)

replies(1): >>44008270 #
3. igouy ◴[] No.44008270[source]
We say the context has breaking changes.

We say the context is not backwards compatible.