Most active commenters
  • (5)
  • threatofrain(3)
  • Lichtso(3)
  • amelius(3)

←back to thread

289 points kristoff_it | 39 comments | | HN request time: 1.284s | source | bottom
1. threatofrain ◴[] No.44609107[source]
IMO the author is mixed up on his definitions for concurrency.

https://lamport.azurewebsites.net/pubs/time-clocks.pdf

replies(4): >>44609321 #>>44609563 #>>44610163 #>>44611854 #
2. tines ◴[] No.44609321[source]
Can you explain more instead of linking a paper? I felt like the definitions were alright.

> Asynchrony: the possibility for tasks to run out of order and still be correct.

> Concurrency: the ability of a system to progress multiple tasks at a time, be it via parallelism or task switching.

> Parallelism: the ability of a system to execute more than one task simultaneously at the physical level.

replies(9): >>44609334 #>>44609420 #>>44609491 #>>44609531 #>>44609532 #>>44609822 #>>44609915 #>>44610273 #>>44611918 #
3. ◴[] No.44609334[source]
4. threatofrain ◴[] No.44609420[source]
Concurrency is the property of a program to be divided into partially ordered or completely unordered units of execution. It does not describe how you actually end up executing the program in the end, such as if you wish to exploit these properties for parallel execution or task switching. Or maybe you're running on a single thread and not doing any task switching or parallelism.

For more I'd look up Rob Pike's discussions for Go concurrency.

replies(1): >>44609486 #
5. tines ◴[] No.44609486{3}[source]
The article understands this.
replies(1): >>44609541 #
6. andsoitis ◴[] No.44609491[source]
> Asynchrony: the possibility for tasks to run out of order and still be correct.

Asynchrony is when things don't happen at the same time or in the same phase, i.e. is the opposite of Synchronous. It can describe a lack of coordination or concurrence in time, often with one event or process occurring independently of another.

The correctness statement is not helpful. When things happy asynchronously, you do not have guarantees about order, which may be relevant to "correctness of your program".

replies(1): >>44609689 #
7. Lichtso ◴[] No.44609531[source]
Concurrency is parallelism and/or asynchrony, simply the superset of the other two.

Asynchrony means things happen out of order, interleaved, interrupted, preempted, etc. but could still be just one thing at a time sequentially.

Parallelism means the physical time spent is less that the sum of the total time spent because things happen simultaneously.

replies(3): >>44609697 #>>44610257 #>>44613039 #
8. jkcxn ◴[] No.44609532[source]
Not the OP, but in formal definitions like Communicating Sequential Processes, concurrency means the possibility for tasks to run out of order and still be correct, as long as other synchronisation events happen
replies(1): >>44609678 #
9. threatofrain ◴[] No.44609541{4}[source]
> Concurrency: the ability of a system to progress multiple tasks at a time, be it via parallelism or task switching.

Okay, but don't go with this definition.

replies(1): >>44610194 #
10. sapiogram ◴[] No.44609563[source]
This is why I've completely stopped using the term, literally everyone I talk to seems to have a different understanding. It no longer serves any purpose for communication.
replies(2): >>44609704 #>>44611907 #
11. gowld ◴[] No.44609678{3}[source]
Concurrency implies asynchrony (two systems potentially doing work at the same time withut waiting for each other), but the converse is not true.

A single process can do work in an unordered (asynchronous) way.

replies(1): >>44610100 #
12. w10-1 ◴[] No.44609689{3}[source]
> The correctness statement is not helpful

But... that's everything, and why it's included.

Undefined behavior from asynchronous computing is not worth study or investment, except to avoid it.

Virtually all of the effort for the last few decades (from super-scalar processors through map/reduce algorithms and Nvidia fabrics) involves enabling non-SSE operations that are correct.

So yes, as an abstract term outside the context of computing today, asynchrony does not guarantee correctness - that's the difficulty. But the only asynchronous computing we care about offers correctness guarantees of some sort (often a new type, e.g., "eventually consistent").

13. jrvieira ◴[] No.44609697{3}[source]
careful: in many programming contexts parallelism and concurrency are exclusive concepts, and sometimes under the umbrella of async, which is a term that applies to a different domain.

in other contexts these words don't describe disjoint sets of things so it's important to clearly define your terms when talking about software.

replies(2): >>44609817 #>>44609977 #
14. ◴[] No.44609704[source]
15. merb ◴[] No.44609817{4}[source]
Yeah concurrency is not parallelism. https://go.dev/blog/waza-talk
replies(1): >>44610041 #
16. ryandv ◴[] No.44609822[source]
They're just different from what Lamport originally proposed. Asynchrony as given is roughly equivalent to Lamport's characterization of distributed systems as partially ordered, where some pairs of events can't be said to have occurred before or after one another.

One issue with the definition for concurrency given in the article would seem to be that no concurrent systems can deadlock, since as defined all concurrent systems can progress tasks. Lamport uses the word concurrency for something else: "Two events are concurrent if neither can causally affect the other."

Probably the notion of (a)causality is what the author was alluding to in the "Two files" example: saving two files where order does not matter. If the code had instead been "save file A; read contents of file A;" then, similarly to the client connect/server accept example, the "save" statement and the "read" statement would not be concurrent under Lamport's terminology, as the "save" causally affects the "read."

It's just that the causal relationship between two tasks is a different concept than how those tasks are composed together in a software model, which is a different concept from how those tasks are physically orchestrated on bare metal, and also different from the ordering of events..

replies(1): >>44610351 #
17. amelius ◴[] No.44609915[source]
> Asynchrony: the possibility for tasks to run out of order and still be correct.

Can't we just call that "independent"?

replies(1): >>44610603 #
18. Lichtso ◴[] No.44609977{4}[source]
yes, some people swap the meaning of concurrency and asynchrony. But, almost all implementations of async use main event loops, global interpreter lock, co-routines etc. and thus at the end of the day only do one thing at a time.

Therefore I think this definition makes the most sense in practical terms. Defining concurrency as the superset is a useful construct because you have to deal with the same issues in both cases. And differentiating asynchrony and parallelism makes sense because it changes the trade-off of latency and energy consumption (if the bandwidth is fixed).

19. Lichtso ◴[] No.44610041{5}[source]
That is compatible with my statement: Not all concurrency is parallelism, but all parallelism is concurrency.
replies(2): >>44610270 #>>44614508 #
20. Zambyte ◴[] No.44610100{4}[source]
Parallelism implies concurrency but not does not imply asynchrony.
21. carodgers ◴[] No.44610163[source]
The author is aware that definitions exist for the terms he uses in his blog post. He is proposing revised definitions. As long as he is precise with his new definitions, this is fine. It is left to the reader to decide whether to adopt them.
replies(1): >>44610233 #
22. michaelsbradley ◴[] No.44610194{5}[source]
Is the hang up on "at a time"? What if that were changed to something like "in a given amount of time"?

For single threaded programs, whether it is JS's event loop, or Racket's cooperative threads, or something similar, if Δt is small enough then only one task will be seen to progress.

23. WhitneyLand ◴[] No.44610233[source]
He’s repurposing asynchrony that’s different from the way most literature and many developers use it, and that shift is doing rhetorical work to justify a particular Zig API split.

No thanks.

24. michaelsbradley ◴[] No.44610257{3}[source]
Asynchrony also practically connotes nondeterminism, but a single-threaded concurrent program doesn't have to exhibit nondeterministic behavior.
25. OkayPhysicist ◴[] No.44610270{6}[source]
What people mean by "concurrency is not parallelism" is that they are different problems. The concurrency problem is defining an application such that it has parts that are not causally linked with some other parts of the program. The parallelism problem is the logistics of actually running multiple parts of your program at the same time. If I write a well-formed concurrent system, I shouldn't have to know or care if two specific parts of my system are actually being executed in parallel.

In ecosystems with good distributed system stories, what this looks like in practice is that concurrency is your (the application developers') problem, and parallelism is the scheduler designer's problem.

26. ninetyninenine ◴[] No.44610273[source]
Doesn't multiple tasks at the same time make it simultaneous?

I think there needs to be a stricter definition here.

Concurrency is the ability of a system to chop a task into many tiny tasks. A side effect of this is that if the system chops all tasks into tiny tasks and runs them all in a sort of shuffled way it looks like parallelism.

27. kazinator ◴[] No.44610351{3}[source]
The definition of asynchrony is bad. It's possible for asynchronous requests to guarantee ordering, such that if a thread makes two requests A and B in that order, asynchronously, they will happen in that order.

Asynchrony means that the requesting agent is not blocked while submitting a request in order to wait for the result of that request.

Asynchronous abstractions may provide a synchronous way wait for the asynchronously submitted result.

replies(1): >>44610510 #
28. ryandv ◴[] No.44610510{4}[source]
> The definition of asynchrony is bad. It's possible for asynchronous requests to guarantee ordering, such that if a thread makes two requests A and B in that order, asynchronously, they will happen in that order.

It's true that it's possible - two async tasks can be bound together in sequence, just as with `Promise.then()` et al.

... but it's not necessarily the case, hence the partial order, and the "possibility for tasks to run out of order".

For example - `a.then(b)` might bind tasks `a` and `b` together asynchronously, such that `a` takes place, and then `b` takes place - but after `a` has taken place, and before `b` has taken place, there may or may not be other asynchronous tasks interleaved between `a` and `b`.

The ordering between `a`, `b`, and these interleaved events is not defined at all, and thus we have a partial order, in which we can bind `a` and `b` together in sequence, but have no idea how these two events are ordered in relation to all the other asynchronous tasks being managed by the runtime.

replies(1): >>44610689 #
29. skydhash ◴[] No.44610603{3}[source]
Not really. There may be some causal relations.
replies(1): >>44610613 #
30. amelius ◴[] No.44610613{4}[source]
Can you give an example?
replies(1): >>44611648 #
31. kazinator ◴[] No.44610689{5}[source]
I mean that it's possible in the sense of being designed in as a guarantee; that the async operations issued against some API object will be performed in the order in which they are submitted, like a FIFO queue.

I don't mean "promise.then", whereby the issuance of the next request is gated on the completion of the first.

An example might be async writes to a file. If we write "abc" at the start of the file in one request and "123" starting at the second byte in the second requests, there can be a guarantee that the result will be "a123", and not "abc2", without gating on the first request completing before starting the other.

async doesn't mean out of order; it means the request initiator doesn't synchronize on the completion as a single operation.

32. skydhash ◴[] No.44611648{5}[source]
Locks, scheduling,... That introduce some synchronicity and so some kind of order. But it's enforced on the system and not a required mechanism.
replies(1): >>44613615 #
33. laserbeam ◴[] No.44611854[source]
Keep in mind that you can’t really express half the concepts in lamport’s papers in most languages. You don’t really talk about total and partial clock ordering when starting a thread. You only really do it in TLA+ when designing a protocol.

That being said, I agree we don’t need a new term to express “Zig has a function in the async API that throws a compilation error when you run in a non-concurrent execution. Zig let’s you say that.” It’s fine to so that without proposing new theory.

replies(1): >>44616474 #
34. ◴[] No.44611907[source]
35. sriram_malhar ◴[] No.44611918[source]
The phrase "multiple tasks at a time" is ill-defined, according to Lamport, because whose clock are you trusting.

For lamport concurrent does not mean what it means to us colloquially or informally (like, "meanwhile"). Concurrency in Lamport's formal definition is only about order. If one task is dependent or is affected by another, then the first is ordered after the second one. Otherwise, they are deemed to be "concurrent", even if one happens years later or before.

36. ◴[] No.44613039{3}[source]
37. amelius ◴[] No.44613615{6}[source]
But if I run "ls" on a machine, and another user runs "ls" on the same machine, wouldn't you consider them independent, even though the OS uses all kinds of locks and what not under the hood?
38. jlouis ◴[] No.44614508{6}[source]
SIMD is parallel execution with no concurrency.
39. ◴[] No.44616474[source]