Most active commenters
  • dvt(3)
  • ryandv(3)
  • Yoric(3)
  • psychoslave(3)

←back to thread

289 points kristoff_it | 38 comments | | HN request time: 1.792s | source | bottom
1. dvt ◴[] No.44610616[source]
"Asynchrony" is a very bad word for this and we already have a very well-defined mathematical one: commutativity. Some operations are commutative (order does not matter: addition, multiplication, etc.), while others are non-commutative (order does matter: subtraction, division, etc.).

    try io.asyncConcurrent(Server.accept, .{server, io});
    io.async(Cient.connect, .{client, io});
Usually, ordering of operations in code is indicated by the line number (first line happens before the second line, and so on), but I understand that this might fly out the window in async code. So, my gut tells me this would be better achieved with the (shudder) `.then(...)` paradigm. It sucks, but better the devil you know than the devil you don't.

As written, `asyncConcurrent(...)` is confusing as shit, and unless you memorize this blog post, you'll have no idea what this code means. I get that Zig (like Rust, which I really like fwiw) is trying all kinds of new hipster things, but half the time they just end up being unintuitive and confusing. Either implement (async-based) commutativity/operation ordering somehow (like Rust's lifetimes maybe?) or just use what people are already used to.

replies(10): >>44610771 #>>44610939 #>>44612125 #>>44612190 #>>44612605 #>>44612656 #>>44612932 #>>44613047 #>>44613470 #>>44615786 #
2. ryandv ◴[] No.44610771[source]
Strictly speaking commutativity is defined over (binary) operations - so if one were to say that two async statements (e.g. connect/accept) are commutative, I would have to ask, "under what operation?"

Currently my best answer for this is the bind (>>=) operator (including, incidentally, one of its instances, `.then(...)`), but this is just fuzzy intuition if anything at all.

replies(6): >>44610835 #>>44610926 #>>44610969 #>>44611560 #>>44612207 #>>44616773 #
3. dvt ◴[] No.44610835[source]
Commutative operations (all of them I think?) are trivially generalized to n-ary operations (in fact, we do this via ∑ and ∏, in the case of addition and multiplication, respectively). You're right that the question of what "operation" we're dealing with here is a bit hazy; but I'd wager that it's probably in the family of the increment operation (N++ === N + 1 = 1 + N) since we're constantly evaluating the next line of code, like the head of a Turing machine.

Edit: maybe it's actually implication? Since the previous line(s) logically imply the next. L_0 → L_1 → L_2 → L_n? Though this is non-commutative. Not sure, it's been a few years since my last metalogic class :P

replies(2): >>44611346 #>>44613955 #
4. benreesman ◴[] No.44610926[source]
It's a good intuition. This has been studied extensively, the composition rule that is lax enough to permit arbitrary effects but strict enough to guarantee this class of outcomes is (>>=). We can keep trying to cheat this as long as we want, but it's bind.
5. dooglius ◴[] No.44610939[source]
Commutativity is a much weaker claim because one is totally before or after the other. e.g. AB may commute with C so ABC=CAB but it is not necessarily the case that this equals ACB. With asynchrony you are guaranteed ABC=ACB=CAB. (There may be an exisiting mathematical term for this but I don't know it)
replies(2): >>44611038 #>>44616889 #
6. xscott ◴[] No.44610969[source]
> "under what operation?"

You could treat the semicolon as an operator, and just like multiplication over matrices, it's only commutative for a subset of the general type.

replies(2): >>44610980 #>>44612012 #
7. ryandv ◴[] No.44610980{3}[source]
Right, exactly. It's been said that (>>=) is a programmable semicolon.

[0] https://news.ycombinator.com/item?id=21715426

8. dvt ◴[] No.44611038[source]
You can prove three-term commutativity from two-term (I did it years ago, I think it looked something like this[1]), so the ordering doesn't matter.

[1] https://math.stackexchange.com/questions/785576/prove-the-co...

replies(2): >>44611317 #>>44612439 #
9. Ar-Curunir ◴[] No.44611317{3}[source]
Strictly speaking this also requires associativity.
10. senderista ◴[] No.44611346{3}[source]
Generalizing an associative binary op to an n-ary op just requires an identity element Id (which isn't always obvious, e.g. Id_AND=true but Id_OR=false).
replies(2): >>44612004 #>>44612390 #
11. noduerme ◴[] No.44611560[source]
`.then()` is ugly, `await` is pretty, but wouldn't the critical part to guarantee commutivity less than guaranteed order (in js) be the `Promise.all([])` part?
12. singularity2001 ◴[] No.44612004{4}[source]
Identity is nop / pass
13. singularity2001 ◴[] No.44612012{3}[source]
or carrots return/new line For that matter
14. brailsafe ◴[] No.44612125[source]
> "Asynchrony" is a very bad word for this and we already have a very well-defined mathematical one: commutativity.

I don't think it's sufficient to say that just because another term defines this concept means it's a better or worse word. "commutativity" feels, sounds, and reads like a mess imo. Asynchrony is way easier on the palette

replies(1): >>44612312 #
15. ◴[] No.44612190[source]
16. Nevermark ◴[] No.44612207[source]
The "operator" in this case would be the CPU executing 2 or N procedures (or functions).

Commutivity is a very light weight pattern, and so is correctly applicable to many things, and at any level of operation, as long as the context is clear.

17. throwawaymaths ◴[] No.44612312[source]
commutativity is also not correct, because 1) it means way more things than just temporal ordering and 2) there are cooky temporal ordering schemes you can come up with (interleaving multiple async/awaits in weird time-dependent ways) which aren't really describable in the simple mathematical notion of commutativity.
18. JadeNB ◴[] No.44612390{4}[source]
> Generalizing an associative binary op to an n-ary op just requires an identity element Id (which isn't always obvious, e.g. Id_AND=true but Id_OR=false).

Only for n = 0, I think. Otherwise, generalizing associative binary f_2 to f_n for all positive integers n is easily done inductively by f_1(x) = x and f_{n + 1}(x_1, ..., x_n, x_{n + 1}) = f_2(f_n(x_1, ..., x_n), x_{n + 1}), with no need to refer to an identity. (In fact, the definition makes sense even if f_2 isn't associative, but is probably less useful because of the arbitrary choice to "bracket to the left.")

19. dooglius ◴[] No.44612439{3}[source]
I'm not talking about a universe where all elements commute, I'm talking about a situation in which A, B, and C do not necessarily commute but (AB) and C do. For a rigorous definition: given X and Y from some semigroup G, say X and Y are asynchronous if for any finite decompositions X=Z_{a_1}Z_{a_2}...Z_{a_n} and Y=Z_{b_1}Z_{b_2}...Z_{b_m} (with Z's in G) then for any permutation c_1,...,c_{n+m} of a_1,...,a_n,b_1,...,b_m that preserves the ordering of a's and the ordering of the b's has XY=Z_{c_1}Z_{c_2}...Z_{c_{n+m}}. I make the following claim: if G is commutative then all elements are asynchronous, but for a noncommutative G there can exist elements X and Y that commute (i.e. XY=YX) but X and Y are not asynchronous.
replies(1): >>44613418 #
20. hinkley ◴[] No.44612605[source]
Asynchrony also allows for partial ordering. Two operations may still need to be retired in a particular order without having to execute in that order.

Subtraction for instance is not commutative. But you could calculate the balance and the deduction as two separate queries and then apply the results in the appropriate order.

21. ordu ◴[] No.44612656[source]
> As written, `asyncConcurrent(...)` is confusing as shit, and unless you memorize this blog post, you'll have no idea what this code means. I get that Zig (like Rust, which I really like fwiw) is trying all kinds of new hipster things, but half the time they just end up being unintuitive and confusing. Either implement (async-based) commutativity/operation ordering somehow (like Rust's lifetimes maybe?) or just use what people are already used to.

I can't agree. It is confusing, because you need to remember the blog post, it wouldn't be confusing in the slightest if you internalized the core idea. The question remains: is it worth it to internalize the idea? I don't know, but what I do know is some people will internalize it and try to do a lot of shit with this in mind, and after a while we will be able to see where this path leads to. At that point we will be able to decide if it is a good idea or not.

> "Asynchrony" is a very bad word for this and we already have a very well-defined mathematical one: commutativity.

It is risky to use "commutativity" for this. Zig has operators, and some of them are commutative. And it will be confusing. Like if I wrote `f() + g(). Addition is commutative, then Zig is free to choose to run f() and g() in parallel. The order of execution and commutativity are different things. Probably one could tie them into one thing with commutative/non-commutative operators, but I'm not sure it is a good idea, and I'm sure that this is the completely different issue to experimenting with asynchrony.

replies(1): >>44613805 #
22. delusional ◴[] No.44612932[source]
> Usually, ordering of operations in code is indicated by the line number

Except for loops which allow going backwards, and procedures which allow temporarily jumping to some other locally linear operation.

We have plenty of syntax for doing non-forwards things.

23. tbrownaw ◴[] No.44613047[source]
> Some operations are commutative (order does not matter: addition, multiplication, etc.)

Fun fact: order does matter for addition. (When adding many floating-point numbers with widely varying exponents.)

24. JW_00000 ◴[] No.44613418{4}[source]
To give a concrete example, matrix multiplication is not commutative in general (AB ≠ BA), but e.g. multiplication with the identity matrix is (AI = IA). So AIB = ABI ≠ BAI.

Or applied to the programming example, the statements:

    1. Server.accept
    2. Client.connect
    3. File.write  # write to completely unrelated file
123 = 312 ≠ 321.
25. tsimionescu ◴[] No.44613470[source]
> So, my gut tells me this would be better achieved with the (shudder) `.then(...)` paradigm. It sucks, but better the devil you know than the devil you don't.

The whole idea behind `await` is to make the old intuition work without the ugliness of `.then()`. `f(); await g(); h()` has exactly the expected execution ordering.

replies(1): >>44613746 #
26. Yoric ◴[] No.44613746[source]
Can confirm.

In JS, we designed `await` specifically to hide `.then()`, just as we had designed `.then()` because callbacks made tracking control flow (in particular errors) too complex.

replies(1): >>44613861 #
27. psychoslave ◴[] No.44613805[source]
I'm not sure they are that different, you could just as well store function calls in some constant each on its line then addition the result on a third. This is only syntax, not conceptual difference here. And on practical level, the difference is that the operator can be directly matched with some machine instruction, with operands being native data type such as integer.

Still, you might then prefer a word as permutability, or swappability.

28. psychoslave ◴[] No.44613861{3}[source]
How is that any better to have await? Any resources I might consult on this?
replies(2): >>44613995 #>>44614028 #
29. dwattttt ◴[] No.44613955{3}[source]
Implication sounds right. With no further analysis, running each line in order is correct (for whatever "order" is defined by a language, let's assume imperative).

A compiler could recognise that e.g. L_2 doesn't depend on L_1, and would be free to reorder them. And compilers do recognise this in terms of data dependence of operations.

30. Yoric ◴[] No.44613995{4}[source]
Well, one of the ways we "sold" async/await it to Google was by showing how we could improve Promise-based tests.

I recall that one of our test suites was tens of thousands of lines of code using `then()`. The code was complicated enough that these lines were by and large considered write-only, partly because async loops were really annoying to write, partly because error-handling was non-trivial.

I rewrote that test suite using `Task.spawn` (our prototype for async/await). I don't have the exact numbers in mind, but this decreased the number of LoC by a factor of 2-3 and suddenly people could see the familiar uses of loops and `try`/`catch`.

31. tsimionescu ◴[] No.44614028{4}[source]
Well, consider the difference between

  a().then(() =>
         b())
     .then(() =>
         c())
Compared to

  await a()
  await b() 
  await c() 
Even for this simple case I think it's much clearer. Then look at a more complex case:

  for( i=0; i<n; i++) {
    await a(i) ;
  } 
Now try re-writing this with then() and see the difference.
replies(1): >>44614462 #
32. psychoslave ◴[] No.44614462{5}[source]
For the first one, it really feels like only a matter of how you break lines. Sure there is also the matter of anonymous function syntax, but

The latter is more fair, here is a possible solution:

  const gen = (function* () {
    for (let i = 0; i < n; i++) yield a(i);
  })();

  const run = next => !next.done && next.value.then(() => run(gen.next()));
  run(gen.next());
Or something similar using reduce. But in both cases, it illustrates the point, I guess.

But if we are at point we can introduce new keywords/syntax in the language, it would just as well possible to come with something like

  a.chain(b, c)
In case you need to pass parameters

  a.chain([b, p1, p2], c)
And for the latter case

  const indexes = (function* () {
    for (let i = 0; i < n; i++) yield i;
  })
  a.through(indexes)
replies(1): >>44614609 #
33. Yoric ◴[] No.44614609{6}[source]
Well, sure, if you have `yield`, you pretty much have `await` already, as `await` is thin syntactic sugar on top of `yield` in all languages other than OCaml.
34. alfiedotwtf ◴[] No.44615786[source]
> Usually, ordering of operations in code is indicated by the line number (first line happens before the second line, and so on), but I understand that this might fly out the window in async code

This isn’t always true at the language level, and almost certainly not at the CPU pipeline and microcode level.

Logic languages like Prolog will execute statements out of order, by design. Other languages like Mercury use the IO monad to signify serial operations

replies(1): >>44617617 #
35. jhanschoo ◴[] No.44616773[source]
Under function composition `;`, where both the LHS and RHS are viewed as functions operating on the whole environment state.
replies(1): >>44616923 #
36. jhanschoo ◴[] No.44616889[source]
I agree. T and U async with respect to each other means at least that T and U can be broken down into tasks t1, t2, t3, ... tn and u1, u2, ..., un, such that they can be interleaved in any order, but typically we still require that the t tasks are executed in sequential order. The divisions between the tasks are where they give up control, e.g. as they wait for data to be loaded into memory, or on a network call.

This is still a special case of what we mean by async wrt each other, because depending on the interleaving at each step and e.g. the data loaded into memory, the number of tasks may change, but the idea is that they still eventually terminate in a correct state.

37. ryandv ◴[] No.44616923{3}[source]
Right; though it's a special kind of function composition (Kleisli composition) and often presented in a different form (bind, >>=).
38. sdbrady ◴[] No.44617617[source]
I'm not sure what you mean by "statements" in Prolog as it's not a term the language defines. If you're referring to clauses, it's not true that execution is unordered: the Prolog interpreter attempts to unify a goal with clauses from the knowledge base in the order they appear. This ordering is semantically significant for control flow.

If instead you're referring to goals within the body of a clause, this is also incorrect. Goals are evaluated strictly left-to-right, and each must succeed before the next is attempted. This evaluation order is likewise required and observable, especially in the presence of side effects.