Honestly, I know it's minor, but when I tried Zig the fact I couldn't do the classic for loop to do a "for (int i = 0; i < n; ++i)" was quite annoying. It pollutes the scope, so you need to add scope braces every time, and I find it's much easier to forget the ++i at the end of the while loop.
I understand where it's coming from, but having something like "for (i32 i in [0,n[)" would be so much nicer in my opinion.
Given how Zig does not allow variable shadowing (which is a choice which makes sense on its own) I find the scope leak to be very annoying in practice.
This is true. There have been proposals to alter the `while` syntax to fix the scope leak, as well as proposals to extend `for` to support ranges, but none have been accepted (and they aren't very compelling in my opinion).
Honestly, I'm not sure why people think it is such a big deal. I've written a lot of zig, mostly gamedev stuff, and I rarely use that construction. I find that the vast majority of loops are over arrays or slices.
While I don't have much experience with Zig, I agree -- from my little bit of playing around, it also hasn't been a big deal.
(The biggest thing I've hit that I want is a way to return information with an error. If that were fixed, I think Zig's error handling would be perfect.)
Loris Cro did a great talk on error handling in Zig, which also deals with how to return information with an error if you need to, and why the common case optimizes for simpler error values: https://www.youtube.com/watch?v=TOIYyTacInM
But the answer there is basically, "if you care about payload, don't use zig's built-in error handling." Which, obviously, you can do, but I'm not convinced it's a great answer!
Even for something as simple as parseInt, it would be nice, I would argue, if it returned, along with InvalidCharacter, which character was invalid. This would enable e.g. very precise indications in messages to the end user of what was wrong with the input.
In general, because currently it's super-ergonomic to return just error codes but more of a pain to return errors w/ payloads, what you'll get in practice is no error payload even when it would be useful, which will end up turning into less-helpful-than-ideal errors for end users, which is contrary to zig's goal of enabling the creation of perfect software. :-)
Just thinking out loud, given Zig's errors primarily allow the compiler to enforce that errors are handled, and exhaustively, beyond that why can't error payloads just be passed as an input argument pointer to the function call? Maybe this is a dumb idea, but it's what C programs already do, except C programs don't have the safety of compiler checked errors offered by Zig. Does the Zig compiler need to check error payloads beyond the type system already provided? I'm not sure.
I agree with you that having such a super-ergonomic error handling system will direct most code towards a certain style, but I don't think this is contrary to Zig's goals of perfect software or simplicity. It's already such a huge win on top of C's error handling.
> Just thinking out loud, given Zig's errors primarily allow the compiler to enforce that errors are handled, and exhaustively, beyond that why can't error payloads just be passed as an input argument pointer to the function call?
Yeah, that's definitely a reasonable way to do it, and actually Andrew gives an example of this:
(Andrew's example is slightly different in that rather than an explicit argument, it's an optional field in one of the arguments.)
So it's definitely not like crazy bad or anything, but also definitely less ergonomic than if you could directly return a payload with the error, and this is enough friction that it feels like you'll end up not having error information when you want it (both as an API consumer and eventually as an end-user looking at an error message).
But, I'm still very new to Zig, so perhaps my instincts on that are wrong. :-)
> But, I'm still very new to Zig, so perhaps my instincts on that are wrong. :-)
Maybe you're right too! Just speaking for myself here, but as a programmer and "man with a hammer", I sometimes like to think that instincts might just be our mental machine learning model, where it can be really well trained and give the right black box answer, albeit not with the explanatory backstory or rationale, but worth trusting nevertheless.
I'm also new to Zig, and there are ways that a pointer input arg as error payload could go wrong. I'm following the proposal issue, and it will be interesting to see which way it goes.
I think you're spot on with your description of instincts. Which of course means sometimes they can lead us wrong, especially if we're applying them to domains very different from those they were trained on.
Yeah, it's not that common in general, I agree. I think my impression was coloured by the fact I tried Zig to implement a binary protocol deserialiser where getting the size first then iterating happened a lot.
Both forms feature "implicit flow of control", contrary to Zig's stated goals. In "while (i < 10) : (i += 1)", it's hard to understand what the heck "(i += 1)". The best concept one might get is that it's puposely made be different from C, just to be different and confuse people.
In second case, it's "defer", coming from Go, the language which chickened out to add normal exceptions, because they're "implicit transfer of control", and LOLishly added "defer", as if it's not such.
> Both forms feature "implicit flow of control", contrary to Zig's stated goals.
They absolutely don't. All control paths are explicitly represented by syntax (no different from the hidden goto in a while loop -- it's explicitly recognised by the while syntax); proof: you can draw all of them by just examining the syntax of the current subroutine, while knowing nothing about others. Exceptions, however, are implicit: any call, foo(), might or might not cause some control flow change in the client without there being any explicit acknowledgement of that by the client; you cannot draw all the flow paths just by examining the syntax of the current subroutine.
Sorry, but throwing an exception in a function call is equivalent to:
res, exc = fun();
if (exc) goto exception_handler;
That's underlying model of how exceptions behave, and how they're implemented "manually" in languages with no exception handling (C, Go). You absolutely can draw that by examining syntax of a subroutine, and it's no more implicit than "defer".
Right, both the C style and Zig's defer are explicit, as opposed to exceptions, which are implicit, only Zig's error handling is less error-prone (it forces you to handle errors) and makes the code more readable, IMO, than the C style.
But no, there's a kind of continuum, and Go/Zig "defer" is already pretty high towards "implicit flow of control" end. (I agree that exceptions are a notch higher.)
It's only C's syntax which is truly explicit. It's literally a structured machine-independent assembler. That's why it's gold language which is very hard to displace (it's already perfect for what it is). But just as everyone I'm watching with popcorn all the contenders popping up. (My humble opinion about Zig's issues on that path, I, together with other people, expressed here.)
> is already pretty high towards "implicit flow of control" end
Its implicitness is zero -- there is zero information not available in the syntax of the current block, exactly as in C -- so I don't see how it can be high compared to anything. You just don't have it in C, so you're not used to it. This is exactly like an Assembly programmer who says that C's `while` is implicit because there is no explicit jmp. In fact, the third clause in C's `for` header works almost exactly like defer: you write a piece of code that isn't executed immediately after the preceding one, but is injected to the end of the block.
A language with `while` isn't any more implicit than a language with just goto; it just has another explicit control-flow construct. Same goes for defer.
> It's literally a structured machine-independent assembler.
Not so much once you take the preprocessor into account.
> it's already perfect for what it is
I think that was true in 1970, not today. First, we know more. Second, we write much bigger programs. Third, "structured assembly" isn't as valuable as it was now that even machine code is "high level." This is not to say that C could be replaced everywhere, but I think, intertia aside, it could be replaced almost everywhere.
> Its implicitness is zero -- there is zero information not available in the syntax of the current block
Ah, so your mind's window is single block, that what you bang on. You know, blocks can be long too. To the end of reading 1000-line block, you think that you know how it ends. But oops, you completely forgot about some "defer", which implicitly executes at the end of block. That's my definition of "true explicitness", where all code which executes at some point is available locally.
And if you forgo that definition of explicitness, why stop at block? Exceptions work across blocks, i.e. on the level of the entire subroutine, and not much harder to reason about than "defer".
> In fact, the third clause in C's `for` header works almost exactly like defer
Good point. It's a clause in "for", so best practice is to use it for "loop iteration expression". It's also familiar syntax to the entire generations of programmers. Unlike Zig's "2nd clause in while" which is "original design" people will "thank" you for.
> it could be replaced almost everywhere.
The question is with what. IMHO, Zig isn't good enough to replace C at all. Too much NIH, again. It does too much differently. Naive thinking is of course "so that it's better", but actually, it just repeats C's mistakes (ugliness in language) and makes its own, "very original" ones.
> That's my definition of "true explicitness", where all code which executes at some point is available locally.
But no control flow construct satisfies that. Jumps are the whole point, and the unit for structured control flow is the block (what's the next instruction after `break`, `continue` or `}`?). defer is just as structured, just as local, and just as explicit as C's control flow constructs.
> It's also familiar syntax to the entire generations of programmers.
True, but learning Zig takes a day; two tops. Where it differs in syntax from C, it does so for good reason, and it's not like it's a complex language where different syntax is an additional burden. C's `for` syntax just doesn't make sense (`;`, which everywhere else in the language denotes sequential composition, means something different in the `for` header). Is an additional 2 minutes of learning, in a language that has very little syntax, not worth fixing something that's unpleasant in C?
I understand where it's coming from, but having something like "for (i32 i in [0,n[)" would be so much nicer in my opinion.