Python 3 incorporated types into the language itself, in a similar way (though non-reified) to PHP. This seems much easier to deal with than requiring two files (.rb and .rbs) to describe a single data structure.
Python 3 incorporated types into the language itself, in a similar way (though non-reified) to PHP. This seems much easier to deal with than requiring two files (.rb and .rbs) to describe a single data structure.
I'm having a really hard time understanding this "I need types forced down my throat" and "I like typing 3x as much as I would otherwise need to" and "yes, I want half my screen obscured by the types of everything I'm doing, not the actual code" and the "adding types now means bugs are impossible" mass cult hysteria that's running so rampant. Typing very occasionally prevents bugs that are generally easy to catch/fix or show up straight away when running an app. It's mostly a documentation system. And it slows development down.
Especially in Ruby which is such an elegant "programmer's language" I think it would just be silly.
3x? Even on languages that do not support type inference I would say that this is at most 1.1x. Even then, type inference exists.
> adding types now means bugs are impossible
I usually see that as a mis-representation of what type advocates say. Rather, it seems that people just support that types reduce the amount of bugs.
> or show up straight away when running an app
Or that show up after you had said app running for a while, and then you get a run-time type error which appears only after doing certain actions. This is the main reason that I am avoiding languages like lua and python.
(In addition languages with more advanced type-systems allow you to catch bugs such as buffer overflows or division by 0 at compile time)