On Sun, May 24, 2015 at 3:04 PM, Florin Mateoc <
[hidden email]> wrote:
> Hi all,
>
> Does anybody know why was the arithmetic protocol added to strings?
>
> Evaluating '2' > 1 raises a DNU
> Evaluating 1 < '2' answers true. Even worse, 1 < '2bla' answers true as well.
>
> This (the successful part) happens because of the implementation mentioned in the subject (plus numeric parser's
> behavior). That implementation was not part of the original protocol, as written by Dan Ingalls in 1998. It seems to
> have been added by Yoshiki in 2004. Was this done to address something in particular? To me it seems inconsistent, as in
> the above examples. There are other inconsistencies as well: the arithmetic operators mostly work - they were
> implemented in String by Yoshiki in 2004 as well, but e.g. 1s + '2' fails ('2' + 1s succeeds). And there is further
> funny behavior:
>
> '2' + 1 evaluates to 3
> '2' + '1' evaluates to '3'
> '2foo' + '1bar' evaluates to '3'
>
>
> Presumably the failures could be made to work as well, but is this desirable? To me this seems as bad and arbitrary as
> Javascript's automatic conversions (there both comparisons mentioned above succeed, but 1 + '2' evaluates to '12' and
> '2' + 1 evaluates to '21', whereas 1 - '2' evaluates to -1 and '2' - 1 evaluates to 1. Just beautiful!).
+1. Isn't it crazy? We should push to fix it in 5.1.