Yes, but this should be made *very* clear in the documentation ...
Marten
I think one could make a valid argument that the code points should be the source of the bits. e.g. ASCII characters in a Unicode16 string would only have their ASCII code points used, not the internal representation bits.Richard Sargent via Glass <[hidden email]> hat am 23. Juli 2018 um 21:02 geschrieben:
Free forum by Nabble | Edit this page |