I have a test in GBS which compares the #printString of a two-byte string on the server with that of an equivalent two-byte string on the client. It fails, because the DBString implementation answers a single-byte string.
(I doubt many people are using DBString and almost certainly not relying on its #printString, so this is primarily a warning note.)
It yields a String with 16(!) codepoints 16r27 16rF1 16rA7 16rF1 16rD4 16rF1 16rE5 16rF1 16r27 16rF1 16r54 16rF1 16r65 16rFB 16r00 16r27. It turns out that the double-byte codepoints which can be represented in 8 bits only require a single byte in the client representation.