On Wed, Sep 16, 2009 at 12:17 PM, Ken Treis
<[hidden email]> wrote:
On Sep 16, 2009, at 12:02 PM, Douglas Brebner wrote:
One minor point is that whether integers and longs are different depends
on the platforms data model. Under MS Win64, integers and long integers
are both 32bit, while on most unixes integers are 32bit while long ints
are 64bit. (LLP64 and LP64 models respectively)
Really? so sizeof(int) == sizeof(long) and sizeof(long) ~= sizeof(__int64)?? Strange choice (see below) ;)
Every other 64bit platform I've seen has used
sizeof(int) == 4, sizeof(long) == sizeof(long long)
But one should insulate oneself using appropriate defines/typedefs, and include asserts to confirm basic types are of the size one assumes.
I don't know if this is important though.
Good point. All of my platforms are LP64 (Mac OS X and Linux x86-64), and my main objective at present is to get this working for an application I'm committed to build, so I was ignoring those sorts of cross-platform details.
Perhaps there would need to be new primitives for the basic size of each relevant C type? I'm anxious to hear what Eliot might have to say about this since he's got about 2000x more experience with this than I do.
are you using the 64-bit VM? If you're using a 32-bit VM on a 64-bit platform then yu're using 32-bits and you'll be linking against 32-bit libraries and hence nothing will have changed.
Alien has support for 64-bit values. Of course it uses the Windows choice, short = 2, int unused as a name, long = 4, longLong = 8. e.g.
anAlien unsignedByteAt: oneRelativeIndex
anAlien unsignedShortAt: oneRelativeIndex
anAlien unsignedLongAt: oneRelativeIndex
anAlien unsignedLongLongAt: oneRelativeIndexare the basic primitives for fetching unsigned integers.
But this needs to be wrapped with something that selects the right sizes based on the current platform and that's yet to be addressed.