Help mapping large database objects to Smalltalk objects

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Help mapping large database objects to Smalltalk objects

Mariano Martinez Peck
Hi folks:   I started to code a bit the large support for SqueakDBX to large objects and I want to ask you for help. In a first step, I will start with the basic large objects: CLOB, NCLOB, XML, BLOB.
Now, I need to map each of them, to an Smalltalk object. I though about this:

CLOB -> String
NCLOB -> String 
XML -> String and the the user do whatever they want with the string representing the XML.
BLOB ->  ByteArray

what do you think? are there suitable objects in a core image ?   (I don't want to depend in another package)

Now, do you know which is the faster why to create a ByteArray from a String ? and vice-versa ? just asString ?

Another question....to get a field value of a row, I call an OpenDBX  using FFI like this:

apiQueryFieldValue: handle index: index
    "const char* odbx_field_value( odbx_result_t* result, unsigned long pos )"
    <cdecl: char* 'odbx_field_value' (ulong ulong) module: 'opendbx'>
    ^self externalCallFailed

As you can see that function of OpenDBX answer me in a char* the value of the field. The problem is...suppose that field is a large object (a CLOB for example). A Clob can be 1gb or even more....of course, that will be bad for my image (freeze ? crash ?   I don't know...).   I would like to validate the size of that before bring it into the image, so that not to crash or whatever. I want to put like a limit. The problem is that I don't know how to do that from Pharo side....is that possible ?   or the only way is to change OpenDBX side ?

Thanks a lot in advance,

Mariano

_______________________________________________
Pharo-project mailing list
[hidden email]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project