Hi
what is the usual practice in reading in rows from a database table - is a loop usually written to read next n rows / till the end - does the arriving data have to be processed like a dictionary into a collection where each object corresponds to a row and each key/value into the objects methods corresponding to columns e.g in NBSQLite3 interface test I find the command below used, looks like I need to process row by row rs := db execute: 'select * from x'. row := rs next. ... := row at: <key> ... rs close (I will be using Sqlite, Postgresv2 and ODBC drivers ... 3 separate projects) regards Sanjay
cheers,
Sanjay |
hi,
let me clarify ... with a Postgres connection using Pharo4+PostgresV2 this works: result := pgConnection execute: 'select * from mytable'. result rows -><print> gives an ordered collection of the rows but how to get this in Sqlite (Pharo4+NBSqlite3) result := nbsqliteConnection execute: 'select * from mytable'. result rows -><print> gives a walk back since I will be extending to other databases (definitely oracle & mysql) how can I figure out the best way to do Create Retrieve Update Delete operations - each with multiple rows regards Sanjay
cheers,
Sanjay |
I don't know if this is your case, but I remember our team having a problem in SQL selects.
Th rows it retrieved where very volatile; after the first read they dissappeared. So the first step was to save them/copy them into a ST object and then process them. (this was due to the memory of each record been in a temporary memory buffer) |
Sergio - which database driver are you using and what is the expression to read in multiple rows
problem 1: in PostgresV2 I used res := postgresConnection execute: 'select * from df_co;'. and then res rows seems to return an array with the rows I have not tried with a large result set say 10000 rows and want to know how it is to be handled Problem 2: But in Sqlite I am unable to get the result set @Pierce(NG) the smalltalkhub page shows an example db execute: 'SELECT * FROM BLOGPOSTS;'. but how are the rows supposed to be accessed - in the tests i see they have to be walked thru one by one like a 3GL loop regards Sanjay
cheers,
Sanjay |
I haven't worked myself with this part of the code, but I can show you how we did it.
Sergio - which database driver are you using and what is the expression to FreeTDS This is how we performed a query: clientes | rows lista | rows := (self executeSQL: 'SELECT * from GVA14 order by NOM_COM') rows. lista := Dictionary new. rows do: [ :row | lista at: (row valueNamed: 'COD_CLIENT') put: (Cliente fromRow: row) ]. ^ lista executeSQL: aSQLStatement self ensureDBConnection. self dbType = 'MYSQL' ifTrue: [ ^ (self conn prepare: aSQLStatement) execute: #() ]. self dbType = 'ODBC' ifTrue: [ ^ (self conn query: aSQLStatement) execute asTable ] ifFalse: [ ^ self conn execute: aSQLStatement ] But the driver had several quirks, so we had to do this on ensureDBConnection ensureDBConnection [self dbType isNil ifTrue: [ self conn isConnected ifFalse: [ self conn connect ]. self conn isOpen ifFalse: [ [ self conn open ] on: DBXError do: [ self error: 'Error de conexion con la base de datos.' ]. self conn execute: 'SET ANSI_NULLS ON'. self conn execute: 'SET ANSI_WARNINGS ON' ]. ^ (self conn execute: 'select 1') rows first rawValues first = '1'] ifFalse: [^self ensureDBConnectionFor: self dbType]] on: Error do: [self error: 'Error de conexion con la base de datos'] |
In reply to this post by Sanjay Minni
On Thu, Apr 30, 2015 at 06:29:53AM -0700, Sanjay Minni wrote:
> the smalltalkhub page shows an example > db execute: 'SELECT * FROM BLOGPOSTS;'. > but how are the rows supposed to be accessed - in the tests i see they have > to be walked thru one by one like a 3GL loop Sanjay, Here's how you can collect all rows: | db resultSet row coll | coll := OrderedCollection new. db := NBSQLite3Connection openOn: '/tmp/so.db'. [ resultSet := db execute: 'select * from posts limit 10'. [ (row := resultSet next) notNil ] whileTrue: [ coll add: row ] ] ensure: [ db close ]. coll inspect. Each row is an NBSQLite3Row instance. Use #at: with the column name as key to get at the data. coll fifth at: 'Title' ==> 'How do I calculate someone''s age in C#?' With the collection you still have to loop through them, no? I am using NBSQLite3 to play with existing data, and I haven't worked with enough different types of data to attempt to generalize a looping construct. So I've always done by-row processing. Another reason is that, while my data sets aren't "Big Data", I am aesthetically inclined against reading everything into RAM. :-) Above example uses the Sep 2011 StackOverflow data dump. The SQLite datafile created from the dump, with full text indexing, is 16GB and the table posts has ~6.5 million rows. HTH. Pierce |
In reply to this post by Sergio Fedi
Sergio -
Are you using DBX - there seem to be (-ve) comments on DBX here http://forum.world.st/Oracle-on-Linux-td4816854.html#a4816876 for Windows ODBC seems to be successfully used across many RDBMS engines here http://forum.world.st/Question-regarding-crashes-using-OpenDBXDriver-td4796332.html#a4797122 Not sure about a stable approach on Linux ... which I am also looking for Unfortunately RDBMS is central to my world regards Sanjay
cheers,
Sanjay |
In reply to this post by Pierce Ng-3
Thanks Pierce - I go the drift of the loop but have actually tried it out with the odbc<->sqlite driver however in your notes below you mentioned that you used a dump of large size. My question is when faced with such large row sets how do you limit the size of data read in the select statement . Select * from table would probably read in the entire table set, and the limit clause would stop and not go beyond the no# of rows specified. another doubt I had was from the example code below from the NBSQLite3. is rs close at the right place - would it be closing the connection ? res := db beginTransaction. rs := db execute: '...'. rs close. res := db commitTransaction. regards Sanjay
cheers,
Sanjay |
On Fri, May 01, 2015 at 04:23:26AM -0700, Sanjay Minni wrote:
> however in your notes below you mentioned that you used a dump of large > size. My question is when faced with such large row sets how do you limit > the size of data read in the select statement . This web page suggests how to deal with this: https://www.sqlite.org/cvstrac/wiki?p=ScrollingCursor > another doubt I had was from the example code below from the NBSQLite3. > is rs close at the right place - would it be closing the connection ? > > res := db beginTransaction. > > rs := db execute: '...'. > rs close. > > res := db commitTransaction. rs holds an NBSQLite3ResultSet instance. Closing it does not close the the database connection object which is an NBSQLite3Connection instance. In my example I had failed to close the result set object. :-P Resource finalization is still tricky at times. I have a few broken images lying around, mostly due to that. Pierce |
Free forum by Nabble | Edit this page |