>BTW: How big (old) can a single repository grow before it gets
>fragmented or corrupted too much to be practically useful anymore? I >tend to publish even minor changes twice or three times daily and am >wondering if this will really last for several years .... > >Andre Fragmentation, risk of corruption and scalability should depend on the underlying database system and the server it runs on, less on Store itself. Postgres, Oracle, and DB2 should be able to handle large repositories. MS Access might not scale as well. I'm not sure about Firebird. BTW, the Store schema for Firebird has shorter string sizes for pundle comments, which makes replication from non-Firebird repositories difficult. I've switched to Postgres because of this. Therefore, I wouldn't recommend Firebird as the database system for Store for commercial purposes. It can be difficult to scroll through large version lists when a bundle or package has many versions. That's especially bad if you are versioning containing bundles whenever a contained package changes, and when more than one developer works on parts of the same bundle. Such a policy leads to a situation where bundles have several times more versions than any package from their contents. Using "open bundles", as in the lineups described by Reinout Heek (there was a second tool based on "open bundles" if I remember a thread on this mailing list from several months ago correctly), will keep the number of bundle versions down to a manageable size. Somebody has published index creation scripts for Store databases on either the Cincom Smalltalk wiki or the UIUC VisualWorks wiki - they should be easy to find. Installing those indexes can speed up some Store operations quite a bit. (BTW, does anybody know if those indexes have been included in the Store schema creation scripts in the meantime?) If your repository grows too large (e.g. if you carry it around on a laptop), you can always run a garbage collection to remove unneeded versions, such as packages published as "work in progress" which are older than a year, or whatever selection suits your needs. Be sure to have a backup, though - but that's good idea anyway. ;-) HTH, Joachim Geidel |
[hidden email] wrote:
> Fragmentation, risk of corruption and scalability should depend on the underlying database system and the server it runs on, less on Store itself. Postgres, Oracle, and DB2 should be able to handle large repositories. I'm using Postgres, which is perfectly scalable. Anyway, it's not data fragmentation, but rather logical corruption which I am worried about. I remember having problems with overrides included in two packages and I couldn't remove them. The only solution was to create a new repository and publish all code from a master image from scratch. > [...] > Somebody has published index creation scripts for Store databases on either the Cincom Smalltalk wiki or the UIUC VisualWorks wiki - they should be easy to find. Installing those indexes can speed up some Store operations quite a bit. (BTW, does anybody know if those indexes have been included in the Store schema creation scripts in the meantime?) > I scanned all tables and found only a single index (tt_methodretrievalindex) in tw_method, that's all. It would be nice to have more indexes, if that could speed up Store. > If your repository grows too large (e.g. if you carry it around on a laptop), you can always run a garbage collection to remove unneeded versions, such as packages published as "work in progress" which are older than a year, or whatever selection suits your needs. Be sure to have a backup, though - but that's good idea anyway. ;-) > Sounds great. Is there a regular method for purging old packages/bundles from Store? Andre |
Free forum by Nabble | Edit this page |