changing memory in VW for loading a large case studies

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

changing memory in VW for loading a large case studies

Stéphane Ducasse
Hi all

I'm doing some experiences with fingerprint and I want to load Squeak38 mse in Moose on VW7.6
However I get some problems (memory not sufficient).
Does one of you know how I can change the initial memory of VW?

Stef
_______________________________________________
Moose-dev mailing list
[hidden email]
https://www.iam.unibe.ch/mailman/listinfo/moose-dev
Reply | Threaded
Open this post in threaded view
|

Re: changing memory in VW for loading a large case studies

Richard Wettel
Hi Steph,

This mail from Joachim Geidel might help with tweaking the memory in VW.

Cheers,
Ricky

Begin forwarded message:

From: Joachim Geidel <[hidden email]>
Date: September 26, 2009 5:10:40 PM GMT+02:00
To: Jim Guo <[hidden email]>, vwnc <[hidden email]>
Subject: Re: [vwnc] any way to automaticly set memory policy?

Am 26.09.09 14:40 schrieb Jim Guo:
Memory size nowadays are larger and each time I have to manually set
Memory Policy when build a new image.  And we don't know the exact
memory size computer which will run a program in the future in advance
so it seems better decide automaticly.
Is there a way? Thanks.

Yes, and no. :-)

You can save the configuration of the Runtime Packager, including settings
for MemoryPolicy parameters, for later use, such that you don't have to
enter the values every time you build an image (see page 21-21 in the
Application Developers Guide). But that addresses only part of the problem.

MemoryPolicy does not implement the ability to adapt its parameters to its
environment. But you can write a subclass of MemoryPolicy such that
- it computes memoryUpperBound dynamically instead of using a constant value
- it computes growthRegimeUpperBound as e.g. 90% of memoryUpperBound or
whatever seems reasonable
- sets defaults for other parameters to values more appropriate to larger
memory sizes and the needs of your application (e.g., if your application
needs large amounts of memory, it is a good idea to set
preferredGrowthIncrement to a larger value than the default of 1 MB - I tend
to set it to 10 MB without thinking much about it).

The MemoryPolicy has several places where it can be adapted. For example,
you can tweak the permitMemoryGrowth: method, the memoryMeasurementBlock or
the way memoryUpperBound is determined to achieve what you want.

It is relatively easy to detect how much physical memory your computer has
using the appropriate operating system calls, and it's a good idea to set
memoryUpperBound lower than that. VisualWorks applications can be extremely
slow when parts of their memory are swapped to disk, especially when running
global garbage collections.

You could also implement a SubSystem which reads parameters for the
MemoryPolicy from a configuration file and installs or configures the
currentMemoryPolicy accordingly in its startUp method.

You should also set the ObjectMemory's sizesAtStartup to larger values,
especially the sizes of eden and survivor space. My rule of thumb is to set
eden and survivor space to 10 times the default size, and then run the
TimeProfiler on parts of the application which are likely to produce lots of
temporary objects. If it shows that there are too many scavenges (garbage
collection runs in NewSpace), I increase them further until scavenges don't
decrease any more or time to execute the code starts increasing. You can
increase the other size parameters too, in particular stack space and
compiled code cache. I usually double them. If you don't use fixed space,
leave its size at the default value. Increase OldSpace headroom such that it
is large enough to hold the objects created after starting and initializing
your application, and a bit more. That way, your application will start a
bit faster, and it doesn't have to allocate more OldSpace segments during
initialization.

HTH,
Joachim Geidel


_______________________________________________
vwnc mailing list
[hidden email]
http://lists.cs.uiuc.edu/mailman/listinfo/vwnc


_______________________________________________
Moose-dev mailing list
[hidden email]
https://www.iam.unibe.ch/mailman/listinfo/moose-dev
Reply | Threaded
Open this post in threaded view
|

Re: changing memory in VW for loading a large case studies

jannik laval
In reply to this post by Stéphane Ducasse
Hi Stef,

You should run the VM with a terminal, with the parameter: " -memory 1500m" for 1500 Mbits of memory.

For example: 
./Squeak\ 4.2.2beta1U.app/Contents/MacOS/Squeak\ VM\ Opt -memory 1500m

This is what I use for loading all pharo packages.
---
Jannik Laval


2010/4/11 Stéphane Ducasse <[hidden email]>
Hi all

I'm doing some experiences with fingerprint and I want to load Squeak38 mse in Moose on VW7.6
However I get some problems (memory not sufficient).
Does one of you know how I can change the initial memory of VW?

Stef
_______________________________________________
Moose-dev mailing list
[hidden email]
https://www.iam.unibe.ch/mailman/listinfo/moose-dev


_______________________________________________
Moose-dev mailing list
[hidden email]
https://www.iam.unibe.ch/mailman/listinfo/moose-dev
Reply | Threaded
Open this post in threaded view
|

Re: changing memory in VW for loading a large case studies

Stéphane Ducasse
In reply to this post by Richard Wettel
thanks richie

Stef

On Apr 11, 2010, at 7:27 PM, Richard Wettel wrote:

> Hi Steph,
>
> This mail from Joachim Geidel might help with tweaking the memory in VW.
>
> Cheers,
> Ricky
>
> Begin forwarded message:
>
>> From: Joachim Geidel <[hidden email]>
>> Date: September 26, 2009 5:10:40 PM GMT+02:00
>> To: Jim Guo <[hidden email]>, vwnc <[hidden email]>
>> Subject: Re: [vwnc] any way to automaticly set memory policy?
>>
>> Am 26.09.09 14:40 schrieb Jim Guo:
>>> Memory size nowadays are larger and each time I have to manually set
>>> Memory Policy when build a new image.  And we don't know the exact
>>> memory size computer which will run a program in the future in advance
>>> so it seems better decide automaticly.
>>> Is there a way? Thanks.
>>
>> Yes, and no. :-)
>>
>> You can save the configuration of the Runtime Packager, including settings
>> for MemoryPolicy parameters, for later use, such that you don't have to
>> enter the values every time you build an image (see page 21-21 in the
>> Application Developers Guide). But that addresses only part of the problem.
>>
>> MemoryPolicy does not implement the ability to adapt its parameters to its
>> environment. But you can write a subclass of MemoryPolicy such that
>> - it computes memoryUpperBound dynamically instead of using a constant value
>> - it computes growthRegimeUpperBound as e.g. 90% of memoryUpperBound or
>> whatever seems reasonable
>> - sets defaults for other parameters to values more appropriate to larger
>> memory sizes and the needs of your application (e.g., if your application
>> needs large amounts of memory, it is a good idea to set
>> preferredGrowthIncrement to a larger value than the default of 1 MB - I tend
>> to set it to 10 MB without thinking much about it).
>>
>> The MemoryPolicy has several places where it can be adapted. For example,
>> you can tweak the permitMemoryGrowth: method, the memoryMeasurementBlock or
>> the way memoryUpperBound is determined to achieve what you want.
>>
>> It is relatively easy to detect how much physical memory your computer has
>> using the appropriate operating system calls, and it's a good idea to set
>> memoryUpperBound lower than that. VisualWorks applications can be extremely
>> slow when parts of their memory are swapped to disk, especially when running
>> global garbage collections.
>>
>> You could also implement a SubSystem which reads parameters for the
>> MemoryPolicy from a configuration file and installs or configures the
>> currentMemoryPolicy accordingly in its startUp method.
>>
>> You should also set the ObjectMemory's sizesAtStartup to larger values,
>> especially the sizes of eden and survivor space. My rule of thumb is to set
>> eden and survivor space to 10 times the default size, and then run the
>> TimeProfiler on parts of the application which are likely to produce lots of
>> temporary objects. If it shows that there are too many scavenges (garbage
>> collection runs in NewSpace), I increase them further until scavenges don't
>> decrease any more or time to execute the code starts increasing. You can
>> increase the other size parameters too, in particular stack space and
>> compiled code cache. I usually double them. If you don't use fixed space,
>> leave its size at the default value. Increase OldSpace headroom such that it
>> is large enough to hold the objects created after starting and initializing
>> your application, and a bit more. That way, your application will start a
>> bit faster, and it doesn't have to allocate more OldSpace segments during
>> initialization.
>>
>> HTH,
>> Joachim Geidel
>>
>>
>> _______________________________________________
>> vwnc mailing list
>> [hidden email]
>> http://lists.cs.uiuc.edu/mailman/listinfo/vwnc
>
> _______________________________________________
> Moose-dev mailing list
> [hidden email]
> https://www.iam.unibe.ch/mailman/listinfo/moose-dev


_______________________________________________
Moose-dev mailing list
[hidden email]
https://www.iam.unibe.ch/mailman/listinfo/moose-dev