[ANN]: Next OpenSmalltalkVM Release Planned For February 1

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

[ANN]: Next OpenSmalltalkVM Release Planned For February 1

fniephaus
 
Hi all,
The last release is almost a year old and now that the Pharo-VM is mostly merged back into the code base, it is a good time to work on the next stable release.
I had a chat with Eliot and Clement and it looks like the Sista bytecode set/full block support and Ephemerons will be ready to release. We have decided to set the release date to February 1, which gives us more than a month to revise our release bundles and test the VMs.
Please find the corresponding GitHub milestone at [1], the GitHub project at [2] should give a good overview over what needs to be done. There is still plenty to do, so any contributions are greatly appreciated!

Best,
Fabio

Reply | Threaded
Open this post in threaded view
|

Re: [ANN]: Next OpenSmalltalkVM Release Planned For February 1

fniephaus
 
Dear all,
We've just reached an important point for the next release: our CI builds finally pass again [1]. Thanks to everyone who worked on this!
If you break it, please fix it. If you want to try out something or work on a feature, please use a branch first. :)

Happy holidays!

Fabio


--

On Thu, Dec 7, 2017 at 11:29 AM Fabio Niephaus <[hidden email]> wrote:
Hi all,
The last release is almost a year old and now that the Pharo-VM is mostly merged back into the code base, it is a good time to work on the next stable release.
I had a chat with Eliot and Clement and it looks like the Sista bytecode set/full block support and Ephemerons will be ready to release. We have decided to set the release date to February 1, which gives us more than a month to revise our release bundles and test the VMs.
Please find the corresponding GitHub milestone at [1], the GitHub project at [2] should give a good overview over what needs to be done. There is still plenty to do, so any contributions are greatly appreciated!

Best,
Fabio