Hey again!
(resending) Ok, sorry for all these repetitive posts about Packages but I want everyone to feel that they *did* have a chance to affect the outcome before it jumped in their face :) So, we just managed to merge my last low level messing-abouts with Package, so now packages can have properties and these properties gets persisted in the MyPackage.js file and the MyPackage.st file too. Let's rewind a bit so that all follow: Packages ======== Amber has packages. A package in Amber is a collection of classes + class extensions. A class extension is a "loose method" for a class in another package. This mechanism works just like in Monticello. This means that in Amber we do NOT have "class categories" - what you see in the left most pane in the browser IS the Package. On disk a package is represented as three different files, let's say Kernel package: js/Kernel.deploy.js js/Kernel.js st/Kernel.st The first one is a javascript file with the compiled Amber classes in it. It does not contain the original Amber Smalltalk source code. The second one is the "superset" of all these three - it has the compiled Amber classes + the Amber Smalltalk source code and other meta information. You can delete the other two files and still be a happy camper. The last file is ONLY the Amber Smalltalk source code, in chunk format. It can be compiled by the command line compiler amberc - to produce the other two js files. Same goes here, you can delete the other two and still be a happy camper. Package properties ================== First step in creating some loading mechanisms around packages would be to reify (cool word) packages so that we can work with them in Smalltalk and then add the ability to attach "meta information" to them. These two steps have been done. This means that there is a class called "Package" in the system and you can find instances of Package in various ways, and ask them stuff etc. For example, which package does class Number belong to? Number package ====> Kernel How many classes is in Kernel package? (Smalltalk packageAt: 'Kernel') classes size ====> 35 What packages do we have? Smalltalk current packages ====> a Array (Kernel Canvas Compiler IDE SUnit Examples Benchfib Kernel-Tests) Now, we also would like to attach meta information and have that meta information persisted in the three files described above. That works too, so now you can: (Smalltalk current packageAt: 'Kernel') propertyAt: 'version' put: '1.2 ...and after this we can verify what properties we have: Number package properties ====> a Dictionary('version' -> '1.2') ...if we now commit the Kernel package, and take a look at Kernel.st we find first line saying: Smalltalk current createPackage: 'Kernel' properties: #{'version'->'1.2'}! Thus, every package file will have such a first line declaring the package and its properties using the Amber style literal Dictionary. And in Kernel.js it looks like this: smalltalk.addPackage('Kernel', {"version":"1.2"}); Now, if you have been following my steps you can reload the IDE and verify that this did not disappear on us. :) Next step ========= So, now we can proceed to the next step, and this is where our code ends. But I want to describe what I want to build. The idea is that a given Package can list its dependencies *by name*. We are not going into versions of packages at this point, and perhaps will never do - not explicitly at least. So if MyPackage depends on Foo and Bar I can do: (Smalltalk current packageAt: 'MyPackage') dependencies: #('Foo' 'Bar') And checking properties it says: (Smalltalk current packageAt: 'MyPackage') properties Hehe, ok, so obviously there is a bug here - we get the dependencies but also some other stuff. Let's ignore that. Now, this is the model I have in mind: Build a package scanner mechanism that can read only the first line of a package file so that we can get the properties without actually loading a package. Build a Resolver class that can fetch package files from either direct URIs or simply by package name and then it will "look" in a ordered list of places (yeah, kinda like CLASSPATH, but for packages). Then we simply recurse, scan, and load the packages in reverse order. :) These mechanisms should give us enough "rope" to start publishing packages (just a URL and we can include it as a property so that we know where "upstream" is) and we can publish repositories of packages (just a URL where we can find an Index.html listing them all). And then using the Resolver we should be able to find and load packages using simple dependencies. regards, Göran |
Beautiful! I like the fact that properties are just entries in a dictionary, this gives infinite freedom, like having a property be a block or nastier things :)
As it's been pointed out a couple of times, it'd be great to have a common default repo for all projects, some kind of little-scale CPAN for Amber.
This was attempted in Pharo but I think it was implemented too late and everybody had their own repositories scattered all over the web already. Now to load a package in Pharo you first have to check the MetacelloConfigurations repo, if it's not there you move to PharoNonCorePackages, and if it's still not there you move to the squeaksource web and search for ConfigurationOfTheProjectImLookingFor, if you still fail, you search for TheProjectImLookingFor and install it straight away from Monticello... Let us start fresh and get a public shared repo that is the default, hard-coded, repo for everyone. In the same way we now can commit to the server and to the browser, we could add a "Public Commit" feature, which would commit to the public shared repo by default.
What do you think? Cheers! 2011/10/21 Göran Krampe <[hidden email]> Hey again! Bernat Romagosa. |
On 10/21/2011 09:55 AM, Bernat Romagosa wrote:
> Beautiful! I like the fact that properties are just entries in a > dictionary, this gives infinite freedom, like having a property be a > block or nastier things :) Yeah, it's powerful and simple. BUT... a block will not work, sorry. In fact, it is currently quite constrained to the intersection of: - Objects that can be represented in javascript object syntax. - Objects that implement storeOn: (please add more) So nah, hardly infinite freedom. But freedom at least. :) > As it's been pointed out a couple of times, it'd be great to have a > common default repo for all projects, some kind of little-scale CPAN for > Amber. > > This was attempted in Pharo but I think it was implemented too late and > everybody had their own repositories scattered all over the web already. > Now to load a package in Pharo you first have to check the > MetacelloConfigurations repo, if it's not there you move to > PharoNonCorePackages, and if it's still not there you move to the > squeaksource web and search for ConfigurationOfTheProjectImLookingFor, > if you still fail, you search for TheProjectImLookingFor and install it > straight away from Monticello... > > Let us start fresh and get a public shared repo that is the default, > hard-coded, repo for everyone. In the same way we now can commit to the > server and to the browser, we could add a "Public Commit" feature, which > would commit to the public shared repo by default. > > What do you think? I agree 110%. And I have had this vision that when you fire up Amber from amber-lang.net - you should *immediately* be able to code away and: All changes are automatically logged using local storage. So if you accidentally hit F5 or whatever, it is all there. Nothing lost. Whenever you feel like it you can: - Save packages locally. We could just add a "download as zip". - Commit using current webDAV mechanism of course. - Commit to "the Amber public cloud". Now, this last mechanism would require a login and then at least selection of "public" or "private". Setting up this cloud using a NoSQL db on a server is really not hard work. I just recently bought a server at dediserve.com and gladly use it to host this - there is plenty of Gb disk space to spare. So, if anyone tackles this I am gladly helpijng and hosting! And if noone steps up I will eventually get there. ;) regards, Göran |
Hi Göran,
Am Freitag, 21. Oktober 2011 10:07:54 UTC+2 schrieb Göran Krampe:
not sure if this complements or fits to your work - i'm currently working on some kind of "hosted environment" for amber. I'm using a free plan on https://www.dotcloud.com , configured with node.js and mongoDB. My original goal was to work on serverside amber - I have a minimal server.js which loads the FileServer.st on startup of node via the "Importer new import:" Mechanism. I manipulated the FileServer.st so that 1. <script>smalltalk.runsOnNode = true;<script> is added to html-files (so i can check on the frontend if Amber is running on node - this leads to a new button in the IDE which makes it possible to 2. load/ reload a selected package on the server. 3. I implemented a "handlePOSTRequest: " Method that executes static Methods on the server. The Request url specifies what to execute ("/MyClass?myMethod"), the data to submit is in the POST-Request payload. My next steps would be to implement an Example using MongoDB (i thought of yet another blogging software). Further things for the future would be serverside usersessions and transactions. What do you think?
|
Hey!
On 10/21/2011 10:45 AM, Stefan Krecher wrote: > Hi Göran, > > Am Freitag, 21. Oktober 2011 10:07:54 UTC+2 schrieb Göran Krampe: > > Whenever you feel like it you can: > > - Save packages locally. We could just add a "download as zip". > - Commit using current webDAV mechanism of course. > - Commit to "the Amber public cloud". > > Now, this last mechanism would require a login and then at least > selection of "public" or "private". Setting up this cloud using a NoSQL > db on a server is really not hard work. > > I just recently bought a server at dediserve.com > <http://dediserve.com> and gladly use it to > host this - there is plenty of Gb disk space to spare. > > So, if anyone tackles this I am gladly helpijng and hosting! And if > noone steps up I will eventually get there. ;) > > not sure if this complements or fits to your work - i'm currently > working on some kind of "hosted environment" for amber. > I'm using a free plan on https://www.dotcloud.com , configured with > node.js and mongoDB. My original goal was to work on serverside amber - > I have a minimal server.js which loads the FileServer.st on startup of > node via the "Importer new import:" Mechanism. I manipulated the > FileServer.st so that > 1. <script>smalltalk.runsOnNode = true;<script> is added to html-files > (so i can check on the frontend if Amber is running on node - this leads > to a new button in the IDE which makes it possible to > 2. load/ reload a selected package on the server. > 3. I implemented a "handlePOSTRequest: " Method that executes static > Methods on the server. The Request url specifies what to execute > ("/MyClass?myMethod"), the data to submit is in the POST-Request payload. > > My next steps would be to implement an Example using MongoDB (i thought > of yet another blogging software). > Further things for the future would be serverside usersessions and > transactions. > > What do you think? Well, it definitely overlaps of course. But I was only aiming at "package hosting" to begin with, not actual "running apps" hosting. Thus I basically just want: - Register/login procedure in Amber. Simplest as possible. - Commit packages to cloud, public or private. - Packages then available as say http://amber-lang.net/packages/gokr/MyPublicPackage.js - ...and since we typically store them in a NoSQL db we can later do searching etc. The architecture should be simple and the code should be open sourced. So basically anyone can host a "cloud" but of course we want people to just use amber-lang.net to keep things "simple". regards, Göran |
Hi,
so maybe this would be a good showcase - seems to be more useful than implementing yet another blogging software ... In the first step I would do account-creation and committing Packages (Name, Version, creator, date, pub/priv) via HTTP-POST to Amber on node into a MongoDB. Am i missing something? regards, Stefan Am Freitag, 21. Oktober 2011 13:30:05 UTC+2 schrieb Göran Krampe:
|
On 10/21/2011 04:16 PM, Stefan Krecher wrote:
> Hi, > so maybe this would be a good showcase - seems to be more useful than > implementing yet another blogging software ... > In the first step I would do account-creation yeah, and let's make it very simple. Say email address as username perhaps (thus unique per definition) and let the user set a password somehow etc. > and committing Packages > (Name, Version, creator, date, pub/priv) via HTTP-POST to Amber on node > into a MongoDB. > Am i missing something? Nope. :) Except perhaps I would have picked Riak or CouchBase or something, but hey, it doesn't matter! Mongo is cool too. Also, addressing package file by URL and getting a *list* of packages in the directory etc, will be nice. I am not sure, does Mongo have a good HTTP api? regards, Göran |
Am Freitag, 21. Oktober 2011 16:44:50 UTC+2 schrieb Göran Krampe:
there is a simple builtin REST-API and serveral extensions. But i would tend to server everything via node regards, Stefan
|
In reply to this post by gokr
True, gives control.
regards, Göran -- Sent from my Palm Pre 2, wohoo! On Oct 21, 2011 17:16, Stefan Krecher <[hidden email]> wrote: Am Freitag, 21. Oktober 2011 16:44:50 UTC+2 schrieb Göran Krampe:
there is a simple builtin REST-API and serveral extensions. But i would tend to server everything via node regards, Stefan
|
On 10/21/11, Göran Krampe <[hidden email]> wrote:
> True, gives control. > > regards, Göran > > > > -- Sent from my Palm Pre 2, wohoo! > > ________________________________ > On Oct 21, 2011 17:16, Stefan Krecher <[hidden email]> wrote: > > Am Freitag, 21. Oktober 2011 16:44:50 UTC+2 schrieb Göran Krampe: >> >> Also, addressing package file by URL and getting a *list* of packages in >> the directory etc, will be nice. I am not sure, does Mongo have a good >> HTTP api? > > there is a simple builtin REST-API and serveral extensions. But i would tend > to server everything via node > regards, > Stefan Then there it is no problem at all to have Mongo as the database! --Hannes |
No, Mongo is perfectly fine. I am not even sure if I would store the actual package in a db or only store meta info. Either way, as long as someone hacks this up and make packages and lists of packages available on URIs, then I am all happy.
regards, Goran -- Sent from my HP TouchPad On Oct 21, 2011 7:06 PM, H. Hirzel <[hidden email]> wrote: On 10/21/11, Göran Krampe <[hidden email]> wrote: > True, gives control. > > regards, Göran > > > > -- Sent from my Palm Pre 2, wohoo! > > ________________________________ > On Oct 21, 2011 17:16, Stefan Krecher <[hidden email]> wrote: > > Am Freitag, 21. Oktober 2011 16:44:50 UTC+2 schrieb Göran Krampe: >> >> Also, addressing package file by URL and getting a *list* of packages in >> the directory etc, will be nice. I am not sure, does Mongo have a good >> HTTP api? > > there is a simple builtin REST-API and serveral extensions. But i would tend > to server everything via node > regards, > Stefan Then there it is no problem at all to have Mongo as the database! --Hannes |
Hi,
currently hacking on this - already got a simple account-creation and login-mechanism. Now it comes to persist source-code to the db. Should all three files (js, deploy.js and .st) get persisited? Or wouldn't it be enough to just persist the .st file? regards, Stefan Am Samstag, 22. Oktober 2011 15:12:02 UTC+2 schrieb Göran Krampe: No, Mongo is perfectly fine. I am not even sure if I would store the actual package in a db or only store meta info. Either way, as long as someone hacks this up and make packages and lists of packages available on URIs, then I am all happy. |
On 10/26/2011 04:43 PM, Stefan Krecher wrote:
> Hi, > currently hacking on this - already got a simple account-creation and > login-mechanism. > Now it comes to persist source-code to the db. Should all three files > (js, deploy.js and .st) get persisited? > Or wouldn't it be enough to just persist the .st file? I would persist all three I think. If Amber changes (and thus the Compiler perhaps) then we can not reproduce the original compiled files. I also would like to load packages "fast" (the js version) and not having to run it through Compiler. regards, Göran |
Am Mittwoch, 26. Oktober 2011 16:54:41 UTC+2 schrieb Göran Krampe:
Ok. btw, there seems to be a bug (feature) Packages with no properties (nil) can't get exported with the ChunkExporter. Only setting an (empty) Dictionary solves the problem. But maybe that's only a problem with "old" Packages that just were copied from an old (before the new Packages) src-tree ... regards, Stefan
|
Free forum by Nabble | Edit this page |