Hi, Last month Amazon extended their serverless runtime platform AWS Lambda with support for custom runtimes. I created a Pharo Lambda Runtime so now we can implement Lambda functions in Smalltalk and easily deploy them on the Lambda platform. Lamba has quite a large "free-tier", more than enough to do some experiments and to host small applications for free. See the GitHub project for more details https://github.com/jvdsandt/pharo-aws-toolbox Cheers, Jan. |
Cool! thanks! On Thu, Dec 27, 2018 at 10:32 AM Jan van de Sandt <[hidden email]> wrote:
|
In reply to this post by Jan van de Sandt
This sounds really great. Something I'd like to experiment with. Thanks. cheers -ben On Thu, 27 Dec 2018 at 17:32, Jan van de Sandt <[hidden email]> wrote:
|
In reply to this post by Jan van de Sandt
Nice job. Well documented. Thank you.
> On 27 Dec 2018, at 10:32, Jan van de Sandt <[hidden email]> wrote: > > Hi, > > Last month Amazon extended their serverless runtime platform AWS Lambda with support for custom runtimes. I created a Pharo Lambda Runtime so now we can implement Lambda functions in Smalltalk and easily deploy them on the Lambda platform. Lamba has quite a large "free-tier", more than enough to do some experiments and to host small applications for free. > > See the GitHub project for more details https://github.com/jvdsandt/pharo-aws-toolbox > > Cheers, > Jan. |
In reply to this post by Jan van de Sandt
Cool - I was using a JS shim and had asked AWS many times why they couldn’t open it up wider...
Now I’m back from my travels I’ll reincarnate my previous work and see how it works with this. I was looking forward to doing more with Lambda, so this is great timing. Tim
Sent from my iPhone
|
In reply to this post by Jan van de Sandt
On Thu, Dec 27, 2018 at 10:32:03AM +0100, Jan van de Sandt wrote:
> Last month Amazon extended their serverless runtime platform AWS Lambda > with support for custom runtimes. I created a Pharo Lambda Runtime so now > we can implement Lambda functions in Smalltalk and easily deploy them on > the Lambda platform. Lamba has quite a large "free-tier", more than enough > to do some experiments and to host small applications for free. Very nice! Thanks for making this. Pierce |
In reply to this post by Tim Mackinnon
Hi Tim, Yes, I read that you got Pharo working via the Javascript runtime. It should now be much easier and faster. I still have to figure out the best way to create a deployment image. With the new bootstrap/modular setup of Pharo 7 it should be possible to create a lean-and-mean runtime image that can run in the cheapest 128MB max. ram configuration. Jan. On Thu, Dec 27, 2018 at 2:18 PM Tim Mackinnon <[hidden email]> wrote:
|
Hi Jan - reading through your docs, this looks very promising as I hadn’t got as far as using the api gateway - I was just connecting to the internal Alexa service.
One thing I didn’t quite understand - you mention specifying a HANDLER as an env variable, but your example description doesn’t seem to show where that is set? Is it in the bootstrap file (also is that bootrap.sh or just simply bootstrap? And are there any file attributes to set on that file?). It strikes me that rather than using an env variable - why don’t you specify the handler class when you invoke the image as the last command line option? (That is what I did in my example - or is it faster to query an env variable? Personally I find it easier to see it more explicitly on the command line).
I also notice you include NeoJson - did you need that? I found that the default Ston reader was more than adequate for reading the json that was sent over (and so it’s one less pre-req).
I haven’t yet fully understood the runtime layer - is this simply a zip file with the vm + files and without the image? Previously I had to add all of that in the zip I uploaded for each function, but this sounds like it simplifies things a lot. Do you have a script you used to create that - I ask as I found that trimming down the size of that made a difference to load times for Alexa (eg there are lots of sound and graphics dll’s you can remove - which I have in my script, and possibley, I could add to what you have done).
Equally, I also figured out the command line fu for uploading and registering your Lamba so it works in a ci - this might also be worthy of inclusion.
Anyway, l’ll Give it a go and see how the results compare - it was surprisingly fast using the js shim - but this seems like a much better solution.
Thanks for sharing - it’s an executing world.
Tim
Sent from my iPhone On Fri, 28 Dec 2018, at 11:35 AM, Jan van de Sandt wrote:
|
Hi Tim, On Fri, Dec 28, 2018 at 4:49 PM Tim Mackinnon <tim@testit.works> wrote:
Up til now I have configured my Lambda functions mostly via the AWS Console. When you create your function you must upload your code, specify the type of runtime and give the name of the handler. For a Java runtime this is the name of the main class, so for the Smalltalk implementation I choose something similar. If you want you can leave this item empty or set it to "Provided" and just hardcode the startup class in the image or in the bootstrap file.
With NeoJSON you can create mappings to serialize/deserialize custom Smalltalk classes to Json. I use this functionality in the CloudWatch-Logs interface to handle the request and response objects. I don't think this is possible with STON.
Yes, the layers are a great new feature. My layer just includes a standard Pharo 64 bits VM without any additions or removals. I made this one by hand, a script would be a better idea! You can remove shared libraries that you don't need. I think the most important thing is that the image does not load/initialize any unnecessary libraries.
Yes, a CI job that can build a 'deployment' image and that can upload this image to AWS Lambda would be a great feature!
Thanks for your comments! Jan.
|
Ah - so is the HANDLER variable set via the AWS console? (If so, then I’d agree that using the env var makes more sense).
It also sounds like NeoJson makes sense for the mapping - I hadn’t needed it before in my usage - I don’t think it’s very heavy weight anyway. For debugging, the serialisation of the stack to a bucket is so cool (kudos to Fuel for that), but cloudwatch is a useful basic tool and it does let you hook up to things like data dog etc. so we fit easily into the cloud eco system. I need to get things up and running again and will report back. Thanks again. Tim
Sent from my iPhone
|
In reply to this post by Jan van de Sandt
Great work!
Doru > On Dec 27, 2018, at 10:32 AM, Jan van de Sandt <[hidden email]> wrote: > > Hi, > > Last month Amazon extended their serverless runtime platform AWS Lambda with support for custom runtimes. I created a Pharo Lambda Runtime so now we can implement Lambda functions in Smalltalk and easily deploy them on the Lambda platform. Lamba has quite a large "free-tier", more than enough to do some experiments and to host small applications for free. > > See the GitHub project for more details https://github.com/jvdsandt/pharo-aws-toolbox > > Cheers, > Jan. -- www.feenk.com "You can inspect and adapt only what is explicit." |
Free forum by Nabble | Edit this page |