You can use a objective-c bridge proxy object (ObjectiveCSqueakProxy) to trigger smalltalk code to run on UI interaction.
It's explained in a talk I gave at ESUG 09 last year.
http://vst.ensm-douai.fr/ESUG2009Media/uploads/1/esug09TalkIphoneFinal2.pdfA more current example would be the Scratch.app, which we just provided remote sensor enabling and iPad support for.
http://itunes.apple.com/us/app/scratch/id358266270?mt=8In Scratch.app the presentation view controller exposes the Scratch desktop, the Scratch image updates the project text comments,
and a proxy object handles the green GO, red STOP icons to run/stop the project.However the zoomabilty, web, and keyboard icons are all under
the control of the presentation space view controller. The login screen, and webview are separate view controllers controlled by the objective-c program.
On 2010-04-06, at 2:09 PM, Lawson English wrote:
> John McIntosh wrote:
>> Actually it's six apps and two expose the morphic desktop
>>
>>
>
> Does that include the ability to script buttons and the like using squeak?
>
>
> Lawson
--
===========================================================================
John M. McIntosh <
[hidden email]> Twitter: squeaker68882
Corporate Smalltalk Consulting Ltd.
http://www.smalltalkconsulting.com===========================================================================