A Pipe to the Future

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
8 messages Options
pwl
Reply | Threaded
Open this post in threaded view
|

A Pipe to the Future

pwl
Hi,

I am wondering if the following would be possible and if so how?

aProcess := [ "a block that pumps out continuous values"] fork.
bProcess := [:in | in someOperation ] fork.

"pipe the results of process a into process b like in unix shell"
aProcess asPipeInto: bProcess.

or

aProcess | bProcess

Would a shared queue be needed? Or some other concurrency control
mechanism for sharing the output results.

Unfortunately blocks don't pump out multiple values... or fortunately
they don't.

Just a thought.

Cheers,

Peter



Reply | Threaded
Open this post in threaded view
|

Re: A Pipe to the Future

Mathieu SUEN
Hi,

On Sep 6, 2007, at 2:02 AM, Peter William Lount wrote:

> Hi,
>
> I am wondering if the following would be possible and if so how?
>
> aProcess := [ "a block that pumps out continuous values"] fork.
> bProcess := [:in | in someOperation ] fork.

#fork resume the process but you could use #newProcess instead.

I would rather think about something like this:

[ "a block that pumps out continuous values"]  | [:in | in  
someOperation ]

and you could add somethings like:

[ "a block that pumps out continuous values"]  > anoutPutStream
[:in | in someOperation ] < anInputStream


        Mth

>
> "pipe the results of process a into process b like in unix shell"
> aProcess asPipeInto: bProcess.
>
> or
>
> aProcess | bProcess
>
> Would a shared queue be needed? Or some other concurrency control  
> mechanism for sharing the output results.
>
> Unfortunately blocks don't pump out multiple values... or  
> fortunately they don't.

You could call #value several time unless there have a return  
statement (depending on special case...).

>
> Just a thought.
>
> Cheers,
>
> Peter
>
>
>


Reply | Threaded
Open this post in threaded view
|

Re: A Pipe to the Future

Michael van der Gulik-2
In reply to this post by pwl


On 9/6/07, Peter William Lount <[hidden email]> wrote:
Hi,

I am wondering if the following would be possible and if so how?

aProcess := [ "a block that pumps out continuous values"] fork.
bProcess := [:in | in someOperation ] fork.

"pipe the results of process a into process b like in unix shell"
aProcess asPipeInto: bProcess.

or

aProcess | bProcess


It's easier to do it directly.

keepGoing := true.
s := SharedQueue new.
[ 1 to: 100 do: [ :each | s nextPut: each ] ] fork.
[ [keepGoing] whileTrue: [Transcript show: s next asString, ' ']. ] fork.

And then:
keepGoing := false. "To stop the second process."

The SharedQueue acts like the pipe. You could make your own class to add a layer of abstraction, but this would only be worthwhile if you are having difficulty with the complexity.

Michael.





pwl
Reply | Threaded
Open this post in threaded view
|

Re: A Pipe to the Future

pwl
Hi,

Thanks for the excellent example.

Considering the idea to have a "wrapper" that generates whatever code is
needed underneath for the piping syntax so that piping works for single
Smalltalk processes as well as multiple processes in one image or across
images may have benefits.

Cheers,

Peter




Michael van der Gulik wrote:

>
>
> On 9/6/07, *Peter William Lount* <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     Hi,
>
>     I am wondering if the following would be possible and if so how?
>
>     aProcess := [ "a block that pumps out continuous values"] fork.
>     bProcess := [:in | in someOperation ] fork.
>
>     "pipe the results of process a into process b like in unix shell"
>     aProcess asPipeInto: bProcess.
>
>     or
>
>     aProcess | bProcess
>
>
> It's easier to do it directly.
>
> keepGoing := true.
> s := SharedQueue new.
> [ 1 to: 100 do: [ :each | s nextPut: each ] ] fork.
> [ [keepGoing] whileTrue: [Transcript show: s next asString, ' ']. ] fork.
>
> And then:
> keepGoing := false. "To stop the second process."
>
> The SharedQueue acts like the pipe. You could make your own class to
> add a layer of abstraction, but this would only be worthwhile if you
> are having difficulty with the complexity.
>
> Michael.
>
>
>


Reply | Threaded
Open this post in threaded view
|

Re: A Pipe to the Future

Jason Johnson-5
In reply to this post by pwl
On 9/6/07, Peter William Lount <[hidden email]> wrote:

> Hi,
>
> I am wondering if the following would be possible and if so how?
>
> aProcess := [ "a block that pumps out continuous values"] fork.
> bProcess := [:in | in someOperation ] fork.
>
> "pipe the results of process a into process b like in unix shell"
> aProcess asPipeInto: bProcess.
>
> or
>
> aProcess | bProcess
>
> Would a shared queue be needed? Or some other concurrency control
> mechanism for sharing the output results.
>
> Unfortunately blocks don't pump out multiple values... or fortunately
> they don't.
>
> Just a thought.
>
> Cheers,
>
> Peter

It depends on what you are asking.  If you are trying to make a
"worker thread" type pattern in Smalltalk, then Michael's message will
be what you want.  If you are talking about the functional programming
aspect [1] then you might want to look at my LazyList package on
Squeak source.

It allows easily creating generators as:

LazyList enummerateFrom: 0 with: [:e| e + 2]    "infinite list of even
numbers, starting at 0"

and defines many functional maps, transforms and reductions with a
Smalltalk naming convention, e.g.:

"LazyList equivalent of the example unix expression"

(((LazyList fromList: (file contents))
  select: [:e| e = value])
    reject: [:e| e = otherValue)
      inject: 0 into: [:l :r| l + 1 ]   "NOTE: list values ignored,
we're only counting them"

"Don't actually need LazyList here.  But you could use it in the case
that the file has no end (e.g. /dev/random), but then you would have
to remove the reduce operation"

"Sieve Of Eratosthenes"

SomeClass>>sieve: aLazyList
  ^ LazyList
      cons: aLazyList first
      with: (LazyList delay:
         [ SomeClass sieve: aLazyList allButFirst
             select [:e|  (e \\ aLazyList first = 0) not ] ])

allPrimes := SomeClass sieve: (LazyList enummerateFrom: 2 with: [:e| e + 1])
"NOTE: This returns immediately.  But just don't call: allPrimes at:
10000000 on an x486"

Hrm, I see some refactoring opportunities here.  with: should always
take a block and delay it.  I would have used that here for clarity
but I'm sure if I did someone would try this and then ask why it
doesn't work. :)  I also should change "cons" to "with" (i.e.
#with:with:) to be consistent with Smalltalk usage.

[1] The statement:

cat file | grep value | grep -v otherValue | wc -l

is basically functional programming.  That statement has a generator,
two filters and a reduction.

Reply | Threaded
Open this post in threaded view
|

Sieve of Eratosthenes [was Re: A Pipe to the Future]

Simon Guest-2
At Thu, 6 Sep 2007 06:52:59 +0200,
Jason Johnson wrote:

>
> "Sieve Of Eratosthenes"
>
> SomeClass>>sieve: aLazyList
>   ^ LazyList
>       cons: aLazyList first
>       with: (LazyList delay:
>          [ SomeClass sieve: aLazyList allButFirst
>              select [:e|  (e \\ aLazyList first = 0) not ] ])
>
> allPrimes := SomeClass sieve: (LazyList enummerateFrom: 2 with: [:e| e + 1])
> "NOTE: This returns immediately.  But just don't call: allPrimes at:
> 10000000 on an x486"

I had a play with this; it's pretty cool.  Note that LazyList delay:
should be LazyValue delay: (oh, and enumerateFrom with one m).

Being able to type
(allPrimes take: 100) asOrderedCollection
is rather good.

cheers,
Simon

Reply | Threaded
Open this post in threaded view
|

Re: Sieve of Eratosthenes [was Re: A Pipe to the Future]

Jason Johnson-5
On 9/6/07, Simon Guest <[hidden email]> wrote:
>
> I had a play with this; it's pretty cool.

Ausome, thanks!

>  Note that LazyList delay:
> should be LazyValue delay: (oh, and enumerateFrom with one m).

Ack, I obviously didn't test it at all.  I'm glad that was all that
was wrong.  :)  But I should refactor that since it's now clear that
an already delayed object never seems to get passed in practice as the
second argument.

> Being able to type
> (allPrimes take: 100) asOrderedCollection
> is rather good.
>
> cheers,
> Simon

Yea, I hope you find it useful.  I built it for my recurrence rule
implementation.  If you add entries to the list using #insertUnique:
(I know, I know, subclass and specialize) then you can lazily merge it
with another list, or even filter it against another infinite lazy
list.  You can download the ICal package from Squeaksource (look for
the versions with jbj or jj or something like that at the end) to see
what I use it for.

Reply | Threaded
Open this post in threaded view
|

Re: A Pipe to the Future

Paolo Bonzini-2
In reply to this post by Jason Johnson-5
> LazyList enummerateFrom: 0 with: [:e| e + 2]    "infinite list of even
> numbers, starting at 0"
>
> and defines many functional maps, transforms and reductions with a
> Smalltalk naming convention, e.g.:

Smalltalk already has a powerful abstraction for lazy lists, i.e.
streams; you can add #select:/#reject:/#collect:/#inject:into: to
streams.  They would be lazy operations, unlike the ones on collections,
since Streams are possibly infinite.

You can also have generators as in Python (it is relatively easy to
implement them on top of continuations) to create a "pluggable" stream.

> SomeClass>>sieve: aLazyList
>   ^ LazyList
>       cons: aLazyList first
>       with: (LazyList delay:
>          [ SomeClass sieve: aLazyList allButFirst
>              select [:e| e \\ aLazyList first ~= 0 ] ])

Cool stuff.

With stream-like generators, this would look like:

odds := Generator inject: 3 into: [ :prev | prev + 2 ].
primes := Generator inject: 2 into: [ :prev "unused" |
    | prime |
    prime := odds next.
    odds := odds select: [ :each | each \\ prime ~= 0 ].
    prime ].

After staring at the two for a while, they look pretty similar actually
(aLazyList => odd, cons:with: => inject:into:, first => next which also
removes the first so my version does not need allButFirst).

Paolo