Is Smalltalk right for my autonomous car? (was Re: Audio and Video Object Analysis)

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Is Smalltalk right for my autonomous car? (was Re: Audio and Video Object Analysis)

Kirk Fraser
Thanks for the discussion. I didn't know C++ was so bad.  Is there any negatives about Python?  

On government, I think ADA is about as dead as Cobol.  They aren't requiring it in the DARPA Robotics Challenge.  But I did see on comp.lang.Smalltalk where one huge government project in Smalltalk failed because they couldn't stop an unwanted ship from appearing in a simulation or something like that.  So they went to C.

On a vehicle, the worst times I can imagine for a GC of even one second is just before a decision is needed to round a corner or stop.  Is there a way to fan out GC's so they are more frequent but less time consuming?  Perhaps a millisecond every second - let's see at 60 mph that would be 316800 feet per hour, 88 feet per second, 0.088 feet per millisecond or 1.056 inches.  The Google car shoots for 1 cm accuracy - 1 in. is close enough.  So a garbage collection of one second means 88 feet off course but a millisecond would work. 

--
Kirk W. Fraser
http://freetom.info/TrueChurch - Replace the fraud churches with the true church.
http://freetom.info - Example of False Justice common in America

_______________________________________________
Cuis mailing list
[hidden email]
http://jvuletich.org/mailman/listinfo/cuis_jvuletich.org
Reply | Threaded
Open this post in threaded view
|

Re: Is Smalltalk right for my autonomous car? (was Re: Audio and Video Object Analysis)

Frank Shearar-3
On 18 December 2013 14:37, Kirk Fraser <[hidden email]> wrote:
> Thanks for the discussion. I didn't know C++ was so bad.  Is there any
> negatives about Python?

Just the same as with Ruby and Smalltalk: it's relatively slow, and so on.

> On government, I think ADA is about as dead as Cobol.  They aren't requiring
> it in the DARPA Robotics Challenge.  But I did see on comp.lang.Smalltalk
> where one huge government project in Smalltalk failed because they couldn't
> stop an unwanted ship from appearing in a simulation or something like that.
> So they went to C.

Was that JWARS or something? I know the DoD or similar had a huge
battlespace simulator written in Smalltalk.

> On a vehicle, the worst times I can imagine for a GC of even one second is
> just before a decision is needed to round a corner or stop.  Is there a way
> to fan out GC's so they are more frequent but less time consuming?  Perhaps
> a millisecond every second - let's see at 60 mph that would be 316800 feet
> per hour, 88 feet per second, 0.088 feet per millisecond or 1.056 inches.
> The Google car shoots for 1 cm accuracy - 1 in. is close enough.  So a
> garbage collection of one second means 88 feet off course but a millisecond
> would work.

More frequent GCs certainly help. Or you could talk the Erlang
approach, where each process has its own stack, and its own GC. That
means that when a process does need to GC, it has only a small amount
of memory to walk, so GC pauses are very short.

But really, I'd take the approach of not worrying about such a thing
now. I've never seen a GC of a second on any Smalltalk image. The
worst I've seen has been ~100 milliseconds.

frank

_______________________________________________
Cuis mailing list
[hidden email]
http://jvuletich.org/mailman/listinfo/cuis_jvuletich.org
Reply | Threaded
Open this post in threaded view
|

Re: Is Smalltalk right for my autonomous car? (was Re: Audio and Video Object Analysis)

Casey Ransberger-2
In reply to this post by Kirk Fraser
Read the Ungar paper I linked. It does exactly what you're describing, and also gives a good overview of different garbage collection strategies. The idea is in part to do small GCs very often, and avoid full GC until the user is idle (imagine sitting at a stop light.) And I said a *fraction* of a second, not a whole second. This paper was written in the 90s and referred to a system which also used virtual memory. Nix the virtual memory, go with flash for storage, use a generation scavenging garbage collector, and try profiling that setup *today.* I think -- of course I don't know, because I haven't done the profiling -- that you'll have a latency much lower than the average nimrod texting on the road has.

Eliot is working on a new object memory for Cog called Spur, which I understand will dramatically reduce the amount of time the VM spends in GC; since it isn't causing huge pauses *now,* and it will get *much* better in the near future, I really don't think that GC is something to worry about.

And yes, C and C++ are *very* dangerous languages because they're semantically the same as macro'd machine code. This means (for example) that one can accidentally write words of data past the end of an array: this is called a buffer overrun. If the address of the first thing after the end of the array happens to be a function which will get called in the future, and the data being written is supplied by an external party (like a user or bot) it is possible for a bad actor to overwrite part of that function and inject machine code into your application. If your application is running as a privileged user, it's trivial to take complete control of the machine that it runs on, and if not, there's still plenty of opportunities for privilege escalation.

Smalltalk doesn't have this problem because it doesn't expose raw pointers to the programmer. Arrays are bounds checked. All memory access -- in fact, all hardware access -- is managed by a well-maintained virtual machine. And a million and one other things that eliminate some serious programmer errors. C and C++, OTOH, are rather like riding a horse naked without a saddle. You might get where you're going just a hair faster than if you had gotten dressed and put the saddle on first, sure, but it's liable to hurt a lot afterwards.


On Wed, Dec 18, 2013 at 6:37 AM, Kirk Fraser <[hidden email]> wrote:
Thanks for the discussion. I didn't know C++ was so bad.  Is there any negatives about Python?  

On government, I think ADA is about as dead as Cobol.  They aren't requiring it in the DARPA Robotics Challenge.  But I did see on comp.lang.Smalltalk where one huge government project in Smalltalk failed because they couldn't stop an unwanted ship from appearing in a simulation or something like that.  So they went to C.

On a vehicle, the worst times I can imagine for a GC of even one second is just before a decision is needed to round a corner or stop.  Is there a way to fan out GC's so they are more frequent but less time consuming?  Perhaps a millisecond every second - let's see at 60 mph that would be 316800 feet per hour, 88 feet per second, 0.088 feet per millisecond or 1.056 inches.  The Google car shoots for 1 cm accuracy - 1 in. is close enough.  So a garbage collection of one second means 88 feet off course but a millisecond would work. 

--
Kirk W. Fraser
http://freetom.info/TrueChurch - Replace the fraud churches with the true church.
http://freetom.info - Example of False Justice common in America

_______________________________________________
Cuis mailing list
[hidden email]
http://jvuletich.org/mailman/listinfo/cuis_jvuletich.org



_______________________________________________
Cuis mailing list
[hidden email]
http://jvuletich.org/mailman/listinfo/cuis_jvuletich.org
Reply | Threaded
Open this post in threaded view
|

Re: Is Smalltalk right for my autonomous car? (was Re: Audio and Video Object Analysis)

Juan Vuletich-4
On 12/19/2013 2:17 AM, Casey Ransberger wrote:
> ...
>
>  C and C++, OTOH, are rather like riding a horse naked without a
> saddle. You might get where you're going just a hair faster than if
> you had gotten dressed and put the saddle on first, sure, but it's
> liable to hurt a lot afterwards.
>

:D

_______________________________________________
Cuis mailing list
[hidden email]
http://jvuletich.org/mailman/listinfo/cuis_jvuletich.org