Hi, All.
Downloading a file(jpg file for instance) by a Socket seems to be nasty work. socket := Socket port: 80 host: 'www.somewhere.com'. socket connect. socket writeStream nextPutAll: ... ... ... "GET method to receive the file" [socket hasInput] whileTrue: [dataStream nextPut: socket readStream next]. dataStream contents. I get "Remote Socket closed" error before getting all the bytes. And I receive 16xx bytes each trial. The only way I can it right is to request exact number of bytes of the file, which I know it by selecting "Properties" menu item of Internet Explorer. socket readStream next: 111988. Have a good one Hwa Jong Oh |
Hwa Jong Oh,
> Downloading a file(jpg file for instance) by a Socket seems to be nasty > work. > > socket := Socket port: 80 host: 'www.somewhere.com'. > socket connect. > socket writeStream nextPutAll: ... ... ... "GET method to receive the file" > [socket hasInput] > whileTrue: > [dataStream nextPut: socket readStream next]. > dataStream contents. > > I get "Remote Socket closed" error before getting all the bytes. And I > receive 16xx bytes each trial. > > The only way I can it right is to request exact number of bytes of the file, > which I know it by selecting "Properties" menu item of Internet Explorer. > > socket readStream next: 111988. Take a look at how Live Update downloads the patch files. The best place to start is LiveUpdate class>>urlContents:. This eventually make use of some Windows APIs provided by the URLMonLibrary class, which is another good place to look for interesting methods that you can use. Best Regards, Andy Bower Dolphin Support http://www.object-arts.com --- Are you trying too hard? http://www.object-arts.com/Relax.htm --- |
In reply to this post by Howard Oh
Hwa Jong Oh wrote:
> Downloading a file(jpg file for instance) by a Socket seems to be nasty > work. Andy has mentioned how to use the Microsoft software to do this kind of thing. I don't know what you are trying to do, but it may be that you want to use raw TCP/IP to connect to the server, in that case inspecting the following "do-it" does work. ===================== socket := Socket port: 80 host: 'www.object-arts.com'. socket connect. socket writeStream nextPutAll: 'GET /Home.htm HTTP/1.0 ' asByteArray. socket writeStream flush. in := socket readStream. out := ByteArray writeStream. [in do: [:each | out nextPut: each]] on: SocketClosed do: [:e |] on: Error do: [:e | e signal]. in contents asString. ===================== There's a bug in the socket stuff which doesn't correctly handle the case where the server closes the socket (I've reported this before, twice), and that's why I'm forced to use the error handler for SocketClosed. Otherwise I've been using essentially the above code for about a year now without any problems. > Hwa Jong Oh -- chris |
In reply to this post by Andy Bower
LiveUpdate seems to cache the contents from web pages. The following
should update every minute: LiveUpdate urlContents: 'http://www.sandia.gov/pv/pvweather/webout.txt' but will return the same contents for a while. I had been using: IXMLHttpRequest retrieveUrl: 'http://www.sandia.gov/pv/pvweather/webout.txt' but it also keeps a cache. Is there a way to force the download of pages? Thanks, Brian Murphy-Dye Andy Bower wrote: > Hwa Jong Oh, > > >>Downloading a file(jpg file for instance) by a Socket seems to be nasty >>work. >> >>socket := Socket port: 80 host: 'www.somewhere.com'. >>socket connect. >>socket writeStream nextPutAll: ... ... ... "GET method to receive the >> > file" > >>[socket hasInput] >>whileTrue: >> [dataStream nextPut: socket readStream next]. >>dataStream contents. >> >>I get "Remote Socket closed" error before getting all the bytes. And I >>receive 16xx bytes each trial. >> >>The only way I can it right is to request exact number of bytes of the >> > file, > >>which I know it by selecting "Properties" menu item of Internet Explorer. >> >>socket readStream next: 111988. >> > > Take a look at how Live Update downloads the patch files. The best place to > start is LiveUpdate class>>urlContents:. This eventually make use of some > Windows APIs provided by the URLMonLibrary class, which is another good > place to look for interesting methods that you can use. > > Best Regards, > > Andy Bower > Dolphin Support > http://www.object-arts.com > --- > Are you trying too hard? > http://www.object-arts.com/Relax.htm > --- > > > > > |
One trick you may want to try is sticking the millisecond time on the end of
the URL in the form: http://www.sandia.gov.pv/pvweather/webout.txt?x=12345 where 12345 is replaced with the time (or a counter or any changing sequence). The ?x=12345 will be ignored, but this will force the cache to refresh. I don't develop in SmallTalk, but I do a lot of Intranet development, and this almost always thwarts unwanted caching schemes. Cheers, Sean "Brian Murphy-Dye" <[hidden email]> wrote in message news:[hidden email]... > LiveUpdate seems to cache the contents from web pages. The following > should update every minute: > > LiveUpdate urlContents: 'http://www.sandia.gov/pv/pvweather/webout.txt' > > but will return the same contents for a while. I had been using: > > IXMLHttpRequest retrieveUrl: 'http://www.sandia.gov/pv/pvweather/webout.txt' > > but it also keeps a cache. Is there a way to force the download of pages? > > Thanks, Brian Murphy-Dye > > > Andy Bower wrote: > > > Hwa Jong Oh, > > > > > >>Downloading a file(jpg file for instance) by a Socket seems to be nasty > >>work. > >> > >>socket := Socket port: 80 host: 'www.somewhere.com'. > >>socket connect. > >>socket writeStream nextPutAll: ... ... ... "GET method to receive the > >> > > file" > > > >>[socket hasInput] > >>whileTrue: > >> [dataStream nextPut: socket readStream next]. > >>dataStream contents. > >> > >>I get "Remote Socket closed" error before getting all the bytes. And I > >>receive 16xx bytes each trial. > >> > >>The only way I can it right is to request exact number of bytes of the > >> > > file, > > > >>which I know it by selecting "Properties" menu item of Internet > >> > >>socket readStream next: 111988. > >> > > > > Take a look at how Live Update downloads the patch files. The best place to > > start is LiveUpdate class>>urlContents:. This eventually make use of some > > Windows APIs provided by the URLMonLibrary class, which is another good > > place to look for interesting methods that you can use. > > > > Best Regards, > > > > Andy Bower > > Dolphin Support > > http://www.object-arts.com > > --- > > Are you trying too hard? > > http://www.object-arts.com/Relax.htm > > --- > > > > > > > > > > > |
Yeah, I had tried your suggestion (which does work), but assumed there
had to be a better solution ;) By the way, the IXMLHttpRequest should be more like: request := IXMLHttpRequest new open: 'GET' bstrUrl: 'http://www.sandia.gov/pv/pvweather/webout.txt' varAsync: false bstrUser: nil bstrPassword: nil. request send: nil. request responseText Brian Murphy-Dye. Sean Inglis wrote: > One trick you may want to try is sticking the millisecond time on the end of > the URL in the form: > > http://www.sandia.gov.pv/pvweather/webout.txt?x=12345 > > where 12345 is replaced with the time (or a counter or any changing > sequence). > > The ?x=12345 will be ignored, but this will force the cache to refresh. I > don't develop in SmallTalk, but I do a lot of Intranet development, and this > almost always thwarts unwanted caching schemes. > > Cheers, > > Sean > > "Brian Murphy-Dye" <[hidden email]> wrote in message > news:[hidden email]... > >>LiveUpdate seems to cache the contents from web pages. The following >>should update every minute: >> >>LiveUpdate urlContents: 'http://www.sandia.gov/pv/pvweather/webout.txt' >> >>but will return the same contents for a while. I had been using: >> >>IXMLHttpRequest retrieveUrl: >> > 'http://www.sandia.gov/pv/pvweather/webout.txt' > >>but it also keeps a cache. Is there a way to force the download of pages? >> >>Thanks, Brian Murphy-Dye >> >> >>Andy Bower wrote: >> >> >>>Hwa Jong Oh, >>> >>> >>> >>>>Downloading a file(jpg file for instance) by a Socket seems to be nasty >>>>work. >>>> >>>>socket := Socket port: 80 host: 'www.somewhere.com'. >>>>socket connect. >>>>socket writeStream nextPutAll: ... ... ... "GET method to receive the >>>> >>>> >>>file" >>> >>> >>>>[socket hasInput] >>>>whileTrue: >>>> [dataStream nextPut: socket readStream next]. >>>>dataStream contents. >>>> >>>>I get "Remote Socket closed" error before getting all the bytes. And I >>>>receive 16xx bytes each trial. >>>> >>>>The only way I can it right is to request exact number of bytes of the >>>> >>>> >>>file, >>> >>> >>>>which I know it by selecting "Properties" menu item of Internet >>>> > Explorer. > >>>>socket readStream next: 111988. >>>> >>>> >>>Take a look at how Live Update downloads the patch files. The best place >>> > to > >>>start is LiveUpdate class>>urlContents:. This eventually make use of >>> > some > >>>Windows APIs provided by the URLMonLibrary class, which is another good >>>place to look for interesting methods that you can use. >>> >>>Best Regards, >>> >>>Andy Bower >>>Dolphin Support >>>http://www.object-arts.com >>>--- >>>Are you trying too hard? >>>http://www.object-arts.com/Relax.htm >>>--- >>> >>> >>> >>> >>> >>> > > |
In reply to this post by Howard Oh
[This is my second attempt at posting a reply -- the original version seems
to have got stuck on the Totally-Objects server] Hwa Jong Oh, > Downloading a file(jpg file for instance) by a Socket seems to be nasty > work. If you really want to use raw TCP/IP to talk HTTP (rather than using the M$-supplied stuff that Andy described), then you need to change the form of the loop slightly. There's a bug in the sockets package which doesn't handle socket closure properly (and which I've asked about at least twice before). If you evaluate: =============== socket := (Socket port: 80 host: 'www.object-arts.com') connect; yourself. socket writeStream nextPutAll: 'GET /Home.htm HTTP/1.0'; nextPutAll: #[13 10]; nextPutAll: #[13 10]; flush. in := socket readStream. out := ByteArray writeStream. [in do: [:each | out nextPut: each]] on: SocketClosed do: [:e |]. out contents asString. =============== Then it should work (I've just tested it, and it works OK for me). > Hwa Jong Oh -- chris |
In reply to this post by Brian Murphy-Dye-4
Brian,
> Yeah, I had tried your suggestion (which does work), but assumed there > had to be a better solution ;) If you look in the URLMonLibrary there are a number of file download methos. You should be able to you use the URLMonLibrary>>urlDownload:toFile: method to download a file and bypass the cache. Best Regards, Andy Bower Dolphin Support http://www.object-arts.com --- Are you trying too hard? http://www.object-arts.com/Relax.htm --- |
In reply to this post by Chris Uppal-3
Chris,
> ===================== > socket := Socket port: 80 host: 'www.object-arts.com'. > socket connect. > socket writeStream nextPutAll: 'GET /Home.htm HTTP/1.0 > > ' asByteArray. > socket writeStream flush. > in := socket readStream. > out := ByteArray writeStream. > [in do: [:each | out nextPut: each]] > on: SocketClosed do: [:e |] > on: Error do: [:e | e signal]. > in contents asString. > ===================== > > There's a bug in the socket stuff which doesn't correctly handle the case > where the server closes the socket (I've reported this before, twice), and > that's why I'm forced to use the error handler for SocketClosed. > I've been using essentially the above code for about a year now without any > problems. I can't find anything in the archive on this. Can you give me a pointer or a quick recap? There's a (very) slim chance that this could be related to a problem I see every few weeks or so on one of my servers. It runs along fine for a while and then users will report one of a couple of problems that always seem to trace back to WSAE_NOBUFS. One suggestion is that I might need to "hang around a little longer" to let the sockets close - the guy who made the suggestion _really_ knows sockets, so a I take it seriously, even if I don't know how to do it :( Have a good one, Bill -- Wilhelm K. Schwab, Ph.D. [hidden email] |
Free forum by Nabble | Edit this page |