"cachey" file system on Windows

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

"cachey" file system on Windows

Wayne Johnston

I have some SUnit tests that exercise some of my code that does things such as copy files (via CfsFileDescriptor>>copyFile:new:), asks for the last modified timestamp on files (CfsStat>>#stMtime), and removes files (CfsFileDescriptor class>>#remove:).  Once in a while my tests do not work as expected.  The scenarios are:

 

1.    I delete a file “B” and then try to copy a file “A” to “B”, the copy fails with EEXIST meaning it’s failing because “B” already exists.

2.    I delete a file then ask for the last modified timestamp on the file, it’ll unexpectedly succeed with the ‘correct’ timestamp.

3.    #openEmpty: and #close on a file "B", then try to copy "A" to "B".  The copy sometimes fails with EBUSY – like the file I had closed is still locked.

 

In all cases it’s like the OS or hard drive has a little cache effect going on.  It seems unreasonable that all applications working with files have to do what I did and put in a little delay in my tests and my lower level code to make things consistently succeed.  I have seen places where a 100ms delay sometimes isn’t enough.

 

Is there some simpler way to deterministically solve this problem?  Any explanation for that apparent cache behavior?

--
You received this message because you are subscribed to the Google Groups "VA Smalltalk" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To post to this group, send email to [hidden email].
Visit this group at http://groups.google.com/group/va-smalltalk.
For more options, visit https://groups.google.com/groups/opt_out.
 
 
Reply | Threaded
Open this post in threaded view
|

Re: "cachey" file system on Windows

Andres Valloud-5

In there, you will find text like

The DeleteFile function marks a file for deletion on close. Therefore, the file deletion does not occur until the last handle to the file is closed. Subsequent calls to CreateFile to open the file fail with ERROR_ACCESS_DENIED.


Reading that in a strict sense, all the text says is "file deletion occurs after the last handle to a file is closed".  It does *not* say things like "file deletion occurs immediately after...", or "file deletion occurs in the process of closing the last handle...", and it does not contain terms such as "... atomic...".

I do not immediately recognize 'EBUSY' as a Windows error code.  I'd figure out what was the actual error code returned, and then look it up under the MSDN system error codes list.  That list begins here:

http://msdn.microsoft.com/en-us/library/windows/desktop/ms681382%28v=vs.85%29.aspx


I did not read all possible documentation on the subject on MSDN.  I specifically do not guarantee that reading any amounts of MSDN docs will result in a consistent, complete picture of what could be going on.

Andres.

On Thu, Aug 1, 2013 at 9:39 AM, Wayne Johnston <[hidden email]> wrote:

I have some SUnit tests that exercise some of my code that does things such as copy files (via CfsFileDescriptor>>copyFile:new:), asks for the last modified timestamp on files (CfsStat>>#stMtime), and removes files (CfsFileDescriptor class>>#remove:).  Once in a while my tests do not work as expected.  The scenarios are:

 

1.    I delete a file “B” and then try to copy a file “A” to “B”, the copy fails with EEXIST meaning it’s failing because “B” already exists.

2.    I delete a file then ask for the last modified timestamp on the file, it’ll unexpectedly succeed with the ‘correct’ timestamp.

3.    #openEmpty: and #close on a file "B", then try to copy "A" to "B".  The copy sometimes fails with EBUSY – like the file I had closed is still locked.

 

In all cases it’s like the OS or hard drive has a little cache effect going on.  It seems unreasonable that all applications working with files have to do what I did and put in a little delay in my tests and my lower level code to make things consistently succeed.  I have seen places where a 100ms delay sometimes isn’t enough.

 

Is there some simpler way to deterministically solve this problem?  Any explanation for that apparent cache behavior?

--
You received this message because you are subscribed to the Google Groups "VA Smalltalk" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To post to this group, send email to [hidden email].
Visit this group at http://groups.google.com/group/va-smalltalk.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

--
You received this message because you are subscribed to the Google Groups "VA Smalltalk" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To post to this group, send email to [hidden email].
Visit this group at http://groups.google.com/group/va-smalltalk.
For more options, visit https://groups.google.com/groups/opt_out.
 
 
Reply | Threaded
Open this post in threaded view
|

Re: "cachey" file system on Windows

Wayne Johnston
In reply to this post by Wayne Johnston
Thanks Andres.  I agree it sounds ambiguous.

Cfs* classes use errors like EBUSY, ENOENT, EEXIST etc. which seem to be a standard thing such as from http://www.virtsync.com/c-error-codes-include-errno defined for us in _PRAGMA_CfsConstants.

--
You received this message because you are subscribed to the Google Groups "VA Smalltalk" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [hidden email].
To post to this group, send email to [hidden email].
Visit this group at http://groups.google.com/group/va-smalltalk.
For more options, visit https://groups.google.com/groups/opt_out.
 
 
Reply | Threaded
Open this post in threaded view
|

Re: "cachey" file system on Windows

Steve Cline
In reply to this post by Wayne Johnston
Not directly related, but I have observed when my corporate masters start running some kind of scan on my machine (virus, malware, no-no checker?) my tests will start failing for similar laggy file cleanup problems.  I had started to go down the road of making repeated checks to see if the file is really gone, but decided it was not worth the effort as the problem seems to go away before I get the solution coded each time