|
yes.
Christian Graus - Microsoft MVP - C++
Metal Musings - Rex and my new metal blog
"I am working on a project that will convert a FORTRAN code to corresponding C++ code.I am not aware of FORTRAN syntax" ( spotted in the C++/CLI forum )
|
|
|
|
|
Thanks. I am not quite ready to delve into WPF just yet.
|
|
|
|
|
3.0 just adds to 2.0, in the same way that you can still use untyped collections in 2.0, but the new stuff is there, and better.
Yeah, I wish I had time for WPF, but I still don't, much.
Christian Graus - Microsoft MVP - C++
Metal Musings - Rex and my new metal blog
"I am working on a project that will convert a FORTRAN code to corresponding C++ code.I am not aware of FORTRAN syntax" ( spotted in the C++/CLI forum )
|
|
|
|
|
hey i'm creating a xaml browser application and I want to bind a database with a listbox in the interface ..does anyone has reference or an article that explains that??
I found an article but it's for windows apllications
|
|
|
|
|
It shouldn't really be much different than from windows apps, have you resolved this?
"Any sort of work in VB6 is bound to provide several WTF moments." - Christian Graus
|
|
|
|
|
Hello,
This relates to my question yesterday, but this question is far more specific so I figured it best to start a new thread regarding it.
I am currently trying to design a method that will detect whether or not a folder is being actively copied to once it is detected by my application. For my program to function correctly, any detected folders in the target directory MUST be finished being copied to before my program can pick them up for processing. However, no matter what I try, my program tries to grab the folders and move them for processing before the copy is actually finished.
So far I've tried checking the LastWriteTime and LastAccessTime of the directory via the DirectoryInfo class. For some reason, these properties would state that the last write and access times were x seconds ago, even when the files were not completely done copying.
I thought this might be caused by files being written to subfolders inside the folder I was checking (which I'm assuming are not checked in reference to that particular top folder), so on a suggestion from Christian Graus, I tried a recursive script that would get the size of all files and folders inside the target folder and return that value. My program would use this method to first get the size of the folder, sleep the thread for a second, then recheck the value. If the values matched, the folder would be processed. If they didn't match, the folder was passed over and checked on the next cycle. However, even though the files were still not done copying (the file copy progress box was still showing), the cumulative directory size would be reported as the same on both checks and it would pick up the folder early again. I also checked the reported size value against the actual size of the folder (though the properties window in Windows) and the reported size matched the actual directory size, even though my program got that value while the files were still actively copying (once again, the file copy progress dialog was still visible and progressing).
So I'm really at a loss here. Does anyone know of a method to check a folder's activity and report whether files are still actually being copied to it? I've posted some of the relevant snippets, in case I just boffed my code and didn't realize it.
Thanks in advance for any assistance.
Code for getting folder size:
<br />
public static class Utility<br />
{<br />
public static int GetSize(DirectoryInfo dir)<br />
{<br />
int size = 0;<br />
foreach (FileInfo f in dir.GetFiles())<br />
{<br />
size += (int)f.Length;<br />
}<br />
foreach (DirectoryInfo d in dir.GetDirectories())<br />
{<br />
size += GetSize(d);<br />
}<br />
return size;<br />
}<br />
}<br />
Code for scraping the target directory and checking whether a folder is done being copied to or not (code relevant to this issue is bolded):
<br />
void RunAsyncScrape(object sender, DoWorkEventArgs e)<br />
{<br />
DirectoryInfo[] studies;<br />
try<br />
{<br />
studies = new DirectoryInfo(targetdir).GetDirectories();<br />
}<br />
catch (Exception ex)<br />
{<br />
MessageBox.Show("Could not scrape the target directory\nReason: " + ex.Message, "File Scraping Error");<br />
}<br />
<br />
if (studies.Length > 0)<br />
{<br />
ArrayList orders = new ArrayList();<br />
foreach (DirectoryInfo dir in studies)<br />
{<br />
int size = Utility.GetSize(dir);<br />
Thread.Sleep(1000);<br />
int sizeb = Utility.GetSize(dir);<br />
if (size == sizeb)<br />
{<br />
orders.Add(dir)<br />
}<br />
}<br />
if (orders.Count > 0)<br />
e.Result = orders;<br />
}<br />
}<br />
|
|
|
|
|
Christoff915 wrote: So far I've tried checking the LastWriteTime and LastAccessTime of the directory via the DirectoryInfo class. For some reason, these properties would state that the last write and access times were x seconds ago, even when the files were not completely done copying.
This is true. The reason is because NTFS only updates those properties every once in while. It does NOT do it on-the-fly as the file is being written to. So, the file WAS written to at that time. It doesn't mean that the file write has stopped yet.
Christoff915 wrote: I thought this might be caused by files being written to subfolders inside the folder I was checking (which I'm assuming are not checked in reference to that particular top folder), so on a suggestion from Christian Graus, I tried a recursive script that would get the size of all files and folders inside the target folder and return that value. My program would use this method to first get the size of the folder, sleep the thread for a second, then recheck the value. If the values matched, the folder would be processed. If they didn't match, the folder was passed over and checked on the next cycle. However, even though the files were still not done copying (the file copy progress box was still showing), the cumulative directory size would be reported as the same on both checks and it would pick up the folder early again. I also checked the reported size value against the actual size of the folder (though the properties window in Windows) and the reported size matched the actual directory size, even though my program got that value while the files were still actively copying (once again, the file copy progress dialog was still visible and progressing).
Again, you're checking this information faster than NTFS updates it! NTFS is LAZY. You cannot depend on the properties to tell you when the folder is finished.
You can't check this every second to see if it's done. You're simply not going to know unless the application that is writing the data tells you. Somehow, I doubt that it has any such capability.
Since the app can't tell you when it's done, you're only semi-reliable method is to check for the existance of the folder and get it's total size, about once every 15 seconds, and KEEP DOING THIS! If the size doesn't change for, say, the last 3 passes, then you can assume the app is done writing to the folder.
Note, I said "assume". Because of system failures or loads, or whatnot, there is no absolutely 100% reliable method of knowing when the other app is done. THe only way that's going to happen is if the other app tells you it's done.
|
|
|
|
|
Dave, thanks for the quick reply.
That is a disappointment, although the information you provided cleared up a lot of the confusion I have over this issue.
Back to the drawing board, I guess...
|
|
|
|
|
Since the application is trying to write the file to a folder. You can check to see if you can get exclusive access to the file. If you can then that file must be done. Now like it has been said before the process might have failed etc, but if you can get exclusive rights to the file, then nobody else can touch the file. If you in control of the other app, you would want to do the same thing. Look at the FileInfo class
FileInfo fi = new FileInfo("temp.txt");
FileStream fs = fi.Open( FileMode.Open, FileAccess.ReadWrite, FileShare.None );
fs.Close();
Hope that helps.
Ben
|
|
|
|
|
That will work for that one particular file, but the app in question writes a bunch of files and folders inside the folder he's talking about. Managing watching all those files would be pretty chaotic.
|
|
|
|
|
Well, I figured he is already looping through all the files with is FileInfo loop. I figured it isn't much different then waiting 45 seconds and checking again to see if the size has changed. Anyway, it is the only way I know of, that ensures that the file is not still being written. That is usually the bigger issue. You don't want to try and process a file that is still writing to the folder you are looking at.
Ben
|
|
|
|
|
I'll look into your suggestion, Ben...thanks for the advice.
|
|
|
|
|
But it doesn't work if the application is writing to several files simultaneously or writing to the files intermittantly. Open the file, write, close, ..., rinse and repeat. Logging libraries do this frequently.
I know the technique you're talking about. I've figured out, the hard way, that it doesn't work all the time.
|
|
|
|
|
Would FileSystemWatcher help?
|
|
|
|
|
Hi,
I am stuck with one of the basic question on .Net Framework.
Regarding Serialization, I know that it is the process of converting objects into a stream of bytes inorder to store the status into a storage medium. De-serialization is just the opposite. But my question is, in real time, where exactly we require the serialization? At what point of time we choose seriaization?
Can anybody who knows please reply. Any help would be great.
Thanks
Meeram395
|
|
|
|
|
Most of the time I use serialization, I am serializing objects to xml. Then this xml is saved in a sql database. Later when the order is looked at, the state of the order can be retrieved by taking the xml that was saved and de-serializing it back into the original object.
Hope that helps.
Ben
|
|
|
|
|
Serliazation is used for such things as viewstate in ASP.NET or for transport in .NET remoting or Web Services, or any other interprocess communication.
only two letters away from being an asset
|
|
|
|
|
Serialization is simply the act of taking an object in memory and converting it to some format that allows it's internal state to be saved (persisted) and recreated later (either a a later point in time, on a different computer, or both).
Serialization is used in a lot of situations. Serialzing to XML is a way to persist the object to a databse or transfer it through remoting to another system. If you don't have a need to persist or transfer the data, then you don't need serialization.
-----------------------------
In just two days, tomorrow will be yesterday.
|
|
|
|
|
Hi
The real use of serialization is in object usages. For instance I have a object named Person. And i am keeping informations like name, address, salary etc as members of that object. Then if i need to persist it and later i need it in same form. Then we can use serialization to keep that object for future use.
Gg
|
|
|
|
|
Hi,
Can anybody tell me where can i get sample code / demo for the
Application Blocks in Enterprise Library...
If You have with you ...can you pls send it to me...
Thanks in Advance,
with regards,
Bommannan.Ramalingam
Nothing can be Done by Changing the Face. But, Anything can be Done by Facing the Change.
|
|
|
|
|
here[^]
only two letters away from being an asset
|
|
|
|
|
I'm trying to print in WPF without showing the WPF PrintDialog control. It seems crazy, but I can't figure out how to enumerate all available printers. In other words I want a PrintQueueCollection of all accessible printers on the machine (or a way of enumerating all usable PrintQueues). All the examples and code I see only allow you to enumerate local printers, or the printers on a specific remote server. I just want to get the printers set up on the machine, as shown for example in "Printers" in the Control Panel. Seems easy, anyone have a code snippet?
Thanks,
Jeff
|
|
|
|
|
Have you resolved this? Or looked around at the WPF articles here. Josh Smith might be the guy to ask.
"Any sort of work in VB6 is bound to provide several WTF moments." - Christian Graus
|
|
|
|
|
Yes, I resolved it, thanks. It was just a matter of passing "Connections" and "Local" queue types to GetPrintQueues.
Jeff
|
|
|
|
|
Hello,
I am currently developing an application that will serve as a client interface for a Rimage cd-producing server. This application is intended to run on the same Windows box that the Rimage is connected to and monitor a directory on the file structure or network for folders (patient studies to be burned to cd) that are sent there to be processed. Once a folder is detected in the target directory, that folder is moved to a temp directory, where an order is created and the processing begins.
The process for the order is managed entirely within a worker thread, and each order gets its own thread to run in. The thread method contains a WHILE loop that cycles until the order's state property is set to indicate completion. Inside this loop are sets of conditionals that move the order along as the order's state is advanced by the server. Once the order is done the loop exits, and a cleanup routine is executed that removes the files that were processed and other completion-related tasks (logging, UI updating, etc).
I have a couple questions related to my design that I'm hoping a more seasoned developer (read: anyone) might be able to answer and possibly point me in a better direction (read: something that actually works).
1) In the order detection process, when a folder is found in the target directory, I load that folder into a DirectoryInfo class and check the LastWriteTime property. If LastWriteTime.AddSeconds(10) < DateTime.Now , I move the folder for processing. If not, the folder is left alone until the next pass, during which it will be checked again. The objective of that check is to make sure that all contents to be burned have been copied over before the order process begins.
This has usually seemed to work fine, but lately when I test using larger folders of data (+100megs), the folder will be sent for processing before the files are done copying.
My question is: does the LastWriteTime (or even LastAccessTime, for that matter) account for any subfolders inside the folder being written to or does it only check for files being written to in that particular folder, ignoring any subfolder activity?
2) In the method that the worker thread runs, the constantly cycling WHILE loop doesn't seem like the best way to do it. However, the thread needs to stay alive through two different processes; the first process that tells the Rimage to create an ISO image of the files and the second process that sends an order to the Rimage to burn that ISO image to a cd. So having the method loop while it worked its way through those processes seemed the best way to do it at the time. Upon further reflection, I'm sure there's a better method than that to manage multiple orders independently while not tying up my UI thread and making my form non-responsive...I just don't have a clue what it is and a lot of the tutorials that I've looked at just don't seem to fit with what I'm trying to accomplish here.
In case you couldn't tell by now, I'm a very inexperienced programmer (read: HTML monkey/Layout guy) that has been assigned with this project. Kind of like learning to swim by being tossed in the middle of a lake. With an anvil around your ankle. And sharks. I would really appreciate any guidance or links to resources that some kind-hearted guru can provide that would help with the above issues.
Thanks in advance.
|
|
|
|
|