|
yep. that is what I would try, intuitively I would use odd line widths though.
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that. [The QA section does it automatically now, I hope we soon get it on regular forums as well]
|
|
|
|
|
Rob Philpott wrote: That's one approach...
Which is nice, apart from the fact that you are not .Dispose() the GraphicsPath and Pen instances you created.
All the best,
Martin
|
|
|
|
|
Thanks.. I try with your solutions ...
Alex
|
|
|
|
|
Hello,
I have another problem. Is there a way to avoid that the circuit is redrawn below? I to draw the current point that moves, use the DrawEllipse function (current position). How can I avoid soiling the circuit with all subsequent drawEllipse?
Thanks.
Alex
|
|
|
|
|
|
Haven't you posted this at least two times before...? This is not the forum for these kind of posts..
|
|
|
|
|
How many times are you going to do this? You already have annoyed a lot of people. I would say stop it before someone reports you and you get banned.
People have already done that. Now what are you pleading for? If you think this way your article would get published, then let me tell you : No it will not be.
50-50-90 rule: Anytime I have a 50-50 chance of getting something right, there's a 90% probability I'll get it wrong...!!
|
|
|
|
|
Not again.
CCC solved so far: 2 (including a Hard One!)
37!?!! - Randall, Clerks
|
|
|
|
|
Here is the deal: for each "Rank my article please" you publish on a forum, I will find a way to vote your "article" another 1, thus reducing the probability it will ever become public.
Luc Pattyn [Forum Guidelines] [Why QA sucks] [My Articles]
I only read code that is properly formatted, adding PRE tags is the easiest way to obtain that. [The QA section does it automatically now, I hope we soon get it on regular forums as well]
|
|
|
|
|
Just report it as a substandard article. That way it will never see the light of day.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
My blog | My articles | MoXAML PowerToys | Onyx
|
|
|
|
|
Your article has been ranked - several times. You choose to ignore the quite serious criticisms however so I, for one, will not approve your article.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
My blog | My articles | MoXAML PowerToys | Onyx
|
|
|
|
|
Hey guys,
I'm developing an application (a windows service in fact) which occasionally creates a large amount of objects. It's reasonable the service then consumes a lot of memory (op to 500 Mb). However when the service is done processing all objects (which are being disposed after processing) the service still claims the full 500 Mb. I have spent an entire day finding a memory leak in code but this does not seem to be the problem.
Can anybody explain me (or point to a good reliable document) which explains me when memory is being 'released' after a process claimed it? I don't want to mess with the garbage collector because GC should find out when to clear up stuff.
Thanks!
|
|
|
|
|
Are you doing anything through unmanaged code? I think nAnt provides the graphical view of memory consumption. You might give that a try to find which part of your code is taking up that much of memory.
50-50-90 rule: Anytime I have a 50-50 chance of getting something right, there's a 90% probability I'll get it wrong...!!
|
|
|
|
|
I was also thinking about p/Invoke stuff not being taken care of..
|
|
|
|
|
Nope, it's only managed code. I'm not surprised by the amount of memory consumed, because the amount of instanciated objects is huge, when the objects did their job they are disposed. I'm surprised that the service still uses a lot of memory even when all objects are disposed and the service is idle.
|
|
|
|
|
Well, in that case, check if you are holding reference to any of the huge objects that you would have created. Run a memory profiler and see the results. That should tell you what consumes that much of memory when seervice is idle. Also check if you have use static objects. AFAIK they will exist throughout the lifetime of your service.
50-50-90 rule: Anytime I have a 50-50 chance of getting something right, there's a 90% probability I'll get it wrong...!!
|
|
|
|
|
Oops, I guess I found the problem...
I'm deviding files in chunks of 1024 bytes. The code does something with each chunk and then fires an event which tells the chunk is complete. The event handler then calls a method to start processing the next chunk. This creates a nasty piece of recursive code which consumes a lot of code. Solved this and memory is more stable. I will however run a profiler later to check (and find) additional issues. For now everything seems fine!
Thanks for the help, appreciate that!!
|
|
|
|
|
Hello,
I followed the discussion and have a wild guess now!
Eduard Keilholz wrote: The code does something with each chunk and then fires an event which tells the chunk is complete. The event handler then calls a method to start processing the next chunk.
If you are referencing this chunk, from an object which also holds the event ...
And if you now register (subscribe) this event for every chunk from your "managing" class and not unregister (unsubscribe) after complete ...
Your "managing" class would hold all the references to the objects which hold the references to the chunks and therefor the GC would not be able to free your objects.
Just a wild guess!
Hope you find it.
P.S.: I'm sure with this[^] memory profiler you will find it quickly!
All the best,
Martin
modified on Thursday, January 21, 2010 5:46 AM
|
|
|
|
|
Hey Martin,
Thanks for your reply.. This is what happends
A file becomes 'active'. As soon as a byte becomes active, the 'managing' class registers to three events (Completed, ProcessChanged and ErrorOccured). As soon as ProcessChanged is called, the 'active' file should start handling the next file part. As soon as Completed or ErrorOccured is fired, the 'managing' unregisters for the active file's events and a new file becomes 'active'. This process starts over and over again untill all files are completely processed.
Now the ProcessChanged imidiately called the 'next file part' method which caused a recursive bunch of code. This caused the extreme large amount of memory consumption since some files are fairly large. I changed this so the recursion doesn't take place anymore. My Windows Service now claims about 27 Mb tops, in stead of the previous 500 Mb.
I'll be running the memory profiler soon. Thanks for the interest!
|
|
|
|
|
Hello,
If you really unregister all the events (and I'm sure you do) I think you do not have a memory leak at all.
It's just, like you found out, that you hold a lot of your chunk objects because of the recurse calls.
This forces the GC to move the objects in a next generation.
If your action is done, and no more new objects have to be created, the GC will not see the need of cleaning up the generations.
I would assume, that if you start that action again (in the first implementated version), the 500Mb will not be topped, unless you haven't forgot to unregister the events or holding the objects in an collection for example!
Here[^] is an article, which explains how the GC passes the objects threw the generation levels!
All the best,
Martin
|
|
|
|
|
The windows service now starts claiming about 26 Mb of ram. If I make the service work (really work) it grows to about 28 Mb and then stabilized (dit I write that correct)? It seems that my problem WAS the recursion and not a memory leak. Pretty happy after a day of stress testing, thank for the help!
|
|
|
|
|
Also, keep in mind that if you're looking at Task Manager to tell you how much memory your app is using, it's lying to you. You're seeing how much memory is RESERVED by the .NET CLR for your app, NOT how much your app is actually using. The .NET CLR keeps a managed memory pool that your objects are allocated from. If you free an object, the memory goes back into the pool for future use. It is NOT returned to Windows! The .NET CLR will return memory to Windows if Windows wants it back. Otherwise, it'll keep the memory in the managed pool.
|
|
|
|
|
Erhm, actually... that WAS the tool I was using to monitor the memory consumption. I'll try some performance counters, maybe I can find a more accurate result there.
Thanks for sharing your meaning...
|
|
|
|
|
I would probably agree here. The objects you have are probably referenced within the main application either through events (the little devils) or main variables. Setting these variables to null rather than the reference to the object may allow the garbage collector to work its magic, or you can try calling it directly to see if its just being lazy.
Event disposal
How to: Subscribe to and Unsubscribe from Events
Just noticed that Covean has posted about the GC that should be useful.
modified on Thursday, January 21, 2010 5:50 AM
|
|
|
|
|
Disposing an item does not mean that the GC collects it immediately or even frees memory.
The GC tries to keep memory allocation fast/flexible and if your app constantly allocates large objects
for medium/long-lifetime and disposes it, the GC could decide not to free its memory because it awaits
a new object allocation soon.
I would try to force the GC to collect all generations in idle-state.
GC.Collect();
GC.WaitForPendingFinalizers();
If your memory consumption now falls down in idle-state, you know that the GC causes this behaviour
and you should decide if its better to let the GC do it on its own or to force it.
If your memory consumption doesn't fall down then you really should look if you don't use some
unmanaged resources.
Greetings
Covean
|
|
|
|