I have A LOT of calculations to perform so I wrote a quick program to do them for me.
For some reason when I can't seem to use all of my CPU, it seems like the system is throttling it. I wrote a simple application to demonstrate this behavior. See the below code:
Parallel.For(0, 10000, x =>
{
int i = 0;
while (true)
{
i++;
}
});
Running the above code in a console app, it maxes out all of my processors and my CPU utilization is at 100% (as I would expect).
In my calculation, I am creating several Lists and arrays so I added that to the above code and that is when I saw it behave just as my actual program was (where it would use less than 100% of the CPU). See example code below:
Parallel.For(0, 10000, x =>
{
int i = 0;
while (true)
{
i++;
List<int> vs = new List<int>();
}
});
On my computer, which has 12 logical processors, the CPU utilization only reaches 70-80%. When running on a higher-performance PC I have, which has 32 logical processors, the behavior is even more dramatic - the highest the CPU utilization goes is 50%. And my real application that is performing the calculations only uses 20% of the total CPUs.
And as a follow-up, in Task Manager I see that it is using all logical cores more or less evenly - it isn't maxing out some and others sitting idle.
I am curious why this is happening, and more importantly, how can I make my program use all of the processor?
What I have tried:
I tried monitoring garbage collection but that didn't offer any clues (and also wouldn't make sense because I would expect garbage collection to show up as CPU usage).
I tried increasing the process priority from normal to "High" and even "Real-Time" but that didn't change anything.