Okay, long story short: I wanted to test out the capability/computing speed of the C# language w/ my CPU. I then proceeded to create a simple class with an integer property (Infos.tries) and wrote that in the Main():
I then proceeded to add another thread that would retrieve the value from this particular int and calculate with the latest one to get a number, the exact number of incrementation my machine had done. With this method, I could go up to about 80 million tries and my CPU running at 9%. But here comes the problem:
Since I saw that it only used 9% of my CPU's strength, I tried adding a second thread that would consist in another incrementing thread, doubling the CPU usage and doubling the calculation speed. The actual CPU usage went up to 200% (about 18-20% of charge) but the number of calculations dropped to about 38 - 50 million... Why?
Why would it use more power and calculate 50% less?
while (true) { Infos.tries++; }
I then proceeded to add another thread that would retrieve the value from this particular int and calculate with the latest one to get a number, the exact number of incrementation my machine had done. With this method, I could go up to about 80 million tries and my CPU running at 9%. But here comes the problem:
Since I saw that it only used 9% of my CPU's strength, I tried adding a second thread that would consist in another incrementing thread, doubling the CPU usage and doubling the calculation speed. The actual CPU usage went up to 200% (about 18-20% of charge) but the number of calculations dropped to about 38 - 50 million... Why?
Why would it use more power and calculate 50% less?