How much does it improve performance?
How much does it improve performance?
I didn't get much of a baseline in terms of credits (pending, impatience, etc). But I'm watching the GPU load. Running one at a time, the typical load was about 50% - now it's >95%. Completion times have not noticeably increased, either. I'll post my credit total at the end of the day.
I only have a handful of tasks validated so far, but this is what I found:
GTX 580:
o Normal: Run time averaged 70 seconds and pay averaged 51. That works out to about 63,330/day.
o With app_info: Run time averaged 133 seconds and pay averaged 61. That works out to about 78,865/day.
GTX 295:
o Normal: Run time averaged 352 seconds and pay averaged 1231. That works out to about 30,234/day.
o With app_info: Run time averaged 524 seconds and pay averaged 61. That works out to about 40.057/day.
The odd bit is around the credits awarded. Normally, they vary between 48 to 54 per task on the 580. But with the app_info, they are all awarded the exact same 60.74 credits. Similar situation with the 295, all exactly 60.74.
That's really interesting - I am not seeing the same behavior. I'm seeing huge variation in the number of credits awarded, from ~40 to ~142, with an average of 60.79 (damn close to your fixed value).
Since you did it, I went in and calculated numbers, and I'm seeing a similar gain. My number of normal WUs is fairly small, but I was getting 48k/day on a 570 normal, and 59k/day with the app_info file.
Just curious, why did you pick .75 CPU per? With my GTX 295, that is taking 3 of my 4 CPUs...not leaving much for crunching. Can this be lowered at all, without impacting the performance? Anyone feel like doing some experiments?
I used 0.75 because the default was 0.85, and 0.75 seemed rounder - completely arbitrary, but I wanted to stick close.
For the record: At exactly 06:00 UTC 27 March 2011, I changed it over to 0.25. Anything reported after that time has the new app_info system...