Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: GTX 1060 3GB errors out on Collatz

  1. #1
    Past Administrator
    DrPop's Avatar
    Join Date
    October 13th, 2010
    Location
    SoCal, USA
    Posts
    7,635

    GTX 1060 3GB errors out on Collatz

    I've got a 1060 3GB card that for some reason errors out any Collatz WUs it gets. Any idea what I can do to troubleshoot something like this? I already tried detaching and reattaching. I am getting good results with a GTX 980, 1060 6GB and 750ti on the same account with the same settings, so that's what's got me puzzled.

    Also, the GTX 980 is busting out Collatz WUs in just over 16 minutes, while the GTX 1060 6GB is taking over 25 mins. From everything I could read online when I bought the 1060 6GB, they should should be really neck and neck with each other. Any idea why the 1060 is so slow and/or the 980 is so fast in comparison?
    Thanks for any ideas on either of these questions!

  2. #2
    Administrator
    Bryan's Avatar
    Join Date
    October 27th, 2010
    Location
    CO summer, TX winter
    Posts
    6,457

    Re: GTX 1060 3GB errors out on Collatz

    On the one that errors out all WU make sure you have MS C++ installed: For Windows, the most recent Visual C++ runtime 2012 or later

    In your project folder there are 2 files that end with .config. Put this in both of them:

    verbose=1
    kernels_per_reduction=48
    threads=8
    lut_size=17
    sleep=1
    reduce_cpu=0
    sieve_size=28

    Read THIS thread to find the optimized settings for your other cards.


  3. #3
    Platinum Member
    John P. Myers's Avatar
    Join Date
    January 13th, 2011
    Location
    Jackson, TN
    Posts
    4,502

    Re: GTX 1060 3GB errors out on Collatz

    Quote Originally Posted by Bryan View Post
    On the one that errors out all WU make sure you have MS C++ installed: For Windows, the most recent Visual C++ runtime 2012 or later
    http://www.setiusa.us/showthread.php...time-Installer


  4. #4
    Past Administrator
    DrPop's Avatar
    Join Date
    October 13th, 2010
    Location
    SoCal, USA
    Posts
    7,635

    Re: GTX 1060 3GB errors out on Collatz

    Thanks to both of you, that was definitely it on the 1060 3GB erroring out. Looks like it's running good now. Bryan - I'll have to play with those files and settings to see if I can get anything more out of these 1060s, but man - if that's all they can do, the 980 still smokes them despite everything I read claiming they were nearly an equivalent card for less watt burn. Guess that'll teach me...

  5. #5
    Administrator
    Bryan's Avatar
    Join Date
    October 27th, 2010
    Location
    CO summer, TX winter
    Posts
    6,457

    Re: GTX 1060 3GB errors out on Collatz

    Jed play with the lut_size 1st. I had a 1080 that was really producing poorly. I had used my config file for a 1080TI and it will do lut_size of 18. On the 1080 I dropped it to 17 and it took off and started producing almost double.

    This is what Slicker says:
    lut_size - this is the size (in power of 2) of the lookup table. valid options are 2 to 31. Chances are that any value over 20 will cause the GPU driver to crash and processing to hang. The default is 10 which results in 2^10 or 1024 items. Each item uses 8 bytes. So 10 would result in 2^10 * 8 bytes or 8192 bytes. Larger is better so long as it will fit in the GPUs L1/L2 cache. Once it exceeds the cache size, it will actually take longer to complete a WU since it has to read from slower global memory rather than high speed cached memory.

    Mikey reported that he is getting 3.1M/day on a 1060 using:

    verbose=1
    kernels_per_reduction=48
    threads=8
    lut_size=15
    sleep=0
    cache_sieve=1
    reduce_cpu=0
    sieve_size=30

    I don't know what your 980 does, but this will give you an idea of what a 1060 should do.


  6. #6
    Past Administrator
    DrPop's Avatar
    Join Date
    October 13th, 2010
    Location
    SoCal, USA
    Posts
    7,635

    Re: GTX 1060 3GB errors out on Collatz

    Wow. I'm not getting anything close to that on either 1060. I think the 980 is technically a slightly faster card, but I must have everything at default and it's burning through them in 16 mins+ while the 1060 cards (I have one 6GB and one 3GB) and taking 25 mins+! So that must be the problem. Since the 980 was a higher end card of its generation compared to the 1060, I wonder if it has more cache or something like that? I'll definitely have to play with those numbers!

  7. #7
    Administrator
    Bryan's Avatar
    Join Date
    October 27th, 2010
    Location
    CO summer, TX winter
    Posts
    6,457

    Re: GTX 1060 3GB errors out on Collatz

    Also, be sure you have that setup in both the 32 and 64 bit .config files. I found one of my 64 bit machines was sometimes being given 32 bit WU. I had only put the optimization in the 64 bit so it was really slow on the 32 bit. When I added the config to the 32 bit it woke up and performed equal to the 64 bit app.


  8. #8
    Past Administrator
    DrPop's Avatar
    Join Date
    October 13th, 2010
    Location
    SoCal, USA
    Posts
    7,635

    Re: GTX 1060 3GB errors out on Collatz

    Holy moley! What a difference! I don't know what the final time will be, but with the settings from Mikey you shared in the .config file, the "expected run time" is dropping like a rock. Yeah baby!

    Interesting thing, I only have an x64 config file - it doesn't have files for 32 in the project folder. With results like that on the 1060, you think I should mess with the 980 config file at all or just be happy with where it's at now? Thanks so much for this info, huge increase in credits coming.

    EDIT: I just checked, and here's what I had on my 980 rig:

    verbose=1
    kernels_per_reduction=64
    threads=8
    lut_size=18
    sleep=1
    reduce_cpu=0
    sieve_size=30

    I must have gotten that from someone on here at some point last year on a Collatz challenge, maybe? Do you think I should try upping the Kernels and lut_size more like this on the 1060 rigs as well, or is Mikey's "dialed in" for the 1060?
    Last edited by DrPop; 01-09-18 at 06:21 PM.

  9. #9
    Gold Member
    FourOh's Avatar
    Join Date
    January 16th, 2013
    Location
    Memphis, TN
    Posts
    1,036

    Re: GTX 1060 3GB errors out on Collatz

    Quote Originally Posted by DrPop View Post
    Holy moley! What a difference! I don't know what the final time will be, but with the settings from Mikey you shared in the .config file, the "expected run time" is dropping like a rock. Yeah baby!

    Interesting thing, I only have an x64 config file - it doesn't have files for 32 in the project folder. With results like that on the 1060, you think I should mess with the 980 config file at all or just be happy with where it's at now? Thanks so much for this info, huge increase in credits coming.

    EDIT: I just checked, and here's what I had on my 980 rig:

    verbose=1
    kernels_per_reduction=64
    threads=8
    lut_size=18
    sleep=1
    reduce_cpu=0
    sieve_size=30

    I must have gotten that from someone on here at some point last year on a Collatz challenge, maybe? Do you think I should try upping the Kernels and lut_size more like this on the 1060 rigs as well, or is Mikey's "dialed in" for the 1060?
    It looks like you're close to the high end of the average for a GTX 980... try changing threads to 9 and see what happens. I wouldn't take the lut_size any higher... if anything bring it down to 17.



  10. #10
    Gold Member
    FourOh's Avatar
    Join Date
    January 16th, 2013
    Location
    Memphis, TN
    Posts
    1,036

    Re: GTX 1060 3GB errors out on Collatz

    I'm getting very good results (3.3M/day) on my GTX 980 with these settings:

    verbose=1
    kernels_per_reduction=48
    threads=8
    lut_size=16
    sleep=1
    reduce_cpu=0
    sieve_size=29



Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •