Quote Originally Posted by MindCrime View Post
the way I see it is it's one of those things that look really cool when graphed but on paper seems mundane.
G G G Eb F F F D

Boring, right? Mundane. Seemingly pointless. Random. In this context, yes. When graphed in another context, not so much.

Why do I crunch Collatz? Because it's a chance to study an interesting pattern! It's not my project, time, or hardware, but I dislike the idea of saying we're trying to "disprove" the conjecture. It's not a football game. There need not be a winner and a loser. While our computers are fantastic at completing large amounts of computations, and in recent decades have even become passable at pattern recognition, they are still (thankfully for our survival) pitiful at discovering a USE for the patterns and knowledge they unearth. Our computers complete this simple arithmetic task over and over again because Slicker's program tells them to. They feel no tug of their governing idée fixe.

We may yet disprove the conjecture. Though we think we see a pattern that extends to infinity, there may be a pattern over the pattern, the first member of which is an enormous figure. Or it could just be one odd alignment that throws a wrench into the whole thing. Then what do we do? Where's your integer god now, eh? Even if we never disprove the conjecture, the study of it may prove useful to just the right human with just the right pattern of neurons. We're making this huge record of figures we know eventually reduce to 1. Some do it very quickly. Others take a little longer. Others still take a MUCH longer time to reduce. The study of this record could someday undermine a queueing architecture for financial transactions, a memory accessing algorithm, or a 15 Grammy-winning dubstep album. I don't know. I don't see anything in it. The right person might.

Suppose that person then uses our little dataset as the underpinnings of a new digital architecture. That person then goes on to win the Nobel prize and all future computers manage their enormous power based on this new algorithm- including those at the Department of Defense. Years go by and humanity marches on fighting, loving, crunching... until the first computer hits a figure at 10^25839427864. It tries to queue something but it never returns, or is so many steps away it "practically" disappears to the rest of the system. The information, no matter how trivial or small, falls into an endless computation sequence. Other computers begin to hit the same figure. An automated NORAD monitor registers this as a sign the stations have been eliminated. What computers are still functioning complete their retaliatory launch sequences before becoming useless bricks (or CreditNew servers, same difference).

THE WORLD ENDS BECAUSE YOU DIDN'T SEE WHY IT WAS IMPORTANT TO "ENDLESSLY" COMPUTE COLLATZ!

Do you get it now?! Does not being consumed in a nuclear holocaust make "sense" to you?! Good.

Oh, yeah, and also Slicker's the man. The project is well-run, has an active community, and blazes a trail for new hardware/driver/software uses and implementations. And it keeps my Intel iGPUs from killing themselves out of boredom.

In other news- I'm bored at work.