Alright, so I just did the math - running the highest end AMD rig from BikerMatt for the Hydra here - and if I were to swap it out for a Sandy Bridge setup, this is what I find:
The difference in TDP is 30W - (125W for AMD X6 and 95W for i7).
That 30W equates to roughly 25.2KW over the course of a month at 24/7 operation.

Now, taking good hard look at my electric bill, I happen to notice that most of the "fees" I pay for my office are "built in" transmission, delivery, blah, blah charges - like half of them don't even matter how much juice I'm sucking down, they are just going to be high because it's a business. OK, big deal, right? So I did the math on the part of my bill that would actually change, and ...
I get a whopping like $4! That's it. Let's give it the benefit of the doubt and say I screwed up somewhere and it would be $5 per month savings with the i7 CPU.
That still makes it only $60 per year power savings at 24/7 usage. Verdict: it would take a looooong time to save the cost of one of those things in power savings. Not really sure it's worth it, unless the points are just so good with it over the AMD X6 cpus?

Clearly I was expecting a bigger difference in power bill savings, and maybe it would be for someone on a residential line, I don't know. Anyway, just thought I'd share...back to debating about GPUs now.