Anybody ever done a calculation for credits / watt on these little android devices? I've got POGs running now on an old Motorola Atrix 4G (MB 860) that Zombie convinced me to crunch with. I'm just curious what the efficiency might be of the ARM chips inside these things compared to our Intel / AMD hogs. BOINC says it's an ARMV7 rev 0 (v7l), which I have found out is the instruction set.
Specs sheets on 'Net tell me the Atrix 4G has the Nvidia Tegra2 (Dual Cortex A9) inside. I can find all kinds of info except the wattage. Best guess?

Looks like I'm getting 0.0066 credits per second on this thing. I wonder how many watts it's sucking down?
My i7 gets 0.026 credits per second. That's 4x the credits per second per WU, and the i7 crunches on 8 of those WUs every second of course. So (for sake of easy academics) let's say the i7 burns 96W to do 32X the work as the ARM CPU (4 x 8).

This means the Intel i7 Sandy Bridge would burn 3W for every equivalent crunched WU of the ARM CPU. hmmm... so it's definitely a possibility, that if the Tegra 2 CPU is using less than 3W, it is actually more efficient to crunch with than an i7! And I read somewhere the Tegra 4 is considerably more efficient than this old Tegra 2 in this example. Imagine that efficiency...but certainly not as effective as the i7 for building up the credits quickly.