PDA

View Full Version : Cray's First GPU Supercomputer



zombie67
05-25-11, 11:56 PM
http://hardware.slashdot.org/story/11/05/24/2137256/Cray-Unveils-Its-First-GPU-Supercomputer

http://www.hpcwire.com/hpcwire/2011-05-24/cray_unveils_its_first_gpu_supercomputer.html


Hardware-wise though, the XK6 is not that different from its CPU-based brethren. The blade is basically a variant of the XE6, replacing four of the eight AMD Opteron sockets with NVIDIA Tesla GPU modules. Each four-node blade consists of two Gemini interconnect chips, four Opteron CPUs, and four NVIDIA Tesla 20-series GPUs. The Tesla in this case is the X2090, a compact form factor of the M2090 module that was introduced last week. Like the M2090, the X2090 sports a 665 gigaflop (double precision) GPU, 6 GB of GDDR5, and 178 GB/second of memory bandwidth. A XK6 cabinet can house up to 24 blades (96 nodes), which will deliver something in the neighborhood of 70 teraflops.

http://www.theregister.co.uk/2011/05/24/cray_xk6_gpu_supercomputer/

YoDude9999
05-26-11, 12:47 AM
.....Soon to be available in stores near you!

Yo-

DrPop
05-26-11, 01:20 AM
Any idea on the cost of running something like that? :confused:;)

Dandasarge
05-26-11, 01:00 PM
Any idea on the cost of running something like that? :confused:;)


"running" is pretty open? Like if you owned one what the power use would be or, cost of a project like that?

DrPop
05-26-11, 02:56 PM
I was thinking more along the lines of utility bill every month. Obviously to buy one would be asronomical in price anyway ...I was more curious about the efficiency aspect. Would it be better to have a giant blade server like this, or a bunch of little rigs, when it came to paying Edison Electric at the end of the month?:confused::p

Dandasarge
05-26-11, 04:56 PM
I was thinking more along the lines of utility bill every month. Obviously to buy one would be asronomical in price anyway ...I was more curious about the efficiency aspect. Would it be better to have a giant blade server like this, or a bunch of little rigs, when it came to paying Edison Electric at the end of the month?:confused::p

Nope its the same, a CPU uses the same energy, same with a GPU only thing you lose is the redundant hard drive, but you can do that with net booting.