home page
Why not GPUs!?

Why not GPUs!?

Message boards : Number crunching : Why not GPUs!?
Message board moderation

To post messages, you must log in.


Send message
Joined: 25 Apr 07
Posts: 8
Credit: 1,900,808
RAC: 1
Message 41691 - Posted: 4 Mar 2011, 21:36:57 UTC

I wondered if they are going to use GPUs for this project and how difficult it would be for them to do that kind of programing.

ID: 41691 · Report as offensive     Reply Quote
Profile geophi
Volunteer moderator

Send message
Joined: 7 Aug 04
Posts: 1919
Credit: 40,679,736
RAC: 8,636
Message 41693 - Posted: 4 Mar 2011, 21:46:04 UTC - in response to Message 41691.  

There are no short term plans for that. Some climate models are being programmed to use GPUs for parts of the computation at some Universities or national centers. However, the IT resources on this project are understaffed in the best situations, and we are not in the best situation now. My guess is that the UK Met office, where the had*** models come from, would need to do the programming to have this put in. Then it would have to be adapted, and tested by cpdn, to run in the BOINC environment. Given the history of IT staffing here, that seems extremely optimistic in any type of "near" term timeframe.
ID: 41693 · Report as offensive     Reply Quote

Send message
Joined: 28 Mar 09
Posts: 125
Credit: 9,825,980
RAC: 0
Message 41703 - Posted: 5 Mar 2011, 14:48:11 UTC
Last modified: 5 Mar 2011, 14:49:41 UTC

There are a number of cuda fortran options available now. According to the Nvidia Fortran web page there are 4 options available:

1. The Portland Group cuda fortran compiler
2. A Fortran to C cuda translator
3. A Fortran wrapper for cuda C
4. Cuda libraries for Fortran 95

Hopefully there is something there that could be used to get a GPU app going sooner.
ID: 41703 · Report as offensive     Reply Quote

Send message
Joined: 19 Apr 08
Posts: 179
Credit: 4,306,992
RAC: 0
Message 41707 - Posted: 5 Mar 2011, 19:31:24 UTC - in response to Message 41703.  
Last modified: 5 Mar 2011, 19:31:51 UTC

Running CPDN's source through any of the above--if they mananged to produce any binaries at all--would result in a multitude of dismembered thermal routines.

A couple of older posts:

LochDhu10yr said in 2005:
"On page 9 of the Stanford link KeeperC posted it says, "Ideal apps to target GPGPU have minimal dependencies between data elements."

We know in CPDN each cell infuences the ones around it (temp, wind speed, humidity, etc.) So CPDN isn't a good project for a GPU, but SETI and Einstein would do well.

IMHO this will still be the case twenty years from now.
ID: 41707 · Report as offensive     Reply Quote

Message boards : Number crunching : Why not GPUs!?