climateprediction.net home page
Using GPUs for number crunching

Using GPUs for number crunching

Questions and Answers : Wish list : Using GPUs for number crunching
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · Next

AuthorMessage
Profile mo.v
Volunteer moderator
Avatar

Send message
Joined: 29 Sep 04
Posts: 2363
Credit: 13,363,707
RAC: 28
Message 36862 - Posted: 5 May 2009, 15:02:32 UTC

I can understand how you feel, Tullio, and I would not be happy if anyone ever felt excluded on CPDN. I know that some people have much faster computers than others, but running models on CPUs we're all more or less comparable (though not equal) and most of us can run all or most model types.

I do think that CPU and GPU credits should go into separate league tables. Just as there are separate marathons for different classes of participants.

It looks to me as if the top 100 BOINC crunchers are now almost all processing on multiple graphics cards, probably several per computer. Hardly any of them crunch much CPDN.

On CPDN what we hope members will accumulate is expertise which is just as valuable for completing models as high-end hardware.
Cpdn news
ID: 36862 · Report as offensive     Reply Quote
BarryAZ

Send message
Joined: 13 Jul 05
Posts: 125
Credit: 11,616,200
RAC: 0
Message 36863 - Posted: 5 May 2009, 16:58:45 UTC - in response to Message 36862.  

One interesting thing is that the only cards really capable of doing GPU processing for BOINC are pretty much gamer cards (which also are good for photo processing and CAD/CAM work). The large majority of computer users these days don't really need those cards and do well enough with embedded GPU support (particularly with a number of newer AMD processor boards using ATI 3200 or ATI 3300 GPU's).

I picked up a couple of 9400GT cards at a nice price for systems that lacked embedded video, and then the other day for a system rebuild I picked up a 9600GT. I run GPU Grid on these CUDA cards and the 9400GT is truly 'entry level' for this task, the 9600GT is a better fit -- but that's a $100+ video card which is something I'd not deploy for regular business workstations.

One nice thing about the GPUGrid project is, as its name suggests, that it is only for GPU processing (and until the millenial day that the native BOINC client supports ATI cards, only for nVidia cards). On GPUGrid there isn't a CPU versus GPU debate.

Over on MilkyWay there are indeed some 'GPU is the only way folks' and they get a fair amount of pushback from the 'I'm not a gamer and won't buy AMD 48xx cards strictly to generate MilkyWay credits' folks. That debate is an interesting experiment for sociologists to look at (smile).


ID: 36863 · Report as offensive     Reply Quote
Profile tullio

Send message
Joined: 6 Aug 04
Posts: 264
Credit: 844,488
RAC: 0
Message 36864 - Posted: 5 May 2009, 19:25:38 UTC
Last modified: 5 May 2009, 19:25:59 UTC

When SETI started, the idea was that of exploiting unused cycles. Now people are building homemade supercomputers just to run BOINC. Yes, times have changed.
Tullio
ID: 36864 · Report as offensive     Reply Quote
old_user502006

Send message
Joined: 14 Feb 08
Posts: 1
Credit: 198
RAC: 0
Message 44414 - Posted: 15 Jun 2012, 22:53:37 UTC - in response to Message 33134.  
Last modified: 15 Jun 2012, 22:55:34 UTC

Why say cuda math isn't good enough for science when setiathome uses cuda?
ID: 44414 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 6952
Credit: 20,843,205
RAC: 0
Message 44415 - Posted: 15 Jun 2012, 23:08:06 UTC - in response to Message 44414.  

SETI doesn't need or use the very wide range of values that climate models do.
A lot, if not most of the code here, is 80 bit floating point.

Anyway, the programs are from the UK Met Office, who run the originals on supercomputers.
You'll have to talk to them about creating GPU versions.



Backups: Here
ID: 44415 · Report as offensive     Reply Quote
Professor Desty Nova
Avatar

Send message
Joined: 19 Sep 04
Posts: 92
Credit: 1,698,513
RAC: 1
Message 44790 - Posted: 1 Sep 2012, 10:17:54 UTC

Since Climateprediction doesn't use GPUs, they could tweak/update the server code so newer BOINC clients aren't always asking for GPU work like this:

01/09/2012 10:24:48 | climateprediction.net | Sending scheduler request: To fetch work.
01/09/2012 10:24:48 | climateprediction.net | Requesting new tasks for ATI
01/09/2012 10:24:50 | climateprediction.net | Scheduler request completed: got 0 new tasks
01/09/2012 10:24:50 | climateprediction.net | No work sent


Some of the projects I run have on the "proprieties of Projects" in the BOINC client, "Project have no apps for ATI GPU", and these don't ask for work for the GPU.


Professor Desty Nova
Researching Karma the Hard Way
ID: 44790 · Report as offensive     Reply Quote
Profile [AF>Amis des Lapins] Phil1966

Send message
Joined: 10 Jul 13
Posts: 5
Credit: 39,900
RAC: 0
Message 47376 - Posted: 21 Oct 2013, 18:43:39 UTC

Very interesting topic. Have read almost all posts.
Might be worthwhile for CPDN to "lose" 3 to 6 months "recoding" the project for GPU's ....
As it would then run 20 times faster.
Don't need any answer.
Just that 323 hours or more for 1 WU, these days, looks very unattractive
for people investing into hardware to run BOINC only ...
Sorry for my bad english.
Kind Regards
Philippe
ID: 47376 · Report as offensive     Reply Quote
Profile Dave Jackson
Volunteer moderator

Send message
Joined: 15 May 09
Posts: 2463
Credit: 3,124,201
RAC: 383
Message 47381 - Posted: 22 Oct 2013, 8:11:21 UTC - in response to Message 47376.  

I know you said, "no reply needed," but I suspect from what I have read on this and other threads it would be a lot more than 3-6 months.
ID: 47381 · Report as offensive     Reply Quote
Profile MikeMarsUK
Volunteer moderator
Avatar

Send message
Joined: 13 Jan 06
Posts: 1498
Credit: 15,613,038
RAC: 14
Message 47388 - Posted: 22 Oct 2013, 11:49:18 UTC - in response to Message 47376.  
Last modified: 22 Oct 2013, 11:52:58 UTC

...Might be worthwhile for CPDN to "lose" 3 to 6 months "recoding" the project for GPU's ....


Like Dave says, it would probably take man-decades rather than a few months. The simple models we run are about a million lines of code, and the next generation is 10 million lines of codes. If they start today, perhaps they might finish on the 19th Jan 2038, but probably they won't :-)

In any case, GPUs can perform only relatively simple tasks. Complex ad-hoc code is out of their scope.
I'm a volunteer and my views are my own.
News and Announcements and FAQ
ID: 47388 · Report as offensive     Reply Quote
old_user715391

Send message
Joined: 3 Apr 14
Posts: 4
Credit: 7,511
RAC: 0
Message 48687 - Posted: 3 Apr 2014, 7:50:45 UTC

https://developer.nvidia.com/cuda-fortran
ID: 48687 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 6952
Credit: 20,843,205
RAC: 0
Message 48688 - Posted: 3 Apr 2014, 8:03:25 UTC - in response to Message 48687.  

The climate programs used were created, improved, and are owned by the UK Met Office.
They normally run on their supercomputers, are mostly floating point FORTRAN, and the source code is said to be close to a million lines long.

None of this is going to show up on fiddly little GPUs any time soon.

ID: 48688 · Report as offensive     Reply Quote
old_user715391

Send message
Joined: 3 Apr 14
Posts: 4
Credit: 7,511
RAC: 0
Message 48689 - Posted: 3 Apr 2014, 8:18:55 UTC - in response to Message 48688.  

did you look at my link?
ID: 48689 · Report as offensive     Reply Quote
old_user715391

Send message
Joined: 3 Apr 14
Posts: 4
Credit: 7,511
RAC: 0
Message 48690 - Posted: 3 Apr 2014, 8:19:57 UTC - in response to Message 48688.  

Also, why does it say "GPU computing suspended", that would imply it could be enabled? What does this mean?
ID: 48690 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 6952
Credit: 20,843,205
RAC: 0
Message 48691 - Posted: 3 Apr 2014, 8:31:34 UTC

No, I didn't look at your link. It's a waste of time bothering.

And you get messages like that because some projects do use GPUs, and it's possible that some others that may be possible to eventually use them, have started to do so. So BOINC needs to check. If it doesn't find the right reply, it starts to ask less frequently, until it gets down to once per day.

There are ways for a project option to be set to stop it permanently, but we still haven't upgraded the server software to a version that can do that.
I think that it may be possible to use an option in a cc_config file that does something similar.

ID: 48691 · Report as offensive     Reply Quote
old_user715391

Send message
Joined: 3 Apr 14
Posts: 4
Credit: 7,511
RAC: 0
Message 48696 - Posted: 3 Apr 2014, 10:03:06 UTC - in response to Message 48691.  

The climate prediction model uses GPU. And you claim that Fortran cannot use GPU is incorrect.
ID: 48696 · Report as offensive     Reply Quote
Profile Iain Inglis
Volunteer moderator

Send message
Joined: 16 Jan 10
Posts: 979
Credit: 3,103,609
RAC: 95
Message 48698 - Posted: 3 Apr 2014, 10:43:53 UTC - in response to Message 48696.  

That's not right. The climate prediction models from CPDN do not use the GPU. If some other project uses a GPU for climate prediction then that doesn't change the argument that Les described. The issue is not that GPUs can't support FORTRAN, it's that getting a million lines of FORTRAN to run on a GPU and produce the same numerical results is a big job.

The question of GPU processing and multi-processor implementations keeps coming up as it seems to offer a way to get results quicker, but the answers are always the same. CPDN is about ensembles of models - evangelists for new technologies really need to explain how CPDN's objectives can be better met with these technologies given CPDN's evidently limited resources.
ID: 48698 · Report as offensive     Reply Quote
old_user714979

Send message
Joined: 28 Mar 14
Posts: 7
Credit: 47,798
RAC: 0
Message 48737 - Posted: 8 Apr 2014, 6:51:56 UTC - in response to Message 48698.  

Some people leave their PCs on just to complete work units and this may have consequences.

The Wikipedia entry for our local brown coal burning power station called Hazelwood says it was listed as the least carbon efficient power station in the OECD in a 2005 report by WWF Australia. Guess you can argue if this is valid or not, but the issue is a work unit completed on brown coal power is probably going to do more damage to the environment than a work unit completed on a green energy source. Using inefficient code means work units take longer to complete which increases the quantity of Carbon released into the atmosphere.

In real terms one PC is a drop in the ocean. It may become a perception issue when considering the carbon released by the PC base over the lifetime of a climate prediction project.

Based on the current Task info for SETI@Home's BOINC client project my video cards NVIDIA GPU completes a work unit in about 17 minutes while the Intel CPU work unit will take just under 3 Hours on a similar task.

As a FORTRAN compiler with GPU extensions already exists the obvious first step is to compile the existing FORTRAN code without using any GPU extensions for the BOINC client and see if anything breaks. Maybe the compiler will have a meltdown, maybe it will work flawlessly. No point getting exited about GPU processing if the tools are not up to the job.

Assuming no major problems with the compiler the next step is professional development of your programming staff with training on GPUs. At this point you might be in a position to know if GPU processing is a reasonable option. Cost is a compiler, some days of professional development then decide if GPU processing is an option. Floating Point rounding, guard digits, etc are significant issues that can't be assessed without knowledge of the GPUs and software.

I'll do my bit to be more efficient by not getting more climateprediction.net tasks and I'll concentrate on efficient BOINC projects that will use GPUs.
ID: 48737 · Report as offensive     Reply Quote
Profile mo.v
Volunteer moderator
Avatar

Send message
Joined: 29 Sep 04
Posts: 2363
Credit: 13,363,707
RAC: 28
Message 48738 - Posted: 8 Apr 2014, 9:10:28 UTC
Last modified: 8 Apr 2014, 9:12:02 UTC

Hi Volvo

The CPDN models come from the UK Met Office where they consist of a version of the Unified Model. CPDN then adapts these models for its own use, for example deciding on the precise parameter values for particular experiments and compiling the models for the three platforms: Windows, Linux and Mac. Further CPDN adaptations can consist of time-slicing long models so that different computers take on different sections, and they all have to be stitched together.

But they all still consist of the Unified Model which the Met Office has adapted and developed continuously for years. The Met Office has a team of developers working on this, just as at the small number of other institutions that have developed models. I've seen a list of the names of one of these teams; it filled a computer screen. I also know that these organisations employ ace programmers.

To my knowledge these organisations all run their models on CPUs, in some cases on supercomputers. For example, a supercomputer in Tokyo is used for this purpose. If using GPUs were possible for the type of calculations required for climate models I'm pretty sure that all these model programmers in several institutions would already have harnessed this possibility. They have every motivation to complete model runs as quickly as possible because similar models based on the UM are used for weather prediction, for which they also run ensembles, albeit much smaller than ours at CPDN.

CPDN has two programmers who do not design the UM which runs on CPU.

We are all aware that running research tasks on computers uses electricity and that we need to ensure that our computers run as efficiently as possible. One way we can reduce the carbon footprint is be ensuring that as few models crash as possible.
Cpdn news
ID: 48738 · Report as offensive     Reply Quote
Ingleside

Send message
Joined: 5 Aug 04
Posts: 96
Credit: 11,299,871
RAC: 21
Message 48739 - Posted: 8 Apr 2014, 13:18:36 UTC - in response to Message 48737.  

Assuming no major problems with the compiler the next step is professional development of your programming staff with training on GPUs. At this point you might be in a position to know if GPU processing is a reasonable option.

Well, AFAIK all currently active climate-models uses SSE2-optimizations, and my guess this means they're using double-precision. Since the fortran-compiler linked a few posts back is CUDA, and Nvidia-cards has abyssimally poor double-precision-speed of only 1/24 single-precision-performance, except if you pays $$$$ for the professional cards, even a top-end Nvidia-GTX780Ti only manages 210 GFLOPS at most. A quad-core (8 with HT) cpu on the other hand is around 100 GFLOPS. Meaning even best-case the Nvidia-GPU will only be 2x faster than CPU. In reality even 50% performance on GPU can be too high, meaning your "slow" CPU is outperforming your "fast" GPU.

So, unless can use single-precision on most of the calculations, a CUDA-version of CPDN is a waste of development-time.

Instead of CUDA, an OpenCL-compiler would be more interesting, since OpenCL also works with the much faster Amd-GPU's. But even with this additional speed, it's still unlikely can get a climate-model to run faster on GPU than CPU.
ID: 48739 · Report as offensive     Reply Quote
Profile geophi
Volunteer moderator

Send message
Joined: 7 Aug 04
Posts: 1821
Credit: 36,393,036
RAC: 10
Message 48740 - Posted: 8 Apr 2014, 15:19:03 UTC

Climate research groups are testing GPUs for climate prediction. But, from what I've seen, they are testing it for only part of the calculation. Climate models are HUGE and very non-linear. They are testing the GPUs in the radiation calculation part of the model computations.

Given that the model is owned and changed by the UKMET office group, and they have not been modifying it to run on GPUs, I'm not sure how two CPDN computer people who are already overworked are going to do it.

As for energy efficiency, I do my part by expanding the number of models I run during the cool season, and greatly decreasing it during the warm season. During one relatively mild winter with lots of CPUs running climate models, I only had the heat come on twice. On the other hand, I have no wish to spend more money on electricity for air conditioning during our hot summers.
ID: 48740 · Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · Next

Questions and Answers : Wish list : Using GPUs for number crunching

©2019 climateprediction.net