Gavin Schmidt, supermodeller: the emergent patterns of climate change

by Gareth on May 2, 2014

In this new TED talk, Gavin Schmidt, NASA climate modeller and juggler extraordinaire, talks about the climate system, how we use models, how they’re put together, and how the great swirls of earth’s atmosphere emerge from a million lines of Fortran code. It’s a great exposition, and the graphics he calls up in support are magnificent.

{ 50 comments… read them below or add one }

Bob Bingham May 2, 2014 at 4:44 pm

Very very good. So why is the blgosphere full of deniers?

andyS May 2, 2014 at 5:32 pm

Good question. I would have thought one million lines of Fortran would convince anyone.

SimonP May 3, 2014 at 9:06 pm

That was actually quite funny. Legacy Fortran is often really poorly coded. Regardless of how well it is written, debugged, and documented, there is bound to be coding errors in a million lines of Fortran. The question is whether they give unintended results that no-one picks up on. It would be a nightmare to maintain, Fortran does (did?) not have concepts like encapsulation and abstraction to reduce unintended side-effects.

andyS May 4, 2014 at 9:40 am

Fortran had the most evil of programming constructs, the computed GOTO.

However, it was and is very good at array processing, hence its use in GCM simulations and seismic data processing.

Back in the day, we used Fortran with dedicated array processing hardware module.

The reason NASA and others still use it is probably because there isn’t a good business case to rewrite a million lines of code in another language. Today’s new thing is tomorrow’s legacy app.

Macro May 4, 2014 at 2:01 pm

Yes “GOTO” as you say – evil!
Actually the reason it is still used is because being pretty close to “assembler” – (now that was evil) it is still the fastest floating point language around.
“Predicting weather and predicting climate are actually quite separate beasts. Sure, they both use stonking big computers and telephone-book-thick reams of Fortran code (yes, Fortran, for those sniggering up the back – it’s still the fastest floating point arithmetic language). But the divergence is in how the forecasts are made.”
http://theconversation.com/a-chaotic-beast-probably-wacky-weather-and-climate-forecasting-5182

Thomas May 4, 2014 at 4:04 pm

On Assembler: When I started our company in 84 writing for the first generation Apple Macs I wrote a whole text processor drop in for our Desktop Publishing app in Assembler. Anything written in a high level language was simply agonizingly slow….
I still remember dreaming in Hex code back then. Oh the good old times… ;-)

Macro May 4, 2014 at 4:19 pm

You evil person you! :)
Wasn’t MacWrite good enough? lol I think I still have a 3.5in floppy with that on someplace…

Thomas May 4, 2014 at 9:44 pm

Yep! We squeezed our “RagTime”, a Desktop Publishing application with build in spreadsheets, graphics, a calculating text processor (spreadsheet expressions with their results floating in the text flow – something MSFT has not done to-date), and even an SQL DB interface for catalog publishing into a single 3.5″ floppy back in the 80ties and early 90ties… :-)

Thomas May 3, 2014 at 8:40 am

Good question!

My take on it is this: There are people who have early in the “game” declared their position (one of Denial) publicly on bogs and elsewhere, at a time when they had not actually considered the science carefully yet. But people find it incredibly hard to admit to being wrong. Instead they double down again and again, and dig themselves deeper and deeper into the dodo as this is the only way they see or know how to proceed. We have prime exemplars spamming this blog all the time…

Then there are people who realize quite clearly that admitting to the severity and the cause of AGW will immediately lay blame at the door of an out-of-control exponential growth system called “libertarian Capitalism”. It is the cancer that is eating our future. However these people are totally married to the paradigms of this system and a global threat that outs the cancer for what it is, must be denied because the cancer diagnosis is inconceivable to them. They equate alternative suggestions of how society should govern itself to navigate round the abyss as some form of communism, which must be defeated at all cost.

In the end, both these dispositions are frequently coming together in the same individuals. They will deny the the science and weasel around just like the young Earth bible sort will weasel around any discussion of Evolution or Paleo-Sciences with the most outrageous never ending nonsense. And it is also quite often the case of cause that people trained in science denial due to their religious fantasies are well placed to deny climate science and often do so to.

Many Humans are so silly!

nigelj May 3, 2014 at 12:11 pm

Reasons for climate denial? I agree with Thomas some people dig themselves into positions.

You also have the fossil fuel lobby who are quite powerful behind the scenes.

You also have people who are very capitalist and see government regulation as a threat. I feel this tendency can go too far.

Another reason may be more interesting. Apparently conservatives are much more likely to be climate sceptics. Conservatism and liberalism are believed to be innate traits people are born with, and both clearly have value. However if you are conservative, you may be uncomfortable about change, so climate change and changes to lifestyles or fossil fuel use would be seen as a major threat best denied.

John Mashey May 3, 2014 at 3:07 pm

“Apparently conservatives are much more likely to be climate sceptics”

1) The IPCC started when Reagan / George H.W. Bush were Presidents, and about that time, climate science in US wasn’t particularly politicized. That split was manufactured over the last 25 years … and I know folks affiliated with the Hoover Institution (not exactly a left-leaning place) who are quite concerned about climate change.

2) “skeptics”: SeePseudoskeptics Are Not Skeptics. Jo Nova claims to own the word “skeptic,” but I don’t think so.

3) But Gavin was off on one thing (well, really an in-joke that I suspect audience missed):
Fortran, not “newfangled languages like C.” C is 40+ years old. :-)

Macro May 3, 2014 at 3:23 pm

Yes it is! But then Fortran is even older! I started my programming on Fortran IV in the 1960’s on an IBM 360 with punch cards :) The task was to see which milking shed was the more efficient, herring bone or circular.

John Mashey May 3, 2014 at 5:45 pm

Yes, Fortran dates to the 1950s, so nearly 20 years older than C.
If you’re ever in San Francisco area, at the Computer History Museum in Mountain View, we still have working IBM 026 keypunches, in a room with working IBM 1401s and all their gear.

Computers have come a long way since then, as Gavin’s models would not get far on those, or even bigger S/360s. We had a fairly large one, 1MB of fast memory, 8MB of Large Core Storage.

Although I learned C in 1973, Fortran was still a good thing when I was designing supercomputers and helping sell them to NASA, NCAR, etc … where they spent most of their cycles running Fortran.

Macro May 3, 2014 at 7:44 pm

Ours was big by NZ standards – The NZ Dairy Board as it was then (Now Fontara) held every cow and bull in NZ on tape. Each cows milk production – butter fat, carotin, etc etc. Selective breeding from the top producers and bulls lead to increasing milk production. We in the research branch only had the odd hour or two to run our small programmes two or three times a week.

andyS May 3, 2014 at 8:07 pm

The computed GOTO in Fortran was evil though

John C May 4, 2014 at 10:32 pm

Quick question if someone doesn’t mind. I was reading something by a certain Judith Curry and she was talking about sea level rise slowing in recent years (from 3.5mm to 2.5mm/yr by memory). It has been well publicised that increased heat is being trapped in the deep ocean. I would think the resulting thermal expansion would lead to sea level rise increasing at a greater rate rather than slow down?

I can’t find an explaination online and it doesn’t make sense to me. Any thoughts most welcome.

Bob Bingham May 4, 2014 at 10:40 pm

If she said it she got it the wrong way round. it was 1.7 mm a year until 1990 and is now 3.16mm a year according to NASA. http://climate.nasa.gov/
http://www.climateoutcome.kiwi.nz/sea-level.html

bill May 4, 2014 at 10:50 pm
Rob Painting May 5, 2014 at 12:05 am

Hopefully Gareth will not tolerate this off-topic trolling by John C.

Gareth May 5, 2014 at 9:38 am

I will not. No more, John C.

bill May 5, 2014 at 11:55 am

Oh, I don’t know, I was sort-of wondering where Mr. ‘I’m Not Denying the Science, Me!’ was going to go with this. I understand that ‘the seas aren’t rising’ is a currently active meme in the ‘Quantum Climate Denial’ community…

Thomas May 5, 2014 at 6:52 am

A space to watch:

Climate change is clear and present danger, says landmark US report
National Climate Assessment, to be launched at White House on Tuesday, says effects of climate change are now being felt.

Climate change has moved from distant threat to present-day danger and no American will be left unscathed, according to a landmark report due to be unveiled on Tuesday.

http://www.theguardian.com/environment/2014/may/04/climate-change-present-us-national-assessment

John Mashey May 5, 2014 at 7:00 am

SIgh. As often happens when climate and computing intersect, people say all sorts of things.

IMPORTANT
Of course models are large, but physics models need to follow conservation laws that tend to keep bugs from propagating too much.
See Gavin’s excellent FAQs on models.

If people reject the possibility that large Fortran codes can be “good enough”, they should never ride in cars designed from the 1990s onward, fly in new airplanes, cross newer bridges, go up in newer skyscrapers, use drugs designed with computers, or in fact use anything designed with software from these folks. Also, they can reject the idea that the design of Black Magic had anything to do with computers.

I took a quick look at Gavin’s GISS code a few years back, it’s Fortran 90, not Fortran II or IV or even Fortran 77, and it looked decent to me.

For people who would like a reasonably-general introduction, at the Scientific American level, I’d strongly recommend a 1993 (still relevant, and beautiful) book, Supercomputing and the Transformation of Science, which can be gotten fro not much more than cost of shipping. It’s well done, and one of the authors, Larry Smarr, is a distinguished supercomputing expert … and even better, who bought a lot of gear from us at SGI. :-)

MINOR, BUT SILLY, AND ONLY FOR COMPUTER FOLKS:
There was nothing particularly evil about Fortran’s Computed GOTO, a useful construct whose near-equivalents are found in most algorithmic languages. (example: “switch” in C and its derivatives), and has been deprecated/deleted in more recent Fortran standards that include SELECT CASE,
The evil one was ASSIGNED GOTO, an absolute nightmare for optimizing compilers, which cannot be sure of the branch target at the GOTO itself. That was obsolescent in f90, deleted from 95.

andyS May 5, 2014 at 9:28 am

I was thinking that the computed GOTO was something along the lines of the old Basic construct GOTO i*100 where i is a variable. I checked on this last night and you are correct, John that the Fortran computed GOTO is more akin to the C switch statement. Nevertheless, the potential for evilness is still there,
http://en.wikipedia.org/wiki/Considered_harmful
as per Djjkstra’s famous letter, on the general GOTO statement, and in this cartoon
http://xkcd.com/292/

I was not implying that Gavin’s code was bad, or wrong, as a result, by the way. Developers like to knock their own products more than most, as exemplified by sites such as “The Daily WTF” and “Coding Horror”

Bob Bingham May 5, 2014 at 8:06 am

Its a repeating theme of deniers that the models are not right; and they are only models, and they were predicting we should all be dead by now; or underwater, or anything. Like most deniers themes they are not interested in the truth but if you keep repeating the same story it becomes an accepted urban myth.

andyS May 5, 2014 at 12:27 pm

So you think the models are right?

Rob Painting May 5, 2014 at 7:48 pm

Watch the video.

bill May 5, 2014 at 7:55 pm

Remember how andy ‘missed the biblical references’ in a post entitled ‘Postcards from la la land: David Archibald and the four horsemen of the cooling apocalypse’ which contained as its first quote from the said Mr. Archibald:

The Four Horsemen of the Apocalypse come from the Book of Revelation, the last chapter of the Bible. The Book of Revelation also warns of another beast with these words:

…and rather went on from there?

I don’t think andy pays all that much attention to the OP, frankly.

We think the models have skill, andy.

andyS May 5, 2014 at 8:15 pm

In this case, I have done my homework

Gavin says at 6:35 in the presentation:

“Models are not right or wrong. They are always wrong, they are approximations..”

and then goes on to discuss model “skill”

In the Realclimate FAQ that John Mashey linked to upthread. Gavin asks the following questions

What is being done to address the considerable uncertainty associated with cloud and aerosol forcings?

and

Are clouds included in models? How are they parameterised?
Models do indeed include clouds, and do allow changes in clouds as a response to forcings. There are certainly questions about how realistic those clouds are and whether they have the right sensitivity

From my limited understanding of the shortcomings of climate models, aerosols and clouds are some of the biggest uncertainties.

This is not to say that models are wrong or a waste of time

However, the uncertainties expressed in the FAQ weren’t really expressed in the TED talk, perhaps because TED is aimed at a non-technical audience

bill May 6, 2014 at 2:45 pm

This is not to say that models are wrong or a waste of time

…but I’m going to carry on as if that were the case.

Only a fool would assume that models that have performed well in hindcasting can be dismissed on the basis of their political implications – and the gotcha! revelation that the future hasn’t happened yet (well, duh!) – when forecasting.

Which is where you come in…

andyS May 6, 2014 at 2:52 pm

Does hindcasting validate the models?

Rob Painting May 6, 2014 at 5:51 pm

Yes. See my post at Skeptical Science:
Climate Models Show Remarkable Agreement with Recent Surface Warming. Note whom the author of that 2014 paper is.

Rob Painting May 6, 2014 at 5:54 pm

I notice that you, based on your comments at Slater’s blog, seem to have no qualms about the modelling involved in the satellite-based temperature time series.

andyS May 6, 2014 at 5:57 pm

Rob, I am unaware of the modelling issues in RSS, as I pointed out in my reply to you at WO

My only real comment about models was the “it’s models all the way” which was a reference to the Potsdam paper.

The models may or may not be correct, but they are still, nonetheless, just models, and should be treated with the same caution as any other model that we use to approximate a real world entity

bill May 6, 2014 at 5:59 pm

Oh, this silly game. Why don’t you tell us, andy?

andyS May 7, 2014 at 11:50 am

I would have thought that historical data is used to calibrate the models, so hindcasting seems to be just validating that they fit the historical data.

However, it seems that hindcasting doesn’t work that well across the board.

e.g

The Atlantic Multidecadal Oscillation (AMO) index is predicted in most of the models with significant skill, while the Pacific Decadal Oscillation (PDO) index shows relatively low predictive skill. The multi-model ensemble has in general better-forecast quality than the single-model systems for global mean surface temperature, AMO and PDO.

http://judithcurry.com/2012/05/17/cmip5-decadal-hindcasts/

Thomas May 7, 2014 at 7:31 pm

Yes, models, while pretty reliable generally, clearly have some limitations as the reality can sometimes track at the limit of the envelope sketched by the models, not at the center….

http://www.wunderground.com/climate/facts/models_are_reliable.asp

Surprised?

But why on Earth do Deniers always believe that the limitations of models will work in their favor, i.e. over predicting the effects?

John C May 5, 2014 at 1:34 pm

My fault, I realise it is not relevant to this thread. I was not having much luck myself so just thought I would ask here. Will keep it on topic in future, no problem.

[Snipped. You were warned. GR]

Bob Bingham May 5, 2014 at 1:53 pm

If you look at the NASA sea level graph there was a big dip in 2010/11 when it rained like hell in Pakistan and Australia and it took at least a year for the water to get to the sea.Like the temperature it is not an even progression.

bill May 5, 2014 at 2:44 pm

I’m curious – would I be correct in wagering I’d have been able to claim a QED on the snipped material? ;-)

Rob Painting May 5, 2014 at 7:49 pm

Yes, you were right. If only predicting the lottery were so easy.

bill May 5, 2014 at 8:13 pm

Woohoo! Psychic or what!? Reckon I may yet be able to claim James Randi’s $million… ;-)

Rob Painting May 5, 2014 at 2:16 pm

My only quibble with the talk is Gavin’s repeated use of ‘orders of magnitude’. The general public doesn’t know what that means. Other than that it’s excellent.

John Mashey May 5, 2014 at 2:45 pm

For fun, try Powers of Ten.

I think a TED audience would get it, given the visuals.

bill May 5, 2014 at 2:49 pm

A quick insert of ‘meaning that each jump in the scale gets larger (or longer) by a factor of ten’ wouldn’t go astray, I agree. Though then people often don’t grasp what a vast transition just a handful of such jumps can represent, but that is, at least, explained anecdotally in the presentation.

bill May 5, 2014 at 2:51 pm

Ah, snap with John M!

nigelj May 6, 2014 at 9:07 am

On this sea level rise issue Judith Curry claims sea level rise has decreased from 3.5mm to 2.5mm per year. Probably true if you look at just the last 5 years, as there was a big drop in sea level rise some years ago due to heavy rains in South America, but since then rates have jumed back up, but an average for the period could be about 2.5mm.

So what does all this mean? Typical sceptic picking a short period that is coloured by natural variability and too short to conclude anything. They do it with temperatures and arctic ice, so why not try the same things with sea level rise.

Rob Painting May 6, 2014 at 2:44 pm

Still off-topic. Try to not let the anti-science drongos ruin every comment thread.

Bob Bingham May 7, 2014 at 4:19 pm

The big issue with sea level rise is that the models are not good at predicting sudden collapse. They can work out the melt rate of a glacier, just as anyone can with a big block of ice, but Greenland is riddled with holes like a rotten cheese and when the holes join up it collapses. The scientists who work there believe that the situation is dire and much worse than the IPCC can prove but predicting it is rather difficult. A lot hangs on the answer as millions of people live within one metre of sea level. We can offer a new home to the people of Tokyo, Shanghai, Bangladesh, Florida and of course Kiribati.

Thomas May 7, 2014 at 7:47 pm

A sobering consideration:

a mere 1% loss of ice from these three sources (Greenland, East Antarctic, West Antarctic ice sheets) would produce a likely increase in sea levels of around 83cm – from these ice formations alone….

The best fitting trend finds that Greenland ice loss is accelerating at a rate of 30 Gigatonnes/yr2. Greenland’s mass loss doubled over the 9 8 year period. (2002-2009)

http://www.skepticalscience.com/stable-greenland-ice-sheet-basic.htm

and

http://www.skepticalscience.com/stable-greenland-ice-sheet-intermediate.htm

{ 1 trackback }

Previous post:

Next post: