Over the last couple of days, my posts have been embarrassingly light, in both volume and content. It’s been a nice break for me. It is necessary to do so, at times. But, it leaves me feeling a bit guilty. This isn’t what most readers have come to expect from Suyts Space. Sadly, today is a day of playing catch-up, so I may be a little late in the content. But, late for Suyts Space is timely for most others. That stated, I must include this caveat. When the Marcott paper was released, I paid little attention to it. Hank is the one who brought it to the forefront of my focus. So, it could be that others out there have done things similar to Hank’s work that I’m not aware of.
Even though I’m behind in my reading, posts, and other time consuming things, I think it appropriate to take a moment and recognize…….
Right now, in terms of Marcott, I think it’s time to reach for the popcorn and watch. The hubbub of the latest hockey stick has gained the attention of many and they’re weighing in now, not the least of which is Steve McIntyre of Climate Audit.
Steve Mac has a very interesting post up. If you haven’t read it yet, one should do so. It’s enlightening and humorous. One of his graphics caught my eye…….
Now, where, oh where have we seen this before? Oh, yeh…….
The graph directly above is Hank’s work in reconstructing the proxy data used by team Marcott. Reproduced, verified and confirmed! I should also mention that Hank was right not to include it in his offering, More Fishing for Hockey Sticks in Marcott et al., 2013, in that even the authors state the hockey stick end of the graph (going either way) is not robust. Of course, that didn’t stop them from releasing their hockey stick graph to the public.
As it turns out, I underestimated team Marcott’s creativity. Instead of doing that tiresome act of deception by splicing unlike data to the end of a graph, why not just manipulate the data one presently has to make the distortion? Apparently, team Marcott believes it appropriate to simply move data about the timeline at their leisure. Their conscience allows this because in their minds the message is more important than the truth.
One of the many other’s looking into the Marcott debacle is Nick Stokes. Climate warriors will easily recognize his name. While I disagree with Nick’s views on the various climate issues, I don’t question his abilities. He’s a sharp guy well versed in the climate issues. Here’s his attempt at reconstructing team Marcott’s graph….. minus the time travel.
Hank overlaid his work onto this……. (heavy black line)
And, lastly, there are some notes of interest in the comments of Nick’s post…….
NikFromNYCMarch 16, 2013 at 10:01 PM
“Over the last century we have good thermometer records and don’t need proxy reinforcement.”
Make that the last *three* centuries:
http://s7.postimage.org/trx62nkob/2agnous.gifNick StokesMarch 16, 2013 at 10:11 PM
Nik,
Yes, certainly more than 150 yrs, anyway. But there’s a lot of pointless argument about 20C proxy aberrations, when no-one seriously believes they should be preferred to the thermometer record.
Continuing with another exchange……..
CarrickMarch 17, 2013 at 4:21 PM
Nick thanks for updating your code.
On a related topic, it probably would have been better if Shaun Marcott had avoided making this statement:
“In 100 years, we’ve gone from the cold end of the spectrum to the warm end of the spectrum,” Marcott said. “We’ve never seen something this rapid. Even in the ice age the global temperature never changed this quickly.”
The statement may or may not be right, but you certainly can’t use Marcott’s reconstruction to reach that conclusion.CarrickMarch 17, 2013 at 4:24 PM
I should have added for people who haven’t been following that particular thread across the blogosphere, that the frequency resolution of the reconstruction prevents you from making a statement of this sort. I find it mildly ironic that Marcott used the word “spectrum” in a assertion that is shown untenable by looking at the spectrum of his reconstruction.
Nick StokesMarch 17, 2013 at 4:55 PM
Carrick,
Indeed, his recon can’t show that.
I was interested in this quote from your link:
“The same fossil-based data suggest a similar level of warming occurring in just one generation: from the 1920s to the 1940s. Actual thermometer records don’t show the rise from the 1920s to the 1940s was quite that big and Marcott said for such recent time periods it is better to use actual thermometer readings than his proxies.”
That’s a point that I’ve been making for a while. People get excited about 20C aberrations in proxies, but they shouldn’t. It doesn’t tell us anything about real temps; thermometers do that. It may tell us about inadequacies in the proxies, but in this case it just says there were too few of them.
CarrickMarch 17, 2013 at 5:34 PM
I don’t really understand what you referring to as to which people are getting excited about which 20C aberrations. Perhaps you could link me so I would be sure I understood your comment better?
With the update to your post, it is looking more and more likely that the 20th century spike represents a glitch in their method (it’s too high of frequency to be signal related).
I highlighted these exchanges to note and reinforce a couple of points I made in one of my posts. Real thermometer readings are preferred over proxy data. Of course, this is true. Commenter Carrick, though, makes the point which is relevant when discussing the use of the real thermometer data, which is the frequency resolution of such data.
So, now it’s my turn to take a bow. As I’ve often stated, one shouldn’t splice unlike data to the same graph. But, if one was to do so, one must make them as alike as possible. The frequency resolution probably being the most important step. The argument that we should use thermometer data when we can isn’t unreasonable. (Sort of, but that’s for another post.) Here’s my contribution towards this notion. From my dagger post, the temp data attached to the proxy data, represented with the proper frequency resolution…… (small red tick mark on the end of the graph).
Maybe, this little break I’ve taken has given the rest of the world time to catch up with the people here at Suyts Space.
Carrick also makes the point of the utterly unredeemable dishonesty of Marcott. Marcott may be redeemed, through Christ, but, his dishonesty cannot. But, again, that’s for another post. But, it should be noted.
Well done!
“CarrickMarch 17, 2013 at 5:34 PM
I don’t really understand what you referring to as to which people are getting excited about which 20C aberrations. Perhaps you could link me so I would be sure I understood your comment better?”
Which people get excited? Well, co-author Shakun certainly. Video interview with Revkin.
http://dotearth.blogs.nytimes.com/2013/03/07/scientists-find-an-abrupt-warm-jog-after-a-very-long-cooling/#more-48664
(That Shakun guy is a SCIENTIST? ROTFLMAO.)
Holy crap!!! “Boom! Outside the elevator”? His elevator doesn’t go all the way up!!! Like you said…. scientist? Bwahahahaha!!!!
Dirk posted this earlier……we need to pay attention to the pea
DirkH says:
March 17, 2013 at 6:25 am
It turns out that Shakun and Marcott are NOT the innocent nincompoops I thought. Rather, they are specialists in rewriting history. Remember that CO2 tracks temperature? Well, but what if you can “re-date” rpoxies? Then we could make it look like CO2 comes before temperature and ALL IS WELL IN THE LAND OF THE WARMIST.
http://climateaudit.org/2013/03/16/the-marcott-shakun-dating-service/#comment-405373
==========
This also explains why “re-dating” must be as large as 1,000 years; it must be big enough to encompass the 800 years lag between temperature and CO2 that non-redating studies find.
Right, but, what can ones say to a person who thinks time and events are subjective matters? Truth isn’t relevant to these people.
There is inherent uncertainty in proxy dating. So it is a matter of degree.
Well, yes, I’d say about 180 degrees from reality. 😀
Yes, there are a number of implications that come from their re-dating the proxies. The least of which is to prove that time travel is possible for proxies. Just put them in the Marcott and Shakun time machine and viola! They reappear 1000 years later.
This whole business of perturbation needs to be looked at. All it does is plays with the error of the means. The mean of a sample is most likely to represent reality. The larger the number of samples, the more representative of reality that mean becomes. When you shift (perturbate) any measure towards either end of the error region, you’re pushing the data into a least likely scenario. You’re increasing the overall error of your results and producing an outcome that is less likely to represent reality. In this case, Marcott et al. did just that.
Why anyone would do that baffles me. But I understand why the do it. In multiple noisy datasets it allows you to find just about any pattern you want.
..and that’s the bottom line
From the laws of chemistry: Carbon dioxide is highly soluble in water. Cold water holds more than hot water (Henry’s Law of solvation). The oceans are not just water, they’re lots of cold salty water (excepting the surface that gets solar heating), so the oceans have essentially an infinite buffering capacity (which alters Henry’s Law, btw; which is strictly for the solvation only, but you can modify it to take buffering and other chemical reactions into count), particularly when you can precipitate out calcium, magnesium and other divalent cation carbonates out.
Atmospheric carbon dioxide partial pressure must follow first ocean water temperatures then to a lesser extent, air temperatures. (Assuming biological and other inorganic sources and sinks net to zero.)
Reblogged this on Climate Ponderings.
I’ll be out of pocket for a few hours. BBL