Feeds:
Posts
Comments

Dan’s constant

BPSDB

Following from Dan Pangburn: Dan commented on this, helpfully providing links to several pdfs showing where he derived his constant.

So I took a look at them.

Dan uses the First Law of Thermodynamics.

That’s a start: Energy(in) – Energy(out) = Energy(retained).

Let’s take a look at Energy(Out). A single term: X·T4, where X is Dan’s constant.

Dan helpfully provides a link to Wikipedia’s A very simple model. This shows

(1-a)S = 4εσT4

where

  • S is the solar constant – the incoming solar radiation per unit area—about 1367 W·m−2
  • a is the Earth’s average albedo, measured to be 0.3
  • σ is the Stefan-Boltzmann constant—approximately 5.67×10−8 J·K−4·m−2·s−1
  • ε is the effective emissivity of earth, about 0.612

Dividing both sides by (1-a)S, we get 1 = Y·T4 where Y = 4εσ/(1-a)S.

Plugging in the terms into Y, we get
Y = (4 x 0.612 x 5.67×10−8)/(0.7 x 1367) K−4
= 1.45·10-10 K−4
(or 1·10-10 K−4 to 1 s.f. – we can’t justify more than one significant figure)

Y can’t really be a constant, though, since 1 = Y·T4. If T increases then Y must decrease (and vice versa). Perhaps we should rewrite it as Yi·Ti4 = 1. But for small ΔT Y will not change by much.

So far, so good.

Dan derives his constant in the same way, but then multiplies an additional term (the average sunspot count).

Quoting from his pdf on page 6:

The average sunspot number since 1700 is about 50, the energy radiated from the planet is about 342*0.7 = 239.4 (for the units used) and the earth’s effective emissivity is about 0.61 (http://en.wikipedia.org/wiki/Global_climate_model). Thus, as a place to start, X should be about 50/239.4 times the Stephan-Boltzmann constant times 0.61.

50/239.4*5.67E-8 *0.61 = 7.2E-9

Which then he “refines”, continuing:

With this plugged into the equation, a plausible graph is produced with a dramatic change observed to take place in about 1940. In EXCEL, 7.2E-9 was placed in a cell and the cell (value for X) called by the equation which produced a graph. The graph was observed as the value for X was varied. X was adjusted until the net energy from 1700 to about 1940 exhibited a fairly level trend. This occurs when X is 6.519E-9 (unbeknownst to me at the time, cell formatting rounded it to 6.52E-9).If an average sunspot number of 6.52/7.2*50 = 45.28 had been used, no adjustment would have been needed.

This is, of course, nonsense.

But we will follow this for now to see where it goes.

If we now multiply Y by Dan’s sunspot average count, we get
45.28Y = 45.28 x 1.45·10-10 K−4
= 6.56·10-9 K−4.

This is pretty close to Dan’s value (the difference is probably due to slightly different values of the terms S & ε, which I had used from the model).

To all intents and purposes X = 45.28Y.

Now go back to Dan’s term X·T4, the output energy.

Replacing X with 45.28Y, and remembering that Y·T4 = 1 (so that Y = T-4), we get

X·T4 = 45.28·T-4·T4.

Gosh! The Temperature terms cancel, the Stefan-Boltzmann equation vanishes, and the energy we are left with is ….
 
 
45.28 Sunspots ….
 
 


 
 
 
Image Credit:

Solar & Heliospheric Observatory

AddThis Social Bookmark Button

We give our consent every moment that we do not resist.

Comment Policy

Discussion on this blog is to be guided by:

It is worth knowing and abiding by whether you comment on this blog or not.

Comments that are not relevant to the post that they appear under or the evolving discussion will simply be moved or deleted, as will links to Denier spam known to be scientific gibberish
  • The “Mostly” Open Thread” is for general climate discussion that is not relevant to a particular post. Spam and abuse rules still apply;
  • The “Challenging the Core Science” Comment Thread is for comments that purport to challenge the core science of anthropogenic climate change.
  • The “Spam” Comment Thread is for comments posted by people who think that they can ignore site policy.

:

Dan Pangburn

Dan Pengburn has been commented several times, so I think that he deserves a post of his own. 🙂

anom(Y) = calculated temperature anomaly in year Y
N(i) = average daily Brussels International sunspot number in year i
Y = number of years that have passed since 1700 (or any other year where the net summation is approximately zero such as 1856, 1902, 1910, 1938, or 1943)
T(i) = agt (average global temperature) of year i in °K,
ESST(c,Y) = ESST (Effective Sea Surface Temperature) in year Y calculated using an ESST range (magnitude) of c
CO2(Y) = ppmv CO2 in year Y
CO2start = ppmv CO2 in 1880

Dang, his equation is just too big to fit the image.
However we could simplify his equation and tidy it up a bit.

In the first summation, N(i) will always be positive, since N(i) >= 0 (you cannot have negative sunspots).

It is also dimensionless. For dimensional analysis Dan has to change this to degrees, but doesn’t say how it does.

In the second term, 6.52×10-9T4 is quoted.
Dan did not state how this was derived, or cited. Is it based on science, or has Dan just made it up?

It is vaguely reminiscent of the Stefan Boltzmann law, but he is certainly not using the Stefan–Boltzmann constant (which is more than 10 times larger than Dans). Besides which the the Stefan Boltzmann equation is in units of Watts per square metre. We need to derive the anomaly in degrees Kelvin, so somehow Dan needs to define his constant.

Using Dan’s value at 288K, term 2 results in ~ 45. So (according to Dan) we should be warming whenever the “average daily Brussels International sunspot number” is more than 45 then we should result in warming (e.g. 2000), and a lower number should mean cooling (e.g. 2007).

Check any of your favourite datasets and see if how many years agree with Dan’s figures. I don’t know how many that you might find in agreement with Dan’s, since I gave up looking after seeing 2000 and 2007.

But there are more terms – perhaps we need to look at them to see if this makes more sense further along.

(Or maybe not. :))

ESST(c,Y) = ESST (Effective Sea Surface Temperature) in year Y calculated using an ESST range (magnitude) of c

I haven’t got a clue what this means, since Dan does not say how he derives it. But I don’t think it means the Sea Surface Temperature. Perhaps he meant the anomaly?

The last term does makes sense – that CO2 will warm logarithmically.

Then we have Dan has four “coefficients”, a, b, c and d.

Usually are coefficients are constants without dimensions (e.g. π). I know that some engineering terms use coefficients with units, but if they are they quoted in units. Since Dan doesn’t explain the units for his terms the equation , there is a real problem with a, b & d when looking at dimensional analysis (I can’t comment on c since I don’t understand the term). Possibly he meant that a and d were in units of K and b was in K-1 it might make more sense – but he didn’t say this.

But wait – Dan earlier stated that the coefficients are “to be determined” (i.e. not known).

They are not coefficients or even constants – he selects his terms according to the year (and even offers different versions for the same year).
His “coefficients” are variables! He even it states that the “coefficients” are adjusted to get the “best fit” of R2.
If I’m reading this correctly, then there is no supporting science of his coefficients. His “coefficients” are nothing more than pattern matching.

Since the coefficients were determined using all available data, some reviewers asserted that the equation may have no predictive ability in spite of it being formulated from relevant physical
phenomena and a known law of thermodynamics.

(My emphasis)

Of course I would expect Dan as an engineer would understand “a known law of thermodynamics”. In fact I would expect him to know at least three of them.
Which one has he selected? It would help.

Dan has however predicted the temperature for the next 25 years or so (and, surprisingly enough, we see that it will be cooling).
He is assuming that the sunspot variability over the next years is the same as the pattern between 1915 to 1941 – which is fair enough, since that he knows that it is a guess.
If sunspots do resemble then Dan predicts a cooling of about of between 0.2 K and 0.4 K (depending on his variable “coefficients”, despite that he has no idea what almost all his terms are unknown).

The beauty of it is that his own graph shows substantial warming between 1915 to 1941. 🙂

Shot in the foot? I think so.

Finally, Dan “shows” that the temperature has been declining between 2005 and 2011 (despite that the 2011 isn’t yet known yet).
He draws a straight line between 2005 and 2011(using UAH).

This is just sloppy. If Dan knows how to calculate R2 then he is perfectly capable of working out an OLS trend.

Over to you, Dan. 🙂

Image Credit:

climaterealists.com

AddThis Social Bookmark Button

We give our consent every moment that we do not resist.

Comment Policy

Discussion on this blog is to be guided by:

It is worth knowing and abiding by whether you comment on this blog or not.

Comments that are not relevant to the post that they appear under or the evolving discussion will simply be moved or deleted, as will links to Denier spam known to be scientific gibberish
  • The “Mostly” Open Thread” is for general climate discussion that is not relevant to a particular post. Spam and abuse rules still apply;
  • The “Challenging the Core Science” Comment Thread is for comments that purport to challenge the core science of anthropogenic climate change.
  • The “Spam” Comment Thread is for comments posted by people who think that they can ignore site policy.

Berkeley Earth Surface Temperature (BEST) are a team of independent scientists who have released of their temperature record.
The team was led by Richard Muller, a physicist at the University of California. There were 10 contributors in total, only one of whom is a traditional climate scientist.
Saul Perlmutter, one of the team, recently won a Nobel Prize for “the discovery of the accelerating expansion of the Universe”.

The team have drafted four papers:

  1. Statistical Methods
  2. Urban Heat Island
  3. Station Quality
  4. Decadal Variations

(If you don’t want to read the full papers, they have a two page summary)

Having looked at considerably more than the usual climate data, they conclude that

  • The temperature records by GISS, NOAA and CRU are pretty much right (BEST are warming than CRU & NOAA)
  • The “Urban Heat Island” is a myth, since urban areas are less then 0.5% of the surface on land
  • Bad quality of stations is a real problem, but that they do not significantly change trends

Not exactly shattering news, then, but learning why the team decided to undertake the study is interesting.

In the Economist, who broke the story, tells us

Marshalled by an astrophysicist, Richard Muller, this group, which calls itself the Berkeley Earth Surface Temperature, is notable in several ways. When embarking on the project 18 months ago, its members (including Saul Perlmutter, who won the Nobel prize for physics this month for his work on dark energy) were mostly new to climate science. And Dr Muller, for one, was mildly sceptical of its findings. This was partly, he says, because of “climategate”: the 2009 revelation of e-mails from scientists at CRU which suggested they had sometimes taken steps to disguise their adjustments of inconvenient palaeo-data. With this reputation, the Berkeley Earth team found it unusually easy to attract sponsors, including a donation of $150,000 from the Koch Foundation.

So Muller was sceptical. This is good and natural, of course. And they decided to check their results for themselves.

Rather than just use the datasets already available, they also included all the records that they had found (in some cases only for a short duration). In total they accumulated 1.6 billion records, about 5 times the data used by GISS, NOAA and CRU. And they had to develop a new analytical approach to incorporate fragments of records.

One caveat – the papers have been submitted (to the Journal of Geophysical Research) but have not yet been accepted. The CRU has declined an offer on the story, because the papers have not yet been through peer review. Possibly this is why Real Climate have not covered the story yet. It could still be a damp squib, but that leaves us exactly as it was before.

Certainly Watts critical (I counted eight blogs about BEST since the story broke), but I do not recall too much concern about peer review in the past.

And Dellingpole’s Global Warming is real is a gem:

“The planet has been warming,” says a new study of temperature records, conducted by Berkeley professor Richard Muller. I wonder what he’ll be telling us next: that night follows day? That water is wet? That great white sharks have nasty pointy teeth? That sheep go “baaaa”?

Some more sensible blogs include

And probably many more …

S2

Update: This was covered by BEST before, back in April. I was busy with Eigenvectors and didn’t pay attention.

Image Credit:

BEST

AddThis Social Bookmark Button

We give our consent every moment that we do not resist.

Comment Policy

Discussion on this blog is to be guided by:

It is worth knowing and abiding by whether you comment on this blog or not.

Comments that are not relevant to the post that they appear under or the evolving discussion will simply be moved or deleted, as will links to Denier spam known to be scientific gibberish
  • The “Mostly” Open Thread” is for general climate discussion that is not relevant to a particular post. Spam and abuse rules still apply;
  • The “Challenging the Core Science” Comment Thread is for comments that purport to challenge the core science of anthropogenic climate change.
  • The “Spam” Comment Thread is for comments posted by people who think that they can ignore site policy.

Very brief comment

I have just taken my last exams, so I have got some time for the next few months.

It will take me a while to catch up with all the dross in the inbox, but when I’ve caught up with it I’ll find another post.

Thanks to everyone who has continued to contribute the blog in the last few months, I will hopefully continue to do so. 🙂

S2

BPSDB Disempowering ourselves again

It’s unlikely that the U.S. is going to take serious action on climate change until there are observable, dramatic events, almost catastrophic in nature, that drive public opinion and drive the political process in that direction,” Stavins, director of Harvard’s Environmental Economics Program in Cambridge, Massachusetts, said today in an interview in Bloomberg’s Boston office.

Disaster Needed for U.S. to Act on Climate Change, Harvard’s Stavins Says

The argument that people will not do anything until it starts to affect them has probably been around for all of history. Certainly it is an old one with respect to climate change. The most recent iteration by Harvard economist Robert Stavins.

I was not able to find  much response to Stavins in the climate science blogosphere, perhaps because we have repeatedly been here before. However, there were two which illustrate several of the false assumptions that tend to get associated with this argument:

  1. What do we mean by “affect”?

  2. “Act” or react?

  3. Why catastrophe? Why Wait?

.

Let’s start by noting that what is being referred to is what is known as “trigger events” in discussions of political activism. Trigger events are things that spike public awareness of a particular issue, for good or ill.

. Continue Reading »

BPSDB

.

Hat tip to Magnus and Gareth

The F-bomb again, sigh. Maybe Tobis really has fundamentally altered the tone of climate science discussion? OK, they are climate scientists, there are actual facts and some legitimate political commentary in there, enjoy.

Continue Reading »

BPSDB

Nothing New Under the Sun

Science in the days of John Tyndall, the man who in the mid 19th century identified the greenhouse gases (the greenhouse effect itself was discovered by Joseph Fourier in 1824) certainly had to deal with Deniers.

After all, it was a period of great scientific discovery, including Darwin’s Evolution by Natural Selection. Scientific discoveries that threatened orthodoxy and ignorance.

Tyndall knew the consequences of Denial and the measure of the people who wallow in it:

It is as fatal as it is cowardly to blink facts because they are not to our taste.” ~ John Tyndall

He also knew how much point there was to presenting them with facts and reason in the hope that they would assess the facts fairly and objectively:

Religious feeling is as much a verity as any other part of human consciousness; and against it, on the subjective side, the waves of science beat in vain.”

So it’s no surprise that Tyndall took the time to try and help educate a broader public about science and scientific matters (“Fragments of science for unscientific people“). Those were simpler times when gentlemen wrote books and gave public talks for other gentlemen. Now with dozens of different types of media and instant global communication that can potentially reach almost any inhabitant on the planet the art of communication has become mind boggling.

Actually it’s not particularly any more complicated or difficult than it ever was, it’s just more incoherent and bewildering. What could and needed to be done was easier to discern then, now it is not so obvious, but the fundamentals remain the same.

In an earlier post I spoke of the need for a coherent, proactive media strategy. It is not my intent to lay one out, but rather to talk about what a media strategy is and what some of the options might be for implementation.

Further, as I stated in another earlier post: “Granted the climate science community is a loose network of a broad spectrum of individuals and groups, with occasional nodes that might be described as coalitions and the like, so I am not suggesting a unified strategy. It’s not only impractical, it’s probably impossible.

Even so, it is possible for us to have a loose strategy that is constantly discussed and reviewed, and which many in the network implement in ways that are suited to their strengths and abilities.

Continue Reading »