wayneL
VIVA LA LIBERTAD, CARAJO!
- Joined
- 9 July 2004
- Posts
- 25,947
- Reactions
- 13,236
Tony Abbott's climate claims debunked: researcher dissects 2013 statement
Sophie Lewis was so annoyed about the way science was ignored in the political debate about climate change she went to work to disprove the myths
Saturday 23 January 2016 08.25 AEDT
Climate scientists are regularly infuriated by the things politicians say. But it’s not often they publish a scientific paper tearing a politician’s comments to shreds.
Sophie Lewis, from the Australian Research Council’s centre for excellence in climate science, has done exactly that, dissecting statements about climate records made by the former prime minister Tony Abbott in 2013.
Last week, temperature figures showed 2015 was officially the hottest year on record. Before that, 2014 was the hottest year on record. And scientists are expecting 2016 to once again win the dubious honour.
Heat records are being broken with wild abandon. Last year, 10 months broke temperature records.
Climate scientists say a rise in the average temperature caused by greenhouse gas emissions makes extreme heat records more likely.
In 2013, the UN’s top climate official, Christiana Figueres, linked bushfires in Australia to climate change. Abbott called such claims “complete hogwash” and said drawing links between broken records and climate change was a sign of desperation.
He went on: “The thing is that at some point in the future, every record will be broken, but that doesn’t prove anything about climate change. It just proves that the longer the period of time, the more possibility of extreme events.”
Superficially it seems to make sense: if you wait long enough, you’re bound to see records fall. Lewis suspected many people shared Abbott’s interpretation, and set out to show it was wrong.
Lewis says she was frustrated by the gap she saw between what the science showed and what some politicians said was happening.
basilio, Jesus! Every graph, every data series has been tortured into submission by vested interests.
Did you not realize that was the game? Remember the embarrassing 97% consensus which, like, wasn't?
Though slightly more subtle, you've done the exact same thing as the blog writer.
Poisonous.
No Wayne. It's not a game. When almost every scientist studying climate science uses similar figures to show we are in deep trouble we need to listen.
When rogues and charlatans cherry pick information to misled they are dishonest. The reason for what is called "peer reviewed" science is for fellow scientists to closely scrutinise each other and attempt to keep the scientific process honest and accountable. When people attempt to pass off doctored or dishonest data as representations of reality its just wrong.
You know; like doctored accounts, fake witness statements, dishonest CV's.
https://tamino.wordpress.com/2011/07/06/aligning-station-records/
Aligning Station Records - July 6, 2011 - author unknown
As some of you know, I devised a method for aligning temperature data records which I believe is better than the “reference station method” used by NASA GISS.
However, the difference is small and it doesn’t change the overall global result when small regions are averaged, then those regional results are area-weight-averaged to produce a global estimate. It’s an interesting, and possibly useful, refinement which doesn’t change the overall final answer...
...The temperature estimate for a specific location is the weighted average of nearby stations, with closer stations given greater weight by the weighting function. I ignored the weighting function altogether so that all stations in a given region can be equally weighted to compute a regional average.
My modification is therefore specifically tailored to produce a local estimate, whereas the Berkeley method is designed to include everything in one fell swoop and produce a global estimate...
New York: A winter storm has dumped nearly 58cm of snow on Washington, DC, before moving on to Philadelphia and New York, paralysing road, rail and airline travel along the US east coast.
At least 10 states declared weather emergencies on Saturday, aiming to get a handle on highways made impassable by the drifting snow and to shore up coastal areas where the blizzard conditions raised the danger of flooding.
The worst appeared to be over for Washington, although moderate snow was expected to keep falling until late Saturday, with the deepest accumulation of 58 centimetres recorded in Poolesville, Maryland, north of the nation's capital.
"Records are getting close - we're getting into the top five storms," Gallina said.
The combined average temperature over global land and ocean surfaces for July 2015 was the highest for July in the 136-year period of record, at 0.81 °C (1.46 °F) above the 20th century average of 15.8 °C (60.4 °F), surpassing the previous record set in 1998 by 0.08 °C (0.14 °F). As July is climatologically the warmest month of the year globally, this monthly global temperature of 16.61 °C (61.86 °F) was also the highest among all 1627 months in the record that began in January 1880. The July temperature is currently increasing at an average rate of 0.65 °C (1.17 °F) per century.
Using reference values computed on smaller [more local] scales over the same time period establishes a baseline from which anomalies are calculated. This effectively normalizes the data so they can be compared and combined to more accurately represent temperature patterns with respect to what is normal for different places within a region.
For these reasons, large-area summaries incorporate anomalies, not the temperature itself. Anomalies more accurately describe climate variability over larger areas than absolute temperatures do, and they give a frame of reference that allows more meaningful comparisons between locations and more accurate calculations of temperature trends.
http://arstechnica.com/science/2016...ated-the-truth-about-global-temperature-data/Scientific Method / Science & Exploration
Thorough, not thoroughly fabricated: The truth about global temperature data
How thermometer and satellite data is adjusted and why it must be done.
by Scott K. Johnson - Jan 22, 2016 2:30am AEDT
706
Scott K. Johnson/Suzanna Soileau-USGS/Hanna-Barbera
“In June, NOAA employees altered temperature data to get politically correct results.”
At least, that's what Congressman Lamar Smith (R-Tex.) alleged in a Washington Post letter to the editor last November. The op-ed was part of Smith's months-long campaign against NOAA climate scientists. Specifically, Smith was unhappy after an update to NOAA’s global surface temperature dataset slightly increased the short-term warming trend since 1998. And being a man of action, Smith proceeded to give an anti-climate change stump speech at the Heartland Institute conference, request access to NOAA's data (which was already publicly available), and subpoena NOAA scientists for their e-mails.
Smith isn't the only politician who questions NOAA's results and integrity. During a recent hearing of the Senate Subcommittee on Space, Science, and Competitiveness, Senator Ted Cruz (R-Tex.) leveled similar accusations against the entire scientific endeavor of tracking Earth’s temperature.
“I would note if you systematically add, adjust the numbers upwards for more recent temperatures, wouldn’t that, by definition, produce a dataset that proves your global warming theory is correct? And the more you add, the more warming you can find, and you don’t have to actually bother looking at what the thermometer says, you just add whatever number you want.”
There are entire blogs dedicated to uncovering the conspiracy to alter the globe's temperature. The premise is as follows—through supposed “adjustments,” nefarious scientists manipulate raw temperature measurements to create (or at least inflate) the warming trend. People who subscribe to such theories argue that the raw data is the true measurement; they treat the term “adjusted” like a synonym for “fudged.”
Peter Thorne, a scientist at Maynooth University in Ireland who has worked with all sorts of global temperature datasets over his career, disagrees. “Find me a scientist who’s involved in making measurements who says the original measurements are perfect, as are. It doesn’t exist,” he told Ars. “It’s beyond a doubt that we have to—have to—do some analysis. We can’t just take the data as a given.”
Speaking of data, the latest datasets are in and 2015 is (as expected) officially the hottest year on record. It's the first year to hit 1 °C above levels of the late 1800s. And to upend the inevitable backlash that news will receive (*spoiler alert*), using all the raw data without performing any analysis would actually produce the appearance of more warming since the start of records in the late 1800s.
With all the heat around the adjusting of weather information I thought the following analysis might explain the how and why of the situation more clearly. It offers clear, excellent detail of many examples of adjustments to data, why they were done and the impact it has on final results.
http://arstechnica.com/science/2016...ated-the-truth-about-global-temperature-data/
Peter Thorne, a scientist at Maynooth University in Ireland who has worked with all sorts of global temperature datasets over his career, disagrees. “Find me a scientist who’s involved in making measurements who says the original measurements are perfect, as are. It doesn’t exist,” he told Ars. “It’s beyond a doubt that we have to—have to—do some analysis. We can’t just take the data as a given.”
Anomalies more accurately describe climate variability over larger areas than absolute temperatures do, and they give a frame of reference that allows more meaningful comparisons between locations and more accurate calculations of temperature trends.
And to upend the inevitable backlash that news will receive (*spoiler alert*), using all the raw data without performing any analysis would actually produce the appearance of more warming since the start of records in the late 1800s.
This problem is a terrifically tough nut to crack, and updates to techniques have routinely produced significantly different looking datasets—particularly for the UAH group. Its initial version actually showed a cooling trend through the mid-1990s. After a 1998 paper found that the gradual lowering of the satellites’ orbits was introducing a false cooling influence, UAH’s revisions accidentally broke its time-of-day correction, creating a new source of false cooling. That wasn’t figured out until Mears and a colleague published a 2005 paper. And just last year, another paper laid out evidence that insufficient corrections are still having a cooling influence on the data.
With that said, both satellite records do show slightly smaller warming trends for the troposphere than our surface records show, which is unexpected. “If you include the uncertainty analysis,” Mears explained, “I think that the data aren’t really good enough to say that it either is or isn’t following what you expect.”
“Some of the interannual wiggles are bigger in RSS, and since 1998 or something like that, we’re showing less [warming] than the surface datasets. I suspect that’s at least partly due to a problem in our dataset, probably having to do with the [time-of-day] correction. It could be an error in the surface datasets, but the evidence suggests that they’re more reliable than the satellite datasets,” Mears said.
With all the heat around the adjusting of weather information I thought the following analysis might explain the how and why of the situation more clearly. It offers clear, excellent detail of many examples of adjustments to data, why they were done and the impact it has on final results.
http://arstechnica.com/science/2016...ated-the-truth-about-global-temperature-data/
NOAA, NASA, the UK Met Office, and the Japanese Meteorological Agency all produce surface temperature datasets using independent methods, but there is also some overlap in data sources or techniques. So motivated by skepticism of human-caused global warming, University of California, Berkeley physicist Richard Muller organized a well-publicized project to create his own dataset, built from the ground up. This “Berkeley Earth” team chose to handle homogenization a little differently.
Rather than make adjustments, they simply split records containing sudden jumps into multiple records. Records that gradually drifted away from neighbors were just given low weights in the final averaging. But despite a number of methodological differences and a larger database of stations, their results looked just like everybody else’s.
As with the surface temperature datasets, there’s more to revealing the climate trend than just printing out the satellite measurements. There are two key corrections that have to be applied.
The first involves calibrating the instrument to keep the measurements from drifting over time. The satellite frequently takes a peek at deep space, getting a reading of the (well-known) Cosmic Microwave Background Radiation left over from the Big Bang””an almost absolutely frigid temperature of 2.7 kelvins. Then the instrument takes a microwave reading of a special block on the satellite itself that has a bunch of actual thermometers tracking its temperature, which varies as the satellite passes into and out of sunlight.
If the instrument had a simple linear sensitivity to temperature, you could just draw a straight line through those two known points and precisely work out the temperature of any other object. Unfortunately, there’s a slight curve to the sensitivity, and there are a number of slightly curved lines that can pass through two points. Use the wrong curved line, and the changing temperature of the calibration target on the satellite will influence your measurements. To figure out which curved line to use for each satellite, you have to carefully compare its measurements with those made by other satellites operating at the same time.
The second correction is the most important, and it has also caused significant confusion over the years. The satellites orbit the Earth from pole to pole, passing over each location at the same time of day each time. But many of the satellites don’t quite nail this rhythm and are progressively falling a little more behind schedule each day. Since temperature changes over the course of the day, your measurements for that location would slowly change over time even if every day were the same temperature. It’s as if you started checking the temperature at your house at 5:00pm each day but after a few years ended up checking at 7:00pm instead.
This problem is a terrifically tough nut to crack, and updates to techniques have routinely produced significantly different looking datasets””particularly for the UAH group. Its initial version actually showed a cooling trend through the mid-1990s. After a 1998 paper found that the gradual lowering of the satellites’ orbits was introducing a false cooling influence, UAH’s revisions accidentally broke its time-of-day correction, creating a new source of false cooling. That wasn’t figured out until Mears and a colleague published a 2005 paper. And just last year, another paper laid out evidence that insufficient corrections are still having a cooling influence on the data.
With that said, both satellite records do show slightly smaller warming trends for the troposphere than our surface records show, which is unexpected. “If you include the uncertainty analysis,” Mears explained, “I think that the data aren’t really good enough to say that it either is or isn’t following what you expect.”
"Back in 2010 or so, I put together some software that would download all the global temperature data, turn it into a global temperature estimate, and would let people play around with that," he continued. "A number of the skeptical folks created their own temperature reconstruction.
And low and behold, it turned out to pretty much be the same as [the Met Office Hadley Centre’s dataset]. In fact, actually slightly warmer than Hadley. I think when people who are acknowledged as skeptical of these things do the work themselves and see that, lo and behold, the results aren’t different, that ends up being a very powerful thing.”
Watching Neanderthal Apocalypse and the roll of climate change in their extinction.
Bastids should have stopped driving SUVs eh?
Well TS I'm surprised you didn't see the graph highlighting how the raw data of weather around the world in fact indicates a higher level of warming.
Interesting isn't it ? For all the huffing and puffing and outrage at adjusted global temperatures it turns out the adjustments seem to reduce the amount of warming.
Surely it would serve them better to evidence raw data therefore creating more hysteria for the media and the masses to chow down on ?
So we cannot use the raw data it must be analysed first and anomalies assumed to create a trend pattern rather than declaring the actual temperature or are my eyes painted on and my brain has shrunk to a size of a pea and just as mushy?
This problem is a terrifically tough nut to crack, and updates to techniques have routinely produced significantly different looking datasets—particularly for the UAH group. Its initial version actually showed a cooling trend through the mid-1990s. After a 1998 paper found that the gradual lowering of the satellites’ orbits was introducing a false cooling influence, UAH’s revisions accidentally broke its time-of-day correction, creating a new source of false cooling. That wasn’t figured out until Mears and a colleague published a 2005 paper. And just last year, another paper laid out evidence that insufficient corrections are still having a cooling influence on the data.
And as for the rest of the paper? You overlooked all the interesting ways that scientists had to account for the different ways water temperature were recorded - canvas buckets versus metal buckets, the use of water in engine room intakes which in itself was responsible for raising the indicated temperature by .6C.
During World War II, a huge change-over took place as naval vessels swarmed the seas. Water temperature measurements were now made by thermometers in the engine cooling water intake pipe. That intake obviously led to a hot engine, raising the measured temperatures a bit. What’s more, ships of different sizes drew water from slightly different depths beneath the surface.
Anomalies more accurately describe climate variability over larger areas than absolute temperatures do, and they give a frame of reference that allows more meaningful comparisons between locations and more accurate calculations of temperature trends.
So volcanos can be significantly affect climate! No matter how much the Neanderthals limited their carbon-based fuel consumption. Or imposed a firewood tax on themselves.
Meanwhile, if Basilio can reproduce un-attributed work, so can I.
Cyclical, post-ice age warming should not be conflated with the theory of AGW caused by man-made CO2. It is interesting that the alarmists "global warming", quickly morphed into "climate change".
View attachment 65634
Publisher: C3headlines@gmail.com ..On the above graph, from 1880 to current, check temp change in blue against CO2 levels in black (left and right axes).
Global Warming Science Facts: Climate Reality Intrudes, CO2 Has Little Impact On Long-Term Global Temperatures
...silly me for thinking I should question the "data".
Unless you're able to give some sort of reference with regard your capacity to question the "data"; One might ask, legitimately, what if anything your assessment of that "data" would be worth. Your critique of those that do have a recognised ability in that regard may have some weight.... Care to give us all here a percentage of the global scientific community that that might entail? And then, if you care to name them and cite their published work in support of their theories ... People here will be able to make a better case for recognising you as silly or something else.
After a 1998 paper found that the gradual lowering of the satellites’ orbits was introducing a false cooling influence, UAH’s revisions accidentally broke its time-of-day correction, creating a new source of false cooling.
During World War II, a huge change-over took place as naval vessels swarmed the seas. Water temperature measurements were now made by thermometers in the engine cooling water intake pipe. That intake obviously led to a hot engine, raising the measured temperatures a bit. What’s more, ships of different sizes drew water from slightly different depths beneath the surface.
Cold water has a higher density than warm water. Water gets colder with depth because cold, salty ocean water sinks to the bottom of the ocean basins below the less dense warmer water near the surface. The sinking and transport of cold, salty water at depth combined with the wind-driven flow of warm water at the surface creates a complex pattern of ocean circulation called the 'global conveyor belt.'
We should also consider the accuracy of the typical mercury and alcohol thermometers that have been in use for the last 120 years. Glass thermometers are calibrated by immersing them in ice/water at 0c and a steam bath at 100c. The scale is then divided equally into 100 divisions between zero and 100. However, a glass thermometer at 100c is longer than a thermometer at 0c. This means that the scale on the thermometer gives a false high reading at low temperatures (between 0 and 25c) and a false low reading at high temperatures (between 70 and 100c) This process is also followed with weather thermometers with a range of -20 to +50c
In general, bucket temperatures have been
found to average a few tenths of a ◦C cooler than simultaneous
engine intake temperatures. Field and lab experiments
demonstrate that cooling of bucket samples prior to
measurement provides a plausible explanation for negative
average bucket-intake differences. These can also be credibly
attributed to systematic errors in intake temperatures,
which have been found to average overly-warm by >0.5 ◦C
on some vessels. However, the precise origin of non-zero average
bucket-intake differences reported in field studies is often
unclear, given that additional temperatures to those from
the buckets and intakes have rarely been obtained. Supplementary
accurate in situ temperatures are required to reveal
individual errors in bucket and intake temperatures, and the
role of near-surface temperature gradients. There is a need
for further field experiments of the type reported in Part 2 to
address this and other limitations of previous studies.
Brooks conducted an additional shipboard comparison
aboard the ocean liner SS Finland on a cruise between
San Francisco and New York in May 1928 (B28). Temperatures
from the main engine intake were found to average
0.8 ◦C warmer than those obtained by fast measurement
with a rubber-covered tin bucket of small volume
(1.7 L). Those from the refigerator intake in the refigerator
room averaged 0.2 ◦C warmer. Respectively, the engine intake
and refigerator intake readings were found to average 0.7
and 0.3 ◦C warmer than those from a specially-fitted intake
thermograph. While details of the engine intake thermometer
were not reported, the refigeration intake thermometer
was graduated in intervals of 2◦ F (∼ 1.1 ◦C). Temperature
change of the tin bucket sample pre-measurement was assumed
small, although cooling of 0.1 ◦C was noted in one
minute following collection under a wind speed of 9 m s−1
and SST-wet bulb temperature contrast of 6 ◦C.
The storm ”” dubbed Winter Storm Jonas and "Snowzilla" ”” walloped a dozen states from Friday into early Sunday, affecting an estimated 85 million residents who were told to stay indoors and off the roads for their own safety.
The 68 centimetres of snow that fell in New York's Central Park was the second-highest accumulation since records began in 1869, and more than 56 centimetres paralysed the capital Washington.
The greatest snowfall for a single-day in Baltimore was 23.3 inches on Jan. 28, 1922.
Snowfall records date back to 1884 at Washington, D.C., and 1892 at Baltimore.
Record-breaking or not, the weekend blizzard will prove to be very disruptive over a broad area of the mid-Atlantic and perhaps part of southern New England as well.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?