>
> On Mon, 11 Jun 2001, Doug Henwood wrote:
>
> > Wall Street Journal [of course] - June 11, 2001
> >
> > Scientists' Report Doesn't
> > Support the Kyoto Treaty
> >
> > By Richard S. Lindzen. Mr. Lindzen, a professor of meteorology at
> > MIT, was a member of the National Academy of Sciences panel on
> > climate change.
>
> An interesting context for the article that follows is provided by
Doug's
> interview last Thursday, the day after the report came out, with
Ross
> Gelbspan, author of _The Heat Is On_:
>
=========
Here's the link for the NAS report:
< http://lab.nap.edu/catalog/10139.html >
apropos; the article below highlights a big part of the problem too [not the leadership part, which is a stupid way of looking at the issue]....Ghost of Derek de Solla Price.....
[NYT] June 11, 2001 U.S. Losing Status as a World Leader in Climate By ANDREW C. REVKIN In little more than a decade, the United States has fallen significantly behind other countries in its ability to simulate and predict long-term shifts in climate, according to a wide range of scientists and recent federal studies.
This slide in status has occurred amid a growing scientific consensus that rising levels of heat-trapping emissions from smokestacks and tailpipes are warming the climate and could become the biggest environmental problem of the next 100 years.
President Bush plans to use a Rose Garden speech on global warming policy today to propose several ways to improve the situation, government officials say, including an increase in money for basic climate research and an effort to coordinate American climate-modeling efforts with those abroad.
But many climate experts say that the problems are deep-rooted, and that a clearer picture of the local and global impact of coming climate shifts will emerge only if there is a substantial shuffling of the scientific bureaucracy and permanent support for basic monitoring of climate-influencing factors like the ebb and flow of greenhouse gases.
American researchers have repeatedly had to go to Europe or Japan to find computers capable of handling their most ambitious climate analyses. The most recent international effort to assess links between global warming and human activities, completed by the Intergovernmental Panel on Climate Change this year, relied mainly on European models.
Over all, many experts conclude, advanced climate research in the United States is fragmented among an alphabet soup of agencies, strained by inadequate computing power and starved for the basic measurements of real-world conditions that are needed to improve simulations.
While Britain and Japan have poured tens of millions of dollars into computing centers focused on long- term climate research, budgets for similar efforts in the United States have been flat at best, and the work is done at dispersed research centers run by a variety of federal agencies.
"We have groups doing numerical weather prediction, hurricanes, climate, oceans, but in the international arena, countries have whole institutions doing the functions of these individual groups," said Dr. Ronald J. Stouffer, who designs and runs climate models at the Geophysical Fluid Dynamics Laboratory in Princeton, N.J., a top Commerce Department center for weather and climate work.
Improving each aspect of climate analysis is essential, many experts say, if the country is to move from pondering what to do about a general warming trend to considering consequences for particular regions and the likely impact on agriculture, ecosystems and water supplies.
"What really matters to people is, does the wheat belt move north, how much does sea level rise, does California lose its water supply?" said Dr. Larry Smarr, a computer scientist at the University of California, San Diego, who helped establish the American network of academic supercomputing centers in the 1980's.
But, he said, the limits on detail and power in computer models for American researchers are like those facing a "nearsighted person who's lost his glasses."
"Twelve or 13 years ago," Dr. Smarr said, "we took it for granted that the U.S. was in the lead on everything from weather prediction to climate modeling." Europe is leading in long-term and day-to-day forecasting.
"I've watched over the last decade in horror," he said. "It's almost like benign neglect."
The problems in climate science have been identified in a lengthening string of reports by the National Academy of Sciences, including the climate report completed for the White House last week. They were also highlighted for several senators and Treasury Secretary Paul H. O'Neill on Friday by a separate panel of scientists from the academy.
"Here in the United States, with a gross national product perhaps 10 times that of England, we're spending less than they do on this sort of problem," said Dr. Edward S. Sarachik, a professor of atmospheric sciences at the University of Washington who was an author of the report written for the Bush administration. He also led a separate science academy panel that issued a report in April on the weakness of America's most sophisticated computer climate models.
During the Clinton administration, the lack of American modeling leadership did not have a discernible impact on climate policy, various experts said. But it did prevent the United States from playing a more central role in writing critical sections of the Intergovernmental Panel's report - particularly the part assessing the extent of human influence on the warming trend of recent decades.
In computing power, Dr. Sarachik said, "our top two centers together don't amount to one-fifth of the European effort."
American scientists still tend to dominate basic research on the physics of the atmosphere, many climate experts say. But they lag in the ability to plug that knowledge into computer models that provide society with the only meaningful lens on future climate.
Given the growing importance of the problem, the science academy has recommended the formation of a National Climate Service that would be similar to the National Weather Service but would focus on long- range trends instead of the evanescent day-to-day flickers of weather.
The lack of computer power has hurt the most at the pinnacle of climate science: the use of supercomputers to create detailed models simulating the interrelationships of the earth's atmosphere, oceans, ice caps, plants and other features that together set the global thermostat.
These models are composed of several hundred thousand lines of computer code that divide the air, land and oceans into a grid of hundreds of interacting boxes. The best American models still lack sufficient resolution to capture critical features like the Rocky Mountains, which funnel humid Gulf of Mexico air over the heartland, or the Gulf Stream, which pumps tropical warmth north along the East Coast.
Researchers abroad, most notably at Britain's Hadley Center for Climate Prediction and Research, can run simultaneous sets of climate simulations that compress 1,000 years of climate change into a day of computer crunching.
"That is roughly 5 to 10 times faster than we can," said Dr. Jay S. Fein, the program director for climate dynamics at the National Science Foundation. "Some of our best scientists have been and will be attracted to work in Europe."
But the gap does not just pose the threat of a brain drain, Dr. Fein and other experts say. They say there are potential economic and security problems if there are delays answering climate questions that are a priority for the United States - like whether global warming will eliminate the winter mountain snows that supply California with three-quarters of its water in summer.
That problem was highlighted last year when the United States Global Change Research Program, a 10- year-old government office coordinating most climate work, published an assessment of the expected impact of global warming on areas around the country. The heart of the effort, which had been requested by Congress, was a series of modeling studies that had to be run on computers in Britain, Canada and Japan because American climate centers lacked the capacity to perform the calculations.
Those computers might not always be available to serve American needs, the experts say.
And the gap may soon widen.
Japan is poised to leapfrog past everyone in the quest for model power. Over the last three years, it has spent more than $400 million as it builds what it is calling an Earth Simulator in Yokohama, which will have a linked array of supercomputers calculating at speeds many times faster than the best existing modeling systems.
A typical computer array used for climate modeling in the United States can process about 20 gigaflops, or 20 billion floating-point operations per second, recent studies say. European centers are routinely running beyond 100 gigaflops. The Japanese machine's performance will be measured in teraflops, or thousands of gigaflops.
But better computers and organization are only part of the picture, the experts say. Better monitoring of conditions that influence the current climate is crucial, Dr. Sarachik's April study concluded.
For example, while policy makers have been debating the value of forests and farmland as a sponge for some human-generated carbon dioxide, the budget for a federal program monitoring atmospheric carbon dioxide with instruments on aircraft and tall radio towers has remained at $1.4 million a year over the last nine years. But that budget has been so eroded by inflation that the operation could be cut in half without emergency financing from Congress, government scientists say.
"If something doesn't happen, we're done," said Dr. Pieter P. Tans, chief scientist at the climate monitoring and diagnostics laboratory of the National Oceanic and Atmospheric Administration in Boulder, Colo., which takes the carbon dioxide measurements.
With budgets for science curtailed everywhere, Dr. Tans said, he knows it is a tough sell to seek support for work that is important but plodding.
But, he added, "To really understand climate, we have to establish a high-quality record that can be trusted."