Brain-storms and litmus tests for acidity with hot air pollution

18 Mar 2010
Some of the proposed/agreed areas before the undersea land grab. OSPAR (Oslo Paris Convention) began to emerge in 1972 to stop waste dumping in the Northeast Atlantic. Credit: TW : EEC Photos

Some of the proposed/agreed areas before the undersea land grab. OSPAR (Oslo Paris Convention) began to emerge in 1972 to stop waste dumping in the Northeast Atlantic. Credit: TW : EEC Photos

Peter O’Neill reports that doubts are also coming in across all weather fronts about the quality of climate data affecting regulation of the fishing industry.

Natural England, a UK government-funded body with a budget of £300 million, recently co-hosted an international conference entitled “Sea change: Securing a future for Europe’s seas”. The main focus was marine protected areas and the speakers were generally from the academic and NGO sector with a sprinkling of politicians and officials. The few fishermen there said the MPAs needed far more definition and a longer timescale so fishing stakeholders can be properly consulted.

The failure of the CFP is now accepted by the European Commission (EC) in Brussels. Poul Degnbol (EC DG Maritime Affairs and Fisheries) said the aims set out were a “poor” guide on how to get you to the final destination. The “CFP objectives are not focused and prioritized [offering] no guidance for decision making and no accountability…[and] demonstrated that compromising the ecological sustainability in order to cushion short term economic and social effects of reductions have undermined the economic and social sustainability of European fisheries” (http://tinyurl.com/yf85n9x).

It is the moves by Russia to extend its continental shelf and reactions by other government which means the boundaries of many proposed MPAs may have to be renegotiated [e.g. see www.waddensea-secretariat.org]. This may give the fishing industry more time to adjust. But there is a deeper concern. The detailed data used as the bedrock for ideas such as MPAs, seems to be based on shifting sands. The same data sources are often used to decide closed seasons. But these climate change data sources are coming more and more coming under serious questioning.

The industry has known for decades that some seas are warming and fish are moving because of that. The political wild card in the equation is how much is caused by man, and how much is unavoidable natural warming and cooling cycles. The Natural England conference was shown a graph of sea warming. However, World Fishing recalls how scientists in the UK scoffed at its reports that there was a phenomenon causing squid to disappear off the coast of Santa Cruz in California in 1983…because of water temperature changes. They said it was an old myth made up by the fishermen, who called it El Niño. Of course every scientist today will tell you about the importance of El Niño in their calculations.

Dodgy data

The row continues over the hacked emails, whose content, it was alleged, seemed to indicate that climate scientists at the University of East Anglia were not letting full data out into the public domain for tough checking by others. The row has now reached the spotlight of an investigative select committee of the UK parliament where questions have been asked about whether the data had been manipulated.

The fishing industry needs to know if the data is flawed, and if it was processed to promote one side or the other. Dodgy data means mean bad decisions imposed on the sector by government.

Data from a range of sources are put in the scientific model mixer, including Arctic and Antarctic glacier melting. But we have also discovered Himalayan glaciers are not melting as fast as was claimed. Downstream water provision from these is key when planning for India’s large inland river and village pond fish industries and shrimp farm levels around the Bay of Bengal.

In February, WF attended a briefing by three environmental professors from the US, UK and Germany, all heading international atmospheric and pollution networks. They were floating an idea to harden up the quality of ground data collected at monitoring sites about air quality and harmful climatic gases (and in some cases temperature). These include sea locations.

They stressed this more controlled network was still an idea in development, and could not put a hard figure on the total cost. But the figures thrown up suggested between US 2.5 million 10 million dollars are needed to set up a site (and up to a million per annum to run one). Multiply these figures by say 190 countries ad you get an enormous cost. And between these three academics there were voluble differences of opinion about how good or bad the current data is, with them contradicting each other. For example, some sites operating are ‘contaminated’ because they have been put too near factories or towns whose heat flaws the data. You can add to that another row about temperature and rain stations in China, for example, having been moved around and often wrongly sited. Yet that China data has been used as an important base on which global climate change targets have been based.

The academics also said the ground data coming out could be 30 per cent inaccurate. To quote one of them: “You can imagine those numbers would be way off”.

And they all agreed that while it was possible to see the global quantity of CO2 emissions, nevertheless CO2, as one academic put it, “is the hardest one to do” of all the potential, climate-impacting emissions in terms of nailing down the source which caused it. We have to remember then, that cargo, trawl and most large industrial fishing vessels are seen as part of the CO2 production process. More proposals are sweeping in to tackle ocean acidification. The same goes for long-term coral security and keeping fishermen away from corals and sea mounts, yet the simple solution of better seabed maps is ignored.

A limited repetition of the 1960’s winter snows, and gales in the Bay of Biscay (what’s new?) causing land floods in coastal France and southern England, have now led the UK Meteorological Office to announce a review of its 150 years of data to see why its forecasts are going wrong. It does add that it believes the data is not flawed.

Skippers have recorded weather cycles and water temperatures hour by hour in their logbooks over hundreds of years. That is the hard, cold, and scientifically neutral data which is needed, not ‘brain-storming’. What is on offer seems to have some worrying short circuits (and perhaps dangerous shortcuts) in the electrical action in the neural synapses of the scientific community.

Links to related companies and recent articles ...

European Commission

view more