Saturday, May 31, 2014

How many deaths could result from failure to act on climate change?

A recent OECD study concludes that outdoor air pollution is killing more than 3.5 million people a year globally. The OECD estimates that people in its 34 Member countries would be willing to pay USD 1.7 trillion to avoid deaths caused by air pollution. Road transport is likely responsible for about half.

A 2012 report by DARA calculated that 5 million people were dying each year from climate change and carbon economies, mostly from indoor smoke and (outdoor) air pollution.

Back in 2012, a Reuters report calculated that this could add up to a total number of 100 million deaths over the coming two decades. This suggests, however, that failure to act on climate change will not cause even more deaths due to other causes.

Indeed, failure to act on climate change could result in many more deaths due to other causes, in particular food shortages. As temperatures rise, ever more extreme weather events can be expected, such as flooding, heatwaves, wildfires, droughts, and subsequent crop loss, famine, disease, heat-stroke, etc.

So, while currently most deaths are caused by indoor smoke and outdoor air pollution, in case of a failure to act on climate change the number of deaths can be expected to rise most rapidly among people hit by famine, fresh water shortages, as well as wars over food, water, etc.

How high could figures rise? Below is an update of an image from the earlier post Arctic Methane Impact with a scale in both Celsius and Fahrenheit added on the right, illustrating the danger that temperature will rise to intolerable levels if little or no action is taken on climate change. The inset shows projected global number of annual climate-related deaths for these two scenarios, i.e. no action and little action, and also shows a third scenario of comprehensive and effective action that would instead bring temperature rise under control.

[ click on image to enlarge ]
For further details on a comprehensive and effective climate plan, see the ClimatePlan blog.





Monday, May 26, 2014

Data sharing: Exciting but scary




© www.CartoonStock.com



Yesterday I did something I've never done before in many  years of publishing. When I submitted a revised manuscript of a research report to a journal, I also posted the dataset on the web, together with the script I'd used to extract the summary results. It was exciting. It felt as if I was part of a scientific revolution that has been gathering pace over the past two or three years, which culminated in adoption of a data policy by PLOS journals last February. This specified that authors were required to make the data underlying their scientific findings available publicly immediately upon publication of the article. As it happens, my paper is not submitted to PLOS, and so I'm not obliged to do this, but I wanted to, having considered the pros and cons. My decision was also influenced by the Wellcome Trust, who fund my work and encourage data sharing.


The benefits are potentially huge. People usually think about the value to other researchers, who may be able to extract useful information from your data, and there's no doubt this is a factor.  Particularly with large datasets, it's often the case that researchers only use a subset of the data, and so valuable information is squandered and may be lost forever.  More than once I've had someone ask me for an old dataset, only to find it is inaccessible, because it was stored on a floppy disk or an ancient, non-networked computer and so is no longer readable.  Even if you think that you've extracted all you can from a dataset, it may still be worth preserving for potential inclusion in future meta-analyses.


Another value of open data is less often emphasised: when you share data you are forced to ensure it is accurate and properly documented. I enjoy data analysis, but I'm not naturally well-disciplined about keeping everything tidy and well-organised. I've been alarmed on occasion to return to a dataset and find I have no idea what some of the variables are, because I failed to document them properly.  If I know the world at large will see my dataset then I won't want to be embarrassed by it, and so I will take more care to keep it neat and tidy with everything clearly labelled. This can only be good.


But here's the scary thing. data sharing exposes researchers to the risk of being found out to be sloppy or inaccurate. To my horror, shortly before I posted my dataset on the internet yesterday I found I'd made a mistake in the calculation of one of my variables. It was a silly error, caused by basing a computation on the wrong column of data. Fortunately, it did not have a serious effect on my paper, though I did have to go through redoing all the tables and making some changes to the text.  But it seemed like pure chance that I picked up on this error – I could very easily have posted the dataset on the internet with the error still there. And it was an error that would have been detected by anyone eagle-eyed enough to look at the numbers carefully.  Needless to say, I'm nervous that there may well be other errors in there that I did not pick up. But at least it's not as bad as an apocryphal case of a distinguished research group whose dramatic (and published) results arose because someone forgot to designate 9 as a missing value code. When I heard about that I shuddered, as I could see how easily it could happen.


This is why Open Data is both important for science but difficult for scientists. In the past, I've found mistakes in my datasets, but this has been a private experience.  To date, as far as I am aware, no serious errors have got into my published papers – though I did have another close shave last year when I found a wrongly-reported set of means at the proofs stage, and there have been a couple of instances where minor errata have had to be published. But the one thing I've learned as I wiped the egg off my face is that error is inevitable and unavoidable, however careful you try to be. The best way to flush out these errors is to make the data public. This will inevitably lead to some embarrassment when mistakes are found, but at the end of the day, our goal must be to find out what is the case, rather than to save face.


I'm aware that not everyone agrees with me on this. There are concerns that open data sharing could lead to scientists getting scooped, will take up too much time, and could be used to impose ever more draconian regulation on beleaguered scientists: as DrugMonkey memorably put it:  "Data depository obsession gets us a little closer to home because the psychotics are the Open Access Eleventy waccaloons who, presumably, started out as nice, normal, reasonable scientists." But I think this misses the point. Drug Monkey seems to think this is all about imposing regulations to prevent fraud and other dubious practices.  I don't think this is so. The counter-arguments were well articulated in a blogpost by Tal Yarkoni. In brief, it's about moving to a point where it is accepted practice to make data publicly available, to improve scientific transparency, accuracy and collaboration. 

Sunday, May 25, 2014

Large Falls in Arctic Sea Ice Thickness over May 2014

Comparing ice thickness (in meters) on May 2, 2014 (left) and May 30, 2014 (right, forecast run May 25, 2014)
Arctic sea ice has shown large falls in thickness in many areas over the course of May 2014, as shown on above image. The animation below also compares the situation between May 2, 2014, and May 30, 2014 (as forecast by Naval Research Laboratory on May 23, 2014). Ice thickness is in meters.


Thickness is an important indicator of the vulnerability of the ice. If only looking at sea ice extent, one might (wrongly) conclude that sea ice retreat was only minor and that everything looked fine. By contrast, when looking at thickness, it becomes evident that large falls have occurred over the course of May 2014.

Falls at the edges of the sea ice can be expected at this time of the year, but the large fall closer to the center is frightening. On the one hand, it appears to reflect cyclonic weather and subsequent drift of the ice. On the other hand, it also indicates how vulnerable the sea ice has become. Last year, a large area showed up at the center of the sea ice where the ice became very thin, as discussed in July 2013 in the post Open Water at North Pole and again in the September 2013 post North Hole.

The appearance of huge weak areas at the center of the sea ice adds to its vulnerability and increases the prospect of total sea ice collapse, in case of one or more large cyclones hitting the Arctic Ocean later this year. To highlight the dangerous situation, the main image from a post earlier this month is again added below.


Adding to the concerns are huge sea surface temperature anomalies, as illustrated by the image below, showing anomalies at May 23, 2014, and created by Harold Hensel with ClimateReanalyzer and Google Earth.

[ click on image to enlarge ]
The image below shows sea surface anomalies on May 26, 2014, with an overlay of land temperatures, as created by Harold Hensel and edited by Sam Carana.


The image shows sea surface temperatures on the Northern Hemisphere that are 1.44 degrees Celsius warmer than the baselline temperature, despite large areas with cold water partly resulting from the huge amounts of meltwater flowing down along the edges of Greenland into the North Atlantic Ocean. The graph below shows Northern Hemisphere and Global sea surface temperature anomalies over May 2014.

By comparison, current (May 27, 2014) surface temperature anomalies of 0.64°C globally and 0.84°C for the NH. The image below shows annual temperature anomalies (land and ocean data).



Meanwhile, the development of this year's 'north hole' at the center of the sea ice appears to persist, as illustrated by the image below.






Thursday, May 22, 2014

The real budgetary emergency and the myth of "burnable carbon"

by David Spratt


How fast and how profoundly we act to stop climate change caused by human actions, and work to return to a safe climate, is perhaps the greatest challenge our species has ever faced, but are we facing up to what really needs to be done?

We have to come to terms with two key facts: practically speaking, there is no longer a "carbon budget" for burning fossil fuels while still achieving a two-degree Celsius (2°C) future; and the 2°C cap is now known to be dangerously too high.

No Carbon Budget Left - David Spratt from Breakthrough  -  "We have no carbon budget left
for burning of fossils fuels - emergency action is now the only viable path"  - 
David Spratt

For the last two decades, climate policy-making has focused on 2°C of global warming impacts as being manageable, and a target achievable by binding international treaties and incremental, non-disruptive, adjustments to economic incentives and regulations (1).

But former UK government advisor Professor Sir Robert Watson says the idea of a 2°C target "is largely out of the window”, International Energy Agency chief economist Fatih Birol calls it "a nice Utopia", and international negotiations chief Christiana Figueres says we need "a miracle". This is because, in their opinions, emissions will not be reduced sufficiently to keep to the necessary "carbon budget" (2).

The carbon budget has come to public prominence in recent years, including in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report in 2013, as being the difference between the total allowable greenhouse gas emissions for 2°C of warming, and the amount already emitted or spent. The budget varies according to the likelihood of overshooting the target: the higher the risk, the bigger the budget. In the IPCC report, no carbon budget is given for less than a one-in-three chance of failure.

At that one-in-three risk of failure, the IPCC says the total budget is 790 GtC (gigatons, or one billion tons, of carbon), less emissions to 2011 of 515 GtC, leaving a budget of 275 GtC in 2011, or ~245 GtC in 2014 (3).

What is less well understood is that if the risk is low, there is no carbon budget left (4).

Breakthrough National  Climate Restoration
Forum 21-22 June,  Melbourne
Climate change with its non-linear events, tipping points and irreversible events – such as mass extinctions, destruction of ecosystems, the loss of large ice sheets and the triggering of large-scale releases of greenhouse gases from carbon stores such as permafrost and methane clathrates – contains many possibilities for catastrophic failure.

Ian Dunlop, a former senior risk manager and oil and coal industry executive, says the management of catastrophic risk has to be very different from current processes. As serious, irreversible outcomes are likely, this demands very low probabilities of failure: management of catastrophic risk "must centre around contingency planning for high-impact and what were regarded as low-probability events, which unfortunately are now becoming more probable… Major, high-risk industrial operations, such as offshore oil exploration, provide a model, with detailed contingency planning and sequential barriers being put in place to prevent worst-case outcomes" (5).

If a risk-averse (pro-safety) approach is applied – say, of less than 10% probability of exceeding the 2°C target – to carbon budgeting, there is simply no budget available, because it has already been used up. A study from The Centre for Australian Weather and Climate Research shows that "the combination of a 2°C warming target with high probability of success is now unreachable" using the current suite of policy measures, because the budget has expired (6).

This is illustrated in Figure 1 where, as we move to the right (greater probability of meeting target) along the blue line which is the 2°C carbon budget, we reach a point around 90% probability (blue circle) where the total budget intersects with what we have already emitted.



As well, on-going greenhouse emissions associated with food production and deforestation are often conveniently pushed to one side in discussing carbon budgets. UK scientists have shown that if some reasonably optimistic assumptions are made about deforestation and food-related emissions for the rest of the century, then most emission reduction scenarios are incompatible with holding warming to +2ºC, even with a high 50% probability of exceeding the target. In other words, food and deforestation has taken up the remaining budget, leaving no space for fossil fuel emissions (7).

In addition, the carbon budget analysis makes optimistic and risky assumptions about the stability of Arctic, and of polar and other carbon stores such as permafrost. As one example, the modelling discussed in the IPCC report projects an area of summer Arctic sea-ice cover in the year 2100 higher that actually exists at the moment, yet there is a great deal more warming and sea-ice loss to come this century! In fact, many Arctic specialists think the Arctic will be sea-ice free in summer within the next decade, with consequences for global warming that the carbon budget calculations have significantly underestimated. (8)

Australian Climate Council member Prof. Will Steffen says the IPCC carbon budget may "be rather generous". The IPCC report says the modelling used does not include explicit representation of permafrost soil carbon decomposition in response to future warming, and does not consider slow feedbacks associated associated with vegetation changes and ice sheets. Recent research suggests these events could happen well below 2°C of warming, so they should be taken into account, but they are not.

Accounting for the possible release of methane from melting permafrost and ocean sediment implies a substantially lower budget (9). This reinforces the need to take a pro-safety, risk-averse approach to the carbon budget, especially since some research suggests that Arctic permafrost may be vulnerable at less than 2°C or warming (10).

For all these reasons – that is, prudent catastrophic risk management, accounting for food production and deforestation emissions, and for Arctic sea ice and carbon store instability – the idea of "burnable carbon" – that is, how much more coal, gas and oil we can burn and still keep under 2°C – is a dangerous illusion, based on unrealistic, high-risk, assumptions.

A second consideration is that 2°C of warming is not a safe target. Instead, it's the boundary between dangerous and very dangerous (11), and 1°C higher than experienced during the whole period of human civilisation (12), illustrated in Figure 2. The last time greenhouse gas levels were as high as they are today, modern humans did not exist (13), so we are conducting an experiment for which we have no direct observable evidence from our own history, and for which we do not know the full result.



However, we do understand that many major ecosystems will be lost, a 2°C sea-level rise will eventually be measured in the tens of metres (14), and much of human civilisation and large, productive river delta systems will be swamped. There is now evidence to suggest that the current conditions affecting the West Antarctic ice sheet are sufficient to drive between 1.2 and 4 metres of sea rise (15), and evidence that Greenland will contribute more quickly (16), and they are just two contributors to rising sea levels.

It is now clear that the incremental-adjustment 2°C strategy has run out of time, if for no other reason than the "budget" for burning more fossil fuels is now zero, yet the global economy is still deeply committed to their continuing widespread use.



We all wish the incremental-adjustment 2°C strategy had worked, but it hasn't. It has now expired as a practical plan.

We now have a choice to make: accept much higher levels of warming of 3–5°C that will destroy most species, most people and most of the world's ecosystems; a set of impacts some more forthright scientists say are incompatible with the maintenance of human civilisation.

Or we can conceive of a safe-climate emergency-action approach which would aim to reduce global warming back to the range of conditions experienced during the last 10,000 years, the period of human civilisation and fixed settlement. This would involve fast and large emissions reduction through radical energy demand reductions, whilst a vast scaling-up of clean energy production was organised, together with the remaking of many of our essential systems such as transport and food production, with the target being zero net emissions. In addition, there would need to be a major commitment to atmospheric carbon dioxide drawdown measures. This would need to be done at a speed and scale more akin to the "war economy", where social and economic priority is given to what is perceived to be an overwhelming existential threat.

After 30 years of climate policy and action failure, we are in deep trouble and now have to throw everything we can muster at the climate challenge. This will be demanding and disruptive, because there are no longer any non-radical, incremental paths available.

Prof. Kevin Anderson and Dr Alice Bows, writing in the journal Nature, say that "any contextual interpretation of the science demonstrates that the threshold of 2°C is no longer viable, at least within orthodox political and economic constraints" and that "catastrophic and ongoing failure of market economics and the laissez-faire rhetoric accompanying it (unfettered choice, deregulation and so on) could provide an opportunity to think differently about climate change" (17).

Anderson says there is no longer a non-radical option, and for developed economies to play an equitable role in holding warming to 2°C (with 66% probability), emissions compared to 1990 levels would require at least a 40% reduction by 2018, 70% reduction by 2024, and 90% by 2030. This would require "in effect a Marshall plan for energy supply". As well low-carbon supply technologies cannot deliver the necessary rate of emission reductions and they need to be complemented with rapid, deep and early reductions in energy consumption, what he calls a radical emission reduction strategy (18). All this suggests that even holding warming to a too-high 2°C limit now requires an emergency approach.

Emergency action has proven fair and necessary for great social and economic challenges we have faced before. Call it the great disruption, the war economy, emergency mode, or what you like; the story is still the same, and it is now the only remaining viable path.


keynote speaker, David Spratt, explains why there is no carbon budget left to burn.

Sources:
This article was originally published at ClimateCodeRed.org
Above video, NO CARBON BUDGET LEFT TO BURN, was uploaded by Breakthrough.



Notes
  1. Jaeger, C.C. and J. Jaeger (2011), "Three views of two degrees", Reg. Environ. Change, 11: S15-S26; Anderson, K. and A. Bows (2012) “A new paradigm for climate change”, Nature Climate Change 2: 639-70
  2. http://www.bbc.co.uk/news/science-environment-19348194; http://www.guardian.co.uk/environment/2011/may/29/carbon-emissions-nuclearpow; http://www.smh.com.au/environment/weather/climate-pioneers-see-little-chance-of-avoiding-dangerous-global-warming-20131105-2wyon.html
  3. IPCC (2013) "Working Group I Contribution to the IPCC Fifth Assessment Report Climate Change 2013; The Physical Science Basis: Summary for Policymakers"
  4. "For a 90% probability of not exceeding 2C of warming the carbon budget had reduced to zero by 2012, using a multi-agent (that is, the well-mixed greenhouse gases, including CO2 and CH4)", Raupach (2013, unpublished), based on Raupach, M. R., I.N. Harman and J.G. Canadell (2011) "Global climate goals for temperature, concentrations, emissions and cumulative emissions", Report for the Department of Climate Change and Energy Efficiency. CAWCR Technical Report no. 42. Centre for Australian Weather and Climate Research, Melbourne; Rogelj, J., W. Hare et al. (2011) "Emission pathways consistent with a 2°C global temperature limit", Nature Climate Change 1: 413-418 show at Table 1 no feasible pathways for limiting warming to 2°C during the twenty-first century with a "very likely" (>90%) chance of staying below the target, without carbon drawdown.
  5. Dunlop, I. (2011), "Managing catastrophic risk", Centre for Policy Development, 
  6. http://cpd.org.au/2011/07/ian-dunlop-managing-catastrophic-risk/
  7. Raupach, M. R., I.N. Harman and J.G. Canadell (2011) "Global climate goals for temperature, concentrations, emissions and cumulative emissions", Report for the Department of Climate Change and Energy Efficiency. CAWCR Technical Report no. 42. Centre for Australian Weather and Climate Research, Melbourne. 
  8. Anderson, K. and A. Bows (2008) “Reframing the climate change challenge in light of post-2000 emission trends”, Phil. Trans. R. Soc. A 366: 3863-3882; Anderson, K. and A. Bows (2011) “Beyond ‘dangerous’ climate change: emission scenarios for a new world”, Phil. Trans. R. Soc. A 369: 20–44
  9. Wadhams, P. (2012) “Arctic ice cover, ice thickness and tipping points”, AMBIO 41: 23–33; Maslowski, W., C.J. Kinney et al. (2012) "The Future of Arctic Sea Ice", The Annual Review of Earth and Planetary Sciences, 40: 625-654
  10. IPCC (2013) "Working Group I Contribution to the IPCC Fifth Assessment Report Climate Change 2013; The Physical Science Basis;
  11. Vaks, A., O.S. Gutareva et al. (2013) “Speleothems Reveal 500,000-Year History of Siberian Permafrost”, Science 340: 183-186; Schaefer, K., T. Zhang et al. (2011) "Amount and timing of permafrost carbon release in response to climate warming", Tellus 63:165-180
  12. Anderson, K. and A. Bows (2011) “Beyond ‘dangerous’ climate change: emission scenarios for a new world”, Phil. Trans. R. Soc. A 369: 20–44
  13. Marcott, S.A, J.D. Shakun et al. (2013) "A Reconstruction of Regional and Global Temperature for the Past 11,300 Years", Science 339: 1198-120; Hansen, J., P. Kharecha et al. (2013) "Assessing 'dangerous climate change': Required reduction of carbon emissions to protect young people, future generations and nature", Plos One 8: 1-26
  14. Tripadi, A.K., C.D. Roberts et al. (2009), "Coupling of CO2 and Ice Sheet Stability Over Major Climate Transitions of the Last 20 Million Years", Science 326: 1394-1397
  15. Rohling, E. J.,K. Grant et al. (2009) “Antarctic temperature and global sea level closely coupled over the past five glacial cycles”, Nature GeoScience, 21 June 2009 `af
  16. NASA (2014), "NASA-UCI Study Indicates Loss of West Antarctic Glaciers Appears Unstoppable", Media release, 12 May 2014, http://www.nasa.gov/press/2014/may/nasa-uci-study-indicates-loss-of-west-antarctic-glaciers-appears-unstoppable, accessed 19 May 2014; Rignot, E., J. Mouginot et al. (2014) "Widespread, rapid grounding line retreat of Pine Island, Thwaites, Smith and Kohler glaciers, West Antarctica from 1992 to 2011", Geophysical Research Letters, doi: 10.1002/2014GL060140; Joughin, I., B.E. Smith et al. (2014), "Marine Ice Sheet Collapse Potentially Under Way for the Thwaites Glacier Basin, West Antarctica", Science 344: 735 -738
  17. NASA (2014), "Hidden Greenland Canyons Mean More Sea Level Rise", Media release, 19 May 2014, http://www.nasa.gov/press/2014/may/hidden-greenland-canyons-mean-more-sea-level-rise, accessed 19 May 2014; Morlighem, M., E. Rignot et al. (2014), "Deeply incised submarine glacial valleys beneath the Greenland ice sheet", Nature Geoscience, doi:10.1038/ngeo2167
  18. Anderson, K. and A. Bows (2012) “A new paradigm for climate change”, Nature Climate Change 2: 639-70
  19. Anderson, K. (2014) "Why carbon prices can’t deliver the 2°C target", 13 August 2013, http://kevinanderson.info/blog/why-carbon-prices-cant-deliver-the-2c-target, accessed 19 May 2014; Anderson, K. (2012) "Climate change going beyond dangerous – Brutal numbers and tenuous hope", Development Dialogue, September 2012; Anderson, K. (2011) "Climate change going beyond dangerous – Brutal numbers and tenuous hope or cognitive dissonance", presentation 5 July 2011, slides available at http://www.slideshare.net/DFID/professor-kevin-anderson-climate-change-going-beyond-dangerous; plus (7) above.

Wednesday, May 21, 2014

The Art of Climate Change

by Dorsi Lynn Diaz

Help be a part of the solution! The Art of Climate Change on Kickstarter - an interactive social media & art show/exhibit this summer.

Climate change is HERE and climate change is happening NOW. It is not a figment of your imagination and the weather outside indeed is "frightening."

As I write this, the UK is getting battered by unprecedented storms and in California where I live, we are facing the possibility of a MEGA drought. As a long-time artist, writer and educator, I have been sounding the alarm bell for years. The question loomed large for me: How are we, as a collective society, going to tackle a huge problem?

That was when I had a light-bulb moment.

The idea came to me last year when I realized we need to have a multi-modal approach to addressing climate change. A hands-on, interactive dialogue with great visuals. In order to tackle the problem we needed to look at all the different aspects of climate change. And thus, the "The Art of Climate Change" was born - and the idea for a project: an art show and exhibit. But not your typical art show!

This show would be interactive and get people thinking about SOLUTIONS to climate change, challenge them to think out of the box, and most importantly, educate them about the how and wheres of climate change plus why places like the Arctic matter. This show could travel to cities and communities all over, and be a blue-print for teaching people about climate change and engage their own local artists, inventors and community in learning about "the problem."



But first I needed a venue to do the first show, and that would be one of the biggest hurdles. I connected with a local gallery, pitched the idea to them and they were impressed. In fact, they really liked my idea because it was "different" because it talked about solutions, not just doom and gloom. So now that I've been approved by their Board of Directors, I've got my venue and the show has taken on a life it its own.

The Art of Climate Change has its venue! Whoooppee!! The show is on the datebook - and it will be in run from June 19 - July 27, 2014 at The Sun Gallery in Hayward, Ca. (located in the Silicon Valley area)

I need help and support however to pull this off. This is a huge endeavor and the show has many different facets to it. I have many costs involved: Marketing, advertising, sign production, printing for the science graphics, some travel, equipment rentals (laptops and TV screens), art supplies, website hosting and building and other production costs... and this is why I am asking for your help. Not only will there be "art" on the walls but there will also be a series of artwork by children on endangered species that I have been teaching for the last several months.

The sections of the exhibit have been broken down into the following areas:

1. A section where we talk about "The Problem". This is where we talk turkey and explain the problem and take a good look at it.

2. There will be a section of the exhibit dedicated to extreme weather photos and art. Like they say, a picture can tell a thousand words, right?

3. Next we need to talk about "The Arctic and why it matters". Those record cold snaps happening in the US? Those are one of the strongest symptoms of our melting Arctic. That's due to our now meandering jet stream.

4. The Methane Monster. Yes there are monsters and this is probably one of the biggest ones we need to be worried about. Remember the dinosaur extinction? Well, scientists say that methane was their undoing. And we certainly don't want to go the way of the dinosaurs, right? So yes, we need to talk about the elephant in the room - that pesky methane monster. Which, by the way, is being released in some pretty scary amounts right now from underneath that warming Arctic water. No, it's not good. Not good at all.

5. A section just for THE CHILDREN and EDUCATION. This is the biggest reason I am doing this project. I want to be part of the solution to securing their future. One of the big parts of this project is teaching the kids. Right now I am doing a series of projects with them on endangered animal species. The way I look at it is if we can "teach the children we can touch the world." Their artwork will be prominently displayed in the art/show exhibit. So far they have done done art of endangered Polar Bear cubs, the Monarch Butterfly, Bees, Barn Owls and the Maui Dolphin.

6. A section with a "CALL TO ACTION"....this is where attendees are encouraged to engage with the problem so they can BE PART of the solution...which btw is the next big part of the art show/exhibit....

7. SOLUTIONS. This is where I have things planned that are definitely out of the box. Like inventions to slow down climate change by friends of mine that happen to be very creative too.

So that is my Kickstarter project in a very big nutshell. The really exciting thing though is how this blooming project has just sort of "vacuumed people" up...all kinds of people...from all around the world! Here are some of them that are going to be part of my project:
  • Climate Change Professor Paul Beckwith from the University of Ottawa, who will do a live Skype Q & A session with us. Attendees can sit down face to face with a leading climate change educator and ask questions about climate change from inside the show.
  • A life size mural of a Polar Bear with an Arctic scene, painted in the show/exhibit hall by muralist Lisa Hamblett-Montagnese.
  • Photographer Rose Gold will make the day even more special for kids by taking photos of them with the Polar bear.
  • A display of children and families climate change (endangered species) artwork from students of the Sun Gallery, A Joyful Noise Learning Center, Green Forest Art Studio, The Community Church of Hayward and Young Rembrandt's of the East Bay
  • A live viewing of Andy Lee Robinsons video on a flat screen TV which will be available for viewing all during the exhibits 5 1/2 weeks. Andy's video shows the decline of the Arctic ice accompanied by a musical composition by Andy called "Ice Dreams"
  • A graphic of "The Arctic Death Spiral" by Andy Lee Robinson, to be displayed in the Arctic section of the show.
  • A full size poster by Sam Carana (who set up the Arctic Methane Emergency Group on FB and edits the Arctic-News blog) on the effects of runaway climate change, designed by Sam and displayed in the Arctic section.
  • Original cartoons by Sam Carana, also an adviser on this project, displayed in the Methane section of the show.
  • Quotes with ideas by Harold Hensel, contributor to the Arctic-news blog and advisor on this project.
  • A full size poster of a tunnel invention as a possible solution to our warming waters by Patrick McNulty. Posted in Solutions.
  • A display of alternative fuel named "Bio-Fuel" with information by inventor Jay Toups. Posted in Solutions.
  • A live aquaponics display by Michael and Natalie Elola of Lucky Garden Hydroponics on how to grow vegetables and fruit indoors without using soil. Posted in Solutions.
  • A full size Polar bear costume mascot to be used for outreach. Designed and sewn by Nancy Martinez
  • A call for art by The Sun Gallery for extreme weather photos, climate change art and recycled and re-purposed art
  • A display of childrens books about climate change. Joe Santiago's books will also be featured. Displayed under Education.
  • The original video for the project will be displayed on a flat-screen video at the show for 5 1/2 weeks. Video editing and production by Mead Rose at Web Design by Mead.
  • Artistically designed Climate change confections by pastry chef Cori Diaz for the Artists reception
  • A local rock/punk band that sings songs about climate change. They will sing at the Artists reception.
  • Educational tables set up by the City of Hayward with information about the cities climate change plan along with other entities like the EPA, Water Conservation Board, EBMUD and Waste Management.
  • Deagon B. Williams, friend and adviser on this project.
  • Advertising help with the project by Trish McDermott of Avatar Tech Pubs.
Endangered Animal Art series taught by Dorsi Diaz


he Arctic "Death" Spiral in The Art of Climate Change
Costs for The Art of Climate Change
The many people people contributing their talents to The Art of Climate Change
The main reason I am doing the show is for them
Sam Carana's contribution to the show - a very telling graphic
The types of climate change disasters we need to talk about

This is what we need to be talking about.
Risks and challenges

I have been working on this project already for over 4 months, successfully pulling people together to either create art for the exhibit, or to contribute educational material. The biggest obstacle for the show was of course the venue but I have the venue for the art show/exhibit set in stone from June 19 - July 27 of this year. A solid foundation has been made, the main thing I need help with now is the financial expenses that the show will cost me - like the rental of equipment to do the live Skyping session with Professor Paul Beckwith, and the special art projects I plan to do within our community. I also have plans to do more community outreach to reach more local public agencies and I plan to have more events centered around the show (how much I can do will be determined by how much funding I get)

How will I deal with any special surprises or costs that I might not have possibly factored in? I will do what I always did in business, I will work with the issue and either adjust or downsize that particular part, - possibly even bartering for services, or ask for donations to help with a particular cost.

What unique challenges might I have after the project is funded? Well I don't foresee any emergencies but if there are any, I have a network of people that will help and advise me through any major problems. The only thing I see is that I may not be able to accommodate all the art that may come in, but that's a good problem to have! Better to have more than not enough - and director Liesa Lietzke and Jacqueline Cooper at The Sun Gallery where the show is will be able to help walk me through any major hiccups if there are any.

Questions?

Have a question? If the info above doesn't help, you can ask questions at Kickstarter to the project creator.

Donate

To donate to this project, go to Kickstarter.

Friday, May 16, 2014

More extreme weather can be expected



The heaviest rains and floods in 120 years have hit Serbia and Bosnia this week, Reuters and Deutsche Welle report.

The animation below shows the Jet Stream's impact on the weather. Cold temperatures have descended from the Arctic to Serbia and Bosnia in Europe and all the way down to the Gulf of Mexico in North America, while Alaska, California, and America's East Coast are hit by warm temperatures. In California, 'unprecedented' wildfires and fierce winds lead to 'firenadoes', reports CNN.



The image below shows that on May 15, 2014, the wind approaching Serbia and Bosnia at 700 hPa reached speeds of up to 120 km per hour (75 mph), as indicated by the green circle on the main image and inset.


The image below, from skeptical science, shows the cyclonic spin that can be expected in a through such as the one that hit Serbia and Bosnia recently.


As the Jet Stream changes, more extreme weather events can be expected. What makes the Jet Stream change? As the Arctic is warming up faster than the rest of the world, the temperature difference between the Arctic and the equator decreases, in turn decreasing the speed at which the Jet Stream circumnavigates the globe. This can cause 'blocking patterns', with extreme weather events hitting an area longer than before.

As the jet stream becomes wavier, cold air can more easily descend from the Arctic down to lower latitudes in a downward through of the Jet Stream, while warm air can more easily reach higher latitudes in an upward ridge of the Jet Stream.

This spells bad news for many areas across the world that can be expected to be hit by more extreme weather events such as heatwaves, wildfires fuelled by stronger winds and more intense drought, storms and floods.

Heatwaves are a huge threat in the Arctic, especially when followed by storms that can cause warm surface water to mix down to the bottom of the sea and warm up sediments under the seafloor that can contain huge amounts of methane in the form of hydrates and free gas. The situation is dire and calls for comprehensive and effective action, as discussed at the Climate Plan blog.


Related

- Extreme weather strikes around the globe - update
http://arctic-news.blogspot.com/2014/02/extreme-weather-strikes-around-the-globe-update.html

- Escalating extreme weather events to hammer humanity (by Paul Beckwith)
http://arctic-news.blogspot.com/2014/04/escalating-extreme-weather-events-to-hammer-humanity.html

- Our New Climate and Weather (by Paul Beckwith)
http://arctic-news.blogspot.com/2014/01/our-new-climate-and-weather.html

- Our New Climate and Weather - part 2 (by Paul Beckwith)
http://arctic-news.blogspot.com/2014/01/our-new-climate-and-weather-part-2.html

- Changes to Polar Vortex affect mile-deep ocean circulation patterns
http://arctic-news.blogspot.com/2012/09/changes-to-polar-vortex-affect-mile-deep-ocean-circulation-patterns.html

- Polar jet stream appears hugely deformed
http://arctic-news.blogspot.com/2012/12/polar-jet-stream-appears-hugely-deformed.html



Sunday, May 11, 2014

Changing the landscape of psychiatric research:


What will the RDoC initiative by NIMH achieve?










©CartoonStock.com




There's a lot wrong with current psychiatric classification. Every few years, the American Psychiatric Association comes up with a new set of labels and diagnostic criteria, but whereas the Diagnostic and Statistical Manual used to be seen as some kind of Bible for psychiatrists, the latest version, DSM5, has been greeted with hostility and derision. The number of diagnostic categories keeps multiplying without any commensurate increase in the evidence base to validate the categories. It has been argued that vested interests from pharmaceutical companies create pressures to medicalise normality so that everyone will sooner or later have a diagnosis (Frances, 2013). And even excluding such conflict of interest, there are concerns that such well-known categories as schizophrenia and depression lack reliability and validity (Kendell & Jablensky, 2003).



In 2013, Tom Insel, Director of the US funding agency, National Institute of Mental Health (NIMH), created a stir with a blogpost in which he criticised the DSM5 and laid out the vision of a new Research Domain Criteria (RDoC) project. This aimed "to transform diagnosis by incorporating genetics, imaging, cognitive science, and other levels of information to lay the foundation for a new classification system."



He drew parallels with physical medicine, where diagnosis is not made purely on the basis of symptoms, but also uses measures of underlying physiological function that help distinguish between conditions and indicate the most appropriate treatment. This, he argued, should be the goal of psychiatry, to go beyond presenting symptoms to underlying causes, reconceptualising disorders in terms of neural systems.



This has, of course, been a goal for many researchers for several years, but Insel expressed frustration at the lack of progress, noting that at present: "We cannot design a system based on biomarkers or cognitive performance because we lack the data". That being the case, he argued, a priority for NIMH should be to create a framework for collecting relevant data. This would entail casting aside conventional psychiatric diagnoses, working with dimensions rather than categories, and establishing links between genetic, neural and behavioural levels of description.



This represents a massive shift in research funding strategy, and some are uneasy about it. Nobody, as far as I am aware, is keen to defend the status quo, as represented by DSM.  As Insel remarked in his blogpost: "Patients with mental disorders deserve better". The issue is whether RDoC is going to make things any better. I see five big problems.



1. McLaren (2011) is among those querying the assumption that mental illnesses are 'disorders of brain circuits'. The goal of the RDoC program is to fill in a huge matrix with new research findings. The rows of the matrix are not the traditional diagnostic categories: instead they are five research domains: Negative Valence Systems, Positive Valence Systems, Cognitive Systems, Systems for Social Processes, Arousal/Regulatory Systems, each of which has subdivisions: e.g. Cognitive Systems is broken down into Attention, Perception, Working memory, Declarative memory, Language behavior and Cognitive (effortful) control. The columns of the matrix are Genes, Molecules, Cells, Circuits, Physiology, Behavior, Self-Reports, and Paradigms. Strikingly absent is anything about experience or environment.



This seems symptomatic of our age. I remember sitting through a conference presentation about a study investigating whether brain measures could predict response to cognitive behaviour therapy in depression.  OK, it's possible that they might, but what surprised me was that no measures of past life events or current social circumstances were included in the study. My intuitions may be wrong, but it would seem that these factors are likely to play a role. My impression is that some of the more successful interventions developed in recent years are based not on neurobiology or genetics, but on a detailed analysis of the phenomenology of mental illness, as illustrated, for example, by the work of my colleagues David Clark and Anke Ehlers. Consideration of such factors is strikingly absent from RDoC.



 2. The goal of the RDoC is ultimately to help patients, but the link with intervention is unclear. Suppose I become increasingly obsessed with checking electrical switches, such that I am unable to function in my job. Thanks to the RDoC program, I'm found to have a dysfunctional neural circuit. Presumably the benefit of this is that I could be given a new pharmacological intervention targeting that circuit, which will make me less obsessive. But how long will I stay on the drug? It's not given me any way to cope with the tendency of checking the unwanted thoughts that obtrude into my consciousness, and they are likely to recur when I come off it.  I'm not opposed to pharmacological interventions in principle, but they tend not to have a 'stop rule'. 



There are psychological interventions that tackle the symptoms and the cognitive processes that underlie them more directly.  Could better knowledge of neurobiological correlates help develop more of these?  I guess it is possible, but my overall sense is that this translational potential is exaggerated – just as with the current hype around 'educational neuroscience'. The RDoC program embodies a mistaken belief that neuroscientific research is inherently better than psychological research because it deals with primary causes, when in fact it cannot capture key clinical phenomena. For instance, the distinction between a compulsive hand-washer and a compulsive checker is unlikely to have a clear brain correlate, yet we need to know about the specific symptoms of the individual to help them overcome them.



3. Those proposing RDoC appear to have a naive view of the potential of genetics to inform psychiatry.  It's worth quoting in detail from their vision of the kinds of study that would be encouraged by NIMH, as stated here:



Recent studies have shown that a number of genes reported to confer risk for schizophrenia, such as DISC1 (“Disrupted in schizophrenia”) and neuregulin, actually appear to be similar in risk for unipolar and bipolar mood disorders. ... Thus, in one potential design, inclusion criteria might simply consist of all patients seen for evaluation at a psychotic disorders treatment unit. The independent variable might comprise two groups of patients: One group would be positive and the other negative for one or more risk gene configurations (SNP or CNV), with the groups matched on demographics such as age, sex, and education. Dependent variables could be responses to a set of cognitive paradigms, and clinical status on a variety of symptom measures. Analyses would be conducted to compare the pattern of differences in responses to the cognitive or emotional tasks in patients who are positive and negative for the risk configurations.



This sounds to me like a recipe for wasting a huge amount of research funding. The effect sizes of most behavioural/cognitive genetic associations are tiny and so one would need an enormous sample size to see differences related to genotype. Coupled with an open-ended search for differences between genotypes on a battery of cognitive measures, this would undoubtedly generate some 'significant' results which could go on to mislead the field for some time before a failure to replicate was achieved (cf. Munafò, & Gage, 2013).



The NIMH website notes that "the current diagnostic system is not informed by recent breakthroughs in genetics". There is good reason for that: to date, the genetic findings have been disappointing. Such associations as are found either indicate extremely rare and heterogeneous mutations of large effect and/or involve common genetic variants whose small effects are not of clinical significance. We cannot know what the future holds, but to date talk of 'breakthroughs' is misleading.



4. Some of the entries in the RDoC matrix also suggest a lack of appreciation of the difference between studying individual differences versus group effects.  The RDoC program is focused on understanding individual differences. That requires particularly stringent criteria for measures, which need to be adequately reliable, valid and sensitive to pick up differences between people.  I appreciate that the published RDoC matrices are seen as a starting-point and not as definitive, but I would recommend that more thought goes into establishing the psychometric credibility of measures before embarking on expensive studies looking for correlations between genes, brains and behaviour. If the rank ordering of a group of people on a measure is not the same from one occasion to another, or if there are substantial floor or ceiling effects, that measure is not going to be much use as an indicator of an underlying construct. Furthermore, if different versions of a task that are supposed to tap into a single construct give different patterns of results, then we need a rethink – see e.g. Foti et al, 2013; Shilling et al, 2013, for examples.  Such considerations are often ignored by those attempting to move experimental work into a translational phase. If we are really to achieve 'precision medicine' we need precise measures.



5. The matrix as it stands does not give much confidence that the RDoC approach will give clearer gene-brain-behaviour links than traditional psychiatric categories.



For instance, BDNF appears in the Gene column of the matrix for the constructs of acute threat, auditory perception, declarative memory, goal selection, and response selection. COMT appears with threat, loss, frustrative nonreward, reward learning, goal selection, response selection and reception of facial communication. Of course, it's early days. The whole purpose of the enterprise is to flesh out the matrix with more detailed and accurate information. Nevertheless, the attempts at summarising what is known to date do not inspire confidence that this goal will be achieved.



After such a list of objections to RDoC, I do have one good thing to say about it, which is that it appears to be encouraging and embracing data-sharing and open science. This will be an important advance that may help us find out more quickly which avenues are worth exploring and which are cul-de-sacs. I suspect we will find out some useful things from the RDoC project: I just have reservations as to whether they will be of any benefit to psychiatry, or more importantly, to psychiatric patients.



References

Foti, D., Kotov, R., & Hajcak, G. (2013). Psychometric considerations in using error-related brain activity as a biomarker in psychotic disorders. Journal of Abnormal Psychology, 122(2), 520-531. doi: 10.1037/a0032618



Frances, A. (2013). Saving normal: An insider's revolt against out-of-control psychiatric diagnosis, DSM-5, big pharma, and the medicalization of ordinary life. New York: HarperCollins.



Kendell, R., & Jablensky, A. (2003). Distinguishing between the validity and utility of psychiatric diagnoses. American Journal of Psychiatry, 160, 4-12.



McLaren, N. (2011). Cells, Circuits, and Syndromes: A Critical Commentary on the NIMH Research Domain Criteria Project Ethical Human Psychology and Psychiatry, 13 (3), 229-236 DOI: 10.1891/1559-4343.13.3.229



Munafò, M. R., & Gage, S. H. (2013). Improving the reliability and reporting of genetic association studies. Drug and Alcohol Dependence(0). doi: http://dx.doi.org/10.1016/j.drugalcdep.2013.03.023



Shilling, V. M., Chetwynd, A., & Rabbitt, P. M. A. (2002). Individual inconsistency across measures of inhibition: an investigation of the construct validity of inhibition in older adults. Neuropsychologia, 40, 605-619.





This article (Figshare version) can be cited as:

 Bishop, Dorothy V M (2014): Changing the landscape of psychiatric research: What will the RDoC initiative by NIMH achieve?. figshare. http://dx.doi.org/10.6084/m9.figshare.1030210