I have been an 80,000 Hours Podcast listener and active in EA for about eighteen months, and I have shifted from a focus on climate change, to animal welfare, and now to x-risk and s-risk, which seem to be highly promising from an EA perspective. Along the way, I wonder if some parts of EA might have underplayed climate change, and if more engagement and content on the topic could be valuable.
While I was thinking about sustainability and ethics, I was frustrated by how limited the coverage of the topic was in the 80,000 Hours Podcast episodes, so I emailed the team. Rob Wiblin responded and suggested that I write up an EA forum post.
Thanks to Alexrjl, John Bachelor, and David Nash for their suggestions and edits.
Edited 29/10/2019 to remove a misquotation of 80,000 Hours, and a few other cases where I want to rectify some over-simplifications.
Summary
While it is true that EA and 80,000 Hours is effective in drawing attention to highly neglected areas, my view is it has unjustly neglected coverage of climate change. There are several reasons why I believe climate change deserves more attention within EA. Firstly, some key opinion-shapers in EA appear to have recently updated towards higher weightings on the severity of climate change. Secondly, though climate change is probably not an existential risk itself, it could be treated as an existential risk factor or multiplier. Thirdly, there are limitations to a crude application of the ITN framework and a short-termist approach to altruism. Fourthly, climate change mitigation and resilience may be more tractable than previously argued. Finally, by failing to show a sufficient appreciation of the severity of climate change, EA may risk losing credibility and alienating potential effective altruists.
Changing perceptions of climate change among key individuals in EA
1. Assessment of climate change in Doing Good Better, 2015
The view taken in this book, foundational to EA, mostly equates climate change to a year of lost growth, and assigns a 'small but significant risk' that temperature rises are above 4C.
Will Macaskill: Economists tended to assess climate change as not all that bad. Most estimate that climate change will cost only around 2% of global GDP… The thought that climate change would do the equivalent of putting us back one year economically isn’t all that scary- 2013 didn’t seem that much worse than 2014… So the social cost of one tonne of American’s greenhouse gas emissions is about $670 every year. Again, that’s not a significant cost, but it’s also not the end of the world.
However, this standard economic analysis fails to faithfully use expected value reasoning. The standard analysis looks only at the effects from the most likely scenario: a 2-4C rise in temperature… there is a small but significant risk of a temperature increase that’s much greater than 2-4C.
The IPCC gives more than 5% probability to temperature rises greater than 6C and even acknowledges a small risk of catastrophic climate change of 10C or more. To be clear, I’m not saying that this is at all likely, in fact it’s very unlikely. But it is possible, and if it were to happen, the consequences would be disastrous, potentially resulting in a civilisational collapse. It’s difficult to give a meaningful answer of how bad that would be, but if we think it’s potentially catastrophic, then we need to revise our evaluation of the importance of mitigating climate change. In that case, the true expected social cost of carbon could be much higher than $32 per metric ton, justifying much more extensive efforts to reduce emissions than the estimates the economists first suggested.
The main text, and the later table of cause prioritisation uses the economic cost model and assumes 2-4C of warming, without appreciating the follow-on risks. It seems presumptive to assume that, without action, warming would stay to 2C, as DGB does. Pledges are just things written on paper - history’s taught us that.
This source suggests we’re on for 4.1-4.8C of warming by 2100, so it seems erroneous to assume 2-4C should be our baseline assumption. A reader would walk away from this book thinking that climate change was generally not worth worrying too much because 2-4C is equivalent to a year of lost growth, and the chance of >4C of warming is ‘small but significant’.
2. Toby Ord updated his weighting toward climate change in 2018
The text below is from the Fireside Chat with Toby Ord at EAG 2018. In this, Toby raises that the tail-risks are higher than many people think. It seems that one of the key researchers in EA (Toby) has updated their views on the severity of climate change.
Will Macaskill: Between climate change and nuclear winter, do you think climate change is too neglected by EA?
Toby Ord: Yeah, actually, I think it probably is....
I think that there is some existential risk remaining from nuclear war and from climate change… I think that the amount of warming that could happen from climate change is really under-appreciated. The tail risk, the chance that the warming is a lot worse than we expect, is really big. Even if you set aside the serious risks of runaway climate change, of big feedbacks from the methane clathrates or the permafrost, even if you set all of those things aside, scientists say that the estimate for if you doubled CO2 in the atmosphere is three degrees of warming. And that's what would happen if you doubled it.
But if you look at the fine print, they say it's actually from 1.5 degrees to 4.5 degrees. That's a huge range. There's a factor of three between those estimates, and that's just a 66% confidence interval... They actually think there's a one in six chance it's more than 4.5 degrees... I'm actually a lot more worried about it than I was before I started looking into this.
Even assuming that we stay within 2C of first order warming (which we're not on track to do) then >4.5C of warming has a 17% probability. Given that we are neither on track for 2C of first-order, and because the >4C knock-on risk is so high, then I think Will's language of 'a small but significant risk' of >4C warming does not represent the issue accurately.
3. The 80,000 Hours Podcast Episode 50 has a brief discussion of the impact of climate change on food production.
David Denkenberger: In the coincident extreme weather or multiple breadbasket failure, you have droughts or floods on multiple continents at once. There was a UK government study on this that estimated right now, it might be around 1% chance per year, but with the slow climate change... They were getting more like 80% chance of this century that something like that would happen.
Robert Wiblin: Wow. Okay.
At the time of writing this, the only interview I’m aware of where climate change mitigation is extensively discussed is with Professor Yew-Kwang Ng, where they argue that interventions to reduce emissions are beneficial from an economic and welfare perspective because of the short term gain/long term harm impact of warming.
4. Climate change impacts welfare more than just delaying economic growth.
In the Doing Good Better cause review, the impacts are mostly equated to just lower growth, plus a tail-risk of civilisational collapse. However, it seems that this doesn't capture the full brunt of the impacts under the main emissions trajectories. From the Stern Review, source here:
The Stern Review: By 2100, in South Asia and sub-Saharan Africa, up to 145 - 220 million additional people could fall below the $2-a-day poverty line, and every year an additional 165,000 - 250,000 children could die compared with a world without climate change.
Climate change will hit the poorest people in society the hardest, probably increasing inequality, and damage many global supply chains that people rely on - making basic goods harder to access. A recent paper argues that mosquito-born diseases could reach another billion people as the climate warms.
5. Giving What We Can acknowledges a lack of EA-aligned research in this area
GWWC: Specifically, there haven’t been any studies in the past 16 years which quantify the impact of climate change on global health in DALYs and in a per-tonne figure. Producing quantitative estimates of the exact mortality and morbidity impacts of climate change (and of present emissions) is still a relatively neglected area.
6. Implications of climate change are absent from many 80,000 Hours Podcasts discussing future economic growth prospects
In this podcast, Tyler Cowen talks about maximising 'sustainable economic growth', with no definition on what sustainability means, despite the Stern Review highlighting the trade-off between short-term and long-term growth. His implication is that we should all grow the economy, rather than reduce GHGs or improve resilience/adaptation.
The Stern Review: Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.
In the most recent 80,000 Hours Podcast interview (at the time of writing) on improving the world through charter cities, there was no mention of climate change, when adaptation and climate resilience is far behind where it needs to be.
How do these cities with huge populations plan to deal with stresses on global water supplies, floods, and rising temperatures which are impacting cities around the world today?
For more information on how rising sea levels and temperatures will affect societies and cities, I’d recommend this LRB review, and the book ‘Extreme Cities’.
Climate change could multiply very bad outcomes
7. Climate change is better understood as an existential risk factor.
Will Macaskill: Between climate change and nuclear winter, do you think climate change is too neglected by EA?
Toby Ord: Yeah, actually, I think it probably is...
I think the way to think about this is not that war itself, or great power war, is an existential risk, but rather it's something else, which I call an existential risk factor…
I've been thinking about all these things in terms of whether they could be existential risks, rather than whether they could lead to terrible situations, which could then lead to other bad outcomes.
In this report, John Halstead views the causality between climate change and other existential risks as difficult to nail down. He argues that more emissions and warming might create destabilisation and nuclear war, but it's hard to see exactly how.
It seems to me that this 'existential risk factor' or 'existential risk multiplier' should be far from zero. I would expect that global governments dealing with >4C of warming in 100 years would face many domestic pressures, and that might make it harder to form agreements on disarmament treaties, or nanotechnology protocols, for example.
Another example could be biotechnology - if humanity is adapting to high levels of warming, then this could increase the risk that a hostile group (perhaps displaced by climate change) uses advanced weaponry in a way that could be an existential risk.
In this review article, reference is made to senior US security officials calling climate change a ‘threat multiplier’, with the following quote from Todd Miller in his book, Storming the Wall: Climate Change, Migration and Homeland Security.
More dangerous than climate disruption was the climate migrant. More dangerous than the drought were the people who can’t farm because of the drought. More dangerous than the hurricane were the people displaced by the storm.
Climate change could limit humanity’s potential
Or, if we take existential risks as those limiting humanity's potential, then conflicts over resources in a world with 5-10C of warming could lead to bad institutional lock-in, over a few hundred years if not longer-term.
For example, the capital city of Mongolia, which has warmed by 2.2C, is not looking like a place with great MCE (moral circle expansion) prospects, or like somewhere that can dedicate lots of resources to x-risk governance.
Climate change could increase S-risk
It is also seems possible to me that climate change rises the chance of s-risk, e.g. through creating global dynamics in the future on the basis that we’ve failed to satisfactorily cooperate in resolving environmental degradation issues.
8. And some academics working on climate change are predicting near-term social collapses.
The quote from below is from a draft paper by Dr Jem Bendell (Professor of Sustainability Leadership and Founder of the Institute for Leadership and Sustainability (IFLAS) at the University of Cumbria. Social collapse from climate change in the next few decades might have a significantly non-zero probability. The VICE write-up references other sustainability professionals who broadly agree.
That synthesis leads to a conclusion there will be a near-term collapse in society with serious ramifications for the lives of readers. The paper reviews some of the reasons why collapse-denial may exist, in particular, in the professions of sustainability research and practice, therefore leading to these arguments having been absent from these fields until now.
I think that even if we disagree with the severity of these conclusions, we need to assign a significantly non-zero probability to some social collapse scenarios in the next few decades, and then act on that basis (e.g. through improvising resilience).
Weaknesses of a crude application of the ITN framework
I think that climate change might have also been neglected because of issues with the ITN framework of cause prioritisation and some intellectual oversight within early EA.
9. The Importance - Tractability - Neglectedness framework is flawed when it assumes the world is static.
As EA has moved from high impact individual philanthropy to thinking about broader social changes, I think it needs to appreciate that the world is dynamic. In project management, you prioritise what you need to do based on the urgency of when to do them. A better framework would include urgency - because if climate change destabilises civilisation, then it's going to be a lot harder to progress on moral circle expansion, for example. (Thanks to John Bachelor for some thoughts on this, and the idea of ITNU).
And even Will Macaskill’s original analysis doesn’t rule out the possibility that climate change causes social collapse. When we bring in the aspect of time, climate change is a more urgent problem because there is only around 10 years in which to halve emissions in order to avoid dangerous levels of climate change. By contrast, I think the urgency point might change our approach to neglected tropical diseases, as these could still be cured in the future, and advanced technology may make this even easier.
But on current warming trajectories, it looks like civilisation could be very different in 100 years, and possibly much worse, with an increased risk of lock-in. So the sooner we tackle climate change the better.
10. Creative effective altruists could have a big impact in this challenging field.
I think this was well expressed by Christine Peterson.
Christine Peterson: My initial exposure to effective altruism was to some of the earliest documents and the earliest visions, and there was in some of those, there was a very high emphasis on measurement... However, somehow I got the impression that it was over-emphasized, and that effective altruists were perhaps overly focused on measurement, overly focused on near-term goals, and I … My gut reaction was, no, no. You guys are the most intelligent, most ambitious, most energetic. You’re at a time of your life where you don’t have a lot of burdens on you. You don’t, you’re not raising kids yet. Now is not the time to focus on near-term, easy to measure goals. Now is the time to take on the biggest, hardest, most revolutionary things you possibly can, and throw yourselves at them, because some of you will succeed.
To take a practical example, what is the 200 year effect of AMF when there are water shortages and heatwaves in the future, compared to the 200 year effect of a GHG reduction initiative, carbon capture program, and a water desalination plant? This intellectual heritage and the focus on short-term numbers and results influenced by the short-term world of hedge funds, and this approach could be at the detriment of missing out on broader changes.
For this reason, we should be thinking about infrastructure, and long-term changes to society, such as advocating for MCE across all sentient life. Parfit drew a lot of attention to long-term welfare threats from climate change in his work, such as in this talk at Harvard EA.
11. EA has a bias towards intellectually stimulating abstract problems, and climate change is more emotively draining and perhaps more boring than others.
It's grim to consider how your own neighbourhood is going to have to start rationing water within the next two decades, if not sooner, or to assign the probability of social collapse for where you live in your lifetime from climate change.
If you work on risks from nuclear war, then the odds of this happening next year are quite low, so you can probably go about your life unchanged. But 2019, and the next year, and the next, the world will get increasingly hotter.
Empirical ideas about tackling climate change
12. Climate change reduction could be tractable for many EA readers.
Some recent posts on this forum have shown how competitive applications for top EA orgs can be, and 80,000 Hours now has readership of 3m. While a small share of EA forum and 80,000 Hours readers might go into biological security and AI research, I think a lot can do things to reduce climate change, e.g. through petitions, community work, research, divestment, and many others - so there are lots of points of attack on the problem.
Listeners to 80,000 Hours and people involved in EA could do many things:
- Work at major energy companies on reducing emissions
- Develop low-carbon technology
- Work on policy advocacy in neglected areas
- Climate modelling
- Research on reducing the risk of social collapse
- Risk assessments on flood, heat, fire, and impacts to biodiversity
- Adaptation and resilience work to adapt to less water, and more efficient agriculture
13. Improved technology could resolve the collective action problems of climate change.
The 80,000 Hours climate change page argues that it could be less tractable because of the free rider problem. With improved monitoring of emissions from space, this may help tackle the free rider problem, and breakthroughs in decarbonising industry could help improve efficiency. If we take an Elinor Ostrom view, we might be able to build the components to resolve the collective action problems with improved tech.
14. There is increasing public interest in climate change.
It is true that climate change is not very neglected, and this gives a good argument against working on it, relative to areas like biorisk. On the upside, the popularity of climate change means that society already wants to dedicate an huge and increasing amount of resources to tackle this problem.
With global record temperatures in 2018 (which seem likely to be yet higher this year), and the BBC changing its guidelines over reporting on climate change (with a David Attenborough-presented documentary coming out on Thursday 18th April), it seems likely to me that there will be huge public interest and resources which can be allocated to this problem.
Even if the highest amount of value of EA is in allocating 0.01% of global resources from people who can be persuaded to think about x-risk / MCE/ etc., then if we can capture some of the resources flowing to tackling climate and use things like the ITN framework, then this could create a huge amount of value. Or, if we can increase the amount humanity's resources as a whole dedicated to climate change (rather than luxury goods), then this seems highly valuable.
This could also make the social contagion of climate change reduction initiatives very high. With rising temperatures, your 10% GWWC plan to AMF might fail to motivate other people apathetic about climate change in your social circle, but if it was 10% to CATF, CFRN, or Climate Works, then you might be able to inspire many others. Anecdotally, I found that when I Googled “giving what we can”, the most popular autocomplete line was for “climate change”, and the friends who I’ve bought Doing Good Better for set up donations to Cool Earth.
15. Even the wealthiest countries are woefully behind on adaptation, and this could perhaps be tractable.
Perhaps as a result of failing to weight the future correctly, and confusion over the extent of climate change, even some of the richest countries in the world are behind where they need to be to maintain current levels of welfare.
Water shortages are expected in the UK in the next 25 years. However, more resilient global infrastructure (e.g. water, energy, food) plus innovation in this space (e.g. through water desalination) could be highly tractable.
Lack of engagement with climate change could damage EA
16. Individuals coming to EA might be put-off by the lack of engagement with climate change.
In looking for EA material on climate change, I found the GWWC page hadn’t been updated since 2013, and that 80,000 Hours has very little material on 2-4C of warming, and instead chunks up its analysis of climate change into focusing on tail risk. I came to EA trying to work out whether I should focus on climate change or animal welfare, but I've been surprised by the lack of detail on climate change, given the huge public interest in it. And this has been frustrating.
Having listened to the 80,000 Hours Podcast, and listened to Parfit's extinction argument, I agree that it is much more important to work on x-risk (and s-risk), but I wonder whether we are alienating potential EAs by not grappling with this issue.
17. Highlighting that climate change is probably not an extinction risk could reduce fatalism
As before, I agree with John Halstead's view that is it not an existential risk, and that broadly we can adapt to climate change, and thrive in the long reflection were there no other existential risks. But many of my friends do not agree with this - they would ask what the point of preserving the future if climate change will make it much less positive than our current lives. A BBC article describes a rise in eco-anxiety, and feelings of despondency.
But if we can help craft a narrative about how civilisation can get through its present challenges, then we could have a long and flourishing future. This positivity is one of the many things I love about Parfit and about EA in general. This might also help reduce fatalism and increase the number of people motivated to work on improving the long-term future.
Things I'm uncertain about
- Whether the scale/ neglectedness/ tractability framework should be revisited or replaced with a more dynamic model - ITNU with U for urgency
- Whether climate change can actually be tackled meaningfully
- What the level of public support is for climate change mitigation
- How much traction there has been with donors for climate change
- Pros/cons of soft broad EA on climate change, vs narrow EA on x-risk
- Whether biodiversity loss is a significant long-term problem
- How well we can adapt to climate change over coming decades
- How likely is civilisational collapse, or worsening of morals
- How climate change could impact existential risk
- Whether climate adaptation could also be potentially high value for EAs
Conclusions
To summarise, I think the importance/scale of climate change is undervalued within EA, the low neglectedness is a good argument against working on it but does mean there is a lot of available resource, and the tractability of climate change could be higher than thought. When the ITN(U) framework includes urgency in sequencing effective action, this prioritises issues which can shape the ethical trajectories of civilisations.
I agree climate change is not an x-risk, and though EA shouldn't focus on it, we should probably discuss it a bit more and bring our critical thinking to help solve the problem more efficiently. Not discussing it seems like at best an oversight, and at worst, harmful.
But I'm really unsure about all of this, and would like to learn more.
Thanks for reading and I would appreciate your thoughts, and any recommended reading! Like I said earlier, I'm thinking about where my allocate my time to do the most good. I do see existential risk reduction and the regulation of new technology as extremely high impact, but I wonder whether there's some more value to be unlocked here in tackling climate change.
Further reading
Climate change
- A quick summary by the BBC
- The IPCC reports
- UK Climate Change impacts and adaptation assessment
- A review of The Uninhabitable Earth by David Wallace-Wells
- There is no planet B by Mike Berners-Lee
- An individual’s account of wildfires, within a review of other climate change impact books
- OECD page on adaptation
- An interactive tool on how different temperatures will impact different areas
- BBC’s Climate Change: The Facts with David Attenborough
As an x-risk
- Future of Life perspective here
- John Halstead’s Founders Pledge paper on climate change, this paper on existential risk (including a short section on climate change impacting other risks - which I discuss above), and this one on whether climate change is an existential risk
- Fireside chat with Toby Ord in 2018
- Long-Term Trajectories of Human Civilization paper (2019) in Foresight here
- Catastrophe, Social Collapse, and Human Extinction by Robin Hanson
Taking an EA approach to climate change
- As before, the Founders Pledge paper on climate change
- The 80,000 Hours page here (which I discuss above)
- Elinor Ostrom’s profile, 1990 book Governing the Commons, and Nobel Prize speech
Economics and climate change
- An exploration of trade-offs between long-term and short-term economic growth
- The Stern Review summary
Some organisations working on this
- 350.org
- Drawdown.org
- Climate Outreach - e.g. how to mainstream low-carbon research
- Grantham Research Institute at LSE
- Carbon Brief
- Some others are mentioned in the 80,00 Hours cause profile here
I also wrote a short article comparing different ways for individuals to reduce their own carbon footprint.
I'll just respond to point 3 as it refers to my opinions directly. I don't think one should read me saying "Wow okay" as an off-the-cuff response to something someone says as much evidence that I've changed my mind, let alone that other people should. At that point I haven't had a chance to scrutinise the research, or reflect on its importance.
I suspect I said 'wow okay' because I was surprised by the claimed volatility of food supply in general, not the incremental effect of climate change.
Taking the figures Dave offers at face value, the increase is from 56% to 80% in the remainder of the century, which isn't surprisingly large to me. Not having looked at it, I'd also take such modelling with a pinch of salt, and worry that it exaggerates things in order to support advocacy on the topic.
N.B. (1-0.99^81 = 56%)
I think the UK government study I referred to was saying that even the ~1% chance per year of a 10% global food production shortfall now is increased by the climate change we have had so far. So the delta due to slow climate change could be larger. In general, I think it is a good idea for EA to have something to say to people who are interested in climate change. But the conventional emissions reductions that will cost ~$10 trillion present value is generally not very effective. However, I think there could be neglected opportunities that are effective. I talked about one of targeting neglected energy efficiency policy opportunities that reduce CO2 emissions and save money in that same 80,000 Hours interview (near the end). Carl Shulman notes that OPP has found some leveraged opportunities for climate change. Also, FLI and CSER do some work on climate change. The food system resilience work ALLFED does can also be valuable for abrupt regional climate change, where a region can lose 10°C in a decade, which has happened several times in the ice core record.
Thanks for this. It's useful for the community to think about this kind of thing and this is well-argued.
Overall
1. It's a good point that since the top AI fields seem oversubscribed, it might be worth some people moving into the next best causes. Another possibility is that they should wait until the number of organisations catches up with the number of people. It might even be that the most valuable options is having a reserve of a large number of people who could, with some probability, be a good fit for the highest-impact orgs, even though most of these people never end up working for high-impact orgs. This puts a new slant on the demandingness of EA: rather than making sacrifices by donating, EAs make sacrifices by being prepared to accept the substantial probability of themselves never having impact. This would be hard to take psychologically, but might be the right thing to do in a crowded talent space.
2. On indirect risks, another point I make in the FP report is that while climate change is an indirect stressor of other risks, this suggests to me that working on those terminal risks directly would be a better bet than working on climate change since climate change is such an indirect stressor, is very crowded and seems difficult to make progress on. What do you think of that argument?
ITN
3. I don't think it is right that problems with high tractability should be de-prioritised. I think what you mean is that we should focus on things that shift the long-term trajectory of humanity. But these could be highly tractable. e.g. the problem of not starting nuclear war was tractable for Vasili Arkhipov, but plausibly had large long-term effects. Having looked at it in some depth, climate change does look an intractable problem overall and this is indeed a reason not to work on it.
4. Another good point on how there could be increasing returns to scale in climate change, as we could affect the huge pool of funds going to the space through engagement.
5. Really, the ITN perhaps shouldn't be used when we have cost-effectiveness estimates. On the 80k rendering, ITN is literally a cost-effectiveness estimate. But we now have cost-effectiveness estimates of climate charities. If we can make plausible estimates of the impact of bio, AI and nuclear, then we should use those, rather than appealing to the ITN. similarly, for use of time as well as money.
5. It is premature to say that work on climate change could be tractable. I think careful analysis is needed to figure out whether the things you list are indeed a good bet compared to other things that EAs could do.
Details
6. Climate Action Tracker suggests that on current policy, we are in for 3.1 to 3.5C, which is different to the 'baseline' trajectory estimate that you give. I think the current policy trajectory is most relevant for that part of your argument. (But note that this is only by 2100)
7. The impact of climate change on food production is in fact predicted to be fairly modest, as I discuss here. Yields might fall by 10-20% but this will be in the context of rising productivity and improvement in the other factors that determine the supply of food.
8. The emphasis on water shortage throughout is a bit overblown. We don't need to ration water, we just need to price it properly (which is efficient rationing). If we did that, there would be no water problems today or in the future, anywhere (provided people had enough money).
"EAs make sacrifices by being prepared to accept the substantial probability of themselves never having impact. This would be hard to take psychologically, but might be the right thing to do in a crowded talent space."
My impression has always been that even the most qualified person who goes into the most promising field (say, for example, AI risk reduction) has a low absolute chance of being the person to make a breakthrough in that field, but rather, that part of the point of EA was to get more talented people into those fields (e.g. by increasing the number of jobs) to increase the chance that a breakthrough will be made by someone.
Thanks for your comments - really great to learn more about this topic.
1. Agree, and optimisation in some soft-EA fields could have benefit
2. Agree - all other things equal, it would be better to work directly on an x-risk. But it would benefit EA, for the later reasons given, to acknowledge climate change as a potential stressor on x-risk. The assumption I'm using is that climate change might be more tractable for a bigger pool of people. And if someone is concerned about x-risk but is an expert on renewable energy on the breakthrough of some new technology, then they could understand their work as reducing the overall portfolio of x-risk. And if direct x-risks are heavily oversubscribed, or based on only a small number of agents (e.g. nuclear), then perhaps there's more leverage for some people on climate change.
3. Agree that we should be focusing on things which affect the overall trajectory of civilisation. Is climate change really an intractable problem? In that case, why do so many smart people at all these universities, and the UN and IPCC have reducing emissions as a goal? Is it maybe intractable to assume that we'll get to net zero, but is it a worthwhile goal to push to lower the rate of warming to give us more time to adapt?
I don't profess to have the answer, but I'd be interested in the debate. I worry that this discussion doesn't have enough input from real experts in this space.
If climate change is intractable, then what's the next step? Should we be looking at geoengineering, adaptation, resilience? Assuming that climate change is intractable, then here are some other rough ideas of things that could help global welfare:
4,5 - Interesting to read, I don't profess to be an expert so would appreciate learning from other perspectives.
6,7,8- I wonder whether it's possible that some modelling is overconfident on how resilient societies will be to climate change, as we're densely networked and there might be lots of unanticipated secondary effects, such as mosquitoes affecting another billion people. The latest UK adaptation report acknowledges biodiversity as one of the several areas urgently needing further research.
I'll focus on point 2 because I think it is the most important. I don't see the argument for it being true that for the vast majority of people, working on climate change promises more leverage on the problem of nuclear war, than does working directly on nuclear war. Nuclear war is easier to make progress on, more neglected and more important than climate change.
While climate change doesn't immediately appear to be neglected, it seems possible that many people/orgs "working on climate change" aren't doing so particularly effectively.
Historically, it seems like the environmental movement has an extremely poor track record at applying an "optimizing mindset" to problems and has tended to advocate solutions based on mood affiliation rather than reasoning about efficiency. A recent example would be the reactions to the California drought which blame almost anyone except the actual biggest problem (agriculture).
Of course, I have no idea how much this consideration increases the "effective neglectedness" of climate change. I expect that there are still enough people applying an optimizing mindset to make it reasonably non-neglected, but maybe only par with global health rather than massively less neglected like you might guess from news coverage?
I agree that the environmental movement is extremely poor at optimisation. This being said, there are a number of very large philanthropists and charities who do take a sensible approach to climate change, so I don't think this is a case in which EAs could march in and totally change everything. Much of Climateworks' giving takes a broadly EA approach, and they oversee the giving of numerous multi-billion dollar foundations. Gates also does some sensible work on the energy innovation side. Nevertheless, most money in the space does seem to be spent very badly, e.g. on opposing nuclear power. This consideration might even make the environmental movement net negative wrt climate, though I haven't crunched any numbers on that.
I would also add that sensible EA answers in this space face substantial opposition from the envionmental movement. I think a rational analysis argues in favour of advocating for nuclear and carbon capture, for energy innovation in general, and for financial incentives for preventing deforestation. All of these things are opposed quite strongly by different constituencies in the environmental movement. Maybe the one thing most people can agree on is carbon pricing, but that is hard to get through for other reasons
I think this is spot on - the fact that climate change as a cause area is saturated in aggregate terms masks the fact that there a number of both important and neglected (if not necessarily tractable) initiatives that could benefit from an increase in funding and talent. Nuclear and carbon capture advocacy are both good examples. Clean and plant-based meat has nice synergy with animal welfare. I think there's also probably a role for "better" journalism to counteract some of the less helpful, Malthusian narratives that seem to dominate the media right now, at least in the UK and US.
It's hard to tell where this site is getting its numbers from, but my understanding is such claims are usually based on misrepresenting the RCP 8.5 emissions scenario as representative of business as usual even though it makes a number of pessimistic assumptions about other uncertainties and is widely considered as more like a worst case scenario than a median case scenario.
As far as I can tell, claims that extremely high climate sensitivities are plausible tend to be based on conceptual misunderstandings of Bayesian probability.
The Stern Review isn't representative of climate economics as a whole and has various problems that you can find using your favorite web search engine.
On Bayesianism - this is an important point. The very heavy tailed estimates all use a "zero information" prior with an arbitrary cut-off at eg 10 degrees or 20 degrees. (I discuss this in my write-up). This is flawed and more plausible priors are available which thin out the tails a lot.
However, I don't think you need this to get to there being substantial tail risk. Eyeballing the ECS estimates that use plausible priors, there's still something like a 1-5% chance of ECS being >5 degrees, which means that from 1.5 doublings of GHG concentrations, which seems plausible, there's a 1-5% of ~7 degrees
My understanding is there are two somewhat separate issues, one being the improper use of uniform priors and the other being a failure to give estimates that take all evidence (GCMs, recent temperatures, paleoclimate, etc) into account, with probability distributions from mostly-independent evidence sometimes having wrongly been taken as confirmation of the same uncertainty range instead of being combined into a narrower one. Do the estimates that you're eyeballing update on every line of evidence? Annan and Hargreaves under some assumptions find numbers like a 95% upper probability bound of 3.6 degrees, which would imply virtually no risk of ECS>5. (Structural uncertainty may, of course, weaken such conclusions.)
As you explain in your writeup, the 7 degrees here represents an eventual temperature increase that will only be attained in centuries, and the increase over the 21st century would be significantly less (though still major), which makes the scenario less extreme than it sounds.
Your writeup uses Wagner and Weitzman's interpretation of the IPCC's uncertainty range for sensitivity. If I remember correctly, AR5 does discuss the issues of priors and combined evidence, but bases the 1.5-4.5 degree range on subjective judgment, so it's hard for me to be sure that bad Bayesianism is what's causing this interval to be as wide as it is, but Annan seems to think people aren't taking his points into account enough.
I've found it hard to find good information on the questions most directly relevant to estimating the pdf of the size of the effects of climate change (with your writeup as one of the main exceptions) and may be getting things wrong.
Yes I think you are in fact right that plausible priors do seem to exclude ECS above 5 degrees.
You pick out a major problem in drawing conclusions about ECS - the IPCC does not explain how they arrive at their pdf of ECS and the estimate seems to be produced somewhat subjectively from various current estimates from instrumental and paleoclimatic data and from their own expert judgement as to what weight to give to different studies. I think this means that they give some weight to pdfs with a very fat tail, which seems to be wrong, given their use of uniform priors. This might mean that their tail estimate is too high
Intuitively I'm inclined to agree that the probability of very high or low climate sensitivities is overestimated due to the existence of a few separate lines of evidence that give us similar estimates, and because some studies have used inappropriate priors.
But I've heard climate science experts say it's harder to "nail down" the upper end of the ECS range, IIUC because of the multiplicative nature of positive feedbacks. A simple blackbody model of Earth with no feedbacks says that doubling CO2 would give us about 1.1°C of warming (IIRC) but there are several feedbacks in which a temperature increase causes a larger temperature increase: water vapour, ice albedo, permafrost melt (not technically included in ECS, but worth considering along with the effect of destabilizing shallow clathrates, if any), cloud feedback (thought to be small), a potential increase in drought leading to higher albedo, changes to the oceanic depth-temperature gradient / changes to ocean currents (which reminds me, global warming could ultimately cause cooling in Europe which implies that if the ECS is X, the typical warming would be above X outside Europe, and as I have noted elsewhere in this discussion, the land will warm more than the ocean surface at equilibrium).
When you stack the PDF of all the feedbacks together, the tail of the distribution gets uncomfortably long. (I didn't read much of Annan & Hargreaves so if their analysis specifically addresses the "stacking" of feedbacks, let me know.) [Overall, there have been new papers suggesting we can constrain ECS below 4° and others saying we can't, so I think we need to give the dust some time to settle - while still looking for tractable things we can do in this area.]
Note that the historical data on the ECS doesn't help much to constrain the upper end of the temperature range because ECS is likely not independent from initial temperature; we'll be reaching temperature zones Earth hasn't had for many millions of years, and we don't have very solid data going back beyond 800,000 years.
FYI, the CMIP6 models, to be used for the IPCC's AR6 reporting in 2021 are already producing prelim results.
Quote from the linked article: "Early results suggest ECS values from some of the new CMIP6 climate models are higher than previous estimates, with early numbers being reported between 2.8C (pdf) and 5.8C. This compares with the previous coupled model intercomparison project (CMIP5), which reported values between 2.1C to 4.7C. The IPCC’s fifth assessment report (AR5) assessed ECS to be “likely” in the range 1.5C to 4.5C and “very unlikely” greater than 6C. (These terms are defined using the IPCC methodology.)"
The IPCC experts actually toned down the projected temperature range from the Coupled Model Intercomparsion Project number 5 models. If they did so in a similar fashion in 2021, we'd get an IPCC AR6 ECS range of roughly 2.3 to 5.2 degrees Celsius, with a tail up to 7 degrees.
I'd like to make a few points based on my knowledge as someone who studies climate science issues as a hobby. I'm a member of the volunteer team at SkepticalScience (an anti-climate-misinformation site).
[edits/additions in square brackets; original version contained mistakes]
First, humanity needs to reduce carbon emissions all the way to zero or below, because natural removal of CO2 from the climate system is extraordinarily slow. 50% is too much; 25% is too much. Zero. Popularly-considered strategies for mitigating global warming won't achieve that. Optimistic IPCC scenarios like like RCP 2.6 assume technology will also be widely deployed to remove CO2 from the air. Things like tree planting that increase biomass can slow down the increase of CO2 but can't stop it even briefly; other ideas for carbon sequestration are not economical [AFAIK] and it's irresponsible to simply assume an economical technology for this will be invented. Therefore, we need to switch to 100% clean energy, and do so as soon as possible.
In my opinion the best thing the EA community can do (under the importance-tractability-neglectedness framework) is to study and support nuclear energy in some affordable way. In the past, the push for climate change mitigation has come from traditional environmentalists, who have fought against nuclear power since at least the early 1980s [probably more like 1960s] and mostly haven't reconsidered. This is evident from the many campaigns for "renewable energy" rather than "clean" or even "sustainable" energy. EA can fill a gap here. My favorite thing is to ask people to support new, inexpensive Molten Salt Reactor (MSR) designs. But probably the cheapest thing we can do is a web site. I think there is a real need for a web site about nuclear facts (or clean energy facts generally), something that debunks myths like SkepticalScience does for climate science, and also provides information that isn't adequately available on the web right now, about such topics as the risks of radiation, the types and quantities of nuclear waste, and the ways nuclear regulations have improved safety (albeit increasing costs). And, of course, it would go into some detail about MSRs and other affordable next-generation reactor designs. As EAs are not funded by the nuclear industry, they could be a credible independent voice.
Solar power makes great sense in a lot of tropical places, but in northern climates like Canada it doesn't, as peak energy use happens in the wintertime when the sun is very weak. AFAICT this makes solar in Canada into a nuisance, a potential roadblock as we get close to 100% clean energy (why would we deploy more clean energy if existing solar power makes it redundant for half of each year?). Without nuclear power, our main source of energy [in such climates] would probably have to be wind, and I'm very concerned that the cost of relying mostly on wind power would be prohibitive, especially in a free-market system. I don't know the exact numbers, but once we exceed something like 25%-30% average wind power, instantaneous wind power will often exceed 100% of demand, after which wind turbines are likely to become less and less economical. (Epistemic status: educated guess [but after I posted this someone pointed me to a presentation by an expert, which says solar value starts to decline well before 25-30% penetration])
[Meanwhile, right now nuclear advocates often rely on bare assertions, some of which are wrong. Without credible-but-readable sources - plain language explanations that cite textbooks, scientific literature and government reports - it's hard to convince intellectuals and reasonable skeptics to change their mind. Anti-nukes can simply assert that claims that make nuclear power look not-scary are nuclear-industry propaganda. Note that nuclear power relies on public opinion much more than renewables currently do - company are free to design and build new wind turbines, but nuclear power is, of necessity, highly regulated and its continued existence relies on political will, which in turn flows from popular opinion. Witness the blanket shutdown of all nuclear power in Germany based on the effects of a tsunami on Fukushima, Japan. Hence the motivation for an educational site.]
By contrast, it seems clear to me that mass-produced MSRs can [theoretically] be cheaper than today's CCGTs (natural gas plants). I've been following MSRs with great interest and I've published an article about it on medium, although it remains unlisted because I'm still uncertain about a couple of points in the article and I'd love to get a nuclear expert to review it.
Second, it is a common misconception that we could have 4°C of global warming by 2100; climate scientists generally don't think so [except in the RCP 8.5 (business as usual) scenario which by now is more of a "look at the train wreck we're avoiding!" than a plausible outcome]. Often this misconception arises because there are two measurements of the warming effect of CO2, and the most commonly reported measure is ECS (equilibrium climate sensitivity) which predicts the amount of warming caused by doubling CO2 levels and then waiting for the climate system to adjust. The best estimate of ECS is 3°C (2.0-4.5°C, 90% confidence interval according to the IPCC) and it will take at least 200 years after CO2 doubles to even approach that amount of warming. If the ECS is higher than 3°C I would expect it to take even longer to approach equilibrium, but I'm rather uncertain about that.
To estimate the warming we expect by 2100, look at the TCR (Transient Climate Response) instead. The TCR is highly likely to be in the range 1.0-2.5°C. Keep in mind, however, that only 2/3 of greenhouse warming comes from CO2 according to the AGGI; 1/6 comes from methane and the final 1/6 from all other human-added greenhouse gases combined. The most common estimate of TCR is 1.7°C or 1.8°C and a first-order estimate based on observed warming so far is about 1.5°C. So if CO2 doubles (to 560 ppm), I'd expect about 2.5[±1.1]°C of global warming based on a TCR of 1.75, assuming CO2's fraction of all GHGs increases slightly by then. [side note: I would be surprised if CO2 more than doubles - I think we'll get almost 100% clean energy by 2100; OTOH predicting the future isn't really my forte.]
Third, Having said that, the land will warm a lot faster than the oceans. Climate models on average predict 55% more warming on land than sea [related paper]. [Observations so far suggest that the transient difference could be] greater. Therefore, although 4°C of "global" warming by 2100 is highly unlikely, 4°C of land warming by 2100 is a distinct possibility (though I estimate a probability below 50%.)
I guessed on Metaculus that global warming by 2100 would be [1.7 to 2.6°C] (despite the Paris agreement), but on land [it's likely to reach 3°C (and as climate change is non-uniform, some populated locations could exceed 3°C even if the land average is less than 3. I should add that the land-sea ratio is thought to be lower in the tropics, albeit higher in the subtropics. And my prediction was somewhat optimistic—I assumed that eventually society would build nuclear plants at scale; or that at least some cheap CCS tech would be discovered.)]
Fourth, having lived in the northern Philippines, I think the impact of the warming itself is underappreciated. I lived in a very humid town (more humid and hotter than Hawaii) where the temperature exceeded 30°C in the shade most days. The hottest day of the year was about 37°C in the shade at noon, coldest would have been around 18°C at 6AM.
Maybe it's just that I lived in Canada too long, but humans are humans - we are naturally uncomfortable if our core temperature exceeds 37°C and I became uncomfortable whenever I went outside or left the sanctuary of the Air Conditioner. So for the sake of Filipinos and the other 3+ billion people living in tropical latitudes, I think we should be very concerned about the effect of just the warming itself on humanity's quality of life.
If we get 4°C of [land] warming vs preindustrial, that implies average daily highs of about 33-34°C in my town, which I would describe as virtually unbearable at 75% humidity. Consider also that if the Philippines becomes more prosperous, they will respond to the high temperatures by extensive use of air conditioning, which is energy intensive. If we don't stop using fossil fuels soon, air conditioning itself can become a significant contributor to further global warming.
Energy for Humanity is a great underfunded pro-nuclear NGO working in the EU. Clean Air Task Force and Third Way are also great.
I also think the current emphasis on solar and wind in some places could be a barrier to sensible low carbon policies in the long-term, especially as they don't go very well with nuclear. It also doesn't make a great deal of sense to combine intermittent renewables with nuclear, as France bizarrely recently considered doing, since it just makes nuclear run below capacity when the sun is shining, which doesn't make economic sense.
Although you're right, it appears the renewables juggernaut is unstoppable, and mass production for affordable reactors will require about 15 years to spin up, during which time renewables will be the only game in town. For that reason, MSR vendors want to use huge silos of solar salt to store energy when renewables are going strong, which they can discharge when the renewables start losing power. In this way the nuclear reactor can usually go at full power, albeit at the cost of extra turbines and solar salt (so named because it was pioneered by concentrated solar power technology).
There's also the Breakthrough Institute and Environmental Progress in the US. Plus the broader "Ecomodernist" movement: http://www.ecomodernism.org/
...
I'd like to second this point, as someone who comes from a prosperous rainforest nation (Singapore) with an average of 85% humidity and ~30°C weather. Not only does quality of life go down, but carbon footprint will increase - with AC bills, need for cold refrigeration (especially in transportation), preferencing private cars/taxis over public transport and walking. Singapore has made large infrastructural investments to have thousands of kilometers of covered walkways and air-conditioned public transit to combat this but most cities in these regions don't have the governance capacity or capital for such investments.
I agree with you and bdixon that emission reduction should be a serious priority for EA, and also that we shouldn't minimize its direct effects on human beings. The WHO estimates that between 2030 and 2050 deaths from climate change will reach 250,000 per year. Right now, its likely over 200,000 per year. These deaths don't come simply from heat stress, but also from diseases moving into higher latitudes, droughts, water stress, etc. My understanding is that this estimate does not include the impacts of war and conflict, which are also increasing as a result of climate change.
I disagree, however, that nuclear power presents a viable solution. I am in favor of nuclear power as a technocratic policy prescription, and I would be happy to see more of the world's power become nuclear. But it's not politically viable, and that's what matters.
Nuclear energy is unpopular. A 2011 Ipsos poll (admittedly conducted in the wake of Fukushima) found that only 38% of the population in 24 countries supported getting some power from nuclear. In the US, support for nuclear power is declining, and it no longer has majority support with the public, according to Gallup polls. These numbers can and do shift over time, but getting the public on board with nuclear is a long-term, challenging task. If you agree that massive emissions reductions in the next decade will save many lives and reduce our risk of triggering nasty feedback loops, like a collapse in land-ice, nuclear is not the way to go.
This dovetails with the larger question of tractability. In technological terms, climate change is a solvable problem. There are two reasons to think we might not solve it before triggering mass migration, economic collapse, world war, and nuclear/biological war. 1) The political system has consistently failed to take even modest action. 2) We are running out of time, and the solutions we will need to take only get more drastic the longer we wait.
I would say the question at least warrants thorough research from the community (I'm unaware if this has already been done) - on whether public opinion can change through education, on promising countries/regions with higher rates of public support (no history of nuclear disaster) that are equipped to implement it safely. This may not be a globally scalable solution, but if even a few players adopt nuclear it could draw more investment/improve the technology and potentially make it more feasible for others.
For example, in Pennsylvania 40% of all energy and 93% of carbon-free energy comes from nuclear, but only 1 in 10 know that nuclear energy is carbon-free with certainty. It seems to me that public education could potentially be effective, especially because there appears to be conservative support for nuclear.
If research on that front yielded results, that would certainly be valuable.
But compare that task to the work that climate advocates have been doing for decades. Educating away people's political convictions has seen very limited success when it comes to convincing them that radical action on climate is needed. A similar effort on nuclear power might take decades more (which we don't have; as we know, there's a ticking clock).
The conservative-support argument is interesting, but IMO also flawed. Andrew Sullivan, influential conservative writer/intellectual, called for something like this when he proposed a 'nuclear Green New Deal.' In the United States, it's a non-starter. The politicians and voters who are interested in big, sweeping transformations of the economy are disproportionately concentrated on the political left. So is the most die-hard anti-nuclear opposition.
And this presents a coalition-building challenge. The American GOP is unwilling to take action on climate, and are heavily influenced by money from coal, oil, and gas interests. GOP politicians have, so far, refused to take even modest action, and appear to be comfortable making decisions on issues like climate or healthcare policy that are out of line with the public opinion polling, even with their own voter-base.
In the current American political landscape, bipartisan action, especially when it comes to a Green New Deal, or a 'nuclear new deal,' is currently nonviable. The last ten years of GOP opposition to the ACA, which was a small-c conservative proposal originally floated by the GOP (and tested by Mitt Romney), speaks to the lack of bipartisan options. So American action must come through the Democratic party, and leaning heavily on new nuclear power currently reduces the chance of that happening.
The international situation isn't much better. The conservative CDU/CSU is Germany has vowed to transition off nuclear power entirely. There's also the added problem that many countries are heavily dissuaded by the international community from acquiring and enriching nuclear material.
I would point out that this has been largely liberals trying to convince conservatives about climate science; cross-tribe communication is pretty difficult. Indeed, I wonder if support for nuclear among conservatives stems as much from opposing the "liberal media"'s scare mongering than anything else. There's been some success, at least on the left, from efforts to get the word out about "the" 97% consensus among climate scientists. Educating people on the left seems like an easier problem - there are die-hard anti-nukes who can't be convinced, but they're a small minority.
AFAIK no one has seriously attempted the educational resource I propose, so before saying it can't work I think it's worth trying. We do have some stuff like Gordon McDowell's videos that basically targets maven personalities like myself, but I found that it still doesn't provide all the information I need to get a complete mental model for nuclear power. An educational site is not enough by itself to change public opinion, but it could at least be valuable to maven-type people who want to change minds about nuclear power but don't have good sources of information that they can link to and learn from.
Public opinion is a very hard nut to crack, but what about the media? I would guess that influencers like Jon Oliver probably got some of their information from SkepticalScience, so I think public education may be able to percolate to the people by first percolating up to the media.
I am very much aware. That's what I think we should take steps to address. Providing educational resources isn't enough by itself, but it's a necessary step.
Someone pointed me to this video by Jesse Jenkins at MIT who models the cost of electricity systems in the context of a goal to reach zero carbon emissions. The video shows how nuclear would play an important role even if a nuclear plant costs 6 times as much to build as a natural gas plant. When I saw this video I thought "wait, if new renewables eventually lose so much value that expensive nuclear plants start looking attractive, just how the heck could we convince every country in the world to replace all their fossil fuels?" Since we know how to make nuclear cheaper, the obvious answer is, let's do that.
At a few points in this post, you argue for climate change possibly being more important to tackle than the major short-termist causes within EA. Do you also see it as something which should redirect resources currently being spent on long-termist causes? It's odd to see a claim that EA doesn't focus enough on "things that could alter the trajectory of our civilization" when most popular critiques of EA say the opposite (too much focus on long-term risks, not enough on helping people in clear/direct ways).
On the other hand, one benefit of comparing climate change to disease/animal welfare is that the donor demographics for climate change may be more similar to the demographics for disease/animal donors than for donors to other X-risks (e.g. people with an environmentalist bent, people interested in the Global South).
This is a small nitpick, but I don't think I've ever seen the claim substantiated that EA's focus has been unduly influenced by "the short-term world of hedge funds", even though people make it all the time. Yes, GiveWell was founded by hedge-fund veterans, but the tools they borrowed from Bridgewater were (as far as I know) related to EV calculations, not "having a short time horizon". EA has, almost since the beginning, had a stronger focus on the long-term future than nearly any other social movement.
Thanks for your comments.
No, I think that funding averted from AI alignment to climate change would be a mistake. But optimising money currently spent on climate change could be useful.
Yes, it felt a little harsh of me to have written that. I agree - it's a bit of strawman argument. I think what I was thinking there is perhaps better expressed in the quote from Christine Peterson.
As a long time lurker of the effective altruism movement working to "put the brains in climate change action" and adress system-level obstacles to serious climate action, I'm so happy to see this kind of post on the EA Forum.
Climate change has unfortunately been "neglected" by the EA movement, which is a problem because one of the big issues with "mainstream" climate action is that it doesn't have an "EA mindset".
With all due respects to some of the organizations, like Cool Earth, who adress the problem rationally, climate change action in general suffers greatly from being highly under the influence of organizations like Greenpeace, who approach the problem in a dogmatic way: pushing to close nuclear power plants in France as higher priority than coal power plants in Germany (when the combination of well managed nuclear power plants, energy saving measures, and cross country power lines would make it feasible to provide carbon free power to countries even when the local population doesn't want nuclear in their back yards), promoting solar panels on roofs in countries like England where the sunshine is so low that the energy return of investment makes it a ridiculous investment... A unit of measurement exists, dollars per ton of CO2 averted, but this indicator is very seldom used to guide policy development: while Cool Earth can avert a ton of CO2 for about a dollar, and a ton of CO2 emitted anywhere will have about the same impact on the whole world's climate, the French government's policies to "fight climate change" have cost about €650 per ton. When there is that amount of discrepancy, there is a big need to bring the kind of rationality promoted by EA and LessWrong into the discussion.
Furthermore, climate change policy suffers from a "first mover" problem which is a big obstacle to radically effective measures. As stated in the executive summary of the organization that I work for, Simpol, "The economy, finance and markets now operate globally, whereas the systems of governance supposed to regulate them still operate only nationally. The ability of capital and corporations to move freely across borders necessarily prevents governments from implementing regulations to solve global problems. That’s because any nation implementing tighter regulations would increase its business’s costs, so risking jobs and investment moving elsewhere. The result? International inaction, while our problems only get worse. It’s a vicious circle all governments are caught in; a circle which, if not broken, will break us all. "
This vicious circle has neither been taken into account by promoters of a "Green New Deal", nor even by the UN's mechanism of the Conference of Parties. For an alternative proposal to regulate climate change effectively, please check out simpol dot org.
Thank you for writing this post you bring up several excellent points, especially with regards to climate change permanently reducing the potential of humanity. I'd like to expand on a few of the points you raised:
Point 12: EAs can skills-build working in climate change
The EA movement could learn about movement-buidling, public policy work and professionalization from the climate change movement. If we consider the breadth and depth of the professionalization of climate change it's impressive, covering: environment engineering, environmental sciences, geography & geology (they've had environmental agendas for decades) and environmental studies (society/culture/policy side).
There are many opportunities to engage with climate change fairly easily, which could add to the collective intelligence of the community and help individual EA's skills-build while doing direct work (preventing moral drift). It would be good for the EA community to identify the most promising opportunities within the climate change movement because there are a vast range and not all are equally effective.
Points 14/15: Climate change is not talent- or (comparatively) funding-constrained
The climate change sector is professionalized, a priority (if nominal) for corporations, governments and research institutions. This means that, potentially, a really great EA-aligned climate change charity could attract significant funding from outside the community (counterfactually these resources probably wouldn't be easily moved to a different cause area).
This is good because a) the community is not splitting its monetary resources over fewer and fewer cause areas of and b) it raises the profile of EA and spreads EA-like thinking into a movement which already values evidence-backed reasoning, long-termism (or mid-termism) and grappling with complex problems.
Point 16: Why we may lose out potential EAs
Many potential EA's probably know a good deal about climate change, much more so than wild animal welfare or AI safety risk. By not seriously considering climate change, these people might stick to their priors since they have more confidence in them and have developed them over a long period of time. Engaging with people at their level of knowledge, showing evidence that the community has committed serious time and consideration to climate change could (eventually) convince these people it's not a priority (if that's the conclusion we come to).
The level of depth of the 80K article is "Exploratory". Further, Founder's Pledge suggests 2 charities where it's harder to measure an individual's marginal impact (unless they have significant money to donate).
The biggest issue I have with EA is the lack of attention to climate change. I am supporter and member of the EA but I take issue with the lack of attention to climate change. Add me to the category of people that are turned off from the community because it's weak stance of climate change.
I will admit that I am not as educated, accomplished, skilled, or simply as smart as most members of the EA community. I have a college degree, a small brain, a modicum of self confidence, and am respected by my peers. I concede my weakness and faults but still feel that I have an ability to speak.
It baffles me that EA does not address climate change on a larger scale. To me it clearly represents an extinction threat. Nuclear war, biosecurity, land security, etc does not pose a threat if the very stability and security of human's to feed and house themselves are taken away. EA talks about big ideas but runs away from the impacts that are happening in front of us. Climate change is happening and the world is doing very little to address it. It immediate and constant. Almost every world issue from poverty, migration, refugees, water security (a massive issue that needs to be address), economic instability, government stability, the spread of illness, etc can all be traced back to climate change.
EA has taught me to ask 'what is the most important issue to human survival?' without a doubt my answer is 'climate change'. With that in mind I have to follow a path that in some way addresses this issue. I truly can't answer tell you if more than 100 million will be alive in 2300 or if the planet will be inhabitable in 3000, and that is 100% because of climate change.
I'm not nearly as smart or involved as many people in this community, but quickly after discovering EA (about 4 years ago) I got the impression that climate change and threats to biodiversity were underestimated, and was surprised at how little research and discussion there seems to be.
One point I would like to raise is that this community seems to assume that everyone starts (or should start) from the assumption that human life and our species's continued existence is valued above all else. It's very anthropocentric, which is another thing that may alienate potential effective altruists (like myself).
Personally, I value biodiversity for its own sake, as well as its benefit to humans, since I come from a long-term ecologists' perspective (put simply, I value the continuance of life rather than focusing on one species, and a more diverse biosphere/ecosystem is a healthier and more resilient one). I believe there should be a balance between ecologist and humanist efforts.
I wonder if part of this bias toward humanist efforts is because it's much cleaner and easier to measure costs and benefits in terms of human lives/DALY etc., since there are more data and research done.
Anyway, I would be curious to hear EA's perspective. And if there are convincing reasons I should reprioritize my values, I am open to changing my mind.
I think EAs focus on the survival of the human species above that of nonhumans because nonhumans can't prevent astronomical waste the way a flourishing, advanced human (or posthuman) civilization can. (That, or they don’t think nonhumans lower than chimpanzees are sentient, though I think that is a minority view.) I agree that biodiversity is good, but only in terms of its impact on the welfare of humans and animals. Although not many EAs seem to value biodiversity for its own sake in the way deep ecologists do, many EAs are concerned about wild animal welfare. There is a lot of suffering experienced by wild animals, either due to nature or from human activity, such as starvation, predation, disease, or infant mortality or r-selection. It’s very difficult to have interventions with an expected positive impact for much of this – for instance, if you eradicate a parasite or disease from a species, might that contribute to overpopulation, and consequently starvation and infant mortality instead? As such, WAW organizations like to focus on things like humane insecticides.
Links (floating formatting bar doesn’t show up on iPad sorry):
This is an interesting post by Ramez Naam. He argues that too much attention is given to transportation & energy emissions and not enough to agriculture & industry emissions. Naam thinks that renewable tech will continue to drop in cost, and he's optimistic that part of the equation will solve itself. He says the highest-leverage action is the development of new tech to address agriculture & industry emissions.
Not only is agriculture higher leverage for reducing emissions, it also holds a high potential for drawdown. This potential to store more CO2 in well managed soils is unfortunately neglected, even though some organizations like 4p1000 have a potential ability to promote the storage of Gigatons for a very low rate.
Moreover, the impact of methane emissions (which are mainly linked to agriculture) are probably undervalued by non-specialists, as we tend to convert methane emissions to CO2 based on a 100 year global warming potential, using an indicator of 28x, which is not relevant for the short term. If we only have 15 years to act before getting into dangerous feedback loops, then focusing more on methane would be a good idea, as methane is less stable in the long run but much more potent as a greenhouse gas over 20 years than over 100 as compared to CO2.
Thanks for writing this. I think that it would be good if there was at least some EA investment in climate change so a) we gain a better understanding of the issue b) we are in a better position to shift resources this direction if we receive evidence that it is likely to be worse than we expect c) we gain the opportunity to spread EA ideas into the climate change movement
I'm not proposing a huge amount of investment, but I'd love to see at least some.
I would like to offer a simple personal note that my focus and energy has turned away from EA to climate change only. I now spend all of my time and energy on climate related matters. Though I still value EA's approach to charity giving, it has begun to feel like a voice from the past, from a world that no longer exists. This is how it registers with me now.
I agree that I fall closer into this camp. Where the action tends to be towards climate change and that immediate threat, while I play intellectual exercises with EA. The focus on animal welfare and eating a vegan diet help the planet and fighting malaria are related to climate change. Other issues such as AI and nuclear war seem far fetched. It's hard for me to see the impacts of these threats without preying on my fears. While climate change has an impact on my daily life. As I said in my above post that EA taught me to ask 'what is the most pressing problem that humans face.' To me it is clearly climate change and I moved on from there. I maybe disappointed that EA has not drawn the same conclusion.
Thanks for your hard work in writing this. I was impressed by the depth of thought and how well it was linked to other articles.
1.
This seems like a wise comment. We should all be open to criticism, particularly if we are disagreeing or taking a nuanced view on something the majority of people think is really important or that they expect us to care about.
2. Overall, it seems reasonable to me that EA resources might be ill spent in working on climate change, since there is already so much money and so many people going to work there. Several of my non-EA friends are deciding to fight climate change of their own accord. I think it would be better if they used an EA methodology and it would be better if the people at the top of anti-climate change orgs used this methodology but it seems much of what you speak of is answered by the fact that additional EA members in that community may not (or may) do much additional good compared to their intervention elsewhere. I don't know how to judge that.
Perhaps in regard to this, it would be better if rather than getting jobs in these spheres EA people were encouraged to vote wisely, donate wisely, engage with their communities, be informed about climate change. Maybe this is what you had in mind anyway.
I think if EA folks got involved in trying to change the climate change methodology it could go a long way towards minimizing the amount of wasted effort that other people are putting into the climate problem.
Hey thanks for replying,
Sure, it's a question of maximising effect. I don't know what is best. 80k say it's not the most effective. I suppose you'd have to ask them how that explanation works.
Certainly it's a better thing to do that working building bombs, but as to if it's as good as AI policy, 80k says no.
What do you think?
I would recommend two books on climate change -- Our Final Warning: Six Degrees of Climate Emergency and The Uninhabitable Earth: Life After Warming by David Wallace-Wells. My view after reading these two books is that in the worst case scenario humans will still be able to hide in Antarctica for a very long time, but anything resembles current human civilizations may not exist anymore. Imaging that the world will unite and take action when things start to really get bad is a bit wishful thinking. Just look what is happening in this pandemic.
Another consideration comes to mind: climate change is currently taking up a large amount of attention from competent altruistic people. If the issue were to be solved or its urgency reduced, some of those resources might flow into EA causes.
"[AI safety] is currently taking up a large amount of attention from competent altruistic people. If the issue were to be solved or its urgency reduced, some of those resources might flow into [climate change mitigation]"
So hurry up, Toon ;)
If you take this model a step further, it suggests working on whatever the most tractable problem is that others are spending resources on, regardless of its impact, because that will maximally free up energy for other causes.
Sounds like something someone should simulate to see if this effect is strong enough to take into account.
It's an interesting idea. Often the resource fungibility won't be huge so it may not make much difference, but in some cases it might.
It also seems to assume that it will use fewer total resources than working on both problems less intensively for a longer period. I would guess that it would usually be more efficient to divide resources and work on problems simultaneously, in part due to diminishing returns to investment. E.g. shifting all AI researchers to climate change would greatly hinder AI research but perhaps not contribute much to climate change mitigation, even assuming good personal fit of researchers, since there are already lots of talented people working on the issue.
But I've thought about this for less than 5 minutes so it might deserve a deeper dive. I'm not likely to do it, though.
Thanks all for your comments. A few friends have emailed me and made a couple of points about this post.
1. On the first-order effects of warming, the Stern Review figures are now 10 years out of date, and the IPCC SR1.5C expects worse impacts to welfare than previously stated, under all trajectories. See Chapter 5 of the report, and Byers et al. 2018.
2. A good source on the impact on GDP and societies through sea level rise is Pretis et al. (2018, Philosophical Transactions of the Royal Society)
3. The goal for climate change mitigation should be getting to net zero emissions as fast as possible, as anything other than that still causes warming, and this goal is absent from many EA and the 80,000 Hours write-up.
4. Absent from these discussions are climate economists, who would be able to help us grapple with this more concretely. Some suggested economists to research (and for the 80,000 Hours Podcast) are Adair Turner, Simon Dietz, Cameron Hepburn, and Joeri Rogelj.
If there's already the goal of reducing emissions in general, with more reduction being better, is there any reason to add a goal about the zero level specifically? EA generally (and I think rightly) just cares about the expected amount of problem reduction, with exceptions where zero matters being things like diseases that can bounce back from a small number of cases.
I think the zero-goal matters because (1) if you plan for, say, 50% reduction, or even 66%, you might end up with a very different course of action than if you plan for 100% reduction. Specifically, I'm concerned that a renewable-heavy plan may be able to reduce emissions 50% straightforwardly but that the final 25-45% will be very difficult, and that a course correction later may be harder than it is now; (2) most people and groups are focused on marginal emissions reductions rather than reaching zero, so they are planning incorrectly. I trust the EA/rationalist ethos more than any other, to help this community analyze this issue holistically, mindful of the zero-goal, and to properly consider S-risks and X-risks.
Great read, although I think there are a couple of points, some already mentioned, that are worth highlighting further:
With regards details:
I would agree that the Stern report is now pretty old hat, it's over a decade old and there has been plenty of work since then, both on the climate science side (e.g. the discussions below on TCR vs. ECS) and on the economics side (e.g. what discount rate and damage function to use to work out impacts on global GDP).
I'd also say that it is remiss to leave out the IPCC's latest report on 1.5 degrees, given that it includes all of the most recent literature (and also given that it took on board a fair bit of previous criticisms of IPCC reports, regarding presentation of data, discussion of probabilities etc.).
Beyond this, I would go further and say that, to be honest, discussions about tail-risks of 6 degrees of warming, based off of uncertainties in ECS or TCR values, are of second-order importance. I think you can make a very convincing case for acting on climate change simply based off of the evidence presented in the IPCC SR1.5:
I checked some numbers, in order to illustrate the point about judging the impacts climate change. (The specific paper cited in the SR1.5 that is relevant for this is Byers et al., 2018)
Chapter 5 of the report cites a range of 62-457m fewer people that are 'exposed and vulnerable' at 1.5 versus 2 degrees of warming, based on three contrasting pathways out to 2100 (i.e. exposed to multi-sector climate risks and vulnerable to poverty, defined as income <10$/day).
Given the IPCC only backs these findings with 'middle confidence', if we take the lower range then we have ~62m fewer people exposed and vulnerable, at 1.5 degrees. If we spread that out over the 80 years out to 2100, it's about 800,000 people each year who are either vulnerable to poverty or susceptible to various risks. That's roughly twice as many people as die from malaria each year, one of the highest-priority sectors for GWWC. Obviously, the people affected by climate change aren't dying each year, but they are more susceptible to poverty, in worse living conditions, higher risk of extreme weather events etc. If we work on basis of reducing suffering then surely, even just the fact that, right now, you have comparable numbers of people being affected, should somehow be discussed and taken into account far more.
This is just one example, from one paper cited in the report - there are obviously far more, adressing a very broad variety of impacts. Another great reference is a paper by Pretis et al. (2018, Philosophical Transactions of the Royal Society) that highlights both the aggregate impacts on global GDP and the significant geographical disparities in the impacts on economic growth: the largest relative impact of 2C vs. 1.5C is on people on lower incomes in South-East Asia and Sub-Saharan Africa, predominantly living in coastal areas. This has been discussed in other comments as well, but I think that this specific paper is excellent.
In my opinion, there is little need to go into discussions about climate sensitivity, or tail-risks of 9 degrees, to make a strong case for weighting climate change more highly.
There is also the point about the adequacy of the ITN framework and the inclusion of 'urgency', which I think is a great one:
Beyond just the urgency of the problem with regards timescales of action, there is also the totality of the issue, as discussed in John's summary report and mentioned below - we have to get to net-zero, you can't just reduce GHG emissions by 50, 90 or even 99%.
This is a point that doesn't get discussed enough (beyond the fact that, for example, the report page on 80000 hours is mad old and out of date). Contrast this with, say, improving international discussions about nuclear disarmament, which could reduce the associated overall x-risk by 50% - a 50% reduction in GHG emissions will never be enough, because we'll still be warming things up. So, even though the marginal impact of a few keen EA people working on nuclear war, biohazards or AI might be very high, this neglects the fact that when it comes to climate change, the reductions have to be total, and we can't settle for less.
I think that this should also be taken into account in terms of 'urgency', or maybe it could somehow be factored into another part of the ITN framework. As you discuss in the post, I highly doubt that climate change is an x-risk, but I feel like this isn't really even the point: it doesn't need to be, nor does it even need to be a 'multiplier', in order to be taken seriously.
Your point about the bias towards intellectually stimulating problems is also fair, I think - there are always people who love to be 'bearish' and predict doom, but I think it is a bit of a form of escapism to choose to focus on other issues. I think there might be something in the fact that it is, frankly, fairly tedious work on continuous international discussions, the minutiae of national-level policies or working out how to influence financial flows for infrastructure. Still, I think this point was a really good one, and unfortunately one that a lot of people likely need to hear.
These last two points together, I think, tie in with what you said about the tractability of climate change: it is reasonably tractable, which probably makes it less sexy to work on, which means that EA people avoid it, and also means that we will never achieve the goal of totally reducing GHG emissions to net-zero.
Finally, I really do think that featuring interviews with a few prominent climate scientists or economists should be a big priority, as you could easily thrash out any misconceptions, worries or misunderstandings on either front - there are loads based in Oxford or London, where lots of UK EA people are. For the science, people like: Myles Allen, Friederike Otto, Sir Brian Hoskins, Joeri Rogelj, Tim Palmer (mentioned in John's report). For the economics: Cameron Hepburn, Simon Dietz, Joeri Rogelj again. Just get them on Rob's podcast, and clear this stuff up!
Thanks for the article, and for the great discussion it has already prompted, as seen in the comments.
(See Ben's comment below for the brief...)
Some of this reasoning about social impacts, nonzero probability of severe collapse, dynamic effects, etc, applies equally well to many other issues. Your comment on S-risks - you could tell a similar story for just about any cause area. And everyone has their own opinion on what kind of biases EAs have. So a basic GDP-loss estimate is not a very bad way to approach things for comparative purposes. You are right though that the expected costs are a lot more than 2% or something tiny like that.
In Candidate Scoring System I gave rough weights to political issues on the basis of long run impact from ideal US federal policy. I expected the global GDP costs of future GHG emissions at 26% by 2090, and used that to give climate change a weight of 2.9. Compare to animal farming (15.6), existential risks from emerging technologies (15), immigration (9), zoning policy (1.5), and nuclear security (1.2).
For the same game theoretic reasons that make climate change a problem in the first place, I would expect polities to put too much emphasis on adaptation as opposed to prevention.
If the Burke et al. article that you're largely basing the 26% number on is accurate (which I strongly doubt), it seems like trying to cause economic activity to move to more moderate climates might be an extremely effective intervention.
What is wrong with it?
Economic activity already goes to wherever it will be the most profitable. I don't see why we would expect companies to predictably err.
And, even if so, I don't share the intuition that it might be extremely effective.
If the claims made here from p.13 on are true, it seems like the model can't be reliable. This also disagrees. In general, it seems intuitively like it would be extremely hard to do this kind of statistics and extrapolate to the future with any serious confidence or rely on it for an estimate without a lot more thought. (I haven't tried to look for critiques of the critiques and don't claim to have a rigorous argument.)
I was thinking if climate has effects on growth rate, companies may not be capturing the full costs/benefits from that. My intuition that it could be extremely effective was something like "if an extremely blunt tool like global average temperature can have big effects on growth through improving local temperature in more places than it worsens local temperature, you can probably get much bigger effects by optimizing local temperature in a fine grained way through changing the locations of things." Maybe that's wrong, I don't know.
OK, CSS5 will address this by looking more broadly at the literature and the articles you cite, or maybe I will just focus more on the economist survey.
Thanks for this post. Appreciate the "empirical ideas about tackling climate change" but also found the concepts of climate change multiplying very bad outcomes useful.
I wanted to pick up on the "urgency" idea. Doesn't urgency just mean that there are more ways in which it is important, because it has an interactive effect with other issues? I.e. considering urgency means that the importance/scale is high now, even if it might not be as high in the future?
Happy to be challenged on this; I use the ITN framework a lot (I'm sure we all do), so substantial criticism of that model seems worth delving into.
I'd agree that "urgency" is subsumed by "importance", but it's also worth pointing out explicitly, as something that might be overlooked if it is not mentioned.
Yes, the urgency point could indeed fall within the importance lens as you suggest. My concern was that some crude measures of importance didn't consider this interactive effect in a dynamic world.
In Owen C-B's 'Prospecting for Gold' talk, he briefly talks about urgency as part of tractability (something tractable now could be less tractable in the future).
I argued in my 80,000 Hours podcast that there might be something to a separate component of urgency. We generally say cost-effectiveness is something like total increase in utility per dollar, not time discounted. This can be worked out for AI and alternate foods, which we have done here. Let's say they were the same cost-effectiveness, so we should be putting money into both of them. However, because there is a higher probability that agricultural catastrophes would happen in the next 10 years than AI, the optimal course of action is to spend more of the optimal amount of money on alternate foods in the next 10 years than AI. A way of thinking about this is that the return on investment of alternate foods is significantly higher. And we might even be able to monetize that return on investment by making a deal with a goverment and then have more money to spend on AI.
This logic applies for climate disasters that could happen soon, like coincident extreme weather causing floods or droughts on multiple continents. However, I don't think it applies to tail risk of climate change (greater than 5°C global warming) because that could not happen soon. Of course one could argue that we should act now to reduce climate tail risk. However, if there are many other things we can do to increase welfare with a higher return on investment, we should do those things first. And then we will have more money to deal with the problem, such as paying for expensive air removal of CO2.
In your piece you said "John Halstead <...> argues that more emissions and warming might create destabilisation and nuclear war, but it's hard to see exactly how."
In case it helps, this Economist article may add some more colour on this: https://www.economist.com/international/2019/05/25/how-climate-change-can-fuel-wars
Thank you for opening this topic and sharing your thinking on this. I am quite new to EA and have been working on initiatives to try and help avert the worst outcomes of climate change on-and-off for several years, so very much welcome the discussion about increasing the focus on that in EA.
I wanted to address two main points:
1. If we expand our definitions of x-risk and s-risk to include the extinction of any species of life, or suffering of any life forms, then that dramatically increases the weight you might place on addressing climate change (or more broadly, biodiversity loss).
The recent UN IPBES assessment (basically the IPCC for biodiversity loss) finds that up to 1 million species are at risk of extinction due to humanity's impact on biodiversity (including the contribution of climate change).
Once extinct, each of those species is gone, forever, along with any form of value it had been creating and could potentially create in the future, for humans, other life, or living ecosystems in general.
Together with the excellent arguments you present for increasing our focus on climate change due to x-risks and s-risks to humans, this makes climate change a compelling problem to tackle right now. Especially if you consider Urgency. We have such a short time-window of 10–30 years we have to dramatically transform how we generate and use energy globally (which basically means transforming most of the underlying infrastructure and systems for how we live, stay warm, eat, move around and make things).
2. I also wanted to address your question of whether climate change can actually be tackled meaningfully (specifically mitigation i.e. cutting emissions), as that is something I have dived into some personal research on since the latest Extinction Rebellion global actions.
A lot of good work has been done in this area, with organisations like Drawdown compiling (and quantifying) solutions for the global context.
Zooming in on the UK perspective, the Committee on Climate Change, who independently advise the Government on the topic, have done rigorous modelling to assess how fast the UK could move to a net-zero carbon emissions. They concluded that we can "feasibly" achieve this by 2050 with an annual investment of 1–2% of GBP.
The question this raises for me is: could we therefore invest at a faster rate to set a more ambitious goal e.g. 2025 (as Extinction Rebellion demands), 2030 (as Norway has set) or 2035?
UK GDP is about £2,000bn.
Naively, instead of investing 2% of GDP (£40bn/year) x 30 years = £1,200bn in total, could we instead choose to invest 6% of GDP (£120bn/year) x 10 years = £1,200bn in total?
And by doing that, could we set an example to the world that increases global ambition and causes many other countries to set more aggressive targets, dramatically increasing our chances of avoiding the most catastrophic climate change outcomes, alleviating much human and animal suffering and saving many species from extinction?
For those interested in more detail, I've written more about specifically what I believe needs to happen in the UK here: UK declared a Climate Emergency — what needs to happen now.
I believe that coming together as a global society to solve climate change (as we did in the 1980s with ozone depletion, which was significantly less complex) will give us a huge amount of confidence and experience as a species in tackling large-scale risks.
Would love everyone's thoughts on my arguments above and the viability and cost-effectiveness of a 10–15 year burst of investment in climate change mitigation now to avoid cross-species x-risk and s-risks?
A recent study using machine learning assisted simulations could change the risk assessment for extreme scenarios of climate change. Fortunately the possible tipping point is estimated at the very high point of 1200ppm CO2.
https://phys.org/news/2019-02-high-co2-destabilize-marine-layer.html
https://climatenewsnetwork.net/carbon-rise-could-cause-cloud-tipping-point/?fbclid=IwAR3IF-GemzXGXzsLyq62FI2HUmvFfzpsaUyq1ET3PvU-gu-8Atr01iSCRVM
Yes, absolutely...the 80K podcast occasionally pays it lip service by saying "we agree with the scientific consensus", but it doesn't seem to go much further than that
Note that 80K has written a profile of the topic which links to many recommended resources.