• 0 Posts
  • 12 Comments
Joined 11 months ago
cake
Cake day: December 11th, 2023







  • This is a consistent misunderstanding problem I wish people understood.

    Manufacturing things creates emissions. It costs energy and materials. Something could have absolutely no emissions in usage and still be problematic when done on growing scales because the manufacture costs energy emissions and resources. Hard drives wear out and die and need replacing. Researchers know how to account for this its a life cycle assessment calculation they aren’t perfect but this is robust work.

    IT is up to 4% of global emissions and the sector is growing. People consistently act as if there is no footprint to digital media and there is. https://www.sciencedirect.com/science/article/pii/S2666389921001884

    Yes the headline is a little silly but we actually do need think strategically about the sector and that starts by actually realising it has an impact and asking ourselves what are the priorities that we went to save whilst we decarbonise the industry that supports it.

    There’s no wiggle room left - no sector or set of behaviours that can afford to be given slack. We are in the biggest race of our life’s and the stake are incomprehensibly huge.


  • The answer to your questions are: yes it’s a different baseline to the one chosen by the Paris agreement, different baselines are chosen for relevant to different elements of the issue. Likely the baseline chosen in your link is down to what reliable data they have and so they choose a baseline from a region of data they have rather than going to other sources. This website provides the latest years official record in Paris Terms I would expect the next one (2024) to be much closer to 1.5°C. On (2) I agree that current measurements suggest an instantaneous/yearly temperature around 1.5°C against the relevant baseline. On (3) you are right that the trend is unlikely to change because it comes from radiative forcing (emissions) that have already occurred so even with sudden zero human emissions we would see an increase or best case a leveling (before maybe long term it can decline as CO2 is naturally removed from the atmosphere or faster if humans find a way of doing so at scale). A trend however is already an average of several time points and you can see in the link you said that year on year variation on that number can be as high as say ~0.3°C. This comes about from non-GHG forcing elements of the system (such as El Niño) that add natural variation. So already you could see 2019-> dropped by 0.2°C even though the trend is up. So you could expect us to potentially drop back down to say 1.2°C for a few years before it goes up again. The link above suggests the best data we have we would likely breach 1.5°C by 2031 so not long at all.

    This sounds like a pedantic point but it’s actually quite important for the climate and the confusion stems back to how the problem and climate science was chosen to be communicated. Temperature was chosen in part because it’s a proxy variable of other parts of the system that are what control the system impacts and it was felt that Temperature would be “naturally understandable” by the general population (and politicians…). This had a bit of a backfire because 1.5°C is not a lot of different when considered in say a room and it highlights why this variable is different and why it matters that it’s decadal average rather than a yearly. So if temperature is only a proxy then what are the variables that control the outputs? One key one is the total heat energy stored in different earth systems and there the size of the storage medium matters (so the reason 1.5°C on the world is a lot but on a room isn’t is because the sheer volume of the earth you have to have a huge amount more energy). The other place where Surface Temperature adds confusion and complexity is because of the oceans: the oceans have been absorbing some of the heat and that hasn’t always been visible to us (as we don’t live in the ocean) so if we stopped emitting today the ocean may then deposit some of that heat energy back into the atmosphere so it’s a complex interaction. What we really need to know is what the additional level of radiative forcing and how much additional heat energy swimming about in Earth’s systems - that is what will control the experience we have of the climate. Greenhouse gases act to stop Earth cooling back down by radiating out to space which is why the effect is cumulative so the difference between a sustained year on year 1.5°C and something that averages less but has a few years of 1.5°C is quite high because they will be different amounts of total energy in the system as a result.

    So, the short answer is that the Paris agreement targets are set on the basis on what a decadal rise of 1.5°C by 2100 (i.e the average 2090-2100) means in terms of the excess heat energy and radiative forcing in the system. The limit itself is somewhat arbitrary driven in part by the fact we were at ~1°C when it was agreed and 2°C seemed like a reasonable estimate of something we might be able to limit it to. The origin of 1.5°C rather than 2°C is actually quite interesting and highlights a lot about how climate change policy has been decided but this post is long enough.

    This is a good point. The sheet apocalyptic magnitude of the problem means that every tiny amount of change matters. Billions will die. There probably isn’t a way to prevent that completely anymore. But if we can tick things down by a fraction and save a few hundred thousand people, preserve a species of food crops that would have gone extinct, IDK what the exact outcomes are but the point is tiny changes will have a massive impact and they’re important even if the situation is dire.

    Agreed, I think this is the right way of thinking about it and the risk of having communicated it to the world as a binary target of 1.5C/2C we risk people completely switching off if/when we finally confirm we’ve breached it when the reality is it should embolden us further not demoralise us. This is my number one concern at the moment. I would also add that what we doing is “pushing” a system away from it’s natural equilibrium and if we push hard enough we might find that we find changes in the system itself which are very hard or impossible to undo. So it’s more than just more increase more damages it’s also about risks of fundamentally and permanently changing the system.

    A potential energy surface with local and global minima to demonstrate how forcing can shift the fundamental equilibrium the system operates in

    As an analogy think of the ball in the well of this local minima and we push it back and forth. If we hit it hard enough rather than come back it goes and finds another minima which is just a whole different system than we are used to. These are sometimes called tipping points and the frustrating thing about the complexity of the systems is we don’t and can’t know for sure where those points are (although we do know they increase heavily as you move above 1.5C upwards). They by definition are hard to model because models are built up from prior experience (data) and these are in part unprecedented changes in the atmospheric records.

    A slide about tippings showing how it's like a game of minesweeper where each layer we "dig" down (more temperature increase) the more "mines" (tipping points) we risk hitting.

    I haven’t mentioned “negative emissions” technologies but it is worth saying in principle you could have a situation where we are able to do significant negative emissions and that might mean we could end up with 1.5C in 2100 whilst having a period of time above it but negative emissions technologies could be a whole other rant. Worth noting though that lots of the pathways that show we could just about keep to 1.5C do rely on negative emissions to different degrees (though also the pathways are limited in how much they think we might be able to push our economic systems).


  • I see this misconception a lot and it’s really unfortunate. We aren’t at what climate scientists call 1.5°C. Being at 1.5°C in the means the average anonomly being over 1.5 for a period of decades. It isn’t just a case of scientists being cautious it a completely different impact in the climate. It implies different amounts of impacts and different levels of heat energy in the whole system.

    Yes we have hit 1.5°C over the last 12months partly down to el nino which is expected to subside shortly. Though there is some discussion about whether this year was an expected randomly anonomly or whether it suggests some feedback loop that’s been underestimated but we can’t know until enough time has passed (maybe a year).

    All that just means both that the impacts we are already saying are less worse than you’d expect at long term 1.5°C and therefore we should be extremely worried but also that we have factored that in in our estimates of what outcomes are possible (though the 1.5°C window is increasingly narrow because as you say we still have our foot on the gas). So there is still time to make an impact and every fraction of a degree and kg of CO2 matters.



  • Just to be clear I wasn’t being feacious genuinely curious as to the specifics as I’m not as familiar with haulage.

    I suspect there is an argument that we’ve made cargo transport too cheap and its skewed the economics of local vs outsourced production.

    My preference would be pantograph systems on the motorways and main routes which we could roll out quite quickly and remove the majority of emissions coupled with a systemic look at our material needs and production capacities locally with a view to lowering volumes

    The Silvertown tunnel (and lower thames crossing) in London would be a good example where we are rebuilding our infrastructure along the lines of sustained and increased haulage along certain routes at great public expense so I guess this could be considered an indirect subsidy.