[00:00:00:01] This is our last talk. Embracing Uncertainty, how society deals with not knowing in our engineering policy and climate science. This talk is less on the climate science or economics, and more a cross-disciplinary way of interconnecting some of the topics we've heard about this week. So I'll touch on a couple of things that you might have heard before and try to tie them in together, and then finally, give a little bit of open discussion with all of you guys to see what we've learned and what we can maybe implement in our communities.
[00:00:40:26] Few words to myself. My name's Christoph Tries. I'm a first year master's student in the Technology and Policy Program, just as Emile. My background is in engineering management. I did a bachelor's in Germany. And I do not have five years of experience working in the field, as Emile has, and that was really a wonderful presentation. I was amazed by all the insights and knowledge you shared with us.
[00:01:08:21] I do, however, study decision-making under uncertainty as a research assistant at the Joint Program since last September. And so in the last semester, I've taken a couple of courses on decision making under uncertainty, and-- well, incorporating uncertainty in power systems projects, And I'd like to share some of those experiences, some of those insights I've gained, and connect them with other pieces I've picked up at talks here and there, share them with you, and then maybe engage in discussions afterwards.
[00:01:45:12] The agenda for my talk-- first, introduction to uncertainty. A few definitions, so we actually know what we talk or not talk about. A recap of uncertainty in climate science. Then the forecast is always wrong. I'll get to what that exactly that means later. And decision-making under uncertainty in engineering and policy.
[00:02:10:28] So these four first topic or sections will be under the assumptions that we know climate change is happening. We don't doubt the facts that are and have been presented. But in the fifth one-- communicating climate science to the public-- I'd like to take a step back and actually face one of the challenges that we, in fact, do face.
[00:02:36:09] Emile touched on that there is really political-- or there's-- it's really tough to get legislation passed. This is not only because the economics is tricky and incentives are not always rightly aligned, which is definitely a big part. But also a big part is our public perception, and so we're going to touch on that in the end.
[00:02:57:26] The objectives-- what you should take away from this talk. First, recap what are the big fields of uncertainty in climate science? Second-- recognize uncertainty as everything day and everywhere because it actually is all around us, not only in climate change but in all the other fields that we face in our daily lives and in our professions.
[00:03:23:26] Third, learn about examples of decision making under uncertainty. So accepting that we do face uncertainty everywhere and every day, how can we actually make smart decisions and not just throw our hands up in the air and say, well, we don't know, so we can basically do anything.
[00:03:42:02] And finally, lastly, think about and discuss how uncertainty in climate science can be effectively communicated to the general public. I think that could be a very interesting topic. I'd like to share some of my thoughts and learning [INAUDIBLE] and also hear from you what's your impression?
[00:04:08:18] So introduction to uncertainty-- a Wordle. There's a lot of things you could connect with uncertainty. Deep uncertainty, planning, adaptation, pathways, climate up there on the left. So we do have that. But what we actually talk about is-- so couple definitions first. And we'll see that there's a lot of definitions on uncertainty out there, which actually brings us to a lot of problems, also, especially in climate science.
[00:04:43:02] First of all, uncertainty and risk always present as a pair. So in the public, risk is usually negatively connotative. It's something you want to avoid, something that will hurt you, and uncertainty is basically-- we don't like it, but it's not necessarily positive or negatively connotative.
[00:05:00:27] A very old definition by Frank Knight from 1921-- he sees risk as the measurable aspect of uncertainty-- so basically, a known probability distribution of a event. So we're not going to know what will actually happen-- will it rain tomorrow or will it not rain? But we might be able to say, well, it's a 40% chance to rain, 60% chance that it does not rain. So we face a risk. And uncertainty, in his understanding, is not measurable. In other contexts, that's referred to as deep uncertainty. So we can actually not put numbers to that.
[00:05:39:02] In the financial industry, it's often connected with that risk, actually, has an effect on your objective. So whatever you're trying to achieve, the risk regarding that is the uncertainty that it actually has an effect on your objective function or on your goal-- what you're trying to achieve. So yet another area. Yet another definition of risk and uncertainty.
[00:06:07:05] A, I want to say rather-- it's not connect to a special area, but the differences between aleatoric and epistemic uncertainty is aleatoric is statistical uncertainty. So basically, minor variations of the study conditions. As an example-- well, the Super Bowl is on Sunday, so you know in football practice they have these machines that throw the football out straight away, and you could probably put that machine in a certain way that it would have-- put the football on the same trajectory every single time. However, there is little variations-- maybe some turbulence is in the air, or that the football and the material actually on one side is unnoticeably thinner-- that actually will lead to the football not having the exact same trajectory but a tiny bit of different trajectory.
[00:07:01:05] And these are-- when we categorize this as aleatoric uncertainty, this actually means that we cannot measure. So it might be thinner, but it's little, that's actually unmeasurable for us. So obviously, we'd like to get better at our measurements, and then we can sort of move the boundary between epistemic and aleatoric uncertainty, but for these concepts, aleatoric cannot be measured.
[00:07:28:08] On the other hand, epistemic uncertainty is sort of a systematic uncertainty. It's maybe that we neglect an effect. For example, the friction of the air onto the ball-- if we basically build a model and want to predict where the ball lands-- the football-- and we don't account for the friction then we will be wrong every single time. But taking that into account and including that in our model would allow us to actually have better results.
[00:08:02:12] A final distinction between the levels of uncertainty-- also just one of many that are out there-- between determinism-- we know perfectly what's going to happen-- and total ignorance-- we have no idea-- so called deep, deep uncertainty, or-- there's a couple names for it-- might be on level one, we have sort of point forecast. We know where it's going. It's one forecast. We're pretty sure that's going to happen.
[00:08:27:29] Level two, there's a couple scenarios that we think maybe have a probability that we can have a fair-- well, we trust that one of those scenarios will take place. Level three uncertainty is we have a range, but we can't actually pinpoint what exact element of that range will take effect. And the fourth one, the deep uncertainty, we know, basically, there is something that will happen, but-- well, basically, in which direction that ball is going to move, we have no idea. And obviously, total ignorance is the furthest step.
[00:09:08:06] So we see, uncertainty has a lot of different connotations, and especially once we move into uncertainty in climate science, we'll see that that actually has practical difficulties when we try to make climate policy and-- yes, when we, for example, make climate policy.
[00:09:31:15] So important distinction-- this was a disclaimer from the beginning-- we assume that we don't question whether climate change is real or not, but we are uncertain about how the exact-- or what the magnitudes of what specific effects of climate change are.
[00:09:49:15] This is a recap from-- [? Justin ?] [? Maduro ?] yesterday presented this graph already. So we have-- this is a classic model. You've probably have seen that or in similar graphs before, where we-- the global temperature mean, that's going to rise due to climate change over the years. And the different shaded areas are the uncertain ranges that we know or we assume that the global mean temperature rise is going to be at.
[00:10:23:03] So we have this orange rise here. It's the internal variability. Justin explained that as it's basically-- this phenomenon is like El Nino, La Nina. Climate just is not really-- we cannot predict weather for tomorrow just as we cannot predict the climate for the next couple of decades. So we see this stays pretty straight forward-- does not really go spread out over the years. That's internal variability.
[00:10:53:20] We have model spread. That's our models that if we look at different models, they are calibrated a little differently. They might-- Justin talked about basically the clouds or the cubes, basically, that we divide the earth into. They might have different sizes. They might have a little bit of different assumptions of how the effects from one of these cubes transfer to another cube.
[00:11:22:03] So between our models, we have a certain spread and don't know exactly where we're going to land. And we actually see the most variability over time in our representative concentration paths, RCPs, of CO2 emissions or of greenhouse gasses in general. Basically, you can say how much climate policy our nations or our country's going to implement. If there's a lot of policies, we'll likely be on one of those paths. If there is no policy at all and we just keep adding more greenhouse gases to the atmosphere, we're going to end up up here.
[00:11:58:17] But we really can't say at this point which of these actual actions humans will take. So all of these factors together give us a pretty big spread of uncertainty. So we're actually not able to predict what the climate will be like at the end of the century.
[00:12:20:20] Here, also recap-- a definitely incomplete list of causes of uncertainty. We've gone over the climate cycles-- El Nino, La Nina. One of the big hot topics in climate science in the science part is aerosol radiation. How do clouds form? Just the scientific process are not really understood yet. So that's a big-- and actually, understanding them better will enable us to get rid of some of these epistemic uncertainties that we can see that there is some effect, but we can't really quantify it yet. So that's one area of research kind of reduce that uncertainty spread into the future.
[00:13:01:06] And we've also talked about climate negotiations. There's actually studies who try to predict the outcome of the Paris agreement. Maybe not surprisingly, they didn't really find really good results with the model they used, but it's one of the most complex issues to actually predict regarding climate science and climate change.
[00:13:22:03] Also, other reasons why have uncertainty-- Geared too coarse time steps. Basically, we just go in entire years in many models. Too coarse spatial resolution. We talked about that we're down to 120 kilometers, I think, for these sides of the spheres in our climate models, but obviously, that could and should be much closer to get better results.
[00:13:47:19] Representation of feedbacks are not complete. Climate variables are maybe observation difficulties with our climate monitors et cetera. Unknown unknowns that we actually have not even discovered that there's uncertainty. There's uncertainty that we cannot even attribute to certain things.
[00:14:11:09] So in the ICCP reports, uncertainty plays a big role, obviously. But there has been a lot of progress in trying to get together these definitions of uncertainty or to actually agree on one definition of uncertainty regarding climate change, at least on this very expert group that is sort of at the forefront and considered the sort of leading panel of communications on climate change.
[00:14:46:25] So just as an example, in the Assessment Report number 4, we had likelihood as one manner of uncertainty, which was just the chance of happening of an event judged quantitatively by experts that just gave values of 1-10 to those events. It's not that bad of a measurement. Those experts actually do understand quite a lot about what they are judging. Nonetheless, putting a quantitative-- as we'll later see-- number on it does not really make a lot of sense.
[00:15:19:18] Second measure was confidence. It was the degree of consensus on the experts' statements. So how confident are we that the experts' statements are right. It's like a double measurement. Also, quantitatively as a probability assigned, and they had a third rather qualitative approach.
[00:15:41:28] So the different working groups on the Assessment Report number 4 use different combinations of those uncertainty definitions. So one example. One working group only used the likelihood. On used a combination of both, and I'm not really sure what the third working group was working on. But even inside the IPCC, there was not an agreement. They had formulated those definitions of uncertainty, but different groups had not really correlated to speak with one voice.
[00:16:13:08] What are the years of the AR4 and the AR5?
[00:16:16:13] AR4 was 2007, I believe. Please correct me. And AR5 was recently in 2015. OK. AR5-- so from AR4 to AR5, there was a lot of thought. They saw this as a problem that they had to fix because with the uncertainty, people couldn't really work with the uncertainty values that were given to, let's say, how much sea level is going to rise or other effects that were portrayed or came up as results out of The assessment Report.
[00:16:58:15] So for the AR5, the old concept of confidence was thrown out and the new concept of confidence was now the validity of finding on a qualitative scale. So they said, well, this quantitative approach up here didn't really help us, so the validity of the finding was basically does the finding in our report actually match what's happening?
[00:17:25:05] So it's sort of the truth statement of the findings, and it was basically appraised on a five-point scale-- not points, but from very low to very high-- and just by the sheer not putting numbers on it was the idea that you don't give people the impression that's actually on a scale that maybe two was double the value of one, but by putting it in this qualitative way, you had better grip on the uncertainty or make better inferences.
[00:17:58:02] And the likelihood was then the probabilistic measure of uncertainty in the findings. So here, you did have a quantitative-- and I think most of you have seen that on the next slide-- the likelihood scales. What the probability of certain outcomes are are then, from these very nice probabilistic percentages, transferred back into these terms that you might have read on the reports that first, it was likely, then on the AR5 report, it was very likely.
[00:18:37:23] So yeah. That's basically-- that was the move from AR4 to AR5. As a side note, even AR5 is not conceptually perfect. So there has been some criticism on the definitions of uncertainty in the AR5 report. One study I have found on that was Aven and Renn, who basically looked at the report, were not satisfied, and started their own-- or proposed a different conception, a different framework to actually cover uncertainty.
[00:19:14:03] As an example, they suggested that the strength of knowledge should be assessed, so not only of probability. So you could say this event has a 20% chance of happening, and this event has 40% chance of happening. But important is also how much data do you have to back it up? So you might be really sure and have a lot of data for the 20% chance of sea level rise above 20 centimeters, but you might not have a lot of data to support that hurricanes actually occur or not.
[00:19:49:27] So what they suggested, and what, surprisingly, was missing so far was some measurement of how valid, actually, those probability attributions are. So there's a lot of work remaining. So AR6 hopefully will have a better, more coherent approach to quantifying and qualifying uncertainty. But this-- as we'll see later- at the expert level, this disagreement will then only trickle down to what we get into later.
[00:20:29:21] So partial summary of the first two sections. We have different conceptions and classifications of uncertainty. Even among the experts, there's no agreement. So yeah. Basically, what I've just told you. And there is really a push to have a coherent methodology for quantifying uncertainty.
[00:21:01:05] The forecast is always wrong. This is the title of a lecture by Professor de Neufville, who's in the IDSS Institute for Data Systems and Society. I've taken his class last semester, and the following slides are based on a lot of the examples he is bringing in his class. And what I want to get across here is we face uncertainty every day in everywhere.
[00:21:27:12] For example, if we look at airport runways, you might think paving a new airport runway, taking the old asphalt off, putting new asphalt on-- pretty simple task. You should be able to project pretty well how much that's going to cost. This study is actually shows us that it varies quite a lot from almost half of the expected cost to more than 2 and 1/2 times what they suggested-- what they first projected as the cost. So even in these basically simple, manual tasks, there's a lot of uncertainty. We project something, but in the end, it'll turn out much, much differently.
[00:22:14:03] Actual projected power use in the US. These are so-called porcupine graph. These are projections in different years, and here's the actual power use in the US. So even though they maybe should have figured out by the fourth or the fifth projection that it was not going to keep growing the way they were planning or they were assuming, nonetheless, these very smart people-- that's actually from the International Energy Agency, if I'm not mistaken-- so even at these very high-level organizations, they're not able to make correct predictions.
[00:22:57:05] NASA project cost grown. These are just different NASA projects. Here's the standard deviation of the average. Ratio of actual estimated costs. So the one would be a good product. So this was not as bad. But obviously, there are a lot of projects that just shoot way over budget. Obviously, there might be incentives for that. You know, you have a constrained budget. You try to first get the project through by pushing in a lower budget. But still, it's quite remarkable how big the actual results differ from what we expect.
[00:23:35:17] World copper prices. This is one of my favorite. They really try to-- here, they shoot too high. Now we're going to low. Didn't make it-- oh, we have to [INAUDIBLE] high. Oh, we were too low. So really, there's not a very good-- this is actually a Chilean copper mine that predicted those copper prices. They should actually know their business, you would think.
[00:24:03:27] So there's a lot of uncertainty in the world around us that we cannot predict, coming back to the forecast is always wrong. US wind energy installations have been consistently judged as too low. These are the projections, and the bar on the right, the darker bar is what actually was installed. Same idea.
[00:24:26:21] And even in our politics-- election polls. We might think-- so this is-- we all know that they were wrong. But if you think about it, actually, it should not surprise us that they were wrong because what we actually-- [INAUDIBLE] for a second.
[00:24:48:24] So these are all point forecasts. We forecast one value for 2006, one value for 2008. Here, we have point forecast. These are the blue point, blue and red points. But we also have a range. So the point forecast, which basically, we said, all right, Clinton's going to win all these states, and trump's only going to win a couple, and we have a new president, the first female president of the United States.
[00:25:14:16] Turned out, the point forecasts were terrible. They were wrong. But actually, if we look at the margin-- in the lectures before that Justin gave, we have these-- basically, the whiskers that are more common statistical representation. This is from 538, Nate Silver's site. He's a little more, I want to say, trained to communicate to a broader public. So these are the 80% confidence intervals. 80% is not that high. Usually, we use 95 or even higher intervals in another contexts. But in every single state, he has 80% confidence. And well, it was right.
[00:25:49:16] So if we take into account the larger-- not the point forecast, but actually a distributional forecast, we are pretty good at predicting what's going to happen. So rephrasing on the title of this section, it's not the forecasts that's always wrong, but it's the point forecast that's always wrong. And we should really try and refrain from putting too much weight on the point forecasts. Still not working.
[00:26:25:12] So the fourth section. Decision making under uncertainty in engineering and policy. So we've seen a couple examples how the forecasts-- at least the point forecasts do not work out. There is ways though how in engineering, in our economy, and in policy adaption and policy making how we are able to deal with these uncertain futures. One example, recognizing the uncertainty as it's uncertainty.
[00:26:55:20] So basically, you go on from the point forecast to the distributive forecasts. The concept of expected value is very important. It's used in many, basically, managerial profitability studies. You don't forecast a one perfect profitability of your firm or your new product, but you use an expected value. So you have a range of different demand forecasts.
[00:27:25:29] And it is very important-- the flaw of averages-- you have-- actually, you can't just take the average of all your demand forecasts that you're given and calculate your profitability or your profit once with this average because the expected profit-- so the profitability based on the average forecasts-- if I have a set of forecasts and average the, to one number, to the point forecast, it's not equal to what I get if I take the entire average.
[00:27:53:25] So the distribution of what I assume to be the correct projection, and basically, do the calculations-- maybe if the forecast has 100 different forecasts, I have to do calculations 100 times, but I'll get a much more correct answer than when I only use the average.
[00:28:17:09] This is-- so the flaw of averages, this phenomenon that I actually get different values using one or the other technique, and is an important concept to recognize. And actually, it is being recognized and implemented in managerial decision-making but not as wide spread as one would think. So it's still really a push to get this idea out there, to actually implement it in the field. It takes a little more computational power. It takes more effort for the individuals working in firms to do this concept of a distributive forecast, to actually work with that, but it does get better results.
[00:29:01:00] And maybe on side note, also. If we have the point forecast, it is basically also a distribution, just a distribution that has only one value and says this is 100 percent. So whenever we get a range, we just get a much better picture of our distribution, much better resolution.
[00:29:20:25] The second, basically, technique of decision-making under uncertainty is to incorporate flexibility. And this, if you know you have a certain range of uncertain outcomes-- if you have a range of uncertain outcomes-- you can actually use that to your advantage. This is an example from a skyscraper in Chicago, built by a medical company. So-- I don't have a picture from before-- but they only built first to this floor and stopped.
[00:29:54:10] And then this is a picture when it was under construction. So for a couple years, it was just this lower part here because they didn't know were they going to grow? Was the economy going to be good? Was it going to be bad? But they did build the foundations of the building strong enough to support another 10, 15 stories on top.
[00:30:13:21] So a couple of years later when they decided, all right, we're really going-- I'm not sure of the exact situation. Stay in Chicago, or the economy's good. We're growing. They decided to make their skyscraper bigger and actually capitalize on this uncertainty. So at the beginning, they weren't sure what the-- which path they're going to be on. Are we going to be low climate or high climate? And then they decided, when they knew-- over time, they pretty much knew what path it was, they decided to upgrade.
[00:30:46:11] So what we can take from this lesson for climate change is actually for climate change adaptation. if we have to adapt to the-- we will have some climate change at least. There is no doubt. So until-- so basically, building projects that take into account or that have the possibility to capture on this uncertainty once it becomes known.
[00:31:18:13] For example, in the project I'm working on, it's a hydro power project in Africa. So the idea is to first build a dam and build a lot of caverns where we can put generators, but not put in all the generators because they're really expensive because we don't know how much water we're going to have.
[00:31:38:04] Once climate change becomes known, if we're on the low climate change, meaning wetter climate, or high climate change, meaning dryer climate, once we know which path we're on, the company, or the government in that case, can decide to put more generators in the already basically built caverns. That's analogy to the foundation of the skyscraper that has to be strong in the beginning to later then support a larger structure.
[00:32:11:15] So basically, in Zambia and Zimbabwe, the governments are thinking about this higher power plant. The idea is that they will, in 20 years, for example, decide if they want to put more generators into their hydro power, and actually-- if they have enough water at that time. And if they don't, they will not have built too big. You could-- yes? OK. So that is-- recognizing uncertainty and introducing flexibility are measures that are fairly or starting to be more common in engineering managerial backgrounds, but there's a hope that these could also be applied to climate adaptation.
[00:32:57:04] And finally, decision-making under uncertainty in policy. Similar idea. We have an uncertain future. We don't know what path we're going to be on. But we can make so-called adaptive policy making. That means we design policies, and later re-evaluate them depending on which path turns out to be the one where the world is on.
[00:33:22:02] A sort of-- necessary for that is knowledge assessments. With the climate, it's rather easy because we'll have a thermometer that tells us how much our global mean temperature has risen. But there's a lot of other issues that where if you dig down to discover what you're uncertain-- what the state of the world is that you're in, you can then actually make decisions and capitalize or decide what the reevaluation of the policy should look like.
[00:33:55:07] One example for that-- I'm sorry. I'm [INAUDIBLE] right here. Oh, one example of that is the aviation industry in the US, where the Federal Aviation Agency actually deployed a group that investigates near-misses of planes. So once they kind of find out what might cause accidents of airplanes-- so the knowledge assessment approach-- they can then modify the safety standards of those airplanes.
[00:34:31:20] So the safety standards are now not put very stringent, but you have a very close eye on what goes wrong or what is almost going wrong-- those near-misses-- to then better inform how to reevaluate the policy and actually make the regulation as stringent as necessary but as lenient as possible.
[00:34:55:17] And another great point, which when I was listening to the talk-- the lecture that [? Mihai ?] gave on economics and policy-- sometimes, the co-benefits of climate science action might actually be greater than what we're actually trying to do. We're trying to combat climate change and putting in a policy, but we're not really convincing anybody that it's going to be in the social interest.
[00:35:23:11] But for example, Germany was investing heavily in the solar industry, and a co-benefit was that for at least for about a decade, they were leading the solar industry, and actually, a lot of companies profited off of that. So these might be also conservations if we talk about climate policy. That will always be too focused on what is this policy doing for climate change? But there's lots and lots of co-benefits, and those might actually be enough to push a really good policy through.
[00:35:54:18] Partial summary for this part before we go into the last section. Uncertainty is every day and everywhere. And there is actually tools that are developed that are further developed that actually allow us to make use of that uncertainty and not throw our hands up in the air. And actually, obviously, if you're running a business, or if you're planning climate adaptation strategy, and you take the point forecast, it might look great because you don't account for all the negative climate turns that the climate could take, but you're actually just wrong.
[00:36:36:13] So realizing-- recognizing the uncertainty and introducing then the flexibility will actually be great-- or is hope to be of great use for climate adaptation policies.
[00:36:51:19] Now, communicating uncertainty in climate science to the public. We leave our nice work where we all know that climate change is happening and we enter the age of denial. it's a term I'm not exactly sure where it originates from. Its used by a variety of people and authors. The idea is that we live in an age where-- and you've heard most of those words before. Post-factual world. Alternative facts is the new hot topic there.
[00:37:23:12] Doubt might be used to cast-- so basically, doubt is cast on the results of climate science research in order to then block climate change policies. The edge of denial is actually something that has to be-- if you want to combat climate-- is something we have to accept and we have to tackle and see how we can actually overcome it.
[00:37:50:14] And we're listening to-- I was listening to a great talk by Deborah Blum, who's in the Knight Science Journalism group here at MIT. She was giving a talk in one of her classes, and she talked about her role as a science journalist sort of bridging that gap from climate scientist to the general public. And it was great to hear from her. She's a very experienced woman in her field. She's won a Pulitzer Prize.
[00:38:23:04] And so what she was saying is that she really takes on that role. And she is researching the science papers both, for example, those that find climate change as a fact and those that in their-- the few research findings that are out there that deny climate change, and she tries to really uncover what they say. Mitigate-- if there's really complicated wording on uncertainty in the IPCC report, for example, she has the role-- or it's her view to then communicate that to the general public in terms and words that they can understand.
[00:39:01:07] And if you're honest, all of us are part of the general public, at least in some areas. Maybe in climate change, one or the other of us is rather in the expert group. But in a lot of topics, we also belong to the general public, or we just do not have the knowledge to understand what experts are in detail talking about it.
[00:39:23:19] So how come-- the big question that Deborah Blum is asking-- how she cam actually bridge, or help to bridge that gap between scientists and the public. And one final piece of-- or maybe one of-- where your thoughts could begin is this model, which is from a study-- or actually, from a working group on communicating uncertainty to the end user.
[00:39:57:18] So it is about climate uncertainty, and it is thinking of that there's services that can be offered around climate change, climate adaptation, but there's large uncertainty, and you have an end user, basically, the general public. You can also see here the public and the media.
[00:40:16:00] And you have the providers, who are measurement simulations. This is maybe where some of the folks at the drug program sit, the experts who have constructed the models, who take the observations from climate balloons, run them through the models, and try to inform and let all the other people in this sort of chain know what the climate is going to look like.
[00:40:42:28] And this, for example-- yeah. So you can think of-- there's some processing in between, and then the climate information has to be formed into a product by a firm that is offering it to the end user. You could also take this-- and that's what I think-- take this chart, or this figure and apply it to how we communicate about climate science in general. Because we have the end users, who we try to convince of the fact that climate change is happening and that we need to do to develop climate policy, and providers, the research experts at the universities, at MIT, for example, have a long way between them.
[00:41:36:15] So what we actually need are all these different intermediaries-- journalists could be one of them-- because the public is just-- in our world, I want to say, topics are so complex, or we've gotten so specialized that topics have gotten so complex that we as the end user, in most times, cannot actually understand all the tiny, tiny little variations which the experts really key in on. Because the policy, depending on whether you structure it a little bit different, might have largely different effects.
[00:42:15:18] So we nee, I think, these different intermediaries, where we have, from the experts to the public, sort of an abstraction off the information, but at the same time, making it more available to people. So if we take out these sort of intermediaries, as, for example, I would believe basically, the war on media, who are one of the major intermediaries that we have nowadays.
[00:42:50:07] So if we take out the media that sort of are able to be experts but sort of be a link between the experts-- understand what the experts are talking about and the rephrase it for a end user-- if we take that out, we're going to have a really hard time to actually make smart decisions because there's just not a good communication going on between the general public and the experts.
[00:43:15:00] Yes. So that was what I wanted to share with you today. Thank you for listening so far. I will be-- I'm happy to take questions or some of your views on how climates-- how we can communicate climate science to the public, for example. Yes, please, Kurt.
[00:43:50:01] I think about this in the context of an often heard statement that people make decisions in a lot of ways, primarily more from emotion than from fact. And thinking about ways that I've heard uncertainty communicated through, for instance, stories or other kind of emotional means.
[00:44:13:04] Kerry Emanuel wrote an amazing article a few years ago about tail risk, and your child had a 1% chance of being hit walking across the street, would you let them? Because that's the sort of risk that we're taking with climate. I was wondering if you have any perspective coming from the academic uncertainty world about emotional or storytelling ways to get risk across.
[00:44:43:15] Yeah, yeah. I think storytelling is extremely valuable if you want to get people around an idea. And especially, if we're talking about policy, policy is nothing else than getting people around ideas-- maybe not the general pubic, but at least the congressman, and they have to sell it then to their constituents.
[00:45:03:01] So I would agree that storytelling can go a long way. However, with storytelling-- it's a-- I don't know. How do you call it? The knife of two sides because you can also make lot of great stories that do not actually hold in it the truth, or what we consider the truth, or that actually would encourage the opposite action. So yeah. I don't know. Nathaniel?
[00:45:34:09] I just wanted to make a quick comment on one of those lines too, which is sort of-- it's something I think that you touched on throughout your presentation, but it's sort of the psychological impact of uncertainty and the psychology of having to communicate it to the public or other scientists. Like storytelling is good, and it's a good motivator from the emotional side.
[00:45:55:13] We also have to be very careful about how do we frame the language and frame your stories because off the top of my head-- the IPCC phrasing for the various uncertainty points in that chart that you had in the beginning-- I remember seeing a study a couple of years back which demonstrated that peoples perceptions of the numbers underlying those phrases was actually wildly different than what IPCC intended, and that's not even going into storytelling. That's a very cut and dry factual thing that we have to take [INAUDIBLE]. So it's a slippery slope.
[00:46:31:25] That's good. Yeah. Yes, please?
[00:46:33:20] Thinking about a different way to motivate people. I've heard that insurance actuarials are already factoring in climate change, or at least flooding in certain housing markets. And people pay attention to what they spend. And I wonder if there's other ways that before things actually happen, they'll be priced in. And people will-- [INAUDIBLE]
[00:46:57:15] Well, speaking about the housing market, i can just think of one example where it really didn't work that-- when the housing market here, in the US, crashed and brought forth the financial crisis. So I think we cannot always trust that things will be priced in.
[00:47:18:10] However, there's definitely-- I guess there's a good argument to be made that more things are-- because we just have a much broader perception with our digital communication means-- we can be aware of many more things occurring, and then actually price them in or adjust our actions to them. So as a general point, yes, I guess that will be. I'm not aware of off the top of my head examples where I remember that. Yes?
[00:47:55:17] Just a comment from-- I fall on that spectrum of the end user, and what I'd like to do in my community is kind of move into me being an nth user where I'm able to translate some of that information. And what I find all time is that people will turn off when they hear negative information that feels hopeless, and I think in communicating these things, it has to come across, at least by the time you get to the end user, as there are things that can be done. And there are ways tat we can work with the problem. Because as soon as people hear it's overwhelming, it's too late, they give up.
[00:48:36:13] Absolutely. yeah.
[00:48:37:24] So-- and I think you should look to encourage people all along the way, some people who are close to the science and people closer to the end user to make that communication and to be informed. And I just want to thank you for having this series of talks. it's been helpful to me.
[00:48:59:03] That's great to hear. Thank you. yeah. So yeah. I'd like to add a little bit on to that. So I believe giving people something to actually do makes them feel that they can make a difference. So even though it might maybe not be the best solution to do right now-- as you heard from Emile, second resolution-- if it's anything that people can actually do and act right away, that will help them get around and rally and maybe go to the next phase, become the n user or maybe the n minus 1. So I agree with that.
[00:49:34:03] And the second one, second thought-- important is to keep people from sort of shutting down and saying this is just too much for me. I think one of my intentions with this talk was around the word uncertainty, trying to open some of those blinds where people just shut down and say, well, uncertain-- we don't know what's going to happen. We face uncertainty everywhere in our world, and just realizing that just takes a lot of-- or at least I believe-- it can take a lot of the strain, oh my god, we can't do anything about it-- takes it from it. So yeah. There were other-- please.
[00:50:11:02] This is not so much about uncertainty but it about communication. And one thing that baffles me is we're in America. Not everybody goes to MIT in America, and yet, almost every time I see an article or hear a talk about climate change, the reference is to two degrees centigrade, which many people in America don't understand. And often you go through an entire newspaper article, and there's no mention of what that's equivalent to in Fahrenheit. Do you know why?
[00:50:44:19] Great example. I mean, the reporters-- it's not just scientists. It's the reporters who--
[00:50:49:07] It's-- yeah. One of the points I didn't throw up here, but I might now. So we just have the barrier of vocabulary between those two worlds. And climate scientists just do work in Celsius-- and maybe Kelvin-- because that's just how the scientific community decided to work with it. But also not realizing that the rest of the country cannot follow that. Another great example that we had yesterday in our talk are positive and negative feedbacks. The positive climate feedback was actually more warming. The negative climate feedback was cooling.
[00:51:28:28] So obviously, it does-- from a scientific point, it makes absolute sense. Positive is reinforcing. Negative is in the opposite direction. But yeah. For the general public, it's just confusing in a way. So I agree with your point. That's just something we have to work around, I think. It would be great if journalists realized that they put in a little asterisk or explanation out there. Or the US change to Celsius altogether.
[00:52:02:28] That's going to be harder than passing the climate test.
[00:52:04:25] Yes. Ain't going to happen. Eric?
[00:52:08:25] I was wondering if you accounted for the difference between public perception and then people who have some sort of scientific background, albeit maybe not climate. I think there was a study that showed that the general public-- you have a certain population that believes it's a problem and a population that doesn't. And if you give them more data, their beliefs are a little more confirming of what they were before.
[00:52:33:26] But if you take kind of academically scientifically rigorous people who believe in climate change and those who do not, when you give them more data, their beliefs strengthen more in the direction they already were by greater degree. So you have these educated people who may not believe in climate change. You give them more data, and then they believe their original beliefs even more strongly. So how do we reach those educated people?
[00:53:11:15] So just personal speculation and-- I think we cant. I wanna say, my conception is that people presented with the same set of facts will just come to different conclusions. There's convergence of similar conclusions that people can come to, but I wouldn't believe-- they don't all come to the same conclusion if you give them exactly the same data. just as, if you look at this chart, everybody realizes a couple of different words, and some of you might actually look at, some you might not look at.
[00:53:52:06] And so those people that we have not yet exposed-- that do not have the scientific background, they are still not really far along those paths where they get to a certain result. So once we present them with the data, they'll actually then get on the path that they are sort of feeling. But while those that just are inclined to interpret the data in a different way, it'll just keep them going further. I don't know. Maybe that conception strikes a chord through here or there.