Description: The concept of hormesis—a little bit of a bad thing can be good—is introduced and vigorously debated by the students. Tools to find peer-reviewed, primary sources of scientific knowledge are briefly introduced so the students can find high-quality sources of information for their arguments for or against hormesis. The debate somewhat unpredictably does not end in a definitive conclusion, simply because there are enough high quality studies such that one side cannot disprove the other.
Instructor: Michael Short
Note: To report potential content errors, please use this form.
The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for free. To make a donation or to view additional materials from hundreds of MIT courses, visit MIT OpenCourseWare at ocw.mit.edu.
MICHAEL SHORT: All right, guys. So today I'm not going to be doing most of the talking. You actually are, because, like I've said, we've been teaching you all sorts of crazy physics and radiation biology. We've taught you how to smell bullshit, taught you a little bit about how to read papers and what to look for.
And we're going to spend the second half of today's class actually doing that. Well, we're going to have a mini debate on whether or not hormesis is real. And you guys are going to spend some time finding evidence for or against it. Instead of just me telling you this is what hormesis is or isn't.
So just to finish up the multicellular effects from last time, we started talking about what's called the bystander effect, which says, if a cell is irradiated, and it dies or something happens to it, the other cells nearby notice. And they speed up their metabolism, their oxidative metabolism, which can generate some of the same chemical byproducts as radiolysis does, causing additional cell damage and mutation.
And there was an interesting-- yeah, I think I left-- we left off here at this study, where they actually talked about most of the types of mutations found in the bystander cells were of different types. But there were mutations found, in this case, as a result of what's called oxidative-based damage.
This is oxidative cell metabolism ramping up and producing more of those metabolic byproducts that can damage DNA as well. What we didn't get into is the statistics. What do the statistics look like for large sample sizes of people who have been exposed to small amounts of radiation?
I'm going to show you a couple of them. One of them is the folks within 3 kilometers of the Hiroshima. So I want you to notice a couple of things. Here is the dose in gray, maxing out at about two gray. And in this case this ERR is what's called Excess Relative Risk. It's a little different than odds ratio, where here an excess relative risk of 0 means it's like nothing happened.
So anything above 0 means extra excess relative risk. So what are some of the features you notice about this data? What's rather striking about it in your opinion? Yeah? Charlie?
AUDIENCE: [INAUDIBLE] so in the [INAUDIBLE] timeline from [INAUDIBLE] timeline here.
MICHAEL SHORT: This one?
MICHAEL SHORT: Oh, yeah, these are the errors. Yep. What does it say here? Is it-- more than one standard error Yeah.
AUDIENCE: There's a lot of variability?
MICHAEL SHORT: Yeah, I mean, look at the confidence in this data at high doses. And then while you may say, OK, the amount of relative risk per amount of radiation increases with decreasing dose, which is the opposite of what you might think, our confidence in that number goes out the window.
Now what do you think of the total number of people that led to each of these data points? How many folks do you think were exposed to gray versus milligray of radiation?
AUDIENCE: A lot less for gray than [INAUDIBLE].
MICHAEL SHORT: That's right, the sample size. I thought it was cold and loud in here. The sample size for the folks in gray is much smaller. And yet the error bars are much smaller too. That's not usually the way it goes, is it? Usually, you think larger sample size, smaller error bars, unless the effects themselves and confounding variables are hard to tease out from each other.
If you then look at another set of people, all of the survivor-- oh. yeah, Charlie?
AUDIENCE: How did they determine the-- the doses [INAUDIBLE]?
MICHAEL SHORT: This would have to be from some estimate. This would be from models. It's not like folks had dosimeters everywhere in Japan in the 1940s. But this-- these would be estimates depending on where you lived, let's say in an urban, suburban, or rural area, let's see, things like milk intake right after the bomb, or anything that would have given you an unusually high amount of radiation, distance where the winds were going. This is the best you could do with that data.
And now look at all of the bomb survivors, including the ones outside 3 kilometer region, but still got some dose. What's changed?
AUDIENCE: It seems like they're less likely to get more risk for less dose.
MICHAEL SHORT: Yeah, the conclusion is almost flipped for the low dose cases. If you put them side by side, depending on the folks living within 3 kilometers of the epicenter of Hiroshima versus anyone exposed, all the bomb survivors, you get an almost opposite conclusion for low doses, despite the numbers being almost, you know, within each others confidence intervals for high doses.
So what this tells us is that the effects of high dose are relatively easy to understand and quite obvious even with low sample sizes. What is different between these two data sets? Well, it's the only difference that's actually listed here. Distance from the epicenter, right?
So before I tell you what's different, I want you guys to try to think about what could be different about the folks living within 3 kilometers of the epicenter of Hiroshima versus anyone else in the city or the countryside? Yeah?
AUDIENCE: Would it be like [INAUDIBLE]? It seems like a the closer, like, it would be a lot more instances where you get a higher dose. So they're underestimating [INAUDIBLE].
MICHAEL SHORT: Could be, yeah. It might be harder to figure out exactly how much dose folks had without necessarily measuring it, right? But what other major factors or confounding variables are confusing the data here? Yeah?
AUDIENCE: Wouldn't a lot of people who lived closer, like, not inside the radiation, like, the actual shockwave and heat from the bomb [INAUDIBLE]?
MICHAEL SHORT: So in this case, these are for bomb survivors. So, yes, that's true. If you're closer, you get the gamma blast. You get the pressure wave.
AUDIENCE: But like, even if you survive that, it still like would affect them in addition to radiation. Is it counting for people who got injured from that too?
MICHAEL SHORT: It should just account all survivors, yeah.
AUDIENCE: So if they were injured, that could change how they reacted to the radiation exposure.
MICHAEL SHORT: Sure. Absolutely. And then the other big one is, actually, someone's kind of mentioned it, but in passing, urban or rural. The environment that you live in depends on how quickly, let's say, the ecosystem replenishes or not if you live in a city or what sort of other toxins or concentrated sources of radiation you may be exposed to by living in a city that's endured a nuclear attack or something else.
It could also depend on the amount of health care that you're able to receive. If you show some symptoms of something, if you live way out in the countryside, and there weren't a lot of roads, then maybe you can't get to the best hospital, or you go to a clinic that we don't know as much. The point is, there's a lot of confounding variables.
There's a lot more people. But anything from like lifestyle, to diet, to relative exposure, think about the differences in how folks in the city and out in the countryside may have been exposed to the same dose, because, again, dose is given in gray, not in sieverts. That's the best we can estimate.
But would it matter if you were to exposed to let's say, alpha-particle containing fallout that you would then ingest versus exposed to a lot of gamma rays or delayed betas. It absolutely would. So the type of radiation and the route of exposure in the organs that were affected are not accounted for in the study because, again, the data is in gray.
It's just an estimated joules per kilogram of radiation exposure, not taking into account the quality factors for tissue, the quality factors for type of radiation, the relative exposure, the dose rate, which we've already talked about. How much you got as a function of time actually does matter.
So all these things are quite important. And for all these sorts of studies, you have to consider the statistics. So let's now look at a-- I won't say, OK, a cellphone-like study where one might draw a conclusion if the error bars weren't drawn. So based on this, can you say that very low doses of radiation in this area actually give you some increased risk of, what do they say, female breast cancer?
No. You can't be bold enough to draw a conclusion from the very low dose region from, let's say, the-- the 1s to 10s of milligray, that whole region right there that people are afraid of getting, we don't actually know if it hurts or it has nothing, or if it helps. That's a kind of weird thing to think about.
So the question is, what do we do next? These are the actual recommendations from the ICRP. And I've highlighted the parts that are important, in my opinion, for everyone to read. And the most important one, probably we'll have to come to terms with some uncertainty in the amount of damage that little amounts of dose do.
So this is the ICRP saying to the general public, you guys should chill out. There's not much we can do about tiny amounts of exposure. They happen all the time. You can either worry about it, and get your heart rate up, and elevate your own blood pressure, and have a higher chance of dying on your own, or you can just chill out because there is not enough evidence to say whether a tiny little amount of radiation, and we're talking in the milligray or below, helps, or hurts, or does nothing, which leads me into the last set of slides for this entire course, they're not that long because I want you guys to actually do a lot of the work here, is radiation hormesis, real or not?
There are plenty of studies pointing one way or the other. And I want to show you a few of them with some other examples. The whole idea here is that a little bit of a bad thing can be a good thing, much like vitamins, or, let's say, vitamin A in seal livers, a little bit of it you need. It's a vital micronutrient. A whole lot of it can do a whole lot of damage.
You don't usually think of that being the case for radiation. But some studies may have you believe otherwise with surprisingly high sample sizes. So the idea here is that if you've got anything, not just element and diet, but anything that happens to you, there's going to be some optimum level where you could die or have some ill effects if exposed to too much or too little.
We all know that this happens with high amounts of radiation. The question is, is that actually happened? So let's look at some of the data. In this case, I mentioned selenium and actually have a fair bit of this data that shows some, let's say, contradictory results in this case, where a whole lots of different people were exposed to a certain amount of selenium accidentally.
I don't think these were any intentional studies. But some folks received massive doses of selenium and tried-- folks tried to figure out, well, what how-- oh, yeah, if you want to see how much they got. Remember that you want about 5 micrograms per day on average. That's a pretty crazy amount of selenium that ended up killing this person in four hours.
But let's look at a sort of medium dose, something way higher than you would normally get. Two different studies published in peer-reviewed places-- this one says, "taking mega doses of selenium," so enormous doses, "may have acute toxic effects and showed no decreased incidence of prostate cancer and increased prostate cancer rates. 35,000 people. The same supplements greatly reduced secondary prostate cancer evolution in another study."
Kind of hard to wrap your head around that, right? Both these studies were done with, I'd say, enough people and came to absolutely opposite conclusions, showing that there's definitely other confounding variables at work here. So there's kind of two solutions to this problem, increase your sample size to try to get a most representative set of the population or control for other confounding variables.
And then the question is, how do you model how much is a good thing to go over what these models mean. The one that's described right now in the public is called the linear-no threshold model. This means that if this axis right here is bad and this is axis right here is amount that any amount of radiation is bad for you.
What I think might be a little bit more accurate is called the linear threshold model. If you remember from two classes ago, the ICRP recommends that, I think, 0.01 microsieverts is considered nothing officially. That would mean there is a threshold below which we absolutely don't care. And if there are any ill effects, they're statistically inseparable from anything else that would happen.
And that would suggest here this linear threshold model, where this control line right here would be the incidence of whatever bad happens in the control population not exposed to the radiation, the selenium, the whatever. There's also a couple of other ones like the hormesis model, which says that if you get no radiation, you get the same amount of ill effects as the control group.
If you get a little radiation, you actually get less ill effects. In this case, this would be like saying getting a little bit of radiation to the lungs could decrease your incidence of lung cancer. Does anyone believe that idea? Getting a little bit dose to your lungs could decrease lung cancer? OK.
And then you reach some point of crossover point where, yeah, a lot of this thing becomes bad. And the question is, is radiation hormetic? Does this region where things get better actually lead all the way to x equals 0 as a function of dose?
And I want to skip ahead a little bit to some of the studies. No, I don't want to skip ahead. There are some non hormetic models that have been proposed in the literature. It's easy to wrap your head around a linear model, right? It's just a line. More is worse. But the question is, how much?
So folks have proposed things like linear quadratic, where a little bit of dose is bad. And then a lot more dose is more bad as a function of dose. That's actually kind of what we saw in the Hiroshima data. And I'll show you again in a sec.
So the history of this LNT, or Linear No-Threshold model, states the following four things-- radiation exposure is harmful. Well, does anyone disagree with that statement? I think we all know that even large-- you know, at least large amounts of radiation exposure is bad. It's harmful at all exposure levels. That's the one you have to wonder.
Each increment of exposure adds the overall risk, saying that it's an always increasing function. And the rate of accumulation exposure has no bearing on risk. The first one's easy. We know this is true because you expose people to a lot of radiation, bad things tend to happen, deterministically.
The second one, we already know is false. If you look at large sample sets of data, like, the data we showed before, there's definitely a non-linear sort of relationship going, where each incremental amount of exposure has the same amount of incremental risk. We know from a lot of studies that's not typically true.
Then the question is, what about these two? So now it's going to-- we're going to find and who you some fairly interesting studies. In this case, leukemia as a function of radiation dose, what do you guys think about this data set before I seed any ideas into your heads?
So here is dose and sieverts, not gray. And here is odds ratio, relative risk of contracting leukemia. If you were to look at the data points alone, what would you say?
AUDIENCE: A little bit of dose is good for you.
MICHAEL SHORT: Yeah, you might think that. But look at all the different types of models you can draw through the error bars. As you could draw anything going, let's say, down and then up. You could draw a linear no-threshold model, as long as it got through this line right here or a linear quadratic model.
So a study like this doesn't quite give you any sort of measurable conclusion. A study like this might, especially considering the number of people involved. In this case, this is the activity of radon in air as related to the incidence of lung cancer per 10,000 people. Notice the sample size here, 200,000 people from 1,600 counties that comprise 90% of the population.
Chances are you've then passed the urban-rural divide. You've then passed any region of the country. So by including such a gigantic sample size, you do mostly eliminate the confounding variables. So, location, you know, house construction, urban versus rural, age, anything else are pretty much smeared out in the enormous sample size. And what do you see here?
AUDIENCE: Looks pretty good for low dose.
MICHAEL SHORT: Yeah, you see a fairly statistically-significant hormesis effect, where, you know, the route of exposure is very well-known. Everything else seems to be controlled for by-- I mean, we've included something like almost 0.1% of the US population. That's not bad.
Other ones for people that get more specific, targeted dose, in this case, women who received multiple x-rays to monitor lung collapse during tuberculosis treatment, a group of people that can be tightly controlled and followed very well. These are numbers with one standard deviation. And that, right there so you can see, is centigray. So this dose right here is one gray worth of dose. That's a pretty toasty amount of radiation.
But below that, again, statistically significant-looking data. I don't know how many people were in the study because I didn't extract that information. But it's something you might be doing in the next half an hour.
MICHAEL SHORT: Oh, it does. It says deaths per 10,000 women. But how many people were in the study? The question is, what is your sample size? So like in the last study, it was just 200,000 people in the samples. That gives you some pretty good confidence that you've eliminated confounding results.
So I don't know how many folks get tuberculosis these days in the US, or whether this was even a US study, chances are the sample size is smaller. So than even if the data support your idea of hormesis, you have to call into question, is this a large enough, and a representative enough, sample size to draw any real conclusion?
So then let's keep going. More data needed. Evidence for a threshold model. This is probably the most boring-looking graph that actually gives you some idea of, should there be a threshold for how much radiation is a bad thing? In this case, it's very careful data. It's a very carefully-controlled data set, lung cancer death from radon in miners.
And folks that are going down underground probably have a higher incidence of lung cancer overall from all the horrible stuff they're exposed to, whether it's coal or, you know, if you're mining gypsum. Oh, there's lots of nasty stuff down there. But there is an additional amount of deaths responsible from radon.
Here's your relative list risk level of 1 and up to 10 picocuries per liter, which was around the maximum of the last study. It's as boring as it gets, which helps refute the idea of a linear no-threshold model, because if there was a linear no-threshold model, this dose versus risk would be reliably and significantly going up. So there's data out there to support this.
And even-- even better ones, lung cancer deaths from radon in homes. The study was careful to look at. If you look at the legend here, these are different cities ranging from Shenyang in China, to Winnipeg in Canada, to New Jersey, which is apparently a city, to places in Finland, Sweden, and Stockholm, which are somehow different places. Yeah.
So when you see a study like this where they actually control and check to make sure they're not getting any single locality as an unrepresentative measurement, and the data just looked like a crowd-- a cloud along relative risk equals 1, this either refutes the idea that there is no threshold or supports the idea that there's got to be some threshold lying beyond 10 picocuries per liter.
So, again, to me, it supports the ICRP's recommendation of chill out. You're going to have a little bit of radon in your basement. But pretty big studies, and quite a lot of them, show that a little bit isn't going to add any risk to you. So if you're worried about risk, they're statistically is none based on quite a few of these studies.
And in order to enable you to find these studies on your own, I wanted to go through five minutes of where to look. And the answer is not Google because Google is not very good at finding every study. It also picks up a whole lot of garbage that's not peer reviewed because it just scrabbles the internet, you know? That's what it does really well.
Instead, I want us to take the next half hour, split into teams for and against hormesis, and try and find studies that confirm or refute the idea that radiation hormesis is an actual effect. So how many of you have some sort of computer device with you here? Good. Enough so that there is equal amount in each group.
I'd like to switch now to my own browser. And I want to show you guys the Web of Science. Web of-- yeah, [INAUDIBLE] I use Pine on my phone. It's much better science.
So if you just Google search Web of Science, and you're at MIT, it will recognize your certificates and send you into the actual best scientific paper indexing thing out there.
AUDIENCE: Better than Google Scholar?
MICHAEL SHORT: Oh, my god, it's better than Google Scholar. Yeah. If you think you've found everything by looking at Google Scholar, you're only fooling yourself. You're not fooling anybody else. It's getting better.
But it doesn't find anything. And Google Scholar is really good at finding things that aren't peer reviewed, self-published stuff, things on archive, things that you can't trust because they haven't passed the muster of the scientific community.
So instead, let's say you would just do a simple search for radiation hormesis. You can all do this. Don't worry. I'm not showing you how to search. I'm showing you some of the other features of Web of Science.
And you end up with 534 papers. You can, let's say, sort by number of times cited, which may or may not be a factor in how trustworthy the data is. It might just correlate with the age of the paper. It might also be controversial. So if people cite it as an example of what to do wrong, it might be highly cited.
You know, people have made tenure cases and like careers on papers that ended up being wrong. And all you see is 10,000 citations saying this person is an idiot. If the committee val-- you know, judging you for a promotion doesn't read that far into it, they're like, oh, my god, 10,000 citations, right? Boom! Tenure, that's all you have to do.
I think I have it a little tougher. The important part is while with a title like that, oh, man, the more-- the real fun part though is you can see who has cited this paper. So if you want to then go see, why has this paper been cited 260 times, you can instantly see all the titles, and years, and number of additional citations of the papers that have cited it.
So this is how you get started with a real research, research. Yeah, that's what I meant to say, is starting from a paper and a tool like Web of Science, you can go forward and backward in citation time, backward in time to see what evidence this paper used to make their claims, forward in time to see what other people thought about it.
So who wants to be for hormesis? All right, everyone, all you guys on one side of the room, all you guys, other guys on the other side of the room. And I'd like you guys to try to find the most convincing studies that you can to prove the other side wrong. I suggest using Web Science, not Google Scholar.
It's pretty easy to figure out how to learn how to use. And let's see what conclusion we come to.
MICHAEL SHORT: Yep, hormesis by the wall-- yeah, anti-hormesis by the window. There we go. And I'm going to hide this because I don't want to give anyone an unfair advantage.
AUDIENCE: So [INAUDIBLE].
SARAH: So this is a graph showing the immune response in the cells of mice showing that after they were given doses from 0 to 2 gray, or 0 to 7 on the right, the response of the immune system. So at the lower doses below like 0.5 gray, which is in the range that we're looking at, well, the immune system in the mice had a stronger response at low doses of radiation and then very quickly tapered off, supporting the claim the low doses are good for mice.
MICHAEL SHORT: [INAUDIBLE]
SARAH: I have another graph too.
MICHAEL SHORT: So this percentage change in response, I'm assuming 100 years is no dose. OK.
SARAH: Yes. So at higher doses, the response of the immune system was suppressed, which follows with what all the other studies show about giving doses in excess of like 1 gray to cells.
MICHAEL SHORT: Cool. So anti-hormesis group.
SARAH: Oh, I have another graph, but--
MICHAEL SHORT: Oh, you do?
MICHAEL SHORT: Oh, I wasn't going to call them out. I was going to have them criticize what's up here.
SARAH: Oh, no. I have another graph.
MICHAEL SHORT: [INAUDIBLE] next.
SARAH: I have two of the same ones. No, I have another one somewhere. I'll find it in a sec. This one. All right, so this one is incidences of lung cancer based on mean radon level and corrected for smoking. So you can't say that it was just from people smoking.
So for radon levels up to 7 picocuries per liter, the incidence of fatal lung cancer actually decreased as you had more radon.
MICHAEL SHORT: Oh.
MICHAEL SHORT: Anything else you guys want to show before we let the anti-hormesis folks poke at it?
SARAH: That's what I got.
MICHAEL SHORT: OK.
MICHAEL SHORT: What are your thoughts?
AUDIENCE: OK, could you go back to the last one.
SARAH: I will try, yes.
AUDIENCE: Do you have any other [INAUDIBLE].
AUDIENCE: [INAUDIBLE] response.
AUDIENCE: So-- so a mouse is twice-- almost twice as effective at fending off disease? OK, I-- I am not a mouse biologist, but the smell test makes me think that-- that perplexed me. And I guess you didn't do studies [INAUDIBLE].
SARAH: I am not personally offended by this. So you're good.
AUDIENCE: Enormous-- enormous change. And if radiation hormesis has such a strong effect on these mice, then why isn't it something everywhere as a thing now. Like, if radiation-- if hormesis is responsible for 80% [? movement ?] in mice, [INAUDIBLE] like where--
SARAH: I don't know that it was improvement. I think it was just in the amount of response they saw. I don't know if that means it's-- well, that doesn't always mean it was effective at doing something. Right.
MICHAEL SHORT: [INAUDIBLE] you guys have comments too?
AUDIENCE: Additionally, that's like an extremely small of a dose for such a massive response in like a field that is so based on probability. Like, how can something like the dose range that small have that much of an impact on mice?
SARAH: Well, from 0 to half a gray is pretty significant.
AUDIENCE: But [INAUDIBLE]
AUDIENCE: --before you get to the 0.6 gray.
AUDIENCE: You're also only looking at the cells from [INAUDIBLE] it seems like. And it like looked varied depending on the kind of tissue. So you can't do it for overall.
MICHAEL SHORT: OK, I want to hear from the pro-hormesis team. What makes your-- what makes your legs a little shaky trying to stand and hold this up?
MICHAEL SHORT: Aha.
SARAH: Didn't read the study.
MICHAEL SHORT: I like this-- I like this idea that, yeah, you're only looking at one type of cell, which may or may respond differently to different types of radiation. There are no error bars.
SARAH: No, not even a whole mouse either.
AUDIENCE: [INAUDIBLE] in the mouse.
MICHAEL SHORT: Oh, oh to trigger an immune response.
MICHAEL SHORT: It's like-- there are-- there's other cells nearby. But they're like, oh, you're not my cell. I'm going to [INAUDIBLE].
AUDIENCE: [INAUDIBLE] mice.
MICHAEL SHORT: Yeah. So that's-- that's a valid point. But, yeah, did it say in the study how many?
SARAH: Again, did not read the study.
Read the conclusion.
MICHAEL SHORT: The data alone, just taken it at face value, make it look like hormesis is a definite thing, Yeah, Kristin?
AUDIENCE: I'm saying if there is [INAUDIBLE].
MICHAEL SHORT: Yeah.
SARAH: True. Nine mice cell samples.
MICHAEL SHORT: Let's go to the other study.
SARAH: All right, the-- the lung one?
MICHAEL SHORT: Yeah, it seems to be more controlled and more legit.
SARAH: Yeah. This one has error bars.
MICHAEL SHORT: Yeah, 1 has error bars, 2, corrected for smoking. So let's see what the caption says. Lung cancer fatality rates compared with mean radon levels in the US.
SARAH: And for multiple counties because it talks about counties plural. So--
MICHAEL SHORT: So multiple counties helped control for single localities, or--
AUDIENCE: So the 0 level there is theoretical. So the data that you have down here, like, we don't know what actually happens [INAUDIBLE].
SARAH: Past what?
AUDIENCE: Like-- like below 1, the mean radon levels because everyone is exposed to radon.
SARAH: Well, it says average residential level of 1.7. So I think that means maybe some people have less, maybe some people have more. I don't know what the minimum radon level is.
MICHAEL SHORT: It's not going to be 0.
SARAH: It's not 0.
MICHAEL SHORT: Yeah, no one gets 0 unless you live in a vacuum chamber.
SARAH: I don't know what kind of scale that's on.
AUDIENCE: Me too.
MICHAEL SHORT: Yeah. Cool, yeah. So this-- this is fairly convincing. If the point here was to show there is the theory of linear no threshold, and here's what's an actual data with error bars shows. It does a pretty good job in saying, the theory is not right, in this case. Can you say that in all cases? It's hard to tell.
In the first study you found that was on the cellular level. Maybe the multicellular level-- multicellular level, certainly not the organism level, like we said, how many mice. This is just parts of mice. Just--
SARAH: It could be the same mouse.
MICHAEL SHORT: Some cells-- yeah. This one is definitely at the organism level. It's for-- for gross amounts of exposure, how many of them resulted in increased incidence of lung cancer? The answer is pretty much none. They all showed a statistically-significant decrease, which is pretty interesting. So thanks a lot. Sarah. And the whole team. Now one of you guys come up and find [INAUDIBLE].
SARAH: Carrying the team.
MICHAEL SHORT: So who wants to come up? Or does no one [INAUDIBLE]?
SARAH: Let's throw down, right? Fixing to scrap.
MICHAEL SHORT: OK, you can just pull it out.
SARAH: OK, Are you sure?
MICHAEL SHORT: Yeah.
SARAH: OK. I don't want to break things.
MICHAEL SHORT: No, pulling it out's fine. If you jam it in, you can bend the pins. And that's happened here before.
MICHAEL SHORT: Yeah, if you want to take a minute to send each other the links, go ahead. No, I like this, though, is you can-- you can find a graph that supports something. And you can cite it in a paper. And you can get that paper published. But looking more carefully at the data does sometimes call things into question.
AUDIENCE: Just like [INAUDIBLE].
MICHAEL SHORT: Like, I think you guys found a good example of that mouse cell study that looks like it supports hormesis, but you can't say so for sure. Make sure no one's waiting for their room. No one's kicking us out.
AUDIENCE: Have we got a paper that I found here but we can't open up on there.
MICHAEL SHORT: Interesting. Can you send me the link?
AUDIENCE: Wait, that wasn't an option.
MICHAEL SHORT: Yeah. I mean, we can continue this. There's-- we're not-- since we're not going to the reactor since that valve was broken, let's keep it up.
AUDIENCE: Hey, [INAUDIBLE] workbook and [INAUDIBLE] put it in the log book.
AUDIENCE: That's your fault.
AUDIENCE: I wasn't even [INAUDIBLE].
Email us by name.
AUDIENCE: It's not over yet.
MICHAEL SHORT: Yeah, actually, I like this. This will be a good-- quite a good use of recitation. I'll keep my email open in case folks want to send things to present.
AUDIENCE: That's the whole title.
GUEST SPEAKER: So one-- one of the main problems that we had with the hormesis effect was that all of the studies that we've seen seem to cover a large scope of like tissues, different effects, and all sorts of things, like, yeah, there's a lot of studies.
There's a lot of trends. But, like, the things in particular that they're studying are all over the place. And a lot of the-- a lot of the research done, like these studies here, are not actually meant to study hormesis. It's kind of like recycled data that's used from some other study.
And they're kind of like pulling from multiple sources, which increases the uncertainty. Then, additionally, we have conflicting epidemiological evidence of low dosages. So we're, in one instance, you may see a reduction in breast cancer mortality. You'll see excess thyroid cancer in children, other, which is--
MICHAEL SHORT: That's the same study that was just shown, the Cohen 1995 residential radon study.
GUEST SPEAKER: Yeah.
MICHAEL SHORT: [INAUDIBLE]
GUEST SPEAKER: And so I think-- we're not-- I don't think we're trying to disqualify hormesis as, like, completely wrong. I think one of the biggest issues that we're taking with it is that it's a small effect, if anything. It's something that we really don't know about. It's hard to quantify.
And it's, at the end of the day, really just not worth it, not worth looking into because of all of the variable-- variables that go into it. And the effects that, like, we just don't know about. We don't understand it. So, yeah, fire away.
MICHAEL SHORT: That's a a great viewpoint, actually. Yeah, Monica?
OK, so it says support for radiation hormesis [INAUDIBLE] cell in animal studies, OK? And then it cites an example. Can you tell me how that, like, you know, supports what you're saying?
AUDIENCE: Can you just highlight the part?
MICHAEL SHORT: Oh, right-- right up here.
GUEST SPEAKER: We haven't seen it in humans.
AUDIENCE: Well, often, biological studies are done on rats because they have similar effects to humans. But it's a lifespan of, like, 1/10 a human's lifespan. So, biologically, that's accepted.
GUEST SPEAKER: Medicine also is not accepted until it works on humans, not on animals.
GUEST SPEAKER: So we can cure cancer in rats all day. But, like, if it doesn't work in like the human body, then it just-- we still don't use it, like, it needs to clear the hurdle of human usefulness before we actually use it.
MICHAEL SHORT: Let's actually look at this paragraph. They relate to carcinogensis in different tissues and the dose-response relationships [INAUDIBLE].
AUDIENCE: So there's a line that says the evidence for hormesis in these studies is not compelling since the data may also be also be reasonably interpreted to support no radiogenic effect in the low dose range.
MICHAEL SHORT: Oh, that's interesting. Now, how would one interpret-- because you showed the Cohen data. So how would one interpret that to mean no effect? I'm trying now determine in this-- are the claims of this paper that you've been [INAUDIBLE]?
And this brings up, actually, another point. They do agree that there's been hundreds of cell and animal studies. They cite three human studies. So since we have the time, you guys may want to look for more than three human studies, done at the time of this writing. It's not fair to take ones that were done afterwards.
GUEST SPEAKER: What? Let's find out.
AUDIENCE: After 2000.
MICHAEL SHORT: It might say at the bottom of the first page.
AUDIENCE: Oh, wait, in the-- in the [INAUDIBLE].
MICHAEL SHORT: 2000, yep. Yeah. So if you want to refute that point, you may want to find more human studies pre 2000. It wouldn't be fair to do otherwise. But, actually, I liked what you said.
So what you're proposing-- if there's a mostly blank board, is that most people should adopt the model that looks something like this. This is the axis of how much bad or that 0. And this is dose in gray. And whether your model does this, or this, or this, it sounds to me like you are defining a-- like you're defining a kill zone.
[INAUDIBLE] maybe the--
GUEST SPEAKER: Yes.
MICHAEL SHORT: The point isn't whether or not hormesis exists. The effect may be so small that who cares. But the bigger discussion is how much is that, not is a little bit good. Is that what you're getting at?
GUEST SPEAKER: Yeah, the like, maybe it does look like this. But the dip is small, really not that different from the linear threshold model, we noticed.
MICHAEL SHORT: Oh, so in addition to being a basic science question, could the issue of hormesis almost be a sidetrack in getting proper radiation policy through? That's a point I hadn't heard made before, but I quite like it. Because it's not like you're going to recommend everyone smokes three cigarettes a day or, you know, everyone gets blasted by little bit of radiation once a year as part of a treatment. I don't think anyone would buy that. Even if it did help, I don't think anybody would emotionally buy that.
But by focusing on-- you know, that-- there's a nice expression is the most important thing is make the most important thing the most important thing. It means don't lose sight of the overall goal, which is if you're making policy on how much radiation exposure you're allowed, do you focus on saying, a little bit is actually good, or do you focus on saying, here's the amount that's bad?
And anything below that, we shouldn't be regulating or overregulating because there's no evidence to say whether it's good or bad outside the kill zone. I quite like that point, actually. It means that the supporters of radiation should chill out as well. Cool, all right, so any other studies you want to point out?
GUEST SPEAKER: We had a couple of abstracts.
MICHAEL SHORT: Yeah, let's see.
GUEST SPEAKER: But I don't-- I'm not sure.
GUEST SPEAKER: OK.
AUDIENCE: Some of the other ones don't compare hormetic models. But they look at-- they say [INAUDIBLE]. It's like--
GUEST SPEAKER: Do you want to come up?
AUDIENCE: Yeah, this one says [INAUDIBLE].
GUEST SPEAKER: All right.
AUDIENCE: It basically compares threshold models with no-threshold models in [INAUDIBLE].
So perhaps hormetic is still better for you, but they-- the [INAUDIBLE] was good enough with [INAUDIBLE].
MICHAEL SHORT: So what they're saying is the-- the choice of model really doesn't matter, as long as it fits through the data that we've got. And it seems to be, again, what happens in the low-dose regime is less important, right?
AUDIENCE: And it will-- they were satisfied when it fell from the [INAUDIBLE].
MICHAEL SHORT: So they're saying the best estimate of this-- interesting.
AUDIENCE: They prefer no threshold [INAUDIBLE].
MICHAEL SHORT: That's funny. "If a risk model with a threshold is assumed, the best estimate is below 0 sieverts. But then how is their confidence interval from-- oh, less than 0 to 0.13. They don't quantify how much lower it goes because a negative dose doesn't make sense. No.
So, yeah, it's a strong conclusion. But it looks-- looks fairly well supported to say that we can't say with those confidence intervals that they give if there is or isn't a threshold. Interesting. What do you guys think of this? So what would you delve into the study to try to agree with or refute this claim?
AUDIENCE: They use a linear quadratic model only, it looks like. So they're not considering any of the other proposed models, which is a little-- maybe not sketchy, but it just seems like it'd be very easy to consider other models and why didn't they do that.
MICHAEL SHORT: Sure. You know, what no study has gotten into yet is, what's the mechanism of, let's say, ill effect acceleration. This is something that, at least at the grad school level, we try to hammer to everyone constantly is not just what's the data, but what's the mechanism. What's the reason for an acceleration of ill effects?
So if you guys had to think with increasing radiation exposure, let's say we wanted this linear quadratic model idea, what could be some reasons or mechanisms for an increased amount of risk per unit dose as the dose gets higher? Yeah?
AUDIENCE: Well, your body [INAUDIBLE]. But then-- so at some-- you get more dose-- you get more dosing [INAUDIBLE]. It just keep fixing itself. And once you get past a certain point, then it can't [? fix itself ?] [? fast enough. ?] The additional damage keeps snowballing events. And they're giving it more damage to curb more radiation because you would run out of-- of various [INAUDIBLE].
MICHAEL SHORT: Sure. Works for me. Yeah, I like that-- the idea there was that you've got some capacity to deal with damage from radiation. And then once you exceed that capacity, you don't also-- with a higher dose, you don't also ramp up your capacity to deal with that dose.
So in the linear region, let's say, you're somewhat absorbing the additional ill effects of dose by capacity to repair DNA or repair cells. Then once you exceed that threshold, you're beyond that point. So that could be a plausible mechanism for why there could be a linear quadratic model that could be tested, certainly with single cell or multi cell studies, like these-- these radiation microbeams or, you know, injecting something that would be absorbed by one cell [INAUDIBLE] irradiated, and seeing what the ones nearby do.
So you could count that as number of mutations, number of cell deaths, anything, something that could be quantitatively tested. So that's pretty cool. I actually quite like this study. It's awfully hard to poke a hole in-- in the logic used here. The claims aren't outrageous.
They're saying, this is what the data is saying. If you change the model, you can or not have a threshold and still get an acceptable fit. Can we actually look in the study itself? One thing I want to know is, what sort of-- do they do meta analysis, or did they-- yeah, so this was on the Japanese atomic bomb survivors.
So did they analyze previous data, or did they get their own. And then if so, what was the sample size? Somewhere it'll be, like, yeah, [INAUDIBLE]. So where [INAUDIBLE].
GUEST SPEAKER: Where am I-- where should I be looking for this--
MICHAEL SHORT: Probably further down in any sort of methodology section-- materials and methods, here we go. OK, here it is, 86,500 something survivors. Oh, yes, with lots of follow up.
AUDIENCE: But how are you able to determine the dose? Like--
MICHAEL SHORT: That is a good question.
AUDIENCE: Because especially for-- if we're looking like low dose, and you're estimating, it's very easy to, like, estimate wrong, or, like, because then-- then it calls into question you have-- [INAUDIBLE] modeling they're using.
MICHAEL SHORT: Mhm. So that's a great question is, how do they know what those people die? So how would we go about trying to trace that? This is when you dig back in time. They reference this, the data appears et al, whatever, whatever. So if you can go to Web of Science, pull up this Pierce et al Web paper. Look at cited references. Yeah, right there. And look for that 1996 Pierce study. Let's see if it has it. You can just like control F for Pierce, and we'll find it. Pierce and [INAUDIBLE]. Yeah, 1996, that's the one.
GUEST SPEAKER: Where? Which one? This one?
MICHAEL SHORT: [INAUDIBLE]. This is the 1996 one. Yep. So let's see if we can trace this back and find out how they estimated the dose of these folks.
GUEST SPEAKER: So I just go to full text?
MICHAEL SHORT: Yeah.
AUDIENCE: How [INAUDIBLE].
MICHAEL SHORT: OK. So interesting, this LLS cohort. So there was some life span study, which was also referred to actually in the lecture notes as one of the original studies, says, who met certain conditions concerning adequate follow up. Although estimates of the-- OK, I want to see the next page. Although we estimate-- that might be what we're looking for. Number of survivors, let's see.
AUDIENCE: It's 92%.
MICHAEL SHORT: OK, here we go, materials and methods. The portion of the LSS cohort used here includes the same number of survivors for whom dose estimates are currently available, et cetera, with estimated doses greater than 5 millisieverts is [INAUDIBLE]. Table 1 summarizes the exposure distribution. So let's go find table 1 and see where the data came from.
MICHAEL SHORT: So it turns out that this is specifically-- DS-86 weighted colon dose in sieverts. Interesting.
AUDIENCE: It [INAUDIBLE]. So how did they get that?
MICHAEL SHORT: I don't know. But it sounds like we need to find this LSS, this Just LSS. So let's look at the things that this paper cites. Find this LSS.
So I'm walking-- what I'm doing here is walking you through how to do your own research. And if someone comes to you with some internet emotional argument of, this and that about radiation is wrong, instead of yelling back louder, which means you lost the argument, you hit the books. And this is how you do the research.
AUDIENCE: LSS-85, does that mean it was [INAUDIBLE].
MICHAEL SHORT: Probably. Version of-- title not available. I hope it's not that one. Can you search for LSS? Nothing? So let's go back to the paper and find what citation that was. If you go up a little bit, I think there was like a sup-- a superscript up to the last page, I'm sorry.
There was a superscript on LSS stuff.
AUDIENCE: So general documentation of the selection of LSS cohorts [INAUDIBLE].
MICHAEL SHORT: Thank you. All right, let's find references 9 and 10 in the-- yeah, [INAUDIBLE].
AUDIENCE: Can you click one of the References tab?
MICHAEL SHORT: Oh, yeah, up there, References. Awesome! 9 and 10, OK. Let's find them.
MICHAEL SHORT: So let me show you quickly how to use Web of Science to get what you're looking for if I could jump on?
GUEST SPEAKER: [INAUDIBLE] up here?
MICHAEL SHORT: You don't have to, yeah. But thank you for being up here for so long and running this. So we're looking for-- where was-- the article was here. Went into references. I guess that was like the last-- I don't want to close all your tabs. Here we go.
So GW, is that Beebe and Usagawa. So we'll go to Web of Science, look for authors, any paper with those authors. So you can do a more advanced search. This is where things get really interesting and specific. So ditch the topic. Search by Beebe and add a field, Usagawa.
And then anything with these two folks in the author field that is indexed by Web of Science will pop up. Nothing. Did I spell anything wrong? Usagawa, of course.
That's unfortunate. Last thing to try is Add Wild Cards. Interesting. This is actually one place where I would use Google to find a specific report. So because you're not looking to survey a field that's out there, but you're looking for any document that you can confirm is that document. Let's head there.
Oh, it looks like Stanford's got it. That's something that references it. So at this point, we've hit the maximum that we can do on the computer. But if you finally want to trace back to see how were the Hiroshima data acquired, take these citations, bring it to one of the MIT librarians like Christ Sherratt is our nuclear librarian.
AUDIENCE: He's a nuclear librarian?
MICHAEL SHORT: And we have a nuclear librarian, yeah. MIT libraries is pretty awesome. So when you're looking for anything here in terms of research or whatever, there's actually someone whose job it is to help you find nuclear documents. And chances are, this is a pretty big one. So I wouldn't be surprised if we have a physical or electronic copy.
So we're now like one degree of separation away from finding the original Hiroshima data, where we can find out how did they estimate that dose. So I think this is fairly-- hopefully, this is fairly instructive to show you how do you go about getting the facts to prove or disprove something, knowing the-- not just the physics that you know, but how to go out and find that stuff.
Now, I did see a bunch of sources from the pro hormesis team. You still want me to show them?
MICHAEL SHORT: OK. Thanks. All right, you just want to hold this up while your-- let's go to your sources. OK, here we go.
AUDIENCE: All right.
MICHAEL SHORT: So walk us through what you found.
GUEST SPEAKER: I just need to open them up.
AUDIENCE: Go through them all, or--
MICHAEL SHORT: Yeah, let's do them all.
GUEST SPEAKER: There's not too much. Kind of-- OK, so, I unfortunately was not able to find like too many pretty graphs, or data, or anything of the sort. But if you look up, what did I search for this? I think I just looked up radiation hormesis.
And this is one of the articles that turned up. And it seems to be pretty well cited. You can see it's been cited 184 times. And kind of the quick look through the citations, from what I saw, seemed to be in support of it. And if you actually look at the abstract itself, where is it?
GUEST SPEAKER: Yeah, well-- the last sentence is pretty excellent. "This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The linear no-threshold hypothesis should be abandoned and should-- and be replaced by a hypothesis that is scientifically justified and causes less unreasonable fear and unnecessary expenditure."
MICHAEL SHORT: You know what? I want to see what are the human epidemiological observations that they cite.
GUEST SPEAKER: Yeah, so unfortunately, the MIT libraries does not have an electronic copy of this article. And I wasn't able to find one. But going through some of the citations for it--
MICHAEL SHORT: Before you do, could you go back to the article?
GUEST SPEAKER: Sure.
MICHAEL SHORT: I want to point something out.
GUEST SPEAKER: Yes.
MICHAEL SHORT: Can you tell if this was peer reviewed?
GUEST SPEAKER: I do not know how to do that.
MICHAEL SHORT: It appears to be a conference.
GUEST SPEAKER: OK.
MICHAEL SHORT: Not all conferences require peer review in order to present the papers. So while conference proceedings will typically be published as a record of what happened at the conference, we don't know if this one was peer reviewed and checked for facts by an independent party. Could you go up a little bit, and maybe there'll be some information on that?
Oh, it did go in the British Journal of Radiology. OK, that's a good sign. So conference proceedings, you don't know. But in order to publish something in a journal, you do because then in order to get in the journal, things have to be peer reviewed to meet the journal standards, regardless of whether they came from a conference or just a regular submission. So, OK, that's good to see. So, now, what else you got?
GUEST SPEAKER: And then one of the key sentences that I found right here, adaptive protection causes DNA damage prevention, and repair, and immune system or immune stimulation. It develops with a delay of hours, may last for days to months, decreases steadily at doses above about 100 milligray to 200 milligray and is not observed anymore after acute exposures of more than about 500 milligray.
That's all pretty interesting. Like I said, unfortunately, I couldn't find the actual paper. So you can't really delve into some of those claims. But I tried to look at some of the citations that delved into them. And this is where my presentation gets a little bit shakier because I'm not particularly good at parsing some of this complex stuff very quickly.
MICHAEL SHORT: Let's do it together.
GUEST SPEAKER: All right. [INAUDIBLE].
MICHAEL SHORT: If you could click Download Full Text in PDF, it'll just be bigger.
GUEST SPEAKER: OK.
MICHAEL SHORT: There we go.
GUEST SPEAKER: So it seemed to me this one was more looking through the statistics of various studies. I'm not entirely sure. But I think the conclusion-- [INAUDIBLE]. There we go.
So the very last paragraph, "the present practice assumes linearity in assessing risk from even the lowest dose exposure of complex tissue to ionizing radiation. By applying this type of risk assessment to radiation protection of exposed workers and the public alike, society may gain a questionable benefit at unavoidably substantial cost. Research on the p values given above may eventually reveal the true risk, which appears to be inaccessible by epidemiological studies alone.
MICHAEL SHORT: So what are they going on claiming [INAUDIBLE] versus not being willing to claim it?
GUEST SPEAKER: So it seems like they're saying that at the current, there's not really a problem-- a statistically valid assertion of the linear no-threshold model and that the benefits to society gained from that are not worth the cost to society from that assumption.
MICHAEL SHORT: So what sort of costs do you think society incurs by adapting a linear no-threshold dose risk model?
GUEST SPEAKER: I mean, it could pose unnecessary regulations on like nuclear power, which could be arguably better for society.
MICHAEL SHORT: Sure. Nuclear power plants emit radiation, fact, to use the old cell phone methodology. There's always going to be some very small amount of tritium released. The question is, does it matter?
And if legislation is made to say absolutely no tritium release is allowed, well, you're not going be allowed to run a nuclear plant. That's not the question we should be asking. The question we should be asking is, how much is harmful?
So I think that's what this study is really getting at is I'm glad to see someone say, you may have a benefit. But the cost is not worth the benefit. Like I-- I had a multiple of the same arguments with different people when they were complaining, well, how dare would you expose me to any amount of radiation at any risk that I can't control. I used to protest outside Draper Labs for 30 years protesting nuclear power.
I was like, OK, how did you get there? They were like, oh, I drove. What? In a car? Do you even know the risks per mile of getting on the road, let alone in Cambridge specifically? No? Well, I was like, you should really consider where you put your effort?
It's-- again, it's emotions versus numbers. I'm going to go with numbers because I tend to make bad decisions when I follow my emotions, as do most people because most decisions are more complex than fight or flight nowadays. Yeah?
AUDIENCE: So a lot of the discussion just seems to be around like expanding [INAUDIBLE]. But a lot of the arguments don't seem to like really [INAUDIBLE]. But, yeah, like there's a certain extent, like, oh, you will see [INAUDIBLE].
MICHAEL SHORT: Yeah.
AUDIENCE: [INAUDIBLE] are doing the same.
MICHAEL SHORT: You make a great point. That's why I like your-- your chosen idea so much is, well, you didn't say chosen. That's what I-- yeah. Yeah, the question we should be asking ourself is not what is the dose-risk relationship, but when should we actually care. It's like both sets of studies have kind of come to the conclusion that, nah, right?
AUDIENCE: [INAUDIBLE] dose doesn't really matter.
GUEST SPEAKER: Yeah, and then I found this last one is a little bit more assertive. It's kind of just hitting the same nail on kind of the elimination of the linear no-threshold model. But then it does go on to make some more powerful claim right here.
"These data are examined within the context of low-dose radiation induction of cellular signaling that may stimulate cellular protection systems over hours to weeks against accumulation of DNA damage."
MICHAEL SHORT: Was this the paper cited in the other one that actually said hours two weeks?
GUEST SPEAKER: I believe so, yeah.
MICHAEL SHORT: OK, cool.
GUEST SPEAKER: And then we can actually--
MICHAEL SHORT: [INAUDIBLE] this one?
GUEST SPEAKER: Yes. We can look up the full text on Google Scholar.
MICHAEL SHORT: That's OK. When you know what you're looking for, you can verify it. That's-- that's a useful thing for Google is like to find known content. But if you're trying to survey a field in Google, no.
GUEST SPEAKER: That's not what I wanted.
MICHAEL SHORT: Not yet. I'm sure-- I'm sure they're working on it. But they're not Web of Science yet.
GUEST SPEAKER: All right.
GUEST SPEAKER: Does anybody see a Get The Full Paper button? Oh, wait, right here, right?
MICHAEL SHORT: Yep. That's it.
GUEST SPEAKER: OK. Sign in?
MICHAEL SHORT: Sounds like we don't subscribe to this.
GUEST SPEAKER: Oh, I was able to get to it somehow. Well, yeah.
AUDIENCE: I have another article supporting this claim, though.
MICHAEL SHORT: OK.
GUEST SPEAKER: But this one--
AUDIENCE: Submit it, or bring yours up, or whatever.
GUEST SPEAKER: And then this one-- this one just had some nice data. If I'm going to summarize, it had-- it was looking at the amount of DNA damage instances compared normal background dose to like very, very low dose. And the very, very low dose was significantly less than the normal background dose. So that just kind of shows that like very low levels of radiation are like no worse for you than just background dose, which is interesting.
MICHAEL SHORT: Cool.
GUEST SPEAKER: Yeah.
MICHAEL SHORT: I also want to make sure, do you guys have more articles you want to show?
MICHAEL SHORT: If you want to send it to me, I'll put it up here.
GUEST SPEAKER: All right, I minimized because I didn't just want to leave your email.
MICHAEL SHORT: Oh, I don't care. There's nothing--
GUEST SPEAKER: OK.
MICHAEL SHORT: I'll bring it back up. So that's all the ones you sent? Cool. Actually, this one-- this debate is turning out a whole lot more interesting than previously because, well, because you're thinking. It's actually really nice to see this. And this is the--
MICHAEL SHORT: I'm not surprised. Don't worry. It's just pleasant to have a debate about something controversial with a whole group of people who are thinking and researching rather than shouting and like throwing plates.
MICHAEL SHORT: Oh, no, if you want throw a chair, but I might throw one back.
MICHAEL SHORT: I wonder if anyone's gone out recently and has come up with all of the pro and anti hormesis studies and actually written a paper that says, that's not the point, because, really, what we're getting-- huh?
AUDIENCE: You could write that.
MICHAEL SHORT: No, I think you could write that paper now.
AUDIENCE: Well, oh.
MICHAEL SHORT: It would make for a pretty cool undergrad thesis, actually. Yeah? Maybe I can tell you a little bit about what an undergrad thesis actually entails because the seniors are all asking. But it's good for you to know ahead of time.
So the main requirement for an undergrad thesis is it's got to be your work. That doesn't mean you have to have collected the data yourself, like done an experiment. But it has to be some original thought, or idea, or accumulation of yours. So trying to settle this debate and trying to figure out what would be a proposed chill region to say, forget the linear threshold or no threshold. That's for the basic scientists.
If you are a government and want to legislate something that actually captures should people be afraid or not, defining that region would be a pretty cool study to do in the meta-analysis of lots of other studies, tracing back how worthy-- I mean, a lot of people refer to the Hiroshima data set because that's about the biggest one we have.
In addition to folks with radon or folks that smoke, they were all exposed to the same thing in the relatively same area. So it's a good control group of people. But how was-- how were those doses estimated? You have to dig that up. And the act of digging that up and then recasting all of these new studies in the basis of everything we've learned since would make for a pretty cool undergrad thesis topic. So as undergrad chair, I wouldn't say no to that.
Threshold and other departures from linear quadratic curvature in the same data set appears to-- is it the LSS data set? Let's try to get the full text. Awesome! I think it's looking good. Great! Now I've seen that name before. Interesting.
MICHAEL SHORT: Interesting. They propose another model called a power of dose, a power law. And they say, depending on this-- there's little evidence that it's statistically different from one which is a what do they call one linear threshold quadratic threshold or linear quadratic threshold, OK?
So, again, it seems to be yet another paper saying, I don't think it matters. Statistics says it doesn't matter. You could fit any model to this data. Let's get to the methods.
MICHAEL SHORT: Interesting. So dose response for all non-cancer mortality in the atomic bomb survivors. So, also, in this case, it's mortalities not caused by cancer.
AUDIENCE: Like, caused by radiation disease? Or is that caused by [INAUDIBLE]?
MICHAEL SHORT: So this would be-- I think what they're getting at is is there a response, or is there a change in the amount of mortality not due to cancer and the-- the--
AUDIENCE: Health benefits other than decreasing risk of cancer.
MICHAEL SHORT: Or in this case, health detriments, right? Because in this-- you know, it never goes negative. You can't really tell in some cases. Let's see.
Yeah, quite hard to tell, especially considering. And so at the low doses, what would you guys say for the low dose data?
AUDIENCE: That doesn't matter.
MICHAEL SHORT: I see a pretty well-defined chill zone right there, right?
AUDIENCE: Chill zone?
MICHAEL SHORT: We're definitely still in the chill zone at 0.4 sieverts of colon dose. And that's a pretty hefty amount of dose. You know, we're talking eight or nine times the allowed amount that you're able to get in a year from occupational safety limits.
Once the doses get higher, things seem to get a little more deterministic or statistically significant. But, yeah, look at all the different models. The linear threshold, quadratic threshold, linear quadratic threshold, power of dose all goes straight through not just like in the error bars, but almost straight through most of the data points, except for the really far away ones.
So this is a pretty neat study, showing, like, hey, the relationship does not appear to matter for doses of consequence. I would call 2 sieverts a dose of consequence based on our earlier discussion of biological effects. Luckily, it doesn't go much farther than that. You don't want a lot of people to have received doses beyond 10 gray.
But this is pretty compelling to me to say, like, we can argue about what the real model is and what the underlying mechanism is, but is this a question we really should be asking ourselves when the total risk-- let's say, when the total risk to an organism reaches about 100%, once you reach a a dose where it doesn't even matter, then is this a question that we should really be debating in the public sphere?
I love the outcome of this particular debate. Lots of statistics, don't have time to parse. Is there anything else, Chris, that you wanted to highlight in this study?
AUDIENCE: This appears to [INAUDIBLE] comments on Professor Donald Pierce on [INAUDIBLE].
MICHAEL SHORT: Oh, OK, well--
AUDIENCE: Do you think it could be the same Pierce?
MICHAEL SHORT: Maybe. It was a UK Pierce, I think. That's pretty cool. So anyone else have any other papers they want to show for or against or for our sort of collective new conclusion? Which is that we should just relax.
Cool. Well, that went-- yeah? Charlie?
AUDIENCE: I just had had a question, like, what would be like a posed use of radiation hormesis [INAUDIBLE]? [INAUDIBLE]
MICHAEL SHORT: So let's say you could prove beyond a shadow of a doubt that a little bit of radiation exposure was a good thing. You might then prescribe radiation treatments in order to reap the benefits. I don't think there's been a single study that shows that there's like deterministic benefits from irradiating people.
Some of the studies show that folks that have gotten exposed via various routes do show a lower incidence of cancer. So you could almost think of it like a vitamin, not an injectable vitamin. But-- so back-- there are lots of pictures online and stories of way up in the north in Russia and northern countries that expose you to ultraviolet radiation to stimulate the production of vitamin D in your skin cells because in the absence of an ingestible source of vitamin D, you make it naturally, but not when there's eternal darkness.
So they'd actually have kids stand in front of a UV lamp, which does have ill effects. That can cause also skin cancers, but the benefits of the organism in generating vitamin D that you need for health are greater. So that might be an example. These-- these sorts of ideas are not that far fetched.
If you put little kids in front of UV lamps, which you know can do bad things, but also does more good things, then who's to say it shouldn't happen for radiation? Well, no one's to say yet because we have no real conclusive proof that it is helpful. But that was the-- yeah?
AUDIENCE: Have there been any mechanisms that [INAUDIBLE]?
MICHAEL SHORT: You mean in-- for radiation or for something else?
AUDIENCE: For radiation.
MICHAEL SHORT: The mechanisms of-- so that one study that Chris showed that-- what was the idea? That-- [INAUDIBLE]. The first one that you showed, the mouse one, and then the one that Chris mentioned where a little bit of radiation dose stimulated the immune system. That might be a potential good thing, where the damage or death of a few cells may stimulate the nearby ones to ramp up an immune response, thus snuffing out any other infection or problem that's coming up. That could be a use.
But we have to be proved with much more confidence than anything I've seen today. So that's a good question. Yeah, like how would you use it? Use it like a vitamin, like a UV lamp, like a SAD lamp. Although, I don't think SAD lamps do anything bad, the Seasonal Affective Disorder, the most unfortunate acronym in the world. Yeah.
MICHAEL SHORT: Yes. I don't know if that would be easy to swallow. Yeah. Cool. All right, any other thoughts from this exercise? I think I'll do more interactive classes like this. It's good to hear you guys talk for a change. Cool. OK.