aged people more than any other age group.
The high mortality rate in an otherwise healthydemographic clearly exemplifies
the virus’s dangerously deadly quality
.Although it is clear that the virus has the potential to be incredibly deadly
the scientificcommunity and general public alike are not in total agreement about whether it poses an actuallythreat to society. There are two main arguments given to justify the lack of preparednessmeasures taken. The first
taken by many members of the scientific community
is that if H5N1could adapt to transmit from human to human
it would have done so already
. The other pointof view
taken primarily by members of the general public
is that modern medicine would beable to control the virus in the case of an outbreak
. Both views are neglecting to consider keyfactors that will contribute to the spread of the virus. The complacency produced by both of these responses will inevitably lead to a pandemic that will be worse than it should have been.The idea that
if it were ever going to happen
it would have happened already
issimply a way of brushing the issue aside without having to think about it any further. The fact isthat most viruses fester for years before morphing into a form capable of devastating thepopulation
. The H1N1 virus is estimated to have been evolving for somewhere around elevenyears before the pandemic broke out.
H5N1 was only discovered around 15 years ago
so it isonly a few years behind the pace of its highly virulent relative
.An additional argument that proponents of the
it would have already happened
theorylike to use is that the conditions leading up to the 1918 outbreak were unique
and no modernscenario could possibly yield the same outcome. Again
this is simply an uninformed conjecture.The horrible conditions that lead to the 1918 pandemic were those experienced in the trenches of World War I.
Soldiers were forced to live in tightly packed and unsanitary quarters while
How can we best address the threat posed by dual-use research? By concentrating on the benefits of the knowledge and publishing it anyway? By keeping some of the research secret from the general public, making it available in full only to responsible scientists? Or by not publishing it—or, in some cases, not even conducting it at all?
Scientists and policy-makers have long understood that the products of research can often be used for good or evil. Nuclear fission research can be used to generate electricity or create a powerful bomb. Studies on the genetics of human populations can be used to understand relationships between different groups or to perpetuate racist ideologies.1 While the notion that scientific research often has beneficial and harmful uses has been discussed before, the threat of bioterrorism—a concern that has only grown since 2001—has led to increased awareness about the need to prevent the misuse of biomedical research, particularly when it involves dangerous pathogens or toxins.2 In 2012, the concern was ratcheted up another notch by two papers reporting the results of research on genetically engineered strains of the H5N1 avian influenza virus.3 Although the H5N1 papers have potential scientific and social value, some scientists and policy-makers opposed their publication because they feared that terrorists (or others with nefarious motives) could use the research to create a bioweapon that could trigger a global pandemic.4
The H5N1 papers raise difficult questions concerning the ethics of knowledge. Should scientific research with dangerous applications be published? Should some types of research be kept secret or not be conducted at all? What type of government oversight of dangerous research is appropriate? In this essay, I will develop a framework for thinking about the ethics of knowledge and apply it to the H5N1 controversy, focusing on issues related to publication. I will argue that redacted publication would have been a reasonable response to the dilemmas posed by the H5N1 papers if not for practical and legal problems with this option. Given these problems, full publication seems appropriate.
The H5N1 Controversy
In December 2011, the National Science Advisory Board for Biosecurity reviewed the disputed papers at the request of the National Institutes of Health, which had funded the research. The NSABB, formed in 2004 upon a recommendation that the National Research Council included in its report Biotechnology in the Age of Terrorism,5 provides advice to government agencies, scientists, institutions, and journals concerning biosecurity issues related to scientific research. It focuses on dual-use research of concern, known as DURC and defined as “research that, based on current understanding, can be reasonably anticipated to provide knowledge, products, or technologies that could be directly misapplied by others to pose a threat to public health and safety, agricultural crops and other plants, animals, the environment, material or national security.”6 DURC may include any type of scientific research, but most of the NSABB's attention has centered on experiments of concern involving dangerous biological pathogens and toxins—so-called select agents—that were identified in the NRC report. Some experiments of concern include studies that would increase the transmissibility of a pathogen, expand the host range of a pathogen, render a vaccine ineffective, or enable the weaponization of a pathogen or toxin.7
The two papers reviewed by the NSABB reported the results of experiments that used genetic engineering techniques to demonstrate how H5N1 can acquire mutations that enable it to be transmitted by air between ferrets via respiratory water droplets. The researchers are confident that their findings in ferrets apply to other mammals, including humans, due to similarities among mammals in pulmonary structure and function. The wild type of H5N1 can be transmitted to humans only by direct contact with birds infected with the virus. Public health officials have been concerned that H5N1 could cause a global pandemic if it evolves the ability to transmit between mammals by air because human beings lack immunity to it and similar viruses. H5N1 has infected six hundred people since 1997, with an estimated case fatality rate of 30 to 60 percent.8 In short, the researchers were attempting to study a process in the laboratory that could cause a public health catastrophe if it occurs in nature.
A common way of thinking about restrictions on scientific publication is to select the option that has the greatest balance of benefits over risks. How we go about identifying, estimating, and making decisions about risks and benefits is sometimes a complex process, however.
The two research teams that authored the papers were led by Yoshihiro Kawaoka, who conducted his research at the University of Tokyo and the University of Wisconsin-Madison, and Ron Fouchier, who conducted his research at Erasmus Medical Center in the Netherlands. Kawaoka's group submitted its paper to Nature, and Fouchier's group submitted its paper to Science. Kawaoka's group combined an H5HA virus found in birds with a strain of the H1N1 virus, which caused a flu pandemic in 2009. The hybrid virus acquired a mutation that allows it to form a type of hemagglutinin protein that facilitates binding to upper airway receptors. After it infected ferrets, the virus acquired additional mutations that enhance transmissibility by air. The virus did not kill any of the ferrets and was vulnerable to a H5N1 vaccine and the antiviral drug Tamiflu.9
Fouchier's team used an H5N1 strain found in an infected human and inserted genes into it that can allow it to be transmissible by air. Fouchier's team also “passaged” the virus between ferret hosts ten times. (Passaging is a technically difficult process that involves inoculating uninfected ferrets with nasal samples from infected ferrets.) As a result of passaging, the virus acquired additional mutations that increased its transmissibility by air. In one experiment, the virus was transmitted between ferrets in adjoining cages 75 percent of the time. The virus became less lethal as its transmissibility increased, however.10 Fouchier's research is considered by many experts to be more dangerous than Kawaoka's because it provides a clearer demonstration of how to alter H5N1 to make it transmissible by air.11
By unanimous vote, the NSABB recommended initially that neither paper should be published in full: only redacted versions of the papers, with key details removed, should be published, with the omitted details made available only to responsible scientists and public health officials.12 The NSABB assessed the risks and benefits of publication and decided that the risks far outweighed the benefits. The primary concern, and the justification for the decision, was that some person, organization, or government might use the results of the papers to make a bioweapon that, if deployed, could cause a global pandemic. A secondary concern was that attempts by scientists to replicate these results could lead to accidental release of the pathogen. The NSABB recognized that the papers also had potential value for biomedicine and public health: they could be useful for monitoring bird populations for outbreaks of mutated forms of H5N1 with the potential to be transmissible between humans, for example, and they could also be useful for developing treatments or vaccines. The potential benefits of publication were not enough, however, to sway the NSABB at that time.13
Shortly after the NSABB announced its recommendations, a group of scientists working on H5N1 agreed to a voluntary moratorium on experiments that genetically modify the virus and called for an international forum to sort through the issues.14 In February 2012, a committee convened to discuss the issues by the World Health Organization and composed jointly of scientists involved in avian flu research, the editors of Science and Nature, and public health officials from eleven countries recommended full publication of the articles, citing the scientific and public health benefits of full publication and the practical difficulties associated with redacted publication.15
The researchers then rewrote their papers and resubmitted them to the NSABB, which met again on March 29 and 30, 2012. The revised versions were longer and provided important details about biosecurity related to the research. The NSABB also reportedly received additional information about the public health value of research, the practical and legal problems with redacted publication, and the likelihood that the research could be used to develop a bioweapon. The NSABB recommended full publication of both papers on grounds that the papers did not contain information that would immediately enable someone to make a dangerous bioweapon, since they did not demonstrate how to make a form of H5N1 that is both highly pathogenic and easily transmissible by air. The NSABB also recommended against publication of additional details that were not contained in the papers and would enable one to make a highly pathogenic and transmissible form of the virus, and it urged the international community to develop policies for overseeing dual-use research and called upon the U.S. government to develop a mechanism for providing controlled access to sensitive scientific information. The decision was not unanimous, however. While all board members agreed the paper from Kawaoka's group should be published in full, six out of eighteen committee members said that the paper from Fouchier's group should be published only in redacted form.16
The NSABB does not have the legal authority to censor or classify research, but its recommendations carry some weight among government officials, researchers, institutions, and journal editors. Shortly after the NSABB recommended publication of both papers, the NIH accepted its recommendations and the journals went ahead with publication. Nature published Kawaoka's paper on May 2, 2012, and Science published Fouchier's paper on June 22, 2012. (Fouchier's paper was delayed while he sought export permission from the Dutch government.) Both journals had conducted their own internal review of the dual-use issues raised by the research.
On April 12, 2012, NSABB member Michael Osterholm, an infectious disease specialist at the University of Minnesota, wrote an open letter to Amy Patterson, head of the NIH's Office of Biotechnology Activities, which oversees the NSABB. Osterholm disagreed with the NSABB's recommendations concerning the Fouchier paper because he believed the risks of publication outweighed the benefits. He also claimed that the agenda for the March meeting was imbalanced because it did not include enough testimony concerning the risks of full publication. He stated that the papers would indeed enable a researcher trained in the appropriate methods to create a dangerous form of H5N1 from the information provided, and he doubted whether the papers would promote public health in the absence of a major commitment of government resources to monitor avian populations for H5N1 mutations.17
On the same day that the NSABB recommended publication of the papers, the U.S. government released a policy for the oversight of life sciences DURC, as defined by NSABB.18 The policy recognizes that federal and institutional oversight of DURC is necessary to protect public health, agriculture, national security, the economy, and the environment, and it requires additional oversight of research funded by the U.S. government. The policy requires federal departments and agencies to review all current or proposed, classified or unclassified, extramural or intramural life sciences research involving select agents and toxins that poses the greatest potential for deliberate misuse with potential for mass casualties and other devastating societal effects. Departments and agencies must now assess the risks and benefits of such research and develop a risk mitigation plan in collaboration with researchers and institutions. Mitigation plans may include modifying the research, applying enhanced biosecurity or biosafety measures to the research, evaluating medical countermeasures (such as treatments or vaccines) to reduce harms related to the research, reviewing emerging findings concerning the research, and determining how to communicate research results responsibly with respect to venue, mode, distribution, and timing. If the risks of the research cannot be mitigated adequately, then departments and agencies must determine how best to deal with the research. The options include asking researchers to agree to publish or communicate results in redacted form, classifying the research, not providing funding (if the research has not been conducted), and terminating funding (if the research has been funded but is not complete). If the research is classified, then it may be referred to another department or agency for funding.19
The Ethics of Knowledge
Addressing the difficult questions raised by the H5N1 controversy requires exploring the ethics of knowledge. Different ethical questions arise in different stages of research.20 First, before a project is initiated, one must decide whether it is worthy of support with public or private funds. The NIH decided to fund Kawaoka's and Fouchier's H5N1 proposals based on its determination that the studies were scientifically well-designed projects that would probably benefit public health. The NIH did not explicitly consider biosecurity issues when making this decision because prior to the adoption of the new federal policy, there was no formal step in the review process for addressing biosecurity issues when making funding decisions.
During the research, ethical questions arise concerning study design and methodology. For example, many of the ethical issues related to research with human participants involve questions of study design or methodology because different designs and methodologies have different impacts on the rights and welfare of individuals. Using placebo control groups in clinical trials raises ethical issues when participants are denied an effective therapy.21 Once data collection is complete, ethical issues arise in how one analyzes and interprets the data. The decision to exclude data outliers from a statistical analysis raises issues concerning honesty and objectivity, for example. Finally, ethical issues also arise during the peer-review process and publication, as the controversy over the H5N1 papers makes clear.
A thorough discussion of the ethical issues raised by the H5N1 papers would consider all stages of research, including whether the NIH should have funded the studies in the first place and whether the biosafety measures were adequate.22 But while these are important concerns, I will focus on questions related to publication.
We should begin by noting that scientific knowledge has both instrumental and intrinsic value. Science's instrumental value derives from its beneficial applications and uses. Biomedical research can lead to new methods of diagnosing, treating, and preventing diseases. Research in physics and chemistry can contribute to innovations in engineering, industry, building construction, transportation, and communication. Social science research can yield results that are useful in public policy formation, business, education, and economic development. The intrinsic value of scientific knowledge stems from its ability to help us explain natural phenomena and satisfy our curiosity about the world. Many people would like to know how the universe formed, for example, even if this knowledge has no immediate practical value.23
Some argue that the probabilities used in risk-benefit reasoning need not be founded on objective evidence but can be based on subjective, educated guesses. But we often lack sufficient evidence even to make an educated guess, and educated guesses are susceptible to bias.
Publication plays a key role in the growth of scientific knowledge because it allows scientists to share data, methods, and results; to learn from and criticize each other's work; and to debate about ideas and theories. Although it is possible to make scientific advancements under a cloak of secrecy, as occurs in classified military research or industrial research, openness is highly valued because of its role in advancing scientific research. Openness obligates researchers to share the results of investigation as broadly as possible with the scientific community. The NIH and the National Science Foundation require funded investigators to share their completed research openly and include dissemination plans in their research proposals. Professional associations have adopted ethics codes and policies expressing a strong commitment to openness. Scientific journals usually require published authors to make supporting data freely available to other scientists.24
Restrictions on publication not only interfere with scientific openness, they also undermine freedom of inquiry. Freedom of inquiry includes the freedom to develop new ideas and question old ones; to formulate and test hypotheses and theories; to collaborate with colleagues and discuss data, assumptions, and methods; and to disseminate and publish the results of investigation.25 Like openness, freedom of inquiry is crucial for the advancement of human knowledge because it promotes critical thought and discussion.26 While it is possible to produce scientific knowledge even when freedom of inquiry and openness are tightly restricted, history demonstrates that science flourishes when researchers are free to openly communicate the results of inquiry and discuss scientific ideas. For example, from the 1930s to the 1960s, scientists in the former Soviet Union were forbidden from conducting research on Mendelian genetics or teaching Gregor Mendel's ideas to students, and scientists who violated this ban were often sent to prison. The Soviet Union banned Mendelian genetics during this era because it conflicted with the Communist Party's ideological commitment to the idea that human nature is malleable. The party endorsed the views of Trofim Lysenko, who held that the environment strongly influences traits and that heredity plays only a minor role. As a result of the repression of Mendelianism, Soviet genetics, zoology, agronomy, and evolutionary biology were decades behind their Western counterparts when the shackles of Lysenkoism were lifted.27
Publication can be disrupted by a variety of political, social, and economic factors ranging from biases in the peer-review process to the desire to protect intellectual property, but government restrictions on publication are especially problematic because they not only impede the progress of science in affected areas but also have a broad impact on the scientific community. Scientists who learn about censorship of research may shy away from conducting research on topics likely to be politically controversial. The Soviet Union's repression of Mendelianism, for example, also affected research in sociology.28 In the United States, scientists whose NIH grants on sexual behavior were challenged at a hearing of the House Energy and Commerce Committee in July 2003 engaged in self-censorship in response to this incident. Although the committee eventually decided to continue funding the grants, half of the researchers surveyed said they had started eliminating controversial words from their grant proposals as a result of the hearing, and one quarter said they had decided to avoid controversial topics.29 Plainly, actual government censorship could have an even more chilling effect on scientists. Censorship of a publication also establishes a precedent for future censorship.30
Freedom of inquiry is important not only for utilitarian reasons related to knowledge advancement, but also because it is connected to respect for autonomy, which many political scientists have argued is the basis for protecting freedom of speech.31 Since a person is autonomous insofar as he or she is capable of making rational decisions and fulfilling moral responsibilities,32 respect for autonomy calls for allowing autonomous individuals to make decisions concerning their own lives.33 Government restrictions on scientific publication can be viewed as morally problematic, in part, because by preventing scientists from making their own decisions about what to publish, they limit the scientists' autonomy.34
Restrictions on Publication
Freedom of inquiry and openness sometimes conflict with other values, however, and in particular with the values of preventing harm to individuals or society and protecting intellectual property or proprietary information. When conflicts arise, scientists and policy-makers must balance these competing values and decide whether freedom of inquiry or openness can be restricted.35 Ethical and policy debates about freedom of inquiry and openness tend to focus on deciding why, when, and how these norms can be restricted, and by whom.
The strongest argument for restricting freedom of inquiry or openness in liberal societies is to prevent harm to individuals or society. For example, many would argue that concern for the rights and welfare of research participants justifies regulations that prevent researchers from widely disseminating some types of personal information.36 Unauthorized disclosure of personal information collected as part of a research study can have adverse consequences for participants, such as stigma, embarrassment, and discrimination in employment or insurance. Confidentiality protections usually do not significantly impede the progress of science, however, because researchers can remove personal identifiers from the data they share or can require recipients to sign data-use agreements that maintain the confidentiality of the data.37
More difficult questions arise when considering restrictions on publication. Because these restrictions can significantly limit freedom of inquiry and scientific openness, they require a compelling justification. For example, research on nuclear weapons has always been conducted in a classified setting in the United States, but this restriction is justified as necessary to protect national security and prevent the proliferation of nuclear weapons.38 Private companies that sponsor research often maintain a cloak of secrecy around their work and refrain from publishing data or results, but at least some of these practices can be justified by companies' legitimate interest in protecting their intellectual property and achieving a competitive advantage in the marketplace. Trade secrecy laws affirm society's commitment to protecting proprietary information.39
A common way of thinking about restrictions on scientific publication is to assess the potential benefits and risks of different choices and then to select the option that has the greatest balance of benefits over risks. Much of the debate about the H5N1 papers appears to reflect this viewpoint.40 The NSABB eventually recommended full publication of the H5N1 papers because it judged that this option offered the greatest balance of benefits over risks.
How we go about identifying, estimating, and making decisions about risks and benefits is sometimes a complex process, however. Risk is usually defined as the product of the probability and degree (or severity) of harm,41 so that a choice that has a low probability of producing a very harmful outcome (such as death) may be considered more risky than a choice that has a high probability of producing a minimally harmful outcome (such as headache). Sometimes, however, we are not able to estimate the risks of different choices because we lack enough evidence to assign objective probabilities to the outcomes. An “objective probability” is a probability based on statistical frequencies, mathematical modeling, or scientific analyses, such as when the Food and Drug Administration uses evidence from clinical trials and other scientific studies to estimate the benefits and risks of putting a new drug on the market. In the case of the controversial H5N1 research, statistical estimates of the risks of the publication were out of the question; obtaining this data would require actually publishing the research and observing the effects. Some have argued that the probabilities used in risk-benefit reasoning need not be founded on objective evidence, but can be based on subjective, educated guesses. However, there are problems with using the subjective approach. First, we often lack sufficient evidence even to make an educated guess. Second, educated guesses are susceptible to biases resulting from philosophical assumptions, economic interests, ideology, and so on.42 Risk-benefit assessment pertaining to science and technology policy should therefore be based on objective evidence.43
The Precautionary Principle
How should we make decisions when we cannot obtain an objective estimate of the risks of different options? Decision theorists have developed strategies for making these types of choices, known as decisions under ignorance. The most conservative strategy, known as maximin, holds that we should maximize our minimum outcomes: we should consider the worst possible outcomes of different choices and make the choice that avoids them.44 For example, suppose you are asked to play Russian roulette and you don't know whether the gun is loaded. In this case, the risks are unknown. If you choose not to play in order to avoid the worst possible outcome—shooting yourself in the head—then you are applying a maximin strategy.
A standard criticism of the maximin approach is that it may prevent us from obtaining important benefits.45 In some situations, it may be worth taking some risks in order to pursue worthwhile opportunities. Suppose that you are trying to decide whether to drive to work and you don't know the probability that you will have an accident if you drive. Let's assume, too, that going to work is a significant benefit and that walking or biking to work or using mass transportation is out of the question. If the worst possible outcome would be that you have an automobile accident and die, then a maximin strategy would lead you not to drive. Yet most people would nonetheless consider this to be an imprudent choice.
We cannot quantify the risks of publishing the H5N1 papers, but the threats are plausible and serious. The precautionary principle would instruct us to take reasonable measures to avoid, minimize, or mitigate them.
Decision theorists have developed a number of different strategies for making decisions under ignorance that are not as conservative as maximin. I will not review all of these approaches here but instead will offer the precautionary principle as a useful alternative to maximin for making decisions under ignorance.46 The basic insight of the precautionary principle is that we should take reasonable measures to prevent serious harms. German environmental law scholars articulated the precautionary principle (Vorsorgeprinzip) in the 1980s. Since then, it has been used by the United Nations, the European Commission, and other organizations for making decisions related to environmental protection, public health, and technology development.47 Many object to it on the grounds that it thwarts the development of science and technology,48 but a number of theorists have sought to answer the charge of conservatism by showing that the principle can take the costs of prevention into account.49 Understood properly, the precaution-ary principle is not antiscience or antitechnology.
One of the most important conditions in an acceptable articulation of the precautionary principle is that the harms must be plausible. We must have some credible evidence that the harms could occur as a result of our actions or policies even though we do not have enough evidence to assign objective probabilities to different outcomes. The precautionary principle advises us to address genuine threats, not Chicken Little fantasies. Another important condition is that we should employ reasonable measures to prevent harm. A measure is reasonable if it balances the different values at stake fairly, is proportional to the nature of the threat, and is effective. A third condition is that we should consider different strategies for preventing harms. In some cases, we should do everything possible to avoid the harms; in other cases, the most reasonable course of action may be to minimize or mitigate harms.50 Whether we decide to avoid, minimize, or mitigate harms depends on how we weigh and consider the different values at stake.51 Putting these points together, the basic idea of the precautionary principle is that we should take reasonable measures to avoid, minimize, or mitigate harms that are plausible and serious.
The examples of Russian roulette and driving to work illustrate this thought. The precautionary principle would advise us to avoid playing Russian roulette because the harms are serious and plausible and there are no significant benefits to playing the game other than its entertainment value. The precautionary principle would advise us to drive our car to work, rather than stay at home, as long as we take steps to minimize or mitigate the harms. We should drive carefully, use a safe automobile, wear a seat belt, and so on. The precautionary principle would advise us to drive instead of staying at home because we would deny ourselves a significant benefit by staying at home.
The Ethics of Knowledge and the H5N1 Controversy
Much of the NSABB's debate about the H5N1 papers focused on assessing the benefits and risks of publication. At the March 2012 meeting, a significant majority of the NSABB agreed that the benefits of publishing these papers outweighed the risks. Their argument for this conclusion rests on two assumptions: (1) publishing the research could significantly benefit public health by providing scientists with information that is useful for monitoring avian populations for dangerous mutations of H5N1 or for developing vaccines or treatments, and (2) publishing the research poses an insignificant risk of harm to society because the information contained in the papers cannot be readily applied to make a dangerous human pathogen, and accidental contamination can be avoided with appropriate biosafety measures. A substantial minority of the NSABB did not accept these assumptions and disagreed with the majority's conclusion.
Was risk-benefit reasoning the appropriate framework for making decisions pertaining to publication of the H5N1 papers? As argued above, to use risk-benefit reasoning for making decisions related to science and technology we need to be able to assign objective probabilities to different outcomes. If we cannot assign objective probabilities to different outcomes, then we should consider strategies for making decisions under ignorance, such as the precautionary principle. The evidence to date strongly suggests that the NSABB's risk-benefit assessment related to the decision did not appeal to objective probabilities. Those who favored publication believed the risk of harm to society was insignificant, while those who opposed publication believed it was significant. What did “significant” mean in this debate? Was the probability of full publication leading to substantial harm 0.001, 0.01, or 0.10? One can argue that there was no way to use statistics or other scientific methods to estimate this probability objectively, due to lack of evidence. The best one could hope to do would be to make an estimate based on an educated guess. But the fact that different experts in the relevant disciplines disagreed about the risks of publication suggests that there may not be enough evidence to even make an educated guess concerning the probability of harm.
One might challenge this conclusion. Since the turn of the century, several papers have been published that raised biosecurity issues.52 These include research on enhancing the virulence of a mousepox virus that could be used to enhance the virulence of the human smallpox virus,53 a study on how to make a polio virus from available sequence data and mail-order supplies,54 a study on the genetics of human smallpox virus that could be used develop a strain of the pathogen that overcomes the immune system's defenses,55 a paper describing how to infect the U.S. milk supply with botulinum toxin,56 and a paper demonstrating how to reconstruct the extinct 1918 Spanish influenza virus, which caused over fifty million deaths worldwide, from published sequence data.57 So far, none of these papers have contributed to known acts of bioterrorism or any other harmful activities. There has been one major incident of bioterrorism in the twenty-first century: the anthrax mailings in the fall of 2001 that killed five people and sickened hundreds. However, no evidence suggests that the person responsible for these attacks made use of papers published in the scientific literature to develop his weapon.58 One might reasonably conclude from this evidence that the probability that the publication of the H5N1 papers will contribute to bioterrorism is very small, perhaps 1 percent or less.
There are two problems with this response. The first is that the sample size is too small to draw any statistically significant inferences. One might try to overcome this problem by gathering more data, but then the problem would be that the sample might be biased, because the published studies might not be as dangerous at the H5N1 research. Another potential bias is that the sample might not include research that related to bioweapons and biosecurity that has been classified. It is conceivable that there would have been many more incidents of bioterrorism if this research had been published.
The foregoing discussion supports the view that policy-makers and scientists should have used the precautionary principle, rather than traditional risk-benefit reasoning, for making decisions about publishing the H5N1 papers. Although we cannot quantify the risks, the threats are plausible and serious. The precautionary principle would instruct us to take reasonable measures to avoid, minimize, or mitigate those risks.59 To decide what counts as a reasonable measure, we must balance the values at stake in light of the different options. The values that clearly favor publication include advancing scientific knowledge and respecting freedom of inquiry and scientific openness. The values opposed to publication include preventing harm to individuals, society, and national security. The goal of promoting public health might or might not favor publication, since publication could enhance the public health response to the threat of H5N1 but could also provide terrorists (or others) with information that could be used to make bioweapons. Publication could also lead to accidental release of the pathogen.
The government's possible options regarding publication included classification, censorship, asking the researchers to publish the paper in redacted form, and recommending full publication. Given the values at stake, what would have been a reasonable response to the threat of H5N1? A reasonable response would balance conflicting values fairly, be proportional to the nature of the threat, and be effective.
Consider classification first. One could argue that classification would not balance the competing values fairly because it would completely sacrifice scientific progress and openness and freedom of inquiry for harm prevention. Classification would also be a disproportional response to the threat posed by publication of the papers because it would be an extreme reaction to the threat that would prevent scientists and public health officials from learning more about potential mutations of H5N1. Classification would be effective at reducing the threat, however, as it presents no significant legal or practices issues. Since the research was funded by the NIH, the government had legal authority to seek classification. As noted above, classification is one of the options mentioned in the new government policy for managing DURC risks that cannot be adequately mitigated. If the research had not been funded by the government, then classification would not have been an option, but in that case, the government could have offered to fund it (assuming the government knew about it at an earlier stage) so that it could be classified.60
Censorship also would not be a reasonable response to the threat of bioterrorism because, like classification, it would not balance the competing values fairly, and it would be a disproportionate reaction to the threat. Like classification, censorship would give too much weight to harm prevention and not enough weight to sharing important information with scientists and public health officials. There are also practical and legal problems with censorship that undermine its effectiveness. First, censorship would be a direct violation of National Security Decision Directive 189, a federal policy implemented under the Reagan Administration and affirmed by every administration since then.61 NSDD 189 requires that access to the products of federally funded fundamental research (defined as basic or applied research in science or engineering that is ordinarily published and shared broadly with the scientific community) shall remain unrestricted to the maximum extent possible.62 NSDD 189 also states that when the results of fundamental research pose a threat to national security, the mechanism for controlling the dissemination of information is classification, and that no restrictions may be placed on conducting or reporting federally funded fundamental research that has not been classified, provided that the research does not violate U.S. statutes.63 Since NSDD 189 is only a government policy and not a statute or regulation, government actions that violate the policy would not be illegal, but they would raise troubling questions about the consistency of U.S. policy.
Publishing the research in redacted form would balance the competing values fairly and is proportional to the nature of the threat posed. But redacted publication may not reduce that threat, and there is no system for making the full papers available to only responsible scientists and health officials.
Second, censorship might violate the First Amendment to the U.S. Constitution, which grants individuals freedom of speech. U.S. courts have held that there is strong presumption against government restrictions on speech that constitute prior restraint, such as threatening to prosecute someone for publication or seeking an injunction to block publication.64 To censor or block publication, the government needs to articulate a clear and compelling state interest, such as national security. In an important free speech case in 1971, New York Times v. United States, the U.S. Supreme Court held that the government could not block publication of the Pentagon Papers, even though the information had been classified, because it had not articulated a clear and compelling state interest.65 The Pentagon Papers contained information about the U.S. government's decision-making related to the Vietnam War. Daniel Ellsberg obtained the papers and released them to the Times. The Court held that the government may punish someone who divulges classified information but that in this case at least, it did not have the authority to stop publication.66 Generalizing from this case, one could argue that it would be difficult for the U.S. government to demonstrate that the H5N1 papers were so dangerous that publication can be blocked.67
Asking the scientists to publish the research in redacted form would appear to be a reasonable response that balances the competing values fairly and is proportional to the nature of the threat. Key details that would enable someone to make a bioweapon would be removed from the publication, but the overall results of the research would be made available to responsible scientists and public health officials. Redacted publication represents a compromise between promoting scientific research and preventing harm. It allows scientists and public health officials to learn more about H5N1, but it does not provide terrorists with a recipe for making a bioweapon.
While redacted publication satisfies the conditions of fairness and proportionality, it might not be effective at reducing the threat. It confronts both practical and legal problems. First, asking the researchers to publish the research in redacted form would appear to violate the spirit if not the letter of NSDD 189.68 Asking the researchers to publish in redacted form would not technically be a restriction imposed by the government because the researchers, not the government, would be imposing the restriction. However, it would be difficult for researchers to refuse the government's request to publish a paper in redacted form, especially if they want to obtain government funding in the future.
Second, redacted publication requires scientists, journal editors, and government officials to develop a system for making full papers with sensitive information available only to responsible scientists and public health officials. There is currently no such system in place, and developing one may be difficult.69 To determine who is a responsible scientist or public health official, it may be necessary to require those who want access to the full details of redacted papers to submit to a thorough background check to verify their trustworthiness. This could be time-consuming, costly, and intimidating. To provide access to papers, a safe method for distributing the papers, such as a secure electronic database, would need to be established and overseen by a responsible organization. (The NSABB has recommended that the U.S. government should establish an organization to perform this function.70) Enforcing the rules pertaining to access to sensitive information could be difficult. To enforce the rules, the organization overseeing the sensitive research would need to have the authority to investigate scientists and impose penalties. If the organization is an agency of the U.S. government, then it would have the authority to investigate and sanction U.S. scientists, such as government employees or individuals funded by federal grants. However, since the agency might not have legal authority over foreign scientists, international cooperation would be needed to enforce rules concerning control of sensitive information. Enforcement mechanisms are necessary to ensure that scientists who receive sensitive information do not disclose it without permission. An honor system, in which scientists agree to abide by the rules but are not subject to penalties if they violate them, is not sufficient.
Third, while redacted publication may still benefit a small group of researchers who have access to the full paper, it may have limited value to the broader scientific community because the paper may omit key details needed to evaluate the research or replicate experiments. In the case of the disputed H5N1 papers, it might not be very helpful to inform the scientific community that one has conducted experiments to make the virus transmissible by air between mammals without telling them how the experiments were done. One way of handling this problem is to ensure that the community with access to the full version of the redacted paper has enough size, diversity, and resources to advance the science, even if the larger community lacks this information.
Fourth, research funded by the U.S. federal government may still be available through a Freedom of Information Act request. FOIA allows individuals to obtain access to federal agency records through a written request to the relevant agency that describes the information sought. The agency must comply with the request unless the records are protected by one of nine exemptions to FOIA or three law enforcement exclusions. The FOIA exemption most relevant to dual-use research would be the exemption that permits agencies to refuse to disclose classified information.71
An important legal question is whether FOIA allows individuals to access scientific research data funded by federal agencies, such as the NIH, NSF, or the Centers for Disease Control and Prevention. The Shelby Amendment—named after its sponsor, Alabama Senator Richard Shelby—requires the Office of Management and Budget to ensure that all data produced under a government grant or contract are available to the public through a FOIA request,72 but at the time the amendment was passed, in 1999, there was considerable debate about how the OMB should implement the requirement. Scientists were concerned that FOIA requests could be used by competitors or private companies to gain access to preliminary data or to interfere with research.73 The OMB issued two notices of Proposed Rulemaking in the Federal Register and received over twelve thousand comments. The final version of the rule held that only published research data and materials necessary to validate data (such as methods) would be available under FOIA. Published data are limited to data published in peer-reviewed journals or cited by federal agencies to support regulations. The data accessed under FOIA do not include preliminary analyses, trade secrets, information protected by copyrights or patents, or drafts of scientific papers.74
The upshot is that the data and methods supporting scientific papers published in peer-reviewed journals may be available to the public through an FOIA request. Publishing a redacted version of a paper may therefore not be an effective way of keeping federally funded research from terrorists or others who might misuse it. The FOIA request need not be made by terrorists or others on their behalf, since information could be available to terrorists if it is disclosed via an FOIA request and then disseminated publicly.
The practical and legal problems with redacted publication indicate that this may not have been a reasonable measure for addressing the threats posed by the H5N1 papers. Redacted publication might be an effective option for dual-use publication dilemmas in the future, however. To make redacted publication more effective, it may be necessary for the U.S. government to develop a system for making redacted papers available only to responsible scientists and public health officials. International cooperation would also be needed to ensure compliance with the rules of the system. Additionally, the OMB may need to revise its implementation of the Shelby Amendment to limit access to data and methods from research that poses a threat to national security but has not been classified.
Full Publication: Halfway Solution
This leaves the remaining option, full publication. While there were no practical or legal problems with full publication, one could argue that this option does not balance competing values fairly and is not a proportionate response because it does nothing to address potential harms resulting from the research. It may preferable to the other three options, but it is still a less-than-optimal response to publication dilemmas.
Similar considerations apply to the other two agents in this dilemma. Editors of the journals would have three options: redacted publication, no publication, and full publication. As noted above, there would be practical and legal problems with redacted publication, so this would not have been a reasonable option for journals dealing with DURC until these wrinkles are ironed out. Refusing to publish the research would not balance the competing values fairly because it would place too much emphasis on harm protection. It would also not be a very effective response to the threat; the investigators would still be able to publish their work in another journal or share it by other means. Refusing to publish the research might delay public dissemination but would not stop it. Again, we are left with the option of full publication, the problems with that option notwithstanding.
Refusing to publish sensitive information might delay public dissemination but would not stop it. Publishing it in full does nothing to address potential harms resulting from the research, but is left as the best option, its problems notwithstanding.
Journals could encounter additional dilemmas if they come into conflict with the government. Suppose the government recommends redacted publication, but the scientists do not agree and seek full publication. Or even worse, suppose the government plans to block publication, but the scientists still seek publication. How should journals respond to these predicaments? One could argue that journals should make their own decisions, irrespective of government policies, in order to maintain their editorial independence. Succumbing to the government's requests would make the journal an instrument of government policy, which would set a troubling precedent. However, refusing to abide by government policies could place a journal at risk of legal liability, which most journals would also like to avoid. So there would be no easy solutions if journals disagree with government policies.
The investigators, too, would have the options of redacted publication, no publication, and full publication. Redacted publication would probably not be reasonable because of the practical and legal problems canvassed above. Refusing to publish the research would not be reasonable because it would not alert scientists and public health officials to the threats posed by possible mutations of H5N1. Additionally, refusing to publish the research might put the investigators in conflict with their sponsors, if the sponsors favor full or redacted publication. As noted above, the NIH requires funded investigators to share data, results, and methods. Refusing to publish NIH-funded research could place investigators at odds with the agency and could prevent them from obtaining funding in the future.
This leaves the option of seeking full publication—less than ideal because it would do nothing to prevent harm, but probably the prudent choice for the researchers, given their circumstances. One way they could address the threats posed by their research would be to write a note to the editors expressing their concerns about the security issues raised by their work. The editors could then decide how to handle their submission. The investigators could also express their concerns to the research sponsors before submitting their papers for publication, giving the sponsors an opportunity to review the research and make recommendations.
Bioscience research that may be used for terrorist or other harmful purposes presents difficult dilemmas for scientists, journal editors, government agencies, and sponsors. Redacted publication would have been a reasonable response to the threats posed by the controversial H5N1 papers if not for practical and legal problems with this alternative. Full publication, the option ultimately recommended by the NSABB, may have been preferable to the other options given the circumstances, but it is not preferable if the circumstances are negotiable. Government agencies, journals, and scientists from different nations should work together to make redacted publication a viable option for dealing with papers that raise DURC issues.
I am grateful to Bruce Androphy, Kenneth De Ville, and William Schrader for helpful comments.
Publisher's Disclaimer: This article is the work product of an employee or group of employees of the National Institute of Environmental Health Sciences, National Institutes of Health (NIH). However, the statements, opinions, or conclusions contained therein do not necessarily represent the statements, opinions or conclusions of NIEHS, NIH, or the United States government.
1. Kitcher P. Science, Truth, and Democracy. Oxford University Press; New York: 2001. Resnik DB. Playing Politics with Science: Balancing Scientific Independence and Government Oversight. Oxford University Press; New York: 2009.
2. Atlas RM, Dando D. The Dual-Use Dilemma for the Life Sciences: Perspectives, Conundrums, and Global Solutions. Biosecurity and Bioterrorism. 2006;4:276–86.[PubMed]Selgelid M. A Tale of Two Studies: Ethics, Bioterrorism, and the Censorship of Science. Hastings Center Report. 2007;37(no. 3):35–43.[PubMed]Douglas T, Savulescu J. Synthetic Biology and the Ethics of Knowledge. Journal of Medical Ethics. 2010;36:687–93.[PubMed]Kuhlau F, et al. A Precautionary Principle for Dual Use Research in the Life Sciences. Bioethics. 2011;25:1–8.[PubMed]
3. Cohen J, Malakoff D. On Second Thought, Flu Papers Get Go-Ahead. Science. 2012;336:19–20.[PubMed]Wolinetz CD. Implementing the New U.S. Dual-Use Policy. Science. 2012;336:1525–27.[PubMed]Imai M, et al. Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets. Nature. 2012;486:420–28.[PubMed]Russell CA, et al. The Potential for Respiratory Droplet-Transmissible A/H5N1 Influenza Virus to Evolve in a Mammalian Host. Science. 2012;336:1541–47.[PubMed]
4. Cohen, Malakoff . On Second Thought, Flu Papers Get Go-Ahead.[PubMed]
5. National Research Council . Biotechnology in the Age of Terrorism. National Academies Press; Washington, D.C: 2004.
6. Office of Biotechnology Activities Frequently Asked Questions. http://oba.od.nih.gov/biosecurity/nsabb_faq.html#NSABB_FAQ001.
7. National Research Council . Biotechnology in the Age of Terrorism.
8. Imai, et al. Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets.[PMC free article][PubMed]
10. Russell, et al. The Potential for Respiratory Droplet-Transmissible A/H5N1 Influenza Virus to Evolve in a Mammalian Host.[PMC free article][PubMed]
11. National Science Advisory Board for Biosecurity Findings and Recommendations, March 29–30, 2012. http://oba.od.nih.gov/biosecurity/biosecurity_documents.html.
12. Berns KI, et al. Adaptations of Avian Flu Virus Are a Cause for Concern. Science. 2012;335:660–61.[PubMed]
14. Malakoff D. H5N1 Flu Controversy Spurs Research Moratorium. Science. 2012;335:387–89.[PubMed]
15. Cohen J. WHO Group: H5N1 Papers Should be Published in Full. Science. 2012;335:899–900.[PubMed]
16. National Science Advisory Board for Biosecurity . Findings and Recommendations, March 29–30, 2012.
17. Osterholm MT. Letter to Amy Patterson. 2012 Apr 12;http://news.sciencemag.org/scienceinsider/NSABB%20letter%20final%2041212_3.pdf.
18. Wolinetz . Implementing the New U.S. Dual-Use Policy.[PubMed]
19. U.S. Government Policy for Overseeing Life Sciences Dual Use Research of Concern. http://oba.od.nih.gov/oba/biosecurity/PDF/United_States_Government_Policy_for_Oversight_of_DURC_FINAL_version_032812.pdf.
20. Shamoo AS, Resnik DB. Responsible Conduct of Research. 2nd ed. Oxford University Press; New York: 2009.
22. Lipsitch M, et al. Evolution, Safety, and Highly Pathogenic Influenza Viruses. Science. 2012;336:1529–31.[PMC free article][PubMed]
23. Resnik DB. The Price of Truth: How Money Affects the Norms of Science. Oxford University Press; New York: 2007.
24. Shamoo, Resnik . Responsible Conduct of Research.
25. Resnik Playing Politics with Science. 2
26. Mill JS. On Liberty. Liberal Arts Press; New York: 1956. 1859.Resnik . Playing Politics with Science.
27. Joravsky D. The Lysenko Affair. University of Chicago Press; Chicago, Ill: 1986.
29. Kempner J. The Chilling Effect: How Do Researchers React to Controversy? PLoS Medicine. 2008;5(no. 11):e222.[PMC free article][PubMed]
30. Resnik . Playing Politics with Science.
31. Scanlon T. A Theory of Freedom of Expression. Philosophy and Public Affairs. 1972;1:204–226.Dworkin R. A Matter of Principle. Harvard University Press; Cambridge, Mass: 1985. Baker CE. Human Liberty and Freedom of Speech. Oxford University Press; New York: 1989. Brison SJ. The Autonomy Defense of Free Speech. Ethics. 1998;108:312–39.
32. Dworkin G. The Theory and Practice of Autonomy. Cambridge University Press; Cambridge, U.K: 1988.
34. Resnik . Playing Politics with Science.
35. Brown MB, Guston DH. Science, Democracy, and the Right to Research. Science and Engineering Ethics. 2009;15:351–66.[PubMed]Shamoo, Resnik . Responsible Conduct of Research.
36. Shamoo, Resnik . Responsible Conduct of Research.
37. Resnik DB. Genomic Research Data: Open vs. Restricted Access. IRB: Ethics and Human Research. 2010;32(1):1–6.[PMC free article][PubMed]
38. Resnik . Playing Politics with Science.
39. Resnik . The Price of Truth.
40. Fouchier RA, Herfst S, Osterhaus AD. Restricted Data on Influenza H5N1 Virus Transmission. Science. 2012;335:662–63.[PubMed]Perez DR. H5N1 Debates: Hung up on the Wrong Questions. Science. 2012;335:799–801.[PubMed]Osterholm MT, Henderson DA. Life Sciences at a Crossroads: Respiratory Transmissible H5N1. Science. 2012;335:801–2.[PubMed]Berns, et al. Adaptations of Avian Flu Virus Are a Cause for Concern.[PubMed]
41. Resnik MD. Choices: An Introduction to Decision Theory. University of Minnesota Press; Minneapolis: 1987.
42. Earman J. Bayes of Bust? A Critical Examination of Bayesian Confirmation Theory. MIT Press; Cambridge, Mass: 1992.
43. Resnik DB. Is the Precautionary Principle Unscientific? Studies in the History and Philosophy of Biology and the Biomedical Sciences. 2003;34:329–44.
44. Resnik . Choices.
46. For a review of strategies for making decisions under ignorance, see Resnik . Choices.
47. United Nations Rio Declaration on Environment and Development. http://www.un-documents.net/rio-dec.htm.European Commission Communication from the Commission on the Precautionary Principle. http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2000:0001:FIN:EN:PDF.
48. Sunstein CR. Laws of Fear: Beyond the Precautionary Principle. Cambridge University Press; Cambridge, U.K: 2005.
49. Munthe C. The Price of Precaution and the Ethics of Risk. Springer; Dordrecht, the Netherlands: 2011.
51. Resnik DB. Environmental Health Ethics. Cambridge University Press; Cambridge, U.K: 2012.
52. Selgelid . A Tale of Two Studies.[PubMed]
53. Jackson R, et al. Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox. Journal of Virology. 2001;75:1205–1210.[PMC free article][PubMed]
54. Cello J, Paul A, Wimmer E. Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template. Science. 2002;297:1016–18.[PubMed]
55. Rosengard A, et al. Variola Virus Immune Evasion Design: Expression of a Highly Efficient Inhibitor of Human Complement. Proceedings of the National Academy of Sciences. 2002;99:8808–8813.[PMC free article][PubMed]
56. Wein L, Liu Y. Analyzing a Bioterror Attack on the Food Supply: The Case of Botulinum Toxin in Milk. Proceedings of the National Academy of Sciences. 2005;102:9984–89.[PMC free article][PubMed]
57. Tumpey TM, et al. Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus. Science. 2005;310:77–80.[PubMed]
58. Selgelid . A Tale of Two Studies.[PubMed]
59. Kuhlau, et al. A Precautionary Principle for Dual Use Research in the Life Sciences.[PubMed]
60. Resnik . Playing Politics with Science.
61. Wolinetz . Implementing the New U.S. Dual-Use Policy.[PubMed]
62. U.S. Government National Security Decision Directive 189. http://www.fas.org/irp/offdocs/nsdd/nsdd-189.htm.
64. Gostin LO. The Influenza Controversy: Should Limits be Placed on Science? Hastings Center Report. 2012;42(no. 3):12–13.[PubMed]
65. N.Y. Times v. United States, 403 U.S. 713. 1971.
67. Gostin . The Influenza Controversy.