A statement released on March 13 by a collection of environmental and other organizations is the latest salvo in a global debate about how far the regulation of synthetic biology should go in endorsing the “precautionary principle.” The statement, titled “Principles for the Oversight of Synthetic Biology,” calls for a broad moratorium on the release and commercial use of organisms developed through synthetic biology.
“Principles” is a helpful contribution in many ways. It highlights a set of issues that need deeper consideration, including the need to think more about synthetic biology’s potential impact on the environment, public health, and economic and social injustices. Its call for a more cautious approach to synthetic biology is a useful corrective, given that the debate about synthetic biology has so far largely rejected the precautionary principle. Unfortunately, in trying to say just what the precautionary approach looks like, the message gets somewhat muddled by another set of concerns.
Synthetic biology is probably best thought of as a variant of genetic engineering. It differs from earlier approaches to genetic engineering in that it focuses on developing simplified genomes, made up of simplified, standardized, and modularized genetic sequences that can be inserted into organisms to cause them to behave in useful ways. In the grandest definitions offered of the field, synthetic biology is just the application to biology of the principles of engineering, leading to the design and construction of new biological systems and the redesign of existing ones.
In practice, the focus is on engineering genes, on the theory that genes are the governing software for biological systems, and on micro-organisms. Complex though microbes can be, they can be genetically tweaked more easily than most multicelled organisms, and they also look particularly useful. Some of the leading lines of work in synthetic biology would use microorganisms to produce valuable hydrocarbons, including fuels and various industrial materials.
Proponents of synthetic biology have rejected the precautionary principle on grounds that it would bring the field to a grinding halt. The precautionary principle is an approach to evaluating risks and benefits that was developed by environmentalists as a strategy for heading off unexpected environmental damage. The basic idea is to shift the burden of proof from those who raise concerns about some new industrial project to those promoting it. Instead of waiting for proof that the project causes harm, society should ask those pushing the project to prove that it won’t cause harm. The problem, say the proponents of synthetic biology, is that we can never prove there won’t be harm. Thus the precautionary principle could lead to paralysis–interminable testing and waiting while we look for unknown risks.
Some commentators have looked for compromise positions. A 2009 report written by Erik Parens and coauthors at The Hastings Center argued for striking a balance between a precautionary stance and a “proactionary” stance that would try to fast-track the science. A 2010 report from the Presidential Commission for the Study of Bioethical Issues struck a somewhat similar stance by endorsing “prudent vigilance,” which the PCSBI said was “a middle ground” between halting the field entirely and “letting the science rip,” regardless of the likely risks.
But compromises are hard to find. Parens et al. did not try to say just what the compromise should be, and the PCSBI did try; it arguably tilted heavily toward the proactionary stance. Critics of the report, led by Friends of the Earth, the International Center for Technology Assessment, and the ETC Group, argued that it had “ignored” the precautionary principle–that it called for caution and oversight and more public discussion but did not actually raise any bars. And some lines of work, such as the deliberate release of synthetic microbes into the environment, seem like good candidates for at least temporary bars, as ecologist Allison Snow argued before the PCSBI. There are plausible concerns that, for example, E. coli modified to process cellulose into fuel precursors, or E. coli modified to alter the human digestive system might be bad for the environment or for public health.
“Principles for the Oversight of Synthetic Biology” was prepared jointly by the organizations that led criticism of PCSBI and cosigned by over 100 other nongovernmental organizations, and it comes down squarely in favor of the precautionary principle, but with a modification meant to avoid paralysis. Borrowing from the “Wingspread Consensus Statement on the Precautionary Principle,” the statement proposes that, “When an activity raises threats of harm to human health or the environment, precautionary measures should be undertaken even if some cause and effect relationships are not fully established scientifically.” The idea here is to spread the burden of proof around more evenly: opponents of a project need not “fully establish” how the project might be harmful, but they do have to roughly identify the cause and effect relationships that they have in mind. They have to point to threats of harm. The project’s proponents then bear the burden of showing that the harms do not pose an obstacle to the project.
This requirement is still too vague to pacify the proponents of synthetic biology, who naturally want to know exactly what they have to establish in order to get on with their work–and do not want the burden of proof to be overly burdensome. But it’s arguably a clearer requirement than “prudent vigilance,” which sounds meaty but does not substantively slow the field. Synthetic biology is sometimes described as the enabling technology for a new industrial revolution, and moreover as one that we ought as a society to promote because it can help correct some of the mistakes made in the first industrial revolution. But if we are to correct those mistakes, then we ought to avoid making them all over again, and that seems to mean we should be more cautious about lines of research that appear to raise threats of harm.
“Principles for the Oversight of Synthetic Biology” nonetheless goes somewhat astray. One problem is the statement focuses only on commercialization and release of organisms developed through synthetic biology, although even pure research raises serious questions. Researchers have recently created a variant of bird flu (H5N1) that would probably have an extremely high fatality rate in humans yet, unlike all existing variants, could also be transmitted easily from person to person. It is not clear whether this research employed synthetic biology techniques, but certainly this is a line of research that could be pursued using those techniques.
Perhaps, in focusing on commercialization, “Principles” reflects an underlying antipathy to corporations. Other recommendations reveal an underlying discomfort with the technology–no matter how it is developed. “Principles” argues, for example, that the precautionary principle requires “that alternative approaches to synthetic biology applications have been fully considered,” but it’s not clear why all the alternatives must be exhausted first. Without specifying what alternatives must be considered first, and what it means to “fully consider” them, it is hard to imagine how the standard could ever be satisfied. The effect is a call for prohibition rather than a call for precaution.
The document also says that the precautionary approach “requires synthetic biology-specific oversight mechanisms that account for the “unique characteristics” of synthetic organisms and their products.” But there are no such unique characteristics: there are no special physical characteristics that organisms have just by virtue of being genetically modified. What is really needed is that we pay attention to the characteristics of particular modified organisms and the plausible if not yet fully established threats of harm associated with those characteristics. This may change the way risks and benefits are tallied up, but it does not obviously require new oversight mechanisms. We might well be able to tweak the existing oversight mechanisms in the EPA, FDA, and USDA, for example. (“Principles” asserts that the field has proceeded “with little oversight or regulation,” but this is false. There has simply been little synthetic biology-specific oversight or regulation.)
These assertions imply that there is something inherently undesirable about organisms developed using synthetic biology techniques. In this view, such organisms are fundamentally worse than the alternatives, and their modifications give them unknown “unique characteristics.” They are not merely organisms developed using synthetic biology techniques, but “synthetic organisms”–a new kind of Frankensteinian bogeyman.
Whether having been developed using synthetic biology techniques makes organisms inherently bad is an interesting and important question. Whether there really are yet such things as “synthetic organisms” is also an interesting question. But these are questions that “Principles for the Oversight of Synthetic Biology” does not explicitly address. Worse, implicitly inserting them undermines the effort to think about the implications of the precautionary principle: it gives the impression that the goal is really just to stop the field entirely, and for reasons that have nothing to do with how the technology might be used.
What’s attractive about “Principles” is that it calls for serious attention to questions about environmental impact, public health, and social justice and that it tries to articulate a more precautionary approach to the technology. Although the effort to articulate the precautionary approach is muddled by extraneous concerns, still, the document is a useful contribution to the debate about synthetic biology.
Gregory Kaebnick is a principal investigator of The Hastings Center’s research project on synthetic biology and editor of the Hastings Center Report.