The Proactionary Principle emerged out of a critical discussion of the precautionary principle during Extropy Institute’s Vital Progress Summit in 2004. We saw that the precautionary principle is riddled with fatal weaknesses. Not least among these is its strong bias toward the status quo and against the technological progress so vital to the continued survival and well-being of humanity.
Participants in the VP Summit understood that we need to develop and deploy new technologies to feed billions more people over the coming decades, to counter natural threats—from pathogens to environmental changes, and to alleviate human suffering from disease, damage, and the ravages of aging. We recognized the need to formulate an alternative, more sophisticated principle incorporating more extensive and accurate assessment of options while protecting our fundamental responsibility and liberty to experiment and innovate.
With input from some of those at the Summit, I developed the Proactionary Principle to embody the wisdom of structure. The Principle urges all parties to actively take into account all the consequences of an activity—good as well as bad—while apportioning precautionary measures to the real threats we face. And to do all this while appreciating the crucial role played by technological innovation and humanity’s evolving ability to adapt to and remedy any undesirable side-effects.
The exact wording of the Principle matters less than the ideas it embodies. The Principle is an inclusive, structured process for maximizing technological progress for human benefit while heightening awareness of potential side-effects and risks. In its briefest form, it says:
Progress should not bow to fear, but should proceed with eyes wide open.
More flatly stated:
Protect the freedom to innovate and progress while thinking and planning intelligently for collateral effects.
Expanded to make room for some specifics:
Encourage innovation that is bold and proactive; manage innovation for maximum human benefit; think about innovation comprehensively, objectively, and with balance.
We can call this “the” Proactionary Principle so long as we realize that the underlying Principle is less like a sound bite than a set of nested Chinese boxes or Russian babushka dolls. If we pry open the lid of this introductory-level version of the Principle, we will discover ten component principles lying within:
1. Guard the Freedom to Innovate: Our freedom to innovate technologically is valuable to humanity. The burden of proof therefore belongs to those who propose measures to restrict new technologies. All proposed measures should be closely scrutinized.
2. Use Best Objective Methods: Use a decision process that is objective, structured, and explicit. Evaluate risks and generate alternatives and forecasts according to available science, not emotionally shaped perceptions, using the most well validated and effective methods available; use explicit forecasting processes with rigorously structured inputs, and fully disclose the forecasting procedure; reduce biases by selecting disinterested experts, by using the devil’s advocate procedure with judgmental methods, and by using auditing procedures such as review panels.
3. Be Comprehensive: Consider all reasonable alternative actions, including no action. Estimate the opportunities lost by abandoning a technology, and take into account the costs and risks of substituting other credible options. When making these estimates, use systems thinking to carefully consider not only concentrated and immediate effects, but also widely distributed and follow-on effects, as well as the interaction of the factor under consideration with other factors.
4. Embrace Input: Take into account the interests of all potentially affected parties, and keep the process open to input from those parties or their legitimate representatives.
5. Simplify: Use methods that are no more complex than necessary taking into account the other principles.
6. Prioritize and Triage: When choosing among measures to ameliorate unwanted side effects, prioritize decision criteria as follows:
· Give priority to reducing non-lethal threats to human health over threats limited to the environment (within reasonable limits);
· Give priority to reducing immediate threats over remote threats;
· Give priority to addressing known and proven threats to human health and environmental quality over hypothetical risks;
· Prefer the measure with the highest expectation value by giving priority to more certain over less certain threats, to irreversible or persistent impacts over transient impacts, and to proposals that are more likely to be accomplished with the available resources.
7. Apply Measures Proportionally: Consider restrictive measures only if the potential negative impact of an activity has both significant probability and severity. In such cases, if the activity also generates benefits, discount the impacts according to the feasibility of adapting to the adverse effects. If measures to limit technological advance do appear justified, ensure that the extent of those measures is proportionate to the extent of the probable effects, and that the measures are applied as narrowly as possible consistent with being effective.
8. Respect Diversity in Values: Recognize and respect the diversity of values among people, as well as the different weights they place on shared values. Whenever feasible, enable people to make reasonable, informed tradeoffs according to their own values.
9. Treat Symmetrically: Treat technological risks on the same basis as natural risks; avoid underweighting natural risks and overweighting human-technological risks. Fully account for the benefits of technological advances.
10. Revisit and Refresh: Create a trigger to prompt decision makers to revisit the decision, far enough in the future that conditions may have changed significantly, but soon enough to take effective and affordable corrective action.