Abstract
When engineers consider risk management, they have traditionally used a largely consequentialist framework in which the key consideration is the expectation value of total harm. However, as discussed by Hanson and by Cramer (in Lewens, 2007) non-consequentialist ethical theories have substantial support from philosophers and it is worth exploring the implications of those theories for engineering practice.
Though we can hardly survey all possible ethical theories, Kantian (deontological) and Rawlsian (justice-centered) theories are extremely prominent. One aspect those theories have in common is that they frown on, if they do not outright forbid, the imposition of risks on people who do not agree to them. This is a view that has been further elaborated in much of the recent work done on environmental justice (Coolsaet, 2020).
One implication is that risks should often be categorically reduced to a de minimis level. It is widely held that justice, unlike probability, does not come in degrees. If that is the case, then it is arguable that justice requires not merely the reduction of the expectation value of harm, but that some kinds of harm must be avoided altogether if at all possible.
The obvious objection to that claim is that risk can never be avoided entirely. One response to that objection is that what really matters is the risk as perceived by those who are required to be subjected to it, or by those with few alternatives to being subjected to it. Related to that point, we must remember that for events lacking a large sample, exact probabilities are not known and must be estimated. Even if we believe we have a good estimate for the probability of some adverse outcome, we should have the humility to respect other estimates. Quantitatively, this can be addressed via several mathematical tools for imprecise probabilities, such as upper and lower probabilities (Walley, 1991).
A further implication that is associated especially with Rawlsian theories is that we must be concerned not only with the total burden of risk, but with its distribution. In particular, the interests of underprivileged groups require consideration. A maximin principle may be applicable, in which we are required to maximize the interests of those who are currently least well off.
In this article, the limitations of engineering information that can be supplied to political decision-makers is explored against well established ethical theory, especially the theories that have been associated with consequentialism and deontology.