Texas Tech University.
TTU Home Center for Biodefense, Law & Public Policy
Center Logo

Legal Issues and Bioterrorism Conference

Keynote Address

Legal Issues in Bioterrorism: Second Annual Law and Science Research Symposium, February 12, 2003.

Presented by William M. Marcy, PhD, PE
Texas Tech University

Looking Where the Light is Better

A man leaving a bar encounters another man circling a light pole staring intently at the side walk. The first man asks the second what he is doing. He replies that he lost his contact lens and is trying to find it. The first man asks him where he thought he might have lost it. The second man replies that it was down the alley. The first man asks why he is looking for his contact lens under the light pole. The second man answers "because the light is better"

When you go through an airport since 9-11 you will see the best example I have found of "looking where the light is better" in terms of screening passenger's baggage looking for terrorists. The airport security problem is one of complexity and robust design. The air transport system is an extremely complex system and the number of failure modes is extremely large.

Perspectives on Complex Systems and the Design of Robust Systems

It is axiomatic that all complex systems will eventually fail given enough time in service. This applies to every category of system whether it is an organization, an economic system, a machine, a legal system, a stock exchange, a system of policies, etc. The more complex the system the more likely it is to fail in a given time frame. The design of robust systems is a set of principles when applied to the design of any system will mitigate or limit the consequences of the failures which are bound to occur. The idea underlying robust design is to build "fail-safe" systems which are characterized by intentional redundancy and which limit or prevent failure propagation.

Answering Questions that haven't been asked

Engineers are taught how to design robust systems and to understand failure modes and failure analysis. For many years I served as a security officer with the Central Intelligence Agency. For much of my career with the CIA my assignment was to challenge the designs of the security infrastructure in order to characterize failure modes and to mitigate the consequences of failures in personnel, physical, technical, computer and telecommunications security. This included intentional attacks on one or more security domains.

Security systems are complex systems. What contributes to the complexity and which results in a fundamentally intractable design problem is the fact that the security problem consists of a set of overlapping domains and each domain has its own peculiar terminology and subject matter content. To make matters worse each domain is further broken down into sub-domains, ad infinitum. The resulting information structure that results is called in mathematical terms an n-dimensional manifold.

Exploring or searching n-dimensional manifolds leads to a class of problem known to be NP-Complete. In mathematical terms this means that even if one had an infinitely fast computer there exists no algorithm for searching them that will converge in finite time. Chess playing computers face this problem. What we must find or invent are heuristics that will get us an approximation to an optimal solution in time short enough time to be useful.

The Design of Robust Systems

No small part of the problem lies in the fact that complex systems fail catastrophically even when no attack is made to disrupt them. The Challenger and Discovery failures are representative of complex systems that failed. Due to weight limitations, space shuttles cannot employ the depth of robust design that air transport aircraft use. Some elements of the shuttle system are not fail-safe such as the heat protective tile system. If damaged in orbit then there is no capability to repair the tiles. If sufficiently damaged catastrophic failure is sure to occur on reentry.

We should adopt "systems thinking" as we approach our research. There are a number of computer based tools we could bring to bear including causal modeling and Petri Nets to model non-stationary, probabilistic, event driven processes.

We can test the robustness of our designs through unit testing and simulation. Such concepts as object oriented programming is an example of a robust design approach to software that will not fail in a way results in a cascade of failures.

The Contribution of the Center for BioDefense, Law and Public Policy

This center has the potential to make a significant contribution to our country by developing and extending the principles of robust system design proposed changes in technology, law and public policy. These are principles that are rarely taught outside engineering schools. They do not require a knowledge of advanced mathematics or computer simulation. Nevertheless there may be opportunities to use computer simulation to perform cause and effect modeling of complex systems to achieve approximations to optimal solutions.


Outcomes are what we expect and consequences are what we get. Our obligation is to make sure that the consequences that result from the changes being made and proposed in technology, law and public policy do not destroy our intended outcomes.