Monday, October 12, 2009

The Fog of Human Nature

Upon reviewing Robert McNamara's Eleven Lessons, it seemed to me as if they could be easily summed up into one unifying rule: be prepared to be wrong. As he states himself towards the end of the film, war is an entity far too complex for humans to fully comprehend. All of the variables that go into it are astronomical in quantity, and therefore failure in one way or another is simple unavoidable. McNamara's rules help to show us the different ways in which we can be wrong, and these of course extend beyond the realms of warfare. His first lesson, empathize with your enemy, demonstrates how we can err when we fail to acknowledge perspectives other than ours, while his second lesson, rationality will not save us, shows us that even logical, cool-headed thinking-- the kind of thought processing considered to be the more fool-proof-- is not without its faults. Lessons three (there's something beyond one's self), five (proportionality should be a guideline in war), and 8 (be prepared to reexamine your reasoning) are all similar in some way to the first lesson: we come closest to success only when we look beyond ourselves and our reasoning, compare ourselves to other things, and question our own selves. Lessons 4 (maximize efficiency) and 6 (get the data) at first appear to contradict the second lesson. After all, being efficient and having data are ways of being rational, but if rationality will not save us, why bother? The answer to that, one must remember that McNamara said that we cannot be saved by rationality, but he did not say that rationality cannot help us. Failure may be inevitable, but it doesn't have to be common, and efficiency and data will help to minimize it, but can't rid humans of it. Perhaps the reason we fail is not attributable to the methods we use, but to those who use the methods, i.e. humans. As lesson 11 clearly states, you cannot change human nature, and as Alexander Pope once said, "to err is human". One reason we may fail so often may be explained in McNamara's seventh rule (belief and seeing are both often wrong). This again goes back to the problems of limited perspective: we can only possibly see the whole picture with a narrowed scope, and under these circumstances, what we see may only be a part of the truth. The rest of the truth often does not factor into what be believe, so even if we are efficient and have a lot of data on a certain subject, this is insufficient when our beliefs are inaccurate. In McNamara's ninth rule (in order to do good, you may have to engage in evil) he speaks of moral and ethical failure. If the goal of society is to be peaceful, then to engage evil would be considered a failure for that society. However, like failure, war is also inevitable, and engaging in evil will hopefully help to prevent its own existence later in the future. Finally, there is McNamara's more affirming rule: 10. Never say never. Considering this instruction, I should revise the unifying rule I presented earlier; instead of "be prepared to be wrong", McNamara would have said, "be prepared to be wrong, but don't let that stop you". Even though we humans are fallible, this is not n excuse to give up on any endeavor, for failure is a very instructive tool if we learn from it.

1 comment:

  1. I really like your detailed analysis of the 11 'lessons', Eric -- particularly your suggestion that 'Lessons 4 (maximize efficiency) and 6 (get the data) at first appear to contradict the second lesson'. I confess that I really don't know what to make of the 'lessons', particularly since many of them don't seem particularly enlightening, and some *do* seem to conflict with others. Numbers 4 & 6, in fact, really seem to belong to a whole different set of 'lessons'.

    ReplyDelete