Whenever someone says the word "analysis" I can't help but think, however briefly, of Richard Feynman. I first discovered Feynman by accident, when I was staying in a hotel in Paris, and someone had left "Tuva or Bust" on a bookshelf as a present for a future guest. Feynman doesn't feature directly in the Tuva book, but the way that Ralph Leighton, the author, wrote about him, I decided I wanted to learn more. Since then, I've read pretty much everything published by/about him, because he's just one hell of a fascinating character. To me, Feynman can be summed up in two phrases, "intellectually fearless" and "fun." Hidden under the delightful, wry, wit of a true raconteur is a tremendously powerful analytic mind. If you're interested in learning more about Feynman, read "Confessions of a curious character" and James Gleick's biography, "Genius."
This article isn't about Feynman. This article is about great articles. I think it's extremely rare that you can read a bunch of words written by another person and actually end up smarter than you were when you began. These articles will make you smarter. By "smart" I don't necessarily mean more intelligent - but it's my experience that exposing people to brilliant pieces of thinking and analysis can help give them an idea how it's done. Analysis, of course, is a creative process - you're looking at a bunch of stuff and trying to pare it down to its essentials; then you're explaining those essentials to someone else. So the analyst must, simultaneously, understand "the big picture" and "the little picture" as well as making assessing the right level of detail in which to couch their explanation. Great analysis, and great explanations, manage to be as clear and comprehensible to a person with no experience of the matter at hand, while remaining relevant and perhaps even captivating to the expert. When I've had the fortune to discuss Feynman's article or Spinney's testimony with experts in the respective fields, they say things like, "that is the most succinct and accurate description of the situation I've ever seen." Or "he hit the nail right on the head." When you consider that someone can write an eleven page paper that dissects the failure trajectory of a space programme and have aerospace experts say, "that's exactly right!" you know you're dealing with a document that contains a tremendous amount of information that has been processed and sorted and understood.
One thing I've noticed over the years is that sometimes you run into important statements of the truth that are so important that they sound utterly obvious. As a child, interested in wargaming and military history, I read Sun Tzu on warfare and thought, "this is just a bunch of obvious platitudes." And, to the degree to which I understood what I was reading, I was right. "If you don't give the enemy a way out, they may choose to fight to the death and there's nothing on earth more dangerous than someone who's decided to take you to hell with them" (my re-interpretation of Sun Tzu's observation about death ground) seems pretty obvious. At least it does, at first. But, as I grew older and somewhat more experienced, I realized that Sun Tzu had not merely uttered an obvious platitude; he had discovered one of the "laws of physics" of warfare, and explained it. Newton, after all, didn't invent the laws of motion - he "merely" explained them. Simple things, explained by great analysts, are profound. Profound things, explained by great analysts, are simple. On deeper reflection, you realize that Sun Tzu is very, very profound.
I think that anyone who's interested in great thinking should read these two articles:
Personal observations on the reliability of the Shuttle, by R.P. Feynman (cached version) - When the space shuttle Challenger blew apart on launch, a government investigation was launched, to determine what had happened, and (if you're a cynic) to white-wash the disaster or, (if you're not a cynic) to try to prevent it from happening again. An expert committee was empaneled to investigate the disaster, and to write recommendations. The story of the inner workings of the expert committee was not told until recently, in a book of Fenyman material that came out early in 2006, but, at the time, the american people were treated to a series of boring hearings in which NASA administrators hemmed and hawwed and said a lot of "maybe" and "possibly" and Richard Feynman made the whole thing real for millions of people by demonstrating, with a piece of O-ring, a clamp and a glass of ice-water, that there was no "maybe" about the causes of the disaster. It was typical of Feynman that he made his demonstration appear spontaneous (it wasn't) and it was equally typical of Feynman that he published his own report, as a completely separate document from the committee's findings. Feynman's "Observations on the reliability of the shuttle" should be required reading for anyone who works in an engineering field, because it is the very juice squeezed from pearls of wisdom. Clearly and apparently effortlessly, Feynman weaves together an explanation of how testing, safety, process control, and engineering discipline fit together. Think about that for a second. Feynman writes a reference masterpiece in eleven pages.(1)
When I read Feynman's "Observations" I cannot help but think about computer security and how virtually everything Feynman has to say about how not to do engineering, and how to achieve safety, is right there in black and white. In fact, every time I re-read "Observations" I lose my temper, because I am confronted with a clear, conclusive argument that, in computing, we continue to do things that are really, really stupid. It is ironic that in "Observations" the only kudos Feynman has to give are to the software development practices NASA used for the shuttle. Compare his description of that process with how 100% of commercial code is written today - you may find it instructive.
When the Columbia exploded on re-entry (things moving 12,000mph don't "break up", they explode) I could hear the ghost of Richard Feynman yelling "dang it!" from wherever physicists go when they die. In "Observations" Feynman is careful to point out, in his discussion of "safety margin", that, if hot gasses are not designed to jet past and damage an O-ring, then there is no such thing as a "safe" flight in which that happens to any degree. You could take "Observations" and staple a post-it note to the front of it reading, "...and if the design doesn't say that heat resistant tiles are supposed to fall off, then there is no 'safety margin' that justifies flying the shuttle if they do."
I have unashamedly paraphrased Feynman over and over again in the last few years, with respect to computer security: "If the design of your operating doesn't say that it's supposed to be hack-able then it shouldn't be and it's not safe for use in an environment where there are hackers." I wish I could boil that down to a Feynman-esque quip, or a Sun Tzu-style phrase - but it is no embarrassment for any man to admit that they aren't in that league of thinkers.
Statement by Franklin C. Spinney (cached version) - I first ran across Franklin "Chuck" Spinney as he played a walk-on part in Robert Coram's "Boyd" and somehow - thank goodness - he stuck in my memory as a possible point of view to investigate when I started researching my book on homeland security.(2) Spinney was a pentagon analyst who became the focal point of the military reform movement during the 1980's, and torched a lucrative career by telling truths with which big government was not prepared to cope. By doing so, he traded his chance at being a successful cog in the machine for greatness.
The military reform movement is pretty much dead, and the US military these days is resting on the laurels of some of the most successful and lopsided military victories in human history. There is no doubt about it: the stuff works. Spinney believes that the engine that powers it all - technology procurement, research, and development, is completely off the rails and well into a "death spiral." In his statement, from his 2002 congressional testimony, he offers a brilliant and incisive analysis of the factors that are causing the US to constantly spend more and constantly get less.
(A death spiral: "spend more get less" is invisible if you are only looking at the cost)
Spinney's analysis is brilliant. Not only does he explain how the procurement system is broken, he quantifies it, in a series of charts that are complex but expressive. He concludes with the observation that the US spends as much on defense as the top 20 other nations on earth, and:
"According to the ISS, the combined defense budgets of the three nations cited most often as threats 'the so-called axis of evil made up of Iran, Iraq, and N. Korea' are less than $12 billion, or only about three per cent of the US proposed defense budget for 2003."
Suddenly, the ease with which we crushed Iraq is a bit more understandable. You can have an incredibly bloated and inefficient war machine and still be the biggest dog on the planet! I must emphasize one thing: I am NOT criticizing our men and women in uniform; I believe that, in fact, they are being betrayed by venal cowards who'd rather sit safely at home and scrabble after a quick buck, than help build their country's defenses. Remember, Spinney wrote that in 2002 - do you think our military is in condition to take on Iran and North Korea together, right now? Why not? Their combined defense budget is literally a tiny percentage of ours! Could it be that our $$ spend is not scaling correctly? Spinney explains it.
The "death spiral" slide above was from a talk I gave on security in internet firewalls and the result of performance focus over security and the resulting negative synergy. Spinney's testimony should be required reading for IT managers. The only negative thing I can say about it is that Spinney's writing is a little bit more "DOD-ese" and lacks Feynman's crystal-clear jargon-free delivery or the poetic concision of Sun Tzu.
Please read these articles! I believe that they will make you a better person, or - at least - a better thinker.
4:10am, Goethe-Bar, Frankfurt airport, hung over and wreathed in cigarette smoke, May 11, 2006
(1) A recently published book of Feynman stories includes a chapter-long description of his participation in the Challenger commission and the Washington mickeymouse that he endured. It offers a much-needed explanation of how Feynman's "Observations" came to be published separately from the committee report and gives a lot of background on the way Washingon committees work. I highly recommend it; included with the book is an audio CD of Feynman telling his "Los Alamos From Below" stories, about safe-cracking and opening the most valuable safe on the planet by guessing the combination. It's interesting because Feynman is, basically, showing us that we persistently refuse to think clearly about security. If you do read this book, when you get to the section about the committee and how it worked, ask yourself "What was the 9/11 committee's process probably like?" Feynman's eleven pages were obviously not as easy to write as he makes them appear; you're looking at the culmination of several months of focused effort by one of the 20th century's most scintillating intellects.
(2) Spinney's experience was not directly relevant but was crucial in forming my understanding of the dysfunction between the FBI, CIA, and other incredibly well-funded federal agencies in the intelligence community. When I started researching the topic, I kept in mind the horrible possibility that the "DOD Experience" closely parallelled the CIA and FBI. Without an inside view that would allow me to gather defensible data, I am left with the impression that the intelligence community's problems nearly exactly mirror the DOD's: bloat, stupidity, foolish use of expensive toys, poor programme management, failure of leadership, and truly gargantuan budgetary excesses. When you read Spinney, think about the CIA, and the Iraq WMD fiasco, and suddenly you can explain how an organization with a $30 billion budget can't find its own ass.