http://www.infoworld.com/article/07/04/27/18OPsecadvise_1.html By Roger A. Grimes April 27, 2007 I recently listened to a wonderful science program on National Public Radio discussing a book called Better: A Surgeon's Notes on Performance [1] along with its author, Dr. Atul Gawande. The book discusses the reasons why some practitioners excel while others just meet the standards or perform poorly. Its hypothesis and conclusions can be universally applied in business and even life. It was easy for me to draw connections to my own experiences and relate the lessons to computer security. Here are some of the excerpts and the corollaries I drew (I apologize to the author in advance for any inaccuracies or misinterpretations): The number one indicator for above-average medical care was often simply consistency. In the story related on NPR, the author discussed how one doctor was able to have significantly longer survival rates for his cystic fibrosis patients (47 years) as compared to the national average (33 years). The secret? Consistency. The doctor determined that many patients simply were not taking the recommended medicines consistently and timely. Once he realized this, he focused on making his patients more consistent, especially stressing that they should continue to take the medicine during the majority of the time when they felt well. The outcome was significantly longer living patients. How many of us work in computer security environments where basic security recommendations are not applied consistently? I think it is nearly impossible to find a company that consistently and universally applies basic security tenets. So, we have inconsistencies, cracks in the system, and bad things are allowed to occur. The very human nature of purposefully allowing inconsistency as a norm leads to below-average outcomes. Taking a personal and institutionalized interest in applying basic security principles consistently will mitigate more risk and lead to a more secure environment. Another conclusion was that improving the existing system often provides better outcomes than just adopting new technology. In the book's example, it talked about how the U.S. Army was trying to improve the survival rate of wounded soldiers in Iraq. Prior to the recent Middle East conflicts (say WWII and Vietnam), wounded soldiers died 25 percent of the time. The Army spent half a billion dollars developing new medical aids, technologies, and treatments, but found out that improving the basics -- and applying them consistently -- provided better outcomes. For example, by ensuring that soldiers always wore their body armor, instead of removing it when it was hot, more soldiers lived. Moving the medical tents closer to the battlefield saved more lives. By focusing on better meeting the "golden hour" rule, they saved even more. They even experimented with essentially going against standard medical practices in some instances (for example, allowing field personnel more leeway to make medical decisions and to apply treatment without waiting for absolute test confirmation), and in doing so saved even more lives. The result was that now only about 10 percent of our soldiers die from their battlefield wounds even in a time of conflict where the average injury is much more serious. This is not to say that new medical inventions and techniques don't help decrease the death rate; I'm sure they do. The key takeaway point is that much of the success is due to the re-application of existing systems. If you're a security manager, focus more on the basics (e.g. patch management, password policy, malware blocking) and less on the latest and greatest new artificial-intelligence anti-malware product of the day. Truly secure environments are consistency secure and have the basics well covered. Pick good metrics. "Metrics" is often a word bandied about by managers seeking ways to report meaningful and measurable statistics to upper management. Metrics are a good thing, but many times, the metrics chosen take more time to collect than the value they provide. Security becomes more about collecting the right metrics and moving the metric in the perceived right direction than actually bettering security. The book talks about APGAR scores [2] and how they have significantly improved the lives of newborn babies. The APGAR score measures five metrics of a newborn baby (what is their color, how well they are breathing, etc.) and assigns a 0-2 point score based on the observed result. Babies with low APGAR scores are considered critical cases, and additional treatment modalities are brought to bear quickly. As a five-year EMT paramedic, I can tell you that an APGAR score only takes seconds to do and becomes second nature. It has been credited with saving the lives of millions of babies. Do you have good metrics? Evaluate the current list of metrics and reports that you collect on a daily, weekly, and monthly basis. Does anyone read them? If you want to find out who does, put very big, bogus outliers in the report and see how long it takes anyone to notice. If you can, analyze the metrics you do collect and decide which ones have the best bang for the buck. Becoming a better computer security worker or manager means taking a step back and analyzing the overall system. Improved processes and more consistent application of current rules will often pay higher dividends than any new technology or product. Roger A. Grimes is contributing editor of the InfoWorld Test Center. [1] http://www.amazon.com/exec/obidos/ASIN/0805082115/infoworldcom-20 and http://www.shopinfosecnews.org [2] http://en.wikipedia.org/wiki/Apgar_score __________________________ Subscribe to InfoSec News http://www.infosecnews.org
This archive was generated by hypermail 2.1.3 : Wed May 02 2007 - 00:33:41 PDT