Building a better mousetrap is all well and good but first you better make sure their are some mice to catch.
Over the past decade we’ve built up quite an anti-terrorism infrastructure. New governmental agencies, capabilities and personnel at the federal, state and local levels with big budgets and new opportunities to make careers. But what happens if the threat isn’t sufficient to justify all that expense and effort?
Nassib Taleb writes about that in an excerpt from his latest book titled ‘Antifragile’. He discusses the difference between ‘noise’ and ‘signal’. He doesn’t talk specifically about ‘homeland security’ (or any form of focused intelligence) but you the relevance is pretty easy to spot.
He uses, as an example, personal physicians who have to justify their salaries. After all, how long can one expect to continue drawing a paycheck if their recommended course of action is ‘Just keep doing what you’re doing. You’re perfectly healthy.’
In business and economic decision-making, data causes severe side-effects – data is now plentiful thanks to connectivity; and the share of spuriousness in the data increases as one gets more immersed into it. A not well discussed property of data: it is toxic in large quantities – even in moderate quantities.
In the intelligence arena, the way we try to process data (if Taleb is correct) does nothing in particular to solve the problem. And frequently in the intelligence field, there’s little desire to look at anything but the immediate noise.
The more frequently you look at data, the more noise you are disproportionally likely to get (rather than the valuable part called the signal); hence the higher the noise to signal ratio…Say you look at information on a yearly basis . Assume further that for what you are observing, at the yearly frequency the ratio of signal to noise is about one to one…-it means that about half of changes are real improvements or degradations, the other half come from randomness…But if you look at the very same data on a daily basis, the composition would change to 95% noise, 5% signal.
So, when Taleb, a bit later on, makes the case that ‘anyone who listens to news (except when very, very significant events take place) is one step below sucker’, he’s talking to virtually the entire homeland security community (and much of the law enforcement one as well). That’s why after work humor is laced with terms about ‘media-led policing’ (or grant-led policing). As a headline scrolls across that flat screen TV, every agency with CIA-envy decides to scramble personnel to ‘run it to ground’. The result is, more often than not, a regurgitation of what was just reported on the news and, even worse, since everyone has been given the same command, numerous vapid reports ping around the ether for days.
Noise begets noise.
In the intelligence field I remain convinced this is due to the fact that agencies continue to do a very poor job of clearly defining how they intend on converting their mission to practical action. Saying you’re going to ‘detect, deter and defeat’ terrorists is great but if it’s not followed up with a (detailed, clear and realistic) ‘how’, it’s worthless.
Taleb argues that the answer is the rationing of information. That’s a strange recommendation in this day and age that I’m trying to get my head around but it does make sense. In Afghanistan (yes, another war story) I encountered this frequently when I found myself being one of a multitude of competing actors (‘Huh, the Supply Officer has an interesting theory on tribal relations? Oh, he just read an article in National Geographic? Great!’) trying to get my analysis to the appropriate customer.
And yes, I know how elitist that reads. But in the intelligence field not every voice is equally valuable. Not every analytical opinion is equally worthy of consideration. And intelligence consumers can’t be expected to read and evaluate every piece of analysis.