Recently I was talking with some managers of a poorly run intelligence shop and they were bemoaning the lack of ‘a sense of urgency’ among their analysts. Why weren’t more ‘products’ being disseminated. To bolster their argument they pointed to another group of analysts who were producing a much higher volume of documents.
Key to the discussion (but completely missed by my interlocutors) was that the two groups were charged with doing very different types of work. The latter were charged with compiling and collating data. A lot of data, to be sure but data. Very little to no analysis or domain knowledge was needed. Provided you’re taught what information is required and what databases are available just about anyone should be able to complete that task.
The other group was charged with understanding a large number of subjects, being able to identify how events may impact on those subjects and understanding the context of changes within the operating environment.
I suggested that with a mission like that, perhaps an emphasis should be placed on developing subject matter expertise rather than counting the number of products with an agency seal on them.
Then I began thinking about how such expertise is developed. In law enforcement (which is either where state and local analysts are located or they’re in agencies that are heavily influenced by such thinking) knowledge is usually transmitted in an apprentice like system. You work with the seasoned veteran for years and learn the ropes until, one day, you are teaching the new guy.
That certainly has value but is it appropriate when were talking about intelligence analysis? How does a junior analyst become a subject matter expert in the workings of La Cosa Nostra or al-Qaeda? Is it a wise use of time and resources to hope they pick it up around the office?
This is, I suspect, a fairly widespread problem where agencies that rushed to create intelligence shops lo these many years since 9/11 never really thought through how to run them or how to measure their success or failure. So, more often then not we’re left with wholly unsatisfying, yet easily measurable metrics like how many products were disseminated this week or ‘Did we publish before our rivals?’. **
I haven’t heard of any agencies (again, at the sub-federal level) that have instituted a program for analysts to develop subject matter expertise but while it might require some up front work, it really shouldn’t be difficult to do. There are plenty of training programs, open source publications and accessible experts that a decent analyst should be able to draft (and a good supervisor/mentor should be able to flesh out) a plan to build expertise while still being able to address the needs of the shop to which s/he is assigned.
I, however, won’t hold my breath for that to happen.
*I would lump a host of others in with politicians here but you get the idea
**The idea that the idea of ‘scooping’ other agencies in the production of (usually highly banal) ‘intelligence products’ is another symptom of a system that is adrift. Especially when given the fact that the 24 news stations and the interwebs almost always ‘scoop’ even the quickest products out of these shops.