Analytical training

I just spent a week teaching analytical techniques to a group of analysts (the same group that I wrote about here).  It was a very motivating experience and I was very impressed by the engagement and motivation of the group.  I’ve often written how much I enjoy teaching soldiers and this group gave the troops a real good run for their money.  In fact, I might even go so far as to say they edged them out since in the military I outrank most of my students and I have more tools at my disposal to compel participation.  I know I write cynically about the field more often than not but meeting people like this fills up my optimism well.

So, some thoughts based upon discussions, observations and repressed memories:

  • Teaching structured analytical techniques is one thing but getting them to be used is another entirely.  That’s because:
    • they’re unfamiliar (remember, there’s little to no standardized training required in the field) and when under stress of some sort, everyone will fall back on the familiar.
    • they aren’t hard-wired into the analytical process of most agencies.  As long as their perceived as something extra that’s added on to ‘real’ analysis we’re going to have an uphill battle.  I think this can be addressed through a combination of formal, entry level training, training for personnel who are going to supervise analysts and (maybe most importantly) requiring these processes be done at the agency level.  In short, products should not be considered complete without the use of some structured techniques.
  • there continues to be a tension between what analysts can do and what they’re frequently called upon to do.  So, do you make products that customers (many of whom aren’t familiar with intelligence and its uses) think they want or do you try to force feed them products you think they need (and yes, I understand this is a false choice and there are various options that straddle these two extremes but this is my blog and if I want to present you with an either/or choice I’m going to do it).  I probably need to get into a bit more detail about the first option since it sounds a bit pretentious (Oh, you know what’s best for your customer?  Don’t you think they might know what they need?).  I can’t emphasize enough that the field of intelligence analysis within the law enforcement field (even nearly a decade after 9/11) is still new and we haven’t done a particularly good job of explaining what success looks like in terms of its implementation.  Ultimately, however, I fall in the second camp.  After all, if you’ve got a kid and they ask for candy for dinner do you give it to them?  It might be best to explain the essentials of nutrition to them and get them to understand why they should eat healthy food but if they don’t buy it and threaten a tantrum you still don’t give in.  You can offer them some dessert if they finish their veggies but they still have to get through the veggies.  Unfortunately, in my analogy, your kid is also your boss at work and if you don’t give him candy you’ll be fired…I’m still working on resolving this.  Give me a couple of days…
  • Many times, what passes for product is (as they say) ‘all sizzle and no steak’.  Pretty pictures, fancy graphs tables of numbers or narratives that aren’t particularly rigorous or don’t describe what they claim to can interpreted as ‘impressive’.  I’ve termed one manifestation of this the ‘heft test’, where the size of the product is a metric to assess it’s quality (hint:  it helps to have color graphics to catch the eye while the customer flips through it).
  • Scenario based training is the way to go.  In fact, for analysts, it’d be nice to see periodic (annual?) scenario ‘war-games’ to refine and reinforce processes and skills as well as identify training needs.  Does anyone do this?
  • I’ve been thinking about riffing off of the idea of ‘Google time‘ where engineers there get to use 20% of their time on self-generated products.  After seeing about 20 analysts produce a wide ranging and fascinating set of such products as part of this course I think this idea has some merit and is worthy of some testing.  One function of intelligence analysis should be (in military parlance) to help the commander shape future operations.  In the law enforcement realm that means they should attempt to describe what the operating environment will look like in the future.  What will the threats be?  What will the challenges and opportunities be?  We simply don’t have a good process for that. While the idea of 20% time isn’t a silver bullet for the problem, I suspect it could allow analysts to devote time to subjects that they see as interesting but which might not yet be on the radar of those who allocate resources or set priorities.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s