Why the intelligence community needs to smarten up on system integration

If you’ve watched the Bourne Identity or its sequels, you’ll know that the U.S. Central Intelligence Agency is as dysfunctional as any other enterprise, but you may not have blamed its IT.

According to an article called “Langley Won’t Tell Us” in the recent issue of Foreign Policy magazine, however, a lack of effective IT integration and standardization is one of the key factors used to explain why critical intelligence-gathering often falls short of expectations. It’s not necessarily that the CIA and associated agencies don’t get the information. They just don’t manage it properly. Consider this excerpt:

Intelligence-system designers create unique hardware platforms and software applications for each agency, and sometimes for separate elements within agencies. Because each of these platforms and applications requires hard firewalls, gaps can occur, and agencies or sections of agencies can get shut out of intelligence-sharing.

Take the case of the newest military command, Africom. To build its intelligence database, analysts and managers had to collect data from the three major commands that were previously responsible for watching the African continent. Each command had used different and incompatible data storage software, making it nearly impossible for the data to be collated. The lines are drawn even more impenetrably between the foreign intelligence services (think CIA and the Defense Intelligence Agency) and the domestic law-enforcement communities (think FBI and the Department of Homeland Security). Now, imagine being an analyst at the National Counterterrorism Center, intended to pool and analyze all that data. Senior managers might have four or five different computer hard drives at their desks in order to access upward of 20 different intranets.

Sound familiar? I think it’s a sign of both how sophisticated end user understanding of IT has become and the universality of integration headaches that these paragraphs would appear in a journal like Foreign Policy.

Consider too that the intelligence being gathered here is not merely intended to decide where to put merchandize on a shelf, or whether a marketing campaign was successful, or if a company will need to hire more staff. The decisions made by these agencies make the difference between thwarting a terrorist plot like the Christmas day bomber, deploying more troops in Afghanistan or sending in more resources to deal with a natural disaster such as the one in Haiti this week.

A lot of things changed over the last decade, and many areas of enterprise IT improved. It is sad to say that in many cases, including high-profile ones such as this, nothing has changed at all. Incompatibility across systems, siloed systems, poor data storage and sharing are as problematic in 2010 as they were in 2000, if not much, much earlier. Overcoming these issues has to be the primary goal of every IT professional in the decade to come.

Related Download
Virtualization: For Victory Over IT Complexity Sponsor: HPE
Virtualization: For Victory Over IT Complexity
Download this white paper to learn how to effectively deploy virtualization and create your own high-performance infrastructures
Register Now