The CIA is undergoing a major transformation, and IT is playing a leading role. In Part 2 of our inside look at the agency, CIA employees describe the environment pre- and post-9/11, and the massive changes that resulted from that day’s tragic events. Like other government agencies, the CIA and its IT department were unprepared for the intense change that was to come. (See “Inside the CIA’s Extreme Technology Makeover, Part 1” to read the first part in our series.)
“We were not on the right path”
To understand just how unlikely it is that CIA employees know what a wiki is, let alone rely on the technology to support its global mission, it’s useful to know where the CIA and its IT department have been in the past.
Established in 1947, the CIA’s mission has been to conduct clandestine operations on foreign nations, collecting critical national security information, then analyze and synthesize the data points, and deliver intelligence to the president, military leaders and other policy makers. For most of its existence, the CIA was wildly focused on spying on the former Soviet Union and combating communism-with varying degrees of success.
“I came in the mid-’80s, during a period of time when the agency was very focused on big, covert actions,” recalls CIA CIO Al Tarasiuk, who spent time overseas in Africa. “That was one of my first jobs in the agency, supporting that stuff from an IT perspective. I kind of had that in my blood.”
IT operations at the time were principally housed in the Office of Communications, or “Commo.” The main method of communication was through cable messaging, which had been used since World War II and offered “command and control” security, Tarasiuk says. IT “was mainly focused on getting those HF [high frequency] circuits tuned up right so that you could pass enough of those messages,” he recalls. “They were just text-based, very simple stuff. But very important for operations.”
The fall of the Soviet Union and the tearing down of the Berlin Wall were cataclysmic events for CIA: The enemy suddenly was not there anymore. “There was kind of a downtime when some of us sensed, where are we going as an organization?” Tarasiuk recalls. The inevitable downsizing and budget cuts soon followed. “Being in the IT world that was apart of the larger support element here, we got hit really, really hard,” he says, “down to the point where our global infrastructure was very fragile.”
Ken Westbrook, chief of business information strategy in the CIA’s intelligence directorate (the analysts), recalls a tough period that was emblematic of much of the 1990s and early 2000s. From 1996 to 2000, Westbrook was deployed to the Balkan Task Force, which was established in 1992 as an interagency group that worked in concert with Allied military forces and collected intelligence on terrorist threats, terrain and infrastructure in Bosnia.
“We were working 24 hours a day during the war in Kosovo,” Westbrook says, “and I just watched analysts struggle trying to do simple things, like trying to get access to information that they needed or trying to communicate with people.”
The CIA’s main information-handling system at the time, called CIRAS, lacked basic features, such as the ability to distinguish what documents have been read and what hadn’t, and assumed that analysts read everything in sequence and chronologically, Westbrook says. And, according to a CIA researcher, the search and networking capabilities of CIRAS were “primitive.” (CIRAS was finally replaced in 2007 by a more modern system, called Trident.)
The consequences of bad intel can be deadly. News reports from 2000 show that the U.S. bombing of the Chinese embassy in Belgrade during the Kosovo war in 1999, which killed three and injured 20, occurred, in part, because CIA officers targeted what they thought was a Yugoslav army warehouse. The data was based on outdated maps, and others failed to catch the mistake before the proposal was passed to the military.
So, for Westbrook, it was the overwhelming difficulties in accessing information, the pervasive stovepipes of data and the gulf between the abilities of the CIA’s systems and those of the private sector that got him involved with IT. “I became convinced that IT was a critical element to support analysis, and I was convinced that we were not on the right path,” he says. “We can’t get the work of analysis done without good IT.”
“Do all we can, with whatever we have on the shelves”
The CIA wasn’t alone in data-sharing and technology woes in the ’90s, even as new threats began emerging. “The intelligence community struggled throughout the 1990s and up to 9/11 to collect intelligence on and analyze the phenomenon of transnational terrorism,” notes the 9/11 Commission report. “The combination of an overwhelming number of priorities, flat budgets, an outmoded structure and bureaucratic rivalries resulted in an insufficient response to this new challenge.”
But all that changed at 8:46 a.m. on Sept. 11, 2001.
The terrorist attacks on the United States and resultant global war on terror changed everything at the CIA, especially IT, which is called Global Communications Services. “It renewed focus in a mission,” says Tarasiuk, who a senior manager in the IT infrastructure organization at the time of the attacks. “The global war on terror, all of the sudden, became the agenda for the agency. The sense of mission came back, and the idea of being part of the tip of the sword in the fight against all this.”
For IT, the pressure was intense. “Immediately it was: Do all we can, with whatever we have on the shelves, get our systems together, extend the infrastructure to the best we can, and find creative ways of partnering with others just to make the mission happen until we could get enough money in here to start rebuilding,” Tarasiuk says.
Infrastructure, storage, bandwidth, server, application and staffing requirements skyrocketed: Instantly, demands in those areas doubled, tripled and quadrupled. Tarasiuk contends that, due to the underfunding and downsizing, “we didn’t really have a well-organized plan” to deal with the new demand. For example, he says there simply wasn’t time to determine the best enterprise architecture strategy for the CIA’s new systems. “It kind of all happened,” Tarasiuk says, “so we just had to-not a negative term-slap things together and get them going.”
During 2001 and into 2002, a former CIA officer named Bruce Berkowitz studied how the analysts in the CIA’s Directorate of Intelligence (DI) used information technology and how they might use IT more effectively. What he found was troubling: the analysts lacked awareness of and access to new IT services that could be of critical value to their work; the CIA did not put a high priority on analysts using IT easily or creatively; and, worst of all, wrote Berkowitz, “that data outside the CIA’s own network are secondary to the intelligence mission.”
Due to information-sharing security threats and a pervasive message that “technology is potentially dangerous,” technology became a “bogey-man rather than an ally” to the analysts, Berkowitz noted. The end result: “DI analysts know far less about new information technology and services than do their counterparts in the private sector and other government organizations. On average, they seem about five years or more behind.”
Not surprisingly, in 2002 the CIA asked a focus group of employees what they needed to get their jobs done. Out of everything that they could have possibly needed to be successful, Westbrook says, IT came in last. CIA spies, analysts and other staffers had worked for so long without good IT that they didn’t even know what they were missing.
Again, the CIA was not the only one in the intelligence community caught off guard by the 9/11 attacks and lacking in top-notch IT systems. Across the government, the 9/11 report declared, “there were failures of imagination, policy, capabilities and management.”
Coming in Part 3 (8/6/08): The CIA’s CIO navigates a tense line between making data visible and keeping secrets
Coming in Part 4 (8/7/08): The CIA’s efforts to use new applications and Web 2.0 technologies