It sounds like a simple concept – so simple that many government organizations believe they’ve implemented strong security systems.
But the fact is that hackers are defeating them regularly and with alarmingly increased frequency.
Information Security Officers, or ISO’s, face several kinds of challenges in delivering protective solutions for government data and systems. In the past year, security has evolved to a new level of insight; the drivers for IT security are numerous and diverse:
• The classic pace of technology development;
• Budget reductions due to economic climate;
• Increased competitive espionage;
• A growing cadre of Internet-trained hackers and script kiddies;
• An explosion of high-speed Internet connectivity;
• The software industry doctrine of full disclosure of vulnerabilities;
• A continued shortage of skilled security practitioners;
• More complex business cases implemented in Web space;
• The establishment of privacy legislation, and
• An increased emphasis on corporate governance.
The list goes on, but these are the key influences driving secure systems and data.
At the turn of the millennium, the challenge was to implement firewalls to protect internal networks. The battle to incorporate firewall technologies is now being fought in the domain of home and SOHO computers. The high speed connections used are appealing to hackers contemplating these systems for a Trojan attack for use as mirrors or as launching pads for DDoS (Distributed Denial of Service) attacks. These compromised systems represent a significant risk to government.
As IT people discovered, firewalls did not cure all, nor did Intrusion Detection Systems (IDS). More was needed – which has turned out to be layered security.
Layered security is not a new development. At the Las Vegas BlackHat conference in 1999, Bill Cheswick, AT&T Labs Chief Scientist, likened a well-developed, layered IT security framework to real world examples, including military perimeters and the biological defences of living organisms. Layered security was a reality within well-funded critical infrastructure organizations like phone companies. Now, the IT industry is generally embracing layered security, but issues remain.
To effectively implement layered defences requires a strict definition of the environmental threat faced by the ISO. That environment is different for virtually every ISO. ISO’s in the public sector in different portfolios – health, environment, defence and others – each face divergent threats, risks, cost of risk and sensitivities with respect to their core information processes and data availability, integrity and confidentiality. However, we can begin to describe the workings of layered security, with the proviso that the relative weights of each component must be tempered with the individual risk scenario and available funding to support it. The mission is to provide a skeletal framework for developing a winning layered-security strategy. For a perspective, it is useful to review the history of IT security and develop the definition of the threat to computer security.
The threat-defining risk
The original computer operations centre of the 70s (glass house) housed mainframe computers. Security was tight, with a well-defined physical security perimeter, limiting access to the inner workings.
Computers are no longer sequestered within the walls of data centres. The walls are gone and the only real boundary is bandwidth. The Internet Protocol (IP) was originally designed for the purposes of an intimate, trusted military/university network which was subsequently hijacked by convenience and accessibility. The explosive growth of the Internet is based on a protocol that has no inherent security features and can only be classed as “not ready for prime time.” The counterculture exists today, as it always has, and its members exploit the playground with anonymity, with little fear of reprisal or consequence.
Accountability simply does not exist. Instead, anonymity prevails. According to a survey conducted by McAfee, during the single week of March 3, 2003 there were 78 newly discovered viruses, for a cumulative grand total of 66,669 viruses and worms roaming “in the wild” on the Internet. If a virus had to be stamped with the name, date of birth and SIN of the author, the introduction of new viruses would halt immediately.
The weakest link
Like most physical systems, the goal of computer security suffers from the “weakest link” problem. The integrity of an information system is only as robust as the strength of the weakest component of that system. Conceptually, this maps to the “weakest link in a chain” analogy.
The complexity of information architectures is unrivaled in today’s world. In the world of bits and bytes, an application will only work if every bit is set just right in the context of a billion or more bytes. Still, that is only in a perfect world. In the real world, the onus of secure and private data processing places additional demands on the system. Not only does it have to work, but it must be secure.
Layered security: People, process and technology
Layer 1 – the technology
Bill Cheswick’s layered security model provides a starting point in developing a security strategy. It is a flat topology that resembles a bullseye target, where the data resides at the centre, surrounded by rings of protective perimeters. The external perimeter is usually the Internet-facing firewall. Moving towards the centre, Intrusion Detection Systems monitor network traffic and the integrity of key systems files on servers. Additional Access Control Lists on internal routers and possibly more internal firewalls protect the central data and processes from the mean streets of the Internet.
A new addition to the technology layer is the honeypot, an exposed system that sits outside the firewall perimeter and serves as flypaper for attempted penetration by hackers. Honeypots resemble real operational networks, but contain contrived data and are not operationally significant. The theory is that attackers are coerced into revealing their exploit tools prior to the invasion of the real internal network, thereby providing advance warning of possible attack signatures which, if they appear on the internal network, signal an attack. The goal is not to stop intruders, but to stall them while preparing a defence. Honeypots can deliver real value in providing advance warning of network attacks, but only if skilled security practitioners are available at a moment’s notice, to analyse the logs and properly interpret the internal network events that need to be detected.
Weakness in the layered security model occurs when an external business partner, connected to the government infrastructure, is not up to standard with respect to security. Examples of these types of partners may be payment clearing houses, agents, retail kiosks, fulfillment houses, marketing partners or even district offices. If these partners require connectivity to a core database to perform transactions and are insecure, the entire layered security investment could be neutralized by a breach at that partner. This is the “weakest link” syndrome in action. ISO’s must be able to set security standards and implement reviews, audits and certification processes to ensure that security budgets do not go for naught. This assurance function is a major due diligence and standard activity that is becoming commonplace as a best practice.
Finally, the current economic climate has drained the water from the swamp, exposing pockets of security tools that were implemented as point solutions. Today, the ISO must evaluate the effectiveness of these solutions within the layered environment. Licence and maintenance cost consolidation is a priority, as resulting savings may liberate budgets for additional security measures. From a technical standpoint, organizations are seeking enterprise-level consoles that provide integration of reporting from point solutions into a consolidated, holistic view of security operations. This will be a hotbed of focus this year as ISO’s seek maximum benefit in an environment of reduced budgets and restrained spending.
Really layered security
Thus far, a flat two-dimensional layered security approach has been described, much like tree rings that involve technology.The technical controls are essential, but are certainly not a panacea. They must be complemented with policy, process and people. Truly layered security results when we add these new layers.
Policy and process
The second layer is the collection of policies and processes that are crucial in managing the technical control layer. Change management is a good example of a quality process that prevents changed technical settings from creating an unsafe condition or inadvertently breaking another unrelated process. Security technologies are relatively new and it has taken time for their management to evolve and become proven. ISO17799 is an example of the evolution of old standards to meet new challenges. The results have been informed processes for policy implementation, quality audits, business continuity plans and incident response plans. Government and the private sector are both witnessing the maturity and acceptance of all these components of this vital organizational layer.
The third layer involves human factors. People (users) invoke computation to perform business functions and people (technocrats) design the network and its security. There has been a significant increase in dedication to security awareness in the user workforce. Heralded by demand for security awareness training and witnessed by the diversity of offerings it provides, user training is gaining much needed momentum. End user security awareness training is officially on the radar screen. The driving forces are lessons learned from the unfortunate experiences of others, pending privacy legislation, a clear trend to responsibility and governance, and the threat of jail time for the negligent.
Experience and availability of security training has also assisted the ranks of network architects, administrators and security professionals in performing their tasks. These people are resources that must meet world class standards for operations knowledge. The observed trend is towards investment in education and training. The investment in education is a positive trend – employees are the face of government and help define the culture. Smart organizations that invest in their people will enjoy a clear competitive advantage, profitability and growth.
The fourth layer is strong physical security to wrap the IT infrastructure, data, processes and people. 9/11 was a wake-up call for reassessment of physical security measures. It was an unwelcome call, but the results are just short of dramatic. The external interface of many organizations now resembles barbed wire where the red carpet was prominent. Improvement is still necessary in the internal interactions where the general posture remains too casual. Hard drives go missing, compromising personal data and privacy rights, but this is an education issue and demonstrated improvement far outweighs remaining issues. Small and medium-sized businesses need help here; large corporations and government have historically solved this issue.
The fifth layer is the creation of security knowledge and wisdom. The classic hierarchy of information technology includes data, information, knowledge and wisdom. Data has no structure or context. Information is data in context, rendering it useful. Knowledge is a collection of organized information, allowing appropriate response. Wisdom is the peak, tempering knowledge with experience. Wisdom facilitates proactive capabilities.
Wisdom is the domain of the sage tribal leader or battle commander who know what will happen. IT security is touching wisdom, but it is elusive because it is a moving target. The laws of physics do not change, so physicists can aspire to build knowledge into wisdom. IT, though, is a moving train, with new technologies, software, and applications being inexorably introduced by business and the profit motive. New vulnerabilities, exploits and attack methodologies are therefore co-introduced. IT security is currently in a state of consolidation and stocktaking, at a milestone on the journey to maturation.
In the last year, security training, education, standards and venues for the peer exchange of security knowledge has been confirmed. Security practitioners are partaking in the exchange of knowledge and experience. Vexing problems continue: the basic insecure protocol, the anonymity afforded to the perpetrator, bug-laden software, the patching problem, open disclosure and greed. Nevertheless, we are better prepared to meet the challenges today. Organizations that understand layered security and recognize the importance of addressing security in all its facets will be best positioned to reduce risk in the face of prevailing threats. ISO’s who are capable of assessing the threat environment and deploying their resources to counter threats will be most successful in securing the integrity of their data and systems.
Tom Slodichak is Chief Security Officer of the information technology security provider WhiteHat Inc.
Of honeypots and hackers
The newest concept in the scheme of layered security is the honeypot. Honeypots are hacker “flypaper,” universally reviled by the hacker community. Honeypots imitate real systems, easily allowing an intruder to break in – but with no risk, as the honeypot is outside the real network and contains no real data.
An Internet-attached server acts as a bait or decoy, luring potential hackers in order to study their activities and monitor how they attempt to break into a system. The goal is to have attackers reveal their tools prior to the invasion of the actual IT infrastructure and to learn where the system may have weaknesses that need attention.
If a honeypot is successful, the intruder will have no idea that s/he is being tricked and monitored. The information collected can be used both to validate internal attack signatures and provide security personnel time to devise strategies to repel the attack on the real infrastructure.
The honeypot’s principal advantage is that it has the potential to detect new and unknown exploits for which an attack signature has not yet been published. This helps network security administrators keep their systems secure.