Last fall, the Office of the Corporate CIO for the Government of Alberta received the first annual HP Privacy Innovation Award at the PrivacyCon 2003 conference in Columbus, Ohio. The award was presented for Alberta’s work in the development of a privacy architecture. Although it is the first award of its kind, it will not be the last.
If CIOs are to deal successfully with the privacy issues facing IT today, they will have to embrace a phrase popular in privacy circles: “Privacy by design.” Privacy by design refers to the need to make privacy protection an integral feature of IT systems and applications.
And privacy architecture is the road to privacy by design. Privacy is perhaps the leading public policy issue in IT. Hardly a day goes by without a new privacy concern or issue in the news. Many issues are driven by developments in the technology used to manage information, although they are not necessarily the fault of the technology itself.
Governments are particularly vulnerable to privacy problems, for several reasons. They manage large volumes of personal information. They require the trust of the citizenry if they wish to introduce new delivery methods for government programs and services.
Perhaps most important, public sector privacy legislation is more mature, and arguably more stringent, than that in the private sector. Increasingly, though, the private sector is faced with much the same privacy requirements as government. The Personal Information Protection and Electronic Documents Act, also known by the awkward acronym PIPEDA, came into full effect on Jan. 1, 2004. Similar legislation is in effect in several provinces, with more likely to come.
The main privacy issues for the IT community are clear: how do we secure personal information against identity theft and other kinds of unauthorized release? How do we ensure that the data subject’s privacy preferences are recorded and respected in transactions involving personal information? How do we comply with privacy legislation, which often requires case-by-case decisions, in automated customer transactions? How can personalization and customer relationship management be reconciled with privacy requirements?
All these issues and more are raised by PIPEDA and its public-sector equivalents. None are easy to deal with in today’s large-scale, high-volume, heavily automated information management systems. And most fall squarely in the laps of CIOs. Alberta’s response to these and other IT privacy issues has been to focus on privacy as a risk management issue. Risk management involves risk identification and risk mitigation. You have to know what risks you face before you can deal with them. Having identified a given risk, you have to minimize the chance of its occurrence and, if it does occur, the impact of that occurrence.
For privacy risk identification, we rely mainly on the privacy impact assessment, as do most governments in Canada and an increasing number of private sector organizations. Privacy impact assessment (PIA) is crucial. But that being said, PIA does not itself reduce risks, although it identifies risks requiring attention and often reports the measures taken to reduce them. Risk mitigation requires more than assessment. It requires the development of rules, both business and technical, and the consistent application of those rules. It is here that privacy architecture comes into play, by defining technical standards or design rules, for IT applications and infrastructure.
When I joined the Government of Alberta’s Office of the CIO in 2001 after a stint at the Office of the Information and Privacy Commissioner, I was tasked with ensuring that privacy impact assessments were conducted on cross-government IT projects. At that time, though, the Office of the CIO was engaged in the planning and development of an overall enterprise architecture for IT operations. I got involved in the development of privacy principles for the project known as GAEA, for Government of Alberta Enterprise Architecture. By September 2002, we had determined that we had to provide more detailed direction to ensure that the government’s privacy obligations were adequately reflected in IT design.
GAEA had five component architectures, dealing with business, data, applications, technology and security. We needed a sixth: privacy. Between October 2002 and June 2003, with the assistance of IBM Global Services and the IBM Privacy Research Lab in Zurich, we developed what is, as far as we know, the first-ever fully elaborated privacy architecture.
The privacy architecture has five major components today. More will be added, but the five already developed comprise what we believe to be the essential foundation for privacy by design.
a A consistent privacy terminology, to provide well- defined common terms to guide privacy discussions in an IT context. This was essential in the development of the privacy architecture itself, which was why it was the first component completed. It will also be important in the implementation phase and for future expansions of the architecture.
b A privacy taxonomy for personal information metadata, to provide the syntax and vocabulary for future rule-based privacy functions and to assist with data sharing decisions within government. The structure of the taxonomy is complete, but the detailed classification remains to be completed. When that is done the taxonomy will provide the basis of a language to describe privacy rules. These rules will govern the automation of routine privacy transactions involved in e-government, electronic service delivery and other personal information management functions.
c An identity key system that prevents the unauthorized disclosure of personal information and controls the use of personal information within government. This system provides for the storage of personal attributes apart from personal identifiers, reducing the privacy risks associated with the unauthorized disclosure of either. Through a system of internal identifiers that are meaningless except to a highly secure identity administration utility, the identity key scheme ensures that identities associated with personal attributes are only available to authorized users and applications. It also prevents the unauthorized sharing of personal information between government ministries, because internal identifiers are unique to a specific ministry, or to a program within a ministry. A special kind of internal identifier is used to enable data sharing across ministry and program boundaries, but only when such sharing is allowed by legislation.
d A data placement process to govern decisions related to data sharing and re-use in the context of the data architecture. This is a decision process, the one part of the architecture that extends into the policy realm, and it requires an accountability structure. It also requires that a privacy impact assessment form part of the decision process. The term “data placement” refers to the GAEA data architecture, which uses “bands” to define the extent to which data assets are shared across government.
e Privacy transformation standards for use in data dissemination decisions and applications. Although potentially capable of full automation, these standards will be applied manually at first. They comprise a decision process and data transformation techniques, which are intended to ensure that personal information is made available in as anonymous a form as possible for the purpose at hand.
An overview of the privacy architecture is available on the Web at sharp.gov.ab.ca/ppa.
Because it represents a substantial shift in design philosophy related to the management of personal information, the privacy architecture will not be retrofitted to existing applications. It will be implemented gradually, as new applications are developed and existing ones are replaced.
No architecture can ever be static. A static architecture is an obsolete architecture. Future development of the privacy architecture will address such topics as access by data subjects to their own data, user interface issues and design features to support consent and choice. Further in the future, we hope to buy or build features that will enable real-time automated privacy decisions on a transactional basis.