Site icon IT World Canada

The real force behind the consumerization of IT

I have yet to meet an IT exec or CIO for whom the “consumerization of IT” — employees asserting control over the technology they use for work — isn’t now a major area of contemplation … and sometimes consternation. But there’s more to the trend than Apple-blinded employees bringing Macs, iPhones, and iPads into the office, even if they are the most identifiable champions of this trend. Let me take you through the key issues behind the consumerization — there’s much more to it than mobile devices.

Two years ago, iPhones started appearing in the office, often connecting to corporate email and Wi-Fi networks. For many, that marked the beginning of the phenomenon known as the consumerization of IT, but it started years before the iPhone. People have been using their home PCs and Macs — systems not typically under strict IT management — for years, and Salesforce.com created a booming business selling cloud-based salesforce automation software directly to business execs, explicitly and proudly bypassing IT, half a decade ago.

As is usually the case with anything new, the IT reaction was to say no, and fears about security breaches quickly became the justifications for the policy. But just as with the home computer, public hotspot, and Salesforce.com phenomena that came before, the cost savings, lack of actual significant security problems, and executive joy at the new technology forced IT to move from “no” to “how.”

The roots stretch back to the 1960s
But anyone who believes the consumerization phenomenon is driven by just technology is missing the point. The real change — and why it’s ultimately not an IT decision — is in business itself. The 1950s were the pinnacle of the hierarchical, military-style “company man” business — a consequence of the mass of military-trained World War II soldiers returning to the workforce. Then came the 1960s and 1970s, where individuals asserted their rights as individuals and as members of minority and other groups. The 1980s saw a deconstruction of the corporation into a flatter model, with fewer middle managers and more employee empowerment.

In manufacturing, this became highly codified, using techniques from management gurus such as W. Edwards Deming, including the use of Lean and Six Sigma coupled with employee co-ownership in the form of quality circles and Toyota’s “anyone can stop the assembly line” philosophy. The 1990s and 2000s saw a continuing hollowing out of middle management, the introduction of part-time and contract labor forces, and the replacement of routine work with robots, software, and offshore workers (in societies that largely had no individual-empowerment culture).

That left many companies with a smaller set of knowledge workers retained because they could think for themselves, as well as use their intuition, personal skills, and so on, whether for sales, customer service, product design, or operations.

The new workforce favors those who rely on themselves
The result is a workforce of nomads who come together as needed, using a wide range of resources in a variety of locations. Inevitably, that nomadism accentuates the importance of the tools these employees use to do the work they’re valued for. As each person’s individual strengths vary, so do the tools they prefer to use — and begin to insist on using.

This phenomenon is by no means unique to knowledge workers. Many tradespeople — contractors and chefs, for example — have long used their own equipment because of the perceived better fit, quality, and/or feel. Software, computing devices, and the like are the knowledge worker’s equivalent.

Given these fundamental shifts in both business structures and the type of value desired from individual employees, a rift has developed between those new realities and the structures that live on from the “company man” era. For example, employees are told to manage all or most of their retirement savings and to keep up their skills on their own dime and time. The company may help a bit, but it no longer takes care of employees in these ways. The notion of a job marriage, where doing your job meant lifelong employment and a secure retirement, is gone.

Thus, the relationship between the employee and employer has changed to one of ad hoc participation. As long as it makes sense for both the employee and employer, the relationship stands. When either decides the relationship is no longer desired, it’s over.

The clash between old and new creates the clash between users and IT
Yet the old “company man” approach lives on in IT and other operational systems. One example is the notion of a standard technology environment, where PCs and their software are reliably stamped out in identical units like cookies in a Mothers factory. The other is the notion that employees need to be protected from risk, by having it removed via technology wherever possible.

In other words, whereas employees are told to act like adults when it comes to their retirement and skills, they’re treated like babies when it comes to technology usage.

As workers are told to be more independent and self-supporting, they’re fenced in at home. Abbie Lundberg, the former editor in chief of CIO magazine and now a technology management consultant, has a great analogy for this situation: IT, the CSO, the legal department, and often HR treat business employees as babies who they lock in the house so that they don’t crawl out into the street and get killed.

The better metaphor, she says, is to think of business staff as teenagers who are going to drive the car whether you want them to or not. It’s better to teach them how to drive and set limits and expectations such as having a curfew and no-consequences permission to call for help if they do get in trouble.

The truth is that if you fence them in, they will find a way out. And that is what will get them — and the company — in trouble. Remember, today’s knowledge workers are valued for their creativity and drive, and their technology familiarity lets them act on it in the realm once the sanctum of IT.

Many in IT are living in a fool’s paradise
The uncomfortable reality for IT and business executives is that most are operating in a fool’s paradise when it comes to the consumerization trend. A recent IDC study shows that although 40 percent of IT decision makers say they let employees access corporate information from employee-owned devices, 70 percent of employees say they access corporate data that way. That means in many organizations IT has no real handle on what is actually happening in the systems it is managing. IDC’s research also shows that the use of personally owned devices is only growing.
Other IDC research shows the IT disconnect from the already-consumerized technology reality in their companies. Note the mismatch in the slide below between IT’s and users’ views of policies relating to who pays for mobile services: IT thinks that the business determines and directly pays for business-related access, a view shared only by BlackBerry users (those whom IT provisions). Users of other mobile platforms say they bear the costs or charge them to the company as an expense — and thus make the decisions. In other words, these IT organizations see only the BlackBerrys that represent the pre-consumerization state of their organizations.

Forrester Research says that the consumerization trend will only intensify as the Millennials become a greater proportion of the workforce. In 2010, a quarter of employees were Millennials, a proportion that rises to 40 percent in 2020. Think about it: The Boomers who grew up in the individual-empowerment era of the 1960s and 1970s are largely the ones who have the political clout and financial ability to use their own technologies, but the generations that follow see such technology as simply normal.

I’ve heard several CIOs at large, conservative enterprises say they had to allow iOS and Android devices because “kids” wouldn’t work for a firm that forced them to use a BlackBerry and Windows XP PC. The U.S. Army is a great example; it’s proactively looking to deploy Android devices and iPads, and it’s training troops on appropriate use of iPhones and other such devices because its 20-something workforce uses them anyhow.
One more study, this time from Aberdeen Research: I covered it in detail earlier this year in my blog because it’s such a shocker. The more you try to control employee-oriented technology, the more it costs you and the less safe you are. Remember that analogy of trying to fence in teenagers? That’s why: When you rigidly control the technology and processes of knowledge workers, they actively work around you — and against you. Your “secured” email ends up getting forwarded to Gmail and Hotmail accounts where you have zero control or visibility into it. Documents find their ways onto CD-Rs, thumb drives, and cloud storage for transfer to home computers and from there to mobile devices. Cloud apps will be used more and more, as IT becomes viewed as the obstacle to getting work done.
The real shocker to me was the fact that a free-for-all environment is safer and cheaper than a rigidly controlled one. But it made sense after Aberdeen researcher Andrew Borg explained it: If employees aren’t actively fighting IT, they’re less likely to cause issues. And of course the safest, cheapest approach is the “wise parent” approach: Use a mix of policies, incentives, and education to help your teen become a self-sufficient adult. The incentive is the right to use a device of their own choosing; the policies channel that use in safe ways, and education helps both reduce resistance to some burdensome but truly necessary policies and increase self-vigilance by the employee — the overwhelmingly vast majority of whom want to do the right thing for them and their company, after all.

How IT can adjust to the new reality, without endangering the business
So how does IT function in this new world? PwC came up with the framework shown in this slide, which I think is right (both because I contributed to it and because it’s enjoyed a good reaction when I’ve made this presentation to various IT audiences). The full PwC report laying out this framework is available as a free download.
It’s a different way for many in IT to think, as it starts with “soft” values and requires IT to share ownership of risk management and technology decision making with employees and their business departments. (It requires the same of the legal, executive, and HR teams.) But as the consumerization trend is fueled by “soft” human issues, it only makes sense that the management response to it be grounded in human approaches.

On the technology side, the framework favors policies, not rigid barriers, to steer employees to the right outcomes while allowing appropriate freedom and creativity. It says the IT monoculture at the endpoint level is a dead direction, so IT instead should think of technology as an onion with multiple layers. The outer, employee-oriented layers should be flexible and individualizable, while core systems should be standardized and safeguarded as much as possible. A simple illustration: Allow any mobile device that conforms to your routine information access policies, but add layers of authentication and security measures such as encryption for those information resources that are truly sensitive within the network. Even if you let an employee access their workgroup share drive from an iPad doesn’t mean that same employee can open your HR database.

The bad news is that not all the technology is available to manage this onion skin — the notion of information rights management is rarely implemented in typical enterprise data objects or systems, and rarely in user apps and devices. The good news is that by shifting risk from an IT- or CSO-only job to a shared one, you incentivize the business to reduce that risk through other means.

The other good news is that consumerization is not new. The first IBM PC or Apple IIe owned by an employee or department started this journey. The Internet pushed it to a whole new level, as information became unbounded, not just computing capability. Yet organizations have not only survived, they’ve thrived with that new power. Think back to the notion that Internet access had to be strictly controlled; it once seemed necessary and scary, but ended up not being so bad. Then you adapted as it became clear you had to, finding many positives to exploit along the way. Now apply that thinking to this newest set of waves: mobile, cloud, and social media.

Exit mobile version