Muddling with middleware

Across the public sector, e-government initiatives are calling for an unprecedented level of cohesion between diverse IT systems. Integration middleware makes the task easier, but the kind of interoperability that would make government truly transparent has been elusive.

Public sector organizations are finding that the real battle is in finding common ground between stakeholders and establishing standards for shared entities. Using software to mediate between two other software platforms that can’t talk to each other sounds a bit like a Band-Aid solution, and in some respects it is. In an ideal world, globally accepted interoperability standards – not middleware – would ensure that applications could operate with each other. One of the ironies is that middleware, like the diverse systems it purports to unify, is largely proprietary, and is segmented into many sub-types.

The upshot is that, instead of using middleware to create a common “middle zone” that would effectively solve the interoperability problem, large organizations wind up accumulating and supporting varied – and often largely redundant – middleware portfolios that leave the issue unresolved.

Government strategists seeking to simply this “hodge-podge of different environments,” as Dave Wallace, Ontario’s Chief Technology Officer, calls it, face a number of barriers. Interoperability has traditionally been a tough sell at ministerial and departmental levels. Under pressure to optimize their own economies, application owners tend to have their own middleware strategies.

As Peter Watkins, Director of the Technology, Planning, and Standards Branch in British Columbia says, middleware is “quite often an embedded component of the business solution;” vendors like SAP, to cite just one example, have begun to include a middleware component with their software. Investments in trained staff and vendor capabilities also tend to align middleware solutions with the applications they support, according to Watkins. Finally, there are architectural alignments; an application that is built with, for example, a messaging architecture, needs a message queuing product when it comes to middleware. “It’s not as if there’s an enterprise solution out there,” says Watkins.

The problem organizations are trying to solve is not integration per se, but the simplification of integration environments. Point-to-point integration has always been possible using standard APIs and conventional tools, and for ad hoc requirements, it is still by far the most economical approach.

The problem is that creating and supporting large numbers of point-to-point interfaces is cumbersome, expensive and error-prone.

Ideally, middleware solves this problem by providing a single connectivity hub for all applications. However, as lack of standardization continues to perpetuate complex and diverse middleware environments, the cure, while not worse than the disease, does not achieve the intended result.

There are some encouraging signs that middleware vendors are working to alleviate this problem by moving towards standardization. The agreement between Microsoft Corp. and Sun Microsystems Inc. to develop interoperability between Java and .Net is a recent example. However, progress has generally been slow, and uncertainty remains a barrier.

According to Ontario’s Wallace, “the uncertain future of UDDI and Web services has left many middleware vendors in ‘wait-and-see’ mode, as they fear their products could become redundant.” The hope is that eventually, industry-wide adoption of Web services technologies and Service-Oriented Architectures (SOAs) will bring in a whole new era of interoperability.

These standardized approaches have become a cornerstone of large vendor strategies, such as IBM Corp.’s computing-on-demand approach. While the standards are not finalized, the need for commonality at the business protocol level is a given, and this is where British Columbia and Ontario are focusing their integration efforts. Building this kind of common ground calls for a new level of co-operation among application owners, and achieving this is a top priority for government strategists like Wallace and Watkins. High profile e-government initiatives are forcing the issue.

“We’re trying to put new glasses on as we look at the integration/interoperability topic,” says Watkins. “In the past, you used to sit down, look at the problem, run an RFP, make a decision as to what middleware product you wanted, and you were done. Given the types of projects we are running these days, and the number of different parties that have to participate in them, that’s really not the most valid way to do it anymore. So we’re trying to sort through the essential things that have to be shared or are in common.” From a practical standpoint, this amounts to building competencies as well as consensus. “The real key,” says B.C.’s Watkins, “is being able to pull the different stakeholders together who have an interest in interoperability, and can successfully design the protocols, interfaces, and standards that they need to make it work. And that, I think, is going to be the key to being able to capitalize on Web services.

“If you’re not good at modeling your business processes and developing interfaces and building XML schemas and these types of things, you’re going to really struggle to make this work.” In bringing application owners together, Watkins finds there is a tradeoff between standardization and the need to preserve local benefits. Diversity, he notes, is the result of a competitive process that often delivers unique, creative solutions to particular business problems at the local level. “We like that competition part of it,” says Watkins, “the innovation and the competition that can come from letting people find a new way to do it.

What we don’t like is when it breaks the interoperability and creates islands and locks up information. We need a way to get the interoperability, but we don’t want it to cost the innovation and competition that is available in the market. “ Ontario is following a more structured approach under the umbrella of its Standard Application Environment (SAE) initiative, which began in the late ‘90s. The province has developed a number of formal documents and policies, including a Vendor of Record (VOR) for standardized middleware, and a number of mandatory standards for defining specific types of data. As in British Columbia, much of the practical work has occurred through co-operative ventures between application owners, vendors and Ontario’s I&IT Group. Wallace sees an evolutionary process where organizations are moving towards a utility-based common infrastructure.

Ontario’s tactical approach is to go for short-term wins while moving in the direction of a standardized environment. “Given that elements of middleware are still proprietary,” Wallace explains, “Ontario’s strategy is to architect component-based, re-usable solutions that are independent of the application platform. This allows for ‘plug in and plug out’ flexibility and should ease migration to more open standard-based platforms as they become available and more predominant in the industry.”

Getting components to work in a variety of application scenarios requires significant technical resources, but the biggest challenges will be on the business side, according to David Senf, manager of IT/Business Enablement Solutions Group for IDC Canada Ltd. “A lot of organizations think they can move to a service-oriented architecture and to re-use a component of an application across their organization — say it’s an accounting module that can be used in multiple applications. It’s not easy at all to define how that component is going to be re-used, and to ensure that everything works smoothly when that application is invoked and that functionality is re-used.

I think that that’s further down the road than most organizations think.” Establishing a common infrastructure for the management of user identities is a target for both Ontario and B.C. The first step is establishing a common data model for describing users; both provinces have published standards towards this end. Short-term gains will be in the area of reduced maintenance and errors. The eventual goal in Ontario’s case is the establishment of a three-layer chain of trust architecture which will not only provide single sign-on to multiple applications, but will manage permissions for specific functions within those applications. Wallace envisions a centralized computing utility that posts a number of application-independent services, including user authentication, that can be shared by any systems that require them. Middleware is seen as the “highway” that allows efficient access to these application-independent services.

Towards this end, Ontario’s I&IT Group is testing a middleware-based messaging backbone that will eventually become the basis of a “computing on demand” utility environment. IBM Canada Ltd., a major vendor for Ontario, believes that a middleware-based utility platform can help governments weather the many changes that invariably occur in the public sector without getting bogged down in overly complicated IT change management. Corey Glynn, a consultant with Global Services, explains IBM’s view: “The key to success is coming up with service-oriented architectures with middleware-based designs that are flexible, so that if services need to be joined up in ways we hadn’t anticipated, or new legislation or program delivery objectives take them in different directions, then the government can be nimble enough to do that without having to go back for major structural re-work.”

Not all proponents of SOA are believers in middleware, however. Some pundits, notably Sun’s president and chief operating officer Jonathan Schwartz, predict that SOA will eventually spell the demise of middleware as a means of integrating business functionality. The rationale is that in a properly configured SOA environment, there will not be multiple copies of data on different databases, therefore there will not be a need to exchange data. For example, the name and address of a citizen would occur only once in one database, and this would be posted as a service for all applications that required that information. Wallace, on the other hand, believes that regardless of the future of standardization, middleware will continue to be a requirement in high performance environments, and could become even more critical.

As middleware becomes richer in functionality, and as shared services become more and more available in internal IT networks and over the Web, middleware could wind up assuming much of the logical functionality currently owned by applications. Says Wallace, “As open standards get there, and as these integrated platforms get more in the place of where we’re going to be, it will be interesting to see how middleware itself evolves from being the key connector to being the application platform itself.”

Regardless of what happens with middleware and standardization, governments have their work cut out for them in terms of interoperability. As pressures increase for transparency and accountability, stakeholders will be increasingly called on to take part in the larger picture. This will be a requirement even in the best-case scenario, where vendor adoption of standards accelerates and Web services becomes a reality in the near term.

Jacob Stoller ( is an independent writer and researcher based in Toronto.

Related Download
Virtualization: For Victory Over IT Complexity Sponsor: HPE
Virtualization: For Victory Over IT Complexity
Download this white paper to learn how to effectively deploy virtualization and create your own high-performance infrastructures
Register Now