Forgetting DCE

Some of you might remember Distributed Computing Environment (DCE), but it’s not clear that some industry pundits or venture capitalists do. Or at least they haven’t internalized a principal reason that DCE is, to put it politely, not prevalent today.

DCE is a set of technologies developed by the Open Software Foundation (now called The Open Group – that lets a computer user employ network-based resources to augment his or her local computer. DCE, quoting from an IBM Corp. Web page, “is a comprehensive suite of integrated, yet modular, products which support transparent file access and secure resource sharing in heterogeneous, networked computing environments.”

I’m sure the Open Group will call this simplistic, but in my mind a major reason that DCE was developed was to share resources, such as disk and processor cycles, over a network because having enough dedicated resources for individuals was too expensive. With DCE, the user can access databases without needing to have a local copy and can get heavy-duty processing done without as powerful a computer on his or her desk.

But the DCE proponents did not take into account the continued development of technology. Before the DCE specifications could be fully developed, disk and computer technology developed enough to negate much of the assumed advantages of using DCE. DCE was based on the assumption that the cost of managing the use of distributed resources would remain less than the cost of replicating them. This assumption did not prove to be long-lived.

There just may be a lesson in the history of DCE for those who are considering investing in peer-to-peer networking, storage as a service offering or maybe even VPNs.

I am leaving out a number of other arguments that were made in the case of DCE – single sign-on, centralized back-up, centralized authorization management and more. Some of these arguments are now made for the newer technologies – they may prove to be as non-decisive as they were for DCE. I am also leaving out the ego factor that leads network managers to think that they should control everything that connects to their networks. That factor is harder to analyze – some of the egos are rather strong.

An undercurrent of Clayton Christensen’s book, The Innovator’s Dilemma, is that it is quite hard for people to take into account the fact that technology does not stand still when evaluating their options. It is much too easy to see what you can buy today and assume that it represents what will be available in the future. An example of this may be the pundits that dismiss using the best-effort Internet for telephony – all they can see is that it would not work well enough for them today. They forget that using today as a guide led to DCE’s development.

Bradner is a consultant with Harvard University’s University Information Systems. He can be reached at [email protected].

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

ADaPT connects employers with highly skilled young workers

Help wanted. That’s what many tech companies across Canada are saying, and research shows...

Unlocking Transformation: IoT and Generative AI Powered by Cloud

Amidst economic fluctuations and disruptive forces, Canadian businesses are steering through uncharted waters. To...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now