If you asked a handful of strangers what’s meant by the word “computing,” they would probably refer to using a combination of hardware and software to manage information. I doubt many of them would talk about math, but that’s the original connotations. This is part of the complicated journey of human thought through language.
Most people in the IT industry hate jargon, even those who use it all the time. It’s become pretty accepted that we should not be creating more three-letter abbreviations and should figure out a way to get rid of the ones in our collective vocabulary. But there’s another side to this story, one that suggests we might do well to not only preserve jargon, but encourage an ever-expanding glossary of technology-related terms.
In a recent article on Miller-McCune called “Rescuing Endangered Languages Means Saving Ideas,” Emily Badger writes about a joint program by the U.S. National Science Foundation and National Endowment for the Humanities where researchers are trying to preserve dialects that are dying out in regions all over the world. They believe that as half of the world’s 7,000 languages are projected to disappear by the end of the century, what we are losing are not just ways to say things but things to say.
“As the famous example says, Eskimo have numerous words to describe what Americans would just call ‘snow’ and ‘ice,’” Badger writes. “This suggests language systems don’t merely translate universal ideas into different spellings; they encode different concepts. And when we lose a language, we risk losing those concepts.”
Of course, technology jargon is far from representing an entire language, but it does tend to focus on ways of encapsulating concepts, notions, representations of ideal states of information management. As major trends in the industry take shape experts often note that they lack common definition. This was true of business intelligence, service-oriented architecture, electronic health records, customer relationship management, cloud computing and many others.
Over time, through countless articles, panel discussions, online forums and via social media, we tend to work towards some kind of consensus on these terms. But the results can be limiting. I’ve been hearing more and more people, for example, reducing cloud computing to “the use of the Internet to perform tasks you have historically done on site,” or something to that effect. This leaves out a lot about private versus public clouds, software as a service vs. hosted infrastructure and so on.
Jargon doesn’t tend to get codified with the same rigour as, say, Italian or Farsi, but it is probably more “living” than many of the languages currently on the verge of extinction. If IT managers were able to take more time to pay attention to the various usages from different sources, they would come as close as possible to getting at that “tribal knowledge” that also seems to be constantly in danger of being lost. Just because so much of what’s said about technology remains open to interpretation doesn’t mean we have to be in such a hurry to settle on an interpretation.