Do you think that technology is an over-used word?
The IT industry is replete with terms that are either poorly defined or have been re-defined in non-intuitive ways. Containers, Simple Object Access Protocol (SOAP), Representational State Transfer (REST), cloud, fog, and more come to mind immediately — but I think the word “technology” is one of the worst culprits.
It seems as if almost everything is a technology – we say wireless technology, information technology, chip technology, storage technology, etc.
I’m not surprised there’s some confusion, given that the Oxford dictionary defines technology as, “the application of scientific knowledge for practical purposes, especially in industry.” This definition covers almost anything and sounds to me more like a definition for engineering!
According to Wikipedia, technology is, “the collection of techniques, skills, methods and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, etc. or it can be embedded in machines, computers, devices and factories, which can be operated by individuals without detailed knowledge of the workings of such things.”
Technology, then, can be both the underlying science or the resulting products and services.
Let’s look at one particular phrase in more detail: emerging digital technologies.
The meaning of emerging
We can ask the obvious question: when is a technology considered to be emerging?
There is a difference between the invention/discovery of a technology and its inclusion in products. For example, encryption can be viewed as a technology, while implementing an encryption algorithm in software is a product.
To me, a technology is just a theory until products cross their chasm (see Geoffrey Moore’s book on this subject) and are being adopted by the early majority customers. A technology could also be called emerging when it has reached the slope of enlightenment in the Gartner Hype Cycle.
So, emerging is more than mere existence, it means fit for use.
The emphasis on digital
The word “digital” is also over-used. What do we mean by a digital technology or a digital business? Isn’t virtually every computer system digital including our “legacy” systems of record?
To be fair, there have been analogue (i.e., non-digital) computers in the past.
Technically, digital means that data or measurements are encoded as bits that can be processed, stored and communicated. Digital is the opposite of analogue, which is a continuous stream of unencoded information, such as sound or vision.
Two reasons for the current emphasis on all things digital may be:
- To distinguish technologies for data processing from other sciences (for example, materials such as graphene); or
- To highlight the importance of data to almost everything we do, from cars and houses to telephones and radios.
The digital economy is an economy in which money is digital and business has moved from manual processes to computerized, or at least computer-assisted, processes. Marketing campaigns and online stores are two areas where digital technologies are replacing more traditional processes.
The SMACI technologies
Here’s a few of the popular “technologies” that are said to be both emerging and digital:
- Social networks – Although social networks such as Facebook, LinkedIn and Twitter can no longer be called emerging, they are still evolving and growing. Social networks are not a single technology; instead, they combine several technologies (applications, smartphones, Internet, and large-scale data centres) in ways that create new value. The key innovation is the application.
- Mobility – Mobility, at its most basic, is wireless communications for which several different technologies are useful (the most current being 4G LTE). A more complete picture, however, is that mobility is an ecosystem that includes wireless radios, smart devices, application stores, management systems, and various standards for formats, interfaces and transmission.
- Big data/Analytics – Big data is a general reference to the storage and processing of large data sets including the use of analytics to identify patterns and trends. There are various underlying technologies involved, not one single innovation.
- Cloud computing – Cloud computing is large-scale data centres offering various IT services to multiple customers on a shared basis. Cloud computing is highly automated, widely accessible, and is scalable on demand. Component technologies include both the traditional (e.g., Internet), and advanced automation and sharing software.
- Internet of Things – As with the other areas, IoT provides M2M (machine to machine) services using a combination of component technologies. These include cloud computing, mobility, and big data as well as specific applications. Some important IoT innovations will be the large-scale (billions of devices), the distribution of processing, and the integration of various systems.
As can be seen by the examples, not everything can be considered as a unique technology unto itself.
Do you agree or disagree?