Site icon IT World Canada

A Case of Smoke & Mirrors

Not long ago the software industry announced new initiatives to sell in the business intelligence market, forecasting sales in excess of US$100 billion over the next few years. At first glance this development looks exciting: software that can deliver intelligence. On closer inspection, however, the sales projection doesn’t quite add up. For one, this large number is more a measure of existing sales in the data mining and data warehousing arena; the sales are not necessarily new, just shifted to a newly defined category. That alone would not be too problematic or disturbing. The real misfire occurs when you examine how the business intelligence cycle actually works.

What the software industry fails to grasp is that business intelligence is traditionally defined as information that has been analyzed to the point where a person can make a decision. Software generally does not analyze. When it does perform some rudimentary analysis, it is typically based on internal data, and the analysis is purely quantitative. Business reality, however, dictates that much analysis be based on the qualitative – an argument, a phrase, a visual assessment – not just on numbers and stats.

The business intelligence cycle has four steps. If you fail to complete any single step, the cycle breaks down, and you do not receive the necessary, critical intelligence. Step 1 demands that managers clearly express the problem or business issue and its related decisions. Step 2 requires you to gather the raw data from published sources, Internet discussion groups and face-to-face interviews, among other sources. At this step, you can already see where electronic or Internet-based data may not supply all the answers. Step 3 asks that you digest and make sense of the entire pile of data you have gathered and make it actionable. This step is perhaps the most valuable in the cycle. By analysing the mounds of data and considering implications, you take the world as every other rival sees it and give it a new spin for your management. This remains a very human job. Step 4(a) may represent a bound set of graphs and charts or nothing more than a sit-down discussion. During this phase, you influence managers and their decisions. Step 4(b) is critical, but only in a very limited way. It is making measuring effectiveness an automated function.

We recently examined nearly 60 software packages that claimed to offer business intelligence solutions and discovered that most significantly exaggerated their true capabilities. For instance, 66 per cent claimed to fulfill Step 2, but we found only 49 per cent did a passable job in this area. The greatest discrepancy fell under Step 3. While 59 per cent of vendors claimed their packages analyzed the information, only 10 per cent of the packages truly did any sort of analysis based on the intelligence model. Even the `report and inform’ requirement found the vendors exaggerating: 80 per cent stated that their products provided a usable reporting function, but we felt that only 53 per cent did so. Most packages failed to address the issues in Steps 1 and 4b.

The chasm between true business intelligence and software reality is deep and wide. While the software industry has made great progress in the collection and reporting phases, it fails in other areas of the cycle. The truth is, software can help deliver intelligence, but not alone; human interaction is still required, and often so is a combination of tools.

A Real-life Example

A couple of years ago, we had a financial services client that saw a portion of its customer base migrate to the discount brokerage business and the Internet. To chart its own course, the client wanted to explore the following issues: What key investments were its competitors making in e-commerce? What kinds of companies were they partnering with? What underlying technology support could they deliver to customers? And finally, what was their thinking and strategy related to security, thin client vs. fat client, and other trends in the industry?

The analysis surprised our client. It turned out that competitors were taking quite varied approaches in areas previously thought to offer little flexibility or few options. In addition to learning of new paths it might consider pursuing, the client gained a much needed sense of urgency that would be required to secure funding for future R&D and marketing.

Let’s roll back the clock and illustrate how we addressed this client’s questions by taking this business issue fully through the intelligence cycle – with and without the aid of business intelligence software.

STEP 1: DEFINE THE PROBLEM

“The client wanted to explore several issues: what key investments its competitors were making in e-commerce; what kinds of companies they were partnering with….” Human contribution: These questions arose from countless discussions within the company, and were the result of customer interactions, sales-force complaints and so on.

Technology contribution: A software package can scan queries and learn from them, refining each time its ability to examine an existing pool of data on “partnering” or “security.” Ultimately, it could search the data pool and accept only data matching its refined search string. Profoundly, what most packages today cannot do is anticipate the questions that have not been asked by management or are asked outside the network, through impromptu meetings, for example.

STEP 2: COLLECT AND ORGANIZE

Human contribution: Articles, references on the Internet and referrals from other sources all lead to experts that in turn lead to other experts. Only a small percentage of business data appears in published or electronic form, so most data must be collected from people by other people. Among the most critical expert knowledge to capture is that of the internal experts – salespeople, procurement officers, customer service representatives, scientists, loan officers and so on. Ironically, those running intelligence programs often find it easier to collect information from experts located outside the company rather than within.

Technology contribution: Certain software packages do an excellent job of

scanning HTML pages and extracting news articles. Some even use the latest presentation applications and graph their findings based on the number of hits on a particular topic, such as e-commerce and brokerage. Certain organizations also use groupware or intranet-based technologies to organize and categorize the internal expertise of a company. We have not seen a technology that will identify and catalogue an expert based on tracking an individual’s keystrokes, but a great deal of the available software does an excellent job of cataloguing such expertise – once located and keyed in by another manager. These packages do a poor job of relating one source to another and possibly providing additional leads to the analyst who needs to locate another expert. Most packages also fail to organize the data they do find and simply generate long lists, ranking the articles by relevancy.

STEP 3: ANALYZE

“The analysis surprised the client in that competitors were taking quite varied approaches in areas previously thought to offer little flexibility or few options.”

Human contribution: If you could measure the distance it takes to reach an intelligence goal, the analysis step would take you most of the way. It is also a very human task. Analysis involves discerning innuendo and even subtle shifts in voice or body language. In this project, for example, each interview transcript offered nuances that answered specific questions unlikely to be found in a printed or electronic source.

Technology contribution: Data-mining systems can draw numerical inferences and comparisons in the areas of customer buying patterns or inventory management. Yet the quantitative arena is a fairly narrow one, offering limited benefits to managers seeking compet-itive insights. In the qualitative arena, which is what traditional business or competitive intelligence is all about, nearly all of

these packages fall short. We have seen little evidence of software that provides the kind of spatial analysis, timeline analysis and relationship analysis that trained professionals perform. Analysis of soft data means seeing just around the corner, such as appreciating why a rival expanded its booth space at the trade show this year, or chose to display only certain services on its Web site. The current technology offerings have a hard time looking around the corner – especially when that corner is on a different block.

STEP 4A: REPORT AND INFORM

Human contribution: This is where the rubber meets the road. Underlying ‘report and inform’ is the argument. You need to argue intelligence. If intelligence is truly vital it will shake up management. Opinions that create friction need strong arguments to support them, and only people can argue well. In this case, the consultants spent hours explaining the findings, pulling out transcripts and pointing at charts. At the end of the day, the client made the changes – not based on paper but on people.

Technology contribution: The best of the packages have found ways to consolidate complex pieces of information. They have created graphs, charts and summary tables containing text. Data-mining tools produce neat, little statistical comparisons that can reveal new customer-buying patterns, for example. These packages cannot argue a point, as we’ve mentioned. Because they can report only in printed form – not a discussion, or finger-wagging argument – they limit the power of their message.

STEP 4B: EVALUATE AND REFINE

Human contribution: At this stage, you judge the quality of the intelligence you produce and how managers used the findings.

Technology contribution: Nearly every system has the ability to track usage. On a very basic level, you can determine which Web page, for instance, your audience uses most, or what types of searches your management requests most often. Software itself cannot grow or change without human intervention. In the financial services example, if the market changed radically, an automated system cannot make the radical shifts or craft out-of-the-box questions without prompting from its intelligence handlers.

A warning to the software industry: Nothing kills a bad product faster than good advertising. Just saying it’s so, as the saying goes, does not make it so.

Just repackaging data-mining tools and placing them under the banner of busi-ness intelligence does not make them intelligence tools. The vendors do in fact offer some excellent products; they just need to be sold as what they truly are – raw tools that do not take the place of a human being, the true intelligence analyzer.

Leonard M. Fuld (lfuld@fuld.com) is president and founder of Fuld & Co., a Cambridge, Mass.-based intelligence research and consulting company. Ken Sawka (ksawka@fuld.com) is vice-president of the Intelligence Systems practice at Fuld & Co. and is an expert and speaker in the field of competitive and business intelligence.

Exit mobile version