Smart thinking on Business Intelligence

Grappling with issues around Business Intelligence? CIO Canada’s spring roundtable brought together six Canadian IT executives to discuss their BI pain points and implementation strategies, and offer their insight on such challenges as effective data gathering, gaining executive buy-in, and securing long-term funding. Each panellist also offered his or her best advice for implementing BI successfully. This article presents the highlights of that discussion.

Roundtable Participants

Tom Atkins (moderator), President, the Tramore Group
Janet Babcock, V.P. and CIO, The Dominion of Canada General Insurance Company
Mario Kovacevic, V.P. Information Technology and E-Business, J.J. Barnicke Ltd.
Sarah Kramer, Provincial V.P. and CIO, Cancer Care Ontario
Chris Moore, CIO, City of Brampton
Peter Pereira, CIO & Head of CIBC Mortgages, Lending & Insurance Technology
Dr. George Semeczko, Chief Technology Officer, Information Services & Change Management, Royal & Sunalliance

Atkins: How do you create the discipline at source to collect, manage and ensure the completeness and accuracy of the information required to make BI effective?

Semeczko: You have to have an ample supply of very good business analysts because the BA is the glue that holds these systems together, from the development of your transactional system through to your back-end BI data sources. At Royal & SunAlliance, we rely heavily on our BAs having an extraordinarily good understanding of the business, and what the information in the business means from the front end and how that feeds into the back end.

Babcock: One of the things that helped us improve our discipline in this area was demonstrating the pain of poor data quality – showing members of the executive team the transformation costs when we went to build our first warehouse, showing them the amount of time that went into cleansing the data. The other important factor was ease of use. In the insurance industry the entry point of most of the information that you are interested in comes in through the operational systems. So you want to design and build applications that are easy to use and make it difficult to corrupt the data.

Kramer: This is a huge challenge for those of us in health informatics because we don’t have a history or tradition of point-of-service, point-of-sale information. We have reporting requirements from people out in the field, and the information we generally can give them has historically been retrospective and ad hoc. We’re constantly trying to improve this set of data. What we do at Cancer Care Ontario is to specify data sets, and we have something called a data book that we give to all of our data providers – there are hundreds of organizations and thousands of individuals to supply this data. The issue is very difficult and we have to push people hard sometimes. There are many people in healthcare who want the information but who balk when asked to collect the data that supports it. We are changing things a little with some new systems, like the Ontario Wait Time Information System, which collect data as somewhat of a byproduct of business processes, but that’s an anomaly.

Atkins: How do you determine what your core data set is and the relationship between that core data set and your business processes?

Moore: The City of Brampton needs data around the planning and management of land, the management of infrastructure, around people and financials, and we also need citizen-centred data. So for us it’s about understanding what the big core data sets are and how everything maps together. A lot of our work involves consolidating data that already exists. Right now we’re going through a large enterprise architecture and business systems mapping process to get to that end-state map, but in the meantime people still need things now. So we’ve got one team that’s going out and discovering the new world, confirming that the earth is not flat, and another team that’s just churning away, getting things done. It’s my responsibility to make sure that all of those planets stay in alignment and nothing collides, so that at some time in the near future, we won’t be looking back and saying “Why didn’t we do that?” So you have to know what those large planets of data are and how they need to interact, and that can only come through understanding what the business needs are.

Kovacevic: J.J. Barnicke Limited is a full service commercial real estate brokerage company. What we look for when thinking of core data within the context of our business is data that drives our core business practise. During this discovery, we collaborate heavily well outside our IT practice group and include our business stakeholders to define core data sets. Many of our business processes are built around core data, and we find that our most important business processes are dependant on core data. Over time and geography these core data sets may change, but a continued collaborative approach to definition allows us to position business processes appropriately.

Pereira: Traditionally everyone in our industry looked at everything from an account perspective. But the change which we are seeing today, which is emerging and rapidly gaining acceptance, is to look at everything from a customer perspective. So most of my challenge is to bring about that cross-over. And the challenges I face are not just around the business process, they are also around the very application design itself. Many of our legacy applications have been designed in a product-centric fashion, rather than a customer-centric one. But as we start moving to CIF’s and other customer-centric views, a lot of my discussions are on the mapping from a product to a customer view, and then the combination of the data ownership, the data process and the combination between various segments within a product view and account view to a customer view. It’s a great process that we’re going through right now.

Atkins: Many organizations have improved the accuracy and completeness of their electronic information sets over time. This makes for a challenge in presenting trend over time, when the data in each time period is at different levels of accuracy and completeness. How has your organization dealt with that problem?

Babcock: Historically, we have taken all of our data and translated it down to the lowest common denominator – for argument’s sake, we’ve taken a thousand data elements and translated them down to five hundred. Certainly our actuaries need multiple years of data. We now have implemented a very rich solution set in terms of quality and completeness, and the amount of information that’s available to the business. However, to assess trends, actuarial needs current and historical data in the same format, and I can’t take five hundred data elements and turn them into a thousand. So as we move forward, that is one of our biggest stress points. We are investing heavily in new operational systems and a key focus is richness in information, and yet we need to present five or ten years all in the same format. We are still looking for a way to solve the problem.

Semeczko: There are lots of theoretical approaches to this issue but the reality is there’s no silver bullet. It does highlight the need for a very good data architecture. It’s a wonderful example to present to your business colleagues to help them understand why you need to invest in that data architecture. It’s that old question of weighing the value of the aged data, the volume of that data, and how much it costs to transform it into your new data architecture. But the reality is that the business evolves and your products evolve, and there is no quick easy fix.

Atkins: How do you sell the CEO or key members of the senior team on things such as IT architecture that will provide long-term operational advantage?

Kramer: I think I’ve been remarkably successful in selling a long-term vision of an architecture that will support the opportunities of the future. But I have to do that every year, as I think everyone at the table has to. And now, three years in, it’s a little tougher to continue to draw so much from our capital pool every year. One thing that I am constantly pushing my own team to do is to be both strategic and structured in terms of approach, and opportunistic in terms of delivering some net new piece of service to the client group at all times – not simply translating the users’ tools from one set to a new, better set. The other dynamic at my table is that there is a certain amount of tolerance at the C-level for one-time dollars every year to do net new things. But there are also ongoing support costs for these things, and that tends to get forgotten the next year. So you have to constantly do a selling job on these rising costs, because managing these things over time can gobble up a fixed IT operating budget in the public sector. That’s a big challenge for me.

Kovacevic: We are fortunate enough to have a CEO and senior team that clearly understand the impact of macroeconomic forces as they relate to our business. This understanding underlines our business’s need for both long-term operational and strategic planning and a combined need for short-term tactical provisioning. When presenting concepts such as IT architecture, I must clearly demonstrate them in the context of this time continuum. Identifying key short-term milestones that have the requisite agility is important and makes it easy for me to represent concepts such as IT architecture in a manner that is consistent with our business. A timeline that highlights effective and tactical short-term deliverables together with long-term operational goals is consistent with the manner of business understanding our senior team has. Presenting only IT architecture in long-range terms is not something that I would do.

Atkins: One of the roles of the CIO is sharing a vision of the different ways IT can help the business. Have you assumed that role in your dealings with your executive groups?

Moore: I spend a lot of time talking about, selling and promoting the art of the possible, and the challenge is how to do that with the executive team, individually and together, in a tactful manner. My challenge is that I see the pain and the frustration of the thousands of people in the line staff who are just trying to do their job, and they’re trying to do it sometimes without the right tools and without the information. And it’s very frustrating to see that and not be able to provide relief to everyone, all at the same time.

Atkins: Many enterprises are migrating from years-old legacy systems to systems based on new technologies. How important is it to include the enterprise’s business intelligence needs in the development of these new systems?

Pereira: In every program we run, BI is a co-component of the project team right from the inception. Be it a legacy-system replacement or an enhancement, BI is at the table. A large part of that is because of the high reliance we have on existing BI, and a need to ensure that as we modify our system, we carry it through. That’s why BI is not taken as a separate initiative by itself but as something that is operationalized. That way we will not only get the funding, but we do not have to write a separate unique business case for BI.

Atkins: Have you considered outsourcing or co-sourcing business intelligence capabilities?

Kovacevic: We have been very successful in co-sourcing in the area of BI. One key reason for success has been the correct alignment with the right third-party organizations that understand our vertical marketplace, our core data and principle business processes. One reason that we had success with this type of environment is that we purposely kept our staff involved through all facets of third-party engagement. Outsourcing is not out of the question, but at present we have seen that there is a strong dependency on leveraging the analytical strength of our own professionals together with the right amount of third-party involvement to yield success in the area of BI.

Semeczko: We’ve just gone through some outsourcing for our infrastructure and our data communications and networking. The whole view of looking at outsourcing, in-sourcing and co-sourcing is often considered. So as we look to see how to get more efficient and raise the bar on our business intelligence, it’s certainly one of the things in the mix that would enable us to best do that, while getting to market quickly and retaining knowledge within the organization.

Atkins: What are the key cost components of implementing and running business intelligence?

Kramer: For us, the biggest cost is in business analysts and in specification – trying to get business users to truly specify what their business intelligence questions are. I’ve got users across the province in government and healthcare institutions that are not used to defining what information they need to make what kind of decisions at what point – and then specifying that. So for us, it’s bringing that translation effort and consulting effort to the table. And there’s a huge cost to that. There’s a benefit realized by the cost but it’s the toughest one for us to resource and to get support for at the executive table, because there isn’t an understanding of the importance of that work.

Moore: The one cost I think about a lot is the triage cost. We’ve got four to five large data sets we are developing, moving them to a point were they will be very mature and robust. In the meantime the business goes on. So for me it’s about what am I going to spend in the meantime in triage to get the results the business needs. If we don’t produce results, then the IT function has limited value.

Pereira: Things like the cost of the technology are manageable for us. One of our biggest drivers is the cost of change. In many of these business intelligence programs the lead time is pretty long. And as the market moves forward, and standards and directions are set, you find two things happening. One is change from the external world, be it regulatory compliance or mandatory requirements; and the other is internal, such as a change in personnel. So with these factors in play, I definitely look at anything that has greater than a one-year horizon with a jaundiced view. Because in one year even the players have changed, and so the needs have changed.

Atkins: Have you had any difficulty getting executive buy-in to implement new BI systems, and if yes, how did you deal with it?

Babcock: I would say there’s very strong executive support across the organization. What we are struggling through now is we have a major multi-year initiative to replace our legacy applications, and what I’m not getting buy into is strong investment in information architecture, and I think that’s because the benefit is too far out. They are seeing today that they can answer the questions that they want to answer, and so it’s very difficult to convince them that we need to spend potentially millions more – over and above the huge investment in legacy replacement – to obtain a really robust information architecture. I have a real challenge in articulating what the long-term benefit of that investment is, when that benefit is eight or ten years out, not two or three.

Atkins: What was your biggest challenge in implementing business intelligence systems within your organization? And how did you overcome that challenge?

Pereira: When I joined this group the biggest challenge I faced was that I had too much BI. There was BI in each portfolio and none of it was standard or consistent. What I did first of all was consolidate everything so that I could bring about a level of consistency. After I did that, I actually disowned it – for the very fact that I did not need BI as a separate group. BI was part of my application developmental lifecycle and end-user support. For the last two years that’s exactly what I’ve done and I’ve had no further issues.

Kramer: The biggest challenge was to define and manage expectations from my colleagues and people in the organization around what a true business intelligence infrastructure would bring. And the way that I continue to manage that is to talk about, and then demonstrate, how bringing a structured approach to answering these questions can unleash much more value from the separate data stores we had previously in unstructured format. It’s a constant sell job around how we can bring added business value by providing a structured linkable approach to answering those business intelligence questions. But as we are now able to demonstrate that value, it becomes easier and easier over time.

Atkins: What advice you would offer to other IT executives on implementing BI systems successfully?

Babcock: Have a very realistic view of where you are starting from with respect to your data. For us, data quality and data cleansing became very costly and certainly elongated the first few initiatives. So do not go in without that very strong appreciation of where you are starting from in terms of the data you are working with. Our experience was that the business had a difficult time being able to articulate what they wanted and how they were going to use the capability once they got it, and I think too much time was spent stressing about that up-front. In our first few cases we probably should have moved faster to deliver something with the expectation that, yes, we are not going to get it right, but we will re-work it. That was a learning curve for us. Now that we have reached a different level of maturity, we feel differently about it. We feel we should push harder to get it right the first time, but then we are working with a business group that has a better appreciation of what the capabilities are.

Semeczko: First, invest in a good data architect. Second, ensure that your BI architecture is integrated to the rest of your architecture and not just a bolt-on to the rest of your systems. And the third thing is to understand the breadth and cost of your existing BI systems, including the hidden systems in end-user computing, and understand how they integrate with your solution.

Kovacevic: Make certain that you have executive alignment and real BI demand from the business. I believe strongly that you must demonstrate short-term deliverables along the path to achieving long-term strategic goals. Short-term tactical goal setting and delivery needs to be a part of your long-term strategy. You must be able to deliver BI components in a reasonable time frame that the business can acknowledge, validate and absorb within its operation. Make certain that BI is a principle part of your ongoing inclusion in both operating and capital cost budgeting.

Kramer: Like any work that a good CIO does, make sure that BI is desired and supported by your executive colleagues. Your job is to both understand that that’s really what they’re saying they need, even if it isn’t how they’re saying it, and to sell them on what can be done should they invest, because there’s a sense of not knowing precisely what the value is. Also make sure it’s clear that there is both a long game and a short game, and that’s not only in terms of investment but also in terms of management. So it won’t go away in year three, like other projects might. This is an ongoing effort. And thirdly, you have to constantly re-group and look inward to see that you are actually getting the value that you thought you would when you started. This type of project needs more continuous audit and review of every investment and every output to make sure that you’re taking the most advantage of what you have. It’s a constant rebalancing act.

Pereira: One of the keys for me is ensuring that there is complete executive alignment and sponsorship for the program. Another thing which I’ve always done is called phasing, or what I call a conference room pilot – so I’ll phase it in so there is an incremental view of what exactly they want. The third one, which I’ve always kept close to my heart, is I always build in at least a 25 percent rework, because no matter how much time I spend up front on the definitions and requirements, as they see the data they are for the first time simulating and looking at it, and there’s always rework. I build that right in the business case.

Moore: Keep it simple, deliver results, and continually communicate the value.

QuickLink: 077234

David Carey is a veteran journalist specializing in information technology and IT management. Based in Toronto, he is editor of CIO Canada. Moderator Tom Atkins CMC, PMP is President of the Tramore Group, a Toronto-based professional services firm specializing in large-scale program and project management. Mr. Atkins can be contacted at CIO Canada wishes to thank Oracle Canada for sponsoring the Spring Roundtable.

Related Download
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center Sponsor: Lenovo
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center
Find out how Hyperconverged systems can help you meet the challenges of the modern IT department. Click here to find out more.
Register Now