Click here to listen to interview with Andy Canham, president of Sun Canada
File type: mp3. Length: 12.22 minutes. Size: 4.95 MB
Hi and welcome to another edition of Voices. I’m Joaquim Menezes, Web editor of IT World Canada. Today, we have a very special guest: Andy Canham, president of Sun Canada. Andy, who was appointed to head Sun Canada’s operations last November, brings more than 20 years of experience in the global computer industry to his job. Currently, Andy directs all sales and managerial functions within Sun’s Canadian operations, as well as growth strategies for the company’s hardware, software and services portfolio.
Welcome to the program Andy.
Hi Joe, nice to meet you.
Andy, can you tell me what aspect of your job you like best, and what do you see as your greatest challenge?
Love to. I think one of the things I love best is working with the team in Canada; to be able to help customers address some of the new challenges in the world of IT is probably one of the most exciting parts of this. I think our greatest challenge right now is re-engaging with our customers over our ability to be relevant, to help them solve problems, making sure that we can help customers understand what the new capabilities of Sun really are – and they do range from a broad new capability in the server line, a new value proposition in the world of information lifecycle management, as well as many of the new capabilities that we have throughout particularly our software portfolio.
Coming to the first part of what you mentioned: your initiatives in the server market. In the overall server market in Canada, I believe, Sun Canada has a 20 per cent market share by revenue (these are numbers IDC has put out). Do you see this percentage going up in 2006 relative to your competitors such as HP and IBM?
Yes we do, and I’ll tell you why. As you’re probably aware, we’ve had a major refresh to the entire Unix-based and Sparc-based technologies that we offer to customers. And we think in the space of Unix particularly, that’s going to position strength, and some new opportunities for us, especially with the multi-thread capabilities that we now have to really offer very high-end performance capabilities for customers. In addition to that, our new x64 offerings…we really see huge growth opportunities in that segment of the market, which is a very large segment in the Canadian marketplace.
Would I be right in assuming that a big part of your sales focus this year would be on the new SunFire T1000 and T2000 systems – based on the UltraSparcT1 processor. (By the way are the T1000 systems already shipping in Canada?)
Just started shipping, yes. And it’s a little difficult to know how we’re going to fare in this. We do have a backlog of some orders around the world, which is causing a little bit of a supply constraint in Canada, but nothing too…I hope it’s not going to impact us too much, as we look at Q3. And to answer your question, yes – a lot of the value proposition that we can offer customers is around this new technology. For some of the new very compute-intensive platforms that customers are looking at, we think this is a very good fit for our new technology that we can now proceed with. We’re moving more towards an open source software model in our own software business, and driving to really having customers look at support as the key [payment] mechanism if they are going to use our software. Andy Canham>Text
And that’s where I’m a bit confused. Because prior to the launch, when the processor was code-named Niagara, the impression was it would mainly enable the lower-end Web-oriented tasks such as delivering Web pages. But at the actual launch [in New York], a lot was made about the “world records” snagged by the T1000 and 2000 systems when they were in test mode for things like Java performance, running Lotus R6, SAP software and more intensive tasks. Do you expect a big take up for the T1-based servers in these areas?
It’s interesting because I read some of that information from the launch too. And, I think we don’t have the technology pre-segmented to certain markets and not other markets. We’re discovering – as we test this technology in many environments – where it can really perform well. In terms of serving up Web pages it may be an excellent engine for doing that. At the same time though, for applications that are able to take advantage of threading, I think we’re still discovering it. It’s exciting; it’s a very new technology approach. It’s not just another iteration of the SPARC technology over the last few years. And I think, as a result we’re still learning ourselves.
Coming to this whole thrust towards hyperthreading and multicore architectures, some vendors have expressed misgivings that traditional models of software licenses might restrict sales of their multicore offerings. For example, an Intel manager talked about that just a couple of days ago. Because Intel is apparently planning for future multicore versions of Pentiums, Xeons and Itaniums that will have up to 32 cores each. The problem, as they express it, is that as more logic cores are squeezed into each chip, the software licenses will increase in cost accordingly – but maybe out of all proportion to the cost of the hardware. Is this something that Sun is concerned about as well?
I think it is, certainly. And we are working with – for example – Oracle. Recently we have been working with [Oracle] to adjust to the degree we can, and help work with them through doing multi-core software pricing. So I think that the software industry, over time, will need to respond to customer needs and customer interest in these technologies to be able to deploy them effectively. So if we can still deliver (on that overall platform) price-performance advantages – even with that licensing challenge, I think this is going to be attractive for customers to look at.
It’s interesting you mentioned your talks with Oracle – because Oracle was cited both by an Intel spokesperson and an HP spokesperson as probably being the most misaligned in this area. The argument was that the processor is not a fixed entity any more and as the world’s changing so companies like Oracle should change with it. But Oracle issued a statement recently that they don’t have a position with respect to…say dual core processors, but their stand is a core is equal to a CPU and all cores as required to be licensed. You’ll announced a collaboration with Oracle recently, but is this something you’ll are really struggling with?
Well we announced, in fact, a promotional offering with Oracle…and the talks about how Oracle will count each core as .25 CPUs will go a long way to addressing the problem in the short term. I do think for Oracle…that this is a very different model for customers to be looking at. And Oracle needs to ensure that whatever decisions they make on a broad basis don’t disadvantage some customers. As you know recently, Joaquim, we’re moving more towards an open source software model in our own software business, and driving to really having customers look at support as the key mechanism to really pay for if they are going to use our software. And that will allow us also to have