64-bit systems: like driving a Ferrari on city streets

The next-generation of computing is coming to smaller businesses. But should they care?

Since the late 1980s, most desktop PCs and smaller servers have run on chips and software that process information in 32-bit chunks. Systems that crunch twice as much data – 64-bits – have been available too, but cost limited the technology mainly to top-of-the-line servers and workstations.

More from Dan McLean

To read more of Dan’s articles for The Globe and Mail, click here.

This is about to change.

A wave of affordable 64-bit technology – something called “industry standard 64-bit” – will wash over the IT landscape in the next couple of years, bringing with it the supercomputing strength needed to drive more comprehensive and complex data analysis, and deliver huge performance increases. This means smaller businesses will be able to afford to start using computers to do things formerly reserved for well-heeled corporations.

The first ripples of the wave can be seen in the 64-bit technology already found on some lower-cost machines that use AMD’s Opteron, or Intel’s Itanium and Xeon chips. Apple’s high-end PowerMacs are 64-bit, as is AMD’s Athlon 64 PC chip, and Intel recently said it will ship 64-bit desktop processors later this year.

But while it’s full of promise, the technology is nothing for the average IT department to start clamouring for – at least not yet.

The problem is you can’t do much with it right now, because there are few standard 64-bit programs. Sure, there’s backward compatibility, so businesses can run their existing 32-bit applications – all that Microsoft Windows-based software most of us currently use on our old-school 32-bit computing hardware – but those programs won’t take advantage of the power of a 64-bit system. In fact, an old 32-bit application running on a new 64-bit platform it wasn’t designed for might actually work slower.

Until Microsoft delivers its promised 64-bit version of Windows XP, there probably won’t be much happening on the application-development front, either. So owning 64-bit gear in the short term will be akin to driving a Ferrari on city streets – the high-powered performance is pretty much wasted. And pundits suggest such high-powered performance might be overkill in many cases – does the average person really need supercomputing power for a word processor or spreadsheet? Probably not.

Still, the new processing architecture is coming because trend-setters like Intel and Microsoft are backing it, and it will likely arrive in the mainstream sooner rather than later. The 64-bit wave will likely strike the IT-department back rooms first, washing from there to the desktop. Alan Freedman, a researcher with IDC Canada, predicts in the x86 Intel-type microprocessor market, the 64-bit variety will dominate in servers during the next couple of years.

In fact, attendees at the recent Future of 64-Bit Computing in the Enterprise conference in Toronto were told that 32-bit technology would be “obsolete” by the end of the decade. Guess those 64-bit applications had better start cropping up soon.

Still, while it’s overkill for some users, industry standard 64-bit will be a boon to others, especially when it comes to complex application processing and calculations. Sixty four-bit chips deliver superior computing performance when working with massive amounts of data, storing and manipulating it through large 64-bit memory cache (32-bit chips handle up to 4GB of memory, whereas the top-end for a 64-bit system is theoretically 16 billion gigabytes). This king-sized memory caching eliminates the need for continual data swapping from slow hard drives, a bottleneck typical of 32-bit computing. Much, much more data can be stored in a 64-bit memory cache to make data searches and data retrieval a lot quicker, which is important when you’re, say, or working with an extremely large database or other types of massive files.

That’s good news, if you believe that smaller businesses will become increasingly dependent on complex data analysis for their standard business operations. And the processing power of 64-bit may allow businesses to replace groups of 32-bit networked servers supporting multiple users and their computing tasks with fewer 64-bit machines, reducing hardware and maintenance costs.

But making these next-generation systems useful to businesses still depends on the availability of 64-bit programs. The software that’s initially expected to take advantage of industry standard 64-bit platforms will likely be aimed at graphics-intensive processing, multimedia delivery and rendering, and security applications involving encryption/decryption, among other things. And when it comes to the software running the core operations of a business, the advent of 64-bit will certainly provide the necessary computing muscle for corporate-type solutions like customer relationship management (CRM) and enterprise resource planning (ERP) – applications that are already appearing in forms designed for smaller businesses.

In fact, Microsoft will likely lead the charge in this area, since besides developing operating systems, the company’s Business Solutions organization is focused squarely on CRM and other enterprise applications for small business.

And once 64-bit computing becomes the mainstream standard, where do things go from there? It likely won’t be in 128-bit technology – at least in our lifetime. Harlan McGhan, a senior staff engineer with Sun Microsystems, told the Toronto conference attendees that the need for more than the maximum amount of RAM that 64-bit systems can handle, which is effectively a trillion bytes multiplied by another trillion bytes – will be a problem for their grandchildren. Instead, microprocessor technology will evolve to a model called multi-core: where a single chip will be built with conceivably dozens of separate microprocessors on it so that they can handle many tasks simultaneously. Both AMD and Intel are expected to ship dual-core chips later this year, but that’s an issue for another column.

So, like it or not, a major upgrade to microprocessing for the masses is coming. It’s going to be driven by hardware makers, who will be joined by the software developers. And for the time being, from the perspective of businesses who will be pushed to upgrade their systems, it will be a wasted effort – at least until a decent selection of 64-bit applications come along.

— This article appeared in The Globe and Mail on March 10, 2005.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now