Databases make move to main memory

In the rapidly growing Internet-centred application world, hard drive-based relational databases aren’t always fast enough, according to proponents of a new approach called main memory databases (MMDBs).

MMDBs, also referred to as in-memory databases, sport functions similar to relational or multidimensional databases but with performance that is significantly higher. This is because MMDBs live in a computer’s RAM.

“If it’s in memory, your search and retrieve operations are conducted in memory,” explained Sanjeev Varma, research director at Gartner Group Inc. in San Jose, Calif. “[With disk-based databases], the data’s on disk at different locations on the disk, and so the I/O operations take more time to execute than in-memory operations.”

Varma said this has been around for a little while, but now that Microsoft is getting involved with MMDBs the technology is going to get additional attention.

What Microsoft is calling the In-Memory Database will be shipped as part of Windows 2000. “It gives you fully transactional database systems designed to provide extremely fast access to data without having to go to a hard drive,” said Erik Moll, marketing manager for Windows NT Server at Microsoft Canada Co. in Mississauga, Ont.

Moll said the technology will be especially useful for e-commerce applications.

“Anything that’s querying against an existing database to validate data or have data go against a particular account, and you want a very quick response — that’s where it fits in,” he said.

TimesTen Performance Software, a Mountain View, Calif.-based Hewlett-Packard Co. spin-off company, was the first to ship a commercial MMDB back in February 1998, according to Tim Shetler, vice-president of marketing for TimesTen. He said that although the approach is fairly new in commercial terms, the research that went into MMDBs occurred back in the mid-’80s.

“Places like IBM, AT&T, and Bell Labs all did a lot of research,” Shetler said. “In fact, telecommunications companies and big industrial companies that had the wherewithal actually built, for internal use, some earlier versions of MMDBs that were used in telephone switching equipment and military applications and things like that.”

It took so long to make it into mainstream commercial markets, he said, simply because of economics.

“There’s been a relentless decline in the price of memory,” he said. “You can now buy a lot of memory and hold a pretty big database and not spend a lot of money on that memory.”

He said that buying a gigabyte of memory 20 years ago – even if it was possible to put a gigabyte on a computer back then — would cost about $5 million. Today, a gigabyte of physical memory costs less than $3,000.

The other factor that’s helping to bring MMDBs into the commercial world is the advent of 64-bit computing, he said.

“Thirty-two bit computers, which are still the most prevalent, will limit the amount of memory that an application can address to something like two to three gigabytes — depending on which system you’re talking about — which means that’s the biggest database you could put in physical memory,” Shetler said. “However, the new 64-bit systems — which you can get from Sun, HP, Compaq, Silicon Graphics and IBM — will allow in-memory databases that are tens of gigabytes in size.”

He said the main target group currently includes voice and data network suppliers.

Kanata, Ont.-based Bridgewater Systems is now testing TimesTen’s product. Bridgewater develops systems and software for use by ISPs and it plans to incorporate the MMDB as part of its products.

“The reason for [using MMDBs] is that we are monitoring a large number of resources — we basically support up to five million subscribers in our particular product,” said Kirby Kostner, a product manager at Bridgewater.

“MMDBs allow us to actually track, in real-time, the number of simultaneous sessions that are taking place from a particular subscriber.”

He agreed that speed and performance are the major issues.

“Based on the number of concurrent operations that are occurring and authentications we have to perform, we need to be able to provide hundreds of authentications per second, and to be able to track which particular subscribers are available at that time.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now