SAS preps in-memory analytic grids

SAS Institute Inc. is joining the newly hot area of in-memory processing, developing a series of high-performance analytics systems tuned for specialized tasks.

The first product will focus on risk management and become available late this year. Products for retailers and other verticals will follow. 

SAS is working with Hewlett-Packard on the hardware end, which will involve grids made of HP BladeSystem technology along with Insight Control management and automation software. The HP partnership is not exclusive, but the only one SAS is conducting for now.

In-memory processing, which pushes data into RAM, adds a performance boost compared to reading and writing information from disks. SAS’s new technology can dramatically speed up processing jobs that would normally take hours, reducing the time to minutes, according to David Wallace, global marketing manager for solutions and industries. 

SAS decided against releasing the in-memory technology as a horizontal tool, Wallace said. “Instead, what we’re doing is focusing our efforts on solving complex problems in specific industries.”

The risk management product will help banks better determine how to allocate capital and find out how turmoil in the financial markets will affect their positions, Wallace said. 

One area of focus for retail is “markdown optimization,” the process of making sure revenues and profits on season-specific items is maximized as the products start to age.

In addition, SAS is working with early customers, including Macy’s and United Overseas Bank, to determine optimal sizes for the private grid clusters, Wallace said. 

SAS’s products are emerging as in-memory processing gains a higher profile in part due to recent announcements from SAP. The applications vendor is developing appliances that employ its new in-memory database, and will gain more in-memory technology from the pending acquisition of Sybase.

While in-memory processing is far from new, in the past the cost of RAM and hardware limited the amount of data that could be processed at one time. The advent of low-priced multicore servers containing large amounts of RAM have mitigated this problem.

Chris Kanaracus covers enterprise software and general technology breaking news for The IDG News Service. Chris’s e-mail address is [email protected]

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now