IBM Corp. made available Wednesday stream computing software called IBM System S to enable organizations to perform perpetual analytics like continually detecting subtle physiological changes in premature infants at the University of Ontario Institute of Technology (UOIT).

System S should help UOIT medical practitioners detect life-threatening conditions and make better decisions by continually observing and collecting physiological data streams like respiration, heart rate and body temperature.

The goal isn’t to replace the traditional manual paper-based approach employed by many medical practitioners. Rather, the technology will supplement those methods because “that extra layer really represents a change in the fundamental way that critical health care is (administered),” said Carolyn McGregor, associate professor with UOIT and Canada Research Chair in Health Informatics.

It’s a new model to problem solving that aims to take a bite-sized approach to ingesting data at different stages as it becomes available and continually refining the outcome, instead of the traditional monolithic tack, said Nagui Halim, chief scientist for stream computing with the Armonk, New York-based company.

“The idea is to allow the application developers to think about processing information in this progressive manner,” said Halim.

Using the example of analyzing a phone conversation, Halim explained that a progressive approach to analytics would entail first performing data encryption on the voice, then voice to text conversion, followed by keyword extraction. “We allow programmers to specify how data is processed to this pipeline organization,” he said.

In a flow model, there could potentially be thousands of concurrent streams where results are joined at the end or some other point, explained Halim, creating what can be a “fairly elaborate graph.” For this reason, Halim said IBM created a means of expressing this flow model, a new development environment, “and the system takes care of all the difficult issues of scheduling, failover, congestion control, any basic things that a runtime has to do.”

IBM also made available at no charge System S trial code to help get customers acquainted with the novel idea of stream computing. The flow model solves problems that are not necessarily those that enterprises have been able to solve in a general systematic way, said Halim, and “we feel it’s important to introduce customers gradually to this.”

Toronto-based TD Securities is also using System S with the goal of making faster financial trading decisions by ingesting trading data at the rate of more than five million bits per microsecond. The equivalent, IBM explained, would be a trader reading the entire works of Shakespeare 10 times in less than one second and then identifying and executing a stock trade faster than a hummingbird flaps its wings.

The company also announced the opening of the IBM European Stream Computing Center in Dublin, Ireland, that will serve as a hub of research, customer support, and advanced testing for stream computing applications for European businesses. Other applications of the technology at the lab will focus on water management, specifically looking at the health of local water ecosystems that intersect with human populations. “The System can be connected to sensors that would detect water quality, temperature, salinity, fish that are in the water, etc., to get a better sense of what’s going on,” said Halim.

Also in the stream computing business, chip-maker AMD Inc. has its own ATI stream technology, originally developed by ATI Technologies Inc., that enables graphics processor units to work like an “application accelerator” for a variety of use cases, said Patricia Harrell, director of stream computing for the Sunnyvale, Calif.-based company. “What stream computing does that’s different is it gives us the opportunity to use that compute capacity for things other than traditional graphics,” said Harrell.

In the past, there have been a number of different architectural approaches taken to accelerate applications, said Harrell, “but some of them in the past have been very special, complicated technologies, some of them have been hard to program.”

Harrell thinks that stream computing is definitely gaining traction, especially considering the number of “game-changing” things happening, like OpenTL, an industry standard programming model for stream computing. “So, now mainstream developers can develop code that take advantage of those capabilities, and it will run on any vendor platform where the vendor also supports that standard,” said Harrell. “That needed to happen to make this go mainstream.”

AMD’s stream computing technology is being used, for instance, to accelerate algorithms to reconstruct CAT scan data, said Harrell. “When you take a CT scan of a person or part of a person, there is lots of post-processing you need to do to turn that data into an image that a radiologist can look at,” she said.

According to Halim, till now, only point solutions with limited capability have existed. He believes that stream computing will “break us out of that pattern” to enable more sophisticated applications in a variety of areas like location-based services, security and transportation.

There is a “very broad” interest in the idea of having technology that observes the real world to help with speedier decision-making or warn of impending problems, said Halim. “It’s a brand new and very important field that will be emerging.”

Related Download
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center Sponsor: Lenovo
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center
Find out how Hyperconverged systems can help you meet the challenges of the modern IT department. Click here to find out more.
Register Now