LAS VEGAS — Hewlett Packard Enterprise says it’s extending the data centre out from the back office right to the edge with its new converged systems for the Internet of Things (IoT) unveiled on Wednesday.
With the new systems and services on offer from HPE, the worlds of operational technology and IT are being linked, said Mark Potter, chief technology officer of the enterprise group at HPE. This is the start of an effort by HPE and its partners to automate operations in a new area of industry, delivering new business models along with it.
There could be as much as US$430 billion in savings generated just from productivity gains related to automation, he says, and IoT data is expected to make up 10 per cent of all data being generated by the end of the decade. There lies the opportunity.
“It’s all about how do you take that intelligent device information and connect it?” he says. “Whether it’s by WiFi, Bluetooth, 3G or LTE cellular, the important thing is that HPE has solutions to connect and secure those devices.”
HPE announced two new converged appliances in the Edgeline EL1000 and Edgeline EL4000 at its Discover conference. Describing them as the first converged systems for the Internet of Things (IoT), the systems integrate data capture, control, computer and storage to provide analytics. The units are built for industrial applications — ready to handle shock, vibration, and extreme temperatures – as HP sees the oil and gas sector as one of the likely customers.
“The edge can be harsh, so we have to create technologies that are rugged,” says Tom Bradicich, vice-president and general manager of servers at HPE. “We’re taking a chunk of the data centre and shifting it out to the edge to do deep compute.”
With the Edgeline 1000 being about the size of a poker chips case, it can be mounted just about anywhere – in a vehicle, on the floor of a manufacturing line, or in the case of HPE’s partnership with Irving, Texas-based FlowServe Corp. On the Discover show floor, a Flowserve industrial pump was integrated with an Edgeline 1000 server, resulting in automated error detection, a dashboard displaying metrics in real-time, and an augmented reality application using a tablet to assess the pump’s performance.
HPE’s Vertica Analytics Platform will run on the Edgeline EL4000 model. It provides analytics from in-database machine learning algorithms at real-time speed. Data being sent back to the data centre is secured with Aruba’s Virtual Intranet Access client, a security solution appropriate for high-security government and commercial applications.
Data being sent back to the data centre is secured with Aruba Network’s Virtual Intranet Access client, a security solution appropriate for high-security government and commercial applications. That’s not the only way HPE is capitalizing on its May 2015 acquisition of Aruba Networks here.
The Aruba line is providing IoT security in other avenues as well. HPE says it has enhanced Aruba ClearPass with version 6.6 to help the IT department profile new IoT devices on the network and enforce security policies. Policy status can be exchanged with other systems, such as a mobile device management system.
“It trusts no device and no user until proven otherwise,” says Michael Tennefoss, vice-president of strategic partnerships at Aruba. “Device profiling will identify devices, fingerprint them and determine if they are trustworthy or not. It can give access to the network and impose terms of how a device uses the network.”
HPE is aware that marrying IT with operational technology will be new to its customers, and it’s introducing an IoT Transformation Workshop to address to that through education. The workshop is the first in a full suite of services designed around implementing IoT solutions, HPE says.
Four IoT Innovation Labs will also be opened – at HPE locations in Houston; Grenoble, France; Bangalore; and Singapore.
The vendor is also partnering with GE Digital, National Instruments, and PTC to deliver joint solutions to customers.
The Edgeline 1000 supports between four to 16 CPU cores, and the 4000 model will support up to 64 cores. The 4000 takes a standard 1U form factor so it can be put in a server rack.
Hewlett Packard Enterprise and Nokia will work together to sell Internet of Things solutions to industrial / manufacturing customers and municipal governments looking for smart city applications starting next year, the firms announced on Wednesday.
The technology tag team points to research firm Markets and Markets that predicts the IoT market for smart cities and manufacturing will reach $161 billion USD by 2020. Manufacturers will want to improve productivity and cities with growing populations will need to find ways to operate more efficiently, maintain infrastructure, and be sustainable, is the thinking.
Smart city services refer to helping a government develop a platform that allows for digital delivery of services, says Antonio Neri, executive vice-president and general manager of enterprise group at HPE. Also it would create opportunities to monetize that platform for private sector applications as well.
To make that happen, HPE will be able to take the relevant data and put it into forecasting models with its CMS portfolio, Neri says, but it needs Nokia’s help to collect the data. Not only is it developing 5G wireless networking equipment, but it has a good relationship with telecommunication companies in many regions.
“You need connectivity,” he says. “You need routing capabilities.”
Smart city solutions could include applications like smart lighting and smart buildings, according to HPE. Solutions for the manufacturing and industrial vertical will include asset management, predictive maintenance.
There’s no details on the specific solutions yet, but they will combine connectivity, core networking, data aggregation, and compute technologies. A proof-of-concept in the works will combine Nokia’s routing with HPE’s hybrid IT for joint project delivery models.
Collaboration isn’t new to Nokia and HPE, which have already done joint deals with 25 enterprise and service provider customers.
LONDON – For enterprises considering a move away from legacy on-premises storage solutions to cloud storage, Hewlett Packard Enterprise (HPE) says it has a new offering that can offer the best of both worlds.
A new HPE 3PAR Flash Now initiative is offering pricing on HP’s all-flash storage platform starting at $0.03 per usable gigabyte (GB) per month. That’s “a fraction of the cost of public cloud solutions,” HPE says in a press release issued Monday. It goes on to pitch the on-premises infrastructure as “less than half the cost of public cloud” and providing “the best of on-premises performance, application availability, and control with the convenience and agility of public cloud consumption models.”
Storage has been one of the lower hanging clouds to jump onto for firms still hesitant to move from on-premises models, because the lower pricing and scalable flexibility provided by the services has been hard to beat. Typically, hosting similar scales of storage in an owned data centre would require a high upfront cost, and that stiff capital investment is exactly what IT departments want to avoid. With HPE’s Flash Now, firms can have the on-premises storage and pay for it from the operational budget, amortizing the costs of installation.
Already making use of 3PAR All-Flash storage arrays, senior director of IT for SOCAN Trevor Jackson says the Flash Now model is interesting and he’d look into it as an option to stack on new storage as he needs it.
“Eight months from now we might need to double our capacity,” he says. “What I like about 3PAR is that it can scale.”
SOCAN became an All-Flash array user more than two years ago. Responsible for tracking the broadcast of copyrighted music works across Canada, Jackson’s department has been receiving reports from a growing number of digital services over the past six years, including Spotify, Youtube, Google Play, etc.
“Because the peformance of the All-Flash array is so great, as we add more services to what we do on a daily basis, it’s able to keep up with that demand,” Jackson says. “You can imagine, some of the top Youtube artists get millions of plays for just one video. When you add multiple artists to that, you’re talking about billions of plays. Sifting through all that data is the challenge.”
While the overriding narrative in the IT industry for the last two years has been about designing a hybrid approach that inevitably makes use of the public cloud for some services, HPE may have given its customers another option, says Ray Wang, principal analyst at Constellation Research.
“Price competitiveness has been the main drive to public cloud,” he says. “If HP matches public cloud pricing and provides financing as stated, they have a way for customers to have another option in their overall IT strategy.”
At a few pennies per GB, HPE is at the same rate as storage on public cloud options, Wang says, but will offer lower long-term costs as a result of the extra inbound and outbound fees that public cloud services incur.
Beyond costs, HPE will be able to target companies with high compliance requirements to keep data stored within their own confines. In many cases, businesses are avoiding cloud storage that will see data go outside of their national jurisdictions voluntarily.
“It comes as no surprise HP is announcing this in Europe, where data privacy and residency concerns are most pronounced,” says Holger Mueller, analyst with Constellation Research. “So for a number of next-gen apps that can stay on-premises, this will be attractive.”
HPE 3PAR Flash Now comes with some other frills beyond an Op-Ex pricing model, in the form of HPE Flexible Capacity and Pre-Provisioning. To help customers support the Flash arrays with appropriate networking infrastructure, HPE has updated its StoreFabric 32GB Fibre Channel portfolio to include a Smart SAN technology to automate orchestration from the 3PAR StorServ arrays.
Jackson says he’d recommend 3PAR’s All-Flash arrays that are struggling with storage as a bottleneck.
“If you’re in an environment with a lot of transactions or data processing, let’s face it, disc has been the bottle neck in the compute chain,” the IT director says. “Then it makes sense to move to Flash.”
HPE says its Flash Now service is available worldwide as of today.
If we were to invent the computer today instead of using the same design architecture that was made 60 years ago, what would we make?
That’s the question that HP Labs asked itself when it started down the path of creating The Machine in 2014. The moonshot R&D initiative seeks to answer the problems posed by the impending influx of big data brought by the Internet of Things and growing connectivity around the globe.
“This is the ‘Hello World’ moment for the next generation of computing,” says Mark Potter, chief technology officer of the enterprise group at HPE. “Some of the fundamentals that have helped grow and revolutionize what we have as a world today are starting to slow down. This will continue to take us forward.”
The basic problem that HPE is trying to solve with today’s computer architecture is the necessary exchange between memory and storage. Today, a computer’s memory (RAM) holds the data that can be exposed to the CPU. When new data requires processing, it must be retrieved from storage (the hard drive) and brought to the RAM. So a CPU’s full processing power can only be exposed to segments of data limited by the amount of RAM in a computer.
This model of a CPU-defined approach (hence the name “central” processing unit) and limited memory has defined the way computers are programmed.
“We treat memory as a precious resource, we say that if an algorithm takes a little less memory, it’s better than another algorithm,” says Keith McAuliffe, chief technologist at HPE. “This architecture turns this upside down. We have to unlearn a lot of computer science that we’ve learned in school.”
While The Machine that HPE is working towards will make these new vast memory pools available with a new type of memory, the prototype uses DRAM. The long, custom boards that HPE has built for the prototype contain 4 TB of DRAM each, wired with a design that gives a powerful System-on-a-Chip (SOC) immediate access to all of the memory at once. Stacked together in a rack, The Machine prototype can combine with other boards to provide access to many terabytes of memory at once.
Stacked together in a rack, The Machine prototype can combine with other boards to provide access to many terabytes of memory at once. Multiple board enclosures are linked by a fibre photonics connection rather than electrical wiring, explains Andrew Wheeler, deputy director of HP Labs.
“Right now computer chips ‘speak’ in electronic signaling, so so over short distances you can do very high bit rates at short distances,” he says. “Photonics gives you the same high-speed signaling at greater distances and there’s less power used.”
Fibre cabling is also more manageable than copper cabling as it’s more flexible and compact, he adds.
In short, The Machine is still using conventional hardware, but it’s blending in the new memory-centric architecture that The Machine will offer. That’s needed so HPE is in a position to start talking about the concept and get developers interested in writing programs that can take advantage of it.
A couple of tweaks to take advantage of more memory at the middleware and OS levels could result in performance boosts of 200-800 per cent pretty quickly, Wheeler says. But that’s just the “low-lift, get started” performance gains that can be achieved. Some applications could stand to see as much as an 8,000x gain, such as a financial model that HPE created to predict individualized financial portfolio risk curves based on real-time market data.
Then there are the applications that you just couldn’t have otherwise. Like discovering new laws of nature.
That’s Sean Carroll’s job. He’s a research professor of theoretical physics at the California Institute of Technology (CalTech) and he’s not just whining when he says “We are constrained by our ability to store and manipulate vast amounts of information.”
To confirm the physical laws that govern our universe, Carroll says that his team sets up experiments and matches data to a template of what they’d expect to see if their hypotheses are correct. That’s how the existence of gravitational waves measured by earth-based telescopes earlier this year confirmed Einstein’s theory of thermodynamics. But it’s a limiting approach.
“What we’d like to have is the ability to look for things that we weren’t expecting,” he says. “You could look for something like this and discover it so every telescope in the world could point at the same thing.”
The Machine will be seeing many prototype phases beyond this before HPE’s memory-centric vision is achieved in full. Wheeler says the immediate focus to to scale up from what they have today and achieve some additional use cases and proof points.
There’s no hard timeline for when the next build of The Machine prototype will be completed. You never know what roadblocks you’ll hit when you’re working with entirely new components, he says.
Similar to HP Inc., HPE’s first year could not be called a disappointment by any stretch of the imagination. For a year of disruption that saw every corporate giant in the enterprise software space feel at least a little bit of struggle, HPE made moves to shine through. Third quarter reports had HPE net revenue down six per cent, but cash flow from operations up 10 per cent from the prior year period.
We spoke with IDC analyst and director of infrastructure channels research, Paul Edwards to learn more.
“Whether you are Dell or HPE, you saw disruption due to the changing industry this past year, so no matter how well HPE has reacted to that disruption, it was unavoidable,” said Edwards. “They’re satisfaction scores have risen, and I would say they are more nimble than before. Overall, this year was relatively positive.”
Considering HPE is a leader in the majority of the market segments it serves at first in servers, second in networking, second in total storage, and second in IT services, and that 80 per cent of Fortune 500 customers are HPE Enterprise Services customers, relatively positive is the right word.
HPE recently beefed up its enterprise security portfolio, as well as add upgrades to its Partner Ready program. Not everything has been peachy at HPE, with the company splitting off its services division and non-core software assets to Computer Sciences Corp. (CSC) and Micro Focus International respectively.
One-on-one with Charlie Atkinson
We spoke with Charlie Atkinson, the managing director and vice president of HPE Canada’s enterprise group about year one.
“When we were embracing the advent of our separation a year ago we were looking forward to being able to move faster and to have a deeper focus on the driving forces that are really driving change and opportunity in the market,” said Atkinson. “Here we are a year later and that vision is one that we have really realized. The agility, the empowerment, and the accountability that we have has just been phenomenal.”
That vision continues to be important as the industry reshapes the way the enterprise operates with the whole notion of multi-cloud and the shift of workloads to different cloud models. This shift requires HPE to become much more adaptive and responsive, something the company hopes to continue to do moving past this first year as well.
“We put a tremendous amount of focus on helping our customers move into the world of agile dev-ops and hybrid IT, “ said Atkinson. “When you look at our converged systems and hyper-converged offerings, we’ve done extremely well, especially in the large enterprise space, helping our customers really operate in the mobile world we live in, and how to make hybrid IT simple.”
On shedding more weight
On the surface, further splitting off of HPE less than a year after the big HP split may seem concerning. But, if HPE is adamant in promising to make itself a faster, and more nimble company. These moves have been made with that ideal in the forefront.
“[The CSC and Micro Focus deal] will allow us again to move faster and have that deeper focus on the enterprise group, while also helping those divisions being spun off succeed in their own space. The combination of enterprise services and CSC will make that business, as a separate independent company, the number one end-to-end IT services firm in the world,” said Atkinson.
We won’t know how those moves turn out for a little while longer. The deal with CSC comes into effect on April 1, with the Micro Focus deal coming into effect in the second half of HPE’s upcoming fiscal year.
Looking forward to year two
As the industry continues to change, HPE is confident in its ability to continue to adapt and succeed in this space.
“One of the reasons our channel community is so excited is because when you look at our hyper-converged offerings, when you look at where our synergy is going, it really does create a great opportunity for our channel partners to contribute in the migration of the workloads from the traditional IT space into tomorrow’s hybrid IT space,” said Atkinson.
“The real proof in the pudding is how you are doing in customer satisfaction ratings. We’ve got industry leading sat ratings, and we are going to continue to innovate, and continue to deliver advanced services in our portfolio.”
If you missed CDN’s interview with Mary Ann Yule, president of HP Canada, from earlier this week, be sure to check that out for the other half of the story here.
Hewlett Packard Enterprise (HPE) this week unveiled several new high-performance computing (HPC) solutions, including a comprehensive software-defined platform, enhancements to its Apollo servers, along with a new ANSYS computer-aided engineering (CAE) software based solution designed to help manufacturing organizations optimize design simulation deployments.
The parallel processing model enables applications to run faster due of the use of more CPUs; according to the Palo Alto-based vendor, its new software-defined platform —using the refreshed HPE Core HPC Software Stack with HPE Insight Cluster Management Utility v8.0 — is designed to simplify the implementation and management of HPC solutions.
The HPE Core HPC Software Stack — which combines app dev tools with cluster management capabilities — is designed to help organizations test and deploy clusters within HPC environments more quickly and effectively.
The company also announced improvements to its software defined Apollo HPC platform by introducing “systems design innovations” for the Apollo 6000 system with new HPE ProLiant XL260a server trays. The offerings are based on the Intel Corp.’s Xeon Phi processor family and the Omni-Path Architecture to help boost bandwidth and reduce latency.
According to research firm IDC, the HPC model is becoming a popular enterprise option due to migration of HPC to the cloud and growing recognition of HPC’s strategic value. Dell Inc., which this week annnounced that it is selling off its software division, also recently bolstered its HPC product portfolio with new “HPC as a Service capabilities,” which include deployment options for on-premises, hybrid, and off-premise.
Small and medium-sized business (SMB) owners looking for the peace of mind and easy access offered by on-site data storage now have an affordable new option, thanks to Hewlett Packard Enterprise (HPE).
The Palo Alto, Calif.-based IT giant, which spun off from the original Hewlett-Packard Company last year, released two new storage solutions today – the flexible hybrid cloud-based StoreVirtual 3200 and solid-state drive (SSD) based MSA 2042 – specifically aimed at helping SMBs modernize their on-site infrastructure without breaking the bank.
“I think the importance of storage and storage technology to both SMB and enterprise customers has increased significantly over the last number of years as technologies like virtualization have become more and more prevalent,” according to Brad Parks, HPE Storage’s director of GTM strategy and enablement.
He noted that while enterprise and SMB customers alike have been demanding – and, in the former’s case, receiving – large-scale hybrid cloud and SSD storage for years, HPE had been unable to accommodate its smaller-scale customer base because the technology simply hadn’t been affordable enough to do so, until now.
“With this release, we’re giving smaller customers access to technology that their larger brethren have been enjoying now for several years,” Parks said.
The StoreVirtual 3200 provides SMB customers with an entry-level, dual-controller storage array based on 64-bit ARM technology, a new RAID (redundant array of independent disks) stack, and advanced storage data services including support for more than 2000 snapshots, thin provisioning, and optional data recovery software for under $10,000. It’s also flexible enough to incorporate additional flash and hybrid cloud storage when needed.
“If you think about the ‘S’ in ‘SMB,’ when starting a new company you need e-mail, you need a CRM system… and in time, you may need to develop your own application or customize your platform for a specific industry, so with this release we’re giving those customers room to grow,” Parks said.
It can also save them money: According to HPE, the StoreVirtual 3200 can match the same performance and capacity as the company’s existing 2-node StoreVirtual 4000 configurations, at a 58 per cent lower cost.
Meanwhile, the MSA 2042 adds 800 GB of built-in SSD capacity to HPE’s flagship entry-level array, along with an enterprise-level software suite that includes flash acceleration, automated tiering, and data protection features. According to HPE, it delivers up to 60 percent more database transactions per second, and has an application response time 80 percent faster than its predecessor, at 46 per cent of the cost.
“Customers who are using shared storage are often storing data for four or five different critical applications all in the same place, and if you put your eggs in one basket, it better be a good basket,” Parks said, likening the difference to using an HDD-based laptop versus an iPad or iPhone.
“After having that ability to instantly turn on your device and have access to everything you need, none of us would go back to the days of a laptop with a spinning disk drive,” he said. “Even small or midsize businesses are going to want to use flash to make sure that users who access their applications they get that instant gratification.”
And like the StoreVirtual 3200, the MSA 2042 allows customers to keep their existing data while expanding it with flash and traditional disk drives in the future. It also supports up to 512 array snapshots and remote replication to any MSA system.
Both devices are available worldwide starting today, with the MSA 2042 priced around $12,820, and the StoreVirtual 3200 around $7,860.
LAS VEGAS – Hewlett-Packard Enterprise (HPE) kicked off its first major conference as a separate entity by focusing on what HP’s enterprise division has long been known for — providing solutions to manage complex IT environments and opportunities for deep integrations across a portfolio of software, hardware, and services.
At its HPE Discover customer event, the company announced the release of OneView 3.0, an infrastructure management application that applies software-defined intelligence to various HP enterprise on-premises systems including ProLiant, BladeSystem, Hyper-Converged Solutions, and Synergy. OneView is built to deliver applications quickly, provides workload template automation and offers an API for integration with other systems.
It’s the next step for HPE in pushing out its composable model of IT infrastructure management that it introduced with Synergy. OneView 3.0 allows organizations not running Synergy servers elements of a composable-style infrastructure management, which Ric Lewis, senior vice-president and general manager of converged data centre infrastructure at HPE describes as “infrastructure that can flex to the needs of any given workplace.”
To be flexible in adapting to the way an IT shop chooses to manage its infrastructure, a new dashboard in this version of OneView will provide a single view of infrastructure of many different flavours.
“It’s a little bit like converged infrastructure, except that was about bringing storage and fabric together,” Lewis explains. “This is built from the ground up to self-discover resources to help the user deploy workloads.”
New to this version of OneView is a unified dashboard that can provide a single view into multiple data centres that work across different form factors. Health and monitoring details are displayed for legacy storage arrays or all-Flash storage alike.
“We’re talking about the whole system put together and designed to do this capability,” Lewis says. “You can’t do all of what Synergy can do, because that was designed for this. But it can do a lot of it.”
Synergy, HPE’s new converged platform, was unveiled last December and hit general availability in April. It marked the introduction of composable infrastructure from HP, and now OneView is being designed to take advantage of it for shops running HP systems that aren’t converged.
OneView also promises to integrate with Intelligent Management Centre to provide end-to-end management of heterogeneous network switches.
While OneView isn’t designed to reach into cloud environments, HPE has addressed that with a new software offering in its Helion line with CloudSystem 10. It allows for easier provisioning of physical servers or virtualized clusters from CloudSystem’s console. HP says this should cut down on the number of organizational baton passes when provisioning physical infrastructure.
“Previously you’d have to provision all your infrastructure and then go back into OneView and take additional steps. Now that’s all automated in the scripting,” says Brad Parks, director of marketing strategy and enablement at HPE.
From a mile-high view, HPE’s Helion Cloud Portfolio combines its hardware and software offerings into a solution that can vary to fit just about any imaginable IT infrastructure requirement. The different elements of the suite can be added on to create an infrastructure package that includes storage, networking, software and services. With an eye on the trend of enterprises moving to operate in some form a hybrid cloud model, HPE is offering a way for its current customers to integrate more deeply into their environments, and for prospective customers, the prospect of a turnkey solution.
As HP describes them, the four main pieces of the Helion stack include:
HPE says that the new Helion products will be available in the second half of 2016. OneView 3.0 is slated for a third-quarter release.