Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Profiles/Reports

Free Reports

Page 1 of 8 pages  1 2 3 >  Last ›
Free Reports

HPE RMC 6.0: Extending Beyond Copy Data Management

If you’ve worked in IT, you know that a large percentage of your company’s data has been copied at least once, and often multiple times, to meet the needs of various use cases. Whether it’s backup copies for data protection, archival copies for compliance, or clones for test/dev or analytics, any particular set of data is likely to have spawned one or more copies. While these copies are nearly always made for a good reason, in many organizations they have spiraled out of control, creating a copy data sprawl that is tough for IT to get its arms around, let alone manage. As copies of data have proliferated, so have the pain points of greater storage complexity, footprint and cost. The performance of production databases also suffers as copies are made for secondary applications.

It is these very issues that copy data management (CDM) is designed to address. CDM solutions focus on eliminating unnecessary duplication of production data to reduce storage consumption, generally through the use of data virtualization and data reduction technologies. The results can be compelling. Nearly one-third of the companies that Taneja Group recently surveyed have either adopted CDM solutions or are actively evaluating them, looking to achieve benefits such as reduced storage costs, faster data access, and better data visibility and compliance.

But while first-generation CDM offerings have proven helpful, they are not keeping up with the demands of new technologies and user requirements. In particular, Flash and Cloud bring new data management opportunities and challenges that cannot be addressed by traditional CDM solutions. User needs and expectations for CDM are also expanding, moving beyond just policy-based snapshot management among homogeneous arrays.

As we’ve learned in our research, next-gen CDM must meet a new set of user needs driven by Flash and Cloud innovations, including support for heterogeneous arrays, greater but less hands-on control of copies based on intelligent policy-based automation, and coverage of new use cases across the data lifecycle, such as test/dev, reporting and analytics. Customers are also looking for integrated solutions that combine CDM with data protection and other secondary storage functions.

As we’ll see, HPE Recovery Manager Central (RMC) 6.0 provides all these capabilities and more. In fact, we’ll argue that the updated RMC 6.0 offering has helped to make HPE a leader in the data management space, streamlining costs and enriching the experience of HPE customers while still delivering on the backup and recovery features that RMC is well known for.

Publish date: 10/16/18
Free Reports

Enterprises Realize Significant Value with Dell EMC Integrated DP for Converged Infrastructure

We talked to four organizations that have implemented Dell EMC Integrated Data Protection for Converged Infrastructure. The term Converged Infrastructure (CI) refers to an IT solution that groups storage, memory, compute, and networking components into a single, optimized package. When organizations evaluated Dell EMC Integrated Data Protection for CI, a key priority was improving data backup and recovery. To lower their operational costs, they also recognized the importance of having CI that would enable turnkey data protection and reduce their implementation, maintenance, and support costs, as well as improve reliability.

Prior to implementing Dell EMC Integrated Data Protection for CI, organizations we interviewed said they had significant cost, risk, and delay with implementations and upgrades. Several factors contributed to these issues including extensive time spent testing and doing research, compatibility and integration issues, and dealing with multiple support organizations.

All the companies we talked to also wanted better data protection. Many companies were struggling with slow backup speeds, poor storage efficiency, insufficient recovery times, and high operational overhead. To address these problems, companies needed better backup performance, instant data recovery, simplified management, and policy driven automation for both backup and data recovery.

Companies also wanted better vendor support, and they shared a common goal of placing “one-call” for all questions and issues they might have with any aspect of the solution. Success criteria included fast, no-hassle resolution and eliminating finger pointing between software and hardware vendors. Also expressed during the interviews, was the desire to have a vendor that would proactively monitor their system and a team that would partner with them and proactively address their ongoing needs.

Dell EMC customers found that the key to addressing their needs is Dell EMC’s CI, Vblock and VxBlock Systems (now consolidated as one turnkey product called VxBlock System 1000). Dell EMC is a CI pioneer and with Dell EMC Integrated Data Protection for CI, Dell EMC has gone a step further and delivered enterprise-class data protection that is tightly integrated with VxBlock Systems. The result is all-in-one, high performance, highly reliable data protection that is integrated with VxBlock Systems during manufacturing, pre-tested, pre-validated, and fully supported by a single vendor.

It’s an amazing value proposition. There are no point products, therefore less complexity and no compatibility issues. Every company we talked to stated that Dell EMC has exceeded their expectations, and helped them at every phase of their journey from implementation to support.

Publish date: 09/28/18
Free Reports

HPE Brings Multi-Cloud Storage to the Enterprise

Companies in every industry and from every corner of the world are increasingly adopting cloud storage, addressing use cases such as backup, archiving and disaster recovery. More than 96% of organizations we recently surveyed are housing at least some of their data in the cloud, up from just 65% five years before. Firms deploying storage in the cloud are looking to increase IT agility and workload scalability, while taking advantage of a more flexible, pay-as-you-go consumption model.

But for enterprises and mid-sized organizations alike, the cloud journey nearly always starts on premises. A large majority of organizations still run the core of their business-critical workloads in the data center, supported by significant and proven investments in on-premises hardware, workflows and business processes that support key business apps and ensure maximum value to users and other stakeholders. Not surprisingly, IT decision makers tread carefully when it comes to considering public cloud deployments for their critical apps or data.

To get the best of the cloud without compromising current IT investments, a growing majority of decision makers are now focusing on solutions with hybrid and multicloud capabilities. Hybrid cloud enables them to gain value from the cloud from day 1, while fully leveraging their on-prem infrastructure. Under a hybrid model, companies can deploy selected apps that make sense to run in the public cloud, but still run a majority of their core business workloads on-premises. They can also employ a dev-ops approach to begin to develop and run cloud-native apps.

Multicloud takes those benefits one step further, enabling portability of workloads between two or more clouds. Organizations we surveyed are now working with at least two major public cloud providers, on average, enabling them to avoid lock-in to a single provider and to choose the provider that best meets the needs of each app and use case. Together, hybrid and multicloud offer an attractive and measured approach for companies looking to deploy some of their workloads in the cloud.

In this piece we’ll examine the customer journey to cloud storage, including some important considerations companies should keep in mind as they decide what approach will work best for them. We’ll then describe HPE’s storage platforms, which are built for cloud and provide a powerful and unique approach to multicloud storage. Finally, we’ll look at the advantages that HPE storage delivers over other cloud storage deployment models, and show how these HPE platforms are helping enterprises to maximize the potential of their cloud storage initiatives.

Publish date: 09/21/18
Free Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18
Free Reports / Profile

HPE InfoSight: Cross-stack Analytics

Accurate and action-oriented predictive analytics have long been the Holy Grail of IT management. Predictive analytics that bring together large amounts of real-time data with powerful analytical capabilities have the potential to provide IT managers with real-time, data-driven insights into the health and performance of their overall environment, enabling them to anticipate and remediate looming issues and optimize resource utilization. While these potential benefits have long been understood, it has only been recently that major innovations in cloud, Internet of Things (IoT), data science, and AI/machine learning have paved the way for predictive analytics to become a reality in the data center.

The IoT now enables companies to collect and monitor real-time sensory or operational data at the edge—whether in online financial systems, retail locations, or on the factory floor. This raw data is typically streamed to the cloud, where it can be tabulated and analyzed. Powerful advances in edge-to-cloud networks and global learning capabilities make the cloud an optimal location for the analytics to take place. Informed by data science and increasingly driven by AI and machine learning technologies, these analytics can help IT managers to monitor key system metrics and understand how well specific infrastructure elements—such as servers or storage—are performing.


But analytics that are focused on a single infrastructure element at a time can only go so far. Sure, it is helpful to monitor the health and performance of specific IT resources, such as CPU heartbeat or storage latency, but infrastructure resources do not operate independently or in isolation. Analytics must go beyond one dimension, and take into account how resources such as servers and storage interact with and depend on one another. This is especially critical in virtualized infrastructures, in which the interaction of virtual machines with hosts, networks and storage makes IT management even more challenging. Ideally, using the power of AI, analytics can cross these various layers of the IT stack to reveal the impact of resource interactions and interdependencies among all the layers. This would take analytics to a whole new level, transcending the limits of human intelligence to enable dynamic, multi-dimensional analysis of complex, virtualized IT environments.

Think about the implications of AI-driven, cross-stack analytics for IT management. For example, such a capability has the potential to transform technical support from a reactive, always-playing-catch-up function to a proactive and forward-looking capability. In this scenario, built-in analytics are capable of connecting the dots between infrastructure layers to automatically anticipate, diagnose, and fix technical issues before they become major problems. Cross-layer analytics might also help to improve system performance by predicting looming configuration issues and recommending changes to address them.


One product—HPE InfoSight—is already embracing these possibilities, fast-forwarding to bring AI-driven, cross-layer analytics to virtualized environments today. HPE InfoSight has proven its value in delivering predictive storage analytics to customers for many years, while extending its capabilities across the infrastructure stack. In this piece we’ll explore the key characteristics that customers should look for in an analytics solution for virtual infrastructure and then look at the HPE InfoSight architecture and its capabilities, and how they are helping customers transform IT management in virtualized environments today. Specifically, we will demonstrate how one customer uses cross-stack analytics delivered by HPE InfoSight to save tremendous time and money in their HPE 3PAR Storage environment.

Publish date: 06/28/18
Free Reports / Report

HPE and Micro Focus Data Protection for Azure Stack

Hybrid cloud is increasingly gaining popularity among enterprise IT buyers, as companies recognize and begin to validate its benefits. With a hybrid cloud, organizations can take advantage of the elasticity and agility of the public cloud, especially for new cloud-native apps, while continuing to run their businesses in the near term on their existing apps on premises. Users gain the choice of deploying new and existing workloads in the public cloud or the data center, wherever it makes the most sense, and the flexibility to migrate them as needed. A hybrid cloud significantly eases the transition to the cloud, enabling organizations to compete in the new cloud-driven world while preserving current IT investments. With these benefits in mind, well over 80% of organizations we recently surveyed are in the process of moving or planning a move to a hybrid cloud infrastructure.


In this brave new world, Microsoft Azure and Azure Stack are increasingly being adopted as the foundation for companies’ hybrid cloud infrastructure. Microsoft Azure is a leading  public cloud offering that, based on Taneja Group research, consistently ranks neck-in-neck with Amazon Web Services in enterprise adoption, with more than 50% of companies using or planning to use Azure within the next two years. Azure Stack enables organizations to deliver Azure services from their own data center. Delivered as an integrated solution on HPE ProLiant servers, Azure Stack allows customers to run Azure compatible apps on premises as well as use cases that benefit from a hybrid deployment. Together, Azure and Azure Stack provide a natural and relatively frictionless path for Microsoft Windows customers to move to the cloud, along with support for new cloud-native tools and services that allow customers to fully take advantage of cloud agility and scalability.


As organizations move critical apps and data to the cloud, data protection quickly becomes a key requirement. But as buyers evaluate solutions, they often find that cloud providers’ built-in backup tools lack the flexibility, breadth of coverage, app awareness and enterprise capabilities they have become accustomed to on premises. As a result, companies look to other vendors—often their on- premises providers—to meet their data protection needs. As we’ll see, Micro Focus Data Protector offers a fully integrated, robust and comprehensive solution for backup and recovery on HPE Azure Stack.


In this piece we’ll further explore the need for data protection in a hybrid cloud environment, and examine the specific backup and recovery approaches that buyers are looking for, as revealed in our recent research. Then we’ll briefly examine what makes Micro Focus Data Protector an ideal solution for protecting an organization’s key information assets in an HPE Azure Stack hybrid cloud setting.
 

Publish date: 06/18/18
Page 1 of 8 pages  1 2 3 >  Last ›