Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Research Areas

Data Protection/Management

Includes Backup, Recovery, Replication, Archiving, Copy Data Management and Information Governance.

The goal of modern data protection/management is to help companies protect, understand and use their data. Functionality comes in many forms, ranging from array-based snapshots and standalone backup, replication and deduplication products to comprehensive data management platforms. Since the amount of data is increasing much faster than IT budgets, companies are focused on decreasing storage costs and simplifying data protection processes. As a result, vendors are moving to scale-out platforms that use commodity hardware and offer better storage efficiency and policy-based automation of administrative functions, such as system upgrades, data recovery and data migration for long-term data retention. Depending on operational needs, objectives outside of data protection may include orchestrating the lifecycle of on-demand test/dev environments, using secondary storage for file services and search/analytics to ensure data compliance.

Page 1 of 37 pages  1 2 3 >  Last ›
Free Reports

HPE RMC 6.0: Extending Beyond Copy Data Management

If you’ve worked in IT, you know that a large percentage of your company’s data has been copied at least once, and often multiple times, to meet the needs of various use cases. Whether it’s backup copies for data protection, archival copies for compliance, or clones for test/dev or analytics, any particular set of data is likely to have spawned one or more copies. While these copies are nearly always made for a good reason, in many organizations they have spiraled out of control, creating a copy data sprawl that is tough for IT to get its arms around, let alone manage. As copies of data have proliferated, so have the pain points of greater storage complexity, footprint and cost. The performance of production databases also suffers as copies are made for secondary applications.

It is these very issues that copy data management (CDM) is designed to address. CDM solutions focus on eliminating unnecessary duplication of production data to reduce storage consumption, generally through the use of data virtualization and data reduction technologies. The results can be compelling. Nearly one-third of the companies that Taneja Group recently surveyed have either adopted CDM solutions or are actively evaluating them, looking to achieve benefits such as reduced storage costs, faster data access, and better data visibility and compliance.

But while first-generation CDM offerings have proven helpful, they are not keeping up with the demands of new technologies and user requirements. In particular, Flash and Cloud bring new data management opportunities and challenges that cannot be addressed by traditional CDM solutions. User needs and expectations for CDM are also expanding, moving beyond just policy-based snapshot management among homogeneous arrays.

As we’ve learned in our research, next-gen CDM must meet a new set of user needs driven by Flash and Cloud innovations, including support for heterogeneous arrays, greater but less hands-on control of copies based on intelligent policy-based automation, and coverage of new use cases across the data lifecycle, such as test/dev, reporting and analytics. Customers are also looking for integrated solutions that combine CDM with data protection and other secondary storage functions.

As we’ll see, HPE Recovery Manager Central (RMC) 6.0 provides all these capabilities and more. In fact, we’ll argue that the updated RMC 6.0 offering has helped to make HPE a leader in the data management space, streamlining costs and enriching the experience of HPE customers while still delivering on the backup and recovery features that RMC is well known for.

Publish date: 10/16/18
Free Reports

Enterprises Realize Significant Value with Dell EMC Integrated DP for Converged Infrastructure

We talked to four organizations that have implemented Dell EMC Integrated Data Protection for Converged Infrastructure. The term Converged Infrastructure (CI) refers to an IT solution that groups storage, memory, compute, and networking components into a single, optimized package. When organizations evaluated Dell EMC Integrated Data Protection for CI, a key priority was improving data backup and recovery. To lower their operational costs, they also recognized the importance of having CI that would enable turnkey data protection and reduce their implementation, maintenance, and support costs, as well as improve reliability.

Prior to implementing Dell EMC Integrated Data Protection for CI, organizations we interviewed said they had significant cost, risk, and delay with implementations and upgrades. Several factors contributed to these issues including extensive time spent testing and doing research, compatibility and integration issues, and dealing with multiple support organizations.

All the companies we talked to also wanted better data protection. Many companies were struggling with slow backup speeds, poor storage efficiency, insufficient recovery times, and high operational overhead. To address these problems, companies needed better backup performance, instant data recovery, simplified management, and policy driven automation for both backup and data recovery.

Companies also wanted better vendor support, and they shared a common goal of placing “one-call” for all questions and issues they might have with any aspect of the solution. Success criteria included fast, no-hassle resolution and eliminating finger pointing between software and hardware vendors. Also expressed during the interviews, was the desire to have a vendor that would proactively monitor their system and a team that would partner with them and proactively address their ongoing needs.

Dell EMC customers found that the key to addressing their needs is Dell EMC’s CI, Vblock and VxBlock Systems (now consolidated as one turnkey product called VxBlock System 1000). Dell EMC is a CI pioneer and with Dell EMC Integrated Data Protection for CI, Dell EMC has gone a step further and delivered enterprise-class data protection that is tightly integrated with VxBlock Systems. The result is all-in-one, high performance, highly reliable data protection that is integrated with VxBlock Systems during manufacturing, pre-tested, pre-validated, and fully supported by a single vendor.

It’s an amazing value proposition. There are no point products, therefore less complexity and no compatibility issues. Every company we talked to stated that Dell EMC has exceeded their expectations, and helped them at every phase of their journey from implementation to support.

Publish date: 09/28/18
Free Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18
Free Reports / Report

HPE and Micro Focus Data Protection for Azure Stack

Hybrid cloud is increasingly gaining popularity among enterprise IT buyers, as companies recognize and begin to validate its benefits. With a hybrid cloud, organizations can take advantage of the elasticity and agility of the public cloud, especially for new cloud-native apps, while continuing to run their businesses in the near term on their existing apps on premises. Users gain the choice of deploying new and existing workloads in the public cloud or the data center, wherever it makes the most sense, and the flexibility to migrate them as needed. A hybrid cloud significantly eases the transition to the cloud, enabling organizations to compete in the new cloud-driven world while preserving current IT investments. With these benefits in mind, well over 80% of organizations we recently surveyed are in the process of moving or planning a move to a hybrid cloud infrastructure.


In this brave new world, Microsoft Azure and Azure Stack are increasingly being adopted as the foundation for companies’ hybrid cloud infrastructure. Microsoft Azure is a leading  public cloud offering that, based on Taneja Group research, consistently ranks neck-in-neck with Amazon Web Services in enterprise adoption, with more than 50% of companies using or planning to use Azure within the next two years. Azure Stack enables organizations to deliver Azure services from their own data center. Delivered as an integrated solution on HPE ProLiant servers, Azure Stack allows customers to run Azure compatible apps on premises as well as use cases that benefit from a hybrid deployment. Together, Azure and Azure Stack provide a natural and relatively frictionless path for Microsoft Windows customers to move to the cloud, along with support for new cloud-native tools and services that allow customers to fully take advantage of cloud agility and scalability.


As organizations move critical apps and data to the cloud, data protection quickly becomes a key requirement. But as buyers evaluate solutions, they often find that cloud providers’ built-in backup tools lack the flexibility, breadth of coverage, app awareness and enterprise capabilities they have become accustomed to on premises. As a result, companies look to other vendors—often their on- premises providers—to meet their data protection needs. As we’ll see, Micro Focus Data Protector offers a fully integrated, robust and comprehensive solution for backup and recovery on HPE Azure Stack.


In this piece we’ll further explore the need for data protection in a hybrid cloud environment, and examine the specific backup and recovery approaches that buyers are looking for, as revealed in our recent research. Then we’ll briefly examine what makes Micro Focus Data Protector an ideal solution for protecting an organization’s key information assets in an HPE Azure Stack hybrid cloud setting.
 

Publish date: 06/18/18
Report

The Easy Cloud For Complete File Data Protection: Igneous Systems Backs-up AND Archives All Your NAS

When we look at all the dollars being spent on endless NAS capacity growth, and the increasingly complex (and mostly unrealistic) file protection schemes that go along, we find most enterprises aren’t happy with their status quo. And while cloud storage seems so attractive theoretically, making big changes to a stable NAS storage architecture can be both costly and present enough risk to keep many stuck endlessly growing their primary filers. Cloud gateways can help offload capacity, yet they add operational complexity and are ultimately only half-way solutions.

What file-dependent enterprises really need is a true hybrid solution that backs on-premise primary NAS with hybrid cloud secondary storage to provide a secure, reliable, elastic, scalable, and – above all – a simply seamless solution. Ideally we would want a drop-in solution that was remotely managed, paid for by subscription and highly performant – all while automatically backing-up (and/or archiving) all existing primary NAS storage. In other words, enterprises don’t want to rip and replace working primary NAS solutions, but they do want to easily offload and extend them with superior cloud-shaped secondary storage.

When it comes to an enterprise’s cloud adoption journey, we can recommend that “hybrid cloud” storage services be adopted first to address longstanding challenges with NAS file data protection. While many enterprises have reasonable backup solutions for block (and sometimes VM images/disks), reliable data protection for fast growing file is much harder due to trends towards bigger data repositories, faster streaming data, global sharing requirements, and increasingly tighter SLA’s (not to mention shrinking backup windows). Igneous Systems promises to help filer-dependent enterprises keep up with all their file protection challenges with their Igneous Hybrid Storage Cloud that features integrated enterprise file backup and archive.

Igneous Systems’ storage layer integrates several key technologies – highly efficient, scalable, remotely managed object storage; built-in tiering and file movement not just on the back-end to public clouds, but also on the front-end from existing primary NAS arrays; remote management as-a-service to offload IT staff; and all necessary file archive and backup automation.

And Igneous Hybrid Storage Cloud is also a first-class object store with valuable features like built-in global meta-data search. However, here we’ll focus on how Igneous Backup and Igneous Archive services is used to solve gaping holes in traditional approaches to NAS backup and archive.

Download the Solution Profile today!

Publish date: 06/20/17
Report

Companies Improve Data Protection and More with Cohesity

We talked to six companies that have implemented Cohesity DataProtect and/or the Cohesity DataPlatform. When these companies evaluated Cohesity, their highest priority was reducing storage costs and improving data protection. To truly modernize their secondary storage infrastructure, they also recognized the importance of having a scalable, all-in-one solution that could both consolidate and better manage their entire secondary data environment.

Prior to implementing Cohesity, many of the companies we interviewed had significant challenges with the high cost of their secondary storage. Several factors contributed to the high costs including the need to license multiple products, inadequate storage reduction, the need for professional services and extensive training, difficulty scaling and maintaining systems and adding capacity to expensive primary storage for lower-performance services, such as group file shares.

In addition to lower storage costs, all the companies we talked to also wanted a better data protection solution. Many companies were struggling with slow backup speeds, insufficient recovery times and cumbersome data archival methods. Solution complexity and high operational overhead was also a major issue. To address these issues, companies wanted a unified data protection solution that offered better backup performance, instant data recovery, simplified management, and seamless cloud integration for long-term data retention.

Companies also wanted to improve overall secondary storage management and they shared a common goal of combining secondary storage workloads under one roof. Depending on their environment and their operational needs, their objectives outside of data protection included providing self-service access to copies of production data for on-demand environments (such as test/dev), using secondary storage for file services and leveraging indexing and advanced search and analytics to find out-of-place confidential data and ensure data compliance.

Cohesity customers found that the key to addressing these challenges and needs is Cohesity’s Hyperconverged Secondary Storage. Cohesity is a pioneer of Hyperconverged Secondary Storage, a new category of secondary storage based on a webscale, distributed file system that scales linearly and provides global data deduplication and automatic indexing as well as advanced search and analytics and policy-based management of all secondary storage workloads. These capabilities combine to provide a single system that efficiently stores, manages, and understands all data copies and workflows residing in a secondary storage environment – whether the data is on-premises or in the cloud. There are no point products, therefore less complexity and lower licensing costs.

It’s a compelling value proposition, and importantly, every company we talked to stated that Cohesity has met and exceeded their expectations and has helped them rapidly evolve their data protection and overall secondary data management. To learn about each customer’s journey, we examined their business needs, their data center environment, their key challenges, the reasons they chose Cohesity, and the value they have derived. Read on to learn more about their experience.

Publish date: 04/28/17
Page 1 of 37 pages  1 2 3 >  Last ›