Taneja Group | Recovery+Manager+Central
Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Items Tagged: Recovery+Manager+Central

news

Flat backup grows into viable tool for data protection

Flat backups reduce license fees and improve recovery point objectives and recovery time objectives, making them useful for data protection.

  • Premiered: 07/08/15
  • Author: Arun Taneja
  • Published: TechTarget: Search Data Backup
Topic(s): TBA Backup TBA Data protection TBA DP TBA Recovery TBA Snapshots TBA flat backup TBA NetApp TBA HP TBA EMC TBA 3PAR TBA StoreServ TBA StoreOnce TBA VMAX TBA Data Domain TBA RPO TBA RTO TBA COW TBA ROW TBA Disaster Recovery TBA DR TBA WAN TBA Microsoft TBA Oracle TBA SAP TBA SnapProtect TBA RMC TBA Recovery Manager Central TBA ProtectPoint TBA Virtualization TBA VMWare
Profiles/Reports

DP Designed for Flash - Better Together: HPE 3PAR StoreServ Storage and StoreOnce System

Flash technology has burst on the IT scene within the past few years with a vengeance. Initially seen simply as a replacement for HDDs, flash now is triggering IT and business to rethink a lot of practices that have been well established for decades. One of those is data protection. Do you protect data the same way when it is sitting on flash as you did when HDDs ruled the day? How do you take into account that at raw cost/capacity levels, flash is still more expensive than HDDs?  Do data deduplication and compression technologies change how you work with flash? Does the fact that flash technology is injected most often to alleviate severe application performance issues require you to rethink how you should protect, manage, and move this data?

These questions apply across the board when flash is injected into storage arrays but even more so when you consider all-flash arrays (AFAs), which are often associated with the most mission-critical applications an enterprise possesses. The expectations for application service levels and data protection recovery time objectives (RTOs) and recovery point objectives (RPOs) are vastly different in these environments. Given this, are existing data protection tools adequate? Or is there a better way to utilize these expensive assets and yet achieve far superior results? The short answer is yes to both.

In this Opinion piece we will focus on answering these questions broadly through the data protection lens. We will then look at a specific case of how data protection can be designed with flash in mind by considering the combination of flash-optimized HPE 3PAR StoreServ Storage, HPE StoreOnce System backup appliances, and HPE Recovery Management Central (RMC) software. These elements combine to produce an exceptional solution that meets the stringent application service requirements and data protection RTOs and RPOs that one finds in flash storage environments while keeping costs in check.

Publish date: 06/06/16
Profiles/Reports

Free Report - Better Together: HP 3PAR StoreServ Storage and StoreOnce System Opinion

Flash technology has burst on the IT scene within the past few years with a vengeance. Initially seen simply as a replacement for HDDs, flash now is triggering IT and business to rethink a lot of practices that have been well established for decades. One of those is data protection. Do you protect data the same way when it is sitting on flash as you did when HDDs ruled the day? How do you take into account that at raw cost/capacity levels, flash is still more expensive than HDDs?  Do data deduplication and compression technologies change how you work with flash? Does the fact that flash technology is injected most often to alleviate severe application performance issues require you to rethink how you should protect, manage, and move this data?

These questions apply across the board when flash is injected into storage arrays but even more so when you consider all-flash arrays (AFAs), which are often associated with the most mission-critical applications an enterprise possesses. The expectations for application service levels and data protection recovery time objectives (RTOs) and recovery point objectives (RPOs) are vastly different in these environments. Given this, are existing data protection tools adequate? Or is there a better way to utilize these expensive assets and yet achieve far superior results? The short answer is yes to both.

In this Opinion piece we will focus on answering these questions broadly through the data protection lens. We will then look at a specific case of how data protection can be designed with flash in mind by considering the combination of flash-optimized HP 3PAR StoreServ Storage, HP StoreOnce System backup appliances, and HP StoreOnce Recovery Management Central (RMC) software. These elements combine to produce an exceptional solution that meets the stringent application service requirements and data protection RTOs and RPOs that one finds in flash storage environments while keeping costs in check.

Publish date: 09/25/15
Profiles/Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18
Profiles/Reports

HPE Brings Multi-Cloud Storage to the Enterprise

Companies in every industry and from every corner of the world are increasingly adopting cloud storage, addressing use cases such as backup, archiving and disaster recovery. More than 96% of organizations we recently surveyed are housing at least some of their data in the cloud, up from just 65% five years before. Firms deploying storage in the cloud are looking to increase IT agility and workload scalability, while taking advantage of a more flexible, pay-as-you-go consumption model.

But for enterprises and mid-sized organizations alike, the cloud journey nearly always starts on premises. A large majority of organizations still run the core of their business-critical workloads in the data center, supported by significant and proven investments in on-premises hardware, workflows and business processes that support key business apps and ensure maximum value to users and other stakeholders. Not surprisingly, IT decision makers tread carefully when it comes to considering public cloud deployments for their critical apps or data.

To get the best of the cloud without compromising current IT investments, a growing majority of decision makers are now focusing on solutions with hybrid and multicloud capabilities. Hybrid cloud enables them to gain value from the cloud from day 1, while fully leveraging their on-prem infrastructure. Under a hybrid model, companies can deploy selected apps that make sense to run in the public cloud, but still run a majority of their core business workloads on-premises. They can also employ a dev-ops approach to begin to develop and run cloud-native apps.

Multicloud takes those benefits one step further, enabling portability of workloads between two or more clouds. Organizations we surveyed are now working with at least two major public cloud providers, on average, enabling them to avoid lock-in to a single provider and to choose the provider that best meets the needs of each app and use case. Together, hybrid and multicloud offer an attractive and measured approach for companies looking to deploy some of their workloads in the cloud.

In this piece we’ll examine the customer journey to cloud storage, including some important considerations companies should keep in mind as they decide what approach will work best for them. We’ll then describe HPE’s storage platforms, which are built for cloud and provide a powerful and unique approach to multicloud storage. Finally, we’ll look at the advantages that HPE storage delivers over other cloud storage deployment models, and show how these HPE platforms are helping enterprises to maximize the potential of their cloud storage initiatives.

Publish date: 09/21/18
Profiles/Reports

HPE RMC 6.0: Extending Beyond Copy Data Management

If you’ve worked in IT, you know that a large percentage of your company’s data has been copied at least once, and often multiple times, to meet the needs of various use cases. Whether it’s backup copies for data protection, archival copies for compliance, or clones for test/dev or analytics, any particular set of data is likely to have spawned one or more copies. While these copies are nearly always made for a good reason, in many organizations they have spiraled out of control, creating a copy data sprawl that is tough for IT to get its arms around, let alone manage. As copies of data have proliferated, so have the pain points of greater storage complexity, footprint and cost. The performance of production databases also suffers as copies are made for secondary applications.

It is these very issues that copy data management (CDM) is designed to address. CDM solutions focus on eliminating unnecessary duplication of production data to reduce storage consumption, generally through the use of data virtualization and data reduction technologies. The results can be compelling. Nearly one-third of the companies that Taneja Group recently surveyed have either adopted CDM solutions or are actively evaluating them, looking to achieve benefits such as reduced storage costs, faster data access, and better data visibility and compliance.

But while first-generation CDM offerings have proven helpful, they are not keeping up with the demands of new technologies and user requirements. In particular, Flash and Cloud bring new data management opportunities and challenges that cannot be addressed by traditional CDM solutions. User needs and expectations for CDM are also expanding, moving beyond just policy-based snapshot management among homogeneous arrays.

As we’ve learned in our research, next-gen CDM must meet a new set of user needs driven by Flash and Cloud innovations, including support for heterogeneous arrays, greater but less hands-on control of copies based on intelligent policy-based automation, and coverage of new use cases across the data lifecycle, such as test/dev, reporting and analytics. Customers are also looking for integrated solutions that combine CDM with data protection and other secondary storage functions.

As we’ll see, HPE Recovery Manager Central (RMC) 6.0 provides all these capabilities and more. In fact, we’ll argue that the updated RMC 6.0 offering has helped to make HPE a leader in the data management space, streamlining costs and enriching the experience of HPE customers while still delivering on the backup and recovery features that RMC is well known for.

Publish date: 10/16/18