Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Research Areas

Data Protection/Management

Includes Backup, Recovery, Replication, Archiving, Copy Data Management and Information Governance.

The goal of modern data protection/management is to help companies protect, understand and use their data. Functionality comes in many forms, ranging from array-based snapshots and standalone backup, replication and deduplication products to comprehensive data management platforms. Since the amount of data is increasing much faster than IT budgets, companies are focused on decreasing storage costs and simplifying data protection processes. As a result, vendors are moving to scale-out platforms that use commodity hardware and offer better storage efficiency and policy-based automation of administrative functions, such as system upgrades, data recovery and data migration for long-term data retention. Depending on operational needs, objectives outside of data protection may include orchestrating the lifecycle of on-demand test/dev environments, using secondary storage for file services and search/analytics to ensure data compliance.

Page 1 of 37 pages  1 2 3 >  Last ›
Free Reports / Report

HPE and Micro Focus Data Protection for Azure Stack

Hybrid cloud is increasingly gaining popularity among enterprise IT buyers, as companies recognize and begin to validate its benefits. With a hybrid cloud, organizations can take advantage of the elasticity and agility of the public cloud, especially for new cloud-native apps, while continuing to run their businesses in the near term on their existing apps on premises. Users gain the choice of deploying new and existing workloads in the public cloud or the data center, wherever it makes the most sense, and the flexibility to migrate them as needed. A hybrid cloud significantly eases the transition to the cloud, enabling organizations to compete in the new cloud-driven world while preserving current IT investments. With these benefits in mind, well over 80% of organizations we recently surveyed are in the process of moving or planning a move to a hybrid cloud infrastructure.


In this brave new world, Microsoft Azure and Azure Stack are increasingly being adopted as the foundation for companies’ hybrid cloud infrastructure. Microsoft Azure is a leading  public cloud offering that, based on Taneja Group research, consistently ranks neck-in-neck with Amazon Web Services in enterprise adoption, with more than 50% of companies using or planning to use Azure within the next two years. Azure Stack enables organizations to deliver Azure services from their own data center. Delivered as an integrated solution on HPE ProLiant servers, Azure Stack allows customers to run Azure compatible apps on premises as well as use cases that benefit from a hybrid deployment. Together, Azure and Azure Stack provide a natural and relatively frictionless path for Microsoft Windows customers to move to the cloud, along with support for new cloud-native tools and services that allow customers to fully take advantage of cloud agility and scalability.


As organizations move critical apps and data to the cloud, data protection quickly becomes a key requirement. But as buyers evaluate solutions, they often find that cloud providers’ built-in backup tools lack the flexibility, breadth of coverage, app awareness and enterprise capabilities they have become accustomed to on premises. As a result, companies look to other vendors—often their on- premises providers—to meet their data protection needs. As we’ll see, Micro Focus Data Protector offers a fully integrated, robust and comprehensive solution for backup and recovery on HPE Azure Stack.


In this piece we’ll further explore the need for data protection in a hybrid cloud environment, and examine the specific backup and recovery approaches that buyers are looking for, as revealed in our recent research. Then we’ll briefly examine what makes Micro Focus Data Protector an ideal solution for protecting an organization’s key information assets in an HPE Azure Stack hybrid cloud setting.
 

Publish date: 06/18/18
Report

The Easy Cloud For Complete File Data Protection: Igneous Systems Backs-up AND Archives All Your NAS

When we look at all the dollars being spent on endless NAS capacity growth, and the increasingly complex (and mostly unrealistic) file protection schemes that go along, we find most enterprises aren’t happy with their status quo. And while cloud storage seems so attractive theoretically, making big changes to a stable NAS storage architecture can be both costly and present enough risk to keep many stuck endlessly growing their primary filers. Cloud gateways can help offload capacity, yet they add operational complexity and are ultimately only half-way solutions.

What file-dependent enterprises really need is a true hybrid solution that backs on-premise primary NAS with hybrid cloud secondary storage to provide a secure, reliable, elastic, scalable, and – above all – a simply seamless solution. Ideally we would want a drop-in solution that was remotely managed, paid for by subscription and highly performant – all while automatically backing-up (and/or archiving) all existing primary NAS storage. In other words, enterprises don’t want to rip and replace working primary NAS solutions, but they do want to easily offload and extend them with superior cloud-shaped secondary storage.

When it comes to an enterprise’s cloud adoption journey, we can recommend that “hybrid cloud” storage services be adopted first to address longstanding challenges with NAS file data protection. While many enterprises have reasonable backup solutions for block (and sometimes VM images/disks), reliable data protection for fast growing file is much harder due to trends towards bigger data repositories, faster streaming data, global sharing requirements, and increasingly tighter SLA’s (not to mention shrinking backup windows). Igneous Systems promises to help filer-dependent enterprises keep up with all their file protection challenges with their Igneous Hybrid Storage Cloud that features integrated enterprise file backup and archive.

Igneous Systems’ storage layer integrates several key technologies – highly efficient, scalable, remotely managed object storage; built-in tiering and file movement not just on the back-end to public clouds, but also on the front-end from existing primary NAS arrays; remote management as-a-service to offload IT staff; and all necessary file archive and backup automation.

And Igneous Hybrid Storage Cloud is also a first-class object store with valuable features like built-in global meta-data search. However, here we’ll focus on how Igneous Backup and Igneous Archive services is used to solve gaping holes in traditional approaches to NAS backup and archive.

Download the Solution Profile today!

Publish date: 06/20/17
Report

Companies Improve Data Protection and More with Cohesity

We talked to six companies that have implemented Cohesity DataProtect and/or the Cohesity DataPlatform. When these companies evaluated Cohesity, their highest priority was reducing storage costs and improving data protection. To truly modernize their secondary storage infrastructure, they also recognized the importance of having a scalable, all-in-one solution that could both consolidate and better manage their entire secondary data environment.

Prior to implementing Cohesity, many of the companies we interviewed had significant challenges with the high cost of their secondary storage. Several factors contributed to the high costs including the need to license multiple products, inadequate storage reduction, the need for professional services and extensive training, difficulty scaling and maintaining systems and adding capacity to expensive primary storage for lower-performance services, such as group file shares.

In addition to lower storage costs, all the companies we talked to also wanted a better data protection solution. Many companies were struggling with slow backup speeds, insufficient recovery times and cumbersome data archival methods. Solution complexity and high operational overhead was also a major issue. To address these issues, companies wanted a unified data protection solution that offered better backup performance, instant data recovery, simplified management, and seamless cloud integration for long-term data retention.

Companies also wanted to improve overall secondary storage management and they shared a common goal of combining secondary storage workloads under one roof. Depending on their environment and their operational needs, their objectives outside of data protection included providing self-service access to copies of production data for on-demand environments (such as test/dev), using secondary storage for file services and leveraging indexing and advanced search and analytics to find out-of-place confidential data and ensure data compliance.

Cohesity customers found that the key to addressing these challenges and needs is Cohesity’s Hyperconverged Secondary Storage. Cohesity is a pioneer of Hyperconverged Secondary Storage, a new category of secondary storage based on a webscale, distributed file system that scales linearly and provides global data deduplication and automatic indexing as well as advanced search and analytics and policy-based management of all secondary storage workloads. These capabilities combine to provide a single system that efficiently stores, manages, and understands all data copies and workflows residing in a secondary storage environment – whether the data is on-premises or in the cloud. There are no point products, therefore less complexity and lower licensing costs.

It’s a compelling value proposition, and importantly, every company we talked to stated that Cohesity has met and exceeded their expectations and has helped them rapidly evolve their data protection and overall secondary data management. To learn about each customer’s journey, we examined their business needs, their data center environment, their key challenges, the reasons they chose Cohesity, and the value they have derived. Read on to learn more about their experience.

Publish date: 04/28/17
Report

Cloud Object Storage for the Healthcare Data Blues

The healthcare industry continues to face tremendous cost challenges. The U.S. government estimates national health expenditures in the United States accounted for $3.2 trillion last year – nearly 18% of the country’s total GDP. There are many factors that drive up the cost of healthcare, such as the cost of new drug development and hospital readmissions. In addition, there’s compelling studies that show medical organizations will need to evolve their IT environment to curb healthcare costs and improve patient care in new ways, such as cloud-based healthcare models aimed at research community collaboration, coordinated care and remote healthcare delivery.

For example, Goldman Sachs recently predicted that the digital revolution can save $300 billion in spending in the healthcare sector by powering new patient options, such as home-based patient monitoring and patient self-management. Moreover, the most significant progress may come from a medical organization transforming their healthcare data infrastructure. Here’s why:

  • Advancements in digital medical imaging has resulted in an explosion of data that sits in  picture archiving and communications systems (PACS) and vendor neutral archives (VNAs).
  • Patient care initiatives such as personalized medicine and genomics require storing, sharing and analyzing massive amounts of unstructured data.
  • Regulations such as the Health Insur­ance Portability and Accountability Act (HIPAA) require organizations to have policies for long term image retention and business continuity.

Unfortunately, traditional file storage approaches aren’t well-suited to manage vast amounts of unstructured data and present several barriers to modernizing healthcare infrastructure. A recent Taneja Group survey found the top three challenges to be:

  • Lack of flexibility: Traditional file storage appliances require dedicated hardware and don’t offer tight integration with collaborative cloud storage environments.
  • Poor utilization: Traditional file storage requires too much storage capacity for system fault tolerance, which reduces usable storage.
  • Inability to scale: Traditional storage solutions such as RAID-based arrays are gated by controllers and simply aren’t designed to easily expand to petabyte storage levels.

As a result, healthcare organizations are moving to object storage solutions that offer an architecture inherently designed for web scale storage environments. Specifically, object storage offers healthcare organizations the following advantages:

  • Simplified management, hardware independence and a choice of deployment options – private, public or hybrid cloud – lowers operational and hardware storage costs
  • Web-scale storage platform provides scale as needed and enables a pay as you go model
  • Efficient fault tolerance protects against site failures, node failures and multiple disk failures
  • Built in security protects against digital and physical breeches
Publish date: 03/22/17
Report

IBM Cloud Object Storage Provides the Scale and Integration Needed for Modern Genomics Infra.

For hospitals and medical research institutes, the ability to interpret genomics data and identify relevant therapies is key to provide better patient care through personalized medicine. Many such organizations are racing forward, analyzing patients’ genomic profiles to match more clinically actionable treatments using artificial intelligence (AI).

These rapid advancements in genomic research and personalized medicine are very exciting, but they are creating enormous data challenges for healthcare and life sciences organizations. High-throughput DNA sequencing machines can now process a human genome in a matter of hours at a cost approaching one thousand dollars. This is a huge drop from a cost of ten million dollars ten years ago and means the decline in genome sequencing cost has outpaced Moore’s Law (see chart). The result is an explosion in genomic data – driving the need for solutions that can affordably and securely store, access, share, analyze and archive enormous amounts of data in a timely manner.

Challenges include moving large volumes of genomic data from cost-effective archival storage to low latency storage for analysis to reduce the time needed to analyze genetic data. Currently, it takes days to do a comprehensive DNA sequence analysis.

Sharing and interpreting vast amounts of unstructured data to find relationships between a patient’s genetic characteristics and potential therapies adds another layer of complexity. Determining connections requires evaluating data across numerous unstructured data sources, such as genomic sequencing data, medical articles, drug information and clinical trial data from multiple sources.

Unfortunately, the traditional file storage within most medical organizations doesn’t meet the needs of modern genomics. These systems can’t accommodate massive amounts of unstructured data and they don’t support both data archival and high-performance compute. They also don’t facilitate broad collaboration. Today, organizations require a new approach to genomics storage, one that enables:

  • Scalable and convenient cloud storage to accommodate rapid unstructured data growth
  • Seamless integration between affordable unstructured data storage, low latency storage, high performance compute, big data analytics and a cognitive healthcare platform to quickly analyze and find relationships among complex life science data types
  • A multi-tenant hybrid cloud to share and collaborate on sensitive patient data and findings
  • Privacy and protection to support regulatory compliance
Publish date: 03/22/17
Free Reports

Is Object Storage Right For Your Organization?

Is object storage right for your organization? Many companies are asking this question as they seek out storage solutions that support vast unstructured data growth throughout their organizations. Object storage is ideal for large-scale unstructured data storage because it easily scales to several petabytes and beyond by simply adding storage nodes. Object storage also provides high fault tolerance, simplified storage management and hardware independence – core capabilities that are essential to cost-effectively manage large-scale storage environments. Add to this built-in support for geographically distributed environments and it’s easy to see why object storage solutions are the preferred storage approach for multiple use cases such as cloud-native applications, highly scalable file backup, secure enterprise collaboration, active archival, content repositories and increasingly cognitive computing workloads such as Big Data analytics.

To help you decide if object storage is right for your company and to help you understand how to apply various storage technologies, we have created a table below that positions object storage relative to block storage and file storage.

As the table shows, there are several factors that differentiate block, file and object storage. An easy way to think about the differences is the following; block storage is necessary for critical applications where storage performance is the key consideration, file storage is well-suited for highly scalable shared file systems and object storage is ideal when cloud-scale capacity and convenience as well as reliability and geographically distributed access are the major storage requirements. 

Publish date: 12/30/16
Page 1 of 37 pages  1 2 3 >  Last ›