Taneja Group | backup+storage
Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Items Tagged: backup+storage

Profiles/Reports

IBM ProtecTIER: Optimized for Tivoli Storage Manager

Deduplication took the market by storm several years ago, and backup hasn’t been the same since. With the ability to eradicate duplicate data in duplication-prone backups, deduplication made it practical to store large amounts of backup data on disk instead of tape. In short order, a number of vendors marched into the market spotlight offering products with tremendous efficiency claims, great throughput rates, and greater tolerance for the too often erratic throughput of backup jobs that was a thorn in the side for traditional tape. Today, deduplicating backup storage appliances are a common site in data centers of all types and sizes.

But deduplicating data is a tricky science. It is often not as simple as just finding matching runs of similar data. Backup applications and modifications to data can sprinkle data streams with mismatched bits and pieces, making deduplication much more challenging. The problem is worst for Virtual Tape Libraries (VTLs) that emulate traditional tape. Since they emulate tape, backup applications use all of their traditional tape formatting. Such formatting is designed to compensate for tape shortcomings and allow faster and better application access to data on tape, but it creates noise for deduplication.

The best products on the market recognize this challenge and have built “parsers” for every backup application – technology that recognizes the metadata within the backup stream and enables the backup storage appliance to read around it.

In 2012, IBM introduced a parser for IBM’s leading backup application Tivoli Storage Manager (TSM) in their ProtecTIER line of backup storage solutions. TSM has long had a reputation for a noisy tape format. That format enables richer data interaction than many competitors, but it creates enormous challenges for deduplication.

At IBM’s invitation, in November of 2012, Taneja Group Labs put ProtecTIER through the paces to evaluate whether this parser for the ProtecTIER family makes a difference. Our findings: Clearly it does; in our highly structured lab exercise, ProtecTIER looked fully poised to deliver advertised deduplication for TSM environments. In our case, we observed a reasonable 10X to 20X deduplication range for real world Microsoft Exchange data.


 

Publish date: 03/29/13
news

The evolution of data deduplication technology continues

New technologies come to market as value-add features for existing mainstream products that are then later merged into these products. It's often the only way a new vendor can bring a product to market, like what happened with data deduplication technology.

  • Premiered: 04/11/13
  • Author: Arun Taneja
  • Published: Tech Target: Search Data Backup
Topic(s): TBA Data Domain TBA Diligent Technologies TBA Exagrid TBA Quantum TBA Sepaton TBA Data Deduplication TBA backup storage TBA Avamar TBA EMC TBA Data protection TBA Virtual Server Environment TBA Primary Storage TBA Storwize TBA IBM TBA NetApp TBA Dell TBA Ocarina Networks
Profiles/Reports

EMC Avamar 7 - Protecting Data at Scale in the Virtual Data Center (TVS)

Storing digital data has long been a perilous task.  Not only are stored digital bits subject to the catastrophic failure of the devices they rest upon, but the nature of shared digital bits subjects them to error and even intentional destruction.   In the virtual infrastructure, the dangers and challenges subtly shift.  Data is more highly consolidated and more systems depend wholly on shared data repositories; this increases data risks.  Many virtual machines connecting to single shared storage pools mean that IO or storage performance has become an incredibly precious resource; this complicates backup, and means that backup IO can cripple a busy infrastructure.  Backup is a more important operation than ever before, but it is also fundamentally more challenging than ever before.

Fortunately, the industry rapidly learned this lesson in the earlier days of virtualization, and has aggressively innovated to bring tools and technologies to bear on the challenge of backup and recovery for virtualized environments.  APIs have unlocked more direct access to data, and products have finally come to market that make protection easier to use, and more compatible with the dynamic, mobile workloads of the virtual data center.  Nonetheless differences abound between product offerings, often rooted in the subtleties of architecture – architectures that ultimately determine whether a backup product is best suited for SMB-sized needs, or whether a solution can scale to support the large enterprise.

Moreover, within the virtual data center, TCO centers on resource efficiency, and a backup strategy can be one of the most significant determinants of that efficiency. On one hand, traditional backup just does not work and can cripple efficiency.  There is simply too much IO contention and application complexity in trying to convert a legacy physical infrastructure backup approach to the virtual infrastructure.  On the other hand, there are a number of specialized point solutions designed to tackle some of the challenges of virtual infrastructure backup.  But too often, these products do not scale sufficiently, lack consolidated management, and stand to impose tremendous operational overhead as the customer’s environment and data grows.  When taking a strategic look at the options, it often looks like backup approaches fly directly in the face of resource efficiency.

Publish date: 10/31/13
Profiles/Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18