Includes Backup/Recovery, Archiving, DPM, VTL, CDP, Data De-duplication, DRM.
Data is the lifeblood of an enterprise. And yet data has been protected in essentially the same fashion over the past two decades, i.e. by backing it up to tape and sending the tapes offsite. This method alone is no longer adequate and a spade of new technologies has become available in the last five years. These new technologies are already transforming the way data is protected, how long it is kept online, how it is archived. Recovery Management has emerged as a new discipline focused on recovering data rather than copying data. The new compliance requirements are essentially requiring companies of all sizes to upgrade their data protection infrastructures or be subject to huge fines. The level of innovation in this space is torrid. Taneja Group covers this space from end to end and has defined many of the new categories that now are considered the norm. The analysts that cover this space have deep industry backgrounds in developing and marketing these technologies.
We define online backup as using the cloud to provide users with a highly scalable and elastic repository for their backup data. This is true across all online backup users but enterprise has specific requirements and some risks that consumer and SMB customers do not share. Consumer and SMB – including education and small government agencies – primarily require acceptable backup and restore performance, plus security and compliance reporting in their online backup. The enterprise needs these things too but they are dealing with additional pressures from backing up larger data sets across multiple remote sites and/or storage systems and applications. Here is what to know when you consider cloud backup vendors for your enterprise backup system.
Taneja Group and InfoStor jointly ran a survey asking IT managers about their experience with corporate file sharing. Taneja Group defines corporate file sharing as the ability to share large numbers of files between business users across networks and mobile devices.
File sharing heavily intersects with Bring Your Own Device (BYOD) and the cloud. BYOD is the phenomenon of employees using personal mobile devices for personal and business applications and data access. File sharing as a business usage is closely associated with BYOD as end users seek to easily share files between their own and others’ multiple computing devices.
File sharing is also bound up with cloud usage. File sharing on mobile devices does not strictly require file sharing services using the cloud; basic secure sharing can be done via VPN just as one would email a file or share its pathname over the LAN. However, this solution is less than ideal for file sharing because it is poorly scalable and lacks any file sharing application functionality.
In contrast, most file sharing products use the cloud because the environment is highly scalable and delivers application functionality such as file versioning and locking. Many file sharing products also use the cloud to host a shared file repository, and most integrate with Active Directory and other SAML-based access management applications. Given a huge growth in data files and in mobile access needs, this approach is far superior to simply sending files using VPN connections.
This is no surprise to end users, who happily use file sharing applications like Dropbox to easily share files. Yet not all file sharing applications are created equal and consumer file sharing applications can threaten corporate data security. Vendors are quickly developing business- and enterprise-level file sharing applications in response to valid concerns about file sharing security, scalability, management, usability and compliance.
These are serious questions and should be serious concerns for IT in businesses of any size. However, our survey found that although some respondents have file sharing solutions and policies already in place, many did not. Some respondents have solid short-term plans to do so but others have no plans in place. Why? Taneja Group has observed that when IT denies a need for secure file sharing in a BYOD environment, they usually lack the time, sense of urgency, executive support and/or budget to deal effectively with the problem.
For more on file collaboration/BYOD issues and vendors, download Taneja Group’s File Collaboration Landscape Market Report.
A technology publication that shall go unnamed posted the news last year that Amazon Glacier was a “tape-killing cloud” and that it would “devastate the [tape] industry.”
Not exactly. We do believe that smaller tape implementations are going the way of the dodo bird. Cloud backup is quickly replacing small standalone tape drives and autoloaders for daily backup, and as low-end tape equipment ages IT replaces it with cloud for long-term backup retention.
However, tape housed in mid-sized and enterprise scale libraries are growing strongly in several high-value computing and industry segments. Thus the question for IT becomes not “Should I use tape?” but “When should I invest in tape libraries?”
The Dropboxes of the world are fueling the BYOD (Bring Your Own Device) phenomenon. They need to replace consumer-level file collaboration applications with an enterprise scale application and its robust management console. However, while IT may be anxious about BYOD and insecure file sharing it is not usually the most driving need on their full agenda. They need to understand how an EFC solution is the opportunity to solve a very large problem, and why they need to take advantage of the solution now.
Traditional data protection is three decades old and is definitely showing its age. Poor management oversight, data growth, virtualization, data silos, and stricter SLAs all conspire to strain traditional backup to the breaking point.
Traditional backup usually follows a set pattern: full baseline backup, daily incremental backup, full weekly backup. When backup volumes were smaller and fewer, this process worked well enough. But a daily operation creates backup data that is missing up to 20 hours or more of current data input, making it impossible to restore to a meaningful RPO. The obvious solution is continuous backup with frequent snapshot recovery points. But this type of backup product can be expensive and resource-intensive, and IT often reserves it for a few Tier 1 transactional applications. But what happens to large and popular business applications such as email, back office files and content management systems? Failed backup and recovery can still devastate a business.
This article will look at why traditional backup is so difficult to do well these days, and why the risk and expense are so high.
Backup applications with large user bases have been vendor cash cows because their customers are reluctant to change such deeply embedded products. As long as the backup worked, it was out of sight and out of mind.
But the field is rapidly changing.
The push to virtualize applications saw traditional backup foundering. Traditional backup in the virtual arena suffered from heavy operational overhead on server, application host, network, and storage levels. The growing amount of VMs and virtualized data had a serious impact on storage resources. For example, each VMDK file represented an entire VM file system image, typically at least 2GB in size. File sizes led to issues for bandwidth, monitoring, and storage resources.
In response, some vendors developed innovative virtual backup products. They made virtual backup much more resource-efficient and easily manageable. Increased performance shrank backup window requirements, provided effective RPO and RTO, simplified the backup process and improved recovery integrity. These tools changed the virtual data protection landscape for the better.
However, many of these startups offered limited solutions that only supported a single type of hypervisor and several physical machines. This left virtual and physical networks essentially siloed – not to mention the problem of multiple point products creating even more silos within both environments. Managing cross-domain data protection using a variety of point products became inefficient and costly for IT.
Traditional backup makers also scrambled to add virtualization backup support and succeeded to a point, but only a point. Their backup code base was written well before the mass appearance of the cloud and virtualization, and retrofitting existing applications only went so far to provide scalability and integration. There was also the inability to solve a problem that has plagued IT since the early days of backup tape – restore assurance. It has always been risky to find out after the fact that the backup you depended on is not usable for recovery. With data sets doubling every 18 months, the risk of data loss has significantly risen.
More modern backup solves some of these problems but causes new ones. Modern backup offers automated scheduling, manual operations, policy setting, multiple types of backup targets, replication schemes, application optimization, and more. These are useful features but they are also costly and resource-hungry: roughly 30% of storage costs go to IT operations alone. Another problem with these new features is their complexity. It is difficult to optimize and monitor the data protection environment, leading to a conservative estimate of about 20% failure in backup or recovery jobs.
In addition, most data protection products offer average-to-poor awareness and integration into their backup tape and disk targets. This results in difficulty in setting and testing Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO) for business applications. The last thing that IT wants is to cripple application recovery, but it is challenging to set meaningful RTO and RPO settings across multiple environments and applications, and extremely difficult to test them.
Even newer VM backup products are inadequate for modern enterprise data centers with physical and virtual layers running critical applications. Combine this with complex and mixed IT environments and it presents a very serious challenge for IT professionals charged with protecting data and application productivity.
What we are seeing now is next generation data protection that protects both virtual and physical environments in one flexible platform. Dell AppAssure is a leading pioneer in this promising field. AppAssure is rewriting the data protection book from limited point products to a highly agile data center protection platform with continual backup, instantaneous restore, backup assurance and a host of additional benefits.