Items Tagged: Big+Data
SearchStorage.com spoke with EMC World attendees to better understand the challenges associated with storing and analyzing massive data sets.
To play this video, click the "Register" or "Event Website" buttons OR click here.
With all the talk about cloud and big data, it’s hard to tell which comes first; but it just might be a cloud foundation that enables big data applications.
Quantum Corp., provider of data protection and big data management, announced this week general availability of three new products targeted at the enterprise customer backup, archive and management challenges. Enterprise environments are characterized by large, fast-growing data volumes requiring high-capacity storage, extensive scalability, and high performance from their backup and archive solutions. They also need an integrated approach to monitor and manage processes across sites and tiers of storage.
So many people come out with end of the year “Trends” articles that I sometimes skip it. However, there is so much rapid morphing in the eDiscovery industry that if I didn’t share my 2012 trends I’d be a slacker. We can’t have that. The following 9 points are some of the trends you need to know about the complex field of eDiscovery coming into 2012.
Transforming Data Protection with vDPAS: A Paradigm Shift
Traditional data protection approach is challenged by a new breed of vendors with a more effective approach to data protection and availability. It protects multiple systems, environments (physical, virtual and cloud) and applications in a single, highly efficient, virtualized storage pool – the fountain of life conceptualized by Taneja Group over five years ago.
This new approach to data protection and data availability is so radically different from the choices available in the marketplace today that it warrants its own category. Taneja Group calls the new category Virtualized Data Protection and Availability Storage (vDPAS), which enables SLA-driven data protection and data management across the entire production environment – regardless of application, system or deployment model.
DDN’s latest SFA offerings make big data capabilities more accessible and affordable than ever before.
DataDirect Networks (DDN) launched two storage systems for people who want to start small in their approach to “big data.”
NetApp's DDP is going a long way towards protecting active big data environments.
According to Amazon, size doesn't really matter in their definition of Big Data. Instead, it's more about the threshold where distributed processing solutions like Elastic Map Reduce start providing cost-effective development and operations.
Recent innovators like GridIron are tackling extreme performance with shared appliances that sit in front of shared storage environments.
The four-part cloud tips series by Arun Taneja, founder and president of Taneja Group, concludes with an explanation of which applications are best suited for the cloud. When moving applications to the cloud, what are the basic ground rules for what applications aren’t going to work and which ones might from a performance standpoint?
One of the issues created by holding voluminous data sets (also called "big data") in your storage environment is how to protect that data.
Cleversafe is announcing a platform integration with Hadoop as part of their upcoming Cleversafe 3.0 release scheduled for later this year.
BEAVERTON, Ore. – July 18, 2012 – The InfiniBand® Trade Association (IBTA), a global organization dedicated to maintaining and furthering the InfiniBand™ specification, today released an analyst report from Taneja Group, demonstrating the continued market growth of InfiniBand products in the HPC and emerging as an attractive choice for the core of the enterprise data center.
With “big data” storage performance requirements that dwarf the needs of most IT shops, the National Center for Supercomputing Applications (NCSA) knew that scale-out file storage was the only realistic option for its Industrial Forge (iForge) high-performance computing (HPC) service.
HP IBRIX X9730 Storage – Unleashing High Value Archives
Future-proofed archives on scale-out architectures
The IT organization has long been challenged to come up with effective strategies storing important data and content. This is fast becoming much more critical as data evolves into having more value in on-going analysis, future reference, and monetizable reuse, while simultaneously becoming increasingly surrounded by compliance and regulatory requirements. Yet each aspect of data value comes with an even bigger challenge for IT: delivering effective strategies to achieve sufficient and lasting preservation and access, over a longer period of time than ever before.
Unfortunately, it’s more common to see a revolving door process of temporary solutions that must be replaced every few years. These “solutions” are in direct opposition to crafting long-term strategies, and are also responsible for a web of complexity that creates one of the greatest management and operational costs in today’s enterprise data center. Worse yet, the data management practices that are the result of this revolving door storage approach also limit the accessibility and use of growing and aging data. In the age of increasingly critical and value laden data sets, broken storage strategies stand to break the business.
In this technology profile, we’ll look at the consequences of this broken approach in the face of a monumental shift in the value of data and content. We will consider the challenges facing an organization trying to position itself for better long-term storage and meaningful reuse of large amounts of data. We’ll examine one solution that clearly stands out in the face of these challenges – the HP IBRIX X9730 Storage, which has big data storage and high-value archiving directly in its crosshairs.
DDN was one of the first vendors to realize that big data comes in two parts: high performance/scalability and a complex value chain.
Dell is going after the big data market in a big way with its DX Object Storage Platform and partner RainStor’s big data repository.
Symantec Corp. today announced an Apache Hadoop add-on capability for its Veritas Cluster File System to help run "big data" analytics on storage area networks instead of scale-out, commodity servers using local storage.
Swedish startup Compuverde AB entered the object storage market today with software that runs on any commodity hardware and a software gateway that delivers file services.