Items Tagged: Enterprise
Taneja Group Report Finds SEPATON's Purpose-Built Data Protection Platform a Winner for Large....
Taneja Group Report Finds SEPATON's Purpose-Built Data Protection Platform a Winner for Large Database Backup and Recovery in the Enterprise. Review confirms SEPATON uniquely optimized to lower complexity, cost, and risk of protecting massive databases
- Premiered: 11/30/11
- Author: Taneja Group
- Published: MarketWatch.com
Flash tech powers cloud storage and gives enterprises a boost
Flash technology can be attractive for cloud service providers -- and enterprises -- looking for a highly available, scalable and efficient data storage solution.
- Premiered: 05/21/12
- Author: Jeff Byrne
- Published: SearchStorage
Mainstreaming High IO Performance with Flash Cache
Servers are growing more and more powerful, but decades-old storage controller technology has not kept pace. This puts storage at an extreme disadvantage just as companies are growing huge infrastructures of virtual and physical systems, applications of all types and very large volumes of primary data.
- Premiered: 12/20/12
- Author: Taneja Group
- Published: Enterprise Storage Forum
Microsoft Buys StorSimple - Deepen integration between Microsoft Azure and Enterprise Data Center
Microsoft officially acquired StorSimple on November 15, 2012. StorSimple was a relative startup that had been shipping products for about 18 months. Why did Microsoft buy StorSimple? What is the strategy behind the purchase? Where will Microsoft take this newly acquired technology? These are many of the questions we are being asked at present. Here is our view....
Enterprise Storage Options
Local, Shared, Cloud-Based, & Beyond.
- Premiered: 03/04/13
- Author: Taneja Group
- Published: PCToday.com
Enterprise Cloud Collaboration: What It Is, What It's Not
Confused about collaboration? Don't be. Enterprise Cloud Collaboration (ECC) is a highly available and scalable cloud file sharing solution with centralized IT control.
- Premiered: 04/05/13
- Author: Taneja Group
Enterprise File Collaboration
There is a lot of confusion in the enterprise file collaboration space right now and frankly, vendors can't afford it.
Taneja Group Webcast on Enterprise Cloud Backup
Consumer online backup's big question is if they should pick JustCloud or Mozy. Enterprise online backup has a lot more riding on it.
Accelerate the Edge: The New Distributed Enterprise
Here at Taneja Group, we think enterprises ought to be agile and flexible, quickly leveraging their forward deployed remote and branch offices as a competitive weapon to gain market opportunity with the presence that only a physical office brings. Unfortunately, we’ve seen that as a business expands its footprint into new regions it can also suffer major growing pains with slow unwieldy deployment and an increasingly costly IT burden.
- Premiered: 09/26/13
- Author: Mike Matchett
- Published: InfoStor
Enterprise Online Backup: What Administrators need to know
We define online backup as using the cloud to provide users with a highly scalable and elastic repository for their backup data. This is true across all online backup users but enterprise has specific requirements and some risks that consumer and SMB customers do not share. Consumer and SMB – including education and small government agencies – primarily require acceptable backup and restore performance, plus security and compliance reporting in their online backup. The enterprise needs these things too but they are dealing with additional pressures from backing up larger data sets across multiple remote sites and/or storage systems and applications. Here is what to know when you consider cloud backup vendors for your enterprise backup system.
Seagate Kinetic: A Disruptive Force in the Storage Market
In the twelve months since announcing Kinetic Open Storage, Seagate has taken some major steps toward making the Kinetic vision a reality.
- Premiered: 11/03/14
- Author: Jeff Byrne
INFINIDAT Expands InfiniBox Enterprise Storage Family, Introduces Multi-Petabyte Scale Unified
INFINIDAT, a leader in high performance, highly available enterprise storage solutions, announces the expansion of its revolutionary InfiniBox family of storage arrays with the addition of two new capabilities and a new midrange model.
- Premiered: 09/21/15
- Author: Taneja Group
- Published: Business Wire
Enterprise Storage that Simply Runs and Runs: Infinidat Infinibox Delivers Incredible New Standard
Storage should be the most reliable thing in the data center, not the least. What data centers today need is enterprise storage that affordably delivers at least 7-9's of reliability, at scale. That's a goal of less than three seconds of anticipated unavailability per year - less than the reliability of most data centers.
Data availability is the key attribute enterprises need most to maximize their enterprise storage value, especially as data volumes grow into scales. Yet traditional enterprise storage solutions aren’t keeping pace with the growing need for greater than the oft-touted 5-9’s of storage reliability, instead deferring to layered on methods like additional replication copies, that can drive up latency and cost, or settling for cold tiering which zaps performance and reduces accessibility.
Within the array, as stored data volumes ramp up and disk capacities increase, RAID and related volume/LUN schemes begin to fall down due to longer and longer disk rebuild times that create large windows of vulnerability to unrecoverable data loss. Other vulnerabilities can arise from poor (or at best, default) array designs, software issues, and well-intentioned but often fatal human management and administration. Any new storage solution has to address all of these potential vulnerabilities.
In this report we will look at what we mean by 7-9’s exactly, and what’s really needed to provide 7-9’s of availability for storage. We’ll then examine how Infinidat in particular is delivering on that demanding requirement for those enterprises that require cost-effective enterprise storage at scale.
Considerations for cloud data migration and security
Cloud data migration is one of the biggest hurdles facing enterprises that want to take a stake in the cloud.
- Premiered: 09/28/15
- Author: Taneja Group
- Published: TechTarget: Search Cloud Storage
Flash storage market remains a tsunami
The flash storage market is poised for rapid growth into enterprise data centers as costs drop and solid-state drive density and capacity expands.
- Premiered: 10/02/15
- Author: Mike Matchett
- Published: TechTarget: Search Solid State Storage
Now Big Data Works for Every Enterprise: Pepperdata Adds Missing Performance QoS to Hadoop
While a few well-publicized web 2.0 companies are taking great advantage of foundational big data solution that they have themselves created (e.g. Hadoop), most traditional enterprise IT shops are still thinking about how to practically deploy their first business-impacting big data applications – or have dived in and are now struggling mightily to effectively manage a large Hadoop cluster in the middle of their production data center. This has led to the common perception that realistic big data business value may yet be just out of reach for most organizations – especially those that need to run lean and mean on both staffing and resources.
This new big data ecosystem consists of scale-out platforms, cutting-edge open source solutions, and massive storage that is inherently difficult for traditional IT shops to optimally manage in production – especially with still evolving ecosystem management capabilities. In addition, most organizations need to run large clusters supporting multiple users and applications to control both capital and operational costs. Yet there are no native ways to guarantee, control, or even gain visibility into workload-level performance within Hadoop. Even if there wasn’t a real high-end skills and deep expertise gap for most, there still isn’t any practical way that additional experts could tweak and tune mixed Hadoop workload environments to meet production performance SLA’s.
At the same time, the competitive game of mining of value from big data has moved from day-long batch ELT/ETL jobs feeding downstream BI systems, to more user interactive queries and business process “real time” applications. Live performance matters as much now in big data as it does in any other data center solution. Ensuring multi-tenant workload performance within Hadoop is why Pepperdata, a cluster performance optimization solution, is critical to the success of enterprise big data initiatives.
In this report we’ll look deeper into today’s Hadoop deployment challenges and learn how performance optimization capabilities are not only necessary for big data success in enterprise production environments, but can open up new opportunities to mine additional business value. We’ll look at Pepperdata’s unique performance solution that enables successful Hadoop adoption for the common enterprise. We’ll also examine how it inherently provides deep visibility and reporting into who is doing what/when for troubleshooting, chargeback and other management needs. Because Pepperdata’s function is essential and unique, not to mention its compelling net value, it should be a checklist item in any data center Hadoop implementation.
To read this full report please click here.
Hyperconverged Infrastructure for Demanding Enterprise Workloads
Date: January 26, 2016 at 8:00 am PT / 11:00 am ET
Presenters: Jeff Kato, Taneja Group; Sachin Chheda, Nutanix
In 2015, IT organizations large and small found hyperconverged infrastructure built with the right technology foundation to be a capable replacement for traditional servers and standalone storage in the datacenter for demanding enterprise applications such as critical databases.
Join the experts at Taneja Group and Nutanix in this technical webinar covering the infrastructure requirements for today’s demanding enterprise applications and how a virtualized first methodology along with a software-based approach to storage can change the way enterprise applications are hosted and served.
The session will also cover the different architectural options for different hyperconverged infrastructure and walk through real world implementations of IT organizations Nutanix Xtreme Computing Platform for demanding enterprise applications such as Oracle and SAP.
- Premiered: 01/26/16
- Location: OnDemand
- Speaker(s): Jeff Kato, Taneja Group; Sachin Chheda, Nutanix
HP Converges to Mine Big Value from Big Data
The promise of Big Data is engaging the imagination of corporations everywhere, even before looking to big data solutions to help handle the accelerated pressures of proliferating new data sources or in managing tremendously increasing amounts of raw and unstructured data. Corporations have long been highly competitive about analytically extracting value from their structured transactional data streams, but are now trying to competitively differentiate with new big data applications that span multiple kinds of data types, run in business interactive timeframes, and deliver more operational-focused (even transactional) values based on multiple types of processing.
This has led to some major re-thinking about the best approach, or journey, to success with Big Data. As mainstream enterprises are learning how and where their inevitable Big Data opportunities lie (and they all have them – ignoring them is simply not a viable strategy), they are also finding that wholesale adoption of a completely open source approach can lead to many unexpected pitfalls, like data islands, batch-analytical timeframes, multiplying scope, and constrained application value. Most of all, IT simply cannot completely halt existing processes and overnight transition to a different core business model or data platform.
But big data is already here. Companies must figure out how to process different kinds of data, stay on top of their big data “deluge”, remain agile, mine value, and yet hopefully leverage existing staff, resources and analytical investments. Some of the important questions include:
1.How to build the really exciting and valuable applications that contain multiple analytical and machine learning processing across multiple big data types?
2.How to avoid setting up two, three, or more parallel environments that require many copies of big data, complex dataflows and far too many new highly skilled experts?
We find that HP Haven presents an intriguing, proven, and enterprise-ready approach by converging structured, unstructured, machine-generated and other kinds of analytical solutions, many already proven world-class existing solutions on their own, into a single big data processing platform. This enables leveraging existing data, applications and existing experts while offering opportunities to analyze data sets in multiple ways. With this solution it’s possible to build applications that can take advantage of multiple data sources, multiple proven solutions, and easily “mash-up” whatever might be envisioned. However, the HP Haven approach doesn’t force a monolithic adoption but rather can be deployed and built-up as a customer’s big data journey progresses.
To help understand the IT challenges of big data and explore this new kind of enterprise data center platform opportunity, we’ve created this special vendor spotlight report. We start with a significant extract from the premium Taneja Group Enterprise Hadoop Infrastructure Market Landscape report to help understand the larger Hadoop market perspective. Then within that context we will review the HP Haven solution for Big Data and look at how it addresses key challenges while presenting a platform on which enterprises can develop their new big data opportunities.
Nutanix XCP For Demanding Enterprise Workloads: Making Infrastructure Invisible for Tier-1 Ent. Apps
Virtualization has matured and become widely adopted in the enterprise market. Approximately three in every five physical servers are deployed in a virtualized environment. After two waves of virtualization, it is safe to assume that a high percentage of business applications are running in virtualized environments. The applications last to deploy into the virtualized environment were considered the tier-1 apps. Examples of these include CRM and ERP environments running SAP NetWeaver, Oracle database and applications, and Microsoft SQL Server. In many 24X7 service industries, Microsoft Exchange and SharePoint are also considered tier-1 applications.
The initial approach to building virtualized environments that can handle these tier-1 applications was to build highly tuned infrastructure using best of breed three-tier architectures where compute, storage and networking were selected and customized for each type of workload. Newer shared storage systems have increasingly adopted virtualized all flash and hybrid architectures, which has allowed organizations to mix a few tier-1 workloads within the same traditional infrastructure and still meet stringent SLA requirements.
Enter now the new product category of enterprise-capable HyperConverged Infrastructure (HCI). With HCI, the traditional three-tier architecture has been collapsed into a single software-based system that is purpose-built for virtualization. In these solutions, the hypervisor, compute, storage, and advanced data services are integrated into an x86 industry-standard building block. These modern scale-out hyperconverged systems combine a flash-first software-defined storage architecture with VM-centric ease-of-use that far exceeds any three-tier approach on the market today. These attributes have made HCI very popular and one of the fastest growing product segments in the market today.
HCI products have been very popular with medium sized companies and specific workloads such as VDI or test and development. After a few years of hardening and maturity, are these products ready to tackle enterprise tier-1 applications? In this paper we will take a closer look at Nutanix Xtreme Computing Platform (XCP) and explore how its capabilities stack up to tier-1 application workload requirements.
Nutanix was a pioneer in HCI and is widely considered the market and visionary leader of this rapidly growing segment. Nutanix has recently announced the next step - a vision of the product beyond HCI. With this concept they plan to make the entire virtualized infrastructure invisible to IT consumers. This will encompass all three of the popular hypervisors: VMware, Hyper-V and their own Acropolis Hypervisor. Nutanix has enabled app mobility between different hypervisors, a unique concept across the converged system and HCI alike. This Solution Profile will focus on the Nutanix XCP platform and key capabilities that make it suitable for teir-1 enterprise applications. With the most recent release, we have found compelling features appropriate for most tier-1 application workloads. Combined with the value proposition of web-scale modular architecture this provides an easy pathway to data-center transformation that businesses of all sizes should take advantage of.
Google enterprise cloud challenge unlikely to be solved soon
The Internet giant predicts a tipping point for adoption of its public cloud offering, despite lingering questions about the size of its enterprise customer base and maturity of the platform.
- Premiered: 02/11/16
- Author: Taneja Group
- Published: TechTarget: Search Cloud Computing