Items Tagged: IT
MCS-Behind The Firewall: A Study Of The Corporate eDiscovery Buyer (Abstract Only)
Taneja Group is pleased to announce the general availability of the 2011 Taneja Group report on cor- porate buying practices for eDiscovery. This extensive 90-page Report is an invaluable resource in your marketing strategy, product planning, and competitive efforts in the challenging world of eDiscovery and Compliance.
Managing Information Explosion in Healthcare
Presented by Ashish Nadkarni, Senior Analyst and Consultant, Taneja Group
Recent social, regulatory and policy changes mean that the Healthcare Industry will face unprecedented growth in the use of electronic medical records (EMRs). This, coupled with the mandate for providers to become all electronic, means that Healthcare IT will be under tremendous pressure at least for the next decade in storing and managing the information explosion. Learn how IT can manage tiered retention of data while benefiting from the mega trends occurring in the industry today, including cloud, virtualization and mobility.
Corporate ediscovery purchasers and influencers run the gamut of departments and disciplines. This can make selling ediscovery into the corporation a tough row to hoe unless you know your buyer extremely well. This is particularly true with the IT ediscovery buyer and influencer.
EMC will make changes based on this year’s limited VMAX SP release but we think they are going in the right direction.
Hilarious videos of personal pursuits when Quest customers save all that time.
Taneja Group and Infostor: Big Data Survey 2012
Taneja Group and InfoStor jointly ran a survey asking IT managers about their big data experiences and roadmaps. We concluded that there is a great deal of uncertainty around big data: what it is, how to manage it, and if it is even in the IT domain rather than specialized application administrators.
Storing and managing large volumes of data certainly involves IT. However, “big data” is its own class: large data sets that are subjected to ongoing analytics and/or massive re-use. Some big data is structured into databases; most of it is unstructured. Big data operations continuously act upon large and growing volumes of data, which generates fast and frequent data movement between servers, networks and storage. Big data analytics in particular need fast and large feedback loops for decision-making as the specialized software tools analyze and reform data into a variety of views, reports and reformed data sets.
IT is rarely involved at the analytics administration level, but they are very involved at the storage level. Big data needs both high capacity and high performance, which requires storage with high capacity disk and the ability to process storage IO very quickly. It must also be highly available since big data by definition is active and important data. And it should be cost-effective as well, though it will not inexpensive.
[Taneja Group discusses scale-out storage as a best practice solution to big data analytics in our report: “Big Data, Big Storage: Scale-Out NAS for Big Data Environments.” (http://bit.ly/UGCVjm)]
Optimizing Performance Across Systems and Storage: Best Practices with TeamQuest
In this paper, we’ll briefly review the challenges to assuring good performance in today’s competitive IT environment, and discuss what it takes to overcome these challenges to deploy appropriate end-to-end infrastructure and operationally deliver high-performance service levels. We’ll then introduce TeamQuest, a long-time leading vendor in IT Service Optimization who has recently expanded their world-class performance and capacity management capabilities with deep storage domain coverage. This new solution is unique in both its non-linear predictive modeling leveraged to produce application-specific performance KPIs and its comprehensive span of visibility and management that extends from applications all the way down into SAN storage systems. Ultimately, we’ll see how TeamQuest empowers IT to take full advantage of agility and efficiency solutions like infrastructure virtualization, even for the most performance-sensitive and storage-intensive applications.