Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Profiles/Reports

Free Reports

Page 5 of 8 pages ‹ First  < 3 4 5 6 7 >  Last ›
Free Reports

For Lowest Cost and Greatest Agility, Choose Software-Defined Data Center Architectures

The era of the software-defined data center is upon us. The promise of a software-defined strategy is a virtualized data center created from compute, network and storage building blocks. A Software-Defined Data Center (SDDC) moves the provisioning, management, and other advanced features into the software layer so that the entire system delivers improved agility and greater cost savings. This tectonic shift in the data center is as great as the shift to virtualized servers during the last decade and may prove to be greater in the long run.

This approach to IT infrastructure started over a decade ago when compute virtualization - through use of hypervisors - turned compute and server platforms into software objects. This same approach to virtualizing resources is now gaining acceptance in networking and storage architectures. When combined with overarching automation software, a business can now virtualize and manage an entire data center. The abstraction, pooling and running of compute, storage and networking functions, virtually, on shared hardware brings unprecedented agility and flexibility to the data center while driving costs down.

In this paper, Taneja Group takes an in-depth look at the capital expenditure (CapEx) savings that can be achieved by creating a state-of-the-art SDDC, based on currently available technology. We performed a comparative cost study of two different environments: one using the latest software solutions from VMware running on industry standard and white label hardware components; and the other running a more typical VMware virtualization environment, on mostly traditional, feature rich, hardware components, which we will describe as the Hardware-Dependent Data Center (HDDC). The CapEx saving we calculated were based on creating brand new (Greenfield) data centers for each scenario (an additional comparison for upgrading an existing data center is included at the end of this white paper).

Our analysis indicates that a dramatic cost savings, up to 49%, can be realized when using today’s SDDC capabilities combined with low cost white-label hardware, compared to a best in class HDDC. In addition, just by adopting VMware Virtual SAN and NSX in their current virtualized environment users can lower CapEx by 32%. By investing in SDDC technology, businesses can be assured their data center solution can be more easily upgraded and enhanced over the life of the hardware, providing considerable investment protection. Rapidly improving SDDC software capabilities, combined with declining hardware prices, promise to reduce total costs even further as complex embedded hardware features are moved into a more agile and flexible software environment.

Depending on customers’ needs and the choice of deployment model, an SDDC architecture offers a full spectrum of savings. VMware Virtual SAN is software-defined storage that pools inexpensive hard drives and common solid state drives installed in the virtualization hosts to lower capital expenses and simplify the overall storage architecture. VMware NSX aims to make these same advances for network virtualization by moving security and network functions to a software layer that can run on top of any physical network equipment. An SDDC approach is to “virtualize everything” along with data center automation that enables a private cloud with connectors to the public cloud if needed.

Publish date: 08/19/14
Free Reports

Redefining the Economics of Enterprise Storage

Enterprise storage has long delivered superb levels of performance, availability, scalability, and data management.  But enterprise storage has always come at exceptional price, and this has made enterprise storage unobtainable for many use cases and customers.

Most recently Dell introduced a new, small footprint storage array – the Dell Storage SC Series powered by Compellent technology – that continues to leverage proven Dell Compellent technology using Intel technology in an all-new form factor. The SC4020 also introduces the most dense Compellent product ever, an all-in-one storage array that includes 24 drive bays and dual controllers in only 2 rack units of space.  While the Intel powered SC4020 has more modest scalability than current Compellent products, this array marks a radical shift in the pricing of Dell’s enterprise technology, and is aiming to open up Dell Compellent storage technology for an entire market of smaller customers as well as large customer use cases where enterprise storage was too expensive before.

Publish date: 05/05/14
Free Reports

Data Defined Storage: Building on the Benefits of Software Defined Storage

At its core, Software Defined Storage decouples storage management from the physical storage system. In practice Software Defined Storage vendors implement the solution using a variety of technologies: orchestration layers, virtual appliances and server-side products are all in the market now. They are valuable for storage administrators who struggle to manage multiple storage systems in the data center as well as remote data repositories.

What Software Defined Storage does not do is yield more value for the data under its control, or address global information governance requirements. To that end, Data Defined Storage yields the benefits of Software Defined Storage while also reducing data risk and increasing data value throughout the distributed data infrastructure. In this report we will explore how Tarmin’s GridBank Data Management Platform provides Software Defined Storage benefits and also drives reduced risk and added business value for distributed unstructured data with Data Defined Storage. 

Publish date: 03/17/14
Free Reports

Fibre Channel: The Proven and Reliable Workhorse for Enterprise Storage Networks

Mission-critical assets such as virtualized and database applications demand a proven enterprise storage protocol to meet their performance and reliability needs. Fibre Channel has long filled that need for most customers, and for good reason. Unlike competing protocols, Fibre Channel was specifically designed for storage networking, and engineered to deliver high levels of reliability and availability as well as consistent and predictable performance for enterprise applications. As a result, Fibre Channel has been the most widely used enterprise protocol for many years.

But with the widespread deployment of 10GbE technology, some customers have explored the use of other block protocols, such as iSCSI and Fibre Channel over Ethernet (FCoE), or file protocols such as NAS. Others have looked to Infiniband, which is now being touted as a storage networking solution. In marketing the strengths of these protocols, vendors often promote feeds and speeds, such as raw line rates, as a key advantage for storage networking. However, as we’ll see, there is much more to storage networking than raw speed.

It turns out that on an enterprise buyer’s scorecard, raw speed doesn’t even make the cut as an evaluation criteria. Instead, decision makers focus on factors such as a solution’s demonstrated reliability, latency, and track record in supporting Tier 1 applications. When it comes to these requirements, no other protocol can measure up to the inherent strengths of Fibre Channel in enterprise storage environments.

Despite its long, successful track record, Fibre Channel does not always get the attention and visibility that other protocols receive. While it may not be winning the media wars, Fibre Channel offers customers a clear and compelling value proposition as a storage networking solution. Looking ahead, Fibre Channel also presents an enticing technology roadmap, even as it continues to meet the storage needs of today’s most critical business applications.

In this paper, we’ll begin by looking at the key requirements customers should look for in a commercial storage protocol. We’ll then examine the technology capabilities and advantages of Fibre Channel relative to other protocols, and discuss how those translate to business benefits. Since not all vendor implementations are created equal, we’ll call out the solution set of one vendor – QLogic – as we discuss each of the requirements, highlighting it as an example of a Fibre Channel offering that goes well beyond the norm.

Publish date: 02/28/14
Free Reports

Storage That Turns Big Data Into a Bigger Asset: Data-Defined Storage With Tarmin GridBank

UPDATED FOR 2014: Today’s storage industry is as stubbornly media-centric as it has always been: SAN, NAS, DAS; disk, cloud, tape. This centricity forces IT to deal with storage infrastructure on media-centric terms. But the storage infrastructure should really serve data to customers, not media; it’s the data that yields business value, while the media should be an internal IT architectural choice.

Storage media focused solutions only support business indirectly by providing optimized storage infrastructure for data. Intelligent data services on the other hand provide direct business value by optimizing data utility, availability, and management. The shift from traditional thinking here is really about seeking to provide logically ideal data storage for the people who own and use the data first, while freeing up underlying storage infrastructure designs to be optimized for efficiencies as desired. Ideal data storage would be global in access and scalability, secure and resilient, and inherently support data-driven management and applications.

Done well, this data centric approach would yield significant competitive advantage by leveraging an enterprise’s valuable intellectual property:  its vast and growing amounts of unstructured data. If this can be done by building on the company’s existing data storage and best practices, the business can quickly increase profitability, achieve faster time-to-market, and gain tremendous agility for innovation and competitiveness.

Tarmin, with its GridBank Data Management Platform, is a leading proponent of the data centric approach. It is firmly focused on managing data for global accessibility, protection and strategic value. In this product profile, we’ll explore how a data centric approach drives business value. We’ll then examine how GridBank was architected expressly around the concept that data storage should be a means for extracting business value from that data, not as a dead-end data dump.

Publish date: 02/17/14
Free Reports

Glassbeam SCALAR: Making Sense of the Internet of Things

In this new era of big data, sensors can be included in almost everything made. This “Internet Of Things” generates mountains of new data with exciting potential to be turned into invaluable information. As a vendor, if you make a product or solution that when deployed by your customers produces data about its ongoing status, condition, activity, usage, location, or practically any other useful information, you can now potentially derive deep intelligence that can be used to improve your products and services, better satisfy your customers, improve your margins, and grow market share.

For example, such information about a given customer’s usage of your product and its current operating condition, combined with knowledge gleaned from all of your customers’ experiences, enables you to be predictive about possible issues and proactive about addressing them. Not only do you come to know more about a customer’s implementation of your solution than the customer himself, but you can now make decisions about new features and capabilities based on hard data.

The key to gaining value from this “Internet Of Things” is the ability to make sense out of the kind of big data that it generates. One set of current solutions addresses data about internal IT operations including “logfile” analysis tools like Splunk and VMware Log Insight. These are designed for a technical user focused on recent time series and event data to improve tactical problem “time-to-resolution”. However, the big data derived from customer implementations is generally multi-structured across streams of whole “bundles” of complexly related files that can easily grow to PB’s over time. Business user/analysts are not necessarily IT-skilled (e.g. marketing, support, sales…) and the resulting analysis to be useful must at the same time be more sophisticated and be capable of handling dynamic changes to incoming data formats.

Click "Available Now" to read the full analyst opinion.

Publish date: 10/21/13
Page 5 of 8 pages ‹ First  < 3 4 5 6 7 >  Last ›