Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Profiles/Reports

Recently Added

Page 2 of 153 pages  < 1 2 3 4 >  Last ›
Free Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18
Free Reports / Profile

HPE InfoSight: Cross-stack Analytics

Accurate and action-oriented predictive analytics have long been the Holy Grail of IT management. Predictive analytics that bring together large amounts of real-time data with powerful analytical capabilities have the potential to provide IT managers with real-time, data-driven insights into the health and performance of their overall environment, enabling them to anticipate and remediate looming issues and optimize resource utilization. While these potential benefits have long been understood, it has only been recently that major innovations in cloud, Internet of Things (IoT), data science, and AI/machine learning have paved the way for predictive analytics to become a reality in the data center.

The IoT now enables companies to collect and monitor real-time sensory or operational data at the edge—whether in online financial systems, retail locations, or on the factory floor. This raw data is typically streamed to the cloud, where it can be tabulated and analyzed. Powerful advances in edge-to-cloud networks and global learning capabilities make the cloud an optimal location for the analytics to take place. Informed by data science and increasingly driven by AI and machine learning technologies, these analytics can help IT managers to monitor key system metrics and understand how well specific infrastructure elements—such as servers or storage—are performing.


But analytics that are focused on a single infrastructure element at a time can only go so far. Sure, it is helpful to monitor the health and performance of specific IT resources, such as CPU heartbeat or storage latency, but infrastructure resources do not operate independently or in isolation. Analytics must go beyond one dimension, and take into account how resources such as servers and storage interact with and depend on one another. This is especially critical in virtualized infrastructures, in which the interaction of virtual machines with hosts, networks and storage makes IT management even more challenging. Ideally, using the power of AI, analytics can cross these various layers of the IT stack to reveal the impact of resource interactions and interdependencies among all the layers. This would take analytics to a whole new level, transcending the limits of human intelligence to enable dynamic, multi-dimensional analysis of complex, virtualized IT environments.

Think about the implications of AI-driven, cross-stack analytics for IT management. For example, such a capability has the potential to transform technical support from a reactive, always-playing-catch-up function to a proactive and forward-looking capability. In this scenario, built-in analytics are capable of connecting the dots between infrastructure layers to automatically anticipate, diagnose, and fix technical issues before they become major problems. Cross-layer analytics might also help to improve system performance by predicting looming configuration issues and recommending changes to address them.


One product—HPE InfoSight—is already embracing these possibilities, fast-forwarding to bring AI-driven, cross-layer analytics to virtualized environments today. HPE InfoSight has proven its value in delivering predictive storage analytics to customers for many years, while extending its capabilities across the infrastructure stack. In this piece we’ll explore the key characteristics that customers should look for in an analytics solution for virtual infrastructure and then look at the HPE InfoSight architecture and its capabilities, and how they are helping customers transform IT management in virtualized environments today. Specifically, we will demonstrate how one customer uses cross-stack analytics delivered by HPE InfoSight to save tremendous time and money in their HPE 3PAR Storage environment.

Publish date: 06/28/18
Free Reports / Report

HPE and Micro Focus Data Protection for Azure Stack

Hybrid cloud is increasingly gaining popularity among enterprise IT buyers, as companies recognize and begin to validate its benefits. With a hybrid cloud, organizations can take advantage of the elasticity and agility of the public cloud, especially for new cloud-native apps, while continuing to run their businesses in the near term on their existing apps on premises. Users gain the choice of deploying new and existing workloads in the public cloud or the data center, wherever it makes the most sense, and the flexibility to migrate them as needed. A hybrid cloud significantly eases the transition to the cloud, enabling organizations to compete in the new cloud-driven world while preserving current IT investments. With these benefits in mind, well over 80% of organizations we recently surveyed are in the process of moving or planning a move to a hybrid cloud infrastructure.


In this brave new world, Microsoft Azure and Azure Stack are increasingly being adopted as the foundation for companies’ hybrid cloud infrastructure. Microsoft Azure is a leading  public cloud offering that, based on Taneja Group research, consistently ranks neck-in-neck with Amazon Web Services in enterprise adoption, with more than 50% of companies using or planning to use Azure within the next two years. Azure Stack enables organizations to deliver Azure services from their own data center. Delivered as an integrated solution on HPE ProLiant servers, Azure Stack allows customers to run Azure compatible apps on premises as well as use cases that benefit from a hybrid deployment. Together, Azure and Azure Stack provide a natural and relatively frictionless path for Microsoft Windows customers to move to the cloud, along with support for new cloud-native tools and services that allow customers to fully take advantage of cloud agility and scalability.


As organizations move critical apps and data to the cloud, data protection quickly becomes a key requirement. But as buyers evaluate solutions, they often find that cloud providers’ built-in backup tools lack the flexibility, breadth of coverage, app awareness and enterprise capabilities they have become accustomed to on premises. As a result, companies look to other vendors—often their on- premises providers—to meet their data protection needs. As we’ll see, Micro Focus Data Protector offers a fully integrated, robust and comprehensive solution for backup and recovery on HPE Azure Stack.


In this piece we’ll further explore the need for data protection in a hybrid cloud environment, and examine the specific backup and recovery approaches that buyers are looking for, as revealed in our recent research. Then we’ll briefly examine what makes Micro Focus Data Protector an ideal solution for protecting an organization’s key information assets in an HPE Azure Stack hybrid cloud setting.
 

Publish date: 06/18/18
Report

Hedvig Takes Your Storage to Hybrid and Multi-Cloud

With data growth exploding and on-premises IT costs creeping ever higher, an increasing number of organizations are taking a serious look at adopting cloud infrastructure for their business data and applications. Among other things, they are attracted to benefits like near-infinite scalability, greater agility and a pay-as-you-go model for consuming IT resources. These advantages are already driving new infrastructure spending on public and private clouds, which is growing at double-digit rates as spending on traditional, non-cloud, IT infrastructure continues to decline.


While most companies we speak with are already developing cloud-native apps in Amazon Web Services (AWS) or Microsoft Azure, a much smaller number have actually deployed typically backend business apps in the public cloud. What’s preventing them from taking this next step? As it turns out, one of the biggest hurdles is productively deploying existing data storage in the cloud. Public clouds don’t have the compatibility to fully support the range of storage protocols, data services and use cases that companies’ key business apps tend to rely on, making it difficult and less useful to move these workloads to the cloud. Some organizations consider reengineering their applications for cloud-native storage, but this is both costly and time consuming, and in fact may not lead to the results they are looking for. Based on recent Taneja Group research, IT buyers want a simple path for lifting and transferring their app data to the cloud, where it can be supported for both primary and secondary use cases. They are also looking to run many workloads flexibly in a hybrid cloud deployment while maintaining the level of data security and governance they enjoy on premises.


In addition to these technical requirements, companies must also weigh potential business costs, such as the risk of getting locked into a single provider. Our research reveals that customers are increasingly concerned about this risk, which is exacerbated by a lack of data mobility among various on-premises and public cloud infrastructures.


Fortunately, the founding team at Hedvig understands these customer needs and set out more than five years ago to address them. The result of their initiative is the Hedvig Distributed Storage Platform (DSP), a unified programmable data fabric that allows customers to simply and securely deploy any type of workload and application data in a hybrid or multi-cloud environment. Based on software-defined technology, Hedvig DSP enables your existing workloads, whether based on block, file or object storage, to take advantage of cloud scalability and agility today, without the expense and delays of a major reengineering effort. With Hedvig, IT teams can automatically and dynamically provision storage assets using just software on standard x86 servers, whether in your own private cloud or a public cloud IaaS environment. Hedvig enables your workloads to move freely between different public and private cloud environments, avoiding lock-in and allowing you to choose the cloud best suited for each application and use case. Hedvig can support your primary storage needs, but also supports tier-2 storage so that you can backup your data on the same platform.


In this piece, we’ll learn more about what IT professionals are looking for in cloud storage solutions, based on our research findings. We’ll then focus specifically on Hedvig storage for hybrid and multi-cloud environments to help you decide whether and how their solutions can meet your primary and secondary storage needs.
 

Publish date: 03/26/18
Page 2 of 153 pages  < 1 2 3 4 >  Last ›