Taneja Group | OLTP
Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Items Tagged: OLTP

Profiles/Reports

Maximizing Database Performance With Dell Equallogic Hybrid Arrays

Today’s combination of rapidly-accelerating demand for data and rapidly-consolidating datacenter infrastructure makes choosing the right storage for each of your business applications more important—and more difficult—than ever. In our view, it’s time more of this burden is taken on by the SAN itself. In other words, it’s time for more SAN intelligence. The intelligent SAN should optimize all available storage resources—automatically. In this profile explore how dynamic, multi-tiered OLTP workloads test the limits of traditional manual storage tiering strategies, and further strengthen the case for automated tiering on the SAN itself. Then we review Dell’s internal benchmark test results and speak to Carnival Cruise Lines, an EqualLogic customer, in order to evaluate how Dell’s hybrid SSD/SAS arrays are delivering higher performance and lower overhead both in the lab and in the field.

Publish date: 05/23/11
news

Data lakes swim with golden information for analytics

First we had data. Then we had big data. Now we have data lakes. Will the murky depths prove bountiful?

  • Premiered: 04/14/15
  • Author: Mike Matchett
  • Published: TechTarget: Search Data Center
Topic(s): TBA data lake TBA analytics TBA Big Data TBA Mike Matchett TBA TechTarget TBA Hadoop TBA Business Intelligence TBA BI TBA OLAP TBA OLTP TBA NoSQL TBA SQL TBA Optimization TBA ETL TBA IoT TBA Internet of Things TBA MapR TBA Project Myriad TBA YARN TBA Virtualization TBA Business Continuity TBA Disaster Recovery TBA DR TBA BC TBA data swamp TBA BlueData TBA Dataguise TBA HDFS TBA Hadoop Distributed File System TBA IBM
news

Kinetica Unveils GPU-accelerated Database for Analyzing Streaming Data with Enhanced Performance

Kinetica today announced the newest release of its distributed, in-memory database accelerated by GPUs that simultaneously ingests, explores, and visualizes streaming data.

  • Premiered: 09/21/16
  • Author: Taneja Group
  • Published: Business Wire
Topic(s): TBA high availability TBA Mike Matchett TBA Kinetica TBA In-Memory TBA Security TBA IoT TBA Internet of Things TBA Data Management TBA OLTP TBA CPU TBA GPU TBA NVIDIA TBA Data Center TBA scalability TBA Apache TBA Hadoop TBA Apache Hadoop TBA Apache Kafka TBA Apache Spark TBA Apache NiFi TBA High Performance TBA cluster TBA Big Data TBA scale-out
Profiles/Reports

The Best All-Flash Array for SAP HANA

These days the world operates in real-time all the time. Whether making airline reservations or getting the best deal from an online retailer, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish. These capabilities enable businesses to succeed in the modern age, giving forward-thinking companies a real edge in innovation.

SAP HANA is an example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. SAP HANA can be deployed on-premises or in the cloud. Originally, on-premises HANA was available only as a dedicated appliance. Recently SAP has expanded support to best in class components through their SAP Tailored Datacenter Integration (TDI) program. In this solution profile, Taneja Group examined the storage requirements needed for HANA TDI environments and evaluated storage alternatives including the HPE 3PAR StoreServ All Flash. We will make a strong case as to why all-flash arrays like the HPE 3PAR version are a great fit for SAP HANA solutions.

Why discuss storage for an in-memory database? The reason is simple: RAM loses its mind when the power goes off. This volatility means that persistent shared storage is at the heart of the HANA architecture for scalability, disaster tolerance, and data protection. The performance attributes of your shared storage dictate how many nodes you can cluster into a SAP HANA environment which in turn affects your business outcomes. Greater scalability capability means more real-time information is processed. SAP HANA workload shared storage requirements are write intensive with low latency for small files and sequential throughput performance for large files. However, the overall storage capacity is not extreme which makes this workload an ideal fit for all-flash arrays that can meet performance requirements with the smallest quantity of SSDs. Typically you would need 10X the equivalent spinning media drives just to meet the performance requirements, which then leaves you with a massive amount of capacity that cannot be used for other purposes.

In this study, we examined five leading all-flash arrays including the HPE 3PAR StoreServ 8450 All Flash. We found that the unique architecture of the 3PAR array could meet HANA workload requirements with up to 73% fewer SSDs, 76% less power, and 60% less rack space than the alternative AFAs we evaluated. 

Publish date: 06/07/17
Profiles/Reports

Enterprise Cloud Platform Ideal for Database Apps: Nutanix Hosting Oracle Penetrates Tier 1

Creating an Enterprise Cloud with HyperConverged Infrastructure (HCI) is making terrific sense (and “cents”) for a wide range of corporations tired of integrating and managing complex stacks of IT infrastructure. Replacing siloed infrastructure and going far beyond simple pre-converged racks of traditional hardware, HCI greatly simplifies IT, frees up valuable staff from integration and babysitting heterogeneous solutions to better focus on adding value to the business, and can vastly improve “qualities of service” in all directions. Today, we find HCI solutions being deployed as an Enterprise Cloud platform in corporate data centers even for mission-critical tier-1 database workloads.

However, like public clouds and server virtualization before it, HCI has had to grow and mature. Initially HCI solutions had to prove themselves in small and medium size organizations – and on rank-and-file applications. Now, five plus years of evolution of vendors like Nutanix have matured HCI into a full tier1 enterprise application platform presenting the best features of public clouds including ease of management, modular scalability and agile user provisioning. Perhaps the best example of enterprise mission-critical workloads are business applications layered on Oracle Database, and as well see in this report, Nutanix now makes an ideal platform for enterprise-grade databases and database-powered applications.

In fact, we find that Nutanix’s mature platform not only can, by its natural mixed workload design, host a complete tier1 application stack (including the database), but also offers significant advantages because the whole application stack is “convergently” hosted. The resulting opportunity for both IT (and the business user) is striking. Those feeling tied down to legacy architectures and those previously interested in the benefits of plain Converged Infrastructure will now want to evaluate how mature HCI can now take them farther, faster.

In the full report, we explore in detail how Nutanix supports and accelerates serious Oracle database-driven applications (e.g. ERP, CRM) at the heart of most businesses and production data centers. In this summary, we will review how Nutanix Enterprise Cloud Platform is also an ideal enterprise data center platform for the whole application stack— consolidating many if not most workloads in the data center.

Publish date: 06/30/17
Profiles/Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18