Taneja Group | OLAP
Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Items Tagged: OLAP

news / Blog

Should Big Data Become An IT Service? Consider Pentaho v5 Business Analytics

All signs point to the next 12 months as when we think the bulk of the market will get around to actually tackling big data opportunities rather than talking about them. One of the enablers we think has been better platforms and friendlier toolsets (e.g. virtual Hadoop, SQL-like queries). Another is that analytics offerings keep getting better. Pentaho has just released their commercial 5.0 version, and what has been a cool (i.e. open source based) BI solution for ETL, OLAP, and reporting/dashboard delivery now has a slick user friendly interface and a ton of big data integration built-in, not just bolted on...

  • Premiered: 09/13/13
  • Author: Mike Matchett
Topic(s): Pentaho Big Data ETL OLAP
news

Data lakes swim with golden information for analytics

First we had data. Then we had big data. Now we have data lakes. Will the murky depths prove bountiful?

  • Premiered: 04/14/15
  • Author: Mike Matchett
  • Published: TechTarget: Search Data Center
Topic(s): TBA data lake TBA analytics TBA Big Data TBA Mike Matchett TBA TechTarget TBA Hadoop TBA Business Intelligence TBA BI TBA OLAP TBA OLTP TBA NoSQL TBA SQL TBA Optimization TBA ETL TBA IoT TBA Internet of Things TBA MapR TBA Project Myriad TBA YARN TBA Virtualization TBA Business Continuity TBA Disaster Recovery TBA DR TBA BC TBA data swamp TBA BlueData TBA Dataguise TBA HDFS TBA Hadoop Distributed File System TBA IBM
news

Navigate data lakes to manage big data

While the data lake concept appeals to business today, IT administrators must exercise caution prior to a full-scale implementation.

  • Premiered: 06/05/15
  • Author: Mike Matchett
  • Published: TechTarget: Search Storage
Topic(s): TBA data lake TBA Storage TBA Big Data TBA storage infrastructure TBA Data protection TBA big data lake TBA analysis TBA HDFS TBA Hadoop TBA Hadoop virtualization TBA Virtualization TBA Hadoop Distributed File System TBA software-defined TBA software-defined storage TBA BI TBA Business Intelligence TBA Disaster Recovery TBA Business Continuity TBA BC TBA DR TBA analytics TBA Spark TBA HP TBA Vertica TBA HP Haven TBA Haven TBA OLAP TBA data-aware
Profiles/Reports

The Best All-Flash Array for SAP HANA

These days the world operates in real-time all the time. Whether making airline reservations or getting the best deal from an online retailer, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish. These capabilities enable businesses to succeed in the modern age, giving forward-thinking companies a real edge in innovation.

SAP HANA is an example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. SAP HANA can be deployed on-premises or in the cloud. Originally, on-premises HANA was available only as a dedicated appliance. Recently SAP has expanded support to best in class components through their SAP Tailored Datacenter Integration (TDI) program. In this solution profile, Taneja Group examined the storage requirements needed for HANA TDI environments and evaluated storage alternatives including the HPE 3PAR StoreServ All Flash. We will make a strong case as to why all-flash arrays like the HPE 3PAR version are a great fit for SAP HANA solutions.

Why discuss storage for an in-memory database? The reason is simple: RAM loses its mind when the power goes off. This volatility means that persistent shared storage is at the heart of the HANA architecture for scalability, disaster tolerance, and data protection. The performance attributes of your shared storage dictate how many nodes you can cluster into a SAP HANA environment which in turn affects your business outcomes. Greater scalability capability means more real-time information is processed. SAP HANA workload shared storage requirements are write intensive with low latency for small files and sequential throughput performance for large files. However, the overall storage capacity is not extreme which makes this workload an ideal fit for all-flash arrays that can meet performance requirements with the smallest quantity of SSDs. Typically you would need 10X the equivalent spinning media drives just to meet the performance requirements, which then leaves you with a massive amount of capacity that cannot be used for other purposes.

In this study, we examined five leading all-flash arrays including the HPE 3PAR StoreServ 8450 All Flash. We found that the unique architecture of the 3PAR array could meet HANA workload requirements with up to 73% fewer SSDs, 76% less power, and 60% less rack space than the alternative AFAs we evaluated. 

Publish date: 06/07/17
Profiles/Reports

HPE and Micro Focus Data Protection for SAP HANA

These days the world operates in real-time all the time. Whether in making ticket purchases or getting the best deal from online retailers, data is expected to be up to date with the best information at your fingertips. Businesses are expected to meet this requirement, whether they sell products or services. Having this real-time, actionable information can dictate whether a business survives or dies. In-memory databases have become popular in these environments. The world's 24X7 real-time demands cannot wait for legacy ERP and CRM application rewrites. Companies such as SAP devised ways to integrate disparate databases by building a single, super-fast uber-database that could operate with legacy infrastructure while simultaneously creating a new environment where real-time analytics and applications can flourish.

SAP HANA is a growing and very popular example of an application environment that uses in-memory database technology and allows the processing of massive amounts of real-time data in a short time. The in-memory computing engine allows HANA to process data stored in RAM as opposed to reading it from a disk. At the heart of SAP HANA is a database that operates on both OLAP and OLTP database workloads simultaneously. To overcome the volatility of server DRAM, the HANA architecture requires persistent shared storage to enable greater scalability, disaster tolerance, and data protection.

SAP HANA is available on-premises as a dedicated appliance and/or via best-in-class components through the SAP Tailored Datacenter Integration (TDI) program. The TDI environment has become the more popular HANA option as it provides the flexibility to leverage legacy resources such as data protection infrastructure and enables a greater level of scalability that was lacking in the appliance approach. Hewlett Packard Enterprise (HPE) has long been the leader in providing mission-critical infrastructure for SAP environments. SUSE has long been the leader in providing the mission-critical Linux operating system for SAP environments. Micro Focus has long been a strategic partner of HPE, and together they have leveraged unique hardware and software integrations that enable a complete end-to-end, robust data protection environment for SAP. One of the key value propositions of SAP HANA is its ability to integrate with legacy databases. Therefore, it makes the most sense to leverage a flexible data protection solution from Micro Focus and HPE to cover both legacy database environments and modern in-memory HANA environments.

In this solution brief, Taneja Group will explore the data protection requirements for HANA TDI environments. Then we’ll briefly examine what makes Micro Focus Data Protector combined with HPE storage an ideal solution for protecting mission-critical SAP environments.

Publish date: 08/31/18