Taneja Group | High+Performance+Computing
Join Newsletter
Forgot
password?
Register
Trusted Business Advisors, Expert Technology Analysts

Items Tagged: High+Performance+Computing

news

What does the next big thing in technology mean for the data center?

There are plenty of technologies touted as the next big thing. Big data, flash, high-performance computing, in-memory processing, NoSQL, virtualization, convergence, software-defined whatever all represent wild new forces that could bring real disruption but big opportunities to your local data center.

  • Premiered: 03/19/14
  • Author: Mike Matchett
  • Published: Tech Target: Search Data Center
Topic(s): TBA data TBA Data Center TBA Big Data TBA Storage TBA Flash TBA SSD TBA HPC TBA High Performance Computing TBA NoSQL TBA Virtualization TBA convergence TBA software-defined TBA Hadoop TBA scale-out TBA Apache TBA analytics TBA scalability TBA Converged Infrastructure TBA hyper convergence TBA Platform as a Service TBA PaaS TBA Hypervisor TBA Hybrid
Profiles/Reports

Fibre Channel: The Proven and Reliable Workhorse for Enterprise Storage Networks

Mission-critical assets such as virtualized and database applications demand a proven enterprise storage protocol to meet their performance and reliability needs. Fibre Channel has long filled that need for most customers, and for good reason. Unlike competing protocols, Fibre Channel was specifically designed for storage networking, and engineered to deliver high levels of reliability and availability as well as consistent and predictable performance for enterprise applications. As a result, Fibre Channel has been the most widely used enterprise protocol for many years.

But with the widespread deployment of 10GbE technology, some customers have explored the use of other block protocols, such as iSCSI and Fibre Channel over Ethernet (FCoE), or file protocols such as NAS. Others have looked to Infiniband, which is now being touted as a storage networking solution. In marketing the strengths of these protocols, vendors often promote feeds and speeds, such as raw line rates, as a key advantage for storage networking. However, as we’ll see, there is much more to storage networking than raw speed.

It turns out that on an enterprise buyer’s scorecard, raw speed doesn’t even make the cut as an evaluation criteria. Instead, decision makers focus on factors such as a solution’s demonstrated reliability, latency, and track record in supporting Tier 1 applications. When it comes to these requirements, no other protocol can measure up to the inherent strengths of Fibre Channel in enterprise storage environments.

Despite its long, successful track record, Fibre Channel does not always get the attention and visibility that other protocols receive. While it may not be winning the media wars, Fibre Channel offers customers a clear and compelling value proposition as a storage networking solution. Looking ahead, Fibre Channel also presents an enticing technology roadmap, even as it continues to meet the storage needs of today’s most critical business applications.

In this paper, we’ll begin by looking at the key requirements customers should look for in a commercial storage protocol. We’ll then examine the technology capabilities and advantages of Fibre Channel relative to other protocols, and discuss how those translate to business benefits. Since not all vendor implementations are created equal, we’ll call out the solution set of one vendor – QLogic – as we discuss each of the requirements, highlighting it as an example of a Fibre Channel offering that goes well beyond the norm.

Publish date: 02/28/14
news

What’s In A Number?

Well, when that number is 10 billion, I’d say it can mean quite a lot.

  • Premiered: 05/20/15
  • Author: Taneja Group
  • Published: Qumulo
Topic(s): TBA Qumulo TBA Qumulo Core TBA High Performance Computing TBA High Performance TBA Tom Fenton TBA Storage TBA Chris Hoffman TBA David Bailey
news

IT pros get a handle on machine learning and big data

Despite its benefits, machine learning can also go very wrong. Beginners need to understand their input data, project scope and purpose, and the machine learning algorithms at work.

  • Premiered: 07/15/15
  • Author: Mike Matchett
  • Published: TechTarget: Search Data Center
Topic(s): TBA IT TBA Mike Matchett TBA Big Data TBA Machine Learning TBA predictive modeling TBA Optimization TBA scale-out TBA Apache TBA Apache Mahout TBA Apache Madlib TBA High Performance TBA High Performance Computing TBA HPC TBA storage architecture TBA Storage TBA In-Memory TBA Microsoft TBA Microsoft Azure TBA AI TBA Artificial Intelligence
news

Big data analytics applications impact storage systems

Analytics applications for big data have placed extensive demands on storage systems, which Mike Matchett says often requires new or modified storage structures.

  • Premiered: 09/03/15
  • Author: Mike Matchett
  • Published: TechTarget: Search Storage
Topic(s): TBA Mike Matchett TBA Big Data TBA analytics TBA Storage TBA Primary Storage TBA scalability TBA Business Intelligence TBA BI TBA AWS TBA Amazon AWS TBA S3 TBA HPC TBA High Performance Computing TBA High Performance TBA ETL TBA HP Haven TBA HP TBA Hadoop TBA Vertica TBA convergence TBA converged TBA IOPS TBA Capacity TBA latency TBA scale-out TBA software-defined TBA software-defined storage TBA SDS TBA YARN TBA Spark
news

The Need for Modern Day Scale-Out NAS Products

Today, storage architectures are undergoing yet another makeover, and the once dominant legacy products are being overtaken.

  • Premiered: 11/12/15
  • Author: Jeff Kato
  • Published: InfoStor
Topic(s): TBA InfoStor TBA Network Attached Storage TBA NAS TBA Storage TBA File Storage TBA SAN TBA scale-out TBA Scale out NAS TBA Cloud TBA EMC TBA NetApp TBA ONTAP TBA Cleversafe TBA Caringo TBA object storage TBA HPC TBA Big Data TBA High Performance TBA High Performance Computing TBA Performance TBA Flash TBA SSD TBA IOPS TBA Metadata TBA scalability TBA software-defined TBA Saas TBA Software-as-a-Service TBA open API TBA API
news

Can your cluster management tools pass muster?

The right designs and cluster management tools ensure your clusters don't become a cluster, er, failure.

  • Premiered: 11/17/15
  • Author: Mike Matchett
  • Published: TechTarget: Search Data Center
Topic(s): TBA cluster TBA Cluster Management TBA Cluster Server TBA Storage TBA Cloud TBA Public Cloud TBA Private Cloud TBA Virtual Infrastructure TBA Virtualization TBA hyperconvergence TBA hyper-convergence TBA software-defined TBA software-defined storage TBA SDS TBA Big Data TBA scale-up TBA CAPEX TBA IT infrastructure TBA OPEX TBA Hypervisor TBA Migration TBA QoS TBA Virtual Machine TBA VM TBA VMWare TBA VMware VVOLs TBA VVOLs TBA Virtual Volumes TBA cloud infrastructure TBA OpenStack
news

Scale-out NAS design now rivals object storage

Jeff Kato takes a closer look at ideal scale-out NAS design principles and vendors that are emerging with modern scale-out NAS designs.

  • Premiered: 02/12/16
  • Author: Jeff Kato
  • Published: TechTarget: Search Storage
Topic(s): TBA scale-out TBA NAS TBA scale-out NAS TBA Block Storage TBA File Storage TBA Storage TBA Oracle TBA Fibre Channel TBA FC TBA scalability TBA web-scale TBA web-scale storage TBA object storage TBA High Performance TBA HPC TBA High Performance Computing TBA Flash TBA SSD TBA data-aware TBA Metadata TBA IOPS TBA Performance TBA NetApp TBA EMC TBA software-defined TBA Virtual Machine TBA VM TBA Public Cloud TBA Cloud TBA hyperscale
news

Delving into neural networks and deep learning

Deep learning and neural networks will play a big role in the future of everything from data center management to application development. But are these two technologies actually new?

  • Premiered: 06/16/16
  • Author: Mike Matchett
  • Published: TechTarget: Search IT Operations
Topic(s): TBA deep learning TBA datacenter management TBA Data Center TBA Datacenter TBA Big Data TBA Machine Learning TBA big data analytics TBA Artificial Intelligence TBA Compute TBA neural networks TBA Cloud TBA scale-out TBA scale-out architecture TBA High Performance Computing TBA High Performance TBA HPC TBA NVIDIA TBA Mellanox TBA DataDirect Networks TBA 4U appliance TBA Google TBA Google AlphaGo TBA network TBA Mike Matchett
Profiles/Reports

IBM Cloud Object Storage Provides the Scale and Integration Needed for Modern Genomics Infra.

For hospitals and medical research institutes, the ability to interpret genomics data and identify relevant therapies is key to provide better patient care through personalized medicine. Many such organizations are racing forward, analyzing patients’ genomic profiles to match more clinically actionable treatments using artificial intelligence (AI).

These rapid advancements in genomic research and personalized medicine are very exciting, but they are creating enormous data challenges for healthcare and life sciences organizations. High-throughput DNA sequencing machines can now process a human genome in a matter of hours at a cost approaching one thousand dollars. This is a huge drop from a cost of ten million dollars ten years ago and means the decline in genome sequencing cost has outpaced Moore’s Law (see chart). The result is an explosion in genomic data – driving the need for solutions that can affordably and securely store, access, share, analyze and archive enormous amounts of data in a timely manner.

Challenges include moving large volumes of genomic data from cost-effective archival storage to low latency storage for analysis to reduce the time needed to analyze genetic data. Currently, it takes days to do a comprehensive DNA sequence analysis.

Sharing and interpreting vast amounts of unstructured data to find relationships between a patient’s genetic characteristics and potential therapies adds another layer of complexity. Determining connections requires evaluating data across numerous unstructured data sources, such as genomic sequencing data, medical articles, drug information and clinical trial data from multiple sources.

Unfortunately, the traditional file storage within most medical organizations doesn’t meet the needs of modern genomics. These systems can’t accommodate massive amounts of unstructured data and they don’t support both data archival and high-performance compute. They also don’t facilitate broad collaboration. Today, organizations require a new approach to genomics storage, one that enables:

  • Scalable and convenient cloud storage to accommodate rapid unstructured data growth
  • Seamless integration between affordable unstructured data storage, low latency storage, high performance compute, big data analytics and a cognitive healthcare platform to quickly analyze and find relationships among complex life science data types
  • A multi-tenant hybrid cloud to share and collaborate on sensitive patient data and findings
  • Privacy and protection to support regulatory compliance

Publish date: 03/22/17
news

Larger AWS EC2 instance types seek to conquer enterprise demands

AWS has churned out a host of new EC2 instance types over the past year, as it responds to a shift toward more production workloads on the platform.

  • Premiered: 06/08/17
  • Author: Ryan Newfell
  • Published: TechTarget: Search AWS
Topic(s): TBA Amazon Web Services TBA Amazon AWS TBA AWS TBA Amazon EC2 TBA Jeff Kato TBA Virtual Machine TBA VM TBA Compute TBA Storage TBA Networking TBA Cloud TBA SAP HANA TBA SAP TBA HANA TBA Public Cloud TBA enterprise cloud TBA Private Cloud TBA Google TBA Microsoft TBA High Performance TBA HPC TBA High Performance Computing TBA VMWare TBA VMotion TBA Data Center