Supercomputers Need Super Storage

LTO Program Presents LTO Gen-7 at HPC15 in Austin 

blog1The LTO program technology providers were sponsors at the recent High Performance Computing (HPC) conference in Austin, Texas, and the buzz was not only about the speed and processing capabilities of these super-beasts but also how to reliably store the huge amount of data the supercomputers process.

What is a Supercomputer?

According to Wikipedia “A supercomputer is a computer with a high-level computational capacity compared to a general-purpose computer. Performance of a supercomputer is measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS). As of 2015, there are supercomputers which can perform up to quadrillions of FLOPS.” That is almost too much to comprehend. Supercomputers play an important role in a variety of industry segments including scientific research, academic institutions, quantum mechanics, weather forecasting, climate research, oil and gas exploration, and molecular modeling.  Some government agencies like the military also utilize HPC for multifaceted applications.blog2

As noted at Wikipedia, “Systems with massive numbers of processors generally take one of two paths: in one approach (e.g., in distributed computing), a large number of discrete computers distributed across a network devote some or all of their time to solving a common problem; In another approach, a large number of dedicated processors are placed in close proximity to each other (e.g. in a computer cluster); this saves considerable time moving data around and makes it possible for the processors to work together (rather than on separate tasks).” These complex implementations crunch a lot of ‘ones and zeroes’ that need to be reliably stored.

Big Computers Produce Big Data

blog3The United States is still the land of the largest number of supercomputers, with an estimated 200 systems. China has increased the number of big beasts to just over 100. As previously stated, these mammoth machines can process upwards of 30 quadrillion calculations…… per second! That kind of performance can process big data. This information can be used in analytics to make important decisions and discoveries. For example, the South Korean Meteorological Administration increased their national weather information system storage capacity by 1000% to over 9 petabytes. This big data crunching helps make it possible to forecast weather changes more accurately and with improved detail–potentially saving thousands of lives and safeguarding property.1  Big data not only means huge files but lots and lots of files, high volumes that need a fast, secure and economical place to call home.

LTO-7 Tape – Welcome Home Big Data

Attendees at the conference learned that LTO-7 technology offers the best place for big data to call home. One LTO-7 cartridge can store 15 terabytes compressed and it can do it quickly. An LTO-7 tape drive can backup these large files at up to 750 megabytes per second compressed. That’s more than 2.7 terabytes of data an hour per drive!  Organizations that already have LTO generation 5 or blog46 technology are in an opportune position to implement LTO gen-7. LTO-7 tape drives can read and write a gen-6 cartridge and can read a gen-5 cartridge, helping to preserve investments and ease the implementation. An LTO-7 cartridge can store 2.4 times more data than an LTO-6 cartridge, and 4 times more data than an LTO-5 cartridge. LTO-7 technology supports tape drive data encryption and WORM cartridges (write once read many) to help address security policies, and you can use LTFS with LTO-7 technology to help make tape easy to use In a manner like using disk or a USB drive.

Whether your information is processed in a supercomputer or not, do yourself a favor – store it securely, economically and long term on LTO technology.

1Wired, How Big Data Can Boost Weather Forecasting