Monday, March 16, 2020

Why Tape Still Has Role in Business Continuity?

I can forgive some people – those who last touched a consumer VHS tape or audiocassette in the late 90s or early 2000s. I’ve come to really enjoy expanding their perspective, though, when I tell them that tape is a major workhorse in the cloud and that most of the household-name technology and Internet companies are tape users. BC, including several data protection applications, is a big part of the reason why, along with tape’s low TCO and low energy consumption. I think we can all agree that economics and preserving the environment is key to continuity in its own right.

Information currently in zettabyte age
The WW datasphere is currently around 35ZB and expected to be 175ZB by 2025 – an estimated annual compound growth rate of 30%.


The odds are good you’re seeing a similar rate of data explosion in your own business. Everything today is born digital, not just structured’ data like databases but unstructured data such as spreadsheets, documents, presentations, video, audio and photographs. Add to that the appliances and devices in the IoT – smart vehicles, smart planes, smart phones, smart homes, factories and cities. Then add to the mix AI, ML, ecommerce, email, social media, gaming, surveillance, VR, mobile and more – you can see the path we’re on.

We keep all this data around for years and sometimes decades because it is potentially valuable enough to justify archiving or keeping online in an active archive. Whether your business relies on archival video footage or photos, harvests data for sale to outside parties or uses information for internal streamlining, strategy or planning, it’s become impossible to even imagine a modern business without data that is increasing in value.

Today, nearly all data is managed throughout its lifecycle using 3 core storage technologies: flash, HDD, and tape. I know some of you are thinking – “but what about the cloud?” The truth is that the cloud is more of a service delivery model than a storage technology, leveraging global connectivity to get those core technologies of flash, HDD and tape working in unison, seamlessly and invisibly behind the scenes.

These core technologies don’t compete per se, because each has its role in the storage workflow and lifecycle.

Flash provides the fastest access times, making it the choice for data that needs frequent, instant access. The tradeoff is its higher price/TB.

HDD is less expensive with the tradeoff being slower speeds than flash and intensive energy use.
Tape is by far the least expensive of the 3, requiring essentially zero power when idle in a tape library or in an archival vault, making it for long term storage. Throw in the best reliability of any storage medium and the longest archival life and you have a strong case for BC.

The continuity case for tape
So. Back to the value of that data. For many businesses, and maybe yours, information may very well be the most valuable asset. By its nature, accumulated data is accumulated value. And where there’s value, there’s always somebody looking to exploit or steal it.


Cybercrime is on the rise and the impacts are sobering. In the first three quarters of 2019, 7.2 billion malware attacks were launched, as well as 151.9 million ransomware attacks, according to SonicWall. The healthcare industry alone reported an 80% increase in cybercrime between 2017 and 2019, with hundreds of incidents and over 400 thousand supposedly HIPAA-protected patient records stolen. Cryptojacking is the latest threat, with 52.7 million incidents, in 1CQ19, of hackers taking control of users’ CPUs to mine data for cryptocurrency.

Here’s where tape’s use case gets especially strong. While spinning disks and flash drives are networked and accessible all of the time, tape can also support an active archive or can be easily stored offline and accessed only when needed. This creates an ‘air gap’ between valuable data, the local network and the web, a practice encouraged by The National Institute of Standards and Technology as “the only true risk avoidance in today’s IT environment.”

Beyond cybercrime, the recent COVID-19 outbreak has reminded us of the vulnerability of global supply chains, especially in the technology space. Businesses can take some comfort in knowing that unlike other storage technologies, tape has a diverse manufacturing footprint, including production in the USA.

No discussion of continuity is complete without a look towards the future. LTO format, an industry standard, has committed to a long-term roadmap ensuring future cartridge capacities from today’s gen 8 at 30TB capacity up to currently planned gen 12 with 480TB of compressed data, ensuring tape will be a relevant technology for years to come.

This low cost of ownership and energy consumption, security, reliability and scalability are the main reasons that nearly all of the big cloud service providers also rely on tape in their massive hyperscale data centers. And as even mid-size businesses grapple with the challenges of massive and growing volumes of data, tape’s value proposition is becoming compelling for a wider audience once again. 

Your business may not be a hyperscale data center today, but it might be time to start thinking like one. Back to the future, indeed.

Monday, February 17, 2020

Backup Software’s Expanding Efforts to Help Defeat Ransomware in Data Center

Ask anyone how to defeat ransomware and software from cyber security providers may first come to mind. These include Avast, Bitdefender, Malwarebytes, Sophos.

Mention using backup software to defeat ransomware and people may look at you like you have lost your mind. Crazy or not, backup software now incorporates features that serve as a secondary perimeter to defend vs. ransomware attacks.

Prevention is Best
You will get no argument from me on this point. Every organization should deploy cyber security to stop a ransomware attack before it ever starts. Once ransomware detonates, an organization may pay a heavy price. In a worst case it pays a ransom to hopefully get its data back. Even if it never pays a ransom, it still pays a heavy toll in lost productivity and business disruption as it recovers data.

However, here’s the catch. An organization cannot assume that cyber security software will suffice in protecting it vs. ransomware. Cyber security software cannot detect and prevent vs. all strains of ransomware. Ransomware changes too rapidly and enters organizations in too many ways for any cyber security software to successfully work in every instance. This puts the onus on every organization to have a means to recover in the likely event that ransomware detonates in their environment.

Backup Software’s Expanding Efforts
To help cyber security software deal with ransomware attacks, an organization may now turn to backup software. Many backup software solutions go beyond core backup and restore capabilities. They now offer their own means to detect, prevent, and recover from ransomware attacks. Backup software solutions vary in the type and number of techniques they use. Here are 4 methods already found and used in a few products:


Honey Pot
Using this technique, the backup software provider places its own files, or honey pots, on production application and file servers. These files serve no other purpose but to detect if ransomware exists in the production environment. Should ransomware detonate and change or encrypt any of these files, the backup software will detect this file change during the backup. The backup software will, in turn, alert the organization that such a file change occurred.


Backup Software Integrated with Anti-Malware Software
Integrating anti-malware and backup software brings together the best of both backup and cyber security software. The backup software continues to focus on what it does best – backing up and recovering data. The anti-malware software comes into play as it may scan data during backups or recoveries. If it detects ransomware in the backup, it alerts to its presence.


Monitoring and Alerting on Changes to Backup Data Files
Many backup software products now store backup files on network file shares. While efficient, it does potentially expose these files to any ransomware that can access these file shares. Once it accesses them, it can encrypt or delete them making them unusable for recovery. To help prevent this, some backup software monitors the locations of backup files for any unusual or suspicious activity.


Predictive Analytics
Some backup providers now incorporate AI and ML into their solutions. This software examines and compares the data contained in backup files and looks for unusual changes in data between backups. If it detects anomalies between backups, it generates alerts to prompt organizations to examine that data.


Powerful Antidote to Ransomware Attacks
No one solution – backup or cyber security software – yet possesses all the answers to prevent ransomware from ever detonating. However, used together, these 2 software products provide organizations with a powerful antidote to ransomware attacks. Used together, they equip almost any organization to detect, prevent, and recover from a ransomware attack should one occur.

Tuesday, December 10, 2019

Tony Evans VP of Business Development and North America Sales, Overland-Tandberg

Coming from Jupiter Networks, Cisco, IBM and HPE

Overland-Tandberg, acquired by Silicon Valley Technology Partners for $45 million last year,, announced that Silicon Valley global technology veteran, Tony Evans, has been appointed VP of business development and North America sales.

I’m excited to have Tony join the executive leadership team of Overland-Tandberg. He brings tremendous industry experience with more than 20 years successfully leading high-performance teams and accelerating companies’ strategic growth objectives. Tony will be reporting to me and will be responsible for leading the development of strategic business initiatives and the North America sales team,” said Eric Kelly, chairman and CEO.

I am thrilled to join the Overland-Tandberg team with its focus on helping clients implement Hybrid Cloud with industry-leading products and partnerships through an innovative business model. Overland-Tandberg is revolutionizing data protection and BC. I’m excited to join a company leveraging global intellect and innovation around the world,” says Evans.

Prior to joining Overland-Tandberg, he leveraged his expertise assisting early to late stage privately funded high-velocity portfolio companies’ positions, selling and delivering business value and outcomes derived from their technology. He served in a variety of different global sales, go-to-market and business development leadership positions.

He also brings experience with some of technology infrastructure companies, more recently operating as MD and VP of global financial services for Juniper Networks and other such as Cisco, IBM and HPE.

 

Quantum CEO Predictions for 2020

Autonomous vehicle development increasingly human-centric, increased adoption of HCI for video surveillance, video and images biggest data generator for enterprises, NVMe to erode SAS SSD array faster, tape market growing reversing decade-long declining

• Autonomous vehicle development will become increasingly human-centric: as OEMs and their technology partners strive to closely align assisted and autonomous driving technology to human behavior. In order to develop systems that adapt to the characteristics of individual drivers or riders, immense amounts of behavioral data must be captured and analyzed, including bio-metric, in addition to external sensor and vehicle control system data. This means the need for cost-effective storage performance and scalability will continue to skyrocket.

• HCI will see increased adoption in video surveillance: IHS predicts worldwide surveillance storage revenue will grow from $3.4 billion in 2019 to $4.2 billion in 2020. Growth is driven by increased demand for better security, smart city and public safety initiatives (public sector), and the desire for business intelligence. Legacy systems require individual components to address compute, storage and networking while HCI for video surveillance integrates all 3 into single platform, delivering easier to install and manage appliances that do not require that security professionals rely on specialized IT assistance. Moreover, HCI appliances enable storage to scale so that when environments grow, the platform can grow too – a challenge amplified by increases in camera counts, camera resolutions, and video retention times. They provide a solid underlying platform to deploy new capabilities, both today and tomorrow.

• Video and images represent biggest data generator for most enterprises: Between surveillance footage, video for marketing and training purposes across all industries, and the use of high-res image and video content generated by machines in use cases as diverse as movie and TV production, autonomous vehicle design, manufacturing, healthcare – we believe video and high-res image content will represent biggest ‘class’ of data for most enterprises.

• NVMe will erode traditional SAS SSD array market faster than originally predicted: With the performance advantages of NVMe, and by leveraging new networking technologies like RDMA, we believe NVMe will erode the market for traditional SSD storage much faster than predicted. In markets such as M&E where higher resolution content combined with higher frame rates, more bits per pixel and more cameras per project are putting pressure on storage architectures, NVMe should prove particularly appealing.

• Tape storage market will grow, reversing a decade-long declining trend: Tape has emerged as a key technology for massive scale cold storage infrastructure – both in the cloud and on-premise. And we believe the architectures used in the cloud will eventually make their way back into the enterprise. So we believe the tape market will grow, and continue to grow over the next 5-10 years, based on a new use case for tape as cold storage for (primarily) video and high res image data.

Tuesday, October 08, 2019

All ATTO products are tested compatible with the newly released macOS Catalina


ATTO Technology, Inc, a global leader of network and storage connectivity and infrastructure solutions, announced today tested compatibility and support of the newly released macOS® Catalina, and that they are ready with support for the impending Apple Mac Pro® 2019.

Apple Mac Pros have been the platform of choice for creative professionals since 2006 and ATTO was the first with network and storage connectivity upgrades and support to meet the robust requirements of digital production workflows. Anticipating macOS Catalina, and the soon to be released Mac Pro 2019, ATTO is ready now to support high-performance storage connectivity for this next generation of professional-level Apple products.

“It makes the Mac feel more ‘pro’ again. Catalina is a big leap in making the Mac Pro a professional workstation geared toward high end media and audio workflows,” says Tim Klein, CEO of ATTO Technology, Inc. “With the soon to be released Mac Pro, we expect to see a resurgence of Apple in all facets of creative environments. ATTO and Apple remain a perfect pair for creative professionals.”

ATTO products are the highest-performing storage connectivity solutions available for Mac® environments and provide future-proof compatibility with Catalina and Mac Pro 2019. The broad portfolio of ATTO network and storage connectivity products tested for Macs include ATTO Celerity™ Fibre Channel HBAs, ATTO FastFrame™ Ethernet adapters, ATTO ExpressSAS® HBAs, and ATTO ThunderLink® Thunderbolt™ adapters.

ATTO has made new software tools available specifically for macOS Catalina that will enhance data-intensive and media production workflows. ATTO Xtend SAN iSCSI Initiator version 5.5 allows high availability with load balancing and failover for iSCSI storage on Macs over Ethernet. ATTO Disk Benchmark for macOS is the same industry-leading benchmark tool for Windows® developed exclusively for Macs.


Thursday, September 19, 2019

The Wait is Over, LTO-8 Tape is Back!

You heard that right, the wait is over, LTO-8 tape is back! If ever there was anything sexy about tape this is got to be one of those moments…LTO-8 media is back, and boy are we glad!

We will be seeing an influx of LTO-8 media flood the market very soon. This news couldn’t have come at a better time as we continue to see an increase in the demand for low cost, greater capacity, long term archive requirements in many enterprises; LTO tape continues to deliver solutions for these requirements delivering high storage capacity, blazing-fast transfer rates, easy-to-use functionality, and steadfast reliability. LTO is still the ideal solution for long-term data retention and archiving.

The announcement by the LTO Consortium came on August 5, 2019, stating that Fujifilm and Sony are now both licensees of Generation 8 technology. The relevance of tape continues to increase in archive and offline long-term storage particularly with large cloud providers but also those SMB customers with tight budgets. With up to 30 TB of compressed capacity and up to 750 MB/s data transfer rates, LTO Generation 8 continues to push innovation.

It is projected that by 2020 data will grow to 44 zettabytes; that is a 50-fold growth from the beginning of 2010; the largest culprits of this growth include machine and sensor data, hi-res video, and digital documents and images and that is just to name a few.

The outlook for LTO tape remains bright and strong with current and future generation delivering tremendous scalability and growth. As data grows by the second in gigantic steps, LTO can provide longevity, security, and assurance at a lower cost than other storage solutions.

Want to order LTO media? Simply click here!

Friday, August 23, 2019

78.6 Million HDDs Shipped in 2CQ19

While 2CQ19 HDD shipment totals of 78.56 million essentially matched the company’s forecast published in May, several factors, both positive and negative, have influenced the near-term HDD forecast published in this report.

On the positive side, nearline demand by cloud customers continued to accelerate from the early recover witnessed in 1CQ19. For 2CQ19, total nearline shipments jumped 13% Q/Q to 12.40 million units and, more importantly, total capacity shipped of 105.72EB advanced 19% over the same period due to a surge in 14TB shipments at two tier-1 hyperscale companies. 12TB, while declining slightly Q/Q, shipped higher volumes than all other nearline capacities as the 14TB transition at other cloud companies commences in 3CQ19.

Client HDD demand, while seasonally weak in 2CQ19, came under additional pressure from accelerating SSD attach rates due to falling SSD pricing caused by NAND oversupply. In addition, seasonally stronger game console HDD sales were muted in 2CQ19 as the current console generations enter their seventh year of existence. While a console refresh from both Microsoft and Sony are expected as early as late 2020, industry chatter indicates that the new platforms will sport SSDs in place of HDDs.

Trade tensions are also high on the list of concerns among both HDD customers and suppliers. While threatened tariff increases in June did not materialize, PC manufacturers and ODMs accelerated system builds in 2CQ19 in order to avoid the potentially higher export tariffs on products shipping out of China. Due to the system build-aheads, PC OEMs are cutting builds in early CQ3 to manage inventory, an action that will temper seasonal HDD demand further.

The uncertainties caused by continued tariff threats weigh on overall spending across numerous economies, spreading fears that a broader global economic slowing is beginning to take hold. As a result, the 2019 HDD forecast has been reduced slightly to account for a weaker PC forecast projected for the year.

At the same time, nearline demand remains essentially unchanged from prior forecasts in terms of annual exabyte growth. A slightly faster recovery of nearline demand by cloud customers in the 1H19 marginally reduces the nearline unit forecast a bit in 1H19, but the impact is fairly small relative to the total HDD TAM.

Over the revised long-term forecast, total HDD demand has been reduced primarily due to a change in the game console HDD outlook. With the next generation of Sony PlayStation and Microsoft Xbox converting from HDD storage over to SSDs based on current market information, the long-term game console HDD shipments will decline at a faster rate than in the previous forecast. In all likelihood, both companies will continue to sell current HDD-enabled platforms as a lower priced option until such time that SSD pricing falls to a point when HDDs can be displaced across the entire console line-up.

Higher-capacity client HDDs will continue to serve markets that value low-cost, high capacity solutions, such as surveillance and external retail HDDs. The external HDD market will remain steady while demand for surveillance HDDs will continue to grow over the long-term; however, other applications such as PCs and most distribution HDDs will have limited demand for high-capacity HDDs greater than 2TB. While the long-term HDD unit forecast remains relatively unchanged in total, the capacity growth for total client HDDs has been reduced slightly from a 25% five-year CAGR to 24%.

Wednesday, July 31, 2019

SAS or NVMe?Decisions, Decisions

Storage architects need to respond to today’s business challenges by ensuring the storage solutions they choose provide the security, stability, scalability and management features necessary to support their ecosystem. As they look toward adopting new storage technologies, there are essential considerations they should weigh and review before moving to a new technology. New storage protocols are continually entering the market, and this paper explores how SAS technology remains the foundation for storage networks and will continue to deliver on that promise tomorrow and beyond.
  
The trend toward SSDs
Solid-state disk (SSD) storage, enabled by NAND flash technology, has dramatically risen in adoption for primary storage, especially since they have the potential to provide much higher performance (IOPS and MB/s). As NAND flash becomes a commodity, prices continue to drop to the point that smaller capacity SSDs (less than 500GB) are competing with HDDs for market share. Both the SAS and NVMe protocols support SSDs. So, this begs the question: how should IT architects evaluate and integrate these technologies into their data center architecture? 

NVMe
 NVMe is designed specifically for the unique characteristics of solid-state technology. However, limitations in hardware and application infrastructure make it difficult to take full advantage of the performance benefits. This is especially important because NVMe SSDs are as much as 25x more expensive per GB than traditional HDDs. In addition, management tools such as the NVMe-MI specification are still in development and are not yet widely deployed. It will take at least two to three years before robust solutions exist that are sufficient to support enterprise storage systems. While the low-overhead NVMe technology shows future promise for wider-scale implementation, IT architects need technologies that are proven and that they can depend on today.

SAS 
The SAS interface is a general-purpose communication protocol for all types of storage media – hard drives (HDD), SSD, tape and more. Extremely fast, a single SAS port can achieve top performance of 2.4GB/s and millions of I/O operations per second (IOPS). More importantly, the SAS roadmap continues to evolve to support the even higher performance expectations of tomorrow’s data center. For instance, 24Gb/s SAS with ‘fairness’ capabilities (expected in mid-2020) allows users to build out high-performance 24Gb/s storage networks without significant changes to infrastructure using standard 12Gb/s storage on the back end. The roadmap doesn’t stop there; 48Gb/s SAS is expected by as soon as 2025. 

SAS architectures deliver cost-effective performance with a better cost per GB than NVMe, especially with higher capacity and density SSDs and HDDs. And, today SAS has a much larger installed base in global datacenters and cloud service providers than NVMe.

According to IDC, SAS comprises more than 70% of enterprise storage drives and is expected to reach over 85% of enterprise storage capacity through 2022. This means that at present, SAS has a larger SCSI developer base with a supported roadmap to continue to develop low-cost high-performance solutions with a technology that has been around for over 30 years. 

Most data centers today depend on SCSI-based technology, which is the underlying command set behind SAS. To move to a completely different I/O protocol such as NVMe requires a major forklift upgrade, including changes in management interfaces and tools, as well as to the architecture, controllers, NVMe=supported servers, and data protection to name a few. Despite the performance promises of NVMe, widespread adoption is not a trivial economic or operational task.

Another consideration is capacity requirements. SSD storage is only able to support a small percentage of the overall capacity needs of the typical data center. While SSDs are great, HDDs have a 2-10x capacity advantage, which translates into significant cost savings. SAS is, by far, the interface of choice for HDDs and mixed storage environments.

In addition, SAS expanders allow an economical and straightforward way to scale. This provides a cost-effective implementation for most workloads compared to NVMe, which uses switch-based scaling (often requiring retimers) that adds costs and complexity to achieve higher performance. 

Software and OS vendors are still developing the applications and features that truly take advantage of NVMe performance benefits, unlike SCSI-based SAS. Until these vendors have full support for NVMe, realizing the benefits are limited. 

Summing it up 
There is a continued adoption of SSDs in the data center. Because of this, the number of vendors over the years has grown, especially with those developing new architectures to optimize how data is stored and retrieved specifically for solid-state storage technologies. This has spilled over into new storage companies that only support solid-state technology. Also, in the enterprise market, the area that has seen new growth in the use of consumer-grade SSDs has been CSPs, MSPs and the media and entertainment markets, but the growth for data center-level SSDs has slower adoption in these markets due to cost. This is an indicator that SAS-based SSDs and HHDs will be consumed at a higher rate than NVMe for their cost, capacity and lower power characteristics. Most IT decision makers don’t understand the TCO associated with using SAS SSDs vs. NVMe SSDs. This has been a limiting factor, especially with associated finance departments (who are looking for the lowest prices) in determining how money is spent on hardware. There is still much education that needs to occur in this area. 

While NVMe may win in raw performance, SAS wins everywhere else – scalability, power efficiency, manageability, reliability, and support. SAS will continue to be the foundation for data center computing for years to come.