Tuesday, October 08, 2019

All ATTO products are tested compatible with the newly released macOS Catalina


ATTO Technology, Inc, a global leader of network and storage connectivity and infrastructure solutions, announced today tested compatibility and support of the newly released macOS® Catalina, and that they are ready with support for the impending Apple Mac Pro® 2019.

Apple Mac Pros have been the platform of choice for creative professionals since 2006 and ATTO was the first with network and storage connectivity upgrades and support to meet the robust requirements of digital production workflows. Anticipating macOS Catalina, and the soon to be released Mac Pro 2019, ATTO is ready now to support high-performance storage connectivity for this next generation of professional-level Apple products.

“It makes the Mac feel more ‘pro’ again. Catalina is a big leap in making the Mac Pro a professional workstation geared toward high end media and audio workflows,” says Tim Klein, CEO of ATTO Technology, Inc. “With the soon to be released Mac Pro, we expect to see a resurgence of Apple in all facets of creative environments. ATTO and Apple remain a perfect pair for creative professionals.”

ATTO products are the highest-performing storage connectivity solutions available for Mac® environments and provide future-proof compatibility with Catalina and Mac Pro 2019. The broad portfolio of ATTO network and storage connectivity products tested for Macs include ATTO Celerity™ Fibre Channel HBAs, ATTO FastFrame™ Ethernet adapters, ATTO ExpressSAS® HBAs, and ATTO ThunderLink® Thunderbolt™ adapters.

ATTO has made new software tools available specifically for macOS Catalina that will enhance data-intensive and media production workflows. ATTO Xtend SAN iSCSI Initiator version 5.5 allows high availability with load balancing and failover for iSCSI storage on Macs over Ethernet. ATTO Disk Benchmark for macOS is the same industry-leading benchmark tool for Windows® developed exclusively for Macs.


Thursday, September 19, 2019

The Wait is Over, LTO-8 Tape is Back!

You heard that right, the wait is over, LTO-8 tape is back! If ever there was anything sexy about tape this is got to be one of those moments…LTO-8 media is back, and boy are we glad!

We will be seeing an influx of LTO-8 media flood the market very soon. This news couldn’t have come at a better time as we continue to see an increase in the demand for low cost, greater capacity, long term archive requirements in many enterprises; LTO tape continues to deliver solutions for these requirements delivering high storage capacity, blazing-fast transfer rates, easy-to-use functionality, and steadfast reliability. LTO is still the ideal solution for long-term data retention and archiving.

The announcement by the LTO Consortium came on August 5, 2019, stating that Fujifilm and Sony are now both licensees of Generation 8 technology. The relevance of tape continues to increase in archive and offline long-term storage particularly with large cloud providers but also those SMB customers with tight budgets. With up to 30 TB of compressed capacity and up to 750 MB/s data transfer rates, LTO Generation 8 continues to push innovation.

It is projected that by 2020 data will grow to 44 zettabytes; that is a 50-fold growth from the beginning of 2010; the largest culprits of this growth include machine and sensor data, hi-res video, and digital documents and images and that is just to name a few.

The outlook for LTO tape remains bright and strong with current and future generation delivering tremendous scalability and growth. As data grows by the second in gigantic steps, LTO can provide longevity, security, and assurance at a lower cost than other storage solutions.

Want to order LTO media? Simply click here!

Friday, August 23, 2019

78.6 Million HDDs Shipped in 2CQ19

While 2CQ19 HDD shipment totals of 78.56 million essentially matched the company’s forecast published in May, several factors, both positive and negative, have influenced the near-term HDD forecast published in this report.

On the positive side, nearline demand by cloud customers continued to accelerate from the early recover witnessed in 1CQ19. For 2CQ19, total nearline shipments jumped 13% Q/Q to 12.40 million units and, more importantly, total capacity shipped of 105.72EB advanced 19% over the same period due to a surge in 14TB shipments at two tier-1 hyperscale companies. 12TB, while declining slightly Q/Q, shipped higher volumes than all other nearline capacities as the 14TB transition at other cloud companies commences in 3CQ19.

Client HDD demand, while seasonally weak in 2CQ19, came under additional pressure from accelerating SSD attach rates due to falling SSD pricing caused by NAND oversupply. In addition, seasonally stronger game console HDD sales were muted in 2CQ19 as the current console generations enter their seventh year of existence. While a console refresh from both Microsoft and Sony are expected as early as late 2020, industry chatter indicates that the new platforms will sport SSDs in place of HDDs.

Trade tensions are also high on the list of concerns among both HDD customers and suppliers. While threatened tariff increases in June did not materialize, PC manufacturers and ODMs accelerated system builds in 2CQ19 in order to avoid the potentially higher export tariffs on products shipping out of China. Due to the system build-aheads, PC OEMs are cutting builds in early CQ3 to manage inventory, an action that will temper seasonal HDD demand further.

The uncertainties caused by continued tariff threats weigh on overall spending across numerous economies, spreading fears that a broader global economic slowing is beginning to take hold. As a result, the 2019 HDD forecast has been reduced slightly to account for a weaker PC forecast projected for the year.

At the same time, nearline demand remains essentially unchanged from prior forecasts in terms of annual exabyte growth. A slightly faster recovery of nearline demand by cloud customers in the 1H19 marginally reduces the nearline unit forecast a bit in 1H19, but the impact is fairly small relative to the total HDD TAM.

Over the revised long-term forecast, total HDD demand has been reduced primarily due to a change in the game console HDD outlook. With the next generation of Sony PlayStation and Microsoft Xbox converting from HDD storage over to SSDs based on current market information, the long-term game console HDD shipments will decline at a faster rate than in the previous forecast. In all likelihood, both companies will continue to sell current HDD-enabled platforms as a lower priced option until such time that SSD pricing falls to a point when HDDs can be displaced across the entire console line-up.

Higher-capacity client HDDs will continue to serve markets that value low-cost, high capacity solutions, such as surveillance and external retail HDDs. The external HDD market will remain steady while demand for surveillance HDDs will continue to grow over the long-term; however, other applications such as PCs and most distribution HDDs will have limited demand for high-capacity HDDs greater than 2TB. While the long-term HDD unit forecast remains relatively unchanged in total, the capacity growth for total client HDDs has been reduced slightly from a 25% five-year CAGR to 24%.

Wednesday, July 31, 2019

SAS or NVMe?Decisions, Decisions

Storage architects need to respond to today’s business challenges by ensuring the storage solutions they choose provide the security, stability, scalability and management features necessary to support their ecosystem. As they look toward adopting new storage technologies, there are essential considerations they should weigh and review before moving to a new technology. New storage protocols are continually entering the market, and this paper explores how SAS technology remains the foundation for storage networks and will continue to deliver on that promise tomorrow and beyond.
  
The trend toward SSDs
Solid-state disk (SSD) storage, enabled by NAND flash technology, has dramatically risen in adoption for primary storage, especially since they have the potential to provide much higher performance (IOPS and MB/s). As NAND flash becomes a commodity, prices continue to drop to the point that smaller capacity SSDs (less than 500GB) are competing with HDDs for market share. Both the SAS and NVMe protocols support SSDs. So, this begs the question: how should IT architects evaluate and integrate these technologies into their data center architecture? 

NVMe
 NVMe is designed specifically for the unique characteristics of solid-state technology. However, limitations in hardware and application infrastructure make it difficult to take full advantage of the performance benefits. This is especially important because NVMe SSDs are as much as 25x more expensive per GB than traditional HDDs. In addition, management tools such as the NVMe-MI specification are still in development and are not yet widely deployed. It will take at least two to three years before robust solutions exist that are sufficient to support enterprise storage systems. While the low-overhead NVMe technology shows future promise for wider-scale implementation, IT architects need technologies that are proven and that they can depend on today.

SAS 
The SAS interface is a general-purpose communication protocol for all types of storage media – hard drives (HDD), SSD, tape and more. Extremely fast, a single SAS port can achieve top performance of 2.4GB/s and millions of I/O operations per second (IOPS). More importantly, the SAS roadmap continues to evolve to support the even higher performance expectations of tomorrow’s data center. For instance, 24Gb/s SAS with ‘fairness’ capabilities (expected in mid-2020) allows users to build out high-performance 24Gb/s storage networks without significant changes to infrastructure using standard 12Gb/s storage on the back end. The roadmap doesn’t stop there; 48Gb/s SAS is expected by as soon as 2025. 

SAS architectures deliver cost-effective performance with a better cost per GB than NVMe, especially with higher capacity and density SSDs and HDDs. And, today SAS has a much larger installed base in global datacenters and cloud service providers than NVMe.

According to IDC, SAS comprises more than 70% of enterprise storage drives and is expected to reach over 85% of enterprise storage capacity through 2022. This means that at present, SAS has a larger SCSI developer base with a supported roadmap to continue to develop low-cost high-performance solutions with a technology that has been around for over 30 years. 

Most data centers today depend on SCSI-based technology, which is the underlying command set behind SAS. To move to a completely different I/O protocol such as NVMe requires a major forklift upgrade, including changes in management interfaces and tools, as well as to the architecture, controllers, NVMe=supported servers, and data protection to name a few. Despite the performance promises of NVMe, widespread adoption is not a trivial economic or operational task.

Another consideration is capacity requirements. SSD storage is only able to support a small percentage of the overall capacity needs of the typical data center. While SSDs are great, HDDs have a 2-10x capacity advantage, which translates into significant cost savings. SAS is, by far, the interface of choice for HDDs and mixed storage environments.

In addition, SAS expanders allow an economical and straightforward way to scale. This provides a cost-effective implementation for most workloads compared to NVMe, which uses switch-based scaling (often requiring retimers) that adds costs and complexity to achieve higher performance. 

Software and OS vendors are still developing the applications and features that truly take advantage of NVMe performance benefits, unlike SCSI-based SAS. Until these vendors have full support for NVMe, realizing the benefits are limited. 

Summing it up 
There is a continued adoption of SSDs in the data center. Because of this, the number of vendors over the years has grown, especially with those developing new architectures to optimize how data is stored and retrieved specifically for solid-state storage technologies. This has spilled over into new storage companies that only support solid-state technology. Also, in the enterprise market, the area that has seen new growth in the use of consumer-grade SSDs has been CSPs, MSPs and the media and entertainment markets, but the growth for data center-level SSDs has slower adoption in these markets due to cost. This is an indicator that SAS-based SSDs and HHDs will be consumed at a higher rate than NVMe for their cost, capacity and lower power characteristics. Most IT decision makers don’t understand the TCO associated with using SAS SSDs vs. NVMe SSDs. This has been a limiting factor, especially with associated finance departments (who are looking for the lowest prices) in determining how money is spent on hardware. There is still much education that needs to occur in this area. 

While NVMe may win in raw performance, SAS wins everywhere else – scalability, power efficiency, manageability, reliability, and support. SAS will continue to be the foundation for data center computing for years to come. 

Tuesday, April 23, 2019

ISC West: Quantum VS-Series for Video Surveillance and Industrial IoT

Systems to record and store surveillance footage, running security infrastructure on single cloud-like platform

Quantum Corp. announced the VS-Series, a flexible storage platform designed for surveillance and industrial IoT applications.

The series is available in a range of server choices, suitable for deployments with fewer than ten cameras, up to the largest environments with thousands of cameras. Using the VS-Series security professionals can record and store surveillance footage and run an entire security infrastructure on a single platform.

 Smarter, safer buildings and cities drive need for new approach to data

Surveillance and security operations continue to get more complex, with more cameras, higher-resolution imagery, and increasing data retention requirements. As buildings become smarter, everything from access control systems to lighting and HVAC systems are now connected – part of the IoT. There is an opportunity to simplify operations by consolidating and converging many applications onto a single platform.

Software platform for video surveillance

VS-Series architecture is based on the Quantum Cloud Storage Platform (CSP), a software-defined storage platform designed for storing machine and sensor-generated data. Like storage technologies used in the cloud, the CSP is software-defined, can be deployed on bare metal, as a VM, or as part of a hyperconverged infrastructure. Unlike other SDSe technologies, it was designed for video and other forms of high-resolution content – engineered for low latency, maximizing the streaming performance of large files to storage.
The company’s Cloud Storage Platform enables high-speed video recording with camera density and can host and run certified VMS management applications, recording servers, and other building control servers on a single platform.
With VS-Series security professionals can operate their security infrastructure on a single platform

Flexible software-defined architecture for range of deployments
The VS-Series product line is being offered in a variety of deployment options, including software-only, mini-tower, 1U, 2U, and 4U hyperconverged servers.
Key VS-Series attributes:
  • Efficiency: Supports high camera density and software architecture that enables users to run their entire security infrastructure on a single hyperconverged platform.
  • Flexibility: The series is a software-defined platform which offers a range of deployment options. Many appliances can scale out for more cameras or scale up for increased retention.
  • Easy to deploy, operate and maintain: The VS-Series comes pre-installed with certified VMS applications, and can be installed and configured in minutes. It is backed by service and support of the company and integrator partners.
  • Resilience: The series software offers a fault-tolerant design to minimize hardware and software issues, designed to virtually eliminate downtime.

The company has unveiled the VS-Series at ISC West. The first products to launch are the VST10x mini-tower appliances, and the VS2112 and VS2124 2U servers, available this quarter. Additional offerings will be available later this year.

Josh Woodhouse, principal analyst, IHS Markit, said: “There are huge increases in ubiquitous video surveillance data required to be processed, analyzed and stored. The total market capacity shipped for enterprise storage systems in video surveillance is forecast to grow at a CAGR of 40% from 2017 to 2022. Storage demand is strongest in safe or smart cities installations, and in retail and transportation hubs around the globe, where end-users are looking to both protect against security threats and maximize the efficient use of collected data. Scalable, flexible storage systems are important in achieving this goal.(*)

Jeremy Scott, manager, strategic alliance partner, Americas, Milestone Systems A/S, said: “Security systems today include more high-resolution cameras to capture relevant situational information. Our partnership with Quantum lets us deliver scalable video surveillance solutions with the storage capacity and access suited to the increased data volumes these systems generate.

Keith Bishop, strategic growth manager, data enabled business, Johnson Controls, said: “Buildings and cities have Internet connected devices everywhere – cameras, badge readers, lighting, HVAC systems, and more – which creates a need to consolidate and manage this digital data in one place. Being able to converge surveillance recording, VMS systems, and other building control applications onto a single platform means our customers can simplify their infrastructure as we help them create smarter and safer buildings and cities.

Jamie Lerner, president and CEO, Quantum, said: “As we worked with partners to design a platform for surveillance, we knew that streaming performance and camera density would be critical, as well as hyperconvergence both for efficiency and to consolidate the security infrastructure. I’m pleased that we were able to bring this VS-Series to market so quickly – it demonstrates both the software-defined nature of the Quantum Cloud Storage Platform and how we can take our years of expertise working in movie and TV production and construct a platform with leading performance for the surveillance industry.

Thursday, March 21, 2019

Five Best Practices to Securely Preserve Video, Photo and Other Data

Whether you’re working with video, photo, audio, or other data, preserving the security of your data has to be at the top of your priority list. Data security might sound like a challenging proposition, but by following just a handful of guidelines it becomes a straightforward and easily accomplished task.

We’d like to share what we consider best practices for maintaining the safety of your data. For both seasoned pros and those just getting started with digital media, these best practices are important to implement and revisit regularly. We believe that by following these practices – independently of which specific storage software, service, or device you use – you will ensure that all your media and other data are kept secure to the greatest extent possible.

1- Keep Multiple Copies of Your Media Files
Everyone by now is likely familiar with the 3-2-1 strategy for maintaining multiple copies of your data (video, photos, digital asset management catalogs, etc.). Following a 3-2-1 strategy simply means that you should always have at least three copies of your active data, two of which are local, and at least one that is in another location.


Mind you, this is for active data, that is, files and other data that you are currently working on and want to have backed up in case of accident, theft, or hardware failure. Once you’re finished working with your data, you should consider archiving your data, which we’ve also written about on our blog.
 
2- Use Trustworthy Vendors
There are times when you can legitimately cut corners to save money, and there are times when you shouldn’t. When it comes to your digital media and services, you want to go with the best. That means using topnotch memory sticks, HDD and SSD drives, software, and cloud services.

For hardware devices and software, it’s always helpful to read reviews or talk with others using the devices to find out how well they work. For HDD reliability, our Drive Stats blog posts can be informative and are a unique source of information in the storage industry.
For cloud storage, you want a vendor with a strong track record of reliability and cost stability. You don’t want to use a cloud service or other SaaS vendor that has a history of making it difficult or expensive to access or download your data from their service. A top notch service vendor will be transparent in their business practices, inform you when there are any outages in their service or maintenance windows, and try as hard as possible to make things right if problems occur.

3- Always Use Encryption (The Strongest Available)
Encrypting your data provides a number of benefits. It protects your data no matter where it is stored, and also when it is being moved – potentially the most vulnerable exposure your data will have.

Encrypted data can’t be altered or corrupted without the changes being detected, which provides another advantage. Encryption also enables you to meet requirements for privacy and security compliance and to keep up with changing rules and regulations.
Encryption comes in different flavors. You should always select the strongest encryption available, and make sure that any passwords or multi-factor authentication you use are strong and unique for each application.

4- Automate Whenever Possible
Don’t rely on your memory or personal discipline alone to remember to regularly backup your data. While we always start with the best of intentions, we are busy and we often let things slide (much like resolving to exercise regularly). It’s better to have a regular schedule that you commit to, and best if the backups happen automatically. Many backup and archive apps let you specify when backups, incremental backups, or snapshots occur. You usually can set how many copies of your data to keep, and whether backups are triggered by the date and time or when data changes.

Automating your backups and archives means that you won’t forget to backup and results in a greater likelihood that your data will not only be recoverable after an accident or hardware failure, but up to date. You’ll be glad for the reduced stress and worry in your life, as well.

5- Be Mindful of Security in Your Workflow
Nobody wants to worry about security all the time, but if it’s ignored, sooner or later that inattention will catch up with you. The best way to both increase the security of your data and reduce stress in your life is to have a plan and implement it.

At its simplest, the concept of security mindfulness means that you should be conscious of how you handle your data during all stages of your workflow. Being mindful shouldn’t require you to overthink, stress or worry, but just to be aware of the possible outcomes of your decisions about how you’re handling your data.
If you follow the first four practices in this list, then this fifth concept should flow naturally from them. You’ve taken the right steps to a long term plan for maintaining your data securely.

Data Security Can Be Both Simple and Effective
The best security practices are the ones that are easy to follow consistently. If you pay attention to the five best practices we’ve outlined here, then you’re well on your way to secure data and peace of mind.

Thursday, February 28, 2019

EMEA WW Storage From 9.5ZB in 2018 to 48.3ZB in 2025, 27% CAGR

Driven by video surveillance, signals from IoT devices, metadata, and entertainment

The Global Datasphere, a measure of how much new data is created and replicated each year, will grow by more than five times over the next seven years. The total amount of new data created in 2025 is forecast to increase to 175ZB from 33ZB in 2018.

The major drivers of this growth are largely consistent across the world’s various regions but occur at different rates. Entertainment data and video surveillance footage have long been (and continue to be) significant drivers of the Global Datasphere. However, signals from the IoT devices, metadata (vital for analytics, contextualization, and AI], and productivity data are showing even faster growth in today’s increasingly digitized world.

Nevertheless, amid the similarities across various regions, there are subtle differences. These differences are based on technology adoption and digital transformation across a region’s population of consumers and enterprises.

The Europe, the Middle East, and Africa (EMEA) storage is growing slightly slower than the overall Global Datasphere (a 2018-2025 CAGR of 26.1% versus 27.2%, respectively). The EMEA storage will increase from 9.5ZB in 2018 to 48.3ZB in 2025, or from 28.8% to 27.6% of the WW storage, respectively. Nearly one third will be driven by growth of video surveillance, signals from IoT devices, metadata, and entertainment. For example, user-created and user-consumed online video like YouTube is one of the top 5 fastest-growing segments of data creation.
Also contributing to growth of the EMEA storage is the sheer number of users getting online for the first time to begin their own digital journeys that result in the consumption, creation, and sharing of data. In MEA countries alone, only 31% of the population in 2018 is using the Internet compared with 86% in Western Europe countries and 53% globally. This dynamic places pressure on enterprises and governments to upgrade infrastructures to accommodate the growing base of users. 

In the more technologically advanced EMEA countries, the edge is an important intermediary between the core and the endpoints to help facilitate the creation and consumption of online video, as well as real-time, on-the-go decisions. Hence the%age of data in the EMEA Datasphere emanating from or replicated in the edge will nearly double – from 11% to 21% of the region’s total Datasphere – as IoT devices increasingly drive processing and analytics closer to the point of origin of the data itself.

Data is at the heart of this digital world, and we are increasingly becoming an information economy. The value is moving to data so that we can create a new world of smarter products, better customer experiences, and self-learning and always improving digital services. In fact, 43% of EMEA organizations executing on digital transformation initiatives have put data capitalization as the top priority to progress. Data is also the heartbeat of modern user experiences and services built using next-generation technologies such as cognitive, IoT, AI, and machine learning.

This unprecedented data growth combined with the pressures of deriving value from data for digital transformation will create imperatives for IT and business organizations across all regions over the next decade to develop a fitting storage, management, and capitalization strategy and drive a new level of engagement with consumers using data-informed services and products. Whether it’s surveillance growth in the UK and France, manufacturing in Germany, or growth in Russia’s mining industry, data is playing a much more critical role.