Wednesday, November 05, 2008

NAS - Best Practices

About 15 years ago, in the days when NetApp was known as Network Appliance, the company set out to make a network-attached storage (NAS) system that would be easy to use and easy to manage.

Today's NAS products remain true to that guiding principle.


"The reality of the situation is that just about every NAS box you can buy today, from startups or from established players, is up and running in 20 minutes," says Arun Taneja, founder and president of the Taneja Group, a consulting firm that focuses on storage technologies. "They're all very easy to use."


Storage administrators who want to tailor their NAS arrays to work with certain applications or under specific situations can easily find countless white papers and guides on NAS vendor websites. For instance, NetApp's online library lists more than 30 technical reports with the words "best practices" in the title. The vendor offers recommendations on how to run NetApp products with databases from Oracle, Microsoft and MySQL, virtualization software from VMware and Microsoft, and ERP software from SAP, just to name a few.


And sometimes it does pay to read the fine print, as the Sacramento Superior Court learned the hard way in July. No one in the IT department there had noticed NetApp's recommendation to use a dedicated connection for each of the ports in its Cisco switch until it was too late. Since the switch is responsible for 90% of the court's servers, there was substantial downtime while the IT staff diagnosed the problem. Now that the Ethernet module has been upgraded, performance has spiked dramatically, according to Lewis Walker, a senior IT analyst at the state court.


But beyond consulting the white papers, there are five universal best practices that apply to NAS arrays.


Best Practice #1. Plan ahead, make future needs your top priority


Walker advises his storage peers to take into account their anticipated product replacement cycles as they assess their long-term storage needs. If it's four years, they need to size out the environment as best they can with that endpoint as the goal. But he also recommends they build in some headroom for unanticipated requests.


"Once you get the technology, someone's going to want to adapt it to do something that no one thought of," Walker says. "Everybody wants to use it."


The Sacramento Superior Court reached the limit of 12 disk shelves on its NetApp FAS 3020 within three years and recently put in a purchase order to upgrade to the FAS 3140, which can take 30 disk shelves, Walker notes.

According to Taneja, users tend to like their first NAS boxes so much that they can fill them up within a few months, if they're not careful. Then they go get a second box and changes need to be made on the client side so the computer knows where to find the files, which, he
notes, is "not trivial if you've got 2,000 users." When the second box reaches capacity, the company adds a third and a fourth and so on, until they find themselves with a management nightmare.


"NAS has a habit of growing like rabbits, and the more rabbits you have, the bigger your headache," Taneja says. He advises clients to do a systematic analysis before installing NAS, but "usually they come to me when they've bashed their heads, and they're already at a hundred."


When weighing expansion plans, administrators also need to think about what sorts of policies they might put in place, such as
limits on MP3 files or personal data. They should use management tools to study how the NAS array is being used, how many files are in the system and how many are active to gain insight into how they can manage more effectively.


"The administrators of the NAS system, at some point, have to ask themselves whether they can just keep buying storage," says Robert Passmore, an analyst at Gartner Inc., noting that "some people establish quotas, which forces the users occasionally to go back and delete files." But the quota system, he says, is "disruptive to the organization." Automatic archiving of files that are no longer in use onto lower cost media is a better option.


Best Practice # 2. Tier the data and automate the process


All disks are not created equal. A Fibre Channel drive might spin at 15,000 rpm, but it's also far too costly and power-hogging to use for low-priority data. Less important files might make more
sense on a cheaper, power-efficient, high-density SATA drive spinning at 5,400 rpm.


But putting the data on the right type of disk at the right time – or tiering – isn't enough, according to Brad Bunce, EMC's director of product marketing for NAS platforms. "Tiering the data allows you to control your costs from a disk perspective," he says. "The key is to have automation in the tiering. The last thing administrators want to be doing is running reports to find stale data and then manually move it."

Bunce described how the process works in an EMC storage system that permits different drive types in the same array. A Celerra file-mover API leveraged with EMC's Rainfinity file management appliance (FMA)
allows users to automate the movement of data based on policies.


If a policy calls for relocating files that haven't been accessed for 30 days, the FMA moves the file from the high-performance disk to the low-performance disk. A stub is left behind to the real file location, so a user or application doesn't know the file has moved. When the file is recalled, the client can directly access the file at its new location.

Another option is selecting products from different storage vendors. The Sacramento Superior Court uses NetApp's FAS midrange systems for its most mission-critical data but opted for a lower-cost Hewlett-Packard NAS box running Microsoft's Windows Storage Server for less important, second-tier data, such as employees' private drives. The court sacrifices useful features, such as thin provisioning and data deduplication with the HP system, which it must manage separately, but the cost savings have been substantial, says Walker.


"As good as NetApp is, they are expensive," he says. "You get what you pay for."


Best Practice #3. Thin provision whenever possible


With thin provisioning, an administrator sets the high-water mark for the storage's ultimate limit, then provisions only a small amount to start, allowing the NAS array to automatically grow the file systems or iSCSI LUNs as users request more storage. Alerts warn administrators when they're due to run out of physical storage so they can add capacity on the fly, nondisruptively, to keep operations humming.


The leaders in the NAS space, NetApp and EMC, offer thin provisioning to maximize storage efficiency, as do many other vendors. "Set it and forget it," is how EMC's Bunce describes the process. A simple checkmark in the company's Celerra storage system enables the feature, which prevents administrators from over-provisioning and allows them to leverage unused capacity for other purposes, if necessary.


Best Practice #4. Take snapshots of the data


Creating logical point-in-time copies of the data will pay off the first time a file is lost, corrupted or accidentally deleted. Restoring data from the snapshot is quick compared to the time-consuming ordeal of going to the tape or some other subsystem. Plus, snapshots consume little space, have no impact on performance and even permit users to recover the data on their own, if enabled.


"Make sure that snapshot gets backed up somewhere," warns Greg Schulz, founder of StorageIO Group. The backup can be on another NAS array, a disk or a tape, he says. "The array might be set up to do snapshots on its own, but if something happens to that array, your backup just went away. In other words, be in total protection."


Best Practice 5. Secure the array and the data


With NAS, a storage administrator is attaching to a network, and that's the value proposition of NAS: to be able to share over a network. "That's the good news," Schulz said. "The bad news is you are going to share over a network, which means you have the potential that unauthorized people could access that data."


So the file systems need to be secured. Authentication mechanisms need to be put in place. Authorizations need to be set so only the appropriate users are permitted to access certain files. Administrative privileges need to be established to limit the number of people who can change the settings. Management tools and interfaces also need to be protected.


"There's both administrative and user security around files," Passmore said.


The Sacramento Superior Court uses Microsoft's Active Directory in conjunction with its NAS array to avoid the "administrative nightmare" of having to update user rights in two places, Walker says.


Also important is implementing best practices for virus protection, since a file stored on a NAS system could be infected. Vendors typically provide instructions for doing this.


Labels: , , , ,

0 Comments:

Post a Comment

<< Home