Here are the influential voices leading the conversations where nonprofits and technology overlap.
Virtualizing your server infrastructure can lower your costs and eliminate server sprawl. It can also free up rack space, reduce cooling requirements and energy use, increase flexibility and improve disaster recovery. However, without the right planning, virtualization can lead to systemwide disaster. Some applications may not work well in a virtualized environment; you should be prepared to move them onto dedicated hardware. Here are additional points to consider:
Run away from integrators that sell you the idea of hardware replication and then say you don’t need an offsite solution. NetApp’s S family has an excellent backup system built into its architecture. Its snapshot technology can make point-in-time replicas of your data on other parts of the unallocated hard disk space. Having daily backups on hard disk for several months is a dream to some IT managers who are used to slow and unreliable tape technology. However, a fire, hardware failure or some configuration problem could wipe out everything. Redundant backup is the key.
Microsoft recently created a huge buzz with its new Hyper-V product, and Citrix’s XenApp has had much success. However, the most respected and widely used product for virtualizing servers is VMware. More than 120,000 customers use VMware today, and the company is majority-owned by highly respected EMC. More important than the number of customers is VMware’s numerous partnerships, which means you will more easily find a local VMware guru than a XenApp or Hyper-V expert. Of course, the aggressive pricing of Microsoft and Citrix may encourage you to give them a shot.
Whenever you add complexity to your network, you should make sure you have good reasons for doing so. Virtualizing your entire infrastructure may have its benefits, but it must fit your overall IT plan — otherwise, it will become a pain point and a distraction. Virtualization can be part of a bigger plan to move your IT infrastructure to a data center. However, if you have multiple systems that do not work in a virtual environment — such as a fax system using a fax board, or an Equitrac server with a connection dongle, or even a security system with a separate wiring schema — you may find that virtualization adds complexity and cost to your network without achieving the goal of server consolidation.
Hewlett-Packard and IBM have traditionally provided excellent server equipment and support. Your server equipment is the most critical part of the virtual environment. You absolutely need top-quality equipment, the fastest processors, redundant fans, redundant power supplies and great support contracts. If you are using a data subsystem, you probably can get away with just a mirrored pair of drives, rather than extensive RAID 5 systems in your virtual servers. Plan to include pricing for a second virtual server for redundancy and load balancing. That will lower the total return on investment, but it will protect you against a single point of failure.
Your virtual environment can grow to fill whatever processor and memory capacity you have available. Processing power is almost always underutilized; it is rare to see any server use more than 25 percent of processor power, except on boot-up. But memory is different. Most server installations need at least 2 gigabytes of dedicated memory. So plan ahead when buying your virtual servers, and get a lot more memory than you currently need. If one of your virtual servers fails, having extra memory on a second virtual server will let you run the failed server’s programs while it is being repaired.
No matter which storage manufacturer you decide to go with — EMC, HP, IBM or NetApp — the basic idea is to select a system that has extensive support, high quality and integrates well in a virtual environment. Check its compatibility with your tape drive solution if you intend to hang a drive directly off the storage solution. Make sure you have enough disk space to allocate to snapshots, new server installations and hot spare drives that can take over if a drive failure occurs.
Which of the following best characterizes your company’s status with server virtualization?
37% We have already deployed.
28% We are currently evaluating virtualization.
25% We have no plans to deploy.
9% We are in the process of deploying.
1% Don’t know
Source: CDW Poll of 576 BizTech readers
I am aware that Windows Server 2008 is out and people are installing it with fine results. However, it seldom pays to be on the bleeding edge. Windows Server 2003 is rock-solid, and it works. Although it is true that in a VMware system you can run various applications, from Linux to Novell NetWare, in addition to many flavors of Windows, the vast majority of your installations will likely be Windows Server 2003.
You may need to use special tools to copy the data you want to migrate to a virtual machine, depending on how much there is. The Xcopy feature in Windows can copy extensive amounts of data over a period of days. Use the Xcopy /? command for options on how to copy changed files before the cutover. For instance, you can use Xcopy to copy the data over a few days and then do a refresh of the data immediately after cutover. In some cases you may need a more advanced solution. Double-Take is replication software that lets you copy live data from your live system to a virtual system, without downtime. Simply make your copy, and when you’ve finished, turn Double-Take off, change IP addresses and the computer name, and you’re essentially good to go.
It is very easy to get caught up in the hype and try to virtualize everything at once. Not every application reacts well inside a virtual machine. Test applications before deploying them in a virtual environment. Check with manufacturers to see if they offer support within a virtual machine. I recently found that one application manufacturer said it was fine to put the main SQL server in a virtual environment as long as the indexer remained on a separate dedicated server. Microsoft Exchange is a tricky application that has had mixed success in virtualized environments. For these reasons, it is best to roll out your virtual environment over a period of time. Give yourself time to settle in and troubleshoot problems before you roll out more virtual sessions.
You may think I’m nuts to consider an additional backup solution outside of hardware-based replication. But your company and reputation hinge on having a reliable backup, no matter what. If your exotic hardware-replication solution fails, where can you turn for major data recovery? LTO tape technology, combined with local operating system agent software, offers an external backup solution for peace of mind and redundancy.