FANDOM


Retrieved from ISED-L list-serv 2/09, CC3.0 a, s-a, nc license

  • We use Retrospect and I'm pretty happy with it. Features definitely lag be= hind the competition, but when using it for disk-based backup I haven't fou= nd a better, more intuitive approach with software that I can afford. We u= se it to back up Windows Servers (2003 and 2008), Exchange 2007, SQL 2005, = and a few Linux servers (CentOS). I've used MS DPM before with various clients and it works very well for bac= king up Microsoft servers. There are some caveats, though. It can only ba= ck up using an agent, and agents are only available for Windows, so this do= esn't meet your Linux or ESX requirements. I haven't run across a SMB backup package that can do the traditional stuff= (Windows, Linux, Exchange, SQL, etc) and VMware Consolidated Backup, and d= o them both well. There are enterprise options that do all of those items = (Asigra, NetBackup, Commvault) but they're all very expensive. If I were y= ou I'd take a mixed approach. Use a dedicated app to do host-level backups= (vRanger, esXpress, Veeam) and then use a more traditional app to do guest= -level backups.
  • I've become a big fan of Amanda - it's incredibly reliable, and if you're comfortable with the command line, it's great. If not, you can get Zmanda that has a nice web interface. Amanda is free and open source, Zmanda is incredibly reasonably priced.
  • In my previous network jobs, the standard seemed to be Backup Exec and I agree it was always problematic. When I landed here, the school used Retrospect http://www.retrospect.com/ which is now owned by EMC. I have found it very reliable and easy to use. We've been on it over 10 years now, and the price is fairly reasonable. A bit of a learning curve with the interface, but once you get it down it=20 runs by itself.
  • I too was never jazzed about BE. I am a former Retrospect user for years, but with all the turmoil there as they did feature parity between Windows Servers and Mac servers and their problematic ways they dealt with linux clients, I pulled it (circa 2004ish). I am sure it is better now. Also, I did like the functionality of their backup server and client specifically for floating laptops. We had many people with laptops and when they popped on the network, the client would register as present with Retro and ask to start backup for the user. Nice stuff and nothing I have seen since has been that good. I did pull retro eventually and went with Arkeia (http://www.arkeia.com/) which was extremely solid for servers and my Oracle databases at the time). I am running Symantec Backup Exec and it is pretty good. It is just the 800 pound gorilla now and workable. Arcserve is another, but I would steer clear of anything CA (insert personal history). We are backing up Linux, Windows and OS X servers with BE 12 pretty reliably. BE 9 and 10 were garbage, but actually it has become better.



It is important to work along several parallel tracks to get things under control:

  • User education: Encouraging users to review their files and e-mail regularly and delete unnecessary items.
  • Document retention policy: Having the institution as a whole define how long to keep electronic copies of files.
  • Quotas and file type analysis: set quotas and monitor for large or prohibited types of files
    • Symantec's StorageExec
    • Windows Storage Server 2003 R2 has good quota management
  • Improve size and quality of primary storage
    • I would wholly recommend a disk system for primary backup. The backup window is much smaller and it can then stream the backup file to tape quickly as well, but that slower operation can take place during normal hours as it isn't accessing the user data on the server.
    • Buy more and faster disks
  • Improve size and quality of the backup system: Move to a disk-based backup system supplemented by off-site backup for critical files.

The last two categories are expensive. The time it takes to back up and restore are additional problems. A full backup of our data is about 3/4 of a terabyte, and that takes a long time to run (and restore in the event of an emergency).


DiscussionEdit

This topic was discussed on the Wizards list-serv on 1/22/07, and on AIMS on 1/29/ - 2/5/07

  • I've had HP then EXA changers and there were just more maintenance than I though multi-thousand dollar appliances should be; and like most, we had some failures too. What I ended up doing was building a NAS and putting them somewhere secure on campus other than the main server closet of each location. So I do disk-to-disk of ALL of my data (ie: powerEdge 2900 with 4x 500gb SATA RAID-5 drives and a hot-spare), with 2 power supplies and 2 gigabit nics, it'll be good uptime. With Veritas CPS and shadow volume copies I keep 10-20 versions of files for staff, then full backups for my sanity. We then use Amerivault for additional true-offsite copies our most critical data (fundraising, grades, finance, DB's, IT info, etc). Some may have budget to send all their stuff offsite, go for it, but there's a way to get the best of both worlds.
  • We currently use traditional backup software (BackupExec) backing up to a drive array located in a separate building from the one in which our server room is located. My BackupExec server is also a domain controller and global catalog server, so I don't need to worry about rebuilding the domain itself if a fire guts the server room. All data is backed up to the drive array (differentials 4 days a week, full 1 day). We also just started using AmeriVault for our "critical" data. We've defined that as our Blackbaud and grading databases and our e-mail system. AmeriVault has been good to work with, and we've done successful test restores of the data.
  • One area I am currently looking at is the use of Amazon.com's S3 storage web service. A new application has come out that makes using this resource much easier. The name is Jungledisk (www.jungledisk.com). It allows you to copy files to the S3 storage service via Windows, Linux or Mac and Amazon's pricing is incredibly cheap. They charge $0.20 per GB transferred and $0.15 per GB stored per month. I would only use this for additional safety of critical data, say Blackbaud database backups, library catalog backups, etc.

But I've calculated that backing up this data to S3 twice a month, would cost us less than $20 a year. I'm sure more backup software will be able to use this storage resource in the near future, but even with something like Jungledisk I can script it fairly quickly.


ISED 3/07: Several schools have established reciprocal relationships to back-up their data off-site at the other school...

  • I use Second Copy to backup mission critical data to our off-site public web server. Simple FTP process however it's automatically scheduled with various "cron-like" jobs depending on the applications being backed up. It happens at night when our T1 is lightly used (have a setting on our Sonicwall that prevents 'Net access after 11pm) and usually doesn't take more than an hour or so each night. Amazing how simple the whole concept is being that it's extremely effective. Second Copy runs locally on the file server as a service. I've actually been considering contacting other schools and seeing if they might be interested in something like this to help me defray my costs.
  • Last summer, we sent a linux based dell server to St. John's in Texas and we back up our mission critical databases there nightly using sftp and a few other open source apps. I have documentation if you are interested.
  • We are in the process of reciprocating with Collegiate in NY using their documentation. Will be refining what they created. Will be sending them our server before the end of school and will begin backing up our data to it in addition to our normal backup routine. If you have any questions, you can contact me directly.

  • A few options to consider:
    • Service Provider: I used a vendor called Amerivault. All mission critical data was backed up nightly, over the internet, with a 1 week retention. I was very happy with their service but they are not the only game in town and pricing is competitive (iBackup [Has Mac Options], eVault, etc.). I was the only IT resource so I could not depend on tapes being swapped in my absence and this setup ran without incident for 3 years and saved me from two major hardware failures and a flood. I also never had to worry about testing tapes or physical loss or theft. Finally the software's ability to capture delta changes within Exchange and other live systems meant that the amount of data I sent down the wire each night was very small (just the changes). Service Provider Pros: Software is usually "free" or inexpensive with solid features. Single file restore was fast and easy and there were a variety of options immediately available (Download) or through the mail (CD, SAN, etc.) Inclusion of specialized backup functions (e.g. Exchange mailbox level restore, Oracle, etc.) 24/7 support and contractual guarantees of data availability and safety. Add-on disaster recover service are better/faster with existing relationship (e.g. they overnight you a server with your data already on it or pull up a fully equipped trailer, make sure insurance will pay the exorbitant cost). Cons: Cost and pricing are usually per GB and exceeding the agreed upon amounts can be costly (ala going over your cell phone minutes). Data is compressed so it good to trial and see how big your data will be before committing to a contract.
    • Systems Providers: I looked into this with various companies and found it hard to justify the cost. Typically these are heart-beat systems that keep a hot fail over of your key systems in a remote location (another building on campus) or possibly in a data center elsewhere. These systems can provide a great fail over, but can also be high maintenance and cost. They also tend to have problems over WAN networks and the security setup to have seamless failover can be extremely challenging. Amerivault provides this system as do many others.
    • DIY (do it yourself): FTP (or similar) files with a nightly script or program to a remote location. Typically cheap, especially if you get another school or business to house the remote server. However (even if it is encrypted) it can make business managers and board members on both sides nervous to think about a school's most valuable data (and some equipment) sitting in a location unless there are contracts/written agreements involved. Oddly while my school was mostly OK with this, a few other schools I talked with became concerned with their liability. I abandoned this approach when I found that I would have to backup 10GB of exchange data every night (because the "cheap" software I had could not do deltas of that data). If your going to save money on the rack space, spend a little extra on good backup software that does fast encryption, compression and good reporting. You could put it together using rsync and scripts, or windows scheduler but for all the effort and maintenance you are probably better off with something commercial like BRU from Tolis. If cost is a big factor, DIY can be very tempting. A half step is to consider server collocation with a data center (buying rack space for monthly fee + bandwidth) or space on a SAN (space used + bandwidth). Using VMWare to host a hot backup in another building on campus. This could be a great solution if properly implemented, but I never explored in detail. It also can not truly run off-site but could be in a separate building.

RecommendationsEdit

Source: ISED-L List serv, 5/08, CC3.0

Knowing the number of servers and amount of storage you backup nightly would help with a recommendation. e.g. We 24 servers and 4TB of data that we backup nightly. I have three solutions in place:

  • Critical production servers I use Acronis to create incremental images

to a 6TB NAS appliance. =20

  • Overland REO (iSCSI appliance) that creates nightly snapshots of our Xythos digital lockers
  • Overland NEO (Tape library) and BrightStore Arcserve to backup our web servers.

The best solution is using Acronis to create incremental snapshots of the servers. The solution provides the quickest recovery time and Acronis has a universal recovery option that allows you to restore to virtual machines or other servers.


We currently have a 8 tape LTO2 autoloader, a 2 TB disk-to-disk, and we use Amazon S3. We use backup exec as our software. Amazon S3 is for offsite online storage, disk-to-disk is for daily incrementals and daily full backups of DBs and email server. LTO2 autoloader is for weekly fulls. I rotate the tapes off site to a bank vault on the other side of town.


What I am working on now is buying a 3 or 4TB NAS unit, like the Buffalo Terastation line http://www.google.com/search?hl=en&q=buffalo+terastation+pro This will be located on the network, in an area physically as far away from the servers as possible, and will be used as the main backup location for daily backups. (Weekly?) tape backups will still be run for the critical stuff and will still go off site. If we did not already have a recent vintage tape drive I might consider an autoloader (tape library) type drive to get extra capacity, but that would mean more tapes that would have to be carried off-site.

For backup software I am currently using Commvault's Galaxy Express ($1000 from Dell ). For one price it covers 4 windows servers and includes an Exchange backup agent. The downside is that it is buggy in places, has poor documentation and is not user friendly. I don't recommend it, but it does work and covers a lot of hardware for the price. Be sure to get a quote for the yearly maintenance before you buy the software.

I've used the old Backup Exec and later versions once it was bought and became Veritas and that was easier to use (though I think more expensive to cover the same hardware).

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.