Friday, October 23, 2009
Wednesday, October 21, 2009
What about NAS Backups?
At one time, I felt that NAS had a rather distinct disadvantage. While it did significantly reduce system administration requirements, it created a challenge in one particular area - backup and recovery. Since more NAS filers usually involve a stripped down (or significantly customized) version of the operating system, normal backup and recovery client software often isn't applicable. With a few exceptions, you can't simply buy client software from your backup vendor for your filer. Although this has gotten better, there was a time when the only way to backup your NAS appliance was to use rdump or to back it up via an NFS mount.
Even the advent of the network data management protocol (NDMP) didn't seem to help things at first. It usually meant locally attaching a tape drive to a filer and backing up that server's data to that tape drive. This often meant a significant reduction in automation. It didn't help that software vendors were slow to support NDMP, because they saw it as competition to their own client software.
However, a lost has changed in recent years. All major backup-software vendors support NDMP, and you can even use SAN technology to share a tape library between your filers and your other backup servers. Even if you're backing up your filers across the network, gigabit NICs that offloaded the TCP/IP processing from the host CPU make data transfer over the network much easier and faster. Jumbo frames also helped some vendors.
Another reason that backup and recovery of filers is now less a problem is that some NAS vendors introduced data-protection options equivalent to (and sometimes easier to use than) the options available on many UNIX or NT systems - including built-in shapshots, mirroring and replication. Therefore, for what it's worth, my respect for NAS has grown significantly in recent years.
Even the advent of the network data management protocol (NDMP) didn't seem to help things at first. It usually meant locally attaching a tape drive to a filer and backing up that server's data to that tape drive. This often meant a significant reduction in automation. It didn't help that software vendors were slow to support NDMP, because they saw it as competition to their own client software.
However, a lost has changed in recent years. All major backup-software vendors support NDMP, and you can even use SAN technology to share a tape library between your filers and your other backup servers. Even if you're backing up your filers across the network, gigabit NICs that offloaded the TCP/IP processing from the host CPU make data transfer over the network much easier and faster. Jumbo frames also helped some vendors.
Another reason that backup and recovery of filers is now less a problem is that some NAS vendors introduced data-protection options equivalent to (and sometimes easier to use than) the options available on many UNIX or NT systems - including built-in shapshots, mirroring and replication. Therefore, for what it's worth, my respect for NAS has grown significantly in recent years.
Tuesday, October 20, 2009
Friday, October 16, 2009
Preparing for the worst
One of the simplest rules of systems administration is that disks and systems fail. If you haven't already lost a system or at least a disk drive, consider yourself extremely lucky. You also might consider the statistical possibility that your time is coming really soon. Backup & recovery should be part of the disaster recovery plan,
My Dad Was Right
My father used to tell me, "There are two types of motorcycle owners. Those who have fallen, and those who will fall." The same rule applies to system administrators. There are those who have lost a disk drive and those who will lose a disk drive. (I'm sure my dad was just trying to keep me from buying a motorcycle, but the logic still applies. That's not bad for a guy who got his first computer last year, don't you think?)
Whenever I speak about my favorite subject at conferences, I always ask questions like, "Who has ever lost a disk drive?" or "Who has lost an entire system?". When I asked those questions there, someone raised his hand and said, "My computer room just got struck by lightning." That sure made for an interesting discussion! If you haven't lost a system, look around you... one of your friend has.
Speaking of old adages, the one that says "It'll never happen to me" applies here as well. Ask anyone who's been mugged if they thought it would happen to them. Ask anyone who's been in a car accident if they ever thought it would happen to them. Ask the guy whose computer room was struck by lightning if he though it would ever happen to him. The answer is always 'No.'
The whole reason of writing this post is to make you able to recover from some level of disaster. Whether it's a user who has accidently or maliciously damaged something or a tornado that has taken out your entire server room, the only way you are going to recover is by having a good, complete, disaster recovery plan that is based on a solid backup and recovery system.
Neither can exist completely without the other. If you have a great backup system but aren't storing your media off-site, you'll be sorry when that tornado hits. You may have the most well organized, well protected set of backup volumes, but they won't be of any help if your backup and recovery system hasn't properly stored the data on those volumes. Getting good backups may be an early step in your disaster recovery plan, but the rest of that plan - organizing and protecting those backups against a disaster - should follow soon after. Although the task may seem daunting, it's not impossible.
My Dad Was Right
My father used to tell me, "There are two types of motorcycle owners. Those who have fallen, and those who will fall." The same rule applies to system administrators. There are those who have lost a disk drive and those who will lose a disk drive. (I'm sure my dad was just trying to keep me from buying a motorcycle, but the logic still applies. That's not bad for a guy who got his first computer last year, don't you think?)
Whenever I speak about my favorite subject at conferences, I always ask questions like, "Who has ever lost a disk drive?" or "Who has lost an entire system?". When I asked those questions there, someone raised his hand and said, "My computer room just got struck by lightning." That sure made for an interesting discussion! If you haven't lost a system, look around you... one of your friend has.
Speaking of old adages, the one that says "It'll never happen to me" applies here as well. Ask anyone who's been mugged if they thought it would happen to them. Ask anyone who's been in a car accident if they ever thought it would happen to them. Ask the guy whose computer room was struck by lightning if he though it would ever happen to him. The answer is always 'No.'
The whole reason of writing this post is to make you able to recover from some level of disaster. Whether it's a user who has accidently or maliciously damaged something or a tornado that has taken out your entire server room, the only way you are going to recover is by having a good, complete, disaster recovery plan that is based on a solid backup and recovery system.
Neither can exist completely without the other. If you have a great backup system but aren't storing your media off-site, you'll be sorry when that tornado hits. You may have the most well organized, well protected set of backup volumes, but they won't be of any help if your backup and recovery system hasn't properly stored the data on those volumes. Getting good backups may be an early step in your disaster recovery plan, but the rest of that plan - organizing and protecting those backups against a disaster - should follow soon after. Although the task may seem daunting, it's not impossible.
Wednesday, October 14, 2009
Backup cartoon #6
Tuesday, October 13, 2009
The One That Got Away: True story of not taking backups
"You mean to tell me that we have absolutely no backups of paris whatsoever?" I will never forget those words. I had been in charge of backups for only two months, and I just knew my career was over. We had moved an Oracle application from one server to another about six weeks earlier, and there was one crucial part of the move that I missed. I knew very little about database backups in those days, and I didn't realize that I needed to shut down an Oracle database before backing it up. This was accomplished on the old server by a cron job that I never knew existed. I discovered all of this after a disk on the new server went south.
"Just give us the last full backup," they said. I started looking through my logs. That's when I started seeing the errors. "No problem," I thought, "I'll just use an older backup." The older logs didn't look any better. Frantic, I looked at log after log until I came to one that looked as if it were OK. It was just over six weeks old. When I went to grab that volume, I realized that we had a six-week rotation cycle, and we had over-written that volume two days before.
That was it! At that moment, I knew that I'd be looking for another job. This was our purchasing database, and this data loss would amount to approximately two months of lost purchase orders for a multibillion-dollar company.
So I told me boss the news. That's when I heard, "You mean to tell me that we have absolutely no backups of paris whatsoever?" Isn't it amazing that I haven't forgotten its name? I don't remember any other system names from that place, but I remember this one. I felt so small that I could have fit inside a 4mm tape box. Fortunately, a system administrator worked what, at the time, I could only describe as magic. The dead disk was resurrected, and the data was recovered straight from the disk itself. We lost only a few days' worth of data. Our department had to send a memo to the entire company saying that any purchase order entered in the last two days had to be reentered. I should have framed a copy of that memo to remind me what can happen if you don't take this job seriously enough. I didn't need to though; its image is permanently etched in my brain.
"Just give us the last full backup," they said. I started looking through my logs. That's when I started seeing the errors. "No problem," I thought, "I'll just use an older backup." The older logs didn't look any better. Frantic, I looked at log after log until I came to one that looked as if it were OK. It was just over six weeks old. When I went to grab that volume, I realized that we had a six-week rotation cycle, and we had over-written that volume two days before.
That was it! At that moment, I knew that I'd be looking for another job. This was our purchasing database, and this data loss would amount to approximately two months of lost purchase orders for a multibillion-dollar company.
So I told me boss the news. That's when I heard, "You mean to tell me that we have absolutely no backups of paris whatsoever?" Isn't it amazing that I haven't forgotten its name? I don't remember any other system names from that place, but I remember this one. I felt so small that I could have fit inside a 4mm tape box. Fortunately, a system administrator worked what, at the time, I could only describe as magic. The dead disk was resurrected, and the data was recovered straight from the disk itself. We lost only a few days' worth of data. Our department had to send a memo to the entire company saying that any purchase order entered in the last two days had to be reentered. I should have framed a copy of that memo to remind me what can happen if you don't take this job seriously enough. I didn't need to though; its image is permanently etched in my brain.
Monday, October 12, 2009
Friday, October 9, 2009
Hard Disks - cheap way to store backups
Disks have become a very attractive backup target these days. Here is a quick summary of some of the reasons why is it so:
- Cost: The biggest reason that disk has become such an attractive backup target is that the cost of disk has been dramatically reduced in the last few years. The cost of a reasonably priced disk array is now approximately the same price as a similarly sized tape library filled with media. When you consider some of the things you can do with disk, such as eliminating full backups and redundant files, disk becomes even less expensive.
- Reliability: Unlike tapes, disks are closed systems that aren't susceptible to outside contaminants. In addition, the actual media of a hard drive is, well, hard when compared to a piece of tape media. The result is that an individual disk drive is inherently more reliable than a tape drive. Disk drives become even more reliable when you put them in a RAID array.
- Flexibility: Generally speaking, tape drives can only go two speeds: stop & very fast. Yes, some tape drives support variable speeds. However, they can usually only slow down to about 40% of the rated speed of the drive. Disk drives, on the other hand, work at whatever speed you need them to go. If you need to go a few hundred megabytes per second, put a few drives in a RAID group, and blast away. Then if you need that some RAID group to write at 10KB/s, go ahead. Unlike tape drives, disk drives have no problem writing slowly, then quickly, then slowly, then.... You get the picture. This makes disk a perfect match for unpredictable backup streams. Once all that random data has been written in a serial fashion on your disk device, the disks can easily stream backup data to tape - if that's what you want to do. Some people are foregoing that step altogether and replacing it with replication. Try doing that with a tape drive.
Thursday, October 8, 2009
Wednesday, October 7, 2009
cpio archive for UNIX
cpio is a binary file archiver & a file format basically for the UNIX based systems that ends with .cpio file extensions. Available under the GNU license, cpio software utility was meant to be a tape archiver that was originally part of PWB/UNIX and that was also part of UNIX System III & UNIX System V and is a stream of files & directories in a single archive. Though its later versions, such as tar, are more popular & considered to be a better solutions that cpio itself but it's usage in the RPM Package Manager, the Linux Kernel 2.6 series' initramfs, Oracle's distribution of its software in the cpio format and Apple's 'pax' installer archive continues to make cpio an important archive format.
The header of a cpio archive contains information such as the file names, time stamp, owners & permissions and was designed to store backups onto a tape device in a contiguous manner. Like the Tar format, CPIO archives are often compressed using Gzip and distributed as .cpgz or .cpio.gz files and supports the binary, old & new ASCII, crc, HPUX binary, HPUX old ASCII, old tar & POSIX.1 tar archive formats.
The cpio utility was standardized in POSIX.1-1988 & was dropped from later revisions, starting with POSIX.1-2001 due to its 8 GB filesize limit. The POSIX standardized pax utility can be used to read and write cpio archives instead. The latest release of cpio is version 2.10 that was released on 20-June-2009 after minor bugfixes.
The header of a cpio archive contains information such as the file names, time stamp, owners & permissions and was designed to store backups onto a tape device in a contiguous manner. Like the Tar format, CPIO archives are often compressed using Gzip and distributed as .cpgz or .cpio.gz files and supports the binary, old & new ASCII, crc, HPUX binary, HPUX old ASCII, old tar & POSIX.1 tar archive formats.
The cpio utility was standardized in POSIX.1-1988 & was dropped from later revisions, starting with POSIX.1-2001 due to its 8 GB filesize limit. The POSIX standardized pax utility can be used to read and write cpio archives instead. The latest release of cpio is version 2.10 that was released on 20-June-2009 after minor bugfixes.
Tuesday, October 6, 2009
Saturday, October 3, 2009
Implementing Backup and Recovery by David B Little & David A. Chapa
Implementing Backup and Recovery: The Readiness Guide for the Enterprise is a book by David B Little & David A. Chapa that arms you with all the information you need to architect a backup and recovery system. System backup is essential in any enterprise - protecting data is equivalent to protecting the company or agency. Whether you have the task of putting together a backup and recovery system for your organization or you are thinking about how backup & recovery rits within the scope of total data availability, this book is an invaluable resource.
Implementing Backup and Recovery takes you through the necessary steps of deploying services by showing you how to address the architecture, limitations and capabilities of the existing network infrastructure. After an introduction to backup and recovery in the enterprise, Little and Chapa give a tutorial on the components of backup. Then, using VERITAS NetBackup as an example, they show you how to install and configure a backup application and how to customize services to meet customer needs. Throughout the book, the authors use real-life client situations to explain specific concepts. In addition, you'll also learn:
Implementing Backup and Recovery takes you through the necessary steps of deploying services by showing you how to address the architecture, limitations and capabilities of the existing network infrastructure. After an introduction to backup and recovery in the enterprise, Little and Chapa give a tutorial on the components of backup. Then, using VERITAS NetBackup as an example, they show you how to install and configure a backup application and how to customize services to meet customer needs. Throughout the book, the authors use real-life client situations to explain specific concepts. In addition, you'll also learn:
- The business & legal requirements of backup systems
- VERITAS NetBackup's tiered architecture and configuration elements
- How to determine your need for additional backup services
- What the future holds for backup and recovery
Friday, October 2, 2009
Discus backup
Thursday, October 1, 2009
Cobian Backup 8 backup software for Windows
Cobian Backup 8 (Blackmoon) is the only open source backup software for Windows which comes with GUI. Written in Delphi by Luis Cobian of the Umea University, Cobian Backup 8 is 100% free & donation-supported backup software for Microsoft Windows platform.
The latest version of Cobian Backup is 9.5.1.212 which supports Unicode, FTP, compression (Zip, SQX, 7z), encryption (including Blowfish, Rijndael, DES, RSA-Rijndael, incremental and differential backup. It also supports long file names (upto 32,000 characters) for all backup types except ZIP (which supports only 256 characters). The software may be installed as an application or as a service running in the background. Multilingual support is implemented via user-submitted language files.
Blackmoon is a multi-threaded program that can be used to schedule & backup your files and directories from their original location to other directories/drives in the same computer or any other computer on the network. FTP backup is also supported in both directions (download and upload). Blackmoon comes in two tastes - application and service. The program uses very few resources and can be running on the background on your system, checking your backup schedule and executing your backups when necessary.
Cobian Backup 8 is not an usual backup application: it only copies your files and folders in original or compressed mode to other destination, creating a security copy as a result. So Cobian Backup can be better described as a "Scheduler for security copies". Cobian Backup supports several methods of compression and strong encryption.
The source files of Cobian Backup 8 can be downloaded from Source Forge while Cobian Backup 9 (Amanita) is not going to be an open source software.
The latest version of Cobian Backup is 9.5.1.212 which supports Unicode, FTP, compression (Zip, SQX, 7z), encryption (including Blowfish, Rijndael, DES, RSA-Rijndael, incremental and differential backup. It also supports long file names (upto 32,000 characters) for all backup types except ZIP (which supports only 256 characters). The software may be installed as an application or as a service running in the background. Multilingual support is implemented via user-submitted language files.
Blackmoon is a multi-threaded program that can be used to schedule & backup your files and directories from their original location to other directories/drives in the same computer or any other computer on the network. FTP backup is also supported in both directions (download and upload). Blackmoon comes in two tastes - application and service. The program uses very few resources and can be running on the background on your system, checking your backup schedule and executing your backups when necessary.
Cobian Backup 8 is not an usual backup application: it only copies your files and folders in original or compressed mode to other destination, creating a security copy as a result. So Cobian Backup can be better described as a "Scheduler for security copies". Cobian Backup supports several methods of compression and strong encryption.
The source files of Cobian Backup 8 can be downloaded from Source Forge while Cobian Backup 9 (Amanita) is not going to be an open source software.
Wednesday, September 30, 2009
Backup cartoon #2
Tuesday, September 29, 2009
Bonkey (The Backup Monkey)
Bonkey (The Backup Monkey) is an open source backup software available for Windows & Mac platform which can backup files only which match a certain criteria to most of the storage devices. Bonkey has specifically been designed to be used in conjunction with Amazon S3, the guys which provide cheap online storage. You can download the code and change it to your custom need if the necessity so arises. Bonkey allows you to:
Some of the features of the open source backup software are:
- backup to multiple locations, including Amazon S3, SFTP, FTP, Windows shares & even local disks
- backup MS-SQL Server databases
- backup automatically at the scheduled time
- email any errors during backup to your email address
- backup only files that have changed, or a snapshot of the files you select
- compress & encrypt files during backup
- transfer files between different locations using drag & drop
- show built-in help
Some of the features of the open source backup software are:
- there's a file transfer mode for easier drag & drop transfers
- you can use synchronisation to remove out of date files from backup targets
- it has a built-in restore wizard to restore to a folder or original file locations
Friday, September 25, 2009
Backup cartoon #1
Thursday, September 24, 2009
Bacula - Open Source Backup
Bacula is an open source network backup solution that is written in C++ and is relatively easy to use while offering many advanced storage management features that make it easy to find and recover lost or damaged files. Bacula is basically an enterprise ready backup software that permits you (or the system administrator) to manage backup, recovery, and verification of computer data across a network of computers of different kinds.
The latest stable release of Bacula is 3.0.2 which supports Solaris, FreeBSD, NetBSD, MacOSX, OpenBSD, HP-UX, Tru64, IRIX, Linux, UNIX and Windows backup clients, and a range of professional backup devices including tape libraries. Administrators and operators can configure the system via a command line console, GUI or web interface; its back-end is a catalog of information stored by MySQL, PostgreSQL or SQLite.
Bacula supports technologies & networks such as:
The latest stable release of Bacula is 3.0.2 which supports Solaris, FreeBSD, NetBSD, MacOSX, OpenBSD, HP-UX, Tru64, IRIX, Linux, UNIX and Windows backup clients, and a range of professional backup devices including tape libraries. Administrators and operators can configure the system via a command line console, GUI or web interface; its back-end is a catalog of information stored by MySQL, PostgreSQL or SQLite.
Bacula supports technologies & networks such as:
- CRAM-MD5
- Cyclic redundancy check
- GZIP
- Large file support
- Logical volume management
- MD5/SHA
- POSIX ACL
- Public key infrastructure
- TCP/IP
- Transport layer security
- Unicode/UTF-8
- Volume snapshot service
Wednesday, September 23, 2009
Backup software
A Backup software is a computer program (which generally sits on the server) used to perform a complete backup of a file, data, database, system, server or clients. The backup software enables a user to make an exact duplicate of everything contained on the original source which most of the times includes compression & encryption. It is also used to perform a recovery of the data, files or system in the event of a disaster. There are several features of backup software that make it more effective in backing up data:
- Volumes
- Data compression
- Remote backup
- Access to open files
- Incremental backups
- Schedules
- Encryption
- Transaction mechanism
Tuesday, September 22, 2009
BackupPC open source backup system
BackupPC is an open-source, high-performance, enterprise-grade free backup software that can be used for taking up data backup for Linux, UNIX, Solaris, Windows XX and Mac OSX PCs & laptops to a server's disk. BackupPC is highly configurable and easy to install & maintain as it comes with a web-based frontend. The most stable release, BackupPC 3.2.0 beta 0 was released on April 5th, 2009 is published under the GNU General Public License..
The most important thing with BackupPC is that no client is necessary to run it, as the server is itself a client for several protocols that are handled by other services native to the client OS. For instance, BackupPC incorporates a Server Message Block (SMB) client that can be used to back up network shares of computers running Windows. Paradoxically, under such a setup the BackupPC server can be located behind a NAT'd firewall while the Windows machine operates over a public IP address. While this may not be advisable for SMB traffic, it is more useful for web servers running SSH with GNU tar and rsync available, as it allows the BackupPC server to be stored in a subnet separate from the web server's DMZ.
The most important thing with BackupPC is that no client is necessary to run it, as the server is itself a client for several protocols that are handled by other services native to the client OS. For instance, BackupPC incorporates a Server Message Block (SMB) client that can be used to back up network shares of computers running Windows. Paradoxically, under such a setup the BackupPC server can be located behind a NAT'd firewall while the Windows machine operates over a public IP address. While this may not be advisable for SMB traffic, it is more useful for web servers running SSH with GNU tar and rsync available, as it allows the BackupPC server to be stored in a subnet separate from the web server's DMZ.
Monday, September 21, 2009
Backup & Recovery: Inexpensive Backup Solutions for Open Systems by W. Curtis Preston
Backup & Recovery: Inexpensive Backup Solutions for Open Systems by W. Curtis Preston is an answer to expensive & propreitary data backup solutions as it features data backup with open system. Within the circuits of the IT industry the book is regarded to help you in taking a Champagne Backup on a Beer Budget and helps you to decide why to backup, what to backup, when to backup, how to backup and storing, monitoring, testing & retrieving your backups.
The book features free & open source backup systems such as Amanda, BackupPC, Bacula & Open Source Near CDP and backing up of databases & files. Not only this if you are really looking for a commercial backup utility, the book helps you shortlist some and find the best one for your business with tips on 'what to look for' in them.
The book features free & open source backup systems such as Amanda, BackupPC, Bacula & Open Source Near CDP and backing up of databases & files. Not only this if you are really looking for a commercial backup utility, the book helps you shortlist some and find the best one for your business with tips on 'what to look for' in them.
Thursday, September 17, 2009
Selection, extraction & backup of metadata from a PC
As we all all know that not every information of a Personal Computer is stored in data files, like some information gets stored in the CMOS, etc. So, the complete recovery of a system from its scratch requires the selection, extraction & backup of these as well:
- System description
A system description, or specifications, is needed to procure an exact replacement after a disaster. - Boot sector
The boot sector can sometimes be recreated more easily than saving it. Still, it usually isn't a normal file and the system won't boot without it. - Partition layout
The layout of the original disk, as well as partition labels, tables & filesystem settings, is needed to properly recreate the original system. - File metadata
Each file's permissions, owner, group, ACLs and any other metadata need to be backed up for a restore to properly recreate the original environment. - System metadata
Different operating systems have different ways of storing configuration information. Windows keeps a registry of system information that is more difficult to restore than a typical file, so a backup of the complete registry should also be kept so that it can be restored in case of a disaster.
Wednesday, September 16, 2009
Areca Backup
Areca Backup is an Open Source backup solution designed for personal use on Linux & Windows machines. Developed in the Java programming language, Areca Backup is available under the General Public License (GPL) v2 and allows its users to select a set of files, & folders, to be backed up. You can specify backup location and configure post-backup actions (like sending backup reports by email or launching custom shell scripts) after taking backup in ZIP/ZIP64 compression or AES 128 & AES 256 encryption which are readable by WinZip or other archivers.
Areca supports full, incremental & differential backups and uses the file's size & last modification time to detect modified files. If one of these attributes is modified (whatever its value is), the file is flagged as modified. This allows a fast detection of modified files.
The most important features of Areca Backup:
Areca supports full, incremental & differential backups and uses the file's size & last modification time to detect modified files. If one of these attributes is modified (whatever its value is), the file is flagged as modified. This allows a fast detection of modified files.
The most important features of Areca Backup:
- Simple to set up
The configuration of Areca Backup is stored in an XML file which can be edited with Areca's graphical user interface. - Versatile
Areca Backup can use advanced backup modes (like delta backup) or simply produce a basic copy of your source files as a standard directory or zip archive. - Interaction
Track different versions of a specific file, browse your archives, recover or view specific files, merge a set of archives... and much more.
Tuesday, September 15, 2009
Back up terminologies #4
- Remote store
Backing up data to an offsite permanent backup facility, either directly from the live data source or else from an intermediate near store device. - Restore time
The amount of time required to bring a desired data set back from the backup media. - Retention time
The amount of time in which a given set of data will remain available for restore. Some backup products rely on daily copies of data and measure retention in terms of days. Others retain a number of copies of data changes regardless of the amount of time. - Site-to-site backup
Backup, over the internet, to an offsite location under the user's control. Similar to remote backup except that the owner of the data maintains control of the storage location. - Synthetic backup
A term used by NetBackup for a restorable backup image that is synthesized on the backup server from a previous full backup and all the incremental backups since then. It is equivalent to what a full backup would be if it were taken at the time of the last incremental backup. - Tape library
A storage device which contains tape drives, slots to hold tape cartridges, a barcode reader to identify tape cartridges and an automated method for physically moving tapes within the device. These devices can store immense amounts of data. - True image restore
A term used by NetBackup and Backup Exec for the collection of file deletion and file movement records so that an accurate restore can be performed. For instance, consider a system that has a directory with 5 documents in it on Friday. On Saturday, the system gets a full backup that includes those 5 documents. On Monday, the owner of those documents deletes 2 of them and updates 1 of the 3 remaining. That updated document gets backed up as part of The Monday night incremental backup. On Tuesday afternoon the system crashes. If we perform a normal restore of the full backup from Saturday and the incremental backup from Monday to the fresh system, we will have restored the 2 documents that were intentionally deleted. True image restore keeps track of the deletions with each incremental backup and prevents the deleted files from being inappropriately restored. - Trusted paper key
A machine-readable print of a cryptographic key. - Virtual Tape Library (VTL)
A storage device that appears to be a tape library to backup software, but actually stores data by some other means. A VTL can be configured as a temporary storage location before data is actually sent to real tapes or it can be the final storage location itself.
Monday, September 14, 2009
Amanda
AMANDA, aka Advanced Maryland Automatic Network Disk Archiver, is basically an Open Source archiving tool or data backup system which allows the administrators to set up a single master backup server to backup data from multiple hosts over the network to backup mediums using a client-server model which includes atleast:
AMANDA started as a university project for the University of Maryland which was later released under a BSD style of a license and is now available both as an open source community edition & a fully supported enterprise edition. It runs almost on any Unix or Linux based systems and may also be combined with a native Win32 client that comes with support for open files.
Amanda supports both tape & disk-based backups and provides some useful functionality not available in other backup products like tape-spanning, ie, if a backup set does not fit in one tape, it will be split into multiple tapes. Among its key features is an intelligent scheduler which optimizes use of computing resources across backup runs.
- the backup server & client itself
- a tape server
- an index server
AMANDA started as a university project for the University of Maryland which was later released under a BSD style of a license and is now available both as an open source community edition & a fully supported enterprise edition. It runs almost on any Unix or Linux based systems and may also be combined with a native Win32 client that comes with support for open files.
Amanda supports both tape & disk-based backups and provides some useful functionality not available in other backup products like tape-spanning, ie, if a backup set does not fit in one tape, it will be split into multiple tapes. Among its key features is an intelligent scheduler which optimizes use of computing resources across backup runs.
Friday, September 11, 2009
Back up terminologies #3
- Flash Backup
A term used for raw partition backup used by NetBackup Advanced Client. In NBAC, support is limited to the VxFS (Veritas), ufs (Solaris), Online JFS (HP-UX) and NTFS (Windows) filesystem types. Similar to the UNIX utility dump. - Full backup
A backup of all (selected) files on the system. In contrast to a drive image, this does not include the file allocation tables, partition structure and boot sectors. - Hot backup
A backup of a database that is still running and so changes may be made to the data while it is being backed up. Some database engines keep a record of all entries changed, including the complete new value. This can be used to resolve changes made during the backup. - Incremental backup
A backup that only contains the files that have changed since the most recent backup (either full or incremental). The advantage of this is quicker backup times, as only changed files need to be saved. The disadvantage is longer recovery times, as the latest full backup, and all incremental backups up to the date of data loss need to be restored. - Media spanning
Sometimes a backup job is larger than a single destination storage medium. In this case, the job must be broken up into fragments that can be distributed across multiple storage media. - Multiplexing
The practice of combining multiple backup data streams into a single stream that can be written to a single storage device. For example, backing up 4 PCs to a single tape drive at once. - Multistreaming
The practice of creating multiple backup data streams from a single system to multiple storage devices. For example, backing up a single database to 4 tape drives at once. - Normal backup
Full backup used by Windows Server 2003. - Near store
Provisionally backing up data to a local staging backup device, possibly for later archival backup to a remote store device. - Open file backup
The ability to back up a file while it is in use by another application. See File locking.
Thursday, September 10, 2009
Findings of a global backup survey
A global backup survey, organized by Kabooza, about backup habits, risk factors, worries & data loss of home PCs came out with the following findings. The Kabooza survey was done on 4257 respondents from around 129 countries.
- 82% of home PC users don’t do regular backup.
- 66% have lost pictures & files on their home PC. 42% within 2008.
- 71% are most worried about losing their digital pictures on their home PC.
- 54% of the respondents do not have any backup what so ever for their PC.
- 54% of the respondents see virus as the primary risk factor for their personal data.
Wednesday, September 9, 2009
Proprietary backup softwares
Find below a list of Proprietary backup softwares available today to take backups:
- .Mac Backup
- ARCserve Backup from CA Inc
- Acronis True Image
- Aggregate Backup And Recovery System
- Altexa online Backup Altexa by 77Backup
- Atempo TIMEnavigator
- Backup4all
- BackupAssist - Cortex IT Labs
- Backup Dwarf - by KRKsoft
- BakBone NetVault
- Carbonite
- CDP Server - Near Continuous Backup software for Windows & Linux
- CommVault Systems Simpana
- CrashPlan (automatic backup) Code42
- Disco
- Dmailer - Dmailer Backup
- EMC Networker
- EMC Corporation Retrospect
- Geek Squad Yearly Subscription 128-bit encryption
- Genie Backup Manager
- Get Backup - For Mac OS X
- GRBackPro Professional backup software - For Windows Server
- Handy Backup
- HP Data Protector
- Hyperbac
- i-drive
- Infinit
- IBM Tivoli Storage Manager (TSM)
- IBM Tivoli Storage Manager FastBack (TSM FastBack)
- IBM Aggregate Backup And Recovery System
- InMage DR-Scout
- Image for Windows - Drive imaging
- Keepit
- Langmeier Backup
- Macrium Reflect
- Microsoft Data Protection Manager
- Nero BackItUp
- NTI Backup Now
- Paragon Drive Backup
- PowerFolder
- Roxio Toast
- SonicWALL SonicWALL CDP
- SOS Backup Integrated Online & Local Backup
- Steek
- Symantec Backup Exec
- Symantec NetBackup
- Symantec Norton 360
- Symantec Norton Ghost
- Syncsort Backup Express
- Time Machine - Included with Mac OS X v10.5 "Leopard"
- UltraBac Software - UltraBac and UBDR (UltraBac Disaster Recovery)
- UltraBac Software - UltraBac and UBDR Gold (UltraBac Disaster Recovery)
- Unitrends - Data Protection Unit
- Uranium Backup - Windows backup software with Tape support
- Ventis BackupSuite 2008 - Windows backup software.
- Windows Home Server
- Windows Live OneCare
- Windows Recovery Environment - Microsoft's tool that is part of Windows Vista
- Yosemite Server Backup - Barracuda Networks Inc
Tuesday, September 8, 2009
Notable incidents of not taking backups
There a few instances available where panic was created due to not taking timely backups. Three of such incidents are pointed below:
- In 1996, during a fire at the headquarters of Credit Lyonnais, a major bank in Paris, system administrators ran into the burning building to rescue backup tapes because they didn't have offsite copies. Crucial bank archives and computer data were lost.
- Privacy Rights Clearinghouse has documented [18] 16 instances of stolen or lost backup tapes (among major organizations) in 2005 & 2006. Affected organizations included Bank of America, Ameritrade, Citigroup, and Time Warner.
- On 3 January 2008, an email server crashed at TeliaSonera, a major Nordic telecom company and internet service provider. It was subsequently discovered that the last serviceable backup set was from 15 December 2007. Three hundred thousand customer email accounts were affected.
Monday, September 7, 2009
Back up terminologies #2
- Copy backup
Backs up the selected files, but does not mark the files as backed up (reset the archive bit). This is found in the backup with Windows 2003. - Cumulative incremental backup
A differential backup used by NetBackup. - Daily backup
Incremental backup used by Windows Server 2003. - Data salvage
The process of recovering data from storage devices when the normal operational methods are impossible. This process is typically performed by specialists in controlled environments with special tools. For example, a crashed hard disk may still have data on it even though it doesn't work properly. A data salvage specialist might be able to recover much of the original data by opening it up in a clean room and tinkering with the internal parts. - Differential backup
A cumulative backup of all changes made since the last full backup. The advantage to this is the quicker recovery time, requiring only a full backup and the latest differential backup to restore the system. The disadvantage is that for each day elapsed since the last full backup, more data needs to be backed up, especially if a majority of the data has been changed. - Differential incremental backup
An incremental backup used by NetBackup. - Disaster recovery
The process of recovering after a business disaster and restoring or recreating data. One of the main purposes of creating backups is to facilitate a successful disaster recovery. For maximum effectiveness, this process should be planned in advance and audited. - Disk image
A method of backing up a whole disk or filesystem in a single image. Since the underlying data structures are what is actually backed up, this method does not allow for file level control over what is selected for backup or restore.
Friday, September 4, 2009
Why choose BackupDataOffsite.com?
The systems installed at BackupDataOffsite.com are automatic, secure & provide offsite data protection for your invaluable data. You can prevent costly data loss & downtime with the easy to use & reliable online data backup solution available with them. You can get started with the Free 30 Day Trial which is fast & easy and takes less than 15 minutes to setup.
The advantages of offsite data backups:
The advantages of offsite data backups:
- Cost effective, secure and automated offsite backups
- Unlimited backup storage capacity at affordable, scalable rates
- Back up multiple computers to one account
- Schedule automated backups to run daily or even hourly
- Top-level security, performance and monitoring
- Easy to use software
- Install a small agent program on your computer or server
- Select files to back up, and a backup schedule
- Your files are encrypted and compressed, then sent to our data center
- Subsequent backups send only incremental changes, reducing bandwidth & backup times
- Receive email notification after every backup, or only when warnings or errors occur
Tuesday, September 1, 2009
Open source backup softwares
It's not that you will only find paid softwares for taking backup for your files. You can rely upon the following open source backup softwares for taking backups:
- AMANDA
- Areca Backup
- BackupPC
- Bacula
- Cobian Backup 8
- cpio
- DirSync Pro
- DAR
- dump
- duplicity
- FlyBack
- Mondo
- rsync
- tar
- TimeVault
- Venti
- Zmanda Recovery Manager
Friday, August 28, 2009
A few piece of advice on taking backups
We understand that taking timely backups is necessary for your organization. Here's a piece of advice on taking backups:
- The more important the data that is stored on the computer the greater the need is for backing up this data.
- A backup is only as useful as its associated restore strategy.
- Storing the copy near the original is unwise, since many disasters such as fire, flood and electrical surges are likely to cause damage to the backup at the same time.
- Automated backup and scheduling should be considered, as manual backups can be affected by human error.
- Backups will fail for a wide variety of reasons. A verification or monitoring strategy is an important part of a successful backup plan.
- It is good to store backed up archives in open/standard formats. This helps with recovery in the future when the software used to make the backup is obsolete. It also allows different software to be used.
Tuesday, August 18, 2009
Back up terminologies #1
- Backup policy
These are the rules & procedures of any organization providing backup services to ensure that adequate amounts & types of backups are made which also include timed testing of the backup processes for restoring the original production system from the backup copies. - Backup rotation scheme
It is a method for effectively backing up the live data where multiple media are systematically moved from storage to usage in the backup process and then back to storage. There are several different schemes available with it. Each takes a different approach to balance the need for a long retention period with frequently backing up changes. Some schemes are more complicated than others and the implementation depends upon the type of data & infrastructure available. - Backup site
It's the place where business can continue after a data loss event. Such a site may have ready access to the backups or possibly even a continuously updated mirror. The backup site is usually off the business place. - Backup software
The software application that is used to perform the backing up of data, ie, the systematic generation of backup copies. - Backup window
The period of time that a system is available to perform a backup procedure. Backup procedures can have detrimental effects to system and network performance, sometimes requiring the primary use of the system to be suspended. These effects can be mitigated by arranging a backup window with the users or owners of the system(s).
Tuesday, August 11, 2009
Offsite data backup websites
There are many websites on the internet that provide you mechanism with which you can backup your data offsite at multi-locations worldwide and keep your data safe & secure. Some of the websites providing such a mechanism are:
- Backup Data Offsite
- Offsite Backup Solutions
- Offsite Backups
- Crash Plan
- Backup Platinum
- Offsite Backup
- Ahsay
- Amic Tools
- HP
- Mozy
- XS Backup
- Keep It
- Drive HQ
- Diino
Monday, August 10, 2009
Why offsite backup?
Companies understand that backup of their crucial data is a necessity as they just can't take risk of loosing data in any manner. Even today, many companies rely on the local backup of their data for the protection but the truth is that local data backups are just not enough. Local backup won't protect your data from fire, flood, theft or any other natural calamity.
Portable media can be an option with offsite backups but even it has a limitation as we have to rely upon people to transport data to a safe location. Portable storage is generally not as reliable, easily stolen, not encrypted & most often goes untested.
Given the above conditions the only option that stands undoubtedly on the top is the offsite backup of the data. Generally, people use internet alongwith a document management system to save their data at multiple locations so that their data remains safe, secure & encrypted at various locations across the globe.
Portable media can be an option with offsite backups but even it has a limitation as we have to rely upon people to transport data to a safe location. Portable storage is generally not as reliable, easily stolen, not encrypted & most often goes untested.
Given the above conditions the only option that stands undoubtedly on the top is the offsite backup of the data. Generally, people use internet alongwith a document management system to save their data at multiple locations so that their data remains safe, secure & encrypted at various locations across the globe.
Saturday, August 1, 2009
Methods of taking backup
There are many methods to take backup of your data. Most of the data is either saved in encrypted form or in compressed form. Some of the mediums on which data is stored are:
- CDs (obsolete now)
- Compact Flash
- DVDs
- Flash Memory
- Floppy drives (obsolete now)
- Hard Disks
- Magnetic Tapes
- Memory Sticks
- Pen Drives
- RAID
- Secure Digital Cards
- SmartMedia
- Thumb Drives
Why back up is important?
In any office the most important thing is INFORMATION. All our data, files, print outs, etc contain information that can be used to take decisions in any given condition or scenario. Large organisations such as banks, insurance companies, mutual funds, etc. interact with large number of people and store intricate information about people such as their financial information which can be crucial to there life. Another example is of large portals which keep the personal information of people stored within their databases.
Think of a scenario when suddenly due to natural calamity or fire the data gets deleted, burnt, destroyed or stolen. The people would blame the banks, insurance companies, mutual funds & portals for negligence & irresponsible behaviour.
To escape from such circumstances large organisations don't store data at just one place. They don't just store the crucial information at multiple locations but also at multiple cities or countries which enables them to keep the data safe & secure from such natural calamities or disaster.
The (crucial) data must always be stored safely & securely at multiple locations.
Think of a scenario when suddenly due to natural calamity or fire the data gets deleted, burnt, destroyed or stolen. The people would blame the banks, insurance companies, mutual funds & portals for negligence & irresponsible behaviour.
To escape from such circumstances large organisations don't store data at just one place. They don't just store the crucial information at multiple locations but also at multiple cities or countries which enables them to keep the data safe & secure from such natural calamities or disaster.
The (crucial) data must always be stored safely & securely at multiple locations.
Subscribe to:
Posts (Atom)