Many software companies continue to evangelize data deduplication strategies.

Matthew Weinberger

January 25, 2010

2 Min Read
Symantec Promotes Backup Exec 2010, Data Deduplication

Symantec data deduplication

Symantec data deduplication

Many software companies continue to evangelize data deduplication strategies. The latest example: Symantec says that new versions of the popular Backup Exec and NetBackup, both with a new focus on data deduplication, will arrive February 1. Here’s the scoop.

Backup Exec, Symantec’s network backup solution for the SMB space, has a laundry list of new features, but chief among them is adherence to what they call “deduplication everywhere.” The idea, according to Simon Jelly, senior director of product management for Symantec, is that as their data needs grow, businesses are spending more to store increasingly redundant data, and disaster recovery times are suffering because of it.

In response, Jelly says, Backup Exec 2010 deduplicates at every possible layer: the local server, the media server, and any appliances you have on your network compatible the Symantec OpenStorage protocol. By only backing up that data which has changed since the last time Backup Exec ran, VARs can offer a solution that stores data with maximum efficiency.

The other new features of Backup Exec 2010 include: compatibility with the latest releases of Windows, Hyper-V, and VMware; enhanced archiving capabilities to better protect the stored data; and granular recovery for Windows applications running inside VMware virtual machines — which means individual e-mails or other files can be recovered from a guest machine from a single backup. And yes, the contents of those virtual machines can be deduplicated.

NetBackup 7, the backup solution aimed squarely at larger, data-center level backups, shares several of the same new features as its little sibling Backup Exec 2010, according to Symantec: it has the same focus on data deduplication everywhere and granular virtual machine recovery.  It has a new unified control and monitoring panel, called “OpsCenter,” all to itself, and boasts a faster disaster recovery from backup.

I’m calling it now: 2010 is going to be the year of data deduplication. By only sending along new data, it minimizes bandwidth costs. By eliminating full system backups, it saves on storage. And by leveraging the appropriate appliances and software, partners have a solution ready to go.

We’re hearing from numerous software companies pitching deduplication strategies and we’ll be covering many of those efforts in the weeks ahead.

Read more about:

AgentsMSPsVARs/SIs
Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like