BDR Solutions: Overcoming Bandwidth Limitations to the Cloud

As the evolving Backup and Disaster Recovery (BDR) market shifts towards virtualization technologies and integrated backup solutions, BDR vendors must overcome the limitations of band

March 20, 2012

3 Min Read
BDR Solutions: Overcoming Bandwidth Limitations to the Cloud

By Datto Guest Blog 2

cloud-bandwidth-limit

As the evolving Backup and Disaster Recovery (BDR) market shifts towards virtualization technologies and integrated backup solutions, BDR vendors must overcome the limitations of bandwidth in order to efficiently transmit and sync large amounts of data between local devices and off-site storage facilities.  The ability to sync on-site data with off-site locations in a timely manner is critical to maintaining a comprehensive business continuity plan.

A 2010 study by Communication Workers of America calculated the average Internet up-load speed across every zip code in the United States at 595Kbps.  Although some businesses may have higher speeds, it is clear that the data pipeline is a huge limiting factor to transmitting large amounts of data quickly and efficiently. At a bandwidth of 595Kbps, transferring 6 Gigabytes at 100% capacity, 100% of the time, will take approximately one whole day. Frequently prospective BDR customer’s businesses need that bandwidth during the day to carry out necessary day-to-day business functions, so in reality only a fraction of that amount can be uploaded off-site for remote backup storage resulting in multi-day transfers.

Furthermore as the entire BDR industry continues to push deeper into block-level backups & virtualization technologies the ability to reliably provide off-site backups becomes even more challenging as data sizes grow.  The lack of available upload bandwidth, presents few options for MSPs to deliver dependable solutions to businesses with slower bandwidth capabilities.

Over the last several months, Datto has been listening to MSPs and compiling feedback to develop an alternate approach to managing and transmitting data for off-site storage. Datto found that often one or more local servers’ off-site backups are more crucial than other local servers for successful planning & execution of a business continuity plan.  As a result, the ability to prioritize servers for synchronizing off-site was developed.  Datto also took this one step further by offering granularity of recovery points, in other words, not every recovery point has to sync off-site.  Users can now pick & choose how often each recovery point is stored off-site. A third development by Datto engineers was to devise a more advanced compression algorithm that greatly reduces the size of the data files being transmitted while ensuring complete data integrity in the process.

These technological advances have opened the door for MSP’s to now provide a solution to customers that in the past may not have had enough bandwidth to efficiently maintain regular off-site backups.

Datto CEO, Austin McChord, recently explained the benefits of the new sync method during a recording of Datto’s video blog series “One Take”.  Datto One Take Episodes are available on Datto’s blog: “The Datto Node”Click here to view the One Take Episode and download an informational PDF about Speed Sync. Remember to send any questions or comments to [email protected]. We love hearing from the channel and will answer your questions on an upcoming One Take episode!

Ed Tella is documentation manager at Datto. Monthly guest blogs such as this one are part of MSPmentor’s annual platinum sponsorship. Read all Datto guest contributions here.

 

Read more about:

AgentsMSPsVARs/SIs
Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like