Continuity's RecoveryGuard Reveals DR FlawsContinuity's RecoveryGuard Reveals DR Flaws

Today's large IT environments are dynamic places; applications, volumes, and file systems are added, deleted, and reallocated on SANs on a daily basis. The disaster recovery plan, on the other hand, is updated and tested on an annual basis. As a result, most organizations think their data is better protected than it really is.

Howard Marks, Network Computing Blogger

January 30, 2008

1 Min Read
information logo in a gray background | information

Today's large IT environments are dynamic places; applications, volumes, and file systems are added, deleted, and reallocated on SANs on a daily basis. The disaster recovery plan, on the other hand, is updated and tested on an annual basis. As a result, most organizations think their data is better protected than it really is.If the Oracle DBAs say they need another 1TB LUN (logical disk), a storage administrator may skip step 19 in the official provisioning process that says to add the new LUN, and it's doppelganger at the DR site to the replication regime. When each management group looks in their management consoles, everything looks hunky dory.

Continuity Software's RecoveryGuard discovers the Oracle database, which LUNs it's using, and how they're replicating, and raises the red flag that the Oracle database isn't as protected as we thought it was. RecoveryGuard can investigate Oracle, Sybase, and MS SQL Server databases, and EMC's SRDF and Network Appliance SNAPmirror replication.

Continuity Software offers a free 48-hour evaluation and a spokesman told me that they have found at least one gap in the data protection at every company that's taken advantage of the offer. Since in our recent survey almost 60% the respondents indicated that they failed to recover at least one of their important applications in their last DR test, I can believe it.

Right now, RecoveryGuard is for the big boys with EMC and NetApp infrastructures that don't mind paying $2,000/protected server/a year to avoid multimillion dollar data losses. For them, it could be a bargain.

Read more about:

20082008

About the Author

Howard Marks

Network Computing Blogger

Howard Marks is founder and chief scientist at Deepstorage LLC, a storage consultancy and independent test lab based in Santa Fe, N.M. and concentrating on storage and data center networking. In more than 25 years of consulting, Marks has designed and implemented storage systems, networks, management systems and Internet strategies at organizations including American Express, J.P. Morgan, Borden Foods, U.S. Tobacco, BBDO Worldwide, Foxwoods Resort Casino and the State University of New York at Purchase. The testing at DeepStorage Labs is informed by that real world experience.

He has been a frequent contributor to Network Computing and information since 1999 and a speaker at industry conferences including Comnet, PC Expo, Interop and Microsoft's TechEd since 1990. He is the author of Networking Windows and co-author of Windows NT Unleashed (Sams).

He is co-host, with Ray Lucchesi of the monthly Greybeards on Storage podcast where the voices of experience discuss the latest issues in the storage world with industry leaders.  You can find the podcast at: http://www.deepstorage.net/NEW/GBoS

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights