Home > slashdot > Data Deduplication Comparative Review

Data Deduplication Comparative Review

September 15th, 2010 09:10 admin Leave a comment Go to comments

snydeq writes “InfoWorld’s Keith Schultz provides an in-depth comparative review of four data deduplication appliances to vet how well the technology stacks up against the rising glut of information in today’s datacenters. ‘Data deduplication is the process of analyzing blocks or segments of data on a storage medium and finding duplicate patterns. By removing the duplicate patterns and replacing them with much smaller placeholders, overall storage needs can be greatly reduced. This becomes very important when IT has to plan for backup and disaster recovery needs or when simply determining online storage requirements for the coming year,’ Schultz writes. ‘If admins can increase storage usage 20, 40, or 60 percent by removing duplicate data, that allows current storage investments to go that much further.’ Under review are dedupe boxes from FalconStor, NetApp, and SpectraLogic.”

Source: Data Deduplication Comparative Review

Related Articles:

  1. Open Source Deduplication For Linux With Opendedup
  2. Ask Slashdot: Free/Open Deduplication Software?
  3. A Primer on Data Backup for Small- to Medium-Sized Companies (Video)
  4. Ask Slashdot: Simple Way To Backup 24TB of Data Onto USB HDDs ?
  5. How Do You Backup 20TB of Data?
blog comments powered by Disqus