Let the community know what you think. Share your opinions now!
I think the most important features to look for are whether dedupe is online or at rest and if the block size is fixed or not.
You should look at these feature and parameters:
1. read speed, 2. write speed, 3. throughput, 4. data protection/integrity, 5. de-dupe topology - at target, at source, both - for last two check the impact on the source 6. the need for agent/plugin installation - check FW requirements (ports to open etc.) 7. space reclamation (garbage collection, filesystem cleaning etc.) - check if the system will be able to finish GC before next run. 8. ability to scale the system performance and capacity 9. data transfer protocols (tcp/IP, FC, iSCSI etc.) 10. application/backup software interoperability - for source based de-dupe, for additional services like virtual synthetic full backups
recovery performance, data availability and accessibility
Performance penalty, on data read operation.
Impact upon performance. Is deduplication going to disrupt the workflow.
data availability , accessibility and performance.
I'm planning to renew my Data Domain solution and I want to compare it with other products.
I use it for secondary storage of my backup and also for file sharing (SMB, CIFS).
What alternative products would you recommend?