HP StorageWorks D2D4000 review

Data deduplication is the hot ticket in town, and with its new LBR technology HP is hoping to make it cost effective for SMBs over low-cost WAN links. We find out if it works.

The D2D4000 carries out deduplication at the appliance enabling it to handle any backup software and data format. Similar to many solutions, it breaks the data stream into 4K blocks, or chunks, and computes hashes for each one, which are then stored in an index on the appliance.

To test deduplication and LBR we used a pair of D2D4000 appliances with one designated as a source and the second acting as a remote target. We placed a Network Nightmare WAN simulator between them configured for a 2Mbps link speed.

D2D4000c

When creating VTLs you can choose from a range of LTO tape devices and library emulations.

Advertisement - Article continues below

At our local site we used a Dell PowerEdge 2900 quad-core Xeon domain controller running Windows Server 2003 and Exchange Server 2003. Symantec Backup Exec 12.5 managed local backup operations to the source appliance and we installed this on a Dell PowerEdge 1950 quad-core Xeon system running Windows Server 2003 file services and SQL Server 2005.

Installation is a swift affair as you run a wizard on each host system which installs the LTO tape drivers, configures the iSCSI initiator and discovers the D2D4000 appliance. You don't even need to create a VTL and this can be done automatically when the appliance detects an initiator logging on to it.

D2D4000d

Replication setup only needs to be carried out from the source appliance and can be completed in as few as four steps.

Advertisement
Advertisement - Article continues below

Replication configuration is a four step process carried out only at the source appliance where you choose a source VTL, enter the address of the remote appliance and select a target VTL for replication. Black-out windows are used to control when replication is allowed to run and you decide how much of the WAN link it can use.

Advertisement - Article continues below

We used our standard test suite that looks at deduplication performance for file server, SQL Server and Exchange server services and used a 50GB data set for each application. Our simulated one-month backup period used a standard cycle of weekly full and daily incremental or differentials as dictated by best practices for each application.

Featured Resources

Top 5 challenges of migrating applications to the cloud

Explore how VMware Cloud on AWS helps to address common cloud migration challenges

Download now

3 reasons why now is the time to rethink your network

Changing requirements call for new solutions

Download now

All-flash buyer’s guide

Tips for evaluating Solid-State Arrays

Download now

Enabling enterprise machine and deep learning with intelligent storage

The power of AI can only be realised through efficient and performant delivery of data

Download now

Most Popular

Visit/mobile/mobile-phones/355088/apple-lifts-iphone-purchase-restrictions
Mobile Phones

Apple lifts iPhone purchase restrictions

23 Mar 2020
Visit/operating-systems/microsoft-windows/355105/microsoft-puts-windows-development-on-lockdown
Microsoft Windows

Microsoft puts Windows development on lockdown

25 Mar 2020
Visit/security/data-breaches/355097/ge-employees-hit-by-canon-data-breach
data breaches

General Electric employees hit by Canon data breach

24 Mar 2020
Visit/security/cyber-security/355089/hackers-hit-windows-with-another-bug
cyber security

Hackers expose yet another Windows 10 vulnerability

23 Mar 2020