As organizations continue to generate and store more and more data, the cost and complexity of moving that data around continues to increase. And data is constantly on the move. From production systems to test systems, from one test system to another, and even from user to user. All that data is required for multiple purposes, such as supporting transactional systems, mirroring production, reporting, performing analytics, for testing purposes, training, and more. And data is being moved around more frequently than ever before to support business requirements and changing technology needs such as agile development and DevOps. How can your company support this myriad of data movement requirements without overworking your DBAs? And how can you satisfy the legitimate need to move, copy, and refresh data without incurring significant costs? Not to mention the long runtimes required to unload and load large tables using Db2 utilities? And all of this movement has to happen while production systems remain online, because nothing gets put on hold just so you can copy data or move it around.