As software continues to revolutionize businesses around the globe, and privacy legislation becomes increasingly more stringent, there’s a critical demand for a structured and centralized approach to the management of test data at an enterprise level.
Without this comprehensive test data management, businesses can wait for weeks for the data they need, sometimes only to discover that it doesn’t even exist.
“Without the right data, achieving the speed and quality required for today’s DevOps and agile needs of the organization is simply not possible,” said ITWC CIO Jim Love, as he hosted the recent webinar, The Real Secret to Quality Application Development.
Jointly sponsored by CA Technologies and ITWC, the webinar examined the critical importance of test data management. Joining Jim Love for the one-hour session was Huw Price, VP, Continuous Delivery BU, CA Technologies.
Quick and agile wins the race
According to Price, the speed and quality of data affect everything, from how fast a company will get to the market, to whether or not it will retain customers. As he sees it, the roadblocks to speedy, high-quality data include project delays due to manual testing, difficulty finding the right data, testing inefficiencies that lead to higher costs, defects stemming from ambiguous requirements, and time and resources devoted to test data compliance.
“It should come as no surprise that the top concern is delays related to manual testing,” he told webinar participants. “Anything with the word ‘manual’ in it does tend to bring continuous to its knees.”
Miles to go before we sleep
Despite a recognized need for reliable, compliant, ultra-fast test data, many of the systems supporting it remain mired in the past. In most companies, test data management is painfully slow and prone to errors. As well, it is adversely affected by data that is not available in parallel or on demand. Other inadequacies include a low percentage of coverage, time wasted looking for data, high infrastructure costs, and the lack of negative data and future test scenarios.
Copying production data is also a common concern. “It may sound simple,” said Price, “but it actually becomes quite problematic. Each system must be copied on the same time and day for testing to work.”
All the right stuff
Early in the webinar, Price admits that it would take days to describe a comprehensive TDM strategy, yet he does what he can in the one-hour session, advocating a strategy with the volume and variety to run all required tests. As well, he argues for user-friendly, self-service access, the ability to capture and store data, and the means to protect it from others.
In summary, a complete TDM strategy should have all the data needed to perform every possible test and provide constant access throughout the sprint in parallel and on demand. Using CA’s Test Data on Demand web portal, for example, data can be requested and received in minutes. It should also include a team of highly trained test data management professionals. “This is the essential core of your test data management,” says Price. “It’s the only way to go.”
On-demand and agile
On-demand data generation and the agile needs of the organization are driving a fundamental shift in the expectations of test data management. “If a management team can get involved earlier, they can fill in the gaps,” said Price. “If they do that, it means that when the developers and testers start working, there’s a rich set of data already there for them. People are using the idea of finding, exploring and visualization and filling in gaps before the developers and testers get involved.”
There are many compelling reasons to develop a dynamic strategy for test data management, including upcoming legislation around privacy issues. In the EU, for instance, “opt out” consent is no longer possible and data can only be used to fulfill the reasons for which it was given. With fines of up to 20 million euros for failing to comply, companies need to know exactly where sensitive data resides at all times and across complex environments.
The answer, according to Price, is a hybrid approach that increases the amount of synthetic data being pushed into development and testing. “You want to increase automation anyway,” he says. “A hybrid approach is really the only way. And once you get your test data in order, it becomes a virtuous circle. Rather than having to scramble at the last minute to make best guesses, developers have what they need from the very beginning.”