We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message
RSS FeedWhite Papers

White Paper Download

Continuous testing at warp speed

Enhance software quality by incorporating realistic test data into testing processes

Category: Storage

Date: , 11:58

Company: IBM

Welcome to the new era of computing, where enterprise applications and data dominate the landscape. The power of applications has never been greater than it is today—and their reach continues to expand.

IBM Software Continuous testing at warp speed Enhance software quality by incorporating realistic test data into testing processes Continuous testing at warp speed 1 Introduction: Agile testing for a data- driven world 2 Garbage in, garbage out: Data quality matters 3 What's wrong with standard test data generation strategies? 4 Five essentials of proper test data management 5 IBM InfoSphere Optim and Rational: Enhancing quality Continuous testing at warp speed Introduction Welcome to the new era of computing, where enterprise applications and data dominate the landscape. The power of applications has never been greater than it is today—and their reach continues to expand. We have entered a new realm of computing, where both clients and employees demand 24x7 access to multiple applications through a growing number of devices, from tablets and laptops to smart TVs and smartphones. And more than ever, organizations are under pressure to deliver these applications quickly. But the power and reach of applications aren't the only things skyrocketing. The volume, variety and velocity of the data that powers mission-critical applications—both traditional transactional applications and emerging analytics platforms built on data warehouses or big data platforms such as Apache Hadoop or IBM® InfoSphere® Biglnsights™—are growing at a breakaway pace. In fact, digital data is on track to grow to 44 times its 2009 volume by 20201—and exponential growth in the number of data sources and data types also means data must be managed across a complex architecture rather than from a single data source. 1 Introduction 2 Garbage in, garbage out 3 What's wrong? 4 Five essentials 5 InfoSphere Optim and Rational 6 Resources Continuous testing at warp speed As the complexity of data infrastructures grows, scaling test environments becomes a significant problem (see Figure 1). It isn't unusual for Fortune 500 companies to spend up to USD30 million building a single test lab—and many of these organizations have dozens of labs. Add in rising wages, and testing costs begin to spiral out of control. Private cloud Figure 1: Complex IT landscapes make setting up test labs extremely costly. Public cloud Q (X) • EJB T \^^/ Business par ft ft # 1 Messaging iHfliHk \ services partners Shared services Data warehouse Directory identity Mainframe Enterprise service bus File systems Heterogeneous environments 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed Are your testing practices agile enough to support the high-quality, continuous development your customers and employees expect? HlliiiaCulU"' Unfortunately, business users don't have a lot of patience for cumbersome application testing procedures. If they don't get high- quality applications, data marts and reports from the IT department, they'll work around enterprise rules to get them—creating more complexity. Agile testing practices play a critical role in delivering quality applications throughout a fast-turnaround, high-flexibility environment. The goal is to iterate quickly and deliver the most critical functionality fast, as well as continuously test to deliver high quality. By taking a strategic, disciplined approach to test data management, enterprises can deliver applications faster and help ensure that they work properly with real data. 5 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed Garbage in, garbage Why poor-quality test data leads In a rush to meet growing client needs and deliver applications faster, quality testing can quickly fall down the priority list. Teams may defer testing until too late in the process and try to make up for lost time by bringing in an army of testers. They may even de-scope the test entirely and simply choose to make fixes in production if something is discovered. However, inadequate testing or testing in production can lead to disaster if software is released with a bug that causes downtime, allows privacy or security poor-quality applications breaches, causes the company public embarrassment, or puts the organization in violation of compliance regulations. Test databases can be time-consuming and difficult to create and maintain, especially since data requirements are often unclear or unknown—and highly skilled IT personnel should not be focused on this operational task. As a result, testing deadlines often slip and delays are rampant because testers cannot complete their most important function early in the development process. Poor testing practices are also risky. When companies use actual customer data to test applications, they risk exposing thousands of sensitive records during testing. These data privacy breaches can cost millions to remediate, and they expose the organization to bad press and devastating customer backlash. But bad test data isn't an inescapable evil. To reduce costs, improve time- to-market and reduce risk, innovative organizations are turning to test data management. 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed What's wrong with standard test data generation strategies? The sheer effort required to build and maintain test data manually can be daunting. Test data management requires a deep understanding of the underlying data model. Data may be stored in hierarchical or non-relational formats, such as IBM Virtual Storage Access Method (VSAM) files and IBM IMS™ databases. All database management systems have different methods for handling data, which further complicates test data preparation. The application data model may contain dozens, hundreds or even thousands of tables—and just as many interrelationships. Even a database of less than a dozen tables may contain relationships that make navigating the data model difficult. Unlike DBAs, testers typically do not focus on understanding the data architecture. Instead, they depend on DBA availability to build up the necessary test databases— which can create delays as testers need more test environments. Testers often need data from multiple related databases, including both relational and non-relational data sources. In addition, each phase of the testing process, from unit testing through system integration and acceptance testing, has unique requirements and varying levels of complexity. Any problems that are discovered must be resolved, and the test data must be refreshed before testing can continue. And after a test is executed, testers need a way to verify the result of any data updates. 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed Due to these complexities, many organizations simply copy or clone data from production. However, cloning comes with its own problems. Cloning an entire production database slows testing because it generates a larger volume of data than is necessary. In addition, cloned data may not support the specific error and boundary conditions required for effective testing (see table, "The pros and cons of cloning"). Another common workaround is to make up test data—but this approach is even worse than cloning production databases. Making up test data is laborious and error-prone, with no guarantee of accuracy. Testers generally find made-up test data challenging to work with because it does not always reflect the integrity of the original data set or retain the proper context. Both cloning and making up test data are risky and could expose personally identifiable information (Pll), human resources (HR), business or other types of sensitive data. Creating in-house data de-identification or data masking routines might address the challenge, but they add extra maintenance points, are difficult to scale and are often inconsistent. For example, "Sally Smith" should be masked to the same value every time—not "Aaaaa Bbbbb" in one place and "John Doe" in another. The pros and cons of cloning Positives Negatives Simple: Requires little knowledge of technology Costly: Significant storage demands Realistic: Creates an exact copy of production data Imprecise: No specific use cases or teams Risky: Sensitive data used in testing Time-consuming: Must copy all production data Hard to use: No way to analyze before/after test Not scalable: Does not scale across sources or applications Inefficient: Waiting for test data results in downtime 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed Five essentials of proper test data management The days of taking a waterfall approach to application development and delivery are gone. Most organizations are embracing agile methodologies—and as a result, acknowledging the need for continuous testing. The shift to a more flexible and dynamic development process requires speedy access to the right test data. Test data management enhances quality testing efforts in three key ways: • Functional testing: Extract a subset of production data to act as input values for data-driven testing or provide the appropriate level of test databases means less time on operational activities and more time spent on actual testing. • Performance testing: Stability, load, benchmark and other types of sustained tests require data for hundreds or even thousands of records to execute performance tests over hours. Automated test data management makes the test data available at your fingertips. • Service virtualization: Virtual components require realistic test data to simulate the behavior of the live service or software they are emulating. Leveraging a test data management strategy to subset production data while masking sensitive information meets these requirements. 1 Introduction 2 Garbage in, garbage out 3 What's wrong? 4 Five essentials 5 InfoSphere Optim and Rational Continuous testing at warp speed When implementing a test data management approach, five best practices help streamline both test data preparation and usage (see Figure 2): 1. Start by discovering and understanding test data. Data is scattered across systems and resides in different formats. In addition, different rules may be applied to data depending on its type and location. Organizations should identify their test data requirements based on the test cases, which means they must capture the end-to-end business process and the associated data for testing. This could involve a single application or multiple applications. For example, a business may have a CRM system, an inventory management application and a financial application that are all related and require test data. Mask Run Check for changes Production Discover ^ Subset (Jj chaN°ges Changes found Refresh data Steps 1. Discover enterprise data sources and decide which data fields are required for the test case 2. Subset test data from production, taking only the data required for the test case 3. Mask the data as it moves from production to test 4. Run test cases and compare the test data to the original test data 5. If there are changes in the test data, they may indicate an error or problem 6. Allow testers and developers to refresh the test data to ensure agile development processes; retest 7. Production Figure 2: Test data management essentials. 10 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed 2. Subset production data from multiple data sources. Subsetting is designed to ensure realistic, referential^ intact test data from across a distributed data landscape without added cost or administrative burden. In addition, the best subsetting approaches include metadata in the subset to accommodate data model changes quickly and accurately. In this way, subsetting creates realistic test databases small enough to support rapid test runs but large enough to accurately reflect the variety of production data. Part of an automated subsetting process involves creating test data to force error and boundary conditions, which includes inserting rows and editing database tables along with multilevel undo capabilities. 3. Mask or de-identify sensitive test data. Masking helps secure sensitive corporate, client and employee information and also helps ensure compliance with government and industry regulations. Capabilities for de-identifying confidential data must provide a realistic look and feel, and should consistently mask complete business objects such as customer orders across test systems. 4. Refresh test data. During the testing process, test data often diverges from the baseline, resulting in a less-than-optimal test environment. Refreshing test data can improve testing efficiencies and help streamline the testing process while maintaining a consistent, manageable test environment, which improves the predictability and repeatability of testing efforts. 5. Automate expected and actual result comparisons. The ability to identify data anomalies and inconsistencies during testing is essential in measuring the overall quality of the application. The most efficient way to achieve this goal is by employing an automated capability for comparing the baseline test data against results from successive test runs; speed and accuracy are essential. Automating these comparisons saves time and helps identify problems that might otherwise go undetected. 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed Design and manage test campaign Compare before and after data InfoSphere Optim Go to production IBM InfoSphere Optim and Rational Enhancing qualityaccelerating continuous testing Figure 3: Integrating test data management into quality testing. IBM delivers a complete quality testing solution through the IBM Rational® Quality Management and Testing and the IBM InfoSphere Optim™ Test Data Management product lines. Together, these offerings are designed to support continuous testing through rapid, high-quality test data generation and management (see Figure 3). The IBM Rational Solution for Collaborative Lifecycle Management (CLM) delivers quality management capabilities in Rational Quality Manager to manage all aspects of the quality lifecycle. Combined with IBM Rational Requirements Composer and IBM Rational Team Concert™, the complete CLM solution addresses the five imperatives of application lifecycle management:2 In-context collaboration: Collaboration can improve a team's ability to connect with each other, to respond to changing events and to improve project predictability. Traceability across the lifecycle: By linking related artifacts, teams are better equipped to answer questions such as "Which requirements are affected by defects?" and "Which work items are ready for test?" Real-time planning: The only way to know when the work is complete is to ensure the plans are fully integrated with project execution and always up to date. 12 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed 4. Development intelligence: Implementing areas for improvement, setting goals and tracking your progress toward achieving those goals cultivate development intelligence. 5. Continuous process improvement: Process is more than a documented set of procedures. A well-designed application lifecycle management solution allows you to change process incrementally, improve the team dynamic and continue to refine toward greater efficiencies. InfoSphere Optim Test Data Management also supports all five test data management best practices. It creates rightsized, fictionalized test databases that accurately reflect end-to-end business processes. InfoSphere Optim scales to meet testing requirements across multiple applications, databases, operating systems and hardware platforms. It also facilitates modern software delivery models — including agile development —by making test data continuously accessible to testers and developers so they can quickly meet test requirements. InfoSphere Optim Test Data Management provides a web-based platform that enables organizations to define roles, responsibilities and workflows for test data management services and operations. With InfoSphere Optim Services on Demand for Test Data Management capabilities, users can make direct requests for test data and refresh automatically—processes traditionally handled by a multitiered infrastructure that required human intervention (see Figure 4). InfoSphere Optim Test Data Management provides the execution framework and workflow required to ensure test data management initiatives are governed properly, that the various roles and responsibilities are well-coordinated and that any associated service levels and metrics are achieved. The Rational Solution for Test Automation supports testing on a number of technologies and platforms, including both legacy deployments and web and mobile applications. By combining Rational Test Workbench with service virtualization 13 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed capabilities and sharing virtual components Test data management request and fulfillment process across the enterprise with Rational Test Virtualization Server, teams can perform testing earlier in the development cycle and make continuous integration testing part of the standard build process. InfoSphere Optim Test Data Management integrates with Rational Quality Management and Testing solutions to enhance application quality and testing efficiency in the new era of computing. Together they help you: • Reduce costs by intelligently creating and subsetting realistic test data from complete business objects • Manage risk by masking sensitive data • Speed delivery of test data through refresh capabilities Figure 4: Integrating test data management into quality testing. 14 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational Continuous testing at warp speed Case study: The value of test data management at a US insurance company The director of software quality was fed up. Almost daily, lead project managers and quality assurance (QA) staff were complaining about the amount of time spent acquiring, validating, organizing and protecting test data. Complicated front-end and back-end systems consistently caused budget overruns. Contingency plans were being built into project schedules because the team expected test data failures and rework. Project teams added 15 percent to all test estimates to account for the effort to collect data from back-end systems, and 10 percent of all test scenarios were not executed due to missing or incomplete test data. The result: costly production defects. With 42 back-end systems needed to generate a full end-to-end system test, the business could not confidently launch new features. Testing in production was becoming the norm. In fact, claims could not be processed in certain states because of application defects that the teams skipped over during the testing process. IT was consuming an increasing number of resources—yet application quality was declining rapidly. The insurance company clearly lacked a test data management strategy aligned to business results. Something had to change. The IT director assembled a cross-functional team and asked some tough questions: • What is required to create test data? • How much does test data creation cost? • How far does the problem extend? • How is the high application defect rate affecting the business? 15 1 Introduction 2 Garbage in, garbage out 3 What's wrong? 4 Five essentials 5 InfoSphere Optim and Rational 6 Resources Finding the answers to these questions was an involved process. No one had a complete understanding of the full story. Through the analysis process, the team discovered that requests for test data came too late, with too many redundancies. There were no efficient processes to provide test data for all of them. Teams would use old test data because of the effort involved in getting new test data, but using old test data resulted in a high number of defects. In addition, the security risks of exposing sensitive data during testing were rampant. After fully analyzing the problems, the team concluded that with every new USD14 million delivery, a hidden USD3 million was spent on test data management. 1 Introduction 2 Garbage in, garbage out Sources of the hidden costs included: • The labor required to move data to and from back-end systems and to identify the right data required for tests The time spent manipulating data so it would work for various testing scenarios • The storage space for test data • The cost of production defects not tested because test data was not available • The cost to mask sensitive data to protect privacy • The costs of skipped test scenarios After implementing a process to govern test data management, the insurance company reduced the costs of testing by USD400,000 per year. The company also implemented IBM solutions to deliver comprehensive test data management capabilities for creating fictionalized test databases that accurately reflect end-to-end business processes. Today, the insurance company can easily refresh 42 test systems from across the organization in record time while finding defects in advance. The organization now has a better ability to process claims across all 50 states for less money. Testing in production is no longer the norm. The business value from implementing test data management included: • Cost savings of approximately USD500,000 per year • 44 percent fewer untested scenarios • 41 percent less labor required over 12 months The insurance company now has an enterprise test data process that helps it lower costs, improve predictability and enhance testing (including enabling automation, cloud testing, mobile testing and more). People, processes and technologies came together to make a real change. 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources and Rational Continuous testing at warp speed Resources InfoSphere Optim Test Data Management To learn more about the InfoSphere Optim Test Data Management solution and how it can help you accelerate continuous testing, contact your IBM representative or IBM Business Partner, or explore these resources: • ibm.com/software/data/optim/ streamline-test-data-management • Watch InfoSphere Optim Test Data Management in action in an online demonstration • Download the white paper: "Enterprise Strategies to Improve Application Testing" • Check out the e-book: "Back to basics: Fundamentals of test data management" • Watch the "Moving application quality from good to great" webcast IBM Rational To learn more about IBM Rational solutions and capabilities, check out these resources: • IBM Rational Quality Management and Testing solutions • IBM Service Virtualization: Removing today's testing bottlenecks 17 1 Introduction 2 Garbage in, 3 What's wrong? 4 Five essentials 5 InfoSphere Optim 6 Resources garbage out and Rational © Copyright IBM Corporation 2013 IBM Corporation Software Group Route 100 Somers, NY 10589 Produced in the United States of America June 2013 IBM, the IBM logo, ibm.com, Biglnsights, IMS, InfoSphere, Optim, Rational, and Rational Team Concert are registered trademarks of International Business Machines Corp., registered in many jurisdictions worldwide. Other product and service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on the web at "Copyright and trademark information" at ibm.com/legal/copytrade.shtml This document is current as of the initial date of publication and may be changed by IBM at any time. Not all offerings are available in every country in which IBM operates. The performance data and client examples cited are presented for illustrative purposes only. Actual performance results may vary depending on specific configurations and operating conditions. THE INFORMATION IN THIS DOCUMENT IS PROVIDED "AS IS" WITHOUT ANY WARRANTY, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT. IBM products are warranted according to the terms and conditions of the agreements under which they are provided. The client is responsible for ensuring compliance with laws and regulations applicable to it. IBM does not provide legal advice or represent or warrant that its services or products will ensure that the client is in compliance with any law or regulation. 'CSC. "Big Data Universe Beginning to Explode." http://www.csc. com/insights/flxwd/7893 l-big_data_universe_beginning_to_explode 2Five Imperatives for Application Lifecycle Management. https://jazz.net/library/article/63 7/#incontextcollaboration Please Recycle IMM14122-USEN-00

You must have an account to access this white paper. Please register below. If you already have an account, please login.

Login

Not registered?Register now

Forgot password?

White paper download

ComputerworldUK Knowledge Vault

ComputerworldUK
Share
x
Open
* *