More and more software projects in the mainstream are embracing Agile, these days.  It’s important to understand how this paradigm shift impacts testing teams.  And, what are the challenges testing teams encounter and how do they overcome them?

This blog post is an attempt to answer some of the questions above by taking inferences from a recent customer engagement of Aspire, where Client’s team and Aspire’s offshore team collaboratively plans the move to Agile.

Driving the transformation to Agile Testing

Our customer is a leader in enterprise fraud management solutions in US.  This customer has been engaged with us for more than 6 years now.  Aspire is involved in Manual Testing and Test Automation of their suite of products.  Because of a heightened need for shorter testing cycles and higher visibility, customer decided to move ahead with Agile.

Primary Differences in working style- Waterfall Model vs. Agile

In the waterfall model, requirements along with product documentation were entrusted to the team.  There was enough time to understand requirements, perform exploratory testing, write test cases, and automate the QA.  On the Test Automation front, the builds were stable, with the UI and functionality intact, making it easier for preparing good test scripts.

All these practises that were working earlier were not applicable for agile approach. So the team decided to pursue each problem one by one and analyze the strategy for agile testing and define new methodologies.

The typical activities that were carried out include:

Requirements Understanding and Scheduling:

Less Documentation/Regular Communication Meetings: Meetings with the Product Owners, Scrum Masters and development team to understand the requirements one by one and validated our understanding by questioning defined Use Cases.  The requirements were approached from the customer perspective.

Refining the requirements from the customer perspective: The requirements were refined to the smallest possible unit.  This enabled the team to commit the unit requirement into specific Sprint.  Then the team proceeded onto writing test scenario with test cases as soon as the sprint started.

Estimate the effort required to complete the estimation: Based on the refined requirements, team was able to precisely estimate the effort required to complete the design, development, QA and documentation.  This helped everyone in the team to know when the preceding task (say development) would be completed and the work product would be available for succeeding tasks (say QA).  This helps to track the schedule and negate any slippage.

Test Automation:

Transition to Test Automation was not as easy as it was in manual testing.   Earlier, the team had a complete set of test cases and cushion of stable builds for categories like smoke and regression. This helped to create module based scripts with good reusability.

Development & Test automation in Different Sprint: Initially Test Automation was followed up by development activities and manual testing with a gap of one sprint.  The team automated test cases of previous sprint, so that the team can minimize test script maintenance activities.  Since this was not in tune with agile, we followed below approach.

  • Developing feature, writing test cases and creating test scripts went in parallel for a particular requirement.
  • Developer/Product Owner reviews the test cases based on scoped requirements
  • Test automation engineer starts writing stub test scripts and common functions based on reviewed test scripts
  • Transparent & Collaborative Approach: Since each member in the team was aware of the team member’s work and this helped to inform the changes better and avoid any rework in the sprints. The integrated work-approach within the team played a crucial role here.

Regression testing:

Delivering working products faster requires shorter end of the release activities. So we need to minimize the end of release activities (specifically regression testing).  To accommodate this, we empowered our QA Infrastructure to perform regression quickly, as and when needed, without pushing it to the end of the release.

  • At the end of each sprint, the test cases (which are mostly automated) were classified into categories like sanity, smoke, regression etc.
  • The Smoke test cases in particular were executed as nightly tests which helped to uncover any major issues instantly.
  • Importantly, these test suites had to be properly maintained to have confidence in the scripts. So the testing team worked with developers to maintain these tests where they will review the scripts and will also execute the tests wherever they feel something has changed/affected.
  • This helped to extensively bring down the regression testing effort required at the end of each release

Certifying alpha / beta kits

  • In the earlier waterfall model, resources spent dedicated time to certify alpha and beta kits.
  • With the current approach in regression testing and test automation, qualifications of kit were integrated and were implied part of sprint deliverables.
  • Delivery of fully working product in each sprint

Team Work

All these need consensus within team that “Quality of product is every ones responsibility”.  The emphasis has been on review of test cases and test scripts, involving ourselves in requirement grooming, giving feedback on usability and most importantly approaching testing with quality in mind.

Conclusion:

Agile helped us to improve ourselves and scale up in new areas. In the team, people who were used to performing manual testing alone, started developing automation scripts and reviewing scripts developed by others.

The team aggressively interacted and worked closely with developers and product owners.

And thus, they were able to deliver product quickly (~50% less time than waterfall model approach) and with good quality. Most importantly they are learning each day in agile and “Transforming Product Development”.