Support Statement: Automated Testing

Testing must be an integral part of every upgrade effort.  For larger systems, reliable test automation may be a requirement.   Testing processes require highly skilled manual work to define, setup and use.   

Here are some of the things GM offers to facilitate testing:
  • Map technical upgrade risks to UI, platform, and code elements as an input to test planning.  gmStudio-generated reports help with this.

  • Detect and report all code changes with every translation tuning cycle to help identify changes that might impact functionality and help plan regression testing.  This is a gmStudio feature.

  • Produce detailed UI reports that can help with UI-driven test planning. gmStudio-generated reports help with this.

  • Implement automated unit testing harnesses for run time frameworks supporting the application.  Manual task.

  • Implement small Migration Unit Tests to allow verifying custom upgrade features independently of the full application.
  • Automatically instrument the source code so that we can log what is going on behind the scenes. When the instrumented source is translated, we produce a like-instrumented .NET application. Running the applications side-by-side produces detailed logs that can be compared to facilitate verification of hidden functionality.  gmStudio and gmSL scripts can do the instrumentation.

  • Produce build-complete .NET codes early in a project. This allows teams to generate an automated unit test framework for application code (using NUNit or Visual Studio). There is still a lot of effort needed to flesh out inputs and expected results in the AUT framework.  We would like to develop a side-by-side automated unit testing methodology that takes advantage of instrumentation and is capable of using COM-interop legacy components to produce and compare with expected results; but this still takes a technical skill and time to define, setup, and use.

  • Activate automated Code Review as part of the MSBuild process.  We also have reports for tabulating the build logs for further analysis.  This is a feature of MSBuild and gmStudio.

  • And last but definitely not least, we are extending the Coded UI testing (CUIT) framework distributed with VS2013 Premium Edition. We are using our framework to implement a custom CUIT language and scripting toolset that will simplify creation and automation of side-by-side UI tests.   This is in still beta, but it is already saving us a lot of time with testing big desktop applications that we are rewriting for project work. Some of the test scripts would take up to 15 tedious minutes to run manually but our tool runs them automatically in only a couple minutes.  Since our solution is built on top of Microsoft tech, we have a strong community behind the work and we can take advantage of the testing features of Visual Studio including code coverage reporting.   We have also verified the hooked-out test scripts into Team City Continuous Integration Server as an MSTest task. It's all very cool and all highly complementary to our automated upgrade methodology. For gmTL specification, /wiki/spaces/GMI/pages/1741958

Tools will only take you so far, even with automation, you will need experienced people collaborating effectively:
 
1) QA leaders/Users creating detailed test scripts -- this takes a deep understanding of the data, dynamics, and use of the system.
2) DBAs helping with automated test data management -- making sure the data in side-by-side environments is ready and valid for the expectations of the test scripts
3) SCM and deployment teams maintaining the code in each environment -- ideally using continuous integration

You will also need a reliable development/debugging environment to get up close and personal with the old and new code (i.e. in a debugger and valid test data)

<soapbox>
For best results, the Users, QA team, support groups should be able to routinely complete a fully documented, successful, automated regression test of the legacy app – before they try testing the new app.  That will keep them very busy while the upgrade team focuses on getting a runnable, testable new app for testing.  And it will help maintain momentum once testing starts.  Beware: a customer might decide to postpone a project before it starts if they realize they cannot rigorously test what they already have.
</soapbox>

One more comment: one way to mitigate testing pressure is to break the codebase into smaller parts that can be upgraded, tested, and deployed independently. Some systems facilitate this, and some don't.  An incremental approach can make for a longer, more complex transition where the customer exists in a hybrid state using and maintaining a mix of old and new until the last component is upgraded.  Some customers are already in this state; so, they may be more accustom to it.  The gradual transition approach can make re-engineering more complex because the new codes might have to integrate with old components in production.