<soapbox>
Testing must be an integral part of every upgrade effort. And, given the size of the project at hand For larger systems, reliable test automation is may be a requirement. Even with automation, the testing Testing processes require a lot of highly skilled manual work to define, setup and use effectively and reliably. A strong QA manager and experienced team will be absolutely critical to this effort.
</soapbox>
.
Here are some of the things GM offers to facilitate testing:
- Map technical upgrade risks to UI, platform, and code elements as an input to test planning. gmStudio-generated reports help with this.
- Detect and report all code changes with every translation tuning cycle to help identify changes that might impact functionality and help plan regression testing. This is a gmStudio feature.
- Produce detailed UI reports that can help with UI-driven test planning. gmStudio-generated reports help help with this.
- Implement automated unit testing harnesses for our run time frameworks supporting the application. Manual task.
- Implement small Migration Unit Tests to allow verifying custom upgrade features independently of the full application.
- Automatically instrument the source code so that we can log what is going on behind the scenes. When the instrumented source is translated, we produce a like-instrumented .NET application. Running the applications side-by-side produces detailed logs that can be compared to facilitate verification of hidden functionality. gmStudio and gmSL scripts can do the instrumentation.
- Produce build-complete .NET codes early in a project. This allows teams to generate an automated unit test framework for application code (using NUNit or Visual Studio). There is still a lot of effort needed to flesh out inputs and expected results in the AUT framework. We would like to develop a side-by-side automated unit testing methodology that takes advantage of instrumentation and is capable of using COM-interop legacy components to produce and compare with expected results; but this still takes a technical skill and time to define, setup, and use.
- Activate automated Code Review as part of the MSBuild process. We also have reports for tabulating the build logs for further analysis. This is a feature of MSBuild and gmStudio.
- And last but definitely not least, we are extending the Coded UI testing (CUIT) framework distributed with VS2013 Premium Edition. We are using our framework to implement a custom CUIT language and scripting toolset that will simplify creation and automation of side-by-side UI tests. This is in still beta, but it is already saving us a lot of time with testing big desktop applications that we are rewriting for project work. Some of the test scripts would take up to 15 tedious minutes to run manually but our tool runs them automatically in only a couple minutes. Since our solution is built on top of Microsoft tech, we have a strong community behind the work and we can take advantage of the testing features of Visual Studio including code coverage reporting. We have also verified the hooked-out test scripts into Team City Continuous Integration Server as an MSTest task. It's all very cool and all highly complementary to our automated upgrade methodology. For gmTL specification, /wiki/spaces/GMI/pages/1741958
And there are still four huge gaps that must be closedTools will only take you so far, even with automation, you will need experienced people collaborating effectively:
1) strong QA leaders/User support for Users creating detailed test scripts -- this takes a deep understanding of the data, dynamics, and use of the system.
2) strong DBA support including DBAs helping with automated test data management -- making sure the data in side-by-side environments is ready and valid for the expectations of the test scripts
3) strong SCM and deployment and environment support teams maintaining the code in each environment -- ideally ideally using continuous integration
4) If we are expected to ensure functionally correct code, we will You will also need a development testing area -- an reliable development/debugging environment to get up close and personal with the old and new code (i.e. in a debugger and valid test data)
<soapbox>
The For best results, the Users, the QA team, and the support teams groups should be able to routinely complete a fully documented, successful, automated regression test of the legacy app before app – before they try testing the new app. That will keep them gainfully very busy for quite a while, so we can focus while the upgrade team focuses on getting them a solid runnable, testable new app for testing. And it will help us maintain momentum once testing starts. Beware: The a customer might decide to cancel the postpone a project before it starts if they realize they cannot rigorously test what they already have.
...
One more comment: one way to mitigate testing pressure is to break the codebase into smaller parts that can be upgraded, tested, and deployed independently. Some codebases allow for systems facilitate this, and some don't. An incremental approach can make for a longer, more complex transition where the customer exists in a hybrid state using and maintaining a mix of old and new until the last component is upgraded. Some customers are already in this state; so, they may be more accepting of accustom to it. The gradual transition approach can make re-engineering more complex because the new codes might have to integrate with old components in production.