COTS versus Tool-Assisted Rewrite


Upgrading a legacy system can be difficult and expensive, and it is important to consider your options carefully.  One option that sometimes comes up is to replace the legacy system with a commercial-off-the-shelf (COTS) package of similar - but seldom identical - functionality.  In many cases, a suitable COTS replacement does not exist, so the decision is easy. However, in some cases a reasonably good package replacement is available and the COTS option is conceivably possible.  In addition, there is often a particular appeal to the COTS option because the package may include some nice features and the implementation cost seems easier to estimate: licenses and implementation support, simple enough.   However, beware that there are many hidden challenges, costs and risks associated with a COTS implementation.  Compared to other options, these costs cannot be ignored.


Great Migrations (GM) has a unique methodology for upgrading legacy applications to contemporary platforms – the tool-assisted rewrite – which offers a number of benefits compared to the COTS option. This article presents our view of the COTS versus GM-Upgrade question.   


Table 1 compares the COTS Package Replacement approach with GM's Tool-Assisted Rewrite approach. The upgrade of legacy VB6 assets to .NET is used as an example.  The tool mentioned in this example is gmStudio, our VB6/ASP/COM-to-.NET migration developemnt platform and upgrade solution.


Table 1: Comparing System Modernization Methodologies


Great Migrations' Tool-Assisted Rewrite

COTS Package Replacement


Agile, continuous improvement of the conversion process to ensure smooth evolution of the system. 
Fine-grained control of scope and schedule.

Waterfall approach with a "big bang" cut-over of systems, data, people, and processes.

Less control over scope and schedule.

Functional Requirements

Production code used as the accurate, complete, and detailed specification of all legacy functionality.
Can be done with no business impact. 


Gap analysis to identify customization.  Negotiate closing gaps with vendor.
Expert resources must revisit functionality and re-fit within vendor constraints.


Effort is focused on specific opportunities that have ROI.  Choose best of breed, right-sized solutions.


You will buy the vendor's decisions.



Repeatable, automated, documented process.

Data Model can be kept out of scope.  Data Integrations and reporting remain as-is.


Proprietary customizations made by the vendor. 

Major data conversion effort to new data model.  Data Integrations and reporting will have to change.



Developers work with familiar code that is ready for maintenance and refactoring.  Learn by comparison.

Users continue working with familiar processes, forms, data, and reports.


Users and developers must be retrained to use new application, processes, and reports.

Possible disruption of service.


Systematic transformation of working code and automatic “fix-and-test” process. 
Fewer and faster test cycles.  Side-by-Side, Approval based testing.


Massive functional change and unstable requirements as well as working with vendor. more and longer test cycles. 
New testing standards and processes.


Quick transition. Reach “cut-over point”, put it behind you and get back to adding value. 
Ability to manage scope to ensure smooth evolution and contain costs.
Often, IT organziations have already had success with .NET this should be able to leverage and improve those processes.


Very long transition. Developers, support people, and users focused on closing gaps, customization and retraining then adapting to many operational changes.  



Long Term

The company retains more freedom and control over features, schedule, and investments.  Continues to grow a dedicated IT staff that shares company priorities, values, and culture.

The company gives up some freedom and control over features, schedule and investments.  Becomes dependent on Vendor's IT staff and vendor's agenda and priorities.


Significantly smaller internal cost.

Major disruption and internal costs.


Leveraging Your Investments

Most legacy applications are  large, deeply-integrated systems that have been produced through significant initial investment followed by ongoing maintenance and enhancement.  Companies that use legacy systems invest in them heavily: carefully vetting and implementing the highest priority enhancements and fixes – release after release, year after year.  The users build business processes around the legacy system's forms, reports, documents, and workflows.  They often build business intelligence repositories, data feeds, and other integrations to fit the legacy system's data model.  In addition, the line of business teams, both in business and in IT, are trained on the legacy application, and they know it.  For example, in the insurance claims area, business process innovation and IT development are strongly influenced by the legacy application.  We estimate that the total business and IT effort already invested in a large mature system will easily total at least 100 person-years and this often translates into millions of dollars over time.   Your investment in legacy systems produces three assets: the legacy source code, the legacy data, and the legacy development organization.  

The Legacy Source Code

The legacy source code, although written in VB6 notation, is an extremely valuable asset.  It contains all the instructions needed to systematically and completely describe the appearance and behavior of the legacy system.  Our methodology is designed to fully leverage this asset, thus resulting in major savings in terms of cost and risk.

The Legacy Data

The legacy data contains precise details about the transactions and information processed by the legacy system.  That data is likely to be used by many reports and other processes that drive critical  business operations and decisions.  The data model and data inftrastructure is completely preserved by an upgrade, but it will typically be drastically changed by a package replacement.

Legacy Development Organization

The development organization has a strong working knowledge of the business processes supported by the application  This knowledge can be fully leveraged, preserved, and in fact enhanced, by the upgrade to .NET of the existing system.  The development team will learn new skills and techniques and they will also develop a deeper understanding of how the application is put together and how it works .  Existing logical design and testing processes will also be upgraded through the modernization effort.

Going Beyond Conversion

Our methodology is somewhat different from what most people probably think of when they hear "software translation".  Our tools are designed to help automate and manage all phases of the migration.  We proceed by an agile process we call the tool-assisted rewrite methodology.  The target design and conversion process is refined by a small team of developers who will understand the legacy application and will refine your .NET design standards. Based on this detailed understanding, this team will configure gmStudio to produce code that will satisfy your requirements and standards.  Significant progress can be made without requiring a code freeze and without disrupting ongoing development of the legacy system. 

The methodology produces an automated, high-fidelity conversion process that can be used to produce a .NET solution for the legacy system – a solution which can be verified to be correct, is reengineered to conform to your architectural standards,  and takes advantage of what the .NET platform has to offer.  Having such a process minimizes cost, risk, and disruption without sacrificing quality, control, or time to market.  It will produce an end product that will flow seamlessly to the user community -- with negligible disruption to operations and minimum re-training.  This is not the case with the COTS option.