Most enterprises have a significant baggage of legacy software running in production. These are the ones which have been developed 10-15 years ago. Some of them don’t have vendor support or patch upgrades anymore and are unsecure. These applications are meant to play a critical role in the revenue generation and in continuing the day-to-day operation of the company. CIOs conundrum have always been how can I keep the legacy and still modernize my IT. New technologies have come in which can help in either keeping the legacy running by parallelly developing application to attain feature parity or completely re-write the code until feature parity is attained.
DevOps, APIs, Microservices, Containers are being perceived as the silver bullet for resolving some of the key and common legacy modernization challenges. While for greenfield projects, there are not much difficulties but when it comes to existing legacy & monolithic applications, the benefits are realized over a longer period.
Let’s look at the common challenges and objectives from a CIO perspective
- Maintaining systems integrity. Legacy systems are not equipped to support modern business needs
- COTS software with no access to code base
- Slow & costly development. Technology complexity of legacy systems combined with skill gap increases cost
- Data security Opening access to modern applications require security functionality as the legacy system is not designed for external access.
- How can IT be optimized for speed vs cost?
- Refactoring is time consuming
- Open source or lock in?
- Public or private cloud?
- Have integrated CI/CD to reduce cycle times and make development team more productive
- Get to auto scaling without having to worry about provisioning once deployed
- Reduce wait time through self service
- Move to API, Microservices & Containerization model
Enterprises started with monolithic architectures, water fall flow process and physical servers. Then they moved to Agile process, multi-tier architecture and virtual servers. Now the current roadmap is to deliver at speed by moving to DevOps, microservices architecture and containers for deployment. IT has days to deliver and weeks to deploy, so speed becomes key.
How can a CIO create a structured way to modernize legacy systems?
There needs to be a method to madness. For starters it’s important to assess the current scenario of the IT landscape in the company. Once there is clarity on this, there are few ways to start. Let’s look at some of them.
|Approach||Scenario 1||Scenario 2||Scenario 3|
|Legacy system||Well architected, COTS||Monolithic||Monolithic|
|Solution||Microservices, APIs, containers||Prioritize apps based on needs. Use Containers/APIs||Complete Rewrite. Containers/APIs., PaaS.|
|Pros/Cons||Suited for Smaller Applications.||Large applications, Faster modernization deadlines. Modernize only key ones||Large applications, expensive, difficult to maintain timelines|
If the legacy system is well architected, figure out a starting point and an end. If the goal is to move towards a microservice, API approach then start by containerizing selected applications which are critical and essential for business. Here components can be deployed independently. Move to PaaS approach to run. Once midpoint is reached look at the possibility of refactoring others to the end stage. This approach is suited for COTs or non-open source legacy software and any smaller applications. Challenge with COTS is that there is no access to code base. Which means pieces of architecture must be visualized, layers are defined ground up and then containerized.
The 2nd approach could be around refactoring the legacy. In large legacy applications any modification to the system may have other repercussions. Changes are not isolated. The thumb rule in these kind of scenarios is to treat the legacy as backend and re-invent from outside. If business wants for example partner integration or mobility capability, then all that needs to done is identify two integration points. One for functionality and one for data. Build a new platform on top using a container platform. This helps in not shutting down the legacy system and can be delivered in weeks. Continuing to use the legacy while being innovative outside. This approach is ideal for monolithic systems where there is an environment lockdown in both Development & QA. Typical it would have been developed under waterfall approach, proprietary leading too much dependency. Little automation/deployment in development cycles. There will large releases with longer timelines. Key is to isolate the pieces into separate containers without massive rewrite, but we do need to understand that re-factoring will be time consuming.
The 3rd approach is probably a tempting approach which is a complete rewrite. Here the legacy is totally replaced. Then use PaaS to run. Some legacy can be rewrapped but most of them are shut down. Key is to identify the stage where there is feature parity compared to the legacy system. The main challenge is to solve is how to rewrite code to attain feature parity. Prioritize business use cases and features. Shut down the backend and start writing new code with microservices and APIs. Stop when parity is reached. This approach is easy to get started but hard to finish given the complexity. If the IT portfolio is too large and complex, start with a POC and then scale.
As the saying goes today’s software is tomorrow’s legacy. It’s important to modernize but with the future in mind.
Learn how Aspire has been modernizing legacy applications to contemporary and digital architectures in this webcast: https://www.brighttalk.com/webcast/12529/231075
Latest posts by Raghu Raghavan (see all)
- Data lake can be foundation for AI/ML initiative for Banks - May 6, 2019
- How AI can Transform Customer Segmentation and Conversions - May 3, 2019
- Why banks should have AI based Customer Intelligence? - December 3, 2018