AWS Public Sector Blog

Karsun Solutions builds modernization platform using Amazon Bedrock

AWS branded background with text overlay that says "Karsun Solutions builds modernization platform using Amazon Bedrock"

Summary

Highly regulated enterprises and government agencies still maintain critical applications operating on legacy mainframe systems. The Government Accountability Office (GAO) published a 2023 report identifying critical federal IT legacy systems in need of modernization that were written in older languages, such as COBOL. According to GAO, reliance on older languages carries risks “such as a rise in procurement and operating costs, and a decrease in the availability of individuals with the proper skill sets.”

Modernizing critical applications includes challenges of undocumented and complex business rules, technical obstacles, regulatory compliance, enterprise-grade security, and privacy. Amazon Web Services (AWS) Partner Karsun Solutions modernizes legacy enterprise systems to elevate the capabilities of its government agency customers. Karsun has created a modernization platform called ReDuX, which uses generative artificial intelligence (AI) powered by Amazon Bedrock.

While AWS makes it accessible for Karsun developers to build with a choice of leading foundation models (FMs) and low-cost infrastructure, Karsun’s innovative methodology allows business analysts to uncover hidden knowledge within mainframe application language code such as COBOL. ReDuX also enables architects and developers to efficiently convert legacy functions into cloud-native microservices. Additionally, ReDuX reduces operational risks by automatically generating test scripts and data synchronization pipelines. This blog post provides an overview of Karsun’s approach, which enables enterprises to successfully transition their business functions from a mainframe to microservices on AWS.

Problem

Rearchitecting mainframe applications to modern, cloud-native services presents several challenges, which has resulted in mainframe lock-in. As the aforementioned GAO report observes, finding people who still know the application logic is difficult. The business rules embedded in legacy code are often complex and poorly documented. Legacy applications are tightly coupled but cannot be migrated in one fell swoop due to business operational risks and instead, must be done incrementally. This adds challenges of defining boundaries, underlying coupling and data synchronization requirements. Additionally, there is a need to maintain system availability during migration and continually retest the system in its new environment. These complexities highlight the need for a careful and considered approach to migrating from mainframe to cloud-native services. Finally, as generative AI technology opens the doors for content summarization and code generation to boost developer productivity, code and data must be handled securely and privately.

Karsun’s ReDuX

ReDuX is a toolkit consisting of playbooks, mechanisms, and frameworks designed to expedite the modernization of mainframe applications. It incrementally refactors and digitally transforms user experiences, merging valuable aspects of mainframe business-critical applications with new automation needs and design. Key tools in the ReDuX toolkit include Blueprint, which provides deep insights into the systems and maps them to new services, and AppPilot, a generative AI tool that utilizes Blueprint to generate code and facilitate incremental migration. AppPilot also brings enhanced security and privacy, offers project-specific usability by interfacing with various systems, overcomes hallucinations noted in large language models (LLMs), and provides proven prompt templates to save developers time and effort.

About Amazon Bedrock

Amazon Bedrock is a fully managed service that offers a choice of high-performing FMs from leading AI companies via a single API, along with a broad set of capabilities needed to build generative AI applications with security, privacy, and responsible AI. With the comprehensive capabilities of Amazon Bedrock, you can easily experiment with a variety of top FMs, privately customize them with your data using techniques such as fine-tuning and retrieval-augmented generation (RAG), and create managed agents that execute complex business tasks without writing any code.

Solution

Rearchitecting a large mainframe application portfolio entails an incremental approach. This demands a roadmap that spans several years and mindful strangulation of the applications to not hinder ongoing business operations. Execution of this roadmap warrants multiple agile teams with cross-functional roles, including business analysts, architects, and software engineers, working concurrently on different aspects of migration. To support those functional roles, ReDuX leverages years of application modernization experience and code repositories, along with access to leading FMs provided by Amazon Bedrock. Figure 1 shows a high-level architecture of ReDuX and how it leverages access to leading FMs via Amazon Bedrock APIs.

Figure 1. Architectural diagram of ReDuX. ReDuX primary components Blueprint and AppPilot are hosted in Amazon Elastic Kubernetes Service (Amazon EKS) clusters. Amazon Aurora PostgreSQL, Amazon Neptune and Amazon Simple Storage Service (Amazon S3) serve as the data stores. Foundation models are accessed through AWS PrivateLink via Amazon Bedrock APIs.

ReDuX Blueprint helps business analysts gain insights by analyzing source COBOL code to understand the complex domain and generate accurate and complete requirements. Analysis of COBOL code enables the construction of a domain knowledge graph encompassing user flows, APIs, data structures, and jobs, offering a comprehensive blueprint of the business workload. A generative AI chatbot using Amazon Bedrock along with an AI enhanced-knowledge graph, aids in the exploration of the system’s business rules at various levels including user interfaces, backend APIs, and data structures. The analyst can dive deeper with the chatbot to probe underlying details of the functions. With various hyper parameters and using the right context based on RAG, hallucinations are controlled when translating code to plain descriptions or answering questions. Figure 2 shows a screenshot of Blueprint displaying an application’s backend services, data structures, and business rules to an analyst, in which they can interact with a chatbot to drill down further details.

Figure 2. Screenshot of Blueprint with API view showing the backend services, data structures, and business rules. On the right hand side, an analyst is drilling down using a chat interface.

While analysts can probe business requirements, architects examine a C4 visualization model generated by Blueprint to analyze abstract representations of the system-based domain-driven design principles through Amazon Bedrock. This insight empowers the architects to construct a transformation map of legacy functions in the mainframe into modernized cloud-native apps and micro-services. Architects then make best practice decisions as recommended by the AWS Architecture Center.

Software engineers also leverage the same C4 model to get a complete understanding of existing code, including all undocumented logic, and impact analysis to internal/external dependencies.  Software engineers can then use ReDuX AppPilot to generate production-grade code for the new environment. AppPilot flexibly switches between FMs to generate code with optimal cost depending on code complexity. AppPilot contextualizes the generative AI request and response based on chosen architectural patterns, frameworks, agile execution state, and business domain. AppPilot greatly improves developer experience and boosts productivity by generating purpose fit code. This lacks hallucinations by injecting reference implementation as context using RAG  along with Amazon Bedrock fine tuning. Furthermore, with security and privacy built-in from day one, none of the customer’s data is used to train the original base models.

The following video shows a screen capture of a software engineer using AppPilot to query Blueprint to understand existing application logic, and then to generate converted code to the intended language.

Mainframe application transformation is not just about code modernization but also about ensuring the integrity, accuracy, and consistency of the data during these incremental modernizations. Data engineers and data analysts are able to leverage Blueprint and AppPilot to bridge the flow, streamline complex analysis, and make strategic decisions on syncing the data elements. Leveraging the knowledge graph of legacy system with new requirements, AppPilot will assist data engineers to generate new data definition language (DDL) statements to define the target data store. AppPilot then assists in generating extract, transform, and load (ETL) code required to implement data pipelines required to keep the data in sync.

Conclusion

AWS helped Karsun capitalize on generative AI and harness momentum to turn ideas into real productivity gains. As a lean cloud modernization organization working with federal agencies, Karsun needs security, privacy, scale, and price-performance. Most importantly, Karsun can trust AWS to deliver solutions that are relevant to their business.

The transformation journey from a mainframe application to modern microservices using Karsun ReDuX and Amazon Bedrock is not just a technical shift. It is a fundamental transformation which symbolizes a fine blend of technology and innovation that significantly improves the operational efficiency and effectiveness of mission-critical operations. This innovative approach unravels legacy business system’s intricate complexities, and enables a smooth transition to a modernized environment.

Vu Le

Vu Le

Vu is a senior solutions architect at Amazon Web Services (AWS) with more than 20 years of experience. He works closely with AWS Partners to expand their cloud business and increase adoption of AWS services. Vu has deep expertise in storage, data modernization, and building resilient architectures on AWS, and has helped numerous organizations migrate mission-critical systems to the cloud. Vu enjoys photography, his family, and beloved corgi.

Anuraadhaa Kandadai

Anuraadhaa Kandadai

Anu is a full stack developer at Karsun Solutions. She actively contributes to the Karsun Innovation Center, handling a variety of projects that span from full-stack development to artificial intelligence (AI). Apart from her professional pursuits, Anu enjoys dancing and sewing.

Badri Sriraman

Badri Sriraman

Badri is a senior information technology (IT) architect at Karsun Solutions and leads the Karsun Innovation Center. He has more than 30 years of experience in modernizing enterprise systems, with expertise leveraging artificial intelligence (AI), machine learning (ML), and other cutting-edge technologies to elevate the objectives of federal clients. He holds a master’s degree in software engineering from John Hopkins University.

Judewin Gabriel

Judewin Gabriel

Judewin is a cloud architect at Karsun Solutions, and has more than 15 years of experience in the software industry. With a keen interest in areas ranging from DevSecOps automation to artificial intelligence (AI) and machine learning (ML), Judewin is dedicated to crafting innovative solutions for complex technological challenges. He enjoys simplifying and demystifying technology, making it more accessible and valuable for his customers.

Pranav Khadilkar

Pranav Khadilkar

Pranav is a solutions architect at Amazon Web Services (AWS) with more than 15 years of experience in software development, design, and architecture. He works closely with partners to assist public sector customers in developing Well-Architected workloads on AWS Cloud. Pranav is passionate about emerging technologies - particularly generative artificial intelligence (AI) and machine learning (ML). He is also a generative AI ambassador at AWS.

Shanmugasundaram Palanivelu

Shanmugasundaram Palanivelu

Shanmuga is a software architect for Karsun Solutions with nearly two decades of experience in building complex software systems. His expertise lies in crafting high-quality, scalable solutions. Recently, Shanmuga has been leveraging artificial intelligence (AI) and machine learning (ML) to enhance the developer experience and he is excited to explore the exciting intersection of software engineering and AI/ML.