AWS Database Blog

Augment DMS SC with Amazon Q Developer for code conversion and test case generation

You can use the AWS Database Migration Service Schema Conversion (AWS DMS SC) with generative AI feature to accelerate your database migration to AWS. This feature automatically handles the conversion of many database objects during migration by using traditional rule-based techniques and deterministic AI techniques. For the AWS DMS SC issues that can’t be converted, Amazon Q Developer can generate PostgreSQL-compatible code alternatives, workarounds, and comprehensive test cases to help you complete your database modernization journey.

In this post, we demonstrate how Amazon Q Developer delivers generic solutions for complex AWS DMS SC issues, intelligently converts database stored procedure code from source to target database-compatible code, and automatically generates comprehensive test cases to validate your migrated database objects.

Solution overview

AWS DMS SC provides valuable conversion capabilities and detailed assessment reports that identify any schema elements that required manual attention. Amazon Q Developer enhances this workflow and generates solutions for complex SQL Server to Amazon Aurora PostgreSQL-Compatible Edition migration challenges that weren’t handled by AWS DMS SC. We explore three possible ways that Amazon Q Developer can accelerate your database modernization journey beyond AWS DMS SC capabilities:

  1. Providing generic solutions for complex AWS DMS SC issues, reducing manual workaround development time.
  2. Automating commercial to open source database code conversions, reducing manual development effort.
  3. Generating comprehensive test cases that cover most use cases automatically, reducing testing cycles by automating test case generation while helping maintain code quality and consistency across the migration.

Prerequisites

Complete the following prerequisites before getting started.

  1. Install the Amazon Q Developer for your preferred IDE. In this post, we use Visual Studio Code (VS Code).
  2. Sign in with either a Builder ID or IAM Identity Center.
  3. Complete database assessment for SQL Server to Amazon Aurora PostgreSQL-Compatible Edition using AWS DMS SC with generative AI enabled.
  4. Security in the cloud is the top priority of AWS. This documentation guides you through applying the shared responsibility model in the context of Amazon Q Developer. We recommend reviewing it before proceeding with the following solution.

I. Amazon Q Developer Generic Solution for AWS DMS SC issues

This solution helps resolve migration issues without exposing your data model or proprietary code, helping you maintain data privacy. This is recommended for organizations with strict security requirements.

Amazon Q Developer generates generic, reusable code samples for AWS DMS SC issues without accessing your sensitive database schemas or stored procedures. You provide AWS DMS SC with action item numbers through prompts to discover workarounds for complex migration scenarios while keeping your intellectual property secure.

The following screenshot shows the results of the sample assessment. The assessment report provides a high-level summary of how much of the schema can be converted automatically from SQL Server to PostgreSQL, and the work that will be left after conversion.

Let’s consider the following example where you can use Amazon Q Developer to provide solution for AWS DMS SC action item:

We used the following prompt to ask Amazon Q Developer for a workaround:

“Provide workaround for AWS DMS SC issue 7657: PostgreSQL doesn’t support the hierarchyid data type – SQL server to PostgreSQL migration”

Several workarounds were provided by Amazon Q Developer for this action item. The following are two examples:

1. Use PostgreSQL’s ltree Extension (Amazon Q Recommended)

The ltree extension provides hierarchical tree-like structures similar to hierarchyid:

 -- Enable ltree extension
CREATE EXTENSION IF NOT EXISTS ltree;
-- Convert hierarchyid column to ltree
ALTER TABLE your_table 
ADD COLUMN hierarchy_path ltree;
-- Example conversion from hierarchyid string representation
UPDATE your_table 
SET hierarchy_path = text2ltree(replace(hierarchyid_column::text, '/', '.'));

2. Use Materialized path pattern

-- Add path column
ALTER TABLE your_table 
ADD COLUMN hierarchy_path VARCHAR(500);
-- Example: /1/2/3/ represents node 3 under node 2 under root node 1
UPDATE your_table 
SET hierarchy_path = '/' || replace(hierarchyid_column::text, '.', '/') || '/';
-- Query descendants
SELECT * FROM your_table 
WHERE hierarchy_path LIKE '/1/2/%';

In the next example, we ask Amazon Q Developer to provide solution for AWS DMS SC issue 7658: PostgreSQL doesn’t support the sql_variant data type. We used the following prompt to ask Amazon Q Developer for a workaround:

“Provide workaround for AWS DMS SC issue 7658: PostgreSQL doesn’t support the sql_variant data type”

Several workarounds were provided by Amazon Q Developer for this action item. The following are two examples:

1. Use JSONB

	-- Replace sql_variant with JSONB
ALTER TABLE your_table ADD COLUMN variant_data JSONB;
-- Migration
UPDATE your_table 
SET variant_data = jsonb_build_object(
    'type', pg_typeof(sql_variant_column)::text,
    'value', sql_variant_column::text
);
-- Query
SELECT * FROM your_table 
WHERE variant_data->>'type' = 'integer' 
AND (variant_data->>'value')::int > 100;

2. Use TEXT with type prefix

-- Add text column
ALTER TABLE your_table ADD COLUMN variant_text TEXT;
-- Migration
UPDATE your_table 
SET variant_text = pg_typeof(sql_variant_column)::text || ':' || sql_variant_column::text;
-- Parse function
CREATE FUNCTION get_variant_value(variant_text TEXT, target_type TEXT)
RETURNS TEXT AS $$
    SELECT CASE WHEN split_part($1, ':', 1) = $2 
           THEN split_part($1, ':', 2) END;
$$ LANGUAGE SQL;

Generative AI inference is non-deterministic. Code conversion results can vary between executions and might not be identical to the previous examples shown. The workarounds provided are based on observed patterns but might require adjustment for your specific use case. Always review and test generated code thoroughly in your development environment before production use.

II. Augment AWS DMS SC with Amazon Q Developer for Unconverted database code

Amazon Q Developer can help you further convert SQL Server stored procedures to Aurora PostgreSQL-Compatible by translating T-SQL syntax, control flow structures, and proprietary functions into PostgreSQL-compatible PL/pgSQL code.

Let’s review an example where we have a SQL Server stored procedure that performs a merge statement based on the business criteria. This stored procedure synchronizes the Products table with incoming data by updating existing products, inserting new ones, and deleting products that no longer exist in the source. It returns a log of all changes made (inserts, updates, and deletes) for tracking purposes.

SQL Server sample code

-- Create a SQL Server table 
CREATE TABLE [dbo].[Products](
[ProductID] [int] IDENTITY(1,1) NOT NULL,
[ProductName] [varchar](100) NOT NULL,
[Price] [decimal](10, 2) NOT NULL,
[StockQuantity] [int] NULL,
PRIMARY KEY CLUSTERED
(
[ProductID] ASC
) 
GO
-- Create a user-defined table type to pass source data 
CREATE TYPE dbo.ProductType AS TABLE 
	( ProductID INT, 
	  ProductName NVARCHAR(100), 
		  Price MONEY )
GO
-- Stored procedure performing Merge
CREATE   PROCEDURE dbo.usp_MergeProducts
    @ProductSource dbo.ProductType READONLY
AS
BEGIN
    SET NOCOUNT ON;
    -- Declare a table variable to hold the output from the MERGE statement
    DECLARE @MergeOutput TABLE (
        Action NVARCHAR(10),
        ProductID INT,
        ProductName NVARCHAR(100),
        Price MONEY
    );
    -- Temporarily enable IDENTITY_INSERT for the Products table
    SET IDENTITY_INSERT dbo.Products ON;
    MERGE dbo.Products AS T -- T is the target table
    USING @ProductSource AS S -- S is the source data from the TVP
    ON (T.ProductID = S.ProductID) -- This is the join condition
    -- WHEN MATCHED performs an UPDATE on existing rows
    WHEN MATCHED THEN
        UPDATE SET
            T.ProductName = S.ProductName,
            T.Price = S.Price
    -- WHEN NOT MATCHED BY TARGET performs an INSERT for new rows
    WHEN NOT MATCHED BY TARGET THEN
        INSERT (ProductID, ProductName, Price)
        VALUES (S.ProductID, S.ProductName, S.Price)
    -- WHEN NOT MATCHED BY SOURCE performs a DELETE on rows in the target not in the source
    WHEN NOT MATCHED BY SOURCE THEN
        DELETE
    -- Capture the OUTPUT from the MERGE into the @MergeOutput table variable
    OUTPUT
        $action AS Action,
        ISNULL(Inserted.ProductID, Deleted.ProductID) AS ProductID,
        Inserted.ProductName,
        Inserted.Price
    INTO @MergeOutput;
    -- Turn IDENTITY_INSERT back OFF for the Products table
    SET IDENTITY_INSERT dbo.Products OFF;
    -- Return the results to the caller from the table variable
    SELECT * FROM @MergeOutput;
END
GO

AWS DMS SC can’t convert this procedure and comment all the code as shown in the following image:

Because AWS DMS SC can’t convert the code, perform the following steps by providing source code and target folder details to Amazon Q Developer in VS Code.

  1. Extract code for this specific procedure from either the AWS DMS SC console shown in the previous image or directly from the SQL Server database.
  2. Save the code in your preferred location, in this post we stored the code in a local computer drive.

“Convert SQL Server procedure code into Aurora PostgreSQL-Compatible code.
Source code is present in a designated folder <Path>
Convert it to a procedure and keep lower case for object names, keep the names of objects same as the source code. Qualify table names with the schema dbo and print each record using RAISE NOTICE
Save the converted code output in folder <>”

The output from Amazon Q Developer can vary between runs, and you might see different results from what’s shown here. AI models are evolving at a rapid pace, with nearly continuous improvements and updates being released. The examples and outputs presented in this post reflect the model’s performance at the time of writing and testing. As these technologies advance, you can expect even more refined and enhanced capabilities in future iterations. The following is the Amazon Q Developer generated code for this specific procedure.

CREATE TABLE dbo.products (
    productid SERIAL PRIMARY KEY,
    productname VARCHAR(100) NOT NULL,
    price DECIMAL(10, 2) NOT NULL,
    stockquantity INTEGER
);
CREATE OR REPLACE PROCEDURE dbo.usp_mergeproducts()
LANGUAGE plpgsql
AS $$
DECLARE
    rec RECORD;
BEGIN
    CREATE TEMP TABLE IF NOT EXISTS productsource (
        productid INTEGER,
        productname VARCHAR(100),
        price DECIMAL(10, 2)
    );
    -- Upsert records
    INSERT INTO dbo.products (productid, productname, price)
    SELECT s.productid, s.productname, s.price
    FROM productsource s
    ON CONFLICT (productid)
    DO UPDATE SET
        productname = EXCLUDED.productname,
        price = EXCLUDED.price;
    -- Print upserted records
    FOR rec IN 
        SELECT p.productid, p.productname, p.price
        FROM dbo.products p
        WHERE p.productid IN (SELECT productid FROM productsource)
    LOOP
        RAISE NOTICE 'UPSERT - ProductID: %, ProductName: %, Price: %', 
            rec.productid, rec.productname, rec.price;
    END LOOP;
    -- Delete records not in source
    FOR rec IN
        SELECT p.productid, p.productname, p.price
        FROM dbo.products p
        WHERE p.productid NOT IN (SELECT productid FROM productsource WHERE productid IS NOT NULL)
    LOOP
        RAISE NOTICE 'DELETE - ProductID: %, ProductName: %, Price: %', 
            rec.productid, rec.productname, rec.price;
    END LOOP;
    DELETE FROM dbo.products 
    WHERE productid NOT IN (SELECT productid FROM productsource WHERE productid IS NOT NULL);
END;
$$;

Generative AI inference is non-deterministic. Code conversion may not be the same as shown in the previous example.

III. Generate test cases using Amazon Q Developer

Amazon Q Developer can help to automatically generate test cases, thereby reducing testing cycles while maintaining code quality and consistency throughout the migration.

Now, we will use Amazon Q Developer to create comparable test cases for both SQL Server and PostgreSQL procedures with identical scenarios: INSERT new records, UPDATE existing records, DELETE through partial source, mixed operations, and empty source deletion. The tests use the same input data and verify that both platforms produce identical final table states, validating that the migration preserves the original MERGE logic despite implementation differences (TVP vs temp table, OUTPUT vs RAISE NOTICE).

The following prompt will generate test cases for the SQL Server and for PostgreSQL procedure:

“Look at the source SQL Server code at <Path> and Amazon Q developer converted file at <Path>. Create comparable test cases for each one of them to validate the migration from SQL Server to PostgreSQL. There is already data in both source and target, do not change or delete existing data”

The number and scope of test cases generated can vary between different runs due to AI model variations and context interpretation differences. Always review and validate all generated test cases before execution in any environment (development, staging, or production).

For this run, Amazon Q Developer generated five test cases for both SQL Server and PostgreSQL:

SQL SERVER TEST CASES (Preserving Existing Data)

-- Test 1: INSERT only new records (high ProductIDs to avoid conflicts)
DECLARE @Test1 dbo.ProductType;
INSERT INTO @Test1 VALUES (9001, 'Test Product A', 100.00), (9002, 'Test Product B', 200.00);
SELECT 'Before Test 1:' as Status, COUNT(*) as RecordCount FROM dbo.Products;
EXEC dbo.usp_MergeProducts @Test1;
SELECT 'After Test 1:' as Status, COUNT(*) as RecordCount FROM dbo.Products;

-- Test 2: UPDATE existing test records only
DECLARE @Test2 dbo.ProductType;
INSERT INTO @Test2 VALUES (9001, 'Updated Test A', 150.00), (9002, 'Updated Test B', 250.00);
EXEC dbo.usp_MergeProducts @Test2;

-- Test 3: Mixed - keep existing + add new test record
DECLARE @Test3 dbo.ProductType;
INSERT INTO @Test3 
SELECT ProductID, ProductName, Price FROM dbo.Products WHERE ProductID < 9000; -- Keep all original data
INSERT INTO @Test3 VALUES (9003, 'New Test C', 300.00); -- Add new test record
EXEC dbo.usp_MergeProducts @Test3;

-- Cleanup test records only
DELETE FROM dbo.Products WHERE ProductID >= 9000;

POSTGRESQL TEST CASES (Preserving Existing Data)

-- Test 1: INSERT only new records (high productids to avoid conflicts)
CREATE TEMP TABLE productsource (productid INTEGER, productname VARCHAR(100), price DECIMAL(10, 2));
INSERT INTO productsource VALUES (9001, 'Test Product A', 100.00), (9002, 'Test Product B', 200.00);
SELECT 'Before Test 1:' as status, COUNT(*) as record_count FROM dbo.products;
CALL dbo.usp_mergeproducts();
SELECT 'After Test 1:' as status, COUNT(*) as record_count FROM dbo.products;
DROP TABLE productsource;

-- Test 2: UPDATE existing test records only
CREATE TEMP TABLE productsource (productid INTEGER, productname VARCHAR(100), price DECIMAL(10, 2));
INSERT INTO productsource VALUES (9001, 'Updated Test A', 150.00), (9002, 'Updated Test B', 250.00);
CALL dbo.usp_mergeproducts();
DROP TABLE productsource;

-- Test 3: Mixed - keep existing + add new test record
CREATE TEMP TABLE productsource (productid INTEGER, productname VARCHAR(100), price DECIMAL(10, 2));
INSERT INTO productsource SELECT productid, productname, price FROM dbo.products WHERE productid < 9000; -- Keep all original data
INSERT INTO productsource VALUES (9003, 'New Test C', 300.00); -- Add new test record
CALL dbo.usp_mergeproducts();
DROP TABLE productsource;

-- Cleanup test records only
DELETE FROM dbo.products WHERE productid >= 9000;

Note that sample code, software libraries, command line tools, proofs of concept, templates, or other related technology are provided as AWS Content or Third-Party Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content or Third-Party Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content or Third-Party Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content or Third-Party Content might incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.

Summary

In this post, we showed how you can use Amazon Q Developer with AWS DMS SC to accelerate the database migration processes for SQL Server to Amazon Aurora PostgreSQL, leading to increased efficiency and reduced time-to-completion.

Take the first step in your database modernization journey by exploring Amazon Q’s capabilities at the Amazon Developer Portal. You can also use AWS Transform for SQL Server Modernization, an AI-powered service that automates the full-stack modernization of Microsoft SQL Server databases and their associated .NET applications to Amazon Aurora PostgreSQL.

About the authors

Amit Arora

Amit Arora

Amit is a Specialist Solutions Architect with a focus on database and analytics at AWS. He works with our financial technology, global energy customers and AWS certified partners to provide technical assistance and design customer solutions on cloud migration projects, helping customers migrate and modernize their existing databases to the AWS Cloud.

Vanshika Nigam

Vanshika Nigam

Vanshika is a Database Specialist Solutions Architect at AWS. Her work spans database migration and solution architecture, guiding organizations through complex modernization initiatives and designing highly available, scalable, and secure database architectures on AWS. She helps customers accelerate their journey from on-premises and commercial databases to AWS Cloud database solutions. With over 7 years of experience in Amazon RDS, Amazon Aurora, and AWS DMS, she specializes in designing and implementing efficient database cloud solutions.