Microsoft Workloads on AWS

How to build an automated C# code documentation generator using AWS DevOps

In this blog post, we will show you how to create a documentation solution on the Amazon Web Services (AWS) Cloud that automatically generates and publishes a technical documentation website for a .NET project, based on source code comments, API definitions, and Markdown documents included in the project.

Having a technical documentation website improves developer productivity, and generating the documentation automatically ensures that the site is always up-to-date. Without this automation, developers need to continually update the documentation, creating operational overhead and increasing the risk of documentation errors.

This solution also implements additional best practices for technical documentation, such as cost-effective hosting, a consistent structure, and version history. The documentation is stored in a centralized location with access control and gets updated with every code check-in. It contains consistent structure, including a table of contents, class descriptions and links to inheritance (if any), method descriptions along with signature and return type, and much more. This post describes various AWS services that are used to build the solution, including AWS CodeBuild, Amazon Simple Storage Service (Amazon S3), AWS Lambda, and Amazon CloudFront.

Solution overview

This post uses DocFx and AWS to automatically generate code documentation for a C# project. DocFx is an open-source documentation generator for .NET projects that builds a static HTML website from XML code comments, REST API definition (Swagger) files, and Markdown documents in the project. The solution automatically generates updated documentation during every build, triggered by an event on the AWS CodeCommit repository. The documentation website is then hosted on Amazon S3 and distributed through Amazon CloudFront, as shown in Figure 1 below:

Figure1: Architecture diagram for proposed solution

Figure1: Architecture diagram for proposed solution

A developer committing code to the AWS CodeCommit repository initiates a build in AWS CodeBuild. During the build, AWS CodeBuild sets up DocFx in the build environment. After the build completes, AWS CodeBuild runs a Lambda function that checks if the project version already has documentation in Amazon S3.

If the documentation doesn’t exist,  AWS CodeBuild runs DocFx to generate the documentation website and uploads the HTML pages to Amazon S3. AWS CodeBuild then runs another Lambda function that updates the Amazon S3 website to include a link to the new documentation. Finally, the Amazon CloudFront distribution synchronizes with the Amazon S3 bucket to serve as the public endpoint for the documentation.

Walkthrough

To deploy the solution, follow these high-level steps:

  • Use an AWS CloudFormation template to deploy the AWS infrastructure.
  • Configure the AWS Codebuild Target.
  • Configure the C# project to support DocFx.

Deploy AWS infrastructure

After launching the AWS CloudFormation stack, it will implement this solution in your AWS account, creating all the necessary AWS resources (including AWS Identity and Access Management (IAM) roles, Amazon S3 buckets, an AWS CodePipeline pipeline, an Amazon CloudFront distribution, and so on).

Follow these steps to deploy the solution using AWS CloudFormation:

  • Step 1: Copy the below AWS CloudFormation template:
    • AWS CodeBuild with a Windows image:
AWSTemplateFormatVersion: 2010-09-09

Parameters:
  DocumentationBucketName:
    Type: String
    Description: "S3 Bucket name for bucket where auto generated code documentation will be stored.
      Refer https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html#bucketnamingrules"
  
  LoggingBucketName:
    Type: String
    Description: "S3 Bucket name for bucket where logs will be stored.
      Refer https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html#bucketnamingrules"
  
  ArtifactStoreBucketName:
    Type: String
    Description: "S3 Bucket name for bucket where codepipeline artifacts will be stored.
      Refer https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html#bucketnamingrules"
  
  CodeCommitRepositoryName:
    Type: String
    Description: "Repository name for codecommit project."
  
  CodeBuildProjectName:
    Type: String
    Description: "Name of codebuild project."

Resources:

# ===== S3 Buckets ======
  LoggingBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Ref LoggingBucketName
      BucketEncryption:
        ServerSideEncryptionConfiguration:
          - ServerSideEncryptionByDefault:
              SSEAlgorithm: 'AES256'
  
  LoggingBucketAccessPolicy:
    Type: 'AWS::S3::BucketPolicy'
    Properties:
      Bucket: !Ref LoggingBucket
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Sid: S3ServerAccessLogsPolicy
            Effect: Allow
            Principal:
              Service: logging.s3.amazonaws.com
            Action:
              - 's3:PutObject'
            Resource: !Sub arn:aws:s3:::${LoggingBucket}/Logs*

  S3DocumentationBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Ref DocumentationBucketName
      WebsiteConfiguration:
        IndexDocument: index.html
      BucketEncryption:
        ServerSideEncryptionConfiguration:
          - ServerSideEncryptionByDefault:
              SSEAlgorithm: 'AES256'
      LoggingConfiguration:
        DestinationBucketName: !Ref LoggingBucket
        
  S3DocumentationBucketAccessPolicy:
    Type: 'AWS::S3::BucketPolicy'
    Properties:
      Bucket: !Ref S3DocumentationBucket
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Sid: 1
            Effect: Allow
            Principal:
              AWS: !Sub arn:aws:iam::cloudfront:user/CloudFront Origin Access Identity ${CloudFrontOriginAccessIdentity}
            Action:
              - 's3:GetObject'
            Resource: !Sub arn:aws:s3:::${DocumentationBucketName}/*
     
  S3ArtifactBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Ref ArtifactStoreBucketName
      BucketEncryption:
        ServerSideEncryptionConfiguration:
          - ServerSideEncryptionByDefault:
              SSEAlgorithm: 'AES256'
      LoggingConfiguration:
        DestinationBucketName: !Ref LoggingBucket

  S3ArtifactBucketAccessPolicy:
    Type: 'AWS::S3::BucketPolicy'
    Properties:
      Bucket: !Ref S3ArtifactBucket
      PolicyDocument:
        Version: 2012-10-17
        Id: SSEAndSSLPolicy
        Statement:
          - Sid: DenyUnEncryptedObjectUploads
            Effect: Deny
            Principal: '*'
            Action: 's3:PutObject'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}/*'
            Condition:
              StringNotEquals:
                's3:x-amz-server-side-encryption': 'aws:kms'
          - Sid: DenyInsecureConnections
            Effect: Deny
            Principal: '*'
            Action: 's3:*'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}/*'
            Condition:
              Bool:
                'aws:SecureTransport': 'false'
          - Sid: ''
            Effect: Allow
            Principal:
              AWS: 
                - !GetAtt
                  - CodeBuildRole
                  - Arn
                - !GetAtt
                  - CodePipelineRole
                  - Arn
            Action:
              - 's3:Get*'
              - 's3:Put*'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}/*'
          - Sid: ''
            Effect: Allow
            Principal:
              AWS: 
                - !GetAtt
                  - CodeBuildRole
                  - Arn
                - !GetAtt
                  - CodePipelineRole
                  - Arn
            Action: 's3:ListBucket'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}'

# ===== Cloudfront ======
  CloudFrontDistribution:
    Type: 'AWS::CloudFront::Distribution'
    Properties:
      DistributionConfig:
        Origins:
          - ConnectionAttempts: 3
            ConnectionTimeout: 10
            DomainName: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
            Id: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
            OriginPath: ''
            S3OriginConfig:
              OriginAccessIdentity: !Sub >-
                origin-access-identity/cloudfront/${CloudFrontOriginAccessIdentity}
        OriginGroups:
          Quantity: 0
        # cache optimized
        DefaultCacheBehavior:
          AllowedMethods:
            - HEAD
            - GET
          CachedMethods:
            - HEAD
            - GET
          Compress: true
          CachePolicyId: 658327ea-f89d-4fab-a63d-7e88639e58f6
          SmoothStreaming: false
          TargetOriginId: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
          ViewerProtocolPolicy: redirect-to-https
        # no cache
        CacheBehaviors:
          - AllowedMethods:
              - HEAD
              - GET
            Compress: true
            CachePolicyId: 4135ea2d-6df8-44a3-9df3-4b5a84be39ad
            PathPattern: /index.html
            SmoothStreaming: false
            TargetOriginId: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
            ViewerProtocolPolicy: redirect-to-https
        CustomErrorResponses:
          - ErrorCode: 403
            ResponsePagePath: /error.html
            ResponseCode: '403'
            ErrorCachingMinTTL: 86400
        Comment: "CloudFront distribution for automated code documentation."
        PriceClass: PriceClass_All
        Enabled: true
        ViewerCertificate:
          CloudFrontDefaultCertificate: true
        Restrictions:
          GeoRestriction:
            RestrictionType: none
        HttpVersion: http2
        DefaultRootObject: index.html
        IPV6Enabled: true
        Logging:
          Bucket: !GetAtt
           - LoggingBucket
           - DomainName
    DependsOn:
      - CloudFrontOriginAccessIdentity
  
  CloudFrontOriginAccessIdentity:
    Type: 'AWS::CloudFront::CloudFrontOriginAccessIdentity'
    Properties:
      CloudFrontOriginAccessIdentityConfig:
        Comment: !Sub >-
          access-identity-${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com
  
# ===== Lambdas ======
  CheckUpdateRequiredOrNotLambda:
    Type: 'AWS::Lambda::Function'
    Properties:
      FunctionName: !Sub '${CodeBuildProjectName}CheckUpdateRequiredOrNot'
      Timeout: 300
      Handler: index.CheckUpdateRequiredOrNot
      ReservedConcurrentExecutions: 1
      Role: !GetAtt 
        - CheckUpdateRequiredOrNotLambdaRole
        - Arn
      Code:
        ZipFile: |-
          import boto3

          def CheckUpdateRequiredOrNot(event, context):
            CheckAndAddErrorHtml(event['Bucket'])
            print("Checking " + event['Version'] + " folder present or not in root level in S3 Bucket " + event['Bucket'])
            s3 = boto3.client('s3')
            listObjectOutput = s3.list_objects(Bucket=event['Bucket'], Delimiter='/')
            print(listObjectOutput)
            content = listObjectOutput.get('CommonPrefixes', [])
            for prefix in content:
                if (prefix.get('Prefix').strip('/') == event['Version']):
                    return False
            return True
              
          def CheckAndAddErrorHtml(bucket):
            s3 = boto3.client('s3')
            errorFilePresent = s3.list_objects(Bucket=bucket, Prefix='error.html').get('Contents', [])
            if errorFilePresent:
              return
            print("Adding error.html file.")
            text = "Error Page for 403"
            file = open('/tmp/error.html','w')
            file.write(text)
            file.close()
            with open('/tmp/error.html', 'rb') as data:
              s3.upload_fileobj(data, bucket, 'error.html', {'ContentType': "text/html"})
      Runtime: python3.8
  
  DocumentationUpdateIndexLambda:
    Type: 'AWS::Lambda::Function'
    Properties:
      FunctionName: !Sub '${CodeBuildProjectName}DocumentationUpdateIndex'
      Timeout: 300
      Handler: index.DocumentationUpdateIndex
      ReservedConcurrentExecutions: 1
      Role: !GetAtt 
        - DocumentationUpdateIndexLambdaRole
        - Arn
      Code:
        ZipFile: 
          Fn::Sub:
          - |-
            import boto3
            from pkg_resources import parse_version
                      
            def DocumentationUpdateIndex(event, context):
                s3 = boto3.client('s3')
                print("Deleting current index.html file.")
                s3.delete_object(Bucket=event['Bucket'], Key='index.html')
                print("Adding index.html file.")
                envVersion = event['Version']
                
                domain = '${CloudfrontDomain}'
                latestVersionHtml = "      <span>Latest Version: <a href='https://{}/{}/api/index.html'>{}</a></span>".format(domain, envVersion, envVersion)
                listObjectOutput = s3.list_objects(Bucket=event['Bucket'], Delimiter='/').get('CommonPrefixes', [])
                versions = []
                for prefix in listObjectOutput:
                    if (prefix.get('Prefix').startswith("v")):
                        versions.append(prefix.get('Prefix').strip('/'))
                
                versions = sorted(versions, key=parse_version, reverse=True)
                allVersionHtml = ""
                for versionNum in versions:
                    allVersionHtml += "\n    <div class='version'>\n      \<span>Version: <a href='https://{}/{}/api/index.html'>{}</a></span>\n    </div>".format(domain, versionNum, versionNum)
                        
                indexHtml = """
                <!DOCTYPE html>
                <html>
                  <head>
                    <style>
                      div.latestVersion {{margin:1em; margin-left: 2em; font-size: 1.5em;}}
                      div.versionList {{margin-left: 3em; overflow: scroll; height:80vh;}}
                      div.version {{margin:1em;}}
                    </style>
                  </head>
                  <body>
                    <header>
                      <span style="font-size: 2em; margin-left: 1em;">Tool Name</span>
                    </header>
                    <div class="latestVersion">
                    {}
                    </div>
                    <div class="versionList">
                    {}
                    </div>
                    <footer style="position: absolute; bottom: 0; left: 1em; font-size: 2em;">
                      Powered by <strong>Team Name</strong>
                    </footer>
                  </body>
                </html>
                """.format(latestVersionHtml, allVersionHtml)
                
                print(indexHtml)
                file = open('/tmp/index.html','w')
                file.write(indexHtml)
                file.close()
                with open('/tmp/index.html', 'rb') as data:
                    s3.upload_fileobj(data, event['Bucket'], 'index.html', {'ContentType': "text/html"})

                return True
          - {
              CloudfrontDomain: { Fn::GetAtt: [CloudFrontDistribution, DomainName]}
            }
      Runtime: python3.8
  
# ===== Dev Tools ======
  CodeCommitRepository:
    Type: 'AWS::CodeCommit::Repository'
    Properties:
      RepositoryName: !Ref CodeCommitRepositoryName
      RepositoryDescription: !Sub 'Code commit repository for ${CodeBuildProjectName} project.'
    
  CodeBuildProject:
    Type: 'AWS::CodeBuild::Project'
    Properties:
      Name: !Ref CodeBuildProjectName
      Description: !Sub 'Code build for ${CodeBuildProjectName} project.'
      Source:
        BuildSpec: buildspec.yml
        GitCloneDepth: 1
        GitSubmodulesConfig:
          FetchSubmodules: false
        InsecureSsl: false
        Location: !Sub >-
          https://git-codecommit.${AWS::Region}.amazonaws.com/v1/repos/${CodeCommitRepositoryName}
        Type: CODECOMMIT
      Artifacts:
        Type: NO_ARTIFACTS
      Cache:
        Type: NO_CACHE
      Environment:
        ComputeType: BUILD_GENERAL1_MEDIUM
        Image: 'aws/codebuild/windows-base:2019-1.0'
        ImagePullCredentialsType: CODEBUILD
        PrivilegedMode: false
        Type: WINDOWS_SERVER_2019_CONTAINER
      ServiceRole: !GetAtt 
        - CodeBuildRole
        - Arn
      TimeoutInMinutes: 60
      QueuedTimeoutInMinutes: 480
      EncryptionKey: !Sub 'arn:aws:kms:${AWS::Region}:${AWS::AccountId}:alias/aws/s3'
      BadgeEnabled: false
      LogsConfig:
        CloudWatchLogs:
          Status: ENABLED
        S3Logs:
          Status: DISABLED
          EncryptionDisabled: false
      Visibility: PRIVATE
    
  CodePipelinePipeline:
    Type: 'AWS::CodePipeline::Pipeline'
    Properties:
      Name: !Sub '${CodeBuildProjectName}Pipeline'
      RoleArn: !GetAtt
        - CodePipelineRole
        - Arn
      ArtifactStore:
        Location: !Ref ArtifactStoreBucketName
        Type: S3
      Stages:
        - Name: Source
          Actions:
            - Name: Source
              ActionTypeId:
                Category: Source
                Owner: AWS
                Provider: CodeCommit
                Version: '1'
              Configuration:
                BranchName: mainline
                OutputArtifactFormat: CODE_ZIP
                PollForSourceChanges: 'false'
                RepositoryName: !Ref CodeCommitRepositoryName
              OutputArtifacts:
                - Name: SourceArtifact
              Region: !Ref 'AWS::Region'
              Namespace: SourceVariables
              RunOrder: 1
        - Name: Build
          Actions:
            - Name: Build
              ActionTypeId:
                Category: Build
                Owner: AWS
                Provider: CodeBuild
                Version: '1'
              Configuration:
                ProjectName: !Ref CodeBuildProjectName
              InputArtifacts:
                - Name: SourceArtifact
              OutputArtifacts:
                - Name: BuildArtifact
              Region: !Ref 'AWS::Region'
              Namespace: BuildVariables
              RunOrder: 1
  
# ===== IAM Roles ======
  CheckUpdateRequiredOrNotLambdaRole:
    Type: 'AWS::IAM::Role'
    Properties:
      RoleName: !Sub '${CodeBuildProjectName}CheckUpdateRequiredOrNotLambdaRole'
      AssumeRolePolicyDocument:
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
      ManagedPolicyArns:
        - 'arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole'
      Policies:
        - PolicyName: !Sub '${CodeBuildProjectName}CheckUpdateRequiredOrNotLambdaPolicy'
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Action:
                  - 's3:ListBucket'
                  - 's3:PutObject'
                Effect: Allow
                Resource:
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}'
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}/*'
  
  DocumentationUpdateIndexLambdaRole:
    Type: 'AWS::IAM::Role'
    Properties:
      RoleName: !Sub '${CodeBuildProjectName}DocumentationUpdateIndexLambdaRole'
      AssumeRolePolicyDocument:
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
      ManagedPolicyArns:
        - 'arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole'
      Policies:
        - PolicyName: !Sub '${CodeBuildProjectName}DocumentationUpdateIndexLambdaPolicy'
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Action:
                  - 's3:ListBucket'
                  - 's3:DeleteObject'
                  - 's3:PutObject'
                Effect: Allow
                Resource:
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}'
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}/*'
  
  CodePipelineRole:
    Type: 'AWS::IAM::Role'
    Properties:
      Path: /service-role/
      RoleName: !Sub 'AWSCodePipelineServiceRole-${AWS::Region}-${CodeBuildProjectName}Pipeline'
      AssumeRolePolicyDocument: 
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: codepipeline.amazonaws.com
      MaxSessionDuration: 3600
      ManagedPolicyArns:
        - !Ref CodePipelineManagedPolicy
        - !Ref CodeBuildManagedPolicy

  CodeBuildRole:
    Type: 'AWS::IAM::Role'
    Properties:
      Path: /service-role/
      RoleName: !Sub 'AWSCodeBuildServiceRole-${AWS::Region}-${CodeBuildProjectName}Build'
      AssumeRolePolicyDocument: 
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: codebuild.amazonaws.com
      MaxSessionDuration: 3600
      ManagedPolicyArns:
        - !Ref CodeBuildManagedPolicy
        - 'arn:aws:iam::aws:policy/AWSCodeArtifactReadOnlyAccess'
  
  CodePipelineManagedPolicy:
    Type: 'AWS::IAM::ManagedPolicy'
    Properties:
      Path: /service-role/
      ManagedPolicyName: !Sub 'AWSCodePipelineManagedPolicy-${AWS::Region}-${CodeBuildProjectName}Pipeline'
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Resource: !GetAtt
              - CodeCommitRepository
              - Arn
            Action:
              - 'codecommit:CancelUploadArchive'
              - 'codecommit:GetBranch'
              - 'codecommit:GetCommit'
              - 'codecommit:GetUploadArchiveStatus'
              - 'codecommit:UploadArchive'
          - Effect: Allow
            Resource: !GetAtt
              - CodeBuildProject
              - Arn
            Action:
              - 'codebuild:BatchGetBuilds'
              - 'codebuild:StartBuild'
  
  CodeBuildManagedPolicy:
    Type: 'AWS::IAM::ManagedPolicy'
    Properties:
      ManagedPolicyName: !Sub 'AWSCodeBuildManagedPolicy-${AWS::Region}-${CodeBuildProjectName}Build'
      Path: /service-role/
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Resource:
              - !Sub >-
                arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:function:${CheckUpdateRequiredOrNotLambda}
              - !Sub >-
                arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:function:${DocumentationUpdateIndexLambda}
            Action:
              - 'lambda:InvokeFunction'
              - 'lambda:ListFunctions'
          - Effect: Allow
            Resource: !Sub 'arn:aws:s3:::${DocumentationBucketName}/*'
            Action: 's3:PutObject'
          - Effect: Allow
            Resource:
              - !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}'
            Action:
              - 's3:PutObject'
              - 's3:GetObject'
              - 's3:GetObjectVersion'
              - 's3:GetBucketAcl'
              - 's3:GetBucketLocation'
          - Effect: Allow
            Resource:
              - !Sub >-
                arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/codebuild/${CodeBuildProjectName}:*
              - !Sub >-
                arn:aws:logs:${AWS::Region}:${AWS::AccountId}:/aws/codebuild/${CodeBuildProjectName}:*
            Action:
              - 'logs:CreateLogGroup'
              - 'logs:CreateLogStream'
              - 'logs:PutLogEvents' 
    • AWS CodeBuild with a Linux image:
AWSTemplateFormatVersion: 2010-09-09

Parameters:
  DocumentationBucketName:
    Type: String
    Description: "S3 Bucket name for bucket where auto generated code documentation will be stored.
      Refer https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html#bucketnamingrules"
  
  LoggingBucketName:
    Type: String
    Description: "S3 Bucket name for bucket where logs will be stored.
      Refer https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html#bucketnamingrules"
  
  ArtifactStoreBucketName:
    Type: String
    Description: "S3 Bucket name for bucket where codepipeline artifacts will be stored.
      Refer https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html#bucketnamingrules"
  
  CodeCommitRepositoryName:
    Type: String
    Description: "Repository name for codecommit project."
  
  CodeBuildProjectName:
    Type: String
    Description: "Name of codebuild project."

Resources:

# ===== S3 Buckets ======
  LoggingBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Ref LoggingBucketName
      BucketEncryption:
        ServerSideEncryptionConfiguration:
          - ServerSideEncryptionByDefault:
              SSEAlgorithm: 'AES256'
  
  LoggingBucketAccessPolicy:
    Type: 'AWS::S3::BucketPolicy'
    Properties:
      Bucket: !Ref LoggingBucket
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Sid: S3ServerAccessLogsPolicy
            Effect: Allow
            Principal:
              Service: logging.s3.amazonaws.com
            Action:
              - 's3:PutObject'
            Resource: !Sub arn:aws:s3:::${LoggingBucket}/Logs*

  S3DocumentationBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Ref DocumentationBucketName
      WebsiteConfiguration:
        IndexDocument: index.html
      BucketEncryption:
        ServerSideEncryptionConfiguration:
          - ServerSideEncryptionByDefault:
              SSEAlgorithm: 'AES256'
      LoggingConfiguration:
        DestinationBucketName: !Ref LoggingBucket
        
  S3DocumentationBucketAccessPolicy:
    Type: 'AWS::S3::BucketPolicy'
    Properties:
      Bucket: !Ref S3DocumentationBucket
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Sid: 1
            Effect: Allow
            Principal:
              AWS: !Sub arn:aws:iam::cloudfront:user/CloudFront Origin Access Identity ${CloudFrontOriginAccessIdentity}
            Action:
              - 's3:GetObject'
            Resource: !Sub arn:aws:s3:::${DocumentationBucketName}/*
     
  S3ArtifactBucket:
    Type: 'AWS::S3::Bucket'
    Properties:
      BucketName: !Ref ArtifactStoreBucketName
      BucketEncryption:
        ServerSideEncryptionConfiguration:
          - ServerSideEncryptionByDefault:
              SSEAlgorithm: 'AES256'
      LoggingConfiguration:
        DestinationBucketName: !Ref LoggingBucket

  S3ArtifactBucketAccessPolicy:
    Type: 'AWS::S3::BucketPolicy'
    Properties:
      Bucket: !Ref S3ArtifactBucket
      PolicyDocument:
        Version: 2012-10-17
        Id: SSEAndSSLPolicy
        Statement:
          - Sid: DenyUnEncryptedObjectUploads
            Effect: Deny
            Principal: '*'
            Action: 's3:PutObject'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}/*'
            Condition:
              StringNotEquals:
                's3:x-amz-server-side-encryption': 'aws:kms'
          - Sid: DenyInsecureConnections
            Effect: Deny
            Principal: '*'
            Action: 's3:*'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}/*'
            Condition:
              Bool:
                'aws:SecureTransport': 'false'
          - Sid: ''
            Effect: Allow
            Principal:
              AWS: 
                - !GetAtt
                  - CodeBuildRole
                  - Arn
                - !GetAtt
                  - CodePipelineRole
                  - Arn
            Action:
              - 's3:Get*'
              - 's3:Put*'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}/*'
          - Sid: ''
            Effect: Allow
            Principal:
              AWS: 
                - !GetAtt
                  - CodeBuildRole
                  - Arn
                - !GetAtt
                  - CodePipelineRole
                  - Arn
            Action: 's3:ListBucket'
            Resource: !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}'

# ===== Cloudfront ======
  CloudFrontDistribution:
    Type: 'AWS::CloudFront::Distribution'
    Properties:
      DistributionConfig:
        Origins:
          - ConnectionAttempts: 3
            ConnectionTimeout: 10
            DomainName: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
            Id: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
            OriginPath: ''
            S3OriginConfig:
              OriginAccessIdentity: !Sub >-
                origin-access-identity/cloudfront/${CloudFrontOriginAccessIdentity}
        OriginGroups:
          Quantity: 0
        # cache optimized
        DefaultCacheBehavior:
          AllowedMethods:
            - HEAD
            - GET
          CachedMethods:
            - HEAD
            - GET
          Compress: true
          CachePolicyId: 658327ea-f89d-4fab-a63d-7e88639e58f6
          SmoothStreaming: false
          TargetOriginId: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
          ViewerProtocolPolicy: redirect-to-https
        # no cache
        CacheBehaviors:
          - AllowedMethods:
              - HEAD
              - GET
            Compress: true
            CachePolicyId: 4135ea2d-6df8-44a3-9df3-4b5a84be39ad
            PathPattern: /index.html
            SmoothStreaming: false
            TargetOriginId: !Sub '${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com'
            ViewerProtocolPolicy: redirect-to-https
        CustomErrorResponses:
          - ErrorCode: 403
            ResponsePagePath: /error.html
            ResponseCode: '403'
            ErrorCachingMinTTL: 86400
        Comment: "CloudFront distribution for automated code documentation."
        PriceClass: PriceClass_All
        Enabled: true
        ViewerCertificate:
          CloudFrontDefaultCertificate: true
        Restrictions:
          GeoRestriction:
            RestrictionType: none
        HttpVersion: http2
        DefaultRootObject: index.html
        IPV6Enabled: true
        Logging:
          Bucket: !GetAtt
           - LoggingBucket
           - DomainName
    DependsOn:
      - CloudFrontOriginAccessIdentity
  
  CloudFrontOriginAccessIdentity:
    Type: 'AWS::CloudFront::CloudFrontOriginAccessIdentity'
    Properties:
      CloudFrontOriginAccessIdentityConfig:
        Comment: !Sub >-
          access-identity-${DocumentationBucketName}.s3.${AWS::Region}.amazonaws.com
  
# ===== Lambdas ======
  CheckUpdateRequiredOrNotLambda:
    Type: 'AWS::Lambda::Function'
    Properties:
      FunctionName: !Sub '${CodeBuildProjectName}CheckUpdateRequiredOrNot'
      Timeout: 300
      Handler: index.CheckUpdateRequiredOrNot
      ReservedConcurrentExecutions: 1
      Role: !GetAtt 
        - CheckUpdateRequiredOrNotLambdaRole
        - Arn
      Code:
        ZipFile: |-
          import boto3

          def CheckUpdateRequiredOrNot(event, context):
            CheckAndAddErrorHtml(event['Bucket'])
            print("Checking " + event['Version'] + " folder present or not in root level in S3 Bucket " + event['Bucket'])
            s3 = boto3.client('s3')
            listObjectOutput = s3.list_objects(Bucket=event['Bucket'], Delimiter='/')
            print(listObjectOutput)
            content = listObjectOutput.get('CommonPrefixes', [])
            for prefix in content:
                if (prefix.get('Prefix').strip('/') == event['Version']):
                    return False
            return True
              
          def CheckAndAddErrorHtml(bucket):
            s3 = boto3.client('s3')
            errorFilePresent = s3.list_objects(Bucket=bucket, Prefix='error.html').get('Contents', [])
            if errorFilePresent:
              return
            print("Adding error.html file.")
            text = "Error Page for 403"
            file = open('/tmp/error.html','w')
            file.write(text)
            file.close()
            with open('/tmp/error.html', 'rb') as data:
              s3.upload_fileobj(data, bucket, 'error.html', {'ContentType': "text/html"})
      Runtime: python3.8
  
  DocumentationUpdateIndexLambda:
    Type: 'AWS::Lambda::Function'
    Properties:
      FunctionName: !Sub '${CodeBuildProjectName}DocumentationUpdateIndex'
      Timeout: 300
      Handler: index.DocumentationUpdateIndex
      ReservedConcurrentExecutions: 1
      Role: !GetAtt 
        - DocumentationUpdateIndexLambdaRole
        - Arn
      Code:
        ZipFile: 
          Fn::Sub:
          - |-
            import boto3
            from pkg_resources import parse_version
                      
            def DocumentationUpdateIndex(event, context):
                s3 = boto3.client('s3')
                print("Adding index.html file.")
                envVersion = event['Version']
                
                domain = '${CloudfrontDomain}'
                latestVersionHtml = "      <span>Latest Version: <a href='https://{}/{}/index.html'>{}</a></span>".format(domain, envVersion, envVersion)
                listObjectOutput = s3.list_objects(Bucket=event['Bucket'], Delimiter='/').get('CommonPrefixes', [])
                versions = []
                for prefix in listObjectOutput:
                    if (prefix.get('Prefix').startswith("v")):
                        versions.append(prefix.get('Prefix').strip('/'))
                
                versions = sorted(versions, key=parse_version, reverse=True)
                allVersionHtml = ""
                for versionNum in versions:
                    allVersionHtml += "\n    <div class='version'>\n      \<span>Version: <a href='https://{}/{}/index.html'>{}</a></span>\n    </div>".format(domain, versionNum, versionNum)
                        
                indexHtml = """
                <!DOCTYPE html>
                <html>
                  <head>
                    <style>
                      div.latestVersion {{margin:1em; margin-left: 2em; font-size: 1.5em;}}
                      div.versionList {{margin-left: 3em; overflow: scroll; height:80vh;}}
                      div.version {{margin:1em;}}
                    </style>
                  </head>
                  <body>
                    <header>
                      <span style="font-size: 2em; margin-left: 1em;">Tool Name</span>
                    </header>
                    <div class="latestVersion">
                    {}
                    </div>
                    <div class="versionList">
                    {}
                    </div>
                    <footer style="position: absolute; bottom: 0; left: 1em; font-size: 2em;">
                      Powered by <strong>Team Name</strong>
                    </footer>
                  </body>
                </html>
                """.format(latestVersionHtml, allVersionHtml)
                
                print(indexHtml)
                file = open('/tmp/index.html','w')
                file.write(indexHtml)
                file.close()
                with open('/tmp/index.html', 'rb') as data:
                    s3.upload_fileobj(data, event['Bucket'], 'index.html', {'ContentType': "text/html"})

                return True
          - {
              CloudfrontDomain: { Fn::GetAtt: [CloudFrontDistribution, DomainName]}
            }
      Runtime: python3.8
  
# ===== Dev Tools ======
  CodeCommitRepository:
    Type: 'AWS::CodeCommit::Repository'
    Properties:
      RepositoryName: !Ref CodeCommitRepositoryName
      RepositoryDescription: !Sub 'Code commit repository for ${CodeBuildProjectName} project.'
    
  CodeBuildProject:
    Type: 'AWS::CodeBuild::Project'
    Properties:
      Name: !Ref CodeBuildProjectName
      Description: !Sub 'Code build for ${CodeBuildProjectName} project.'
      Source:
        BuildSpec: buildspec.yml
        GitCloneDepth: 1
        GitSubmodulesConfig:
          FetchSubmodules: false
        InsecureSsl: false
        Location: !Sub >-
          https://git-codecommit.${AWS::Region}.amazonaws.com/v1/repos/${CodeCommitRepositoryName}
        Type: CODECOMMIT
      Artifacts:
        Type: NO_ARTIFACTS
      Cache:
        Type: NO_CACHE
      Environment:
        ComputeType: BUILD_GENERAL1_SMALL
        Image: aws/codebuild/standard:6.0
        Type: LINUX_CONTAINER
      ServiceRole: !GetAtt 
        - CodeBuildRole
        - Arn
      TimeoutInMinutes: 60
      QueuedTimeoutInMinutes: 480
      EncryptionKey: !Sub 'arn:aws:kms:${AWS::Region}:${AWS::AccountId}:alias/aws/s3'
      BadgeEnabled: false
      LogsConfig:
        CloudWatchLogs:
          Status: ENABLED
        S3Logs:
          Status: DISABLED
          EncryptionDisabled: false
      Visibility: PRIVATE
    
  CodePipelinePipeline:
    Type: 'AWS::CodePipeline::Pipeline'
    Properties:
      Name: !Sub '${CodeBuildProjectName}Pipeline'
      RoleArn: !GetAtt
        - CodePipelineRole
        - Arn
      ArtifactStore:
        Location: !Ref ArtifactStoreBucketName
        Type: S3
      Stages:
        - Name: Source
          Actions:
            - Name: Source
              ActionTypeId:
                Category: Source
                Owner: AWS
                Provider: CodeCommit
                Version: '1'
              Configuration:
                BranchName: mainline
                OutputArtifactFormat: CODE_ZIP
                PollForSourceChanges: 'false'
                RepositoryName: !Ref CodeCommitRepositoryName
              OutputArtifacts:
                - Name: SourceArtifact
              Region: !Ref 'AWS::Region'
              Namespace: SourceVariables
              RunOrder: 1
        - Name: Build
          Actions:
            - Name: Build
              ActionTypeId:
                Category: Build
                Owner: AWS
                Provider: CodeBuild
                Version: '1'
              Configuration:
                ProjectName: !Ref CodeBuildProjectName
              InputArtifacts:
                - Name: SourceArtifact
              OutputArtifacts:
                - Name: BuildArtifact
              Region: !Ref 'AWS::Region'
              Namespace: BuildVariables
              RunOrder: 1
  
# ===== IAM Roles ======
  CheckUpdateRequiredOrNotLambdaRole:
    Type: 'AWS::IAM::Role'
    Properties:
      RoleName: !Sub '${CodeBuildProjectName}CheckUpdateRequiredOrNotLambdaRole'
      AssumeRolePolicyDocument:
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
      ManagedPolicyArns:
        - 'arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole'
      Policies:
        - PolicyName: !Sub '${CodeBuildProjectName}CheckUpdateRequiredOrNotLambdaPolicy'
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Action:
                  - 's3:ListBucket'
                  - 's3:PutObject'
                Effect: Allow
                Resource:
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}'
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}/*'
  
  DocumentationUpdateIndexLambdaRole:
    Type: 'AWS::IAM::Role'
    Properties:
      RoleName: !Sub '${CodeBuildProjectName}DocumentationUpdateIndexLambdaRole'
      AssumeRolePolicyDocument:
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
      ManagedPolicyArns:
        - 'arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole'
      Policies:
        - PolicyName: !Sub '${CodeBuildProjectName}DocumentationUpdateIndexLambdaPolicy'
          PolicyDocument:
            Version: 2012-10-17
            Statement:
              - Action:
                  - 's3:ListBucket'
                  - 's3:DeleteObject'
                  - 's3:PutObject'
                Effect: Allow
                Resource:
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}'
                  - !Sub 'arn:aws:s3:::${S3DocumentationBucket}/*'
  
  CodePipelineRole:
    Type: 'AWS::IAM::Role'
    Properties:
      Path: /service-role/
      RoleName: !Sub 'AWSCodePipelineServiceRole-${AWS::Region}-${CodeBuildProjectName}Pipeline'
      AssumeRolePolicyDocument: 
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: codepipeline.amazonaws.com
      MaxSessionDuration: 3600
      ManagedPolicyArns:
        - !Ref CodePipelineManagedPolicy
        - !Ref CodeBuildManagedPolicy

  CodeBuildRole:
    Type: 'AWS::IAM::Role'
    Properties:
      Path: /service-role/
      RoleName: !Sub 'AWSCodeBuildServiceRole-${AWS::Region}-${CodeBuildProjectName}Build'
      AssumeRolePolicyDocument: 
        Version: 2012-10-17
        Statement:
          - Action:
              - 'sts:AssumeRole'
            Effect: Allow
            Principal:
              Service: codebuild.amazonaws.com
      MaxSessionDuration: 3600
      ManagedPolicyArns:
        - !Ref CodeBuildManagedPolicy
        - 'arn:aws:iam::aws:policy/AWSCodeArtifactReadOnlyAccess'
  
  CodePipelineManagedPolicy:
    Type: 'AWS::IAM::ManagedPolicy'
    Properties:
      Path: /service-role/
      ManagedPolicyName: !Sub 'AWSCodePipelineManagedPolicy-${AWS::Region}-${CodeBuildProjectName}Pipeline'
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Resource: !GetAtt
              - CodeCommitRepository
              - Arn
            Action:
              - 'codecommit:CancelUploadArchive'
              - 'codecommit:GetBranch'
              - 'codecommit:GetCommit'
              - 'codecommit:GetUploadArchiveStatus'
              - 'codecommit:UploadArchive'
          - Effect: Allow
            Resource: !GetAtt
              - CodeBuildProject
              - Arn
            Action:
              - 'codebuild:BatchGetBuilds'
              - 'codebuild:StartBuild'
  
  CodeBuildManagedPolicy:
    Type: 'AWS::IAM::ManagedPolicy'
    Properties:
      ManagedPolicyName: !Sub 'AWSCodeBuildManagedPolicy-${AWS::Region}-${CodeBuildProjectName}Build'
      Path: /service-role/
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Effect: Allow
            Resource:
              - !Sub >-
                arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:function:${CheckUpdateRequiredOrNotLambda}
              - !Sub >-
                arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:function:${DocumentationUpdateIndexLambda}
            Action:
              - 'lambda:InvokeFunction'
              - 'lambda:ListFunctions'
          - Effect: Allow
            Resource: !Sub 'arn:aws:s3:::${DocumentationBucketName}/*'
            Action: 's3:PutObject'
          - Effect: Allow
            Resource:
              - !Sub 'arn:aws:s3:::${ArtifactStoreBucketName}'
            Action:
              - 's3:PutObject'
              - 's3:GetObject'
              - 's3:GetObjectVersion'
              - 's3:GetBucketAcl'
              - 's3:GetBucketLocation'
          - Effect: Allow
            Resource:
              - !Sub >-
                arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/codebuild/${CodeBuildProjectName}:*
              - !Sub >-
                arn:aws:logs:${AWS::Region}:${AWS::AccountId}:/aws/codebuild/${CodeBuildProjectName}:*
            Action:
              - 'logs:CreateLogGroup'
              - 'logs:CreateLogStream'
              - 'logs:PutLogEvents' 
  • Step 2: Sign in to your AWS account and navigate to the CloudFormation console.
  • Step 3: Choose Create stack
  • Step 4: On the Create stack page, for Prepare template, choose Template is ready.
  • Step 5: For Template source, choose the option Upload the template file. Click Choose file option and upload the AWS CloudFormation template file created by copying content from step 1.
  • Step 6: Choose Next. (Alternatively, you can navigate to this link and the stack will use an AWS CloudFormation template stored in Amazon S3.)

Figure2: Create stack on AWS CloudFormation.

Figure 2: Create stack on AWS CloudFormation.

  • Step 7: For Stack name, enter a name for the stack (such as DocumentGeneratorStack). For ArtifactStoreBucketName, enter a name for the Amazon S3 bucket where the AWS CodePipeline artifacts will be stored. For CodeBuildProjectName, enter a name for the AWS CodeBuild project that the solution will create. For CodeCommitRepositoryName, enter a name for the AWS CodeCommit repository that the solution will create to hold your C# project. For DocumentationBucketName, enter a name for the Amazon S3 bucket where the generated code documentation will be stored. For LoggingBucketName, enter a name for the Amazon S3 bucket where the solution logs will be stored.

Figure3: Specify stack details on AWS CloudFormation.

Figure 3: Specify stack details on AWS CloudFormation.

  • Step 8: Select I acknowledge that AWS CloudFormation might create IAM resources with custom names, then select Submit.

Figure4: Acknowledgment checkbox for IAM resources creation.

Figure4: Acknowledgment checkbox for IAM resources creation.

Configuring the AWS CodeBuild target

To configure AWS CodeBuild, add a buildspec file to the C# project.

You can use one of these example buildspec files:

For a C# project building on Windows, use below buildspec.yml:

version: 0.2 
 
env:
  variables:
    # Setting the variables for version, lambdas and bucket where build documentation will be stored.
    VERSION : <Version>
    DOCUMENTATION_S3_BUCKET : <Documentation S3 Bucket Name>
    CHECKUPDATEREQUIREDORNOT_LAMBDA: <CheckUpdateRequiredOrNot lambda>
    DOCUMENTATIONUPDATEINDEX_LAMBDA: <DocumentationUpdateIndex lambda>

phases:
  install:
    commands:
      # Download and install dotnet latest version as codebuild is not compatible with dotnet (>5.0)
      - Invoke-WebRequest 'https://dot.net/v1/dotnet-install.ps1' -OutFile 'dotnet-install.ps1'
      - ./dotnet-install.ps1 -Channel 'STS' -InstallDir 'C:\Program Files\dotnet'
      # Install docfx using chocolatey
      - choco install docfx -y
     
  build:
    commands:
      # Build project
      - dotnet build

  post_build:
    commands:
      # Set version and bucket in json for lambda input
      - echo $Env:VERSION
      - $jObj = @{Version="v"+ $Env:VERSION; Bucket=$Env:DOCUMENTATION_S3_BUCKET}
      - $jObj | ConvertTo-Json | Set-Content jsonPayload.json
      # Invoke lambda to check if project version document is present in s3
      - aws lambda invoke --function-name $Env:CHECKUPDATEREQUIREDORNOT_LAMBDA --cli-binary-format raw-in-base64-out --payload file://jsonPayload.json lambdaresponse.json
      - $outputArtifactNeedToUploadToS3 = cat lambdaresponse.json
      - echo $outputArtifactNeedToUploadToS3
      - $folderName = "v" + $Env:VERSION
      # Based on result of check, build documentation using docfx
      - if ($outputArtifactNeedToUploadToS3 -eq "true") { docfx -t statictoc }
      - if ($outputArtifactNeedToUploadToS3 -eq "true") { ls _site }
      # Copy all created artifacts/files for documentation to s3 bucket provided
      - if ($outputArtifactNeedToUploadToS3 -eq "true") { aws s3 cp _site\ s3://$Env:DOCUMENTATION_S3_BUCKET/$folderName/ --sse --recursive }
      # Invoke lambda to update index.html file with new project version
      - if ($outputArtifactNeedToUploadToS3 -eq "true") { aws lambda invoke --function-name $Env:DOCUMENTATIONUPDATEINDEX_LAMBDA --cli-binary-format raw-in-base64-out --payload file://jsonPayload.json updateIndexlambdaresponse.json }

For a C# project building on Linux, use below buildspec.yml:

version: 0.2 
 
env:
  variables:
    # Setting the variables for version, lambdas and bucket where build documentation will be stored.
    VERSION : <Version>
    DOCUMENTATION_S3_BUCKET : <Documentation S3 Bucket Name>
    CHECKUPDATEREQUIREDORNOT_LAMBDA: <CheckUpdateRequiredOrNot lambda>
    DOCUMENTATIONUPDATEINDEX_LAMBDA: <DocumentationUpdateIndex lambda>

phases:
  install:
    commands:
      # Download and install dotnet latest version as codebuild is not compatible with dotnet (>5.0)
      - export PATH="$PATH:/root/.dotnet/tools"
      - curl -sSL https://dot.net/v1/dotnet-install.sh | bash /dev/stdin --channel STS
      # Install docfx
      - dotnet tool install -g docfx
     
  build: 
    commands: 
      # Build project
     - dotnet build

  post_build:
    commands:
      # Set version and bucket in json for lambda input
     - echo $VERSION
     - jObj={\"Version\":\"v$VERSION\"\,\"Bucket\":\"$DOCUMENTATION_S3_BUCKET\"}
     - echo $jObj > jsonPayload.json
      # Invoke lambda to check if project version document is present in s3
     - aws lambda invoke --function-name $CHECKUPDATEREQUIREDORNOT_LAMBDA --cli-binary-format raw-in-base64-out --payload file://jsonPayload.json lambdaresponse.json
     - outputArtifactNeedToUploadToS3=`cat lambdaresponse.json`
     - echo $outputArtifactNeedToUploadToS3
     - folderName="v"$VERSION
      # Based on result of check, build documentation using docfx
     - if [ $outputArtifactNeedToUploadToS3 = "true" ]; then docfx -t statictoc; fi
     - if [ $outputArtifactNeedToUploadToS3 = "true" ]; then ls _site; fi
      # Copy all created artifacts/files for documentation to s3 bucket provided
     - if [ $outputArtifactNeedToUploadToS3 = "true" ]; then aws s3 cp _site s3://$DOCUMENTATION_S3_BUCKET/$folderName/ --sse --recursive; fi
      # Invoke lambda to update index.html file with new project version
     - if [ $outputArtifactNeedToUploadToS3 = "true" ]; then aws lambda invoke --function-name $DOCUMENTATIONUPDATEINDEX_LAMBDA --cli-binary-format raw-in-base64-out --payload file://jsonPayload.json updateIndexlambdaresponse.json; fi

Configuring the C# project

Update the C# project with the following steps to enable automated documentation generation with DocFx:

  1. Add the DocFx configuration file.
  2. Add Markdown index files that will be the section home pages for the documentation website.
  3. Add table of contents (TOC) files so DocFx will understand the project’s folder structure.

For the DocFX configuration file, use below DocFx.json. (See using DocFx for documentation of each setting.)

{
  "metadata": [
    {
      "src": [
        {
          "files": [
            "**/*.cs"
          ],
          "src": ".."
        }
      ],
      "dest": "api",
      "disableGitFeatures": false,
      "disableDefaultFilter": false
    }
  ],
  "build": {
    "content": [
      {
        "files": [
          "api/**.yml",
          "api/index.md"
        ]
      },
      {
        "files": [
          "articles/**.md",
          "articles/**/toc.yml",
          "toc.yml",
          "*.md"
        ]
      }
    ],
    "resource": [
      {
        "files": [
          "images/**"
        ]
      }
    ],
    "overwrite": [
      {
        "files": [
          "apidoc/**.md"
        ],
        "exclude": [
          "obj/**",
          "_site/**"
        ]
      }
    ],
    "dest": "_site",
    "globalMetadataFiles": [],
    "fileMetadataFiles": [],
    "template": [
      "statictoc"
    ],
    "postProcessors": [],
    "markdownEngineName": "markdig",
    "noLangKeyword": false,
    "keepFileLink": false,
    "cleanupCacheHistory": false,
    "disableGitFeatures": false
  }
}

Add this file to the base folder of the C# project, with the name DocFx.json.

Now, create an api folder in the C# project if it doesn’t exist. You will need to create index and TOC files for the base folder of the project and the api folder.

In the base folder of the project, create a file named index.md. Populate the file with below content:

## Can add version related details like release logs here in markdown format

Next, create a file named toc.yml. Populate the file with below content:

- name: Home
  href: index.md
- name: Articles
  href: articles/
- name: Documentation
  href: api/
  homepage: api/index.md

In the api folder, create a file named index.md. Populate the file with below content:


## Can add documentation related details here in markdown format

Next, create a file named toc.yml (in the api folder). Populate the file with below content:

- name: Home
  href: index.md
- name: Articles
  href: articles/
- name: Documentation
  href: api/
  homepage: api/index.md

Now, push your C# project to the AWS CodeCommit repository using the main branch (see the AWS CodeCommit user guide for details).

Note: AWS restricts AWS CodeCommit repositories to authorized users. You will need to add a policy allowing read/write access to the created repository for your policy IAM user.

The solution configures the AWS CodePipeline project to monitor an AWS CodeCommit branch named main.

Every code push event in the AWSCodeCommit repository triggers the AWS CodePipeline to build and update the static website. The link to open the documentation website can be found on the Amazon CloudFront page in the AWS Management Console.

Cleaning up

To avoid incurring future charges on your AWS account, you will need to delete the resources created by this solution. You can delete all resources by deleting the Amazon CloudFormation stack. For any resources that are changed after deploying the stack (for example, Amazon S3 buckets), you will need to empty them before deleting the stack.

Conclusion

In this blog post, we have discussed how to create an automated C# code documentation generation system. You can use this system to generate and host real-time, encrypted documentation, with access restricted to a relevant audience.


AWS can help you assess how your company can get the most out of cloud. Join the millions of AWS customers that trust us to migrate and modernize their most important applications in the cloud. To learn more on modernizing Windows Server or SQL Server, visit Windows on AWSContact us to start your migration journey today.

Rohan Kachewar

Rohan Kachewar

Rohan Kachewar is a Software Development Manager leading Amazon’s Taskless Workplace tech team. His professional interests are in building secure, scalable distributed systems that provide value to customers. Outside work he spends time watching movies and travelling.

Andy Hopper

Andy Hopper

Andy Hopper is a Principal Specialist Solutions Architect at AWS specializing in how to migrate and modernize Microsoft workloads using .NET and Windows technologies.

Rohit Kumar

Rohit Kumar

Rohit Kumar is a Senior Software Engineer at Amazon, focused on scalable app development, workflow optimization, and cross-functional collaboration. His expertise includes designing robust architectures, leveraging cutting-edge technologies, and delivering innovative solutions for enhanced productivity and efficiency.

Siddharth

Siddharth

Siddharth is a Software Development Engineer- Test in Taskless Workplace team. His passion is in building optimal software solutions and increasing developer productivity. Outside work, he likes to swim and go for trekking.