Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

    Listing Thumbnail

    4DAlert - Data Reconciliation, Data Quality and DataOps(CI/CD)

     Info
    Sold by: 4DAlert 
    4DAlert leverages AI/ML to deliver automated Data Reconciliation, Data Quality and DataOps(CI/CD) for Modern Analytics platforms such as Snowflake, Azure, SQL Server, Athena, Redshift, Google Bigquery, Postgres and others. Data Reconciliation modules comes with several APIs that connects to both source and target databases, reconciles data at predefined interval and generates alerts when there is data discrepancies. Data Quality module scans the analytics platform for various data quality rules such as anomaly detection, outlier detection, Freshness, invalid character, Schema drifting, late arrival of data, and many other rules using AI/ML technologies and alerts when there is data anomaly. DataOps(CI/CD) module automates CI/CD process by integrating database with GitHub or Azure DevOps or other equivalent Source control tools, schema compare and automatic change deployment.
    Listing Thumbnail

    4DAlert - Data Reconciliation, Data Quality and DataOps(CI/CD)

     Info
    Sold by: 4DAlert 

    Overview

    4DAlert leverages AI/ML to deliver automated Data Reconciliation, Data Quality and DataOps(CI/CD) for Modern Analytics platforms such as Snowflake, Azure, SQL Server, Athena, Redshift, Google Bigquery, Postgres and others. Solution comes with three different modules. a) Data Reconciliation modules leverages APIs technologies that connects to both source and target databases, reconciles data at predefined interval and generates alerts when there is data discrepancies. This module enables platforms a. Data Reconciliation & Data Compare b. Maintain Alert & Alert History c. Schedule Alert & Monitor Chain d. Master Data Reconciliation b) Data Quality & Observability module scans the analytics platform for various data quality rules such as anomaly detection, outlier detection, Freshness, invalid character, Schema drifting, late arrival of data, and many other rules using AI/ML technologies and alerts when there is data anomaly. a. Data Quality: b. Data Quality Rules c. Data Quality Framework d. Custom Rules e. DAMA Dashboard f. AI/ML Based Alerting g. Schema Drifting h. Data Late Arrival alert c) DataOps(CI/CD) module automates CI/CD process by integrating database with GitHub or Azure DevOps or other equivalent Source control tools, schema compare and automatic change deployment. a. Intuitive UI for Schema Compare b. Object Dependency Checks c. Integration with Source Control such as GIT and Azure DevOps d. Automatic Deployment Script Generation(Alter vs Create) e. Automatic notification of CI/CD Operation with all Passed and failed DDLs Other Features

    • Solution runs on a small EC2 instance(8Vcpu and 100 GB instance) Is sufficient to support the application for medium size implementation.
    • Solution runs 100% within customer's cloud subscription i.e. with the firewall, supports single sign on(SSO) and secured by Active Directory or OKTA.
    • Integration with JIRA and Service now
    • Solution compatible with Snowflake, Azure, Oracle, Sql Server, Athena, RedShift, Google BigQuery and others

    Highlights

    • Data Reconciliation modules leverages APIs technologies that connects to both source and target databases, reconciles data at predefined interval and generates alerts when there is data discrepancies. This module enables platforms a. Data Reconciliation & Data Compare b. Maintain Alert & Alert History c. Schedule Alert & Monitor Chain d. Master Data Reconciliation
    • Data Quality & Observability module scans the analytics platform for various data quality rules such as anomaly detection, outlier detection, Freshness, invalid character, Schema drifting, late arrival of data, and many other rules using AI/ML technologies and alerts when there is data anomaly. a. Data Quality: b. Data Quality Rules c. Data Quality Framework d. Custom Rules e. DAMA Dashboard f. AI/ML Based Alerting g. Schema Drifting h. Data Late Arrival alert
    • DataOps(CI/CD) module automates CI/CD process by integrating database with GitHub or Azure DevOps or other equivalent Source control tools, schema compare and automatic change deployment. a. Intuitive UI for Schema Compare b. Object Dependency Checks c. Integration with Source Control such as GIT and Azure DevOps d. Automatic Deployment Script Generation(Alter vs Create) e. Automatic notification of CI/CD Operation with all Passed and failed DDLs

    Details

    Sold by

    Delivery method

    Delivery option
    Docker Compose Method

    Latest version

    Operating system
    Linux

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    4DAlert - Data Reconciliation, Data Quality and DataOps(CI/CD)

     Info
    Pricing is based on a fixed subscription cost. You pay the same amount each billing period for unlimited usage of the product. Pricing is prorated, so you're only charged for the number of days you've been subscribed. Subscriptions have no end date and may be canceled any time.

    Fixed subscription cost

     Info
    $4,999.00/month

    Vendor refund policy

    Customer could cancel subscription within first 2 weeks of the subscription without any additional fees.

    How can we make this page better?

    We'd like to hear your feedback and ideas on how to improve this page.
    We'd like to hear your feedback and ideas on how to improve this page.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    Docker Compose Method

    Supported services: Learn more 
    • Amazon ECS
    • Amazon EKS
    • Amazon EKS Anywhere
    • Amazon ECS Anywhere
    Container image

    Containers are lightweight, portable execution environments that wrap server application software in a filesystem that includes everything it needs to run. Container applications run on supported container runtimes and orchestration services, such as Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS). Both eliminate the need for you to install and operate your own container orchestration software by managing and scheduling containers on a scalable cluster of virtual machines.

    Version release notes

    General availability for all three modules i.e. Data Reconciliation, Data Quality and DataOps(CI/CD), Solution compatible with Snowflake, Azure, Oracle, Sql Server, Athena, RedShift, Google BigQuery and others

    Additional details

    Usage instructions

    Deployment Description for Docker Compose Configuration To deploy the Docker Compose configuration provided, you would follow these steps. Here is a general deployment description:

    1. Install Docker and Docker Compose Ensure Docker and Docker Compose are installed on your system. You can install Docker from here . Docker Compose can be installed following the instructions here .

    2. Add Port Rules Add inbound rule for port 80 and 3000 for application hosting.

    3. Create a Docker Compose File Save the provided Docker Compose configuration into a file named docker-compose.yml:

    version: "3.8"

    services: postgres: image: postgres:latest container_name: postgres environment: POSTGRES_USER: fdalert_user POSTGRES_PASSWORD: password POSTGRES_DB: FDALERT ports: - "5432:5432" volumes: - postgres_data:/var/lib/postgresql/data

    fdalert: image: 709825985650.dkr.ecr.us-east-1.amazonaws.com/4dalert/4dalert-marketplace:1.10 container_name: fdalert ports: - "3000:3000" environment: NODE_ENV: default APP_DB_HOST: postgres APP_DB_PORT: 5432 APP_DB_NAME: FDALERT APP_DB_USER: fdalert_user APP_DB_PASSWORD: password

    depends_on: - postgres

    volumes: postgres_data:

    1. Configure AWS ECR Access Make sure you have the necessary permissions and credentials to access the AWS ECR repository. Authenticate Docker to your Amazon ECR registry by running the following command (replace <region> and <aws_account_id> with your actual values):

    aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 709825985650.dkr.ecr.us-east-1.amazonaws.com

    1. Deploy the Docker Compose Stack Navigate to the directory containing the docker-compose.yml file. Run the following command to start the services: docker-compose up -d

    2. Verify Deployment Ensure the services are running correctly by checking the status of the containers: docker-compose ps

    3. Access the Application Your application should now be accessible at http://localhost:3000 (assuming you are running Docker on your local machine).

    4. Manage the Deployment To stop the services, use the following command: docker-compose down To view logs for a specific service, use: docker-compose logs <service_name> For example, to view logs for the fdalert service: docker logs fdalert -f For a Fresh Setup

    5. The user should first create the organization by entering the correct and specified details.

    6. User should first create the admin user for the application setup.

    7. Step 1: Enter the Organization name. (e.g. xyz)

    8. Step 2: Enter the Admin Email Address. (e.g. john@xyz.com )

    9. Step 3: Enter the Admin First Name. (e.g. John)

    10. Step 4: Enter the Admin Last Name. (e.g. Doe)

    11. Step 5: Enter your Password.

    12. Step 6: Confirm your Password.

    13. Step 7: Click on the Create Admin Button

    Support

    Vendor support

    4DAlert implementation would include

    • Support for Initial setup
    • Level1 support first 30 days
    • Level 3 support post 30 days implementation

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Customer reviews

    Ratings and reviews

     Info
    0 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    0%
    0%
    0%
    0%
    0 AWS reviews
    No customer reviews yet
    Be the first to write a review for this product.