Overview
4DAlert leverages AI/ML to deliver automated Data Reconciliation, Data Quality and DataOps(CI/CD) for Modern Analytics platforms such as Snowflake, Azure, SQL Server, Athena, Redshift, Google Bigquery, Postgres and others. Solution comes with three different modules. a) Data Reconciliation modules leverages APIs technologies that connects to both source and target databases, reconciles data at predefined interval and generates alerts when there is data discrepancies. This module enables platforms a. Data Reconciliation & Data Compare b. Maintain Alert & Alert History c. Schedule Alert & Monitor Chain d. Master Data Reconciliation b) Data Quality & Observability module scans the analytics platform for various data quality rules such as anomaly detection, outlier detection, Freshness, invalid character, Schema drifting, late arrival of data, and many other rules using AI/ML technologies and alerts when there is data anomaly. a. Data Quality: b. Data Quality Rules c. Data Quality Framework d. Custom Rules e. DAMA Dashboard f. AI/ML Based Alerting g. Schema Drifting h. Data Late Arrival alert c) DataOps(CI/CD) module automates CI/CD process by integrating database with GitHub or Azure DevOps or other equivalent Source control tools, schema compare and automatic change deployment. a. Intuitive UI for Schema Compare b. Object Dependency Checks c. Integration with Source Control such as GIT and Azure DevOps d. Automatic Deployment Script Generation(Alter vs Create) e. Automatic notification of CI/CD Operation with all Passed and failed DDLs Other Features
- Solution runs on a small EC2 instance(8Vcpu and 100 GB instance) Is sufficient to support the application for medium size implementation.
- Solution runs 100% within customer's cloud subscription i.e. with the firewall, supports single sign on(SSO) and secured by Active Directory or OKTA.
- Integration with JIRA and Service now
- Solution compatible with Snowflake, Azure, Oracle, Sql Server, Athena, RedShift, Google BigQuery and others
Highlights
- Data Reconciliation modules leverages APIs technologies that connects to both source and target databases, reconciles data at predefined interval and generates alerts when there is data discrepancies. This module enables platforms a. Data Reconciliation & Data Compare b. Maintain Alert & Alert History c. Schedule Alert & Monitor Chain d. Master Data Reconciliation
- Data Quality & Observability module scans the analytics platform for various data quality rules such as anomaly detection, outlier detection, Freshness, invalid character, Schema drifting, late arrival of data, and many other rules using AI/ML technologies and alerts when there is data anomaly. a. Data Quality: b. Data Quality Rules c. Data Quality Framework d. Custom Rules e. DAMA Dashboard f. AI/ML Based Alerting g. Schema Drifting h. Data Late Arrival alert
- DataOps(CI/CD) module automates CI/CD process by integrating database with GitHub or Azure DevOps or other equivalent Source control tools, schema compare and automatic change deployment. a. Intuitive UI for Schema Compare b. Object Dependency Checks c. Integration with Source Control such as GIT and Azure DevOps d. Automatic Deployment Script Generation(Alter vs Create) e. Automatic notification of CI/CD Operation with all Passed and failed DDLs
Details
Features and programs
Financing for AWS Marketplace purchases
Pricing
- $4,999.00/month
Vendor refund policy
Customer could cancel subscription within first 2 weeks of the subscription without any additional fees.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Docker Compose Method
- Amazon ECS
- Amazon EKS
- Amazon EKS Anywhere
- Amazon ECS Anywhere
Container image
Containers are lightweight, portable execution environments that wrap server application software in a filesystem that includes everything it needs to run. Container applications run on supported container runtimes and orchestration services, such as Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS). Both eliminate the need for you to install and operate your own container orchestration software by managing and scheduling containers on a scalable cluster of virtual machines.
Version release notes
General availability for all three modules i.e. Data Reconciliation, Data Quality and DataOps(CI/CD), Solution compatible with Snowflake, Azure, Oracle, Sql Server, Athena, RedShift, Google BigQuery and others
Additional details
Usage instructions
Deployment Description for Docker Compose Configuration To deploy the Docker Compose configuration provided, you would follow these steps. Here is a general deployment description:
-
Install Docker and Docker Compose Ensure Docker and Docker Compose are installed on your system. You can install Docker from here . Docker Compose can be installed following the instructions here .
-
Add Port Rules Add inbound rule for port 80 and 3000 for application hosting.
-
Create a Docker Compose File Save the provided Docker Compose configuration into a file named docker-compose.yml:
version: "3.8"
services: postgres: image: postgres:latest container_name: postgres environment: POSTGRES_USER: fdalert_user POSTGRES_PASSWORD: password POSTGRES_DB: FDALERT ports: - "5432:5432" volumes: - postgres_data:/var/lib/postgresql/data
fdalert: image: 709825985650.dkr.ecr.us-east-1.amazonaws.com/4dalert/4dalert-marketplace:1.10 container_name: fdalert ports: - "3000:3000" environment: NODE_ENV: default APP_DB_HOST: postgres APP_DB_PORT: 5432 APP_DB_NAME: FDALERT APP_DB_USER: fdalert_user APP_DB_PASSWORD: password
depends_on: - postgresvolumes: postgres_data:
- Configure AWS ECR Access Make sure you have the necessary permissions and credentials to access the AWS ECR repository. Authenticate Docker to your Amazon ECR registry by running the following command (replace <region> and <aws_account_id> with your actual values):
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 709825985650.dkr.ecr.us-east-1.amazonaws.com
-
Deploy the Docker Compose Stack Navigate to the directory containing the docker-compose.yml file. Run the following command to start the services: docker-compose up -d
-
Verify Deployment Ensure the services are running correctly by checking the status of the containers: docker-compose ps
-
Access the Application Your application should now be accessible at http://localhost:3000 (assuming you are running Docker on your local machine).
-
Manage the Deployment To stop the services, use the following command: docker-compose down To view logs for a specific service, use: docker-compose logs <service_name> For example, to view logs for the fdalert service: docker logs fdalert -f For a Fresh Setup
-
The user should first create the organization by entering the correct and specified details.
-
User should first create the admin user for the application setup.
-
Step 1: Enter the Organization name. (e.g. xyz)
-
Step 2: Enter the Admin Email Address. (e.g. john@xyz.com )
-
Step 3: Enter the Admin First Name. (e.g. John)
-
Step 4: Enter the Admin Last Name. (e.g. Doe)
-
Step 5: Enter your Password.
-
Step 6: Confirm your Password.
-
Step 7: Click on the Create Admin Button
Support
Vendor support
4DAlert implementation would include
- Support for Initial setup
- Level1 support first 30 days
- Level 3 support post 30 days implementation
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.