
Overview
This solution provides compositional analysis and predicts the number of incidents pertaining to each ticket group. The insights around incident distribution helps in proper capacity planning, resulting in efficient resource utilization.
Highlights
- The Machine Learning based algorithm predicts incident volume at group/category level. Volume analysis and prediction is based on historical incident data. Incidents can be grouped based on nature of incident, criticality and/or severity. The solution utilizes both contextual and temporal features of incidents for prediction.
- Predicting incidents along with distribution/composition helps capacity planning, efficient resource utilization and management.
- InfraGraf is a patented Cognitive infrastructure automation platform that optimizes enterprise technology infrastructure investments. It diagnoses and predicts infrastructure failures. Need customized Machine Learning and Deep Learning solutions? Get in touch!.
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
Dimension | Description | Cost/host/hour |
|---|---|---|
ml.m5.large Inference (Batch) Recommended | Model inference on the ml.m5.large instance type, batch mode | $10.00 |
ml.m5.large Inference (Real-Time) Recommended | Model inference on the ml.m5.large instance type, real-time mode | $5.00 |
ml.m4.4xlarge Inference (Batch) | Model inference on the ml.m4.4xlarge instance type, batch mode | $10.00 |
ml.m5.4xlarge Inference (Batch) | Model inference on the ml.m5.4xlarge instance type, batch mode | $10.00 |
ml.m5.12xlarge Inference (Batch) | Model inference on the ml.m5.12xlarge instance type, batch mode | $10.00 |
ml.m4.16xlarge Inference (Batch) | Model inference on the ml.m4.16xlarge instance type, batch mode | $10.00 |
ml.m5.2xlarge Inference (Batch) | Model inference on the ml.m5.2xlarge instance type, batch mode | $10.00 |
ml.c4.4xlarge Inference (Batch) | Model inference on the ml.c4.4xlarge instance type, batch mode | $10.00 |
ml.m5.xlarge Inference (Batch) | Model inference on the ml.m5.xlarge instance type, batch mode | $10.00 |
ml.c5.9xlarge Inference (Batch) | Model inference on the ml.c5.9xlarge instance type, batch mode | $10.00 |
Vendor refund policy
Currently we do not support refunds, but you can cancel your subscription to the service at any time.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Amazon SageMaker model
An Amazon SageMaker model package is a pre-trained machine learning model ready to use without additional training. Use the model package to create a model on Amazon SageMaker for real-time inference or batch processing. Amazon SageMaker is a fully managed platform for building, training, and deploying machine learning models at scale.
Version release notes
Bug Fixes and Performance Improvement
Additional details
Inputs
- Summary
- Train.csv:
- Date: Date (DD/MM/YYYY) when Incident occured.
- Group_Name: Group to which application belongs for which incident notified.
- Application_Name: The application in which error notified.
- Predict.csv:
- Date: Any future date (DD/MM/YYYY) for which the forecasting is needed.
- Group_Name: Group to which application belongs for which number of incidents needed to be forecasted.
- Application_Name: Application for which number of incidents needed to be forecasted.
- Limitations for input type
- The zip file should contain 2 files in CSV (UTF-8 encoded) format containing requisite fields: Note: Please make sure (Group_Name, Application_Name) combination in Predict.csv should also be there in Train.csv
- Input MIME type
- application/zip
Resources
Vendor resources
Support
Vendor support
For any assistance, please reach out at:
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.