Cost to complete the project:  The estimated cost to complete this project is $0.57. This cost assumes that you are within the AWS Free Tier limits, you follow the recommended configurations, and that you terminate all resources within 1 hour of completing the project. Your use case may require different configurations that can impact your bill. Use the Pricing Calculator to estimate costs tailored for your needs

Monthly Billing Estimate: The total cost of building and maintaining your log analytics solution will vary depending on your usage and configuration settings. Using the default configuration recommended in this guide, it will typically cost $382.45/month. If running the lab for an hour, it will cost around $0.57. 

AWS pricing is based on your usage of each individual service. The total combined usage of each service will create your monthly bill. Explore the tabs below to learn what each service does and how it affects your bill.

  • Amazon Kinesis Data Firehose

    Product Description: Amazon Kinesis Data Firehose is a fully managed service for delivering real-time streaming data to destinations such as Amazon S3, Amazon Redshift, or Amazon ES. With Kinesis Data Firehose, you do not need to write any applications or manage any resources. You configure your data producers to send data to Kinesis Data Firehose and it automatically delivers the data to the destination that you specified.

    How Pricing Works: Amazon Kinesis Data Firehose pricing is based on the volume of data ingested into Amazon Kinesis Data Firehose, which is calculated as the number of data records you send to the service, times the size of each record, rounded up to the nearest 5 KB. For example, if your data records are 42 KB each, Amazon Kinesis Data Firehose will count each record as 45 KB of data ingested. In the US East region, the price for Amazon Kinesis Data Firehose is $0.029 per GB of data ingested. For detailed pricing information, see Amazon Kinesis Data Firehose Pricing.

    Cost Example: In this tutorial, you will create two separate Amazon Kinesis Data Firehose delivery streams. One delivery stream will receive the data from your Apache access log producer, and the other stream will receive the output from an Amazon Kinesis Data Analytics application.

    For the first Kinesis Data Firehose delivery stream, assume the producer sends 500 records per second, and that each record is less than 5 KB in size (typical for an Apache access log record). The monthly estimate for data ingestion into the Firehose delivery stream consists of the following:

    • The price in the US East region is $0.029 per GB of data ingested.
    • Record size, rounded up to the nearest 5 KB = 5 KB
    • Data ingested (GB per sec) = (500 records/sec * 5 KB/record) / 1,048,576 KB/GB = 0.002384 GB/sec
    • Data ingested (GB per month) = 30 days/month * 86,400 sec/day * 0.002384 GB/sec = 6,179.81 GB/month
    • Monthly charge: 6,179.81 * $0.029/GB = $179.21
    • Hourly charge:  In this tutorial, assume that the system is only ingesting data for 1 hour. The cost specifically for this tutorial would be approximately $0.27

    The second Firehose delivery stream is receiving records at a much less frequent rate. Because the Amazon Kinesis Analytics application is outputting only a few rows of data every minute, the cost for that delivery stream is correspondingly smaller. Assuming only five records per minute are ingested, and each record is less than 5 KB, the cost for the delivery stream is $0.00005 for the 1-hour duration assumed in this tutorial.

  • Amazon Kinesis Data Analytics

    Product Description: Amazon Kinesis Data Analytics is the easiest way to process and analyze streaming data in real-time with ANSI standard SQL. It enables you to read data from Amazon Kinesis Data Streams and Amazon Kinesis Data Firehose, and build stream processing queries that filter, transform, and aggregate the data as it arrives. Amazon Kinesis Data Analytics automatically recognizes standard data formats, parses the data, and suggests a schema, which you can edit using the interactive schema editor. It provides an interactive SQL editor and stream processing templates so you can write sophisticated stream processing queries in just minutes. Amazon Kinesis Data Analytics runs your queries continuously, and writes the processed results to output destinations such as Amazon Kinesis Data Streams and Amazon Kinesis Data Firehose, which can deliver the data to Amazon S3, Amazon Redshift, and Amazon ES. Amazon Kinesis Data Analytics automatically provisions, deploys, and scales the resources required to run your queries.

    How Pricing Works: With Amazon Kinesis Data Analytics, you pay only for what you use. You are charged an hourly rate based on the average number of Kinesis Processing Units (KPUs) used to run your stream processing application.

    A single KPU is a unit of stream processing capacity comprised of 4 GB memory, 1 vCPU compute, and corresponding networking capabilities. As the complexity of your queries varies, and the demands on memory and compute vary in response, Amazon Kinesis Analytics will automatically and elastically scale the number of KPUs required to complete your analysis. There are no resources to provision, no upfront costs, or minimum fees associated with Amazon Kinesis Data Analytics.

    Cost Example: This example assumes that the system is running for 1 hour in the US East region. The SQL query in this tutorial is very basic and will not consume more than one KPU. Given that the price for Amazon Kinesis Data Analytics in US East is $0.11 per KPU-hour, and the tutorial runs for 1 hour, the total cost for the usage of Amazon Kinesis Analytics is $0.11.

  • Amazon Elasticsearch Service

    Product Description: Amazon ES is a popular open-source search and analytics engine for big data use cases such as log and click stream analysis. Amazon ES manages the capacity, scaling, patching, and administration of Elasticsearch clusters for you while giving you direct access to the Elasticsearch API.

    How Pricing Works: With Amazon ES, you pay only for what you use. There are no minimum fees or upfront commitments. You are charged for Amazon Elasticsearch instance hours, an Amazon Elastic Block Store (Amazon EBS) volume (if you choose this option), and standard data transfer fees. For more information, see Amazon Elasticsearch Service Pricing.

    Cost Example: For this tutorial, the total Amazon ES cost can be calculated as follows: An instance type of m3.medium. Elasticsearch costs $0.094 per hour * 1 hour = $0.094 in the US East region.

  • Amazon S3

    Product Description: Amazon S3 provides secure, durable, and highly-scalable cloud storage for the objects that make up your application. Examples of objects you can store include source code, logs, images, videos, and other artifacts that are created when you deploy your application. Amazon S3 makes it is easy to use object storage with a simple web interface to store and retrieve your files from anywhere on the web, meaning that your website will be reliably available to your visitors.

    How Pricing Works: Amazon S3 Pricing is based on five components: the type of S3 storage you use, where you store your website content (e.g. US East vs. Asia Pacific - Sydney), the amount of data you store, the number of requests you or your users make to store new content or retrieve the content, and the amount of data that is transferred from S3 to you or your users. Since you’ll deliver content with Amazon CloudFront, your S3 costs will be based on storage. For more infromation, see Amazon S3 pricing.

    Cost Example: Using Standard Storage in the US East Region, if you store 5 GB of content, you would pay $0.115 per month. If you created your account in the past 12 months, and you are eligible for the AWS Free Tier, you would pay $0.00 per month. For this tutorial, assume that the producer creates 5 GB of data. Over a 1-hour period, the total cost for storing the records in Amazon S3 is $0.000171.

  • Amazon EC2

    Product Description: Amazon EC2 provides the virtual application servers, known as instances, to run your web application on the platform you choose. EC2 allows you to configure and scale your compute capacity easily to meet changing requirements and demand. It is integrated with Amazon’s proven computing environment, allowing you to leverage the AWS suite of services.

    How Pricing Works: Amazon EC2 pricing is based on four components: the instance type you choose (EC2 comes in 40+ types of instances with options optimized for compute, memory, storage, and more), the region your instances are based in, the software you run, and the pricing model you select (on-demand instances, reserved capacity, spot, etc.). For more information, see Amazon EC2 pricing.

    Cost Example: Assume your log files reside on a single Linux t2.micro EC2 instance in the US East Region. With an on-demand pricing model, your hourly charge for your virtual machine will be $0.0116. For this implementation guide, assuming that the log generating instance runs for 1 hour, your EC2 cost is estimated to be $0.0116.