The best way to get started with Amazon Elasticsearch Service is to work through the Getting Started Guide, part of our technical documentation. Within a few minutes, you will be able to deploy and use an Amazon Elasticsearch Service domain. To get hands-on experience with Amazon Elasticsearch Service, build a log analytics solution by following this step-by-step guide.

Get Started with AWS for Free

Create a Free Account
Or Sign In to the Console

AWS Free Tier includes 750 hrs per month of a t2.micro.elasticsearch or t2.small.elasticsearch instance and 10GB per month of optional Amazon EBS storage (Magnetic or General Purpose)


View AWS Free Tier Details »

 


Expedia uses Amazon Elasticsearch Service (Amazon ES) for a variety of mission-critical use cases ranging from log aggregation to application monitoring and pricing optimization. In this session the Expedia team reviews how they use Amazon ES and Kibana to analyze and visualize Docker startup logs AWS CloudTrail data and application metrics. They share best practices for architecting a scalable secure log analytics solution using Amazon ES so you can add new data sources almost effortlessly and get insights quickly. 

Watch session recording | Download presentation

In this session we use Apache web logs as example and show you how to build an end-to-end analytics solution. First we cover how to configure an Amazon ES cluster and ingest data using Amazon Kinesis Firehose. We look at best practices for choosing instance types storage options shard counts and index rotations based on the throughput of incoming data. Then we demonstrate how to set up a Kibana dashboard and build custom dashboard widgets. Finally we review approaches for generating custom ad-hoc reports.

Watch session recording | Download presentation

MirrorWeb offers automated website and social media archiving services with full text search capability for all content. The UK government hired MirrorWeb to provide search services across 20 years of archived data from over 4800 websites. In this session MirrorWeb discusses the technology stack they built using Amazon Elasticsearch Service (Amazon ES) to search across the 333 million unique documents (over 120 TB) that they indexed within a 10-hour period. They discuss how they moved data from on-premises to Amazon S3 using AWS Snowball and then processed that data using Amazon EC2 Spot Instances reducing costs by over 90%. They also talk about how they used AWS Lambda to ingest data into Amazon ES. Finally they share best practices for building a large-scale document search architecture.

Watch session recording | Download presentation

Applications generate logs. Infrastructure generates logs. Even humans generate logs (though we usually call that "medical data"). By ingesting and analyzing logs, you can gain understanding of how complex systems operate and quickly discover and diagnose when they don't work as they should. In this workshop, we ingest and analyze log streams using Amazon Kinesis Firehose and Amazon Elasticsearch Service. You should come with an understanding of AWS fundamentals (Amazon EC2, Amazon S3, and security groups). You need a laptop with a Chrome or Firefox browser.

Download presentation »


Log analytics is a common big data use case that allows you to analyze log data from websites, mobile devices, servers, sensors, and more for a wide variety of applications including digital marketing, application monitoring, fraud detection, ad tech, gaming, and IoT. In this tech talk, we will walk you step-by-step through the process of building an end-to-end analytics solution that ingests, transforms, and loads streaming data using Amazon Kinesis Firehose, Amazon Kinesis Analytics and AWS Lambda. The processed data will be saved to an Amazon Elasticsearch Service cluster, and we will use Kibana to visualize the data in near real-time.

Learning Objectives:

  1. Reference architecture for building a complete log analytics solution
  2. Overview of the services used and how they fit together
  3. Best practices for log analytics implementation

Watch recording | Download presentation

Data lakes can consist of massive amounts of unstructured data that makes it difficult to search and explore them. You can use Amazon Elasticsearch Service to easily index and search the metadata and the content of the documents in your data lakes. In this tech talk, you will learn how to use Amazon Elasticsearch Service build a metadata repository for your data lake and index the contents of your documents so you can easily locate files by the text contained in them.

Learning Objectives:

  • Understand the search and visualization capabilities of Amazon Elasticsearch Service
  • Learn how to set up Amazon Kinesis Firehose to ingesting, transform and load document metadata into Amazon Elasticsearch Service
  • Learn best practices for building a metadata repository for your data lakes using Amazon Elasticsearch Service

Watch recording | Download presentation

Amazon Elasticsearch Service (Amazon ES) makes it easy to set up, and operate Elasticsearch clusters on AWS. Amazon Virtual Private Cloud (Amazon VPC) lets you launch AWS resources in a secure virtual network that you define. Amazon ES now supports VPC endpoints so you can create a private connection between your VPC and your Amazon ES domains. In this webinar we will walk you through how to set up and configure an Amazon ES domain, and send log data to that domain from a VPC. We'll cover security, scale, and access control for Amazon ES with VPC.

Learning Objectives:

  • Understand how Amazon ES security works
  • Learn how to use Amazon VPC endpoints with Amazon ES
  • Take away best practices for security and access control for your Amazon ES domains

Watch recording | Download presentation