Category: Expert (400)
When you combine AWS CodeBuild with Snyk to deploy your infrastructure as code project, you gain a repeatable process that’s easy to track and manage that happens to include security results. AWS provides the automation while Snyk provides enhanced security. Learn how to manage an IaC project written with Terraform by HashiCorp and deployed with AWS CodeBuild, and review results before a scan and after when hidden issues are often revealed.
As companies grow, they often find themselves needing to migrate applications to new architectures that fit their needs. HashiCorp Consul is a service networking solution to automate network configurations, discover services, and enable secure connectivity across any cloud or runtime. With Consul, you can control traffic to and from services across different platforms from a single interface. In this post, explore a solution for routing service request from Amazon ECS to AWS Lambda using Consul.
Databricks SQL is a dedicated workspace for data analysts that comprises a native SQL editor, drag-and-drop dashboards, and built-in connectors for all major business intelligence tools as well as Photon. In this post, Volker Tjaden, an APN Ambassador from Databricks, shares the technical capabilities of Databricks SQL and walks through two examples: ingesting, querying, and visualizing AWS CloudTrail log data, and building near real-time dashboards on data coming from Amazon Kinesis.
Element 84, in collaboration with AWS and Geoscience Australia, has released the Sentinel-2 Cloud-Optimized GeoTIFF (COG) dataset on AWS Open Data. Sentinel-2 is an important platform for Earth observation, and its imagery contributes to ongoing research in climate change, land use, and emergency management. By making the Sentinel-2 archive more cloud-native, we are making the data more user-friendly and (hopefully) making the lives of emergency managers, climate scientists, and policy makers that much easier.
One of the offerings to help joint AWS and Red Hat customers build for an open hybrid cloud future is Red Hat OpenShift Service on AWS (ROSA), a fully managed OpenShift service, jointly supported by both Red Hat and AWS. In this post, we focus on a single but common use case often observed in the field where a joint enterprise customer has made investments in the AWS Cloud and is now considering expanding and adopting ROSA service as part of the AWS portfolio.
Global business leaders recognize the value of advanced and augmented big data analytics over various internal and external data sources. However, technical leaders also face challenges capturing insights from data silos without unified master data. Learn how migrating Tamr’s data mastering solutions from on-premises to AWS allowed a customer to process billions of records five times faster with fully managed Amazon EMR clusters.
Within AWS IoT fleet deployments, each connected device needs to have unique, trusted, verifiable identity. While Linux OS provides strong protection from unauthorized user access, it’s still possible to compromise the system if the attacker has full physical access to the device. NXP EdgeLock SE050 prevents attackers from extracting the private key even if the physical integrity of the device is compromised and the attacker managed to get access to the device filesystem.
Achieving “speed of thought” or instant analytics on large data sets is a key challenge for business intelligence platforms. Traditionally, data engineers would design and deliver an optimized, aggregated subset of the data to a data warehouse to drive the visualization. This can often take weeks of development and testing or incur significant infrastructure costs. Learn how Indexima uses machine learning and hyper indexes to automate this process and accelerate analytics by up to 1000x across a full data set on Amazon S3.
Amazon Managed Streaming for Apache Kafka (Amazon MSK) is a fully managed, highly available, and secure Apache Kafka service that makes it easy to build and run applications that use Kafka to process steaming data. Learn how to use the new open source Kafka Connect Connector (StreamReactor) from Lenses.io to query, transform, optimize, and archive data from Amazon MSK to Amazon S3. We’ll also demonstrate how to use Amazon Athena to query the partitioned parquet data directly from S3.
Data science is driving significant value for many organizations, including fueling new revenue streams, improving longstanding processes, and optimizing customer experience. Domino Data Lab empowers code-first data science teams to overcome these challenges of building and deploying data science at scale. Learn how to build and export a model from the Domino platform for deployment in Amazon SageMaker. Deploying models within Domino provides insight into the full model lineage.