AWS Partner Network (APN) Blog
Network Access Patterns of AWS Lambda for Confluent Cloud
With its event-driven nature, AWS Lambda provides seamless integration with modern-day platforms like Confluent Cloud. Explore best practices to set up network access paths for Lambda when integrating with Confluent Cloud, and review details about various resources like connectors, Kafka endpoints, and private links, as well as ways to establish connectivity between them and Lambda. Founded by the creators of Apache Kafka, Confluent enables organizations to harness business value from stream data.
Best Practices from Provectus for Migrating and Optimizing Amazon EMR Workloads
Provectus, an AWS Premier Tier Services Partner with the Data and Analytics Competency, helps clients resolve issues of their legacy, on-premises data platforms by implementing best practices for the migration and optimization of Amazon EMR workloads. This post examines the challenges organizations face along the path to a successful migration, and explores best practices for re-architecting and migrating on-premises data platforms to AWS
Get a Blockchain App into Production Fast with Hyperledger Fabric and Kaleido
Hyperledger Fabric is a top blockchain protocol choice for enterprise use cases that require a permissioned blockchain network. Fabric supports the full spectrum of levels of decentralization, so it’s no surprise that multiple blockchain-as-a-service platforms exist that support Fabric. However, most of these platforms target the fully decentralized governance model, with all members being equal. Learn how Kaleido makes provisioning a Hyperledger Fabric blockchain network dramatically simpler.
Low Latency Real-Time Cache Updates with Amazon ElastiCache for Redis and Confluent Cloud Kafka
Founded by the creators of Apache Kafka, Confluent offers a platform for data in motion that enables processing data as real-time streams across on-premises and AWS. Learn how to power a logistics and inventory system with microsecond read performance powered by Amazon ElastiCache for Redis and durable streaming with Kafka powered by Confluent Cloud. You can use this pattern for building streaming applications and other broad sets of use cases with asynchronous requirements for low latency reads.
Archiving Amazon MSK Data to Amazon S3 with the Lenses.io S3 Kafka Connect Connector
Amazon Managed Streaming for Apache Kafka (Amazon MSK) is a fully managed, highly available, and secure Apache Kafka service that makes it easy to build and run applications that use Kafka to process steaming data. Learn how to use the new open source Kafka Connect Connector (StreamReactor) from Lenses.io to query, transform, optimize, and archive data from Amazon MSK to Amazon S3. We’ll also demonstrate how to use Amazon Athena to query the partitioned parquet data directly from S3.
Maintaining Control of PII Hosted on AWS with Hold Your Own Key (HYOK) Security
One of the biggest challenges in moving to the cloud for organizations that collect and process personally identifiable information (PII) is the fundamental change to the trust model. SecuPi minimizes changes to the trust model and reduces the risk associated with digital transformations. Learn how SecuPi can help you collect and process sensitive or regulated PII and reduce barriers to cloud adoption while satisfying the trust model requirements of even the most conservative and risk-averse companies.
Accelerate Data Warehousing by Streaming Data with Confluent Cloud into Amazon Redshift
Built as a cloud-native service, Confluent Cloud offers developers a serverless experience with elastic scaling and pricing that charges only for what they stream. Confluent’s Kafka Connect Amazon Redshift Sink Connector exports Avro, JSON Schema, or Protobuf data from Apache Kafka topics to Amazon Redshift. The connector polls data from Kafka and writes this data to an Amazon Redshift database. Polling data is based on subscribed topics.
Analyze Streaming Data from Amazon Managed Streaming for Apache Kafka Using Snowflake
When streaming data comes in from a variety of sources, organizations should have the capability to ingest this data quickly and join it with other relevant business data to derive insights and provide positive experiences to customers. Learn how you can build and run a fully managed Apache Kafka-compatible Amazon MSK to ingest streaming data, and explore how to use a Kafka connect application to persist this data to Snowflake. This enables businesses to derive near real-time insights into end users’ experiences and feedback.
Building a Business Case for SAP on AWS to Unlock New Value for the Enterprise
Many organizations will find the migration to S/4HANA in the cloud to be as transformative as their initial adoption of enterprise resource planning (ERP) software. Choosing trusted providers that understand both your business needs and technology environment is the shortest path to value. Deloitte’s collaboration with AWS helps clients accelerate their ability to achieve digital transformation and powerful insights-based outcomes with SAP S/4HANA and the SAP HANA platform.
How Behalf Met its Streaming Data Scaling Demands with Amazon Managed Streaming for Apache Kafka
To be a successful fintech startup, companies have to build solutions fast so the business can achieve its goals. However, you can’t compromise on security, reliability, or support. As an AWS Financial Services Competency Partner, Behalf is committed to delivering reliable, secure, low-cost payment processing and credit options to business customers. Learn how Behalf chose Amazon MSK to meet its increasing streaming data needs in a reliable and cost-efficient manner.