AWS Web3 Blog

Category: Blockchain

Processing digital asset payments on AWS

In this post, we explain how blockchain-based digital asset payment systems can reduce costs and delays. We demonstrate how to build a serverless payment system on AWS, using stablecoins such as USDC, PYUSD, or USDG as an example. This solution creates a low-cost, scalable, and decentralized alternative to legacy payment methods. The implementation is available in our GitHub repository.

Improve Solana node performance and reduce costs on AWS

Solana Agave v2.0.14 was released on October 18, 2024. Since then, operators of Solana nodes reported that occasionally their Solana nodes struggle to stay in sync with the latest slots on mainnet-beta. Searching for “catch up” on Solana’s StackExchange reveals numerous discussions of this challenge. In an earlier post, we explained how to run Solana […]

Accelerate Ethereum synchronization time with storage-optimized Amazon EC2 instances

Syncing an Ethereum node can be a time-consuming and costly process if not well optimized, with the need to find the right balance between speed and security. Compute requirements are different between the initial synchronization phase with the network and the steady-state phase where the node only needs to process new blocks (for additional details, refer to Synchronization modes in the Ethereum documentation). This challenge can be addressed by using different types of Amazon EC2 instances corresponding to your requirements. In this post, we demonstrate how to use the latest generation of storage optimized EC2 instances during the synchronization process, and switch back to right-sized memory optimized instances for the run phase to minimize cost.

Implement a USDC bridge on AWS

Stablecoins offer significant advantages in the crypto space. They provide price stability and can serve as a reliable medium of exchange, store of value, or bridge between the fiat and crypto ecosystems. The ability to transfer stablecoins across multiple blockchains further enhances their utility by improving cross-chain interoperability and letting users take advantage of the […]

How Derive scaled their low-latency, decentralized trading platform using AWS Graviton, Amazon EKS, and Amazon Aurora

In this post, we share how Derive successfully scaled their hybrid decentralized trading platform to achieve billions of dollars in trading volume and low-latency execution by using a robust compute and database infrastructure, using AWS Graviton on Amazon Elastic Kubernetes Service (Amazon EKS) and Amazon Aurora. We explore Derive’s hybrid exchange model and how AWS played a crucial role in their growth and scalability.

Build a real-world asset tokenization solution on AWS with Fireblocks

Explore a reference architecture for real-world asset tokenization which integrates AWS services and Fireblocks’ tokenization SaaS platform with existing financial services infrastructure. By using innovative and novel technologies, such as distributed ledger technology combined with well-architected, secure and resilient cloud architecture patterns, this approach not only demonstrates the feasibility of asset tokenization but also highlights its potential to enhance efficiency, transparency, and accessibility in today’s digital economy. As businesses continue to explore the possibilities of blockchain, this architecture serves as a solid foundation for more complex and scalable tokenization solutions in the future.

Build crypto AI agents on Amazon Bedrock

As Web3 and generative AI technologies continue to rapidly evolve, a new category of applications known as crypto AI agents has emerged. These agents use large language models (LLMs) for their intelligence to accomplish a variety of blockchain-related tasks through a supervisor-collaborator architecture. A supervisor agent orchestrates specialized collaborator agents to analyze blockchain data, identify […]

Use a DAO to govern LLM training data, Part 4: MetaMask authentication

In Part 1 of this series, we introduced the concept of using a decentralized autonomous organization (DAO) to govern the lifecycle of an AI model, focusing on the ingestion of training data. In Part 2, we created and deployed a minimalistic smart contract on the Ethereum Sepolia using Remix and MetaMask, establishing a mechanism to govern which training data can be uploaded to the knowledge base and by whom. In Part 3, we set up Amazon API Gateway and deployed AWS Lambda functions to copy data from InterPlanetary File System (IPFS) to Amazon Simple Storage Service (Amazon S3) and start a knowledge base ingestion job, creating a seamless data flow from IPFS to the knowledge base. In this post, we demonstrate how to configure MetaMask authentication, create a frontend interface, and test the solution.

Use a DAO to govern LLM training data, Part 3: From IPFS to the knowledge base

In Part 1 of this series, we introduced the concept of using a decentralized autonomous organization (DAO) to govern the lifecycle of an AI model, focusing on the ingestion of training data. In Part 2, we created and deployed a minimalistic smart contract on the Ethereum Sepolia testnet using Remix and MetaMask, establishing a mechanism to govern which training data can be uploaded to the knowledge base and by whom. In this post, we set up Amazon API Gateway and deploy AWS Lambda functions to copy data from InterPlanetary File System (IPFS) to Amazon Simple Storage Service (Amazon S3) and start a knowledge base ingestion job.

Use a DAO to govern LLM training data, Part 2: The smart contract

In Part 1 of this series, we introduced the concept of using a decentralized autonomous organization (DAO) to govern the lifecycle of an AI model, specifically focusing on the ingestion of training data. In this post, we focus on the writing and deployment of the Ethereum smart contract that contains the outcome of the DAO decisions.