AWS Developer Tools Blog

Introducing Transaction Support in aws-record

Introducing Transaction Support in aws-record

The aws-record Ruby Gem is a data mapper abstraction layer over Amazon DynamoDB, a key-value and document database that delivers single-digit millisecond performance at any scale.

Recently, support for transactions was added to DynamoDB. DynamoDB transactions simplify the developer experience of making coordinated, all-or-nothing changes to multiple items both within and across tables. Transactions provide atomicity, consistency, isolation, and durability (ACID) in DynamoDB, enabling you to maintain data correctness in your applications easily.

Today, we’re launching support for DynamoDB transactional find and write operations directly from aws-record. This support enables you to enjoy the benefits of transactional operations, while also getting to use the object abstractions that aws-record provides.

How to use DynamoDB transactions by example

Let’s assume we have two tables for a game where players can buy items with virtual coins (much like Danilo’s example from the announcement post). In aws-record, you might model the two tables this way.

class GameItem
  include Aws::Record
  string_attr  :id, hash_key: true
  string_attr  :name
  integer_attr :price
  integer_attr :update_serial, default_value: 1
  string_attr :owned_by
end

class GamePlayer
  include Aws::Record
  string_attr  :id, hash_key: true
  integer_attr :coins
  list_attr    :items, default_value: []
  boolean_attr :active
  integer_attr :update_serial, default_value: 1
end

Although we could potentially design this as a single table, the fact that this example uses two tables also helps us demonstrate that transaction support can span multiple DynamoDB tables in a single transaction.

Transactional finds

The aws-record gem provides two approaches to perform transactional finds:

  • The Aws::Record::Transactions.transact_find operation, which allows you to create transactional finds that span multiple tables, but automatically marshal the response into an array of aws-record item objects of the appropriate model class.
  • Calling the .transact_find operation directly on an aws-record model class. This is useful when your find transaction does not span multiple tables, and allows your code to be more concise.

Let’s take a look at two side-by-side examples using the tables we defined above. First, let’s get a set of items using find operations directly on the different tables.

items = GameItem.transact_find(
  transact_items: [
    { key: { id: 'item-1-id' }},
    { key: { id: 'item-id-not-found' }},
    { key: { id: 'item-2-id' }}
  ]
).responses
items.map { |i| i.class } # => [GameItem, NilClass, GameItem]

players = GamePlayer.transact_find(
  transact_items: [
    { key: { id: 'player-1-id' }},
    { key: { id: 'player-2-id' }}
  ]
).responses
players.map { |i| i.class } # => [GamePlayer, GamePlayer]

The model-level transactional find API expects a hash including a :key sub-hash that includes the key to be retrieved for the given item. (You can also optionally provide extra parameters to include in the request, such as a projection expression.)

You can see that the responses array contains aws-record items, but can also contain nil when an item in your transaction isn’t found.

How would we do the same thing in a single transactional find call? That’s where the global variant comes in handy.

objects = Aws::Record::Transactions.transact_find(
  transact_items: [
    GameItem.tfind_opts(key: { id: 'item-1-id' }),
    GameItem.tfind_opts(key: { id: 'item-id-not-found' }),
    GameItem.tfind_opts(key: { id: 'item-2-id' }),
    GamePlayer.tfind_opts(key: { id: 'player-1-id' }),
    GamePlayer.tfind_opts(key: { id: 'player-2-id' })
  ]
).responses
objects.map { |i| i.class } # => [GameItem, NilClass, GameItem, GamePlayer, GamePlayer]

The top-level transactional find API involves writing a bit more code, but enables you to mix and match different tables and models in a single batch of finds.

Transactional writes

The transactional write API provides a way to pass in aws-record items into transactional writes. It also adds the ability to run “save” commands in a transaction, while allowing aws-record to determine if a :put or :update operation is most appropriate. Additionally, it ports the aws-record #save logic (performing a conditional put or an update, depending on what has changed in a given item) to the DynamoDB transactional operations.

Let’s look at how we would perform some transactional writes to implement our game example. Let’s assume we want to take a player ID and an item ID from our game, and attempt to “buy” that item as that player.

def buy_item(player_id, item_id)
  player, item = Aws::Record::Transactions.transact_find(
    transact_items: [
      GamePlayer.tfind_opts(key: { id: player_id }),
      GameItem.tfind_opts(key: { id: item_id })
    ]
  ).responses
  raise ArgumentError if player.nil? || item.nil?
  player.coins -= item.price
  item.available = false
  item.owned_by = player.id
  player.items << item.id
  player.update_serial += 1
  item.update_serial += 1
  Aws::Record::Transactions.transact_write(
    transact_items: [
      {
        update: player,
        condition_expression: "#COINS >= :p AND #SERIAL = :s",
        expression_attribute_names: {
          '#COINS' => 'coins',
          '#SERIAL' => 'update_serial'
        },
        expression_attribute_values: {
          ':p' => (item.price),
          ':s' => (player.update_serial - 1)
        }
      },
      {
        update: item,
        condition_expression: "#AVAIL = :true AND #SERIAL = :s",
        expression_attribute_names: {
          '#AVAIL' => 'available',
          '#SERIAL' => 'update_serial'
        },
        expression_attribute_values: {
          ':true' => true,
          ':s' => (item.update_serial - 1)
        }
      }
    ]
  )
end

Because transactional writes either all fail or all succeed, you can avoid race conditions where, for example, multiple players are recorded as owning the same item, or coin/item inventories don’t line up. This remains true even if you’re using multiple tables in your application.

You can also add checks to your transactions. Imagine if we had a global GameState table, which we used, in part, to determine if any transactions were allowed. We could add a check expression, where an item is checked as a part of the transaction but not changed, like so:

Aws::Record::Transactions.transact_write(
  transact_items: [
    {
      update: player,
      condition_expression: "#COINS >= :p AND #SERIAL = :s",
      expression_attribute_names: {
        '#COINS' => 'coins',
        '#SERIAL' => 'update_serial'
      },
      expression_attribute_values: {
        ':p' => (item.price),
        ':s' => (player.update_serial - 1)
      }
    },
    {
      update: item,
      condition_expression: "#AVAIL = :true AND #SERIAL = :s",
      expression_attribute_names: {
        '#AVAIL' => 'available',
        '#SERIAL' => 'update_serial'
      },
      expression_attribute_values: {
        ':true' => true,
        ':s' => (item.update_serial - 1)
      }
    },
    {
      check: GameState.transact_check_expression(
        key: { config_set_id: "GLOBAL" },
        condition_expression: "#A = :true",
        expression_attribute_names: {
          "#A" => "purchases_active"
        },
        expression_attribute_values: {
          ":true" => true
        }
      )
    }
  ]
)

This variant would also fail to complete the item purchase transaction if purchases were globally disabled for any reason, even if the player and item otherwise meet the transaction conditions.

Conclusion

With this new feature in aws-record you can take advantage of the DynamoDB support for transactions, while continuing to represent your DynamoDB items as objects, and leveraging features like intelligent save behavior.

What can we deliver next to make your experience of using DynamoDB in Ruby even easier? Let me know!