How can I load data to Amazon Redshift from my Salesforce account using Amazon AppFlow?

Last updated: 2022-12-02

I want to extract data from my Salesforce account and load it to Amazon Redshift.

Short description

To extract data from your Salesforce account and load it to Amazon Redshift, you must:

  1. Allow Salesforce edition to provide API access to Amazon AppFlow.
  2. Check that your Salesforce account meets the requirements to load data to Amazon Redshift.
  3. Run Amazon AppFlow on demand to test the connectivity and flow, end to end.

Note: To load your Salesforce data to Amazon Redshift, Amazon Redshift must have a public security group.

Resolution

  1. Log in to your Salesforce account.
  2. Choose Setup, and then, in the search window, enter Company Information.
  3. Under Organization edition, note the edition of Salesforce that you're using.

    Note: Salesforce Enterprise, Unlimited, Developer, and Performance editions provide API access, but Professional and Essentials editions don't. Check the latest information on which editions provide API access before continuing.

  4. In the navigation pane, choose Users, and then choose Profiles.
  5. Choose System administrator, and then choose the System Administrator link.
  6. On the next page, under Administrative Permissions, select the check box for API enabled.
  7. Log in to the Amazon AppFlow console, and choose Create flow.
  8. Under Flow details, enter a name and description for your flow.
  9. Under Source name, type Salesforce, and then, under Choose Salesforce connection, choose Create new connection.
  10. Under Salesforce environment, choose Production, and then enter a name for your connection.
  11. Choose Continue. This opens the Salesforce login page. Enter your user ID and password to continue. After you log in and allow access, the connection from Amazon AppFlow to Salesforce is established.
  12. On the Configure flow page in Salesforce, choose the object that you want to migrate.
  13. In the Destination details section, for Destination name, search for and then choose Amazon Redshift.
  14. Under New connection, choose Create new connection.
  15. On the Connect to Amazon Redshift page, enter the details of your Amazon Redshift connection. Be sure that Amazon Redshift has a public security group.
  16. Attach this AWS Key Management Service (AWS KMS) decrypt policy to the AWS Identity and Access Management (IAM) role for Amazon Redshift:
{    "Effect": "Allow",
    "Action": "kms:Decrypt",
    "Resource": "*" }
  • Choose the Amazon Simple Storage Service (Amazon S3) bucket that you want Amazon AppFlow to use when migrating data to Amazon Redshift.
  • Under Choose Amazon Redshift object, enter the Amazon Redshift schema name. Then, under Choose Redshift table, enter the Amazon Redshift table name.
  • Under Error handling, choose the Amazon S3 bucket that you want to write records to if an issue stops records being written to the destination.
  • On the Map data fields page, choose the object fields that you want to migrate from the destination to the source. Then, choose Map fields.
  • Review all your details, and then choose Run flow.
  • After running the flow, check the records in the destination table.

  • Did this article help?


    Do you need billing or technical support?