Simplifying SaaS integration using AWS Native services

In a previous post, I’ve tested out the new EventBridge API Destinations functionality. I successfully integrated MailChimp and MS Teams. In this post, I’ll detail out how to integrate with Salesforce using the API Destinations functionality.

In our scenario, Magento publishes events to EventBridge when a new customer registers or when an existing customer is updated. Upon this event, we require a Contact to be created (or updated) in Salesforce. Previously, a Lambda function (ideally preceded by an SQS queue) did the actual integration. Using API Destinations, we now can (in some use cases) drop this queue and Lambda function.

This blog consists out of following sections:

  • Set up API Destinations for the Salesforce REST API
  • Appendix A: Create the Salesforce Connected App
  • Appendix B: Salesforce Oauth2 Username-Password flow using Postman
  • Appendix C: Testing the EventBridge DLQ configuration

Set up API Destinations for the Salesforce REST API

We’ll use the UPSERT functionality of the Salesforce REST API. Using a custom external reference Id on the Salesforce Contact object, we can identify the resource using the unique id of the source system (Magento in our case). The API call looks like this:

Now let’s create the CloudFormation template for this. (Note that the web console also allows us to create the full setup). The used CloudFormation template.

Creating the “AWS::Events::Connection” resources

To connect to Salesforce, the only way (as of May 2021) is to use the “OAuth 2.0 Username-Password Flow for Special Scenarios”. For this, you’ll need to configure a Connected App (see Appendix A of this blog). The Connected app provides the client id and the client secret.

You also need to create an integration user in Salesforce. And provide its username and password in the Connection configuration. Make sure that this user has the absolute minimum required permissions! Note that Salesforce does not recommend this OAuth flow:

Use it only if there’s a high degree of trust between the resource owner and the client, the client is a first-party app, Salesforce is hosting the data, and other grant types aren’t available. In these cases, set user permissions to minimize access and protect stored credentials from unauthorized access.
Source: Salesforce Documentation

The CloudFormaton snippet above shows the required configuration for EventBridge to connect to Salesforce. To better understand this OAuth flow, I’ve added the HTTP call EventBridge makes as a Postman example (See appendix B)

Creating the “AWS::Events::ApiDestination” resources

Using the API Destination resource you’ll configure the actual REST API endpoint of the downstream service. As explained above, we’ll need an endpoint that includes the unique id of the customer. For this, we’ll add the wildcard “*” in the provided URL (see green resolved parameter value in the code snippet above).

Also note the rate-limiting configuration to protect our downstream Salesforce service. This is important, as Salesforce throttles incoming calls based on the inbound load.

Creating the “AWS::Events::Rule” resource

In the actual EventBridge Rule, you map the inbound event to the Salesforce REST API data model.

Using the HttpParameters > PathParameterValues setting, you can set the wildcard in the configured ApiDestination URL. It should map to the unique Id of the Magento customer ($

On this Rule, I’ve also configured a Dead Letter Queue to make sure no events get lost. E.g. in case of unexpected payloads, when Salesforce is not available, when your calls get throttled, … and the configured Retry Policy has exceeded. Let’s test this out in Appendix C.

That’s about it!

I’ve integrated Salesforce as a downstream service of the Customer event. And I’ve only used the Amazon EventBridge service for it. For more thoughts on EventBridge and API Destinations check out my previous post.

Appendix A: Create the Salesforce Connected App

Besides the AWS side of things, you’ll also need to configure Salesforce. For this you need to create a Connected App. You’ll find the full documentation here. When configuring for production use, take some time to research these configurations (which OAuth scopes to use, IP relaxation requirements, Permitted users, … ). Below depicted steps will already get you started.

Step 1: Go to App Manager and click “New Connected App”.

Creating Connected App 1/4

Step 2: Configure the app and enable OAuth

Creating Connected App 2/4
Creating Connected App 3/4
Creating Connected App 4/4

Appendix B: Salesforce Oauth2 Username-Password flow using Postman

Postman example on retrieving access token using the Salesforce Oauth2 username-password flow

Note that you’ll need to set the ‘Content-Type’ header to ‘application/x-www-form-urlencoded’ in the above example.

For copy/paste purposes, I’ll also added the CURL variant:

curl --location --request POST 
'' \

--header 'Content-Type: application/x-www-form-urlencoded' \

--data-raw 'grant_type=password

Appendix C: Testing the EventBridge DLQ configuration

In the described example, we configured a Dead Letter Queue on the Rule definition. If a message can’t be successfully delivered to the downstream service, EventBridge puts it into the SQS queue. This can happen in case the downstream service is down, encountered an error when processing the message, is throttling the inbound calls, is unreachable due to network hiccup,… When building a reliable integration, it’s essential to take these scenarios into account. This to guarantee that no message will get lost. The below screenshots illustrate how to test this using the AWS Console. Do note that stacking the message in a DLQ is not enough. You’ll also need an operation procedure to investigate and reprocess the poisoned messages.

Screenshots of testing the EventBridge DLQ config using the AWS Web Console

Looking forward to your thoughts and feedback! Greetings, Samuel.

Author: Samuel Vandecasteele