Salesforce to Azure Data Factory ETL Pipeline

Loading

In this blog, we’ll walk through how to build an ETL pipeline that connects Salesforce with Azure Data Factory. You’ll learn how to extract, transform, and load Salesforce data into Azure for faster analytics and better decision-making.

Step by Step process to create ETL Pipeline

Step 1- How to create Developer account

https://developer.salesforce.com/signup

Once you sign up you will receive an email which will contains your username and the login URL.

Now lets us do the Salesforce configuration. Once you login

Step 1- Create an integration (service) user

Steps to Create Integration User

  1. Go to Setup → Users → New User
    • On the Users page (the one in your screenshot), click the New User button.
  2. Fill in User Information
    • First Name: ADF
    • Last Name: Integration
    • Alias: adfint (Salesforce auto-fills but you can set it)
    • Email: abc@gmail.com (this will receive the activation email)
    • Username: Must be unique across all Salesforce orgs. Convention is something@yourdomain.
      → Use: adf_integration@dataspoof.info
    • Nickname: auto-generated, can leave default.
  3. License & Profile
    • User License: Pick one with API access. Usually:
      • Salesforce OR
      • Salesforce Platform (cheaper if you only need API, check your available licenses).
    • Profile: Select one with API Enabled (e.g., System Administrator for testing, but later create a custom “Integration User” profile with minimal access).
      ⚠️ Best practice: Don’t use System Administrator for integrations, but for testing you can.
  4. Role
    • Optional; you can leave blank or assign a role depending on your org’s sharing model.
  5. Check the following boxes
    • Active: ✔️
    • Generate new password and notify user immediately: ✔️ (this will send an email to dataspoof007@gmail.com with activation link).
  6. Save

Step 2- Create Connected App for Azure Data Factory

1. Navigate to Connected Apps

  1. Log in as System Administrator (your abc@dataspoof.info account).
  2. Go to Setup (⚙️ icon, then click Setup).
  3. In the Quick Find box, type App Manager.
  4. Click App Manager (under Apps → App Manager).

2. Click on App Manager and go to New External client APP

A screenshot of a computer

AI-generated content may be incorrect.

https://login.microsoftonline.com/common/oauth2/nativeclient

Under Selected OAuth Scopes, move these scopes:

  • Manage user data via APIs (api)
  • (Optional: Perform requests on your behalf at any time (refresh_token, offline_access) – not needed for client credentials, but harmless)
  • Or Full Access

Then click on create

Step 3- Now go to external clients APP manager and click on ADF Integration APP

Click on Edit and then Go to

Scroll further down and tick ✅ Enable Client Credentials Flow.

A new section will appear → choose the Run As user:

  • Select your new integration user (adf_integration@dataspoof.info).

Now Go to the setting tab in the same and click on OAuth Settings and click on Consumer key and secret. And save it somewhere

Azure Data factory

  1. Go to Azure data factory
  2. Click on create pipeline
  3. Use copy activity
  4. Set the source and search for salesforce v2

Now after that create an object

After that click on preview dataset

Now configure the sink

Now publish and trigger the pipeline

    Conclusion

    By integrating Salesforce with Azure Data Factory, you can automate the ETL process and keep your data centralized in Azure for analysis. This pipeline reduces manual effort, improves data accuracy, and helps your team turn Salesforce insights into smarter, faster business decisions.

    If you like the article and would like to support me, make sure to:

    This Post Has One Comment

    1. kushner

      A thorough practical guide

    Comments are closed.