Salesforce to Database Contact Migration
home
This template moves a large set of contacts from Salesforce to a database. Trigger with an HTTP call either manually or programmatically.
Contacts are upserted so that the migration can be run multiple times without worrying about creating duplicates. This template uses our batch module to make the act of moving a large set of data easier and more transparent.
Parent accounts of the contacts are created if they donβt already exist in the destination system or can be set to be a specific account for all contacts that are migrated. A database table schema is included to make testing this template easier.
License Agreement
This template is subject to the conditions of the MuleSoft License Agreement. Review the terms of the license before downloading and using this template. You can use this template for free with the Mule Enterprise Edition, CloudHub, or as a trial in Anypoint Studio.
Use Case
As a Salesforce admin I want to migrate Contacts from Salesforce to a database.
This template serves as a foundation for setting an online sync of Contacts from Salesforce instance to database. Every time you browse to the HTTP connector, the integration checks for changes in Salesforce source instance in manner of one time integration and it's responsible for updating the Contact on the target database table.
Requirements have been set not only to be used as examples, but also to establish a starting point to adapt your integration to your requirements.
This template leverages the Mule batch module. The batch job is divided into Process and On Complete stages. The integration is triggered by a HTTP connector defined in the flow that is going to trigger the application, querying Salesforce updates/creations matching a filtering criteria and executing the batch job. During the Process stage, each Salesforce Contact is filtered depending if it has an existing matching Contact in the database. The last step of the Process stage groups the Contacts and inserts or updates them in a database.
Finally during the On Complete stage the template logs statistics into the console.
Considerations
To make this template run, there are certain preconditions that must be considered. All of them deal with the preparations in both source (Salesforce) and destination (Database) systems, that must be made for the template to run smoothly. Failing to do so can lead to unexpected behavior of the template.
This template illustrates the migration use case between Salesforce and a Database, thus it requires a database instance to work. The template comes packaged with a SQL script to create the Database table that it uses. It is the user's responsibility to use that script to create the table in an available schema and change the configuration accordingly.
The SQL script file can be found in contact.sql in /src/main/resources.
This template is customized for MySQL. To use it with different SQL implementation, some changes are necessary:
- Update the SQL script dialect to desired one.
- Replace MySQL driver library dependency to desired one in the pom.xml file.
- Update the Database Config to a suitable connection instead of db:my-sql-connection in global elements in config.xml in /src/main/mule/config.xml.
- Update database properties in the
mule.*.properties
file.
Database Considerations
This template uses date time or timestamp fields from the database to do comparisons and take further actions. While the template handles the time zone by sending all such fields in a neutral time zone, it cannot handle time offsets. (Time offsets are time differences that may surface between date time and timestamp fields from different systems due to a differences in each system's internal clock.)
Take this in consideration and take the actions needed to avoid the time offset.
As a Data Destination
There are no considerations with using a database as a data destination.
Salesforce Considerations
Here's what you need to know about Salesforce to get this template to work:
- Where can I check that the field configuration for my Salesforce instance is the right one? See: Salesforce: Checking Field Accessibility for a Particular Field.
- How Can I modify the Field Access Settings? See: Salesforce: Modifying Field Access Settings.
As a Data Source
If the user who configured the template for the source system does not have at least read only permissions for the fields that are fetched, then an InvalidFieldFault API fault displays.
java.lang.RuntimeException: [InvalidFieldFault [ApiQueryFault
[ApiFault exceptionCode='INVALID_FIELD'
exceptionMessage='Account.Phone, Account.Rating, Account.RecordTypeId,
Account.ShippingCity
^
ERROR at Row:1:Column:486
No such column 'RecordTypeId' on entity 'Account'. If you are
attempting to use a custom field, be sure to append the '__c' after the
custom field name. Reference your WSDL or the describe call for the
appropriate names.'
]
row='1'
column='486'
]
]
Run it!
Simple steps to get this template running.
Running On Premises
In this section we help you run this template on your computer.
Where to Download Anypoint Studio and the Mule Runtime
If you are new to Mule, download this software:
Note: Anypoint Studio requires JDK 8.
Importing a Template into Studio
In Studio, click the Exchange X icon in the upper left of the taskbar, log in with your Anypoint Platform credentials, search for the template, and click Open.
Running on Studio
After you import your template into Anypoint Studio, follow these steps to run it:
- Locate the properties file
mule.dev.properties
, in src/main/resources. - Complete all the properties required as per the examples in the "Properties to Configure" section.
- Right click the template project folder.
- Hover your mouse over
Run as
. - Click
Mule Application (configure)
. - Inside the dialog, select Environment and set the variable
mule.env
to the valuedev
. - Click
Run
.
Running on Mule Standalone
Update the properties in one of the property files, for example in mule.prod.properties, and run your app with a corresponding environment variable. In this example, use mule.env=prod
.
After this, to trigger the use case you just need to browse to the local HTTP connector with the port you configured in your file. If this is, for instance, 9090
then you should browse to http://localhost:9090/migratecontacts
and this outputs a summary report and send it in the e-mail.
Running on CloudHub
When creating your application in CloudHub, go to Runtime Manager > Manage Application > Properties to set the environment variables listed in "Properties to Configure" as well as the mule.env value.
Follow other steps defined here and once your app is all set and started, there is no need to do anything else. Once your app is all set up and started, supposing you choose contactsmigration as domain name to trigger the use case, you just need to browse to http://contactsmigration.cloudhub.io/migratecontacts and the report is sent to the email configured.
Deploying a Template in CloudHub
In Studio, right click your project name in Package Explorer and select Anypoint Platform > Deploy on CloudHub.
Properties to Configure
To use this template, configure properties such as credentials, configurations, etc.) in the properties file or in CloudHub from Runtime Manager > Manage Application > Properties. The sections that follow list example values.
Application Configuration
HTTP Connector Configuration
- http.port
9090
Batch Aggregator Configuration
- page.size
1000
Last migration date
- migration.date
2018-09-08T10:17:21.000Z
Database Connector Configuration
- db.host
localhost
- db.port
3306
- db.user
joan.baez
- db.password
JoanBaez456
- db.databasename
template-sfdc2db-contact-migration
Salesforce Connector Configuration
- sfdc.username
bob.dylan@org
- sfdc.password
DylanPassword123
- sfdc.securityToken
avsfwCUl7apQs56Xq2AKi3X
SMTP Services Configuration
- smtp.host
smtp.gmail.com
- smtp.port
587
- smtp.user
email%40example.com
- smtp.password
password
Email Details
- mail.from
batch.contacts.migration%40mulesoft.com
- mail.to
your.email@gmail.com
- mail.subject
Batch Job Finished Report
API Calls
Salesforce imposes limits on the number of API calls that can be made. Therefore calculating this amount may be an important factor to consider.
In this template only one Salesforce query is made so this is not something to worry about.
Customize It!
This brief guide provides a high level understanding of how this template is built and how you can change it according to your needs. As Mule applications are based on XML files, this page describes the XML files used with this template. More files are available such as test classes and Mule application files, but to keep it simple, we focus on these XML files:
- config.xml
- businessLogic.xml
- endpoints.xml
- errorHandling.xml
config.xml
This file provides the configuration for connectors and configuration properties. Only change this file to make core changes to the connector processing logic. Otherwise, all parameters that can be modified should instead be in a properties file, which is the recommended place to make changes.
businessLogic.xml
Functional aspect of the template is implemented on this XML, directed by one flow responsible of excecuting the logic.
For the purpose of this template the mainFlow uses a batch job, which handles all the logic of it.
endpoints.xml
This file provides the inbound and outbound sides of your integration app.
This template has only an HTTP Listener as the way to trigger the use case.
Inbound Flow
HTTP Listener Connector - Start Report Generation
${http.port}
is set as a property to be defined either on a property file or in CloudHub environment variables.- The path configured by default is
migratecontacts
and you are free to change for the one you prefer. - The host name for all endpoints in your CloudHub configuration should be defined as
localhost
. CloudHub routes requests from your application domain URL to the endpoint. - The endpoint is a request-response since as a result of calling it the response is the total of Contacts synced and filtered by the criteria specified.
errorHandling.xml
This file handles how your integration reacts depending on the different exceptions. This file provides error handling that is referenced by the main flow in the business logic.