Based on what you said, all options seems viable but it's nearly impossible to tell which one is best.
Also your scenario as you tell it looks very simple. Too simple maybe. In my experience the transfer itself is easy and the tough parts are references, relationships and salesforce specials.
So I would recommend to ask yourself the following questions:
- do you really have only one entity to consider?
- if you say es1 is master, it doesn't look like a simple transfer but more like a sync.
- if that's true, how you distinct between inserts and upserts?
- Do you have an usable external unique ID?
- can these records be created only in es1 or also in salesforce?
- can these records be modified also in salesforce?
- if one of the last 2 points is true, you could end in an 2-way-sync. Does "es1" is MASTER mean, that you overwrite all modifications in Salesforce or do you need a conflict management? If false are you aware that the records are defacto readonly in Salesforce?
- do you have any relationships to other salesforce objects? That's the important thing. How will you link them?
- Does es1 knows the salesforce IDs?
- does salesforce know es1 IDs?
- do you have any self references? (Like the parent lookup at account? That's tricky!)
- do you need to worry about polymorphic relationships (like whoid or whatid on activities.)
- are there any triggers OR workflow-rules OR validations rules either in salesforce or es1 which complicate things? Remember limits and errors might break bulk transactions.
a) data-loader / ETL / AppExchange
You might consider to see this pattern from a wider angle. Data-loader is only one of many external tools. ETL tools like Talend might be a great help to deal with many of the questions I raised above. The standard data-loader help could be close to zero. There are lot's of tools providing templates. Did you check AppExchange for an existing solution?
b) implement sync and transformation logic in salesforce
You end up doing the stuff above in APEX. Do you feel comfortable and do you have enough know-how plus experience in apex? Do you know the relevant limits very well? Do you have inbound limits at es1? The hourly job can be done with scheduled APEX - an example (for scheduling ONLY) looks like this:
global class TestScheduledApexFromTestMethod implements Schedulable {
// This test runs a scheduled job at midnight Sept. 3rd. 2022
public static String CRON_EXP = '0 0 0 3 9 ? 2022';
global void execute(SchedulableContext ctx) {
CronTrigger ct = [SELECT Id, CronExpression, TimesTriggered, NextFireTime
FROM CronTrigger WHERE Id = :ctx.getTriggerId()];
System.assertEquals(CRON_EXP, ct.CronExpression);
System.assertEquals(0, ct.TimesTriggered);
System.assertEquals('2022-09-03 00:00:00', String.valueOf(ct.NextFireTime));
Account a = [SELECT Id, Name FROM Account WHERE Name =
'testScheduledApexFromTestMethod'];
a.name = 'testScheduledApexFromTestMethodUpdated';
update a;
}
}
c) implement sync and logic in es1
calculate your salesforce API limits and usage-need. Does es1 provides a sufficient development framework to do the logic? Do you have the skills to do it there?
d) use Salesforce Connect
Salesforce Connect is an exciting new App Cloud integration service that empowers Salesforce users to access and manage data in external apps, whether on-premise or cloud, directly from Salesforce. With Salesforce Connect, your business can be more productive and agile, and deliver new levels of customer success.
https://trailhead.salesforce.com/de/lightning_connect/lightning_connect_setup
Salesforce Connect makes it easy to integrate two Salesforce Orgs and other systems (like SAP). However one thing prevented us from using it in most cases: the pricing. In our experience it is a couple of thousand dollars per month - often out of budget.
Conclusion
Very generally speaking in my experience and in most cases option a) will be best - but not if you limit yourself to data-loader.
Test speed and bandwidth
In all cases consider speed as a possible bottleneck. You can't guarantee that the sync completes within your 1h interval.
This type of implementation is more common than one might think. It is a good idea to map your requirements and communicate effectively with the owner of the external system. Salesforce will redirect users to the external system using hyperlinks. Therefore, it is crucial for the URL structure of that system to be stable. If any changes are needed in the future, they should be communicated to the team responsible for the Salesforce instance. They can then update formulas, links, code, and other relevant components.
In this scenario, it is not merely a matter of being a good idea; it is a requirement and the appropriate way to implement it. If the URLs are static or stable enough, you don't need Lightning Web Components (LWCs). You can utilize the HYPERLINK function in formula fields, for instance, and place these fields on layouts as needed.
However, this situation differs slightly for custom screens built with LWC, Aura, or Visualforce. While the same principles I mentioned earlier still apply, it is likely beyond the scope of an admin user to take action. In such cases, you will most likely require the assistance of a developer to update the code if the URL format changes.
Best Answer
The External ID concept is primarily meant for use with systems that cannot use custom external identifiers of their own to reference Salesforce records, and also generate data that is bound for Salesforce.
If you can use a Salesforce ID in the external system, you can certainly use the Salesforce ID from the external system to update Salesforce records. Note that you can only have a limited number of External ID fields per object, so using Salesforce ID values is preferred when possible.