[SalesForce] Exporting salesforce record dynamically using data loader

I am using Data loader using command line interface(CLI) to export salesforce data
in my local system. Also using Windows scheduler to schedule this task on daily basis.

<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
    <bean id="Account" class="com.salesforce.dataloader.process.ProcessRunner" singleton="false">
        <description>Import Account Data.</description>
        <property name="name" value="Account" />
        <property name="configOverrideMap">
            <map>
                <entry key="sfdc.debugMessages" value="true"/>
                <entry key="sfdc.debugMessagesFile" value="C:\Users\Ravikant Maurya\Desktop\Key1.og"/>
                <entry key="sfdc.endpoint" value="https://login.salesforce.com"/>
                <entry key="sfdc.Server host" value="https://www.salesforce.com"/>

                <entry key="sfdc.username" value="*****************"/>
                <!-- password below has been encrypted using key file, therefore it will not work without the key setting: process.encryptionKeyFile
                the password is not a valid encrypted value, please generate the real value using encrypt.bat utility -->
                <entry key="sfdc.password" value="******************"/>
                <entry key="process.encryptionKeyFile" value="C:\User\Desktop\Key.txt"/>
                <entry key="sfdc.timeoutSecs" value="600"/>
                <entry key="sfdc.useBulkApi" value="true"/>
                <entry key="sfdc.loadBatchSize" value="500"/>
                <entry key="sfdc.externalIdField" value="id"/>
                <entry key="sfdc.entity" value="Account"/>
                <entry key="sfdc.extractionRequestSize" value="10000"/>
                <entry key="sfdc.extractionSOQL" value="Select Id, Name, Phone, AccountNumber FROM Account"/>
                <entry key="process.operation" value="extract"/>
                <entry key="process.mappingFile" value="C:\Program Files (x86)\salesforce.com\Data Loader\samples\conf\accountMasterMap.sdl"/>
                <entry key="dataAccess.name" value="C:\Users\Desktop\CSV\CLI\Account.csv"/>
                <entry key="dataAccess.type" value="csvWrite"/>
                <entry key="process.initialLastRunDate" value="2005-12-01T00:00:00.000-0800"/>
            </map>
        </property>
    </bean>
</beans>

It's working fine and account record is exported in Account.csv file specify by dataAccess.name tag.

My problem is when next day task is running it's override the previous record
Account.csv, but i want to create a new record file every day.

I think i have to make dataAccess.name dynamic for each day but i don't know how to do this, if you have any solution to achieve this it will be appreciated.

Best Answer

According to the SETTING UP DATA LOADER 9.0 FOR AUTO PROCESSES cheatsheet there is an entry key that will help here:

Parameter

<entry key="dataAccess.name" value="C:\Program Files\salesforce.com\Apex Data Loader 9.0\test\ extract.csv" />

Description:

Signifies the location and file name of the dataaccess type process.
If running a csvRead, the process will look for the file in this location.
If running a csvWrite, the process will replace the file in this location.

There is an example of this in the dataloader source code.

<bean id="extractAccountCsvProcess"
      class="com.salesforce.dataloader.process.ProcessRunner"
      singleton="false">
  <description>extractAccountCsv job gets account info from salesforce and saves info into a CSV file."</description>
    <property name="name" value="extractAccountCsvProcess"/>
    <property name="configOverrideMap">
        <map>
            <entry key="sfdc.extractionSOQL" value="Select Id, Name, Type, Phone, AccountNumber__c, Website, AnnualRevenue, 
                LastModifiedDate, Oracle_Id__c, CustomDateTime__c FROM Account where AccountNumber__c like 'ACCT%'" />
            <entry key="process.outputSuccess" value="@TEST_FILE_DIR@/status/extractAccountCsvSuccess.csv"/>
            <entry key="process.outputError" value="@TEST_FILE_DIR@/status/extractAccountCsvError.csv"/>
            <entry key="sfdc.debugMessagesFile" value="@TEST_FILE_DIR@/status/extractAccountCsvSoapTrace.log"/>
            <entry key="process.operation" value="extract"/>
            <!-- <entry key="process.mappingFile" value="@TEST_FILE_DIR@/data/extractAccountCsvMap.sdl"/> -->
            <entry key="dataAccess.type" value="csvWrite"/>
            <entry key="dataAccess.name" value="@TEST_FILE_DIR@/data/extractAccountCsv.csv"/>
        </map>
    </property>
</bean>

You then need a mechanism to automate replacing the dataAccess.name entry. Unfortunately, it doesn't appear to be possible to simply provide a value as a parameter at the command line - Apex Data Loader PropertyPlaceholderConfigurer/Dynamic Property Replacement

Instead you will need to use an additional process to modify the static process-conf.xml file with the required file name.

There are examples in:

In both cases, ANT is used to replace tokens in a template with the desired file name to create the static file that the batch job is then run against.

You could probably also achieve something similar without using ANT. See How can you find and replace text in a file using the Windows command-line environment?


Another option would be to make the output file name unique after the data loader CLI operation has completed. I.e. After the export completes, if a file exists with the name Account.csv, rename it to something unique using the current date.

E.g.

if exist Account.csv ren Account.csv Account_%time:~0,2%%time:~3,2%-%DATE:/=%.csv
Related Topic