I am trying to submit multiple batches (CSV) files into on Salesforce Job. I am using the tutorial http://www.cloveretl.com/blog/salesforce-bulk-api-in-cloveretl/ for most of the basics.
I will have a folder of files that I will need to add to one Salesforce Job. I am familiar with using the ListFiles in a jobflow but I am unsure how to construct the logic between the jobflow and any necessary graphs to make sure all the batches are added to the job before the job is closed (Step 4. in the tutorial above).
Could anyone provide suggestions or guidance on this?
Hi Jesse,
I am not sure whether SalesForce allows users to send more than one request using the same session ID but if it does, here is what you can do. I think the easiest way is to transform the whole graph into a jobflow. In the jobflow, you replace HTTPConnector in step 3 with ExecuteJobflow. In this child jobflow, you can use ListFiles and HTTPConnector, sending the files one by one and taking sessionID passed from the parent jobflow as a parameter. When the child jobflow is done, a token is send from ExecuteJobflow to HTTPConnector in Step 4 and the job is closed.
If SalesForce does not allow to send multiple requests with one sessionID, you can create a parent jobflow with ListFiles and ExecuteGraph. Child graph would contain the whole SalesForceBulkInsert graph with the only change - File URL property used by HTTPConnector in step 3 would contain parameter value coming from the parent jobflow and containing URLs provided by ListFiles.
I hope this helps.