How to do a dynamic insert into a table?

I’ve got a subgraph set up that takes in a table name parameter. My goal is to select the data from that table in my Oracle system and insert it into a table with the same structure which already exists in Azure. I want this subgraph to be able to work with many different tables.

Currently I have this working except that instead of inserting into the Azure table I’m just writing out a CSV. I haven’t been able to figure out how to use DBOutputTable dynamically. Is there a way to do this using DBOutputTable and avoid writing the CSV? Do I have to do it by writing out the CSV first and then loading that file using a different component instead?

Hi,

This answer is also a reaction to forum ticket How to set a parameter on subgraph.

Just to give you the background on this. The subgraph does not actually work as a loop. It just behaves as any kind of component, therefore as Lukas noted in the topic mentioned above, you cannot actually use a different parameter for each subgraph run (as the subgraph is initialized just once at the beginning).

What you are trying to handle is, however, possible with so-called Jobflows. Let me explain that step by step.

1. In your current (not dynamic) graph, what you can do is to set a Parameter e.g. PARAM that holds a table name. In a DBOutputTable component, you can then set DB Connection property and then DB table property to be something like ${PARAM}.
This setup means that the data is going to be loaded to the database (connection tells us the right database) and table (parameter holds the name of the table).

2. Then you can create a Jobflow. CloverETL Jobflow module allows combining graphs and lets you also create the loop desired. Please note that execution of Jobflow requires CloverETL Server Corporate environment. For more information see: http://doc.cloveretl.com/documentation/UserGuide/4-4-1/index.jsp?topic=/com.cloveretl.gui.docs/docs/jobflow.html

In the Jobflow you can use a component called ExecuteGraph. You can connect a list of table names to the input 0 of the ExecuteGraph and then map the field to the parameter PARAM created in your original graph. Please note that the parameter PARAM is available in Input Mapping editor only if you have the original graph already selected in the Graph URL property of the ExecuteGraph component.

3. I am not sure from your question if these tables that you are trying to migrate this way do have the same metadata. If not, you should create dynamic metadata as well. You can do it by following:
Create a connection to your database in the original graph.
Go to the Source tab (an extra tab below the Graph pane) and add the following link to the section of the code:

<Metadata connection="JDBC0" id="Metadata-dyn" name="dynmet" sqlQuery="SELECT * FROM ${PARAM} LIMIT 1"/>

If you then click back to the Graph pane, you can see a newly created Metadata called “dynmet” in the Outline. You can use this metadata as usual (e.g. drag and drop it to the right edge). As you’ve probably already found out, it means that each time the graph starts (each time the Jobflow executes the graph), it runs the SELECT query using the PARAM value (which is the table name) and creates the appropriate metadata.

I hope that my explanation is understandable, if you have any question though, please let us know.

Eva

I actually had a call yesterday with Clover and determined that part of my issue is that I need the Server version of the software to use jobflows and use parameters in graphs. I was basically trying to do the same thing with graphs and subgraphs. I’m expecting to get a trial of Server which will hopefully resolve many of my issues.

Great read. Please feel free to shoot myself and/or Kevin an email if you’d get stuck, so we can get you up to speed faster. Would you like to send that graph and jobflow we sketched out before?