Troubleshooting a postgresql JNDI connection

Hello, I am trying to setup a new JNDI connection in my cloverETL server, I am running into some issues and I am not sure how to troubleshoot from there.

I modified my tomcat/config/context.xml file as follows:

<Resource name="redshift" auth="Container" type="javax.sql.DataSource"
        initialSize="5" maxActive="5" minIdle="5" maxIdle="5" maxWait="1300000" removeAbandoned="true" removeAbandonedTimeout="1500"
        username="myusername" password="mypassword" driverClassName="org.postgresql.Driver"
        url="jdbc:postgresql://myserver.fdasfafgssfgfdh.us-west-2.redshift.amazonaws.com:5439/mydatabase"/>

Restarted tomcat, and create a new db connection via the Designer. Specified my JNDI resource as the one above (java:comp/env/redshift), and fire up my graphs.

The graphs won’t fail running, but in my redshift server I never see the sql calls that they are supposed to make. If I do the calls via a non JNDI connection I can successfully run the queries, so there is got to be some miss configuration I have somewhere for JNDI.

Here’s the output of my logs:

2015-09-10 17:28:56,756 INFO 1388117 [WatchDog_1388117] DBConnection driver[null]:jndi[java:comp/env/redshift]:url[jdbc:postgresql://myserver.fdasfafgssfgfdh.us-west-2.redshift.amazonaws.com:5439/mydatabase]:user[myusername] … OK

So as far as cloverETL, it seem to be connecting (OK), but I don’t see the sql hitting my redshift server, and the graph just stays there forever, until the timeout is reached, and then it fails.

My questions are:

  1. Notice the “driver[null]” part on logs above, should I be concern about that? I am sort of new to the Tomcat env, so not sure if a driver is missing, then again, if I do a postgresql connection that is not JNDI, things work

  2. What would be a good way to troubleshoot what is happening while the graph is sitting idle? I am looking at the logs provided by cloverETL itself, but they just also sit idle in there, how can I tell what happens, or where the app failed to send the sql to the redshift server?

Any help will be appreciated, thank you!

update on this issue: the JNDI setup is actually working, the problem is that the DBExecute clover graphs don’t work with JNDI connections, they just timeout (CloverETL Server 3.5.2.12/12)

Does anybody know why?

Hi Ramiro,

What kind of statements are you running? Can you share the graph and the whole log with us (ideally in debug log level)?
In general for DML statement you should use DBInputTable / DBOutputTable. However, this does not apply to Redshift, since another process is recommended – see our blog for more information.

Thanks for the RedshiftWriter Subgraph recommendation Jan, it will be worth exploring, unfortunately we are stuck on version 3.5 of CloverETL for the moment, hopefully we can make DBEXEC graphs to work with JNDI instead.

This is the particular SQL statement I am running (but in fact, it can be any sql):

insert into my_schema.transform_times (description, site_id, timestamp, batch_id, identifier) values ('staging_fact_tr_user_has_state_by_day_active', 777, getdate(), 149, 'start');

This is the error message I am getting:

Component [truncate staging tables:TRUNCATE_STAGING_TABLES] finished with status ERROR.
Component pre-execute initialization failed.
Cannot establish DB connection to JNDI:java:comp/env/redshift Cannot get a connection, pool error Timeout waiting for idle object

If I switch my connection from the JNDI to a regular postgresql driver connection, the same graphs and sql works perfect.

Attached are the log file (in debug level), and also the source code of the graph I am running. Let me know if I can send you any other information that may help troubleshoot, thank you!

I have already sent you the answer via email, but I will post the answer here as well for other users.

First of all I would definitely use Redshift JDBC driver instead of PostgreSQL, although it should work in most cases. You can find the Redshift drivers and further information about their usage in Redshift documentation here.

Secondly for writing data (INSERT statements) you should always use DBOutputTable. This applies also to reading - you should use DBInputTable for SELECT operations. Please refer to DBExecute, DBInputTable, [url=http://doc.cloveretl.com/documentation/UserGuide/index.jsp?topic=/com.cloveretl.gui.docs/docs/dboutputtable.html]DBOutputTable articles in our documentation for more information.

However, I would strongly recommend you the other approach (blog article) - using COPY command. This approach is also recommended by Amazon: “We strongly recommend using the COPY command to load large amounts of data. Using individual INSERT statements to populate a table might be prohibitively slow.” [Redshift documentation - Using a COPY Command to Load Data]
You might also find useful the Redshift documentation article that deals with the COPY command.