Respected sir,
I m using CloverETL tool With approximately 50 Nodes. While processing the File having 1,00,000 Records, with Max_Record_size_buffer of 1MB the Tool gets traped in deadlock kind of situation as the records in the output file stopped increasing.
Option tried
1. I have tried to manipulate the deadlock mechanisam given in the Tool, but that didnt work.
2. Increasing the memory size for MAX_RECORD_SIZE_BUFFER to 2Mb getting error: Direct Buffer Memory .
3. Reducing number of records to 40,000, with 1MB Max_Record_Size_Buffer, it works.
What could be the possible solution for this???
Thanks Alot for the Quick Responce.
I m not using any database here. The record is read from flat file and output is also going to flat file.
And ur assumption was right that this problem occurs due to Merging.
I Have updated the size of MAX_RECORD_SiZE and DEFAULT_INTERNAL_DATA_BUFFER_SIZE in the Programme.
The Reason that is causing this Deadlock may be :
I have a component Which will read data from all the Ports one by one. I have Approximately 7-8 Input Ports in this
Node. The Node will send that data record to output port as soon as it will receive it from Input Port.
The Output is somethig like this.
In:0 0
In:1 728
In:2 72
In:3 0
In:4 9165
In:5 212
In:6 0
In:7 0
In:8 0
Out:0 0
As we can See that Input Port Number 0 has no Records yet while INput port 4 is packed up.More importantly No Record
has been written in Output port.
What i assume is while i use InPorts.readRecord() method , it waits until there is any record. This may causes the problem.
Is there any way so that
- i can know that there is a Record in the respective port
- if no record found is there on the port yet then move to next record.
Thanks in advance.
Sorry, Please read the line “2) if no record found is there on the port yet then move to next record.” as under:
2)If there is no record on the Input Port And Port is Still in Running state(data may come after few seconds/mins) , is there any way so that i can identify that No Record is there on that port and No need to read that Port this time.
No Akilesh, i am not using JDBC here.
Hi !
As was already mentioned, the problem can relate to JDBC driver for MySQL.
In general - the MAX_RECORD_SIZE relates one record only. If you have 1mil. of records, you only need to have enough space for the biggest. Records are stored internally as variable length. There should be no need to make this number bigger than say 64K
There is different buffer size which can have an impact - DEFAULT_INTERNAL_DATA_BUFFER_SIZE which can be found in org.jetel.graph.BufferedEdge - this is buffer used to accomodate “extra” records in case you have
parallel edges going to one component and the component reads from one port faster/more records then from the other where the two edges/dataflows somehow come in pairs. Unless you use some kind of JOINing of data or merging, no problems with this buffer can ever occure.
You may try to set this buffer to some high number - like 256K.
As I see, you created your own component - great !
You may use hasData() method on input port - returns true if read operation won’t block - see JavaDoc for more information.
Are you using JDBC to insert into mysql database? If yes, then there’s already a thread discussing this issue…