Increasing the total Number of Records/Second

INFO [WatchDog] - Node ID Port #Records #KB Rec/s KB/s
INFO [WatchDog] - ---------------------------------------------------------------------------------
INFO [WatchDog] - INPUT RUNNING
INFO [WatchDog] - %cpu:… Out:0 160950 56645 289 104
INFO [WatchDog] - REF RUNNING
INFO [WatchDog] - %cpu:… In:0 160950 56645 289 104
INFO [WatchDog] - Out:0 160474 61490 281 110
INFO [WatchDog] - OUTPUT RUNNING
INFO [WatchDog] - %cpu:… In:0 160474 61490 281 110

Currently I am inserting around 800,000 records from Database to a file directly and the time taken is around 20 minutes. I have a 8GB server is there are way to increase the Rec/s and make it even more faster.

Thanks,
Naveen

Hello Naveen,
you can usually speed up reading data from database by optimizing the database. On the CloverETL side you can try to change value of the fetchSize attribute in DBInputTable. Increasing the value of the DEFAULT_INTERNAL_IO_BUFFER_SIZE property (see Program and VM Arguments) can speed up writing to a file to the detriment of memory usage. For some general advices how to design the ETL process please see Performance Tuning.