this error message is quite common and is typically caused by one of the following two conditions:
1. An error in the input file
For instance, a record delimiter or field delimiter is missing in the input file or there is one more delimiter than it should be. This can corrupt the parsing of the next fields and records and can result in loading a huge chunk of data into a single field/record, thus causing the buffer overflow error.
2. A valid record or field larger than 32 MB
In this case, it is the actual size of the data being loaded either into a single field or into a single record that exceeds one of the two limits: Record.RECORD_LIMIT_SIZE and Record.FIELD_LIMIT_SIZE. Worth noting is that this problem does not have anything to do with the size of the input file of the UniversalDataReader. In other words, it is not the number of records (rows, if you will) that makes the difference here so breaking the files up into smaller ones would not be a remedy to the root cause.
Overall, increasing the limits is, in fact, an adequate solution. Since the limit has been overflown only by a relatively low amount, it should be sufficient to double both limits. In other words, add the following lines into your engine properties
and restart CloverDX. More information can be found in our documentation
Code: Select all
Record.RECORD_LIMIT_SIZE = 67108864
Record.FIELD_LIMIT_SIZE = 67108864