I configured the task DelimitedDataReader → DelimitedDataWriter, manual operation can perform this task successfully, using Quartz in the background once every three minutes, reported the following error
ERROR Can’t create object of type DELIMITED_DATA_WRITER with reason: null.
I modified the configuration file defaultproperties the following parameters:
Record.MAX_RECORD_SIZE = 1024000
DataParser.FIELD_BUFFER_LENGTH = 8192000
DataFormatter.FIELD_BUFFER_LENGTH = 8192000
DEFAULT_INTERNAL_IO_BUFFER_SIZE = 4194304
I applied for a jvm in 2G memory, reading and writing 500 records. Is not a memory problem, please help the analysis?
Hello,
does the problem persist with default properties? Have you tried Universal Data Reader instead of Delimited one? Can you show your node?
I think the problem mainly in the configuration of default properties, I set up Record.MAX_RECORD_SIZE = 1024000, running in the background all the time to apply for each record 1m memory, but if you multi-task in parallel and there will be huge amounts of data memory overflow can be whether to optimize this?
Could you please send whole stack trace? I have no enough information to do any investigation.