Objective: parse a large set of xml files and populate MySQL
Problem: XMLExtract throws an exception when a bad xml file is encountered and the execution of graph stops
Requirement: I want XMLExtract to tell me that an xml file is bad and the graph should continue to parse the remaining xml files
Description:
I have a large text file where each line is a complete xml file. I open this large text file using UniversalDataReader and pass each line to the XMLExtract. XMLExtract is able to parse correctly formed xml files, passed the data down to DBOutputTable and MySQL gets populated. In rare case, a badly formed xml file is passed on to the XMLExtract component. SAX parser throws an exception, the graph execution stops prematurely with an error, and remaining xml files in the large text file remain unparsed.
Question: Is it possible to handle this exception and let the processing of remaining xml files continue?
Many thanks for your help and I look forward to your reply.
I would recommend to process each line-xml separately in another graph - then result (ok/failed) of this graph can be handled. How easily this can be done depends on edition of CloverETL you use:
If you have commercial edition of CloverETL Server Corporate (or higher) you can use JobFlows - use http://doc.cloveretl.com/documentation/ … graph.html and pass line as parameter. Then inside try to parse delivered XML. If XML reader fails, you can catch the error in top graph.
You’ve right - RunGraph is not part of Community version. Maybe you may try to check trial version of Commercial version - http://www.cloveretl.com/download