Do you need the verification every time the process runs?
Are the inputs likely to be so variable, compared with the sample inputs used when the model was created, that verification is required for every run? If so it may be worth re-visiting the model to evolve it to a more generic form or look back to the reason as to why the inputs are so variable to see if they should be and maybe fix the problem at, or with reference to, the data source.
As for your specific question - sorry, not an area I have experience with but I agree that what seems to be happening does not seem logical UNLESS the process is generating something that it then re-uses - or something like that.
Thank you for your quick reply Grant.
It does require the verification because it is getting the data from large PDF files. There are certain times wherein a field moves a little bit when we receive these PDF files from the customers.
And your right too, it does not make sense that it will create or stop a POST EXPORT when the EXPORT process itself relies on the verfication process first. Also, even if the verification does not fail, the fact that it is checked makes the POST EXPORT script 'Log.AddDistribution_Move' generate an error and fail. And no, it does not generate anything at all that the POST EXPORT script re-uses or has to use.
Hope the folks at Datawatch have a patch for this.
Thanks again Grant!
I wondered if this was a PDF related requirement ...
Trickier but I think I would still be seeking a way to engineer the model so that the need for verification was eliminated. However I can imagine that with pdf file input this may not feel entirely comfortable even if it is possible.
I certainly don't have enough knowledge of practical Data Pump usage to add much to this I'm afraid. Hopefully one of the DP regulars will be along shortly to help out.