3 Replies Latest reply: May 15, 2014 10:00 AM by joey RSS

    Process to use one of many possible projects


      I have a report with 15 possible reports.  There are 15 models, and 15 possible input files.  Now at any time one (and only one) of the input files will need to be processed.


      The input file names are all similar.  For example we have Rep1.txt, Rep2.txt, ect.


      I have one process with 15 projects.  The Multiple Projects Per Job at the process level is selected (to keep my job log shorter), and the process moniters for Rep*.txt.


      Each individual project takes the one input, and has the options Grouping of Multiple Files set to "Each in its own job", and File Existance Criteria set to "Not Required".  The input distribution is set to delete the input files.


      This works perfect, so I'm going to assume this is the propper way to set it up.  If anyone has any tweaks or suggestions feel free to share.  Every time a input file is found, one project runs and the rest are skipped.


      However: I would like to modify the input distributions to move the file to a PROCESSED file so I have a history log to backtrack as needed, instead of simply deleting the file.


      When I change the input distribution from delete to move, Data Pump chokes on the projects that are skipped.  It had no problems when the input distribution was set to delete.  Is this possibly a bug in Data Pump, or do I need to tweak something? 


        <event time="2006-06-30T09:48:18">Input file 'Central1sharedatmfinfoMonarchAgentInforce-RO-A.txt' does not exist and has been ignored.</event>

        <event time="2006-06-30T09:48:18" alert="SystemAlertNoInputToExporter">No inputs exist for project ID 1. Exporting for this project has been skipped.</event>



        <event time="2006-06-30T09:50:13" source="distributor">Distribution started</event>

        <event time="2006-06-30T09:50:23" alert="SystemAlertDistributionError" source="distributor">Could not find file "Central1sharedatmfinfoMonarchAgentInforce-RO-A.txt".</event>

        <event time="2006-06-30T09:50:25" source="distributor" value="complete">Distribution completed with errors</event>

        • Process to use one of many possible projects

          Originally posted by Ken Indorato:


          I'm not sure what you mean by "data pump chokes on the projects that are skipped".


          /b[/quote]I mean the Process completes with errors.  All of the necessary exports were completed, however a SystemAlertDistributionError is raised, which means I recieve an email.



          If an input is not present then the log should report "No inputs exist for project ID x. Exporting for this project has been skipped."

          /b[/quote]I agree, but the messages I included in the post from the log indicate that:

          -AgentInforce-RO-A.txt does not exist and has been ignored

          -No inputs exist for project ID 1. Exporting for this project has been skipped

          -Could not find file AgentInforce-RO-A.txt.

          -Distribution completed with errors


          I guess when you say Exporting for this project has been skipped, this does not mean the input distributions will be skipped.  Still, it is odd that deleting a file that does not exist causes no problems, but moving a file that does not exist does.



          When you changed the input distribution from Delete to Move did you enable one of the options (overwrite, rename, change extension)to handle files with the same name? If not then this would result in a "Distribution completed with errors" message.

          /b[/quote]Yes, I renamed the file and had overwrite checked.


          Did the project that had the input file complete successfully?



          I set up a Process with multiple projects and input criteria similar to what you describe and it runs without a problem.

          /b[/quote]If you have a process with mulitple processes with an input distribution of move to a different folder where one (but not all) of the processes can execute at once I'd be interested to know the settings you used.  Is there a recommended way to share the settings of processes or projects? 


          Thanks for all your help!

          • Process to use one of many possible projects

            Security really shouldn't be an issue since the Data Pump service runs with administrator priviliges. 


            I created fresh models, projects, and processes using the C: drive instead of a networked drive.  There were only two projects.  In each case, when one input file was present, the other project would try to perform its input distribution and the process would fail.  If you like I could send the test files, projects, models, and the process and you could see if you have the same results.

            • Process to use one of many possible projects

              One setting I'm not sure if I specified: on the process level I have Multiple Projects Per Job set instead of ONe Project Per Job.