4 Replies Latest reply: May 15, 2014 10:09 AM by KEVIN KENNEDY RSS

    Extracting many files with many models using command-line

    KEVIN KENNEDY

      I created a batch job to extract 42 files with 42 models and export their tables to an Access database. When I execute the job, some of the exports do not work. I can see on-screen that Monarch (v. 10.5) appears to just skip these by. They are all in this format: "C:\Program Files\Monarch\Program\Monarch.exe" /rpt:"[path]\[reportname.txt]" /mod:"[path][modelname.xmod]" /exp:"[path][database.mdb]" /expfileopt:"Add" /exptable:"[tablename]" /exptableopt:"Overwrite".

      Any suggestions? Right now I am checking the properties of the exported table and rerunning the individual extracts for those that fail, but I do the same thing with some production files where it's not practical to confirm that Monarch did what it was supposed to do, so I need some kind of fix.

       

      To clarify, the batch job contains the above-quoted text 42 times in 42 separate lines. All that diffes are the reportname, modelname and tablename.

        • Extracting many files with many models using command-line
          Grant Perkins

          Is it correct to assume that the failures appear to be random when run in a batch - i.e. not always the same reports?

           

          Are the paths across a network?

            • Extracting many files with many models using command-line
              KEVIN KENNEDY

              Yes, the failures are random, no pattern to them. Yes, the paths are across a network.

                • Extracting many files with many models using command-line
                  Grant Perkins

                  It sounds like network connections may be timing out so that either the incoming connections are not made or time out before being completed or the exports have a similar problem, perhaps because the target database is not available at that moment?

                   

                  The cause? Well, network set up is one possibility if users are, quite reasonably, restricted to the amount of network resource they can use at any one time. If they are it will likely have been set for a generous value for interactive work but maybe not so generous for the way that batch files can crunch data. If the batch is parallel processing (rather than serially processing the lines) some other effect may be in play but the result is the same.

                   

                  Alternatively it could be a no-data to report situation giving no update for export - but I'm guessing you have porbably already covered that.

                   

                  What to do ....?

                   

                  You could consider putting some control statements into the batch file to slow it down a little and see if that improves things. And of course check with your network people to see if they can predict a reason.

                   

                  Or you could consider creating a more sophisticated batch program using, for example, VB and building in error checking and reporting.

                   

                  And I suppose ultimately there is always the Data Pump option where such needs are catered for on an 'industrial' scale.

                   

                  I don't consider myself at all expert in those areas so have little to offer by way of practical guidance but hopefully others will be along shortly to help.

                   

                   

                  Grant