9 Replies Latest reply: May 15, 2014 10:08 AM by John Battalio RSS

    Stop post process on job failure

      Hi,

      I am using Monarch/Data Pump 10.5 pro.  How can I stop a post process from running if any job in the Data  Pump process fails?  Currently I just have one job in the process.  Like, is there something that I can check prior to running my post process?  Thanks.

        • Stop post process on job failure
          Bill Watson

          i am sure there are a couple of ways to do it

           

            a. parse the log events so far and ensure all the project outputs have been success

            b. check your output file(s) are where you expect them to be

           

            in either case you can use the "return false" command to halt a script. obviously if you were outputting to a database, you would have to use the first method

           

          I haven't tried the first method but I am sure I saw something along those lines on the forum somewhere. (maybe Gareth can provide some guidance). for the File Method see the example below:

           

          For this script to work you need to add System.IO to the Imports section on the References and Imports tab of the Script Editor.

           

          Add the following code into your post process script

           

          dim strPath as string = "C:\Data\"

          dim strFile as string = "MYFILE.XLS"

          Dim din As New IO.DirectoryInfo(strPath)

          Dim aryFin As IO.FileInfo() = din.GetFiles(strFile)

          Dim fin As IO.FileInfo

           

          For Each fin In aryFin

             if fin.name = strfile then

                log.addevent(strfile & " Found. Proceeding: " & now.tostring())

                'add code to do whatever you want to do

           

            else

                log.addevent(strfile & " Not Found. Halting Process: " & now.tostring())

                return false

             end if

          Next

          /code

            • Stop post process on job failure

              Bill,

              Thanks for the response.  I had given up on hearing back from anyone and just ran across my old post today.  Checking for the output file isn’t an option since the output is going directly to a database (thanks for that idea and code-I’ve filed it away for future use).

               

              I attempted to parse the log events (e.g., … Dim loginfo As String = logevent.Alert;  If loginfo = "SystemAlertJobFailed" Then…) in a postprocess script and take action accordingly.  I have not been able to get this working-it appears to not see the event alert.  I know the script sees the log as I can add events to the log from the script.  I originally developed the script in Visual Studio against an existing log file and the code works fine (i.e., it finds the event alert and takes action).  Do you have any idea if it is possible to read the log file from a postprocess script or am I trying to do something that is not possible?  Thanks.

                • Stop post process on job failure
                  Bill Watson

                  John

                   

                  With VB.Net you can interogate files to see when they were last modified - so you could check that if it was an access database being modified (something to look into later perhaps). If you are populating an Oracle or SQL Server Database table it may be possible to interogate the database to check if the table was updated or not.. again more investigation needed as I am just getting started in .net still

                   

                   

                  Getting back to Events, I have had a quick look and I think you may be using the wrong property of logevent to check for errors.

                   

                  This example from the Datapump Scripting helpfile uses joblogevent.message property. have you tried using that instead?

                   

                  Example: Writing an event in pre-export and reading it in post-export[/B]

                   

                  Here we are downloading a file, checking for an exception and writing an event to the Job Log.  This is done in the pre-export script.

                   

                  Dim wc As New System.Net.WebClient()

                  Dim Download as Boolean

                   

                  try

                   

                       wc.DownloadFile("http://webserver/server_out/classic.prn", "c:\inbox\datapump\classic.prn")

                       Download = true

                  catch e As System.Net.WebException

                   

                  Log.AddEvent("Download failed")

                  Download = False

                   

                  end try

                   

                  If Download = True then

                       Log.AddEvent("Downloaded file successfully")

                  End if

                  /code

                   

                  The preceding code will write the message as a Job Log event, now we can go through and check the event in post-export script, if the download failed, we can write a message to the Windows Event Log

                   

                  For Each ev As JobLogEvent In Log.GetEvents()

                  ' TODO: Operate on JobLogEvent 'ev' as desired.

                  If ev.message = "Download failed" then

                       Log.AddDistribution_Log("","Error","Download of file failed")

                  End If

                   

                  Next

                  /code

                    • Stop post process on job failure

                      Bill

                      I tried using the joblogevent.message property on a log entry that I added and it worked!  My postprocess script found the message associated with the log entry that I created and sent off an e-mail.  That’s a big step forward for me as now I know that my code is working and that I am reading the log.  Thanks very much.

                       

                      The reason I was attempting to go after the alert property is to be able to take action at a higher level for any job failure.  For example, with the following, the message (“Failed; Dwch…”) could change based on the situation but the alert would not.

                      [/INDENT]

                       

                      Any ideas on how to get at the actual alert text?

                       

                      Thanks again for your patience and assistance.

                       

                      John

                        • Stop post process on job failure
                          Bill Watson

                          John

                           

                          According to the scripting help (which you should be able to find in your start menu or in your datapump installation folder), the joblogevent class has the following Properties:

                          .Alert : Gets the alert name, if any, associated with this event.  (i think you have to have set up an alert in the process for this to be populated)

                          .Message: Gets the message text for this event. 

                          .TimeStamp: Gets the timestamp for this event. 

                           

                          And the following methods:

                          Equals : Determines whether the specified Object is equal to the current Object.

                          GetHashCode : Serves as a hash function for a particular type, suitable for use in hashing algorithms and data structures like a hash table.

                          GetType : Gets the Type of the current instance.

                          ToString : Returns a String that represents the current Object.

                           

                          You could try joblogevent.tostring and see what if that gives you the full contents of the event. Other than that the only other way I could see you doing it is to email a copy of the log file either as an attachment or as inline text (it is an xml file so should be readable easily enough). I think that I have seen a suggestion in the forum that you could copy the log file to an alternate location, and have a seperate scheduled process which parses the log file using monarch, the output of which could be emailed I suppose.

                           

                          seach under the datapump forum for "event" I am sure that was the search I was using when I came across that thread.

                            • Stop post process on job failure
                              Gareth Horton

                              Hi John,

                               

                              Bill's right.

                               

                              You can get this in the PostProcess script - here's a code snippet you can use to ensure you are getting what you want:

                               

                              -


                              For Each ev As JobLogEvent In Log.GetEvents()

                               

                                   Log.AddEvent(ev.message+"_test")

                               

                              Next

                              -


                               

                               

                              This will simply write out the entire event text with _test appended so you can see what is returned by the event message.

                               

                              **Note** This easy approach will only work if you have a single project in a process, or if you are using the multiple projects per job setting.

                               

                              The problem is that you will only get back the job events for the first (primary) job that the process executes.  In a multiple job-spawning process, this is the JobLog that contains the text: "Process launched  job(s)."

                               

                              If you need to iterate through all the job logs that a particular process instance spawns, it will be quite complicated.  It's actually easier if the Process is started programmatically via the API, since you have a method to return an array of JobIDs for a particular process (GetStatusEx), in order to get at all the job logs.  Even then, you'd have to parse out the data from the XML yourself.

                               

                              If you really needed to do this within a Process which is not started programmatically, you would likely have to query the PumpJobs view in the database to get the file references for the job logs, then parse them.

                               

                              Note that in a Process which spawns multiple jobs, the TrackingID for all jobs spawned by that Process instance will be identical (and unique to that instance) and each related job is assigned an ordinal, starting at 1.

                               

                              I know this might be too much information and I hope you only have a single job being spawned in your Process, as it will be much easier.  Even if that's not the case, the benefits of using the multiple projects per job setting in the Process Properties may be worth switching to that.

                               

                               

                              Best regards

                               

                              Gareth

                               

                              John

                               

                              According to the scripting help (which you should be able to find in your start menu or in your datapump installation folder), the joblogevent class has the following Properties:

                              .Alert : Gets the alert name, if any, associated with this event.  (i think you have to have set up an alert in the process for this to be populated)

                              .Message: Gets the message text for this event. 

                              .TimeStamp: Gets the timestamp for this event. 

                               

                              And the following methods:

                              Equals : Determines whether the specified Object is equal to the current Object.

                              GetHashCode : Serves as a hash function for a particular type, suitable for use in hashing algorithms and data structures like a hash table.

                              GetType : Gets the Type of the current instance.

                              ToString : Returns a String that represents the current Object.

                               

                              You could try joblogevent.tostring and see what if that gives you the full contents of the event. Other than that the only other way I could see you doing it is to email a copy of the log file either as an attachment or as inline text (it is an xml file so should be readable easily enough). I think that I have seen a suggestion in the forum that you could copy the log file to an alternate location, and have a seperate scheduled process which parses the log file using monarch, the output of which could be emailed I suppose.

                               

                              seach under the datapump forum for "event" I am sure that was the search I was using when I came across that thread.[/QUOTE]

                                • Stop post process on job failure

                                  Bill,

                                  It finally worked (If ev.Alert = "SystemAlertJobFailed" Then…send out email)!!!  Evidentially my original coding approach (that I put together with the aid of the UserScriptingHelp) to iterate through the JobLogEventList looking at each logeventlist.Item(i) for logevent.Alert, or my use of it, was the problem.  Thanks for sticking with me through this one.  Solving this problem was crucial to my moving forward on some production solutions that have been sitting for quite some time.  And thanks for the additional suggestion that I’ll file away for future use.

                                    • Stop post process on job failure
                                      Bill Watson

                                      Glad to be of help, for me it was a chance to stretch my brain and also gave me some ideas for processes we have here.

                                       

                                      Gareth - thanks for the input - more food for thought

                                        • Stop post process on job failure

                                          Gareth,

                                          Thanks for responding and providing the value added information about the structure of the log.  I use the “multiple projects per job” option for my Data Pump processes when I have more than one project so, if I’m following you correctly, I should be OK since only one log file is created.  Also, thanks for making me aware that a process can be started programmatically.  I may need that if I have multiple projects and one is dependent on another completing without errors.

                                           

                                          John