7 Replies Latest reply: May 15, 2014 10:00 AM by joey RSS

    Maximum number of Monitored Files

    joey

      Is there any possibility of increasing the limit of monitored files for Data Pump above the current 61 limit?  Currently, our company has 97 processes, and I can see the number doubling over the next five years. (Yes, we like Data Pump!)

       

      Since many of our reports are based on batch cycles, the dates input files are available varies considerably.  Our approach has been to schedule each process to run daily at 6:00 AM and PM, but this has lead to the Job Log becoming considerably long, since we like to maintain a year's worth of history.  Especially for reports that run quarterly - We end up with 4 logs of valid data and 726 where the input file wasn't found.

       

      One other aspect I'm curious about is the fact that the condition "Process Prerequisites failed, but no retry information is available"... is considered completed for the purpose of the Job Log Distribution.  I agree it is completed, but if we want to keep a year's worth of history of completed logs, it would be nice to purge out the cases in the last year where input files weren't available.  Perhaps these would need a separate distribution?

       

      Has anyone else encountered similar problems of going over the limit of monitored files?  Are there any suggestions or possible workarounds?  Thanks!

        • Maximum number of Monitored Files
          Gareth Horton

          Joey,

           

          I'll look into this for the next version, although the limit is really governed in some ways by the limitation of the .NET Framework.

           

          If you have a technical background, the limit of 64 waithandles in .NET is what is causing this.  We also use a couple externally, so that leaves 60 for processes.

           

          [url="http://msdn.microsoft.com/msdnmag/issues/04/10/NETMatters/"]http://msdn.microsoft.com/msdnmag/issues/04/10/NETMatters/[/url]

           

          (search for waithandle.waitall on this page, which mentions the limit - the rest of the code is not really relevant)

           

          We will try and look at alternative approaches for the next version. 

           

          I will contact you by PM in the future to discuss the categorization of failure events.

           

          Gareth

           

          Originally posted by Joey:

          Is there any possibility of increasing the limit of monitored files for Data Pump above the current 61 limit?  Currently, our company has 97 processes, and I can see the number doubling over the next five years. (Yes, we like Data Pump!)

           

          Since many of our reports are based on batch cycles, the dates input files are available varies considerably.  Our approach has been to schedule each process to run daily at 6:00 AM and PM, but this has lead to the Job Log becoming considerably long, since we like to maintain a year's worth of history.  Especially for reports that run quarterly - We end up with 4 logs of valid data and 726 where the input file wasn't found.

           

          One other aspect I'm curious about is the fact that the condition "Process Prerequisites failed, but no retry information is available"... is considered completed for the purpose of the Job Log Distribution.  I agree it is completed, but if we want to keep a year's worth of history of completed logs, it would be nice to purge out the cases in the last year where input files weren't available.  Perhaps these would need a separate distribution?

           

          Has anyone else encountered similar problems of going over the limit of monitored files?  Are there any suggestions or possible workarounds?  Thanks! [/b][/quote]

          • Maximum number of Monitored Files
            Joel F

            I work Joey and help adminster Datapump with him. 

             

            Have you considered using the FileSystemWatcher as an alternative.  You would want to make sure you test to see what happens if the connection to the server for a file you are monitoring goes down.  I would imagine you could open multiple instances of the object dynamically which would then make it so you would technically not have any limits.

             

            I've used it in a .NET windows service and it works pretty good.

            • Maximum number of Monitored Files
              Gareth Horton

              Joel,

               

              I am not 100% sure, as I have not seen the code for that personally, but I believe that is the way it is done.  Note that the FileSystemWatcher uses waithandles as well. Try instantiating 65 instances of the FileSystemWatcher.

               

              Note that the limit is not the number of files, but the number of pathspecs.

               

              Gareth

               

                Originally posted by Joel F:

              I work Joey and help adminster Datapump with him. 

               

              Have you considered using the FileSystemWatcher as an alternative.  You would want to make sure you test to see what happens if the connection to the server for a file you are monitoring goes down.  I would imagine you could open multiple instances of the object dynamically which would then make it so you would technically not have any limits.

               

              I've used it in a .NET windows service and it works pretty good. /b[/quote]

              • Maximum number of Monitored Files
                joey

                Thanks for your help Gareth.  One more question, by pathspecs, do you mean directories as opposed to files?  So, I could have an unlimited number of files monitored in 1 directory? 

                 

                Most of our files go to one of 2 directories.  I think we had trouble with the number of monitored files (in 2 directories) when the number was close to 60.  I think we recieved an answer that the limit of 64 applied to files instead of directories.

                 

                If you think it should be directories, I can do a little testing and see what error message we might get.

                 

                Again, thanks for your clarification and assistance on this issue.  File monitoring is such a great feature, its a shame to have to limit it to a few reports.

                • Maximum number of Monitored Files
                  joey

                  I did try enabling 75 files for 2 directories, and recieved the following error, so to answer my previous post, I believe files is the limitation.

                   

                  DwchServer.ReloadMonitoringTablesException: Error occurred while loading the monitoring tables. ---> DwchServer.InvalidFindFirstChangeNotificationException: Error setting up initial change notification for C:DataWatchMonitor - error 56

                     at DwchServer.ChangeNotification..ctor(String strPath, Int32 iWatchSubtree, FileNotifyChange notificationType)

                     at DwchServer.l.a()

                     at DwchServer.u.c(ServerDB A_0)

                  • Maximum number of Monitored Files
                    Gareth Horton

                    Hi Joey

                     

                    Yes, this will cause the problem.

                     

                    The only way to help with the limit is to use wildcards, I am assuming you are using full paths to the files you are monitoring.

                     

                    So you can use the * and ? wildcards to help cover more files than if you specify them fully.

                     

                    e.g.

                     

                    *_gl.prn (gets all files ending with _gl with the prn extension.

                     

                    gl_*.prn (gets all files beginning with gl_ with the prn extension

                     

                    ac*100?.prn (gets all files beginning with ac and ending with 100 plus another single character, i.e account_code_type_1006 will be processed but not account_code_type_10000

                     

                    Gareth 

                     

                    Originally posted by Joey:

                    I did try enabling 75 files for 2 directories, and recieved the following error, so to answer my previous post, I believe files is the limitation.

                     

                    DwchServer.ReloadMonitoringTablesException: Error occurred while loading the monitoring tables. ---> DwchServer.InvalidFindFirstChangeNotificationException: Error setting up initial change notification for C:DataWatchMonitor - error 56

                       at DwchServer.ChangeNotification..ctor(String strPath, Int32 iWatchSubtree, FileNotifyChange notificationType)

                       at DwchServer.l.a()

                       at DwchServer.u.c(ServerDB A_0) /b[/quote]

                    • Maximum number of Monitored Files
                      joey

                      That is certainly an innovative approach!  I would be willing to try that out, except for a separte issue I've had with Data Pump.  See the topic  [url="http://mails.datawatch.com/cgi-bin/ultimatebb.cgi?ubb=get_topic;f=10;t=000109"]Process to use one of many possible projects /url

                       

                      Ken Indorato confirmed that when setting up a process with multiple projects, when the input file to each project is not required, certain parts of the project process anyway, such as the input distribution and the scripts for pre/post export.

                       

                      I could reduce the number of processes to group together similar projects with similar file names and monitor those.  One project would ideally kick off at a time.  However, since certain peices of each project will process reguardless of whether there was input or not, I'm not going to try this approach yet. 

                       

                      Thanks again for your help with this, and great thinking on your alternate approach!