23 Replies Latest reply: May 15, 2014 10:09 AM by Chris Harris RSS

    Future Enhancement Requests

    joey

      Monarch has a great thread for ideas for Monarch. I'm starting this for Data Pump, in the hopes we can use it to make suggestions.

        • Future Enhancement Requests
          ZJSoule _

          Implement a way to test a job and see if it runs correctly without having to actually run it.  I have jobs that have huge distributions, and scripting, and I have to remove it all to test a job and then put it all back in.  Very annoying.

          • Future Enhancement Requests
            joey

            First enhancement: I would like better error messages when dealing with file locking in the job log. We have bombs from time to time when a user left a file open at the end of the day, which causes a job to bomb. Unfortunately, the job log is not as clear as I would like it.

             

            For example:

             

             

             

            /code

             

            Now, what file was locked? Was in the input file, the project, the export file? If so, which export file? There are 10 exports from this particular project. It would be nice if the file name was included in the error message.

             

            To futher the enhancement idea since Data Pump runs as an administrator would it be possible if the log could list WHO had the file locked, or any other details?

             

            As I said, this happens from time to time and our current troubleshooting tools are very limited.

              • Future Enhancement Requests
                Olly Bond

                Hello everyone

                 

                Great idea for a thread!

                 

                I've got a few processes that crash when our SMTP server doesn't respond in time. I'd like DataPump to be able to try to get an SMTP connection for n seconds, or n times, and then fail gracefully, rather than hanging.

                 

                Also in v10, if I try to cancel a running job, DataPump doesn't accept the instruction and the job stays. I have to delete it in order to run another instance of the process.

                 

                Best wishes,

                 

                Olly

                  • Future Enhancement Requests
                    Bill Watson

                    as I don't have version 10 perhaps someone can tell me if this would continue to be a good addition:

                     

                    Ability to Duplicate a process from the management console. At the moment I have to run a stored procedure I created in the Datapump SQL Server database to duplicate a process.

                     

                    The reason this is required is that sometimes you want to make minor changes to code on a live process or even perform a complete overhaul and setting up a new process to replicate one which has countless projects attached can be a longwinded process and is also prone to error if you happen to miss a particular project.

                     

                    Stored Procedure I currently use:

                     

                    USE

                    GO

                    /****** Object:  StoredProcedure .[SP_DUPLICATEPROCESS]    Script Date: 11/08/2010 12:21:09 ******/

                    SET ANSI_NULLS ON

                    GO

                    SET QUOTED_IDENTIFIER OFF

                    GO

                     

                    CREATE PROCEDURE .[SP_DUPLICATEPROCESS]

                    @PROCESSIDENTIFIER INT,

                    @PROCESSCOPIES INT

                     

                    AS

                    BEGIN

                         DECLARE @counter INT 

                     

                         -- Loop through @copies number of times appending @counter to the Process Name

                         SET @counter = 0

                         WHILE @counter < @PROCESSCOPIES  

                         BEGIN

                              INSERT INTO .[dbo].[PumpProcesses] (ProcessName, Description, ProcessDocument, Engine, Priority, AllowParallelJobs)

                              SELECT     PumpProcesses_1.ProcessName' Duplicate' CAST(@counter AS varchar(3)) AS ProcessName, Description, ProcessDocument, Engine, Priority, AllowParallelJobs

                              FROM         .[dbo].[PumpProcesses] AS PumpProcesses_1

                              WHERE ProcessID = @PROCESSIDENTIFIER

                     

                              SET @counter = @counter + 1

                         END

                    END

                    /code

                     

                    Basically when you execute the stored procedure, you pass it the ID number of the process (obtained from the PumpProcesses table) and the number of duplicate copies you want.

                     

                    This will then create duplicates which can be edited from the management consol, each named "Original Process Name Duplicate n"

                     

                    A new option added to the management console to allow this procedure (or a similar one) to be run by selecting the process then clicking on the link would be great.

                      • Future Enhancement Requests
                        thompssc _

                        Nice ideas, I like it.

                         

                        Bill - In v10 you can export a process as XML and re-import it, DataPump will automatically re-name it so it doesn't overwrite the existing process.

                         

                        I would love to have Datapump reliably work with ActiveDirectory. As it stands I run into occasional AD entries which cause a crash when the process goes to run the distribution. To work-around the issue we have to import any user we want to add to a distribution list to the local address book - this can be problematic in an organization with thousands of employees.

                          • Future Enhancement Requests
                            mdyoung _

                            Here are a few ideas I have in mind:[LIST=1]

                            [B]Better process organization managment[/B] - The ability to create sub-folders (or something of the sort) under the Processes folder in order to better organize processes. For example: We have many departments that have processes run for their particular needs. It would be nice to be able to create a sub-folder for each department and put their processes in there; as opposed to, searching through a long list of processes when you have to make an edit.

                            [B]Easier alert managment for each individual process[/B] - Need an easier way to create an alert (without having to create a programming script) that emails a certain person or group of persons when a particular process fails. Supply an additional tab on the process properties dialog that allows you to create an emailing alert to a person(s) when the process fails. As an additional note, have it email ONLY after the maximum number of retries has been reached; unless it's set to indefinitely, then email on each retry (or something like that). Better yet, give us the option to select whether or not to email on every retry or after max retries has been exceeded. Currently, we have global alerts setup to send an email to people when ANY process fails, but there are several processes that I don't want certain people knowing failed, mostly because it doesn't apply to them and they freak out sometimes. I only want them to know when a process fails that only relates to them, not everyone. I realize you can create scripts to do this, but not everyone knows how to create these scripts; much less, know how to edit/manage them.

                            [B]File monitoring delay[/B] - Provide a way to specify a delay time before process execution once a file has been detected by the file monitoring system. To give you an idea of what I'm experiencing sometimes, we have a system from which we download reports so DataPump can process them and place into a SQL database. Some of these are scheduled and some rely on manual downloads. Some of these reports are huge and the way our system downloads is it creates the file first, then opens it and starts writing to it. We need to use DataPumps' file monitoring capabilities to detect when the reports are available; however, because of the way our system downloads, especially large reports, DataPump detects the file and processes it before all the data is in the file. Providing us a way to delay the processing after file detection for x number of minutes would resolve our issues.

                            [B]Distribution authentication credentials option[/B] - Ability to specify credentials to use when a distribution copies or moves an exported file to a network location. As it is currently in v10, when specifying an ftp location, the "Set Authentication Credentials" are displayed. We need the similar thing when specifying a network location using the UNC pathway string (i.e.
                            SomeServer\SomeFolder).

                            /LISTThat's all I can think of right now.

                             

                            Thanks,

                             

                            Micheal

                              • Future Enhancement Requests
                                thompssc _

                                How about pagination / filtering / searching for the "Jobs" screen. We only keep a few months of history and it can take a while to display the entire list, but I usually only want to see logs from today or for a specific process (In the interim I just wrote a report to save me time).

                          • Future Enhancement Requests
                            joey

                            Olly:

                            Old response to your support post, but we are rebuilding our server and I realized we have the same problem with you with respect to SMTP server timeouts. In fact our exchange server was down for some time and wreaked havoc on our Data Pump environment a few years ago.

                             

                            We run a SMTP service on the Data Pump server that continually tries to push emails to our exchange server. That way Data Pump is happy and never times out.  Would you like more information?

                              • Future Enhancement Requests
                                Olly Bond

                                Olly:

                                Old response to your support post, but we are rebuilding our server and I realized we have the same problem with you with respect to SMTP server timeouts. In fact our exchange server was down for some time and wreaked havoc on our Data Pump environment a few years ago.

                                 

                                We run a SMTP service on the Data Pump server that continually tries to push emails to our exchange server. That way Data Pump is happy and never times out.  Would you like more information?[/QUOTE]

                                 

                                Hello Joey,

                                 

                                Yes please! I'm waiting to hear from Datawatch with confirmation of a release date for DataPump v11, and then I'll start planning to roll out the upgrade. If we can iron out the email timing issue beforehand that would be a great help for my colleagues.

                                 

                                Best wishes,

                                 

                                Olly

                          • Future Enhancement Requests
                            Ken Hartland

                            As in Microsoft Excel, it would be very useful if Monarch could sum cells where the area containing them was highlighted. This would allow a quick check (for control purposes etc) of the sum of the data being used.

                              • Future Enhancement Requests
                                Grant Perkins

                                Ken,

                                 

                                Is this something specifically related to Monarch use within Data Pump? If not it may be better to place it in the Monarch Forum thread for Future Enhancements Requests so it does not become lost within the development processes.

                                 

                                Grant Perkins

                                  • Future Enhancement Requests
                                    mdyoung _

                                    Based on a recent post by Mr. John Battalio, another good suggestion, and one I'd think would be easy for you to implement, is give a simple little check box to determine whether or not to execute a post process script when the process fails.

                                     

                                    Thanks,

                                     

                                    Micheal

                                      • Future Enhancement Requests
                                        mdyoung _

                                        I've thought of another enhancement that really needs to be included in the next release. This has bugged the heck out of me for years. Allow the ability to specify separate input distributions for each input file being used in the same process. Let me explain....

                                         

                                        The hospital I work for has three different facilities, each located in different cities. For the purpose of this explanation, I'll refer to them as FacA, FacB, and FacC. They all use the same information system, but each have different databases within the system. There is a report generated automatically from each of the three facilities' database. All three reports are generated and downloaded to three separate network folders (one for each facility), but the reports are all named the same. Because of the way the IS system is designed, there is no way for me to change the file naming convention. For example, the report name will be SYSTEM_STATUS for all three facilities.

                                         

                                        Now.... In the DataPump process on the Input tab, I have all three reports being processed. After processing, I need these reports to be stored in archive directories separate from each other. For instance, FacA's input file needs to be stored at "[I]ServerArchiveDir\[B]FacA[/B]\SYSTEM_STATUS_CurDate"[/I],  FacB's at "[I]ServerArchiveDir\[B]FacB[/B]\SYSTEM_STATUS_CurDate"[/I], and FacC's at "[I]ServerArchiveDir\[B]FacC[/B]\SYSTEM_STATUS_CurDate"[/I].

                                         

                                        Currently, I'm unable to separate the distribution because there's no way of telling the input distribution system which input report to apply the distribution to. With the reports being named the same, telling it to throw all three in the same directory will not work because of "File aready exist" errors... and I sure don't want to overwrite what's in there. With the ability to keep distributions separate for each input file, I can tell it to throw input A into FacA's folder, B into FacB, and C into FacC.

                                         

                                        One way that might cure my problem, is make it to where the naming macros can read into the file with the applied model to grab the First(ColumnName) value, similiar to the exporting macros. This way, I can include a Facility column in the model and use that to determine the input path.

                                         

                                        Thanks,

                                         

                                        Micheal

                                          • Future Enhancement Requests
                                            Olly Bond

                                            Hello everyone,

                                             

                                            Another enhancement request would be to dynamically email output based on key values. If we define a key value for a summary as the email address we'd like that summary sent to, then we'd like a distribution option to email to the key value. A catch-all server setting of an email address for the DataPump admin to which data should be routed where the email address is null or invalid would be nice too.

                                             

                                            Best wishes,

                                             

                                            Olly

                                        • Future Enhancement Requests
                                          joey

                                          Yes, this is a PDF we download from the TX Department of Insurance. Unfortunately they keep changing the format.

                                            • Future Enhancement Requests

                                              We have a monitored folder and the file it was processing was still writing and it failed.  Then it sent an alert email and shut-off monitoring.  We have "Retries" turned on so our thought was even if it error'd out it would retry 20 times (what we specified).  Instead it never retried again and no other files in the folders were processed because monitoring was shut-off after that one error.

                                               

                                              Seems like the Retries tab should retry no matter what because a lot of the errors happening for us is the file is being written to or a database couldn't get a connection for 1 second out of the day...

                                        • Future Enhancement Requests
                                          joey

                                          Another enhancement change: Data Pump uses the address book to help with addresses. Windows 7 does not include the address book. What does Data Pump use for addresses on a Windows 7 machine?

                                          • Future Enhancement Requests
                                            joey

                                            These are the instructions to set up the service:

                                             

                                             

                                            [I][SIZE=5][FONT=Cambria][FONT=Times New Roman]Install SMTP Service[/FONT][/FONT][/SIZE][/I][/B]

                                             

                                             

                                            [SIZE=3]1.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Open Server Manager[/B][/SIZE][/FONT]

                                             

                                            [SIZE=3]2.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Select Features, click AddFeatures[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]3.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Add Feature Wizard, Check SMTPServer[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]4.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Add Feature Wizard, click AddRequired Role Services[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]5.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Add Feature Wizard, Next[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]6.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Web Server (IIS), click Next[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]7.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Select Role Services, click Next[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]8.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Confirm Installation Services,click Install[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]9.[/SIZE]    /FONT[FONT=Trebuchet MS][SIZE=3]Installation Results, click Close[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]10.[/SIZE] /FONT[FONT=Trebuchet MS][SIZE=3]In the services set the Startuptype of Simple Mail Transfer Protocol (SMTP)[/B] to Automatic (DelayedStart)[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]11.[/SIZE] /FONT[SIZE=3][FONT=Trebuchet MS]Open Internet InformationService (IIS)6.0 Manager.  (Start /FONT[FONT=Wingdings]à[/FONT][FONT=Trebuchet MS]Administrative Tools /FONT[FONT=Wingdings]à[/FONT][FONT=Trebuchet MS]Internet Information Services(IIS) 6.0[/FONT][/SIZE]

                                             

                                            [SIZE=3]12.[/SIZE] /FONT[FONT=Trebuchet MS][SIZE=3]Expand , rightclick on , select Properties[/B].[/SIZE][/FONT]

                                             

                                            [SIZE=3]13.[/SIZE] /FONT[SIZE=3][FONT=Trebuchet MS]On the Access tab, click the Connection[/B]button.[/FONT][/SIZE]

                                             

                                             

                                            [SIZE=3]14.[/SIZE] /FONT[SIZE=3][FONT=Trebuchet MS]Click on Relay, add 127.0.0.1to only allow.[/FONT][/SIZE]

                                             

                                             

                                            [SIZE=3]15.[/SIZE] /FONT[FONT=Trebuchet MS][SIZE=3]Open Monarch Data Pump ProAdministrator.[/SIZE][/FONT]

                                             

                                            [SIZE=3]16.[/SIZE] /FONT[FONT=Trebuchet MS][SIZE=3]Click on E-Mail Settings, inthe Server Settings section.[/SIZE][/FONT]

                                            Set the SMTP Server to localhost[/B].[/FONT]

                                            • Future Enhancement Requests
                                              joey

                                              Another request: 

                                               

                                              I would like Data Pump and the xprj editor to be able to access the Microsoft Exchange GAL to get email addresses.

                                              • Future Enhancement Requests
                                                joey

                                                If you look at the job log, under an email there is a from and resolved_from. Any time the number of emails doesn't match up, I think we should have the ability to send an error email to the Data Pump admins.

                                                 

                                                We have had cases where a person's name was typed wrong and the contact couldn't be resolved to determine an email address. In these cases, the job continues but notification was wrong. We don't find out unless a recipient notifies us that not everyone recieved the email.

                                                  • Future Enhancement Requests
                                                    jmckune _

                                                    A request I have is to allow support for executing stored procedures in SQL Server.  Currently we can only connect to a data source contains tables or views.  Because we are a hospital, HIPAA requires that all accesses to patient data be audited, and our IT department prefers to build stored procedure that yield data sets and automatically perform the logging/auditing functions for us.  We like handling data that way as IT tunes the SQL queries for us and we just handle it like a black box.

                                                     

                                                    For example, I would like to be able to execute a stored procedure for pulling back outpatient visits between two dates.  The SQL Server command is:

                                                     

                                                    exec sp_AdminVisitsRAW '01/01/2012', '03/31/2012'

                                                     

                                                    This returns rows and columns of data just like a SELECT statement from any table would do.  However, Data Pump does not currently allow you to enter a SQL Server command:  nothing but tables and views.  This would be a HUGE win for us if we could do this.

                                                     

                                                    Best regards,

                                                     

                                                    Jeff McKune

                                                    Administrative Director of Planning and Decision Support

                                                    Phelps County Regional Medical Center

                                                  • Future Enhancement Requests
                                                    joey

                                                    I noticed the following job log today. I t has a status of "job completed without errors", but if you look at the log, there were errors. I believe that when there are errors (in this case importing the PDF), Datawatch should report them and have the correct status code. Idealy execution of the project would stop when the PDF cannot be imported.

                                                     

                                                     

                                                     

                                                    /CODE

                                                      • Future Enhancement Requests
                                                        Olly Bond

                                                        Hello Joey,

                                                         

                                                        I'd suggest having a "fail on verify" event when the unfiltered table produces no rows - this would give you protection against scaling import errors. But for this - what triggered it? has the PDF changed?

                                                         

                                                        Best wishes,

                                                         

                                                        Olly