5 Replies Latest reply: May 15, 2014 9:53 AM by imindallastx _ RSS

    Question regarding DUPLICATES

    kcnickel _

      (I'm currently using Monarch Pro V9)

      Every day I load 60 days worth of banking reports (current day to 60 days back) and need to look for duplicate items.  If I had 190 items filtered out the previous day, and 194 today, I know I have 2 new duplicates.  The problem is I don't know which 2 are the new ones unless I compare it to the previous day which can be time consuming.  I recently added a template that grabs the date of the reports.  However, the duplicates may or may not have the same date, so I am now wondering:

      Is there a way to adjust my filter to say 'show me the duplicates where at least one of the two items has today's date?'  I'm currently organizing the items by date which shows me if I have new duplicates, but then I have to reorganize to match them up (if the report dates do not match).  Hope that makes sense and thanks for any advice!

        • Question regarding DUPLICATES
          Data Kruncher

          Hi KC,

           

          Welcome to the forum!

           

          I think you're looking at a two pass solution.to your challenge.

           

          In the first pass, you'll create a list of items which appear more than once throughout your reports. To do this, create a filter in the table window named Duplicates. On the Advanced tab, select "Duplicated rows" and "all duplicated rows". Now check the field you want to monitor from the list box ("Specify keys").

           

          Click OK to close the filter and export the table to an Excel file (if you don't have too many rows - export to an Access table if you do). Add this export to the project exports. Save the model and the project.

           

          Close everything and connect to your exported table as a database (Open Database...). After the records have been imported, create a filter which only shows records where the date is today.

           

          These will be the records that you're after in the activity of the previous days.

           

          HTH,

          Kruncher

           

          Edit: It was suggested to me by a certain Most Honorable Guru     that a compound filter might alleviate the need for a two pass process. So, with that in mind, create your Duplicates filter. Now create a Today, MyDate = Today(), filter. Both of these are formula-based.

           

          Now create a compound filter joining Duplicates AND Today.

           

          Let us know if this works out for you.

           

          [size="1"][ June 27, 2007, 05:26 PM: Message edited by: Data Kruncher ][/size]

          • Question regarding DUPLICATES
            imindallastx _

            I'm having a similar problem but I'm working with one flat file that is 334MB in size.

             

            I have accounts with the account holders name, and amount owed.  But a customer can have sub accounts and I'm trying to just filter out the duplicates from my filter.

             

            I've been running the "All Duplicated Rows" function for the last 5 hours with no end in sight.

             

            I have ran other filters requesting similar inforamation and it has taken less time than what it takes to run the duplicate filter. 

             

            Is there something I can do to speed up the process?  Am I even using the Duplicated Row feature correctly?

             

            Any help you can give is appreciated.

            • Question regarding DUPLICATES
              Grant Perkins

              Hi Angela,

               

              Sounds like a large file - does that also mean it will produce a large number of records?

               

              As with all such situations the more records the report will produce the longer it will take - over a certain number the processing is likely to increase exponentially.

               

              How many duplicates are you likely be trying to process? How precise will the link be?

               

              Could you try filtering the extraction first so you can test the principle with a smaller working set of data?

               

              Just some thoughts that might suggest ideas.

               

              HTH

               

              Grant

              • Question regarding DUPLICATES
                Data Kruncher

                Hi Angela,

                 

                Do you have v7 Standard, or Pro, or have you (hopefully) upgraded since you updated your forum profile?

                 

                I may have another approach if you have Pro.

                 

                Kruncher

                • Question regarding DUPLICATES
                  imindallastx _

                  I could filter my file out more than what it already is and I think the records are going to be about 20 to 30K.

                   

                  I have ran other filters that have not taken that long in the past and I'm thinking since I'm asking for all the duplicates, that is what is clogging up my file.

                   

                  In response to DC...I have upgraded to v.9.