3 Replies Latest reply: May 15, 2014 10:02 AM by Dr. Siewert RSS

    Extracting 1,000,000 Records

      Have a 12 Million Record file.  Need to Output 12 one million Record files.  For example:

       

      File1:  = 1 Million Records.

      File2:  = 1 Million Records.

      ..

      ...

      ....

      File12: = 1 Million Records.

       

      I have a internal sequence number (not Rowno(), not Recno(), but a indigenous sequential number I am attempting to select output data on. 

      This process appears to work with the first six files.  Using filters for example:

       

      Seq#>=1.And.Seq#<=1000000.

      ....

      .....

      Seq#>=11000001.And.Seq#12000000

       

      When the filter attempt to extract the seventh file and thereafter, no further records are extracted.

       

      Tried to use Rowno() and Recno(), but it wouldnt let me use those numbers in a filter.

        • Extracting 1,000,000 Records
          Grant Perkins

          I suspect you may have a problem with the total number of records irrespective of the reason for the apparent current limit.

           

          Have a look in the Help file in the Technical Specification section for what may be useful information.

           

          There is an absolute limit of 9,999,999 records that can be held in the table.

           

          There is also a limit on the size of the file (approx 1Gb) that contains the work table and if you are creating large records (Large numbers of field and/or large fields) you can hit that limit well before you hit the maximum number of records limit.

           

          This file can be found in the folder identified in the Workpath variable used for temporary file creation. The location will likely vary according to which operating system version you are running. Have a look in the Settings section of the Help file for more information. If you find the file and check the file size you will have an indication of whether it is at its maximum when you extract your data. My guess is that is it could be.

           

          HTH.

           

           

          Grant

            • Extracting 1,000,000 Records
              Data Kruncher

              Now I don't normally work with files this size, so I haven't run into these kinds of issues previously. But after having run some tests, these symptoms described by Doc are rather consistent with mine (more later) and at this point I suspect that he's found a genuine issue.

               

              Anyway, without having a 12 million record file lying around, I wrote a little program to create one.

               

              Sub writefile()

                  Randomize

                  Open "C:\test\doctest.prn" For Output As #1

                  For i = 1 To 12000000

                      Write #1, i, Int(Rnd * 10000000)

                  Next i

                  Close #1

              End Sub[/CODE]

               

              This creates a 194MB file with precisely 12,000,000 records.

               

              After creating a single wide field in the detail template, I (rather oddly, considering the tech specs as correctly quoted by Grant) wound up with in excess of 10 million records in the table window. 10,007,099 to be precise.

               

              Then I created a filter to limit the selection to the first million records. Monarch gave me 17 records. :confused:

               

              Attempting to filter for records 6MM to 7MM returns no records. Same for records 1MM to 2MM. :confused:

               

              I'm at a loss to explain this behaviour.

                • Extracting 1,000,000 Records

                  Thanks for helping me keep my sanity.  I've since decided to externally parse the files down to a more manageable size before attempting to use Monarch.  I must have misread the Monarch Specifications, as I had thought the size of files used are limited only to the systems resources.  I had failed to consider the fact that a subset of the file in the Table in this case does have a limit!  In fact 9,999,999 records, as pointed out by Grant.

                   

                  So, once again... Thank you for your analysis.