I suspect you may have a problem with the total number of records irrespective of the reason for the apparent current limit.
Have a look in the Help file in the Technical Specification section for what may be useful information.
There is an absolute limit of 9,999,999 records that can be held in the table.
There is also a limit on the size of the file (approx 1Gb) that contains the work table and if you are creating large records (Large numbers of field and/or large fields) you can hit that limit well before you hit the maximum number of records limit.
This file can be found in the folder identified in the Workpath variable used for temporary file creation. The location will likely vary according to which operating system version you are running. Have a look in the Settings section of the Help file for more information. If you find the file and check the file size you will have an indication of whether it is at its maximum when you extract your data. My guess is that is it could be.
Now I don't normally work with files this size, so I haven't run into these kinds of issues previously. But after having run some tests, these symptoms described by Doc are rather consistent with mine (more later) and at this point I suspect that he's found a genuine issue.
Anyway, without having a 12 million record file lying around, I wrote a little program to create one.
Open "C:\test\doctest.prn" For Output As #1
For i = 1 To 12000000
Write #1, i, Int(Rnd * 10000000)
This creates a 194MB file with precisely 12,000,000 records.
After creating a single wide field in the detail template, I (rather oddly, considering the tech specs as correctly quoted by Grant) wound up with in excess of 10 million records in the table window. 10,007,099 to be precise.
Then I created a filter to limit the selection to the first million records. Monarch gave me 17 records. :confused:
Attempting to filter for records 6MM to 7MM returns no records. Same for records 1MM to 2MM. :confused:
I'm at a loss to explain this behaviour.
Thanks for helping me keep my sanity. I've since decided to externally parse the files down to a more manageable size before attempting to use Monarch. I must have misread the Monarch Specifications, as I had thought the size of files used are limited only to the systems resources. I had failed to consider the fact that a subset of the file in the Table in this case does have a limit! In fact 9,999,999 records, as pointed out by Grant.
So, once again... Thank you for your analysis.