I used to see activity of a similar nature when I first began using Monarch with v5 Standard.
I suspect, only suspect mind you, that it had something to do with insufficient RAM in the computer at the time. That, and my rushing things along didn't help either.
For me the problem would be opening a report (or reports), applying a model and just as quickly as possible copying the table and pasting the derived records into Excel. Thinking that everything was fine, I'd forge on with my work, only to discover that there were records missing in Excel. The solution was to slow down a bit. Maybe waiting 30 seconds after opening the model before copying.
But exporting was fine, so my past experience isn't exactly the same as what you're describing. If I had to speculate, I'd look at the RAM available, then the size of the disk swap file and available drive space, things like that. Maybe cleaning out your Temp folder might help.
With these sorts of odd sometimes-it-works-sometimes-it-doesn't problems, it's probably a good idea to contact Datawatch tech support directly. They may have encountered this situation and have some recommendations or even a likely solution. Let us know how it goes.
I've worked on some audit reports with 1m+ lines, where footer information from the report was used to audit as you described, and only encountered problems with a ledger where it was just too large for Monarch to handle.
We solved this by breaking the data into months - but if your report doesn't lend itself to that approach there is another. Assuming your detail trap has a wildcard numeric N trap somewhere, clone the model 10 times and change this trap character to 0,1,2,3,4,5,6,7,8,9.
Just to clarify what you described - are the missing rows at the end of the table entries or do you seem to be losing rows from other places in the table when exporting?
Is it safe to assume you have no filters in play? Anywhere.
Is the table sorted into and displayed in a different order than it was extracted?
Like Olly I have had situations where the input files size and moel definition resulted in a total extract that exceeded the max size of the internal work file. Monarch appears to finish but misses the records it cannot take in towards the end of the report. We are looking at a 1Gb+ work file here so it does require a lot of data and/or many calculated fields to get to that point. (You can find the relevant information in the 'Specifications' section of the Help file.)
However I have not, as far as I recall, seen a problem exporting what it could handle and that seems to be your problem here rather than my problme of not being able to extract everything to start with.
I'm not filtering or sorting. I'll investigate the next instance to see if it's the final few records. I've also learned not to hurry, so I think I'm giving the file time to load, and my PC has 3 gig of RAM. It could be the swap file size, especially since I'm working on a network, but I get the error even when I'm working on a local-drive database.
Good feedback all!