This can happen when you combine lots of joins (lookups) with filters. You're asking Monarch to do a lot of work at once, and if the filter is applied to a field that only exists in a lookup table, you're telling Monarch to:
a. open the original database, take all records
b. make the first lookup, bringing in all data to match all records
c. make the second, etc
d. make the third,
e. make the fourth,
f. make the fifth,
g. make the sixth
h. evaluate the filter expression against the whole set.
Generally, for summary outputs, Monarch is very good at going through the data set with the right order of operations to get you what you need, but, especially if the fields on which you are joining one of the subsequent lookups are themselves imported in a previous lookup, then you can find occasional behaviour where Monarch trips over its own shoelaces and doesn't get you all your data.
The fix is to break the model down into six different models - one to open the first file and apply any filter which just depends on the fields in this, then export this as a local .mdb Access table. The second model opens this table, and makes the first external lookup, then applies any filter that just uses the fields in this, and again, export the filtered table to a different local Access table. Repeat, adding one lookup and making any possible filter at each stage. Now you can run the six projects one after the other, using DataPump if you want logging and error-reporting and all those nice things, or on the desktop with a batch file or a script, and you should have an error-free routine to join your data.
Are the 6 files all version of the same thing?
If so, if you open the files one at a time does the filter work?
Something to eliminate first .... at the very basic level I have seen what appeared to be random results that were "other than expected" which turned out to be related to the size of the internal worktable being created. Monarch's internal database engine is limited (per MS Access) to 1Gb and in past version did not advise that it had "finished" loading the data because the working file was full rather than because it had reached the end of the records it could extract from the source.
This sometime gave unpredictable results especailly if multiple files were open since they were not always read in the same order. Thus record that you would expect to see, based on the filter, may never actually get to the workspace database to be filtered.
6 x 8Mb files is quite a lot although there is some influence on data asize depending on the number and size of fields you are extracting and creating as part of the process.
Rahther than specualte further about a way forward for such a problem I'll wait to see whether this may in fact apply to your case. It might not. You may have something completely different going on.
Is this still V9 as per your profile?
I have had this problem with Monarch Pro since version 7 or 8. Support has only been able to tell me that they have recorded the problem and they are working on it.
Try this method which usually works for me.
First insure that no filter is being applied.
Then go to >> Data >> External Lookups >>
Click the "Refresh" button then apply your filter.
For some reason this method doesn't seem to work if you use the "Refresh Database" under >> File tab