I had that same issue. What I did was save the spreadsheets to .txt format and create a project. I then created a button to run code to process each one. it goes as follows:
Private Sub cmdImportDailyRecords_Click()
Dim MonarchObj As Object
Set MonarchObj = CreateObject("Monarch32")
MonarchObj.SetProjectFile ("G:GROUPSEVERYONEDormant AccountsDDADormant.prj")
MonarchObj.ExportTable ("G:GROUPSEVERYONEDormant AccountsDDADormants.asc")
MonarchObj.SetProjectFile ("G:GROUPSEVERYONEDormant AccountsSAVDormant.prj")
MonarchObj.ExportTable ("G:GROUPSEVERYONEDormant AccountsSAVDormants.asc")
Set MonarchObj = Nothing
I then linked all of the .asc files as tables in an access database, created queries to append the data then ran macros to runn all of the queries. sort of cumbersom at first but works rather well. This may help.
Ok, it's a concatenation question then.
I think the Ontiveros approach seems appropriate, but I am no Excel expert!
If all of the spreadsheet can be combined into one worksheet without exceeding the max row count that might be the way to go. If the size was goinf to be exceeded then I guess the incoming xls files could be imported to Access as appended data or separate tables and then combined to a view that Monarch could read.
As a Monarch solution, process each incoming file and export the result to the same table in Excel or Access using append to existing data. Projects, Export Projects and scripted automation would come into play. The final stage would be to run a new Monarch session to make use of the newly created file.
A slightly less sophisticated method might be to process each spreadsheet and then save is as a fixed width text 'report'. Then open each report in Monarch and run the analysis from all of them to a single table in Monarch.
Data Pump would allow easier control over the pre-processing and provide for automation of the entire thing.
It all depends on what resources you have available and which solution you feel most comortable with.
Does this help?