7 Replies Latest reply: Aug 10, 2016 10:20 AM by Olly Bond RSS

    Maximum numbers of rows for a model

    kajbw _

      Is there a maximum number of rows that can be imported from .txt or similar ASCII file?


      running v 13 and have some 4 million and 6 million row files coming my way.



        • Re: Maximum numbers of rows for a model
          Austin Perkins

          I spoke with someone at Monarch last week about this topic and apparently there is not a limit, your only limitation is your computer resources. The person I spoke with said they had worked with over 45M rows before in Data Prep Studio, I am assuming that there would be no difference for Monarch Classic.

          • Re: Maximum numbers of rows for a model
            Grant Perkins

            What are you exporting to when your extraction is complete? Will all millions of rows be passed on?


            If so bear in mind that there may be restrictions in the target output file options. See the technical specs in the Help file.




            • Re: Maximum numbers of rows for a model
              Olly Bond

              Hello kajbw, hello Austin,


              There are, sadly, limits. Monarch assembles data in the table, and this whole process is managed by the application using Microsoft's LocalDB SQL engine. Before v12, it was the ADE (Access Database Engine) and before that, it was using JET to manage everything in an Access database. LocalDB offers us three advantages - we can have more than 9 external lookups, we can have 512 fields for each row not 256, and we can store 10GB in the table not 2GB.


              Let's assume that we're given five million rows of data - can we work with it? If each row is one field, of a single character, like "A", then it takes up one byte, so we'd be using 5 MB of data. If each row has 2000 bytes of data, we'd have 10 GB and be at the limit. Arriving at 2000 bytes per row isn't hard - 100 character wide fields crop up frequently in address data, and remember that all numbers, even 0,01 or 1, take up eighteen bytes to store. If the average field takes 40 bytes, it's only 50 fields before your data size begins to hit the limit. The hard limit in SQL is 8096 characters per row, regardless of how many rows you have.


              When someone claimed to be working with 45 million rows of data, perhaps each row was less than 200 characters, or perhaps they were making an outer join to those 45m rows from a smaller table, and so not adding all that data to the main table in LocalDB. I should add that Data Prep supports other joins, not just left outer external lookups, and may have a different database architecture under the hood, compared to Classic Monarch.


              If you are faced with data that's just too big for Monarch to handle in one go, then there is are techniques to work around it.


              Hope this helps,



                • Re: Maximum numbers of rows for a model
                  Joe Meree

                  I believe you can develop your model on the desktop with Monarch and then process much bigger input sets on Automator (since it uses the full version of SQL Server and most likely is x64).


                    • Re: Maximum numbers of rows for a model
                      Olly Bond

                      Hello Joe,


                      Microsoft offer a range of SQL Server options, most support up to 10GB as a maximum database size, but if you give Microsoft some extra money, they let you make databases up to 524 PB. Just to give a sense of scale, that's 50,000 times bigger than 10 GB. If each row of our table was 500 bytes, then we could make a table with 1,000,000,000,000,000 rows. Doing anything with this table might take some time, just storing it might cost money, and where we'd want Monarch to export it, I can only imagine...


                      But practically, if a user needs to get past the 10GB limit, the desktop edition of Automator wouldn't help, as that's localDB SQL, so they'd need the server edition. That starts at about USD 25,000, I think, but is there an extra charge if they needed to connect it to a version of SQL that supported bigger database size limits?


                      Best wishes,