1 Reply Latest reply: May 15, 2014 10:12 AM by Olly Bond RSS

    Uploading large data/system stall

    Maurice _

      For some reason MDP stalls when processing a large text file equivalent to some 500,000 rows. Any help is greatlty appreciated.

        • Uploading large data/system stall
          Olly Bond

          Hello Maurice,

           

          The number of rows is only one factor - Monarch handles data in memory in Access format (mdb) which limits it to 2GB of data, or 10m rows, or 254 fields, or 4000 characters data length per record.

           

          If you can process the data by hand in Monarch on the same machine, however, then DataPump ought to be able to process it automatically for you.

           

          If the data is too large, there's two techniques to handle it. The "non-programming" way would be to trap the detail line with a literal numeric trap and run through ten models for 0-9, so you'll only extract a tenth of the records in each pass. The slightly more advanced solution is to ask Datawatch for help configuring the "table-less" operation mode of DataPump where the records are written straight into SQL Server.

           

          Best wishes,

           

          Olly