7 Replies Latest reply: May 15, 2014 9:55 AM by Grant Perkins RSS

    Consistent Data Prep Output (.txt to .prn)

    Richard Corby

      I am converting a very large .txt file to .prn on a daily basis to be used by Monarch (v7).  I'm finding that occasionally, the .prn result file has rows of a different width, with fields pushed over one or two places - I assume this is based on the source data.  How can I ensure that the output file is the same each day so the template works consistently?

        • Consistent Data Prep Output (.txt to .prn)
          Grant Perkins

          Richard,

           

          Is there something about the .txt file that makes it unusable directly by V7?

           

          Grant

           

          Originally posted by Richard Corby:

          I am converting a very large .txt file to .prn on a daily basis to be used by Monarch (v7).  I'm finding that occasionally, the .prn result file has rows of a different width, with fields pushed over one or two places - I assume this is based on the source data.  How can I ensure that the output file is the same each day so the template works consistently? /b[/quote]

          • Consistent Data Prep Output (.txt to .prn)
            Richard Corby

            I guess not...I have just not been able to parse it properly.  Perhaps that is where my question should be centered?

             

            Can you take a look?

            [font="courier"] "DataDate","ClientNum","clientName","AdmZone","AreaCode","PostalArea","Archive","AgtTeam","AgtCode","AgtName","InvoiceNb","Ref","FileNb","DueDate","TransCode","STATUS","CreditNote","DtPromise","AmtPromise","ExchRate","DtLastCheque","AvgPayment","Age","Age 0-30","Age 0-30 U.S$","Age 31-60","Age 31-60 U.S$","Age 61-90","Age 61-90 U.S$","Age 91-120","Age 91-120 U.S$","Age 121-180","Age 121-180 U.S$","Age 181","Age 181 U.S$","Age 91","Age 91 U.S$","Cur","Total","Total U.S$","Age 0-90","Age 0-90 U.S$"

            "2003/11/18","AAAA00007","*CHARGE DE COMPTE / ACCOUNT MANAGER","Q","","","40178","ESR","DI","JEAN HAYES","AT7087","3-470489","","2003/07/14","5","","","","","1.45","2003/07/04","9","127","0","0","0","0","0","0","0","0","216.60","0","0","0","216.60","0","","216.6","0","0","0"

            "2003/11/18","AAAA00007","*CHARGE DE COMPTE / ACCOUNT MANAGER","Q","","","40178","ESR","DI","JEAN HAYES","BJ2087","3-655101","","2003/07/14","5","","","","","1.45","2003/07/04","9","127","0","0","0","0","0","0","0","0","65.14","0","0","0","65.14","0","","65.14","0","0","0"

            "2003/11/18","AAAA00007","*CHARGE DE COMPTE / ACCOUNT MANAGER","Q","","","40178","ESR","DI","JEAN HAYES","BL2564","3-660713","","2003/07/14","5","","","","","1.45","2003/07/04","9","127","0","0","0","0","0","0","0","0","20.24","0","0","0","20.24","0","","20.24","0","0","0"

            "2003/11/18","AAAA00007","*CHARGE DE COMPTE / ACCOUNT MANAGER","Q","","","40178","ESR","DI","JEAN HAYES","BH1776","3-571382","","2003/07/14","5","","","","","1.45","2003/07/04","9","127","0","0","0","0","0","0","0","0","25.08","0","0","0","25.08","0","","25.08","0","0","0"

            "2003/11/18","AAAA00039","*CHARGE DE COMPTE / ACCOUNT MANAGER","Q","","","40314","CER","PC","PATRIZIA TROPIANO (PETITS COMPTES A)","DL0200","3-862221","36272","2003/06/25","1","","","","","1.45","","0","146","0","0","0","0","0","0","0","0","975.00","0","0","0","975.00","0","","975","0","0","0"

            "2003/11/18","AAAA00039","*CHARGE DE COMPTE / ACCOUNT MANAGER","Q","","","40314","CER","PC","PATRIZIA TROPIANO (PETITS COMPTES A)","DL0201","3-862231","36276","2003/06/25","1","","","","","1.45","","0","146","0","0","0","0","0","0","0","0","856.00","0","0","0","856.00","0","","856","0","0","0"

            /font[/quote]Thank you.

            • Consistent Data Prep Output (.txt to .prn)
              Grant Perkins

              Richard,

               

              This is a nice one!

               

              Have you V7 Pro? If so I would simply read the .csv file as a database[/b] and create the table that way.

               

              BUT ...

               

              The records from the report then exceeed the internal record length by 41 characters. This is not a problem if you can exclude some of the fields (41 characters worth!) that you import although there may then be some issues about how much free space you have withinn the record for calculated fields and so on. Not something I can claim much experience with as I have rarely hit the limit as far as I know.

               

              If you don't have the Pro version we need a different approach and your conversion to .prn may reasonably be a good choice.

               

              I will look into that depending on your answer to to the above (long task list today so it's difficult to find time to just play!)

               

              What is your process to convert to .prn?

               

              Grant

               

              Originally posted by Richard Corby:

              I guess not...I have just not been able to parse it properly.  Perhaps that is where my question should be centered?

               

              Can you take a look?

              Thank you. /b[/quote]

              • Consistent Data Prep Output (.txt to .prn)
                Richard Corby

                Grant,

                 

                No, I do not have Pro!  And I cannot drop any fields...this list was created just for me with the fields I needed.

                 

                I have been playing with floating trap, but just can't seem to get it to do what I want.

                 

                I use  Prep[/b] to convert to PRN via a nightly batch file.  I thought this worked fine until I noticed my template off a few places and assumed that the data input into Prep[/b] was effecting the output position...I have not verified this yet.  But one day, a customer may have a "0" balance and the next they have "100.00".

                 

                Are there better conversion tools that will allow me to fix the output width (what is MSRP)?

                 

                Thanks for your assitance...I thought I had this project wrapped up until my reports came out wrong!

                • Consistent Data Prep Output (.txt to .prn)
                  Grant Perkins

                  Hi Richard,

                   

                  I would go for the upgrade, it's the only solution ... 

                   

                  Actually NOT in this case it seems since your record length exceeds the 256 limit.

                   

                  In this case the Prep approach will provide a better result.

                   

                  I'm not sure where you think you have a problem with the data in the fields but all looks reasonably OK except for the numeric fields with decimals which are LEFT JUSTIFIED in line with the first letter of the HEADER. They would require some thought in the templte and some formatting in the table.

                   

                  Also I noticed the the LAST of the values (The "TOTAL" column) seems to be a field limited to 5 chars in the original file. Looks odd and suggest a probable error in the original file requiring refernce back to the programmer.

                   

                  I think the .csv file functionality will, by default, make a field width as wide as the widest 'field' it finds. Sometimes the header column title, sometimes the data.

                   

                  Given that variability from report to report it is a useful idea to make sure that the column headers (where there are any) define the maximum column width that may ever be required for the field in the column. If you don't do that, especially when modelling for an automated process, later versions of the report may have values that exceed the field sizes in the model - giving unanticipated results.

                   

                  Or maybe my cut and paste sample is deficient?

                   

                  Whatever, let me know if I have spotted your particular problem. If not we may have some other anaomalies to uncover since the rest of the fields seem to align very well on screen. I have not yet tried a template but it doesn't look like it would be a problem.

                   

                  Grant

                   

                  Originally posted by Richard Corby:

                  Grant,

                   

                  No, I do not have Pro!  And I cannot drop any fields...this list was created just for me with the fields I needed.

                   

                  I have been playing with floating trap, but just can't seem to get it to do what I want.

                   

                  I use  Prep[/b] to convert to PRN via a nightly batch file.  I thought this worked fine until I noticed my template off a few places and assumed that the data input into Prep[/b] was effecting the output position...I have not verified this yet.  But one day, a customer may have a "0" balance and the next they have "100.00".

                   

                  Are there better conversion tools that will allow me to fix the output width (what is MSRP)?

                   

                  Thanks for your assitance...I thought I had this project wrapped up until my reports came out wrong! [/b][/quote]

                  • Consistent Data Prep Output (.txt to .prn)
                    Richard Corby

                    I verified that the conversion process uses the larger of the first row (header) or the data when sizing fields.

                     

                    What I decided to do is create a header row that is more than wide enough for my data.  I originally used spaces to space it out, but the Prep program ignored those and I switched to "_".

                     

                    I will then run a nightly batch to append my daily data file to this header file and convert.

                     

                    Thanks.

                    • Consistent Data Prep Output (.txt to .prn)
                      Grant Perkins

                      Richard,

                       

                      That sounds sensible BUT I THINK your original input file carries the initial problem.

                       

                      If you have control over that pre-PREP, then you should succeed. But if not (i.e. the original extract INCLUDES the error) then there is nothing you can do EXCEPT have the bespoke extract modified.

                       

                      I would be delighted to be wrong on this, but if the sample you posted is your starting point then you have a problem that you cannot easily solve well.

                       

                      Grant

                       

                      Originally posted by Richard Corby:

                      I verified that the conversion process uses the larger of the first row (header) or the data when sizing fields.

                       

                      What I decided to do is create a header row that is more than wide enough for my data.  I originally used spaces to space it out, but the Prep program ignored those and I switched to "_".

                       

                      I will then run a nightly batch to append my daily data file to this header file and convert.

                       

                      Thanks. /b[/quote]