4 Replies Latest reply: May 15, 2014 9:59 AM by todd _ RSS

    how to parse this sucker  :)

    todd _

      Greetings, I am a lowly non-power user, running Pro, version 6.  I bow to all in the room, believe me.

       

      Have a fixed-length field that reads as "account number", then "/" (minus the quotes), then "client number", then "/" (minus the quotes) then "client name".  I'm trying to pull the client number out.  The rest is garbage.  Three sample lines below.

       

      Felt this was a no-brainer (I'd focus on the slashes as delimiters), then found that there are cases of account numbers (see line 2) that contain slashes themselves.  Also (see line 3), some have no account number (null).  Rats.

       

      127321/ABC100/ABC COMPANY     

      147910/151261/ABC101/DEF COMPAN

      /ABC102/GHI COMPANY HOLDINGS IN

       

      These are all individual items, so it's just a single line trap that's being done.

       

      Anyone have any suggestions?   - Todd

        • how to parse this sucker  :)
          Data Kruncher

          Todd,

           

          I've developed a model which extracts the client number. While it might be able to be simplified, it now involves six calculated fields which use some downright ugly string manipulation funtions, like  [font="courier"] if(instr('/',[Client Name])>0, left(,instr('/',[Client Name])-1), '')  /font[/quote]for example. Might be too much to post here.

           

          Send me a private message and I'll email you the model, if you'd like.

           

          Sandy

          • how to parse this sucker  :)
            Grant Perkins

            Todd,

             

            Just to be sure we are not missing something - you say 'fixed length' fields but that is not quite how the examples look. Are we really talking delimited fields? (/ as the notional delimiter?)

             

            Are there always the same number of fields on each row? Is the account number always the first 'column'?

             

            If yes to the above, have you tried RSplit?

             

            I would think that lines 1 and 3 (taken as they are shown) would work anyway using Lsplit, but 2 would really mess up!

             

            Grant

             

            Originally posted by todd:

            Greetings, I am a lowly non-power user, running Pro, version 6.  I bow to all in the room, believe me.

             

            Have a fixed-length field that reads as "account number", then "/" (minus the quotes), then "client number", then "/" (minus the quotes) then "client name".  I'm trying to pull the client number out.  The rest is garbage.  Three sample lines below.

             

            Felt this was a no-brainer (I'd focus on the slashes as delimiters), then found that there are cases of account numbers (see line 2) that contain slashes themselves.  Also (see line 3), some have no account number (null).  Rats.

             

            127321/ABC100/ABC COMPANY     

            147910/151261/ABC101/DEF COMPAN

            /ABC102/GHI COMPANY HOLDINGS IN

             

            These are all individual items, so it's just a single line trap that's being done.

             

            Anyone have any suggestions?   - Todd /b[/quote]

            • how to parse this sucker  :)
              todd _

              It's actually fixed length.  The first line (for example) is padded with spaces.  Didn't know quite how to post that initially.

               

              There are not always the same number of fields in the first column.  Basically, there are 3 fields, but the way they are printed, the 1st field gets to use as much of the available space as it "wants" to, then a '/' parses it and the next field.

               

              Yep, you're right about 2 really "messing up".   smile.gif[/img]

              • how to parse this sucker  :)
                todd _

                Sandy's example WORKED!  She really is the Data Kruncher!  All hail Sandy!

                 

                Thanks again!