Given that each of your record rows will be creating 2 'columns' in your required output, if all the resulting lines are displayed in a table (and I assume that the columns would have some sort of common header) do you have any feeling for the maximum number of 'columns' you might need for any version of the reports that you might be tasked to extract data from?
I suspect that you don't yet know (that would be normal in my experience) but you need to gain some sort of idea before you can really consider a development of this sort.
There are a few approaches that one could adopt and try but some, though they may work for samples, may not be even half useful for the real thing.
Also it may be that the multi-line field is not optimal for your purpose here and that it could be better to treat each line as a detail line and then put the results into a summary - or maybe a 2 stage process... Is there any chance of posting a sanitized example of a page or two of the report? (If so please paste betweeb CODE tags as described post at the top of the forum pages.
Looking forward to your answers.
There appears to be less than 20 pairs of columns. I've built the multiline model at this part but find "ending" the parsing of each line less than optimal (sort of manual in that I guess at the max number of line which should be 20) and created large nested IF's to put the data into the columns.
It would be nice if the Calculated Lookup feature allowed you to put a variable in the "output" field ...this might help.
Thanks for interest.....I'm moving ahead but feel there should be better features to help with this overall problem of multilines, parsing and mapping the parsed fields into new calculated fields.
Firstly: Is your sample
288 Label 1 10.5 11.6
222 Label 2 5.7 8.6
222 Label 1 15.6 9.8
222 Label 1 17.5 4.3
a representation of the data in the report or after you have extracted it using TEXTLINE?
Secondly: You show the required output as
288 10.5 11.6 0.0(no data) 0.0(no data)
222 15.6 9.8 5.7 8.6
but what happens to the line
Finally - are the labels character or numeric? Do they lend themselves to sorting of a form that would define the order in which their related value fields should appear across the row?
How automated does the solution need to be? If it has to run completely unattended, then any approach which involves guessing an arbitrary upper limit to the number of Labels & Values for each key is going to be risky.
And as Grant asked, are the Labels unique for each Key, or do you have examples like Key A, Label 1, Key A, Label 1(again), Key A, Label 2...? Also, do you ever have empty values?
I think there's a two pass approach which could handle this - possibly three pass if you need to prep the data a little.
288 L1 10.5 11.6
288 L2 5.7 8.6
222 L1 15.6 9.8
222 L2 17.5 4.3[/CODE]
First pass, is to create a table of Keys with a record of the first line on which they appear. Trap the detail as one line, create a calculated field "KeyLine" from Recno(), filter to show only one record from duplicated Keys, export a table with Key and KeyLine showing to an Access table.
Second pass, trap the detail as before, (note with a one line trap - we're not using multi line templates here and there's no ned for memo fields of the TextLine() function), and make an external lookup to the table created in pass 1. Then create a calculated field "Order" as Recno()+1-KeyLine - this will give you a counter that is equivalent to the TextLine function you had before.
Then create a summary with Order as an across key. If you're using v10, you should be able to show Label, Value1 and Value 2 as separate measures. If you're using any earlier version you'll have some more work to do here.
I have been looking at pretty much the same route as Olly.
If your data is sufficiently well presented by the original report to make the value columns align well you could use the MCR functionality (assuming you have a Multi Column Region capable version of Monarch) to extract all values into a single field name and then apply the sort order for 'across' display in a summary and it should work as you need it to.
If you have V10 the Column() function should make this easy.
Earlier versions require a slightly different approach but nothing very complicated.
Even if your report does not align the columns well I would be tempted, for the benefit of a dynamic output column creation (if appropriate - there is a case for saying that a fixed number of columns might seem more consistent), to run a first Monarch model to both preselect the data you require and make a tidier export of the results and then use the MCR technique in a second model. Then use a [URL="http://********************/FreeExcelTools/ExcelWithMonarch_Monarch_Batch_File_Generator.xls"]batch file[/URL] to make the two steps into a single process.
My thoughts, for what they are worth.
Thanks for all the great ideas here.....I finally did the MCR route. I check with the group using the report and they indicated that the max number of lines were in fact 20 so I proceeded with MCR. I ran into a little problem with the number of variable and reaching the max window size of 4000 char but was able to do some restructuring of variables and sizes. If the number of lines were much larger I probably would not have been able to do the MCR route given this max window size limit.
Your ideas were very helpful thank you.
Glad you got the result you needed. Good work.
However I am surprised you hit the 4000 limit.
If you have a number of intermediate calculated fields that add to the problem you may find it possible to reduce the number, once the calculations are individually proven, by combining individual calculations into a single field. I'm guessing you already know that the simple way to do that is to use cut & paste to copy the formula for a particular individual field and then paste that copy in place of the field name when used in another formula. Add parentheses as required to keep the calculation as a 'unit' in the new field.
There are also some circumstances where Monarch, by default, sets the field size for character fields to maximum size when something much smaller would be equally suitable if you are hitting limits. To some extent the same can be said for numeric fields but the potential for size excess to requirements is much less for them.
Just a few ideas that may or may not add to what you have already discovered. My apologies if it reads like I am stating the obvious but the observations may be helpful to others finding this thread in the future.