If I'm interpreting this correctly, you've got a Character type field that contains the value "17,14".
To filter all instances where 17 exists in MyField, use this as the filter expression:
/LISTthen this forumla will find the instances that contain 3's:
InStr(" 3,"," "+trim(MyField))>0 .Or. Right(MyField,2)=" 3" .Or. MyField="3"
The formula adds a space to MyField just in case the value is something like "3, 30, 33". This ensures that it'll still work with "1, 2, 3, 4" or "1, 2, 3" or "3".
They are not character fields, they are numeric
There aren;t any spaces after the comma.
The file looks like this
mary smit 3
john doe 33
mary jane 32[/quote]
Is the filter that deals with the 17,14 column also having to deal with the examples above? I would guess not if it is numeric unless you are using a comma as a decimal point in the first example.
The example data above would, I think, be best dealt with using a numeric field based filter. So if you want to filter for only 3 (to use an example) make it a Value and filter as numberfield=3 .
You can also create a filter for a selection (or a range) of numbers but if you have a lot of non-contiguous numbers to select for I would be tempted to go the flexible route and define the commonly used ones individually and then use the Compound Filter facility to bring them together as you need them to group them.
All in all there are a number of different usable ways to write the filter formulas. Which you use may be dictated by the requirement or may just be a matter of personal choice if there options available to you.
As I recall there are some rather useful suggestions in the Help file. I found them to be excellent guides before I became familiar with Monarch. I still use them from time to time if something out of the ordinary is required.
A different approach to the problem might be to have one row per entry. Rather than:
resulting in two rows in your table, and having to do some fiddly work to split out the GROUP values and apply logical tests to them, you could, provided you know an upper bound on the maximum number of group values for each record (10?, 100?), easily create a table of:
One technique for achieving this in one pass would be using the multi column region option, but with a twist. You can then manipulate the data on a line by line basis, filtering or adding calculating fields as required, and if you want the output back in the original layout, easily achieve this with a summary using GROUP as an across key.