It's most likely that the different behaviour is caused by the limit in v12 and earlier to a total of 2GB of data, which in v13 is raised to 10 GB. Other factors might be versions and bitness of Office components, but I would suggest it's worth focussing on the data size first.
Assuming your Excel file is looking up 10 numeric fields, 10 character fields with an average data length of fifty, and five time or datestamps, then each record is bringing in almost 1KB of data to each row in your table. 170,000 rows is 170 MB. If your table has 1m rows of extra fields, then the external lookup will be adding 1m x 1 KB, or 1 GB, regardless of how many rows are in the Excel file.
If you need to work in v12 or below, and so stay below 2GB, a practical technique is to break the model into steps, and minimise the data size at each stage, and to export just what you need for the next stage as an Access table. So hide fields you don't need, reduce the data length of everything you can, filter the table, export it. Then open that in a new project and make the lookup from there.
Diego, I definitely think you should look into Monarch Complete, now on version 14. You can load your v12 Models AND the entire Excel workbook, then do a join of those datasets. Joins are a more powerful and more dynamic option than external lookups of earlier versions of Monarch, and with Data Prep Studio in v14 you won't need to worry about limiting the size of your data!
It might be helpful to point out to your customers that the "join" in Data Prep is subtly different to the "external lookup" in Monarch Classic? Formally, DPS does a proper left outer SQL join - proper Monarch does something different, in that it won't bring in additional rows to the source table from unmatched rows in the lookup.
MONARCH ? | ? | ? | ? EXPERTS