I can say that the best older version I used was definitely Monarch 10.5 Pro, which at a time was the de facto choice for Monarch for many years.
However, currently anything older than version 12 I do not believe is supported any more by Datawatch.
Having said that, there have been significant improvements in their software.
They do offer a trial of their newer software Monarch v13 as well as their Desktop Designer for visual analytics, and I would encourage you give those a whirl to see the differences as they are quite vast.
I am sure others here can speak to the advantages of the newer versions vs 8 and 9.
I believe the main difference in the older Monarch versions between Standard and Pro was the ability to save projects vs having to open models and reports each time.
I'm struggling with hotel wifi on my phone here so this will be too short and full of typos, but as Joe suggests, 10.5 Pro was almost perfect software and if you find a copy, buy it.
In brief, Standard won't support projects, external lookups, non-text sources (PDF, Excel, ODBC) or automation.
Thank you for your reply. I had one other that told me the differences but he indicated it would support pdf and I bought it. That is something that is critical to me, so I may have wasted my on this purchase. I'll keep my eye out foe a 10.5 or newer.
Sent via the Samsung Galaxy S® 5 ACTIVE™, an AT&T 4G LTE smartphone
I agree with the others that 10.5 is the best older version. I have both 10.5 and the newest version and use the 10.5 as often as possible. The main drawback I have found with the older versions is that they do not play well with newer versions of Excel or Adobe files. These are the two file types I use most often. If you get a really old version, you may have problems opening or exporting to files in Excel 2011-2013 and newer pdf writers.
I believe another difference worth noting is if you are using a newer 64-bit version of Microsoft Office, you will encounter an issue with using the older 32-bit version of Monarch as it will not support cross compatibility. Newer versions of Monarch come in both 32-bit and 64-bit flavors, so you have the option to suit your set up better.
Joe's absolutely right about the Office co-existence and the bitness - under the hood, Monarch v11 and below read and write data to Access formats using the JET engine up to v10 and the Access Database Engine in v11. As most Monarch work involves reading from or writing to Excel, it's pretty intimately tied in to Microsoft drivers, so think about the Office flavour on the desktop environment.
Monarch v12 and above uses local SQL - which is why you can have 512 fields not 256, why you can have 16 external lookups instead of 9. It's just local SQL so still capped at 2GB of total data size - if you are faced with bigger data, do get in touch as there are sometimes tricks that can help.
Just to add one word of caution about v10... Up to v11, models and projects were XML files. Monarch v10 uses a component from Microsoft - MSXML 4.0- to handle the parsing of these files. This component is the subject of a security advisory from Microsoft - so you might not want to, or might not be allowed by a responsible IT security department, to install it.
The old Standard and Pro versions of Monarch have gone - Monarch Classic and Data Prep Studio both read PDFs and let you join data sets. The best approach really depends on what your customer needs. For self service data discovery, the new Data Prep Studio is helpful, but it won't help you automate a regular job, and the auto-definition of templates, while easier for end users who don't want to learn how to build a model, doesn't give you the full power of Monarch to deal with tricky reports.
Designer lets you build interactive dashboards - similar to Qlik and Tableau - with an engine to handle real time data. The old charts in Monarch v10 are long gone - probably a blessing - and Data Prep can export natively to Designer, Tableau and Qlik, I think. The old automation using batch files is still available, but at a licence cost that is similar to the Data Pump server (now called Automator).
Hope this helps,
Luckily for me, I've got access to the best of both worlds with 10.5 (concurrent network 4-user version) and the latest v13.4.
I have to say that I do love the simplicity of the earlier 10.5 version (although it really only plays ball with older versions of Excel and Access) as this also included the option to import elements from other models such as calculated fields and summaries which isn't possible with the current version. The pricing was also quite affordable. 13.4, although brilliant at what it does and far more extensible that the older version, is not something our company could afford to extend to others.
One of the problems I'm facing now, however, is that our company will be moving to working on a "hosted" or "virtual" desktop" (i.e. connecting via RDC to a Windows Server 2012 running TS and 32-bit Office 2016) which will definitely affect access to 10.5.
I can't see how this can be done and have - quite reluctantly I should add - started to look at alternatives such as Microsoft's PowerBI or Astera's Report Miner for a simple no or low-cost self-serve option.
It'd be useful to know whether others are facing the same issues and whether they've found a way through!
We are currently using 10.5 and have recently upgraded a test population of my company to Office 2016. Now exports to Access are generally no longer working, so you are not alone! However, we do have one person that seems to be able to export directly to Access. We are trying to figure out the difference between his setup and mine to enable others to continue to use processes we have set up previously. It appears that upgrading to the most current version is pretty cost-prohibitive, as we also use windows batch files to automate processes. If it now requires two separate licenses (Monarch v13 and Automator), we may also be forced to find an alternative solution with another piece of software. I would appreciate any insight into this subject as well.
Based on your information that a co-worker can successfully use Access 2016 there would seem to be a solution to your problem somewhere.
Is he able to export using the same model/project or is he using different export definitions?
If the latter what sort of Access file type is being used for the export? MDB (older Access versions) or ACCDB? Are you using the same?
Are your respective Windows/Office installations both up to the same levels of updates?
Di you get an error message when the export fails? At what point does it fail?
Let's start there and see where it leads.
Thanks so much for reaching out to me. We did find a fix for this issue. Hopefully I’m saying this right, we downloaded and installed the Microsoft Access Database Engine 2010 Redistributable and I am now able to export directly from Monarch 10.5 into Access 2016.
I really wish Datawatch had not separated the ability to run the program from a batch file, as that is a simple, but elegant way to automate processes for people that aren’t particularly technical. It would be nice if they took the perspective of the user and gave back that functionality.
Thanks again for reaching out. Please let me know if you have any additional questions.
technology advancement group
I'm glad you found a solution to your problem.
I had seen some references to the Access 2010 Redistributable approach but have not needed to investigate it personally so I was cautious about mentioning it. I figured that comparing your system with your co-workers would be a better option from an IT configuration and management perspective.
As for automation ... I am slightly torn between two views of this - three if you include the financial aspect.
On the functional side, whilst the basic batch runs and even VB programming approaches offer a great way for individuals to run things on their own machines there can be questions raised about maintenance of the processes, shareability or use and support throughout an organisation and data and process security and authorisation issues.
In heavily regulated industries - anything to do with financial services for example - where more and more controls are being implemented in recent times, the need for trace-ability and discovery reporting tends to suggest that a co-ordinated and managed approach would be sensible. Auditing potential is important.
One might argue that simply being seen to have a corporate policy for such an approach might be enough by itself to deflect any threats of disruptive investigation and corporate boards or business owners would probably take comfort from that possibility.
The idea of having a software tool available that provides the required functionality for consistent use throughout an organisation should be attractive to those responsible for running the business. At the same time a well developed tool ought to make the user's lives easier and allow continuity for the business during times of absence of process originators.
Of course for commercial reasons there will be costs involved. Licence costs tend to be more visible in financial accounting than employee time costs but in the end the real question is about cost effectiveness of whichever is the preferred approach.
From time to time I have become aware of people who have, almost single handed, used Monarch and desktop "automation" developments to provide remarkable reporting power for their employers and with very few licences required. What their employers did when they were not available for extended periods I am not sure. I also wonder what happened within the business when those people were no longer available to the employers. There seemed to be no backup coverage available in the business plans I was aware of.
By acknowledging the needs for continuity and audit-ability and acquiring an application to perform those functions a business can obtain measurable benefits as well as providing a focal point for the importance of such considerations more broadly within the business.
At the same time the application should make the employee's work less onerous and allow their skills to be deployed more widely in the organisation.
In the end it all comes down to perceived benefits versus perceived costs.
Would appropriate software facilitate more useful information retrieval and analysis activities (or whatever it is used for) AND better control of sources and traceability?
Can the costs be controlled to the point that they are acceptable in the current usage scenario or in an augmented use scenario?
I think you could make representations to Datawatch about what you would like to see by way of functionality for your purposes and how much the required functionality is worth. Remember to look a little beyond the "must have" features that you see as your day to day operational needs to "just to make stuff happen" and consider the potential for wider benefits that would come with a dedicated package.
As A first step I would suggest talking to Datawatch to see what they might be able to offer.
As a second step it might well be worth opening a thread here in the community to discuss the needs, the pros and cons and the financial considerations of various automation levels from the personal desk top to a full enterprise wide high volume automation model.
The combination of the rise of "Big Data" and ever encroaching regulatory requirements has been rapidly shifting the focus of people working in this area in recent years. Maybe now is a good time to find out what effects that may be introducing for people at the sharp end of automation usage.
Just my thoughts on this historically somewhat contentious matter.