3 Replies Latest reply: Nov 12, 2014 1:55 PM by Olly Bond RSS

    Monarch Training

    Peyton Sweat

      Hi all,

       

      Does anyone know if there are any companies that provide Monarch training? I am looking for a few names so that we can contact for in house training.

       

      Thanks

        • Monarch Training
          Gareth Horton

          Peyton,

           

          Don't know if I am pointing out the obvious here, but Datawatch do Monarch training.

           

          Check  [url="http://www.datawatch.com/vortexmlsupport/training.asp"]this[/url] URL.

           

          Gareth

           

          Originally posted by Peyton Sweat:

          Hi all,

           

          Does anyone know if there are any companies that provide Monarch training? I am looking for a few names so that we can contact for in house training.

           

          Thanks /b[/quote]

          • Re: Monarch Training
            rlandeo _

            Datawatch does online training with a live instructor and we have done that several times and it works out pretty well. They do offer in-house training, but that can get pricey. One thing they mentioned is that sometimes several companies in the same geographic area will work together to get in-house training from Datawatch. If you or anyone else is interested, we would like to work together and set that up. We are in the NYC area.

            • Re: Monarch Training
              Olly Bond

              Hello Peyton, hello Raul,

               

              Datawatch have some very good instructors - both in the US and in Europe. If you're lucky enough to get into a class with George Hoffman or Steve Caiels, you'll learn plenty.

               

              I've sold Monarch training classes (I'm a lousy salesman), I've delivered Monarch training classes (I'm a worse trainer), I've been on training (generally from LearningTree, who are great) for a bunch of stuff, and I've hired training from Datawatch (as well as on other products from a few other vendors).

               

              What I've always missed with enterprise software training is a measurable ROI. The classic line is: "CFO asks CEO: "What happens if we invest in developing our people & then they leave us?" CEO: 'What happens if we don't, and they stay?". That kind of bluffing is a lousy argument - training is an investment, and like any investment, should be measured.

               

              Say you've got 5-10 people involved in delivering reporting to a few hundred. You don't want everyone to just be able to build a top ten summary - Greenbar Analytics - A worked example - from Betty's Music Store data. You'll need some people to be able listen to user requirements for new reports, you'll want some to understand how to automate the scheduled delivery of reports, some to debug problems with existing reports, and some to be able to build a new report from any data that's thrown at them.

               

              Some of your users will be Excel whizzes, always writing macros to get out of trouble (that's great, until you have to reliably automate, or worse, audit them). Some will be happier on servers and databases. Some will just about get the idea of a function in Excel, or a sort, or a filter, or a pivot (summary). Some will "get" visualisation, and want to help present data in a dashboard that helps support decisions.

               

              So to make training deliver the best ROI, we have to reconcile these two challenges - the different expectations management have of different users, and the users' differing skills, with a third, the training syllabus. Let's get detailed (from http://www.datawatch.com/support/training/training-course-descriptions/):

               

              The "basic" Modeler training promises to teach people with "Average experience with Microsoft Windows" to: "Assess a report and use Modeler's extraction process (templates and traps) to extract the data from the source report and generate a valid data table", "Recognize diverse layouts of report data in more complex reports and see the structure inherent in the data. Evaluate the structure and make appropriate decisions about constructing templates" (and more...) It covers Floating traps, Multi-Column Regions, Advanced field options... The "intermediate" Modeler training deals with Cleared By templates, and then looks into the Table and Summary Windows. The syllabus isn't bad, but that's four days, and $2000 per student, to get them through the basics. If you want to get tricky, it's extra: "Know what to try when the source report contains append fields that are needed in the Modeler table even though some of the append data may have no detail to append to..." This comes up on the first Advanced Option - that's another half day of staff not working for you, and another $250 online, or more onsite.

               

              Based on my experience, there's too much Report window stuff in the basic course, and not enough on the basics of Table and Summary, or the architecture (this is a model, this is a project, this is an input, this is an external lookup, this is an export) in the beginners course, and a lot of the tricky Report stuff belongs more in Intermediate. But that's just a gut feeling, based on one person's experience. How could we measure this to get an objective idea?

               

              There are three ways we could measure the success of training. Firstly, did the customer pay their invoice? This at least lets us see who was actively unhappy, unless sales threw in training for "free" as part of a deal to sell licences. Secondly, we could ask the students to fill in questionnaires at the end of the course and to hand them in to the trainer. Problem with these is that users might tick boxes to let the trainign company know how the trainer performed, and scribble "Great!", or "Too slow!" if you're lucky, but these are only used by the training provider, not by you, the customer. It's, after all, your money that you are investing in your people.

               

              So my advice would be to throw the syllabus out of the window. Start with working out what your colleagues know already, then think about what you want them to know, and then hire someone to help them get there. And measure the results - if Amy can't use an Across key in a summary, or Thomas can't handle Time Intervals, after you've paid for them to get these skills, then there's a problem.

               

              Monarch Experts offer a skills audit - it will take an hour of your colleague's time, and half a day of yours - so that you can get an overview of the skills (and gaps) that need addressing, and specify what you want to see improved. You don't have to hire us to train you - although we'd be happy to try to help - but you could take the results of the audit to Datawatch's trainers, or to David Gross at www.MrMonarch.com, or to your in-house training department. Ask your colleagues to take the exam again at the end of the class, and you'll be able to measure the return on your investment.

               

              Best wishes,

               

              Olly