Skip main navigation
We use cookies to give you a better experience, if that’s ok you can close this message and carry on browsing. For more info read our cookies policy.
We use cookies to give you a better experience. Carry on browsing if you're happy with this, or read our cookies policy for more information.

Using the MOA interface

Download MOA and run it. It’s a Java program, like Weka. If you can run Weka, you can run MOA!

Here’s an example of how to use the MOA interface. You will need to go through these steps before attempting the Quiz that follows.

Click Configure to set up a task. Change the task type in the dropdown menu at the top to LearnModel. As you can see, the default learner is Naive Bayes. You could change it by clicking the Edit button and then selecting from the dropdown menu at the top – classifiers are organized the same way as they are in Weka. However, leave it as Naive Bayes for now.

The default datastream is a Random Tree Generator. Use the corresponding Edit button to change it to the Waveform Generator, which generates instances from a combination of waveforms. Change the number of instances to generate from 10,000,000 to 1,000,000.

Finally, specify a taskResultFile, say “modelNB.moa”, where MOA will output the model.

Now click OK, and then Run to launch this task. Textual output appears in the center panel; in this case every 10,000 steps. Various evaluation measures appear in the lower panel, and are continuously updated until the task completes. MOA can run several tasks concurrently, as you will see if you click Run twice in quick succession. Clicking on a job in the top panel displays its information in the lower two panels.

The task you have just run is

LearnModel -l bayes.NaiveBayes -s generators.WaveformGenerator 
    -m 1000000 -O modelNB.moa

– you can see this in the line beside the Configure button – and the Naive Bayes model has been stored in the file modelNB.moa. (Note that parameters that have their default value are not shown in the configuration text.)

Click Configure and change the learner to a Hoeffding Tree with output file modelHT.moa:

LearnModel -l trees.HoeffdingTree -s generators.WaveformGenerator 
    -m 1000000 -O modelHT.moa

and run it. Now we have two models stored on disk, modelNB.moa and modelHT.moa.

We will evaluate the Naive Bayes model using 1,000,000 new instances generated by the Waveform Generator, which is accomplished by the task

EvaluateModel -m file:modelNB.moa 
    -s (generators.WaveformGenerator -i 2) -i 1000000 

The “–i 2” sets a different random seed for the waveform generator. You can set up most of this in the Configure panel. At the top, set the task to EvaluateModel, and configure the stream (which has now changed to Random Tree Generator) to Waveform Generator with an instanceRandomSeed of 2. Frustratingly, though, you can’t specify that the model should be read from a file.

It’s useful to learn how to get around such problems. Click OK to return to the main MOA interface, select “Copy configuration to clipboard” from the right-click menu (on a Mac trackpad, do Alt/Shift/tap), then select “Enter configuration” and paste the clipboard into the new configuration, where you can edit it, and type –m file:modelNB.moa into the command line. This gives the EvaluateModel task the parameters needed to load the Naive Bayes model produced in the previous step, generate a new waveform stream with a random seed of 2, and test on 1,000,000 examples. (Recall that parameters that have their default value are not shown in the configuration text.)


Share this article:

This article is from the free online course:

Advanced Data Mining with Weka

The University of Waikato

Get a taste of this course

Find out what this course is like by previewing some of the course steps before you join:

Contact FutureLearn for Support