2.7

# Using the MOA interface

Download the latest version of MOA and run it. It’s a Java program, like Weka. If you can run Weka, you can run MOA!

Note: The interface differs very slightly from what is shown in the videos. You are running in “LITE” mode, which removes rarely-used items from the menus. You can select “STANDARD” mode (used in the videos) at the top right of the screen. But LITE has everything you need for the course.

Here’s an example of how to use the MOA interface. You will need to go through these steps before attempting the Quiz that follows.

Click Configure to set up a task. Change the task type in the dropdown menu at the top to LearnModel. As you can see, the default learner is Naive Bayes. You could change it by clicking the Edit button and then selecting from the dropdown menu at the top – classifiers are organized the same way as they are in Weka. However, leave it as Naive Bayes for now.

The default datastream is a Random Tree Generator. Use the corresponding Edit button to change it to the Waveform Generator, which generates instances from a combination of waveforms. Change the number of instances to generate from 10,000,000 to 1,000,000.

Finally, specify a taskResultFile, say “modelNB.moa”, where MOA will output the model.

Now click OK, and then Run to launch this task. Textual output appears in the center panel; in this case every 10,000 steps. Various evaluation measures appear in the lower panel, and are continuously updated until the task completes. MOA can run several tasks concurrently, as you will see if you click Run twice in quick succession. Clicking on a job in the top panel displays its information in the lower two panels.

The task you have just run is

LearnModel -l bayes.NaiveBayes -s generators.WaveformGenerator
-m 1000000 -O modelNB.moa


– you can see this in the line beside the Configure button – and the Naive Bayes model has been stored in the file modelNB.moa. (Note that parameters that have their default value are not shown in the configuration text.)

Click Configure and change the learner to a Hoeffding Tree with output file modelHT.moa:

LearnModel -l trees.HoeffdingTree -s generators.WaveformGenerator
-m 1000000 -O modelHT.moa


and run it. Now we have two models stored on disk, modelNB.moa and modelHT.moa.

We will evaluate the Naive Bayes model using 1,000,000 new instances generated by the Waveform Generator, which is accomplished by the task

EvaluateModel -m file:modelNB.moa
-s (generators.WaveformGenerator -i 2) -i 1000000


The “–i 2” sets a different random seed for the waveform generator. You can set up most of this in the Configure panel. At the top, set the task to EvaluateModel, and configure the stream (which has now changed to Random Tree Generator) to Waveform Generator with an instanceRandomSeed of 2. Frustratingly, though, you can’t specify that the model should be read from a file.

It’s useful to learn how to get around such problems. Click OK to return to the main MOA interface, select “Copy configuration to clipboard” from the right-click menu (on a Mac trackpad, do Alt/Shift/tap), then select “Enter configuration” and paste the clipboard into the new configuration, where you can edit it, and type -m file:modelNB.moa into the command line. (Check the filenames in the two earlier LearnModel tasks to see if you need to specify a path with this filename.) This gives the EvaluateModel task the parameters needed to load the Naive Bayes model produced in the previous step, generate a new waveform stream with a random seed of 2, and test on 1,000,000 examples. (Recall that parameters that have their default value are not shown in the configuration text.)

Phew!