An Introduction to WEKA Explorer In part from:Yizhou Sun 2008 What is WEKA? Waikato Environment for Knowledge Analysis It’s a data mining/machine learning tool developed by Department of Computer Science, University of Waikato, New Zealand. Weka is also a bird found only on the islands of New Zealand. 2 4/13/2015 Download and Install WEKA Website: http://www.cs.waikato.ac.nz/~ml/weka/index.html Support multiple platforms (written in java): Windows, Mac OS X and Linux 3 4/13/2015 Main Features 49 data preprocessing tools 76 classification/regression algorithms 8 clustering algorithms 3 algorithms for finding association rules 15 attribute/subset evaluators + 10 search algorithms for feature selection 4 4/13/2015 Main GUI Three graphical user interfaces “The Explorer” (exploratory data analysis) “The Experimenter” (experimental environment) “The KnowledgeFlow” (new process model inspired interface) Simple CLI- provides users without a graphic interface option the ability to execute commands from a terminal window 5 4/13/2015 Explorer The Explorer: Preprocess data Classification Clustering Association Rules Attribute Selection Data Visualization References and Resources 6 4/13/2015 Explorer: pre-processing the data Data can be imported from a file in various formats: ARFF, CSV, C4.5, binary Data can also be read from a URL or from an SQL database (using JDBC) Pre-processing tools in WEKA are called “filters” WEKA contains filters for: Discretization, normalization, resampling, attribute selection, transforming and combining attributes, … 7 4/13/2015 WEKA only deals with “flat” files @relation heart-disease-simplified @attribute age numeric @attribute sex { female, male} @attribute chest_pain_type { typ_angina, asympt, non_anginal, atyp_angina} @attribute cholesterol numeric @attribute exercise_induced_angina { no, yes} @attribute class { present, not_present} @data 63,male,typ_angina,233,no,not_present 67,male,asympt,286,yes,present 67,male,asympt,229,yes,present 38,female,non_anginal,?,no,not_present ... 8 4/13/2015 WEKA only deals with “flat” files @relation heart-disease-simplified @attribute age numeric @attribute sex { female, male} @attribute chest_pain_type { typ_angina, asympt, non_anginal, atyp_angina} @attribute cholesterol numeric @attribute exercise_induced_angina { no, yes} @attribute class { present, not_present} @data 63,male,typ_angina,233,no,not_present 67,male,asympt,286,yes,present 67,male,asympt,229,yes,present 38,female,non_anginal,?,no,not_present ... 9 4/13/2015 10 University of Waikato 4/13/2015 11 University of Waikato 4/13/2015 IRIS dataset 5 attributes, one is the classification 3 classes: setosa, versicolor, virginica 15 University of Waikato 4/13/2015 Attribute data Min, max and average value of attributes distribution of values :number of items for which: ai = v j | ai Î A,v j ÎV class: distribution of attribute values in the classes 18 University of Waikato 4/13/2015 19 University of Waikato 4/13/2015 20 University of Waikato 4/13/2015 21 University of Waikato 4/13/2015 Filtering attributes Once the initial data has been selected and loaded the user can select options for refining the experimental data. The options in the preprocess window include selection of optional filters to apply and the user can select or remove different attributes of the data set as necessary to identify specific information. The user can modify the attribute selection and change the relationship among the different attributes by deselecting different choices from the original data set. There are many different filtering options available within the preprocessing window and the user can select the different options based on need and type of data present. 23 University of Waikato 4/13/2015 24 University of Waikato 4/13/2015 25 University of Waikato 4/13/2015 26 University of Waikato 4/13/2015 27 University of Waikato 4/13/2015 28 University of Waikato 4/13/2015 29 University of Waikato 4/13/2015 30 University of Waikato 4/13/2015 Discretizes in 10 bins of equal frequency 31 University of Waikato 4/13/2015 Discretizes in 10 bins of equal frequency 32 University of Waikato 4/13/2015 Discretizes in 10 bins of equal frequency 33 University of Waikato 4/13/2015 34 University of Waikato 4/13/2015 35 University of Waikato 4/13/2015 36 University of Waikato 4/13/2015 Explorer: building “classifiers” “Classifiers” in WEKA are machine learning algorithmsfor predicting nominal or numeric quantities Implemented learning algorithms include: Conjunctive rules, decision trees and lists, instance-based classifiers, support vector machines, multi-layer perceptrons, logistic regression, Bayes’ nets, … 37 4/13/2015 Explore Conjunctive Rules learner Need a simple dataset with few attributes , let’s select the weather dataset Select a Classifier Select training method Right-click to select parameters numAntds= number of antecedents, -1= empty rule Select numAntds=10 Results are shown in the right window (can be scrolled) Can change the right hand side variable Performance data Decision Trees with WEKA 47 University of Waikato 4/13/2015 48 University of Waikato 4/13/2015 49 University of Waikato 4/13/2015 50 University of Waikato 4/13/2015 51 University of Waikato 4/13/2015 52 University of Waikato 4/13/2015 53 University of Waikato 4/13/2015 54 University of Waikato 4/13/2015 55 University of Waikato 4/13/2015 56 University of Waikato 4/13/2015 57 University of Waikato 4/13/2015 58 University of Waikato 4/13/2015 59 University of Waikato 4/13/2015 60 University of Waikato 4/13/2015 61 University of Waikato 4/13/2015 62 University of Waikato 4/13/2015 63 University of Waikato 4/13/2015 64 University of Waikato 4/13/2015 65 University of Waikato 4/13/2015 66 University of Waikato 4/13/2015 67 University of Waikato 4/13/2015 68 University of Waikato 4/13/2015 right click: visualize cluster assignement Explorer: finding associations WEKA contains an implementation of the Apriori algorithm for learning association rules Works only with discrete data Can identify statistical dependencies between groups of attributes: milk, butter bread, eggs (with confidence 0.9 and support 2000) Apriori can compute all rules that have a given minimum support and exceed a given confidence 73 4/13/2015 Basic Concepts: Frequent Patterns Tid Items bought 10 Beer, Nuts, Diaper 20 Beer, Coffee, Diaper 30 Beer, Diaper, Eggs 40 Nuts, Eggs, Milk 50 Nuts, Coffee, Diaper, Eggs, Milk Customer buys both Customer buys diaper itemset: A set of one or more items k-itemset X = {x1, …, xk} (absolute) support, or, support count of X: Frequency or occurrence of an itemset X (relative) support, s, is the fraction of transactions that contains X (i.e., the probability that a transaction contains X) An itemset X is frequent if X’s support is no less than a minsup threshold Customer buys beer 74 April 13, 2015 Basic Concepts: Association Rules Tid Items bought 10 Beer, Nuts, Diaper 20 Beer, Coffee, Diaper 30 Beer, Diaper, Eggs 40 50 Nuts, Eggs, Milk Nuts, Coffee, Diaper, Eggs, Milk Customer buys both Customer buys beer 75 Customer buys diaper Find all the rules X Y with minimum support and confidence support, s, probability that a transaction contains X Y confidence, c, conditional probability that a transaction having X also contains Y Let minsup = 50%, minconf = 50% Freq. Pat.: Beer:3, Nuts:3, Diaper:4, Eggs:3, {Beer, Diaper}:3 Association rules: (many more!) Beer Diaper (60%, 100%) Diaper Beer (60%, 75%) April 13, 2015 77 University of Waikato 4/13/2015 78 University of Waikato 4/13/2015 79 University of Waikato 4/13/2015 80 University of Waikato 4/13/2015 81 University of Waikato 4/13/2015 1. adoption-of-the-budget-resolution=y physician-fee-freeze=n 219 ==> Class=democrat 219 conf:(1) 2. adoption-of-the-budget-resolution=y physician-fee-freeze=n aid-tonicaraguan-contras=y 198 ==> Class=democrat 198 conf:(1) 3. physician-fee-freeze=n aid-to-nicaraguan-contras=y 211 ==> Class=democrat 210 conf:(1) ecc. Explorer: attribute selection Panel that can be used to investigate which (subsets of) attributes are the most predictive ones Attribute selection methods contain two parts: A search method: best-first, forward selection, random, exhaustive, genetic algorithm, ranking An evaluation method: correlation-based, wrapper, information gain, chi-squared, … Very flexible: WEKA allows (almost) arbitrary combinations of these two 83 4/13/2015 84 University of Waikato 4/13/2015 85 University of Waikato 4/13/2015 86 University of Waikato 4/13/2015 87 University of Waikato 4/13/2015 88 University of Waikato 4/13/2015 89 University of Waikato 4/13/2015 90 University of Waikato 4/13/2015 91 University of Waikato 4/13/2015 Explorer: data visualization Visualization very useful in practice: e.g. helps to determine difficulty of the learning problem WEKA can visualize single attributes (1-d) and pairs of attributes (2-d) To do: rotating 3-d visualizations (Xgobi-style) Color-coded class values “Jitter” option to deal with nominal attributes (and to detect “hidden” data points) “Zoom-in” function 92 4/13/2015 94 University of Waikato 4/13/2015 95 University of Waikato 4/13/2015 96 University of Waikato 4/13/2015 97 University of Waikato 4/13/2015 click on a cell 98 University of Waikato 4/13/2015 99 University of Waikato 4/13/2015 100 University of Waikato 4/13/2015 101 University of Waikato 4/13/2015 102 University of Waikato 4/13/2015 103 University of Waikato 4/13/2015 References and Resources References: WEKA website: http://www.cs.waikato.ac.nz/~ml/weka/index.html WEKA Tutorial: Machine Learning with WEKA: A presentation demonstrating all graphical user interfaces (GUI) in Weka. A presentation which explains how to use Weka for exploratory data mining. WEKA Data Mining Book: Ian H. Witten and Eibe Frank, Data Mining: Practical Machine Learning Tools and Techniques (Second Edition) WEKA Wiki: http://weka.sourceforge.net/wiki/index.php/Main_Page Others: Jiawei Han and Micheline Kamber, Data Mining: Concepts and Techniques, 2nd ed.