Review B.Ramamurthy CSE651, B. Ramamurthy 1 7/11/2014 Classifiers Generative vs. discriminative classifiers Probabilistic models Generative: based on (affirmative) prior knowledge (like Naïve Bayes prior) Discriminative: based on discriminating knowledge: “this and not this” happening: logistic regression We covered Bayes for test 2, we will cover logistic regression for final exam CSE651, B. Ramamurthy 2 7/11/2014 Odds ratio and logistic regression From probability to odds ratio: Lets say the probability of success of an event is 0.8; failure is 1 – 0.8 = 0.2 Then the odds of success are : 0.8/0.2 = 4 That is odds of success is 4: 1 Transformation from probability to odds is a monotonic function. As the probability increases so does the odds. Examples: 0.9 odds 9: 1 log of odds 2.19 0.999 odds is 999: 1 log of odds 6.9 0.9999 9999:1 log of odds 9.2 0.5 odds 1:1 log of odds 0 0.2 odds 0.25:11:4 log of odds --1.38 Log of odds is also monotonic function: it is negative till p =0.5 Very often this is what is used in many cases where plausibility of an event needs to proven. LR is discriminative since is uses positive + negative occurrence probs in its calculation. CSE651, B. Ramamurthy 3 7/11/2014 Final exam Simple logistic regression problem: 10 points Simples Processing problem: 10 points For Android: no coding but questions about the components : 30 points For JS: coding: simple examples: html, css, and js: 30 points Cloud computing: Google app engine: 10 points; aws amazon cloud: 10 points; general concepts such as PaaS, Saas, Iaas are also included. CSE651, B. Ramamurthy 4 7/11/2014