Bayesian Belief Network Instructor : Elham Gholami Today’s Topics • Introduction • Bayesian Network Examples • Bayesian Network Evaluation and Learning Bayesian Belief Network Today’s Topics • Introduction • Bayesian Network Examples • Bayesian Network Evaluation and Learning Bayesian Belief Network Introduction • Graphs are an intuitive way of representing and visualizing the relationships among many variables. • Probabilistic graphical models provide a tool to deal with two problems: uncertainty and complexity. • Hence, they provide a compact representation of joint probability distributions using a combination of graph theory and probability theory. • The graph structure specifies statistical dependencies among the variables and the local probabilistic models specify how these variables are combined. Bayesian Belief Network Introduction Probability Theory Probabilistic graphical models Graph Theory Bayesian Belief Network Introduction • Two main kinds of graphical models. Nodes correspond to random variables. Edges represent the statistical dependencies between the variables. Bayesian Belief Network Bayesian Networks • Bayesian networks (BN) are probabilistic graphical models that are based on directed acyclic graphs. • There are two components of a BN model: M= {G, Θ.{ – Each node in the graph G represents a random variable and edges represent conditional independence relationships. – The set Θ of parameters specifies the probability distributions associated with each variable. CPT (Conditional Probability Table) Bayesian Belief Network Bayesian Networks • Edges represent “causation” so no directed cycles are allowed. • Markov property: Each node is conditionally independent of its ancestors given its parents. A portion of a belief network, consistsing of a node X, having variable values (x1, x2, ...), its parents (A and B), and its children (C and D) Bayesian Belief Network Assumptions When y is given, x and z are conditionally independent. Think of x as the past, y as the present, and z as the future. When y is given, x and z are conditionally independent. Think of y as the common cause of the two independent effects x and z. x and z are marginally independent, but when y is given, they are conditionally dependent. This is called explaining away. Bayesian Belief Network Assumptions • The independence assumption is of particular importance to the Bayesian inference. •All of the root nodes are independent of each other. For example, for the two root nodes X1 and X2, it satisfies P(X1X2)=P(X1)P(X2). •If two nodes have common immediate parent nodes and there is no direct arc between these two nodes, they are conditionally independent of each other given the states of their immediate parent nodes. For instance, X1 is parent of X3 and X4. Given X1, X3 and X4 are conditionally independent of each other, i.e. P(X3|X1X4)=P(X3|X1). •For any non-root node, it is conditionally independent of its non-immediate parent nodes given the states of all of its immediate parent nodes. For instance, when the immediate parent node X4 is given, X5 is independent of X1, X2, X3, P(X5|X1X2X3X4)=P(X5|X4). Bayesian Belief Network Bayes Rule Product Rule: Equating Sides: i.e. P A B P A| B P B P A B P B| A P A P( A| B) P( B) P B| A P( A) P(evidence| Class) P(Class) PClass| evidence P(evidence) All classification methods can be seen as estimates of Bayes’ Rule, with different techniques to estimate P(evidence|Class). Bayesian Networks • • Once we know the joint probability distribution encoded in the network, we can answer all possible inference questions about the variables using marginalization. Bayesian Belief Network Bayesian Networks P(a, b, c, d, e) =P(a)P(b)P(c|b)P(d|a,c)P(e|d) Bayesian Belief Network Bayesian Networks P(a, b, c, d) = P(a)P(b|a)P(c|b)P(d|c) Bayesian Belief Network Bayesian Networks P(e, f, g, h) =P(e)P(f|e)P(g|e)P(h|f, g) Bayesian Belief Network Today’s Topics • Introduction • Bayesian Network Examples • Bayesian Network Evaluation and Learning Bayesian Belief Network Example 1 • You have a new burglar alarm installed at home. • It is fairly reliable at detecting burglary, but also sometimes responds to minor earthquakes. • You have two neighbors, Ali and Veli, who promised to call you at work when they hear the alarm. • Ali always calls when he hears the alarm, but sometimes confuses telephone ringing with the alarm and calls too. • Veli likes loud music and sometimes misses the alarm. • Given the evidence of who has or has not called, we would like to estimate the probability of a burglary. Bayesian Belief Network Example 1 The Bayesian network for the burglar alarm example. Burglary (B) and earthquake (E) directly affect the probability of the alarm (A) going off, but whether or not Ali calls (AC) or Veli calls (VC) depends only on the alarm. (Russell and Norvig, Artificial Intelligence: A Modern Approach, 1995) Bayesian Belief Network Example 1 • What is the probability that the alarm has sounded but neither a burglary nor an earthquake has occurred, and both Ali and Veli call? • capital letters represent variables having the value true, and ¬ represents negation Bayesian Belief Network Example 1 • What is the probability that there is a burglary given that Ali calls? What about if Veli also calls right after Ali hangs up? Bayesian Belief Network Example 2 Another Bayesian network example.The event that the grass being wet (W = true) has two possible causes: either the water sprinkler was on (S = true) or it rained (R = true). (Russell and Norvig, Artificial Intelligence: A Modern Approach, 1995) Bayesian Belief Network Example 2 • Suppose we observe the fact that the grass is wet. There are two possible causes for this: either it rained, or the sprinkler was on. Which one is more likely? We see that it is more likely that the grass is wet because it rained. Bayesian Belief Network Applications of Bayesian Networks • Example applications include: – – – – – – – – Machine learning Statistics Computer vision Natural language Processing Speech recognition Error-control codes Bioinformatics Medical diagnosis Weather forecasting Bayesian Belief Network Applications of Bayesian Networks • Example systems include: – PATHFINDER medical diagnosis system at Stanford – Microsoft Office assistant and troubleshooters – Space shuttle monitoring system at NASA Mission Control Center in Houston Data Mining Community Top Resource ( KD nuggest ) for Analytics, Data Mining, and Data Science Software, Companies, Data, Jobs, Education, News, and more http://www.kdnuggets.com/software/bayesian.html Bayesian Belief Network Dr. Shuang LIANG, SSE, TongJi Bayesian Network Tools Free: •BAYDA 1.0 •Bayesian belief network software (Win,2000/TN/95/98) from J. Cheng, including BN PowerConstructor: An efficient system for learning BN structures and parameters from data; BN PowerPredictor: A data mining program for data modeling/classification/prediction . •Bayesian Network tools in Java (BNJ): an open-source suite of Java tools for probabilistic learning and reasoning (Kansas State University KDD Lab) • Chordalysis, a log-linear analysis method for big data, which exploits recent discoveries in graph theory by representing complex models as compositions of triangular structures (aka chordal graphs ) FDEP, induces functional dependencies from a given input relation(GNU C) •GeNIe, decision modeling environment implementing influence diagrams and Bayesian networks (Windows). Has over 2000 users . •JavaBayes •jBNC, a Java toolkit for training, testing, and applying Bayesian Network Classifiers . •JNCC;seitilibaborp esicerpmi sdrawot seyaB eviaN fo noisnetxe na ,)avaJ ni(2 reifissalC laderC eviaN ,2 it is designed to return robust classification, even on small and/or incomplete data sets . •MSBN: Microsoft Belief Network Tools, tools for creation, assessment and evaluation of Bayesian belief networks. Free for non-commercial research users . •PNL: Open Source Probabilistic Networks Library, a tool for working with graphical models, supporting directed and undirected models, discrete and continuous variables, various inference and learning algorithms . •Pulcinella, tool for Propagating Uncertainty through Local Computations based on the Shenoy and Shafer framework. (Common Lisp) Bayesian Network Tools… commercial : •AgenaRisk, visual tool, combining Bayesian networks and statistical simulation (Free one month evaluation). •Analytica, influence diagram-based, visual environment for creating and analyzing probabilistic models (Win/Mac). •AT-Sigma Data Chopper, for analysis of databases and finding causal relationships. •BayesiaLab, complete set of Bayesian network tools, including supervised and unsupervised learning, and analysis toolbox. •Bayes Server, advanced Bayesian network library and user interface. Supports classification, regression, segmentation, time series prediction, anomaly detection and more. Free trial and walkthroughs available. •Bayesware Discoverer 1.0, an automated modeling tool able to extract a Bayesian network from data by searching for the most probable model •BNet, includes BNet.Builder for rapidly creating Belief Networks, entering information, and getting results and BNet.EngineKit for incorporating Belief Network Technology in your applications. •DXpress, Windows based tool for building and compiling Bayes Networks. •Flint, combines bayesian networks, certainty factors and fuzzy logic within a logic programming rulesbased environment. •HUGIN, full suite of Bayesian Network reasoning tools •Netica, bayesian network tools (Win 95/NT), demo available. •PrecisionTree, an add-in for Microsoft Excel for building decision trees and influence diagrams directly in the spreadsheet Today’s Topics Introduction • • Bayesian Network Examples • Bayesian Network Evaluation and Learning Bayesian Belief Network Two Fundamental Problems for BNs • Evaluation (inference) problem: Given the model and the values of the observed variables, estimate the values of the hidden nodes. Learning problem: Given training data and prior information (e.g., expert knowledge, causal relationships), estimate the network structure, or the parameters of the probability distributions, or both. Bayesian Belief Network Bayesian Network Evaluation Problem • If we observe the “leaves” and try to infer the values of the hidden causes, this is called diagnosis, or bottom-up reasoning. • If we observe the “roots” and try to predict the effects, this is called prediction, or top-down reasoning. • Exact inference is an NP-hard problem because the number of terms in the summations (integrals) for discrete (continuous) variables grows exponentially with increasing number of variables. Bayesian Belief Network Bayesian Network Evaluation Problem • Some restricted classes of networks, namely the singly connected networks where there is no more than one path between any two nodes, can be efficiently solved in time linear in the number of nodes. • There are also clustering algorithms that convert multiply connected networks to single connected ones. • However, approximate inference methods such as – sampling (Monte Carlo) methods – variational methods – loopy belief propagation have to be used for most of the cases. Bayesian Belief Network Bayesian Network Learning Problem • The simplest situation is the one where the network structure is completely known (either specified by an expert or designed using causal relationships between the variables). • Other situations with increasing complexity are: – known structure but unobserved variables, – unknown structure with observed variables, – unknown structure with unobserved variables. Bayesian Belief Network Naive Bayesian Network • When the dependencies among the features are unknown, we generally proceed with the simplest assumption that the features are conditionally independent given the class. • This corresponds to the naive Bayesian network that gives the class-conditional probabilities • Naïve Bayesian network structure. It looks like a very simple model but it often works quite well in practice. Bayesian Belief Network Pattern Recognition, Fall 2012 تکلیف .1بررسی کاربرد شبکه بیزی در شبکه های اجتماعی -خالصه یک مقاله مرتبط 33