Decision Tree Algorithm ● Decision Tree is a Supervised learning technique. ● Preferably using for solving Classification problems. ● It is a tree-structured classifier, where internal nodes represent the features of a dataset, branches represent the decision rules and each leaf node represents the outcome. ● In a Decision tree, there are two nodes, which are the Decision Node and Leaf Node. Decision nodes are used to make any decision and have multiple branches, whereas Leaf nodes are the output of those decisions and do not contain any further branches. ● It is a graphical representation for getting all the possible solutions to a problem/decision based on given conditions. ● It is called a decision tree because, similar to a tree, it starts with the root node, which expands on further branches and constructs a tree-like structure. How to draw decision making tree 1. You start a Decision Tree with a decision that you need to make 2. Draw a small square to represent this 3. From this box draw out lines towards the right for each possible solution, and write that solution along the line. 4. At the end of each line, consider the results. If the result of taking that decision is uncertain, draw a small circle. If the result is another decision that you need to make, draw another square. Write the decision or factor above the square or circle. If you have completed the solution at the end of the line, just leave it blank. 5. Keep on doing this until you have drawn out as many of the possible outcomes and decisions as you can see leading on from the original decisions. Why use Decision Trees? ● ● Decision Trees usually mimic human thinking ability while making a decision, so it is easy to understand. The logic behind the decision tree can be easily understood because it shows a tree-like structure. Advantages ● ● ● Are simple to understand and interpret Have value even with little hard data Can be combined with other decision techniques