Lecture outline • Support vector machines

advertisement
Lecture outline
• Support vector machines
Support Vector Machines
• Find a linear hyperplane (decision boundary) that will separate the data
Support Vector Machines
B1
• One Possible Solution
Support Vector Machines
B2
• Another possible solution
Support Vector Machines
B2
• Other possible solutions
Support Vector Machines
B1
B2
• Which one is better? B1 or B2?
• How do you define better?
Support Vector Machines
B1
B2
b21
b22
margin
b11
b12
• Find hyperplane maximizes the margin => B1 is better than B2
Support Vector Machines
B1
 
w x  b  0
 
w  x  b  1
 
w  x  b  1
b11
 
if w  x  b  1
1

f ( x)  
 
 1 if w  x  b  1
b12
2
Margin   2
|| w ||
Support Vector Machines
2
• We want to maximize: Margin   2
|| w ||
– Which is equivalent to minimizing:
 2
|| w ||
L( w) 
2
– But subjected to the following constraints:
 
if w  x i  b  1
1

f ( xi )  
 
 1 if w  x i  b  1
• This is a constrained optimization problem
– Numerical approaches to solve it (e.g., quadratic
programming)
Support Vector Machines
• What if the problem is not linearly separable?
Support Vector Machines
• What if the problem is not linearly separable?
– Introduce slack variables
• Need to minimize:
 2
|| w ||
 N k
L( w) 
 C   i 
2
 i 1 
• Subject to:
 
if w  x i  b  1 - i
1

f ( xi )  
 
 1 if w  x i  b  1  i
Nonlinear Support Vector
Machines
• What if decision boundary is not linear?
Nonlinear Support Vector
Machines
• Transform data into higher dimensional space
Download