Lecture outline Support vector machines •

advertisement
Lecture outline
• Support vector machines
Support Vector Machines
• Find a linear hyperplane (decision boundary) that will separate the data
Support Vector Machines
• One Possible Solution
Support Vector Machines
• Another possible solution
Support Vector Machines
• Other possible solutions
Support Vector Machines
• Which one is better? B1 or B2?
• How do you define better?
Support Vector Machines
• Find hyperplane maximizes the margin => B1 is better than B2
Support Vector Machines
r r
w• x + b = 0
r r
w • x + b = +1
r r
w • x + b = −1
⎧ 1
r
f (x) = ⎨
⎩− 1
r r
if w • x + b ≥ 1
r r
if w • x + b ≤ − 1
2
Margin = r 2
|| w ||
Support Vector Machines
2
• We want to maximize: Margin = r 2
|| w ||
– Which is equivalent to minimizing:
r 2
|| w ||
L ( w) =
2
– But subjected to the following constraints:
r r
if w • x i + b ≥ 1
⎧ 1
r
f ( xi ) = ⎨
r r
⎩ − 1 if w • x i + b ≤ − 1
• This is a constrained optimization problem
– Numerical approaches to solve it (e.g., quadratic programming)
Support Vector Machines
• What if the problem is not linearly separable?
Support Vector Machines
• What if the problem is not linearly separable?
– Introduce slack variables
• Need to minimize:
r 2
|| w ||
⎛ N k⎞
L ( w) =
+ C ⎜ ∑ ξi ⎟
2
⎝ i =1 ⎠
• Subject to: ⎧ 1
r
f ( xi ) = ⎨
⎩− 1
r r
if w • x i + b ≥ 1 - ξ i
r r
if w • x i + b ≤ − 1 + ξ i
Nonlinear Support Vector Machines
• What if decision boundary is not linear?
Nonlinear Support Vector Machines
• Transform data into higher dimensional space
Download