Uploaded by Sowndharayaa K S

AI - A level computer science 9618

advertisement
Back Propagation &
Regression methods
Sowndharayaa
ks.sowndharayaa@gmail.com
TABLE OF CONTENTS
01
02
03
04
REVISION
BACK PROPAGATION
REGRESSION
FURTHER MORE
Revision
1A
2A
Machine learning
Trains algorithms to
find patterns in data.
Deep learning
Mimics the brain with
complex networks for
advanced tasks.
1B
2B
Automates tasks by
improving its
performance over time
with more data.
Uncovers hidden
patterns in massive
datasets, enabling
breakthroughs in areas
like image recognition
and natural language
processing.
01
02
03
04
Multi-layered
N/w
Big Data
Expertise
Automatic Feature
Discovery
Advanced
Applications
Deep learning uses
stacked layers for
complex data
analysis.
Learns best from
massive amounts of
data.
No need for manual
feature selection.
Powers image
recognition, language
processing, and more.
BACK
PROPAGATION
Backpropagation is a training method for
neural networks. It adjusts internal
connections to minimize errors by
analyzing how mistakes flow backward
through the network.
Guess &
Check
01
Backprop analyzes how much
each neuron contributed to the
error, moving backward through
the network
Weight adj.
03
These adjustments help the
network learn from mistakes
and improve its predictions
over time
Network makes a prediction,
then backprop compares it to the
answer and calculates the error.
02
Blame
Backflow
Based on blame, connections
between neurons are fine-tuned,
strengthening good ones and
weakening bad ones.
04
Learning
loop
Types
Static
Recurrent
Single pass
Sequential data
Like Image recognition
Like sentences
Backpropagation helps them
learn from errors in those static
calculations. Ex. Calculator SNN
It analyzes errors not just in the
final output, but also across the
entire sequence. It considers
how errors in earlier parts of the
sequence affected the final
outcome.
Regression
A statistical process for estimating the
relationship between a dependent
variable (usually continuous) and one
or more independent variables
(predictors). It's used to predict
continuous outcomes based on the
trends found in the data.
Factors
Abundant Data
Linear Relationship
01
While it can handle some nonlinearity, regression works best
when there's a generally linear
relationship between the features
and the outcome variable.
Continuous Outcome
excels when predicting
continuous values like
house prices, stock
prices, or customer wait
times.
02
03
Regression algorithms typically
thrive with large datasets. The more
data points your model can analyze,
the better it can capture the
underlying trends and create
accurate predictions.
Clean Data
Regression models
perform better with
clean data that has
minimal outliers and
missing values
04
Further more
Predicting House Prices
Regression
Example
 Features: Square footage, number of
bedrooms, location (zip code)
 Outcome: Selling price of the house
 Insights: Regression models can help
realtors estimate a fair market price for a
house based on these factors. They can also
identify which features have the strongest
influence on price, allowing sellers to
potentially highlight those features (e.g.,
extra bedrooms) during marketing.
Thank you
Download