Uploaded by Mahesh Rathod

Seminar 2 - Priyanka (1)

advertisement
Government College of Engineering, Amravati
(An Autonomous Institute of Government of Maharashtra)
Department of Computer Science and Engineering
Seminar II
on
Detection of DDoS Attack in cloud computing using voting
extreme learning machine algorithm
Presented By:Jayshree Ade
M.Tech. 1st Year
(ID-)
Guided By:-
Contents
•
•
•
•
•
•
•
Introduction
Problem Statement
Research Objective
Literature Survey
Proposed System
Conclusion
References
2
Introduction
• Cloud computing means that all the computing hardware and
software resources that you need to process your tasks are provided
for you, "as a service" over the internet, by a vendor instead of you
owning and maintaining them.
• The services of the cloud computing can be characterized into the three
models: Platform-as a Service (PaaS), Software-as-a-service (SaaS),
and Infrastructure-as-aService (IaaS).
• After pandemic, many organizations are leveraging benefits of cloud
computing.
• However, Cloud computing is confronting different attacks and threats
from the malicious users and this has turned into the primary obstacle in
progressing cloud computing services
3
DDoS attack in cloud environment
• A DoS attack is a malicious attempt by an adversary using a single
attacking host to prevent the targeted victim from accessing the required
services or a node providing a service to its consumers.
• On the other hand, a DDoS attack involves multiple attacking hosts
flooding the victim’s network or host with attack packets, resulting in a
distributed multi-point attack
4
DDoS attack in cloud environment
• DDoS attacks are notoriously difficult to defend against due to their
distributed nature. The DDoS attack mechanism is depicted in Fig.
• The attacker sends zombie commands to flood the target with bogus
traffic.
• The objective of DDoS is to make servers unavailable to legitimate
users. This can be extremely damaging to any online activity, causing
long-term harm.
• The primary purpose of this form of attack is to damage networks, drain
network resources, and prevent genuine users from using them.
• One or more attackers perform a denial-of-service attack against a
target system in a DDoS attack.
5
Extreme Machine Learning
• ELM is a single hidden layer feed forward neural network (SLFN), that uses
only a single hidden layer along with input and output layers.
• It uses random values for the weights that connect input and hid- den layers
and, for hidden layer biases both.
• A conventional SLFN consists of three layers: input layer, hidden layer and
output layer, shown in Fig. 1.
• The notations are given in Table 1.
• x and o denote the input and output vector.
• w and b represent the weight from input to hidden layer and the bias of
hidden layer.
• β denotes the output weight.
• Training the network is to decide these parameters that reach the optimal
solution.
6
Extreme Machine Learning
7
Extreme Machine Learning
8
SLFN Training
• In this section, we will briefly introduce the training problem for SLFN. Given a
training set
• S={(xi, ti)| xi = (xi1, xi2,…, xin)T ∈ Rn, ti = (ti1, ti2,…, tim)T ∈ Rm}, where xi
denotes the input value
• and ti represents the target, the output o of an ELM with bN
• hidden neurons can be expressed as:
• Σ bN
• i¼1
• βig wixj þ bi
•
• ¼ oj; j ¼ 1;…; N ð1Þ
9
SLFN Training
• Where g(x) means the activation function in the hidden layer. In ELM,
activation functions are
• nonlinear ones to provide nonlinear mapping for the system. Table 2 lists
several widely used
• activation functions.
• The goal of training is to minimize the error between the target and the
output of ELM. The
• most commonly used object function is mean squared error (MSE):
• MSE ¼ Σ
• N
• i¼1
• tij−oij
• 2
• ; j ¼ 1;…;m ð2Þ
10
Word Embedding
• Word Embeddings are a numerical vector representation of the
text in the corpus that maps each word in the corpus vocabulary
to a set of real valued vectors in a pre-defined N-dimensional
space.
• It is capable of capturing context of a word in a document,
semantic and syntactic similarity, relation with other words,
etc.
11
Vector Representation
ExampleBOY(2000 GIRL(500 KING(600 QUEEN(9 APPLE(10 MANGO(
)
0)
0)
000)
00)
7000)
GENDER
-1
1
-0.92
0.93
0.0
0.1
ROYAL
0.01
0.02
0.95
0.96
-0.02
0.01
AGE
0.03
0.02
0.7
0.6
0.95
0.92
FOOD
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
300
dimensions
-
12
What is Word2vec?
• Word2vec is combination of two techniques:
1. CBOW (Continuous bag of words)
2. Skip-Gram Model
• Learn weights which acts as word vector representations
CBO
W
Skipgram
13
CBOW
• Predict Target word from the
context
• It then tries to predict words that
are contextually accurate.
Fig 1. CBOW (window = 5) [1]
14
CBOW-Working
Sample String: “Hope can set you free”
1
one hot
vector of
“hope”
0
Actual
Target
0
0
0
0
W
3x5
1
W’ 5x3
0
0
0
one hot
vector of
“set”
0
1
0
W
3x5
0
Predicted one
hot vector of
“can”
15
Skip-gram
• Predict context word from the
target.
• It tries to predict the source context
words given a target word.
Fig 2. Skip-gram (w=5) [1]
16
Skip-gram -Working
Sample String: “Hope can set you free”
Predicted one
hot vector of
“Hope”
3 nodes in
hidden
layer
one hot
vector of
“Can”
0
0
0
1
0
0
1
0
W’ 5x3
0
Actual
Target
W 3x5
0
W’ 5x3
0
0
1
0
Predicted one
hot vector of
0
17
Sample Learned Word Vector
18
Finding similarity
19
Some Interesting Findings of Word2Vec
•
(King - Man) + Woman = Queen
Similar examples:
•
•
•
(Water - Wet) + Fire = Flames
(Paris - France) + Italy = Rome
(Winter - cold) + Summer = Warm
20
Some Interesting Findings of Word2Vec
Fig 3. 2-D representations of some sample word pairs [5]
21
Results and Discussion
● Notebook
22
Results and Discussion
● More Interesting Results
23
Results and Discussion
● More Interesting Results
24
Conclusion
• Word embedding is popular method for representing words as vector. It is
a good fit for catching the context of a given word in an archive, semantic
and syntactic likeness, connection with different words.
• With the help of word vectors obtained using the word embedding
techniques, we can perform the arithmetic operations on the words and
get new meaningful results.
• We can derive new facts with the help of the word vectors.
• Applications: Recommendations system, NLP, Information Retrieval, etc.
25
References
[1] Mikolov, Tomas, et al. "Efficient estimation of word representations in vector space." arXiv preprint
arXiv:1301.3781 (2013).
[2] Pennington, Jeffrey, Richard Socher, and Christopher D. Manning. "GloVe: Global vectors for word
representation." Proceedings of the 2014 conference on empirical methods in natural language processing
(EMNLP). 2014.
[3] Ali, Wazir, et al. "Word embedding based new corpus for low-resourced language: Sindhi." arXiv preprint
arXiv:1911.12579 (2021).
[4]
Ruben
Winastwan,
“Visualizing
Word
Embedding
with
PCA
and
t-SNE”,
https://towardsdatascience.com/visualizing-word-embedding-with-pca-and-t-sne-961a692509f5 , accessed on April
2022.
[5] Pennington, Jeffrey, Richard Socher, and Christopher D. Manning, “GloVe: Global Vectors for Word
Representation”, https://nlp.stanford.edu/projects/glove/, accessed on April 2022.
[6] Jian Yang; Zhang, D.; Frangi, A.F.; Jing-yu Yang "Two-dimensional PCA: a new approach to appearance-based
face representation and recognition“, in Pattern Analysis and Machine Intelligence, IEEE Transactions on, Volume:
26, Issue: 1, Jan 2004, pp. 131 – 137
[7] Van der Maaten, Laurens, and Geoffrey Hinton. "Visualizing data using t-SNE." Journal of machine learning
research 9.11 (2008).
26
THANK YOU
Download