Volume 2, Issue 7, July 2012 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Protection of Transmission Lines Using Artificial Neural Network Isha Awasthi1, Aziz Ahmed2 M.Tech Scholar, Department of EEE, 2 HOD, Department of EEE, AFSET ,Dhauj , Faridabad, Affiliated to M.D.U., Rohtak, Haryana 1 Abstract-High voltage transmission lines are used to transmit electrical energy from source to substations. If any fault and disturbance are generated in the transmission lines and not detected, located and eliminated quickly, it may cause instability in the system. To find the exact location of fault occurring in transmission lines may be calculated by some conventional methods. This paper presents the new approach to fault detection and clarification in power transmission system by using ANN (Artificial Neural Network). The network is trained with Rosenblatt’s algorithm. The results have been calculated by MATLAB. Several graphs have been shown in this paper. From the conventional methods, only one fault may be detected between two substations, but in this paper there are several nodes and the distance between each node has been calculated. The fault is generated in between these nodes. Keywords- ANN (Artificial neural network), Rosenblatt’s algorithm, High voltage transmission lines, Fault, MATLAB Introduction Work on artificial neural networks commonly referred to as “neural networks”, has been motivated right from its inception by the recognition that the human brain computers in an entirely different way from the conventional digital computers. The brain is a highly complex, nonlinear and parallel computer. It has the capability to organize its structural constituents known as neurons, so as to perform certain computations (e.g. pattern recognition, perception and motor control) many times faster than the fastest digital computer in existence today.[1] A neural network is a massively parallel distributed processor made up of simple processing unit that has a natural propensity for storing experimental knowledge and making it available for use. It resembles the brain in two respects: Knowledge is acquired by the network from its environment through the learning process. Inter-neuron connection strengths known as synaptic weights are used to store the acquired knowledge (i) Artificial intelligence: Designing of intelligence computer system from characteristic associated with intelligence in human behavior [2] Example: - 1. Neural Network 2. Fuzzy Logic 3. Expert system 4. Probabilistic reasoning. Types: 1. Hard computing 2. Soft computing © 2012, IJARCSSE All Rights Reserved Characteristics: - 1. Cognition 2. Logical Interface 3. Pattern Recognition Human brain has two properties: Human brain is getting experienced to adapt themselves to their surrounding environments. So as a result the information processing capability of the brain is rendered, when this happen the brain becomes plastic. 1. Plastic: Capability to process information, capability of adding. Must preserve the information it has learn previously. 2. Stable: Remain stable when it is presented with irrelevant information, useless information. *. Synapses with large area are excitatory (+Ve weights) & with small area are inhibitory (-Ve weights). *. Synapses of the neuron are modulated as weights. (Strength of the connection) *. Biological neuron receive all inputs through dendrites, sum them & produces an output. If the sum is greater then the threshold value, then input signals are passed to the cell body. *. NN can mapped input patterns into output patterns. *. NN’s are robust systems. They can recall full patterns from incomplete pattern or noise channel. *. Mc Culloh Pits neuron was centered on the idea that a neuron will fire an impulse only if the threshold value is exceed. Total input (I) receive by the soma Page | 70 Volume 2, Issue 7, July 2012 www.ijarcsse.com 𝑛 𝐼= 𝑤𝑖 . 𝑥𝑖 𝑗 =1 To generate the final output Y sum is passed through a nonlinear filter called activation filter. (ii) Activation function: An activation function for limiting the amplitude of the output of a neuron. The activation function referred as squashing function in that it squashes the permissible amplitude range of the output signal to some finite value. cognitive process. Training algo. of perceptron is supervised learning. It accept no. of inputs Xi (i=1,2,3…………..n) & compute a weighted sum of these inputs. The sum is then compared with a threshold θ of an output Y (which is either 0 or 1) y=1 if y=0 if 𝑛 𝑖=1 𝑤𝑖 . 𝑥𝑖 ≥ θ 𝑛 𝑖=1 𝑤𝑖 . 𝑥𝑖 ≤ θ The perceptron is the single transmission network consist of sensor unit, association unit & output of response unit. The sensor unit produce a binary output 0 or 1. Association unit behave like a basic building block. Response unit comprise pattern recognition. Thus the O/P of the response unit could be such that is the weighted sum of the input is less the nor equal to 0, then the output is 0 else 1. An externally applied bias bk has the effect of increasing or lowering the net input of the activation function depending upon whether it is positive or negative respectively. 𝑚 𝑢𝑘 = 𝑤𝑘𝑗 . 𝑥𝑗 𝑗 =1 & 𝑦𝑘 = Ø. 𝑢𝑘 + 𝑏𝑘 where 𝑥1 , 𝑥2 , … … … 𝑥𝑚 are the input signals. 𝑤𝑘 1 , 𝑤𝑘 2 , … … … 𝑤𝑘 𝑚 are the respective synaptic weights of neuron.𝑘, 𝑣𝑘 is the linear combiner output due to the input signals, 𝑏𝑘 is the bias ∅(. ) is the activation function and 𝑦𝑘 is the output signal of the neuron[1] 𝑣𝑘 = 𝑢𝑘 + 𝑏𝑘 (v) DONALD HEBB’S LEARNING: - “When the axon of cell A is near enough to excite the cell B & repeatedly takes part of firing cell B. Some growth process takes place in one or both cell & cell A efficiency in firing cell B increases”. “When neuron A repeatedly takes part of firing neuron B. The synapses connection between them increases”. Hebb’s learning rule can be used as an association between 2 set of patterns. Ex: (Pavlov’s ex.) Suppose:- F: Sight of food (Unconditional stimulus) B: Sound of bell (Conditional stimulus) S: Salivation (Response) F→ S → (F ∩ B) → S → (B → S) MATLAB TEST RESULTS (i) Identity Function (iii) Recurrent Networks: - There could be neurons with self feedback links. Pattern Associator: -Here training is supervised. A set of pairs of patterns are available. Ex: (Xi, Yi), i = 1, 2, 3……….n. NN learn to establish a relation between the two sets of patterns. Relation (association) between the two sets is stored in the network. Auto Associator: Here training is unsupervised. One set of pattern (Ex: (Xi), i = 1, 2, 3……….n.) is repeatedly presented to the network. The NN learn & remember these & then they are stored in the network. (iv) Rosenblatt’s perceptron: Perceptron is the generic name given by the Frank Rosenblatt because it is a model of EYE. Perceptron was an attempt to understand human memory, learning & © 2012, IJARCSSE All Rights Reserved Page | 71 Volume 2, Issue 7, July 2012 (ii) Hyperbolic Tangent Function www.ijarcsse.com Distances between nodes are shown: (iii) Logistic Function (iv) Implementation of ANN :In this paper we consider nodes as the substations, the several substations are connected from the transmission lines.The fault occurred in transmission lines.In first table the distance between two nodes is calculated. Learning rate can not be 0 or negative Enter the value of first node:1 Enter the value of second node:2 Table- 1 Nodes Weights(Distance between two nodes) (4,1) 3.3621 (6,2) 3.5501 (2,3) 3.2068 (5,3) 2.4185 (6,3) 2.8447 (3,4) 3.4913 (5,4) 3.2214 (1,5) 3.0154 (2,5) 3.0860 (1,6) 2.8311 (4,6) 2.8565 © 2012, IJARCSSE All Rights Reserved Now the fault is occurred in between nodes and the exact location of fault would be calculated by Matlab. Location of fault between nodes 2 & 3 by Matlab: Learning rate can not be 0 or negative Enter the value of first node:2 Enter the value of second node:3 weight = Columns 1 through 9 3.4147 3.5058 2.7270 3.5134 3.2324 2.6975 2.6785 2.9469 3.3575 Columns 10 through 11 3.3649 2.5576 value of shortest distance between specified nodes is =2.73 Best path choosen is through the =2 Best path choosen is through the =3 Fault is generated at a alocation =3.48 Fault is located at = 3.479,distance from the node 2 Fault is located at = -0.752, distance from the node 3 Time taken by the routing: Elapsed time is 8.688825 seconds. Location of fault between nodes 4 & 5 by Matlab: Learning rate can not be 0 or negative Enter the value of first node:4 Enter the value of second node:5 weight = Columns 1 through 9 3.4147 3.5058 2.7270 3.5134 3.2324 2.6975 2.6785 2.9469 3.3575 Columns 10 through 11 3.3649 2.5576 value of shortest distance between specified nodes is =5.26 Best path choosen is through the =4 Best path choosen is through the =1 Fault is generated at a alocation =5.48 Fault is located at = 5.479,distance from the node 4 Fault is located at = -0.223, distance from the node 5 Time taken by the routing: Elapsed time is 24.250947 seconds. Location of fault between nodes 5 & 6 by Matlab: Learning rate can not be 0 or negative Page | 72 Volume 2, Issue 7, July 2012 Enter the value of first node:5 Enter the value of second node:6 weight = Columns 1 through 9 3.3431 2.9922 3.2555 2.7712 3.3060 2.6318 2.6769 2.4462 2.4971 Columns 10 through 11 3.2235 3.0948 value of shortest distance between specified nodes is =5.17 Best path choosen is through the =5 Best path choosen is through the =4 Best path choosen is through the =6 Fault is generated at a alocation =1.48 Fault is located at = 1.475,distance from the node 5 Fault is located at = 3.699, distance from the node 6 Time taken by the routing: Elapsed time is 62.345801 seconds. Location of fault between nodes 1 & 6 by Matlab: Learning rate can not be 0 or negative Enter the value of first node:5 Enter the value of second node:6 weight = Columns 1 through 9 2.6344 3.0387 2.9816 3.3655 3.3952 2.7869 2.8898 2.8456 3.0463 Columns 10 through 11 3.1094 3.1547 value of shortest distance between specified nodes is =3.04 Best path choosen is through the =1 Best path choosen is through the =6 Fault is generated at a alocation =1.34 Fault is located at = 1.340,distance from the node 1 Fault is located at = 1.699, distance from the node 6 Time taken by the routing: Elapsed time is 4.108720 seconds. From these calculation the exact location of fault in transmission lines can be find out www.ijarcsse.com damage to equipment and personnel.This saves cost of replacement of the damaged parts or equipment, save time and labour in tracing fault location by maintenance crew. The paper therefore provides an economic saving to power utility operatives. REFERENCES [1] “Neural Network” Tata Mc Graw Hill. [2] “A Novel Fuzzy Neural Network Based Distance Relaying Scheme” Ieee Transactions on Power delivery [3] M. Sanaye-Pasand and O. P. Malik, “High speed transmission system directional protection using an elman network,” IEEE Trans. on Power Delivery, vol. 13, no. 4, pp. 1040–1045, 1998. [4] C. T. Lin and C. S. G. Lee, “Neural-network-based fuzzy logic control and design system,” IEEE Trans. on Computers, vol. 40, no. 12, pp. 1321–1336, 1991. [5] S. Horikawa, T. Furuhashi, and Y. Uchikawa, “On fuzzy modeling using fuzzy neural networks with the back-propagation algorithm,” IEEE Trans. on Neural Networks, vol. 3, no. 5, pp. 801–806, 1992. [6] Y. Lin and G. A. Cunningham, “A new approach to fuzzy-neural system modeling,” IEEE Trans. on Fuzzy Systems, vol. 3, no. 2, pp. 190–197, 1995. [7] T. S. Sidhu, H. Singh, and M. S. Sachdev, “Design, implementation and testing of an artificial neural network based fault direction discrimination for protecting transmission lines,” IEEE Trans. on Power Delivery, vol. 10, no. 2, pp. 697–706, 1995. [8] T. Dalstein and B. Kulicke, “Neural network approach to fault classification for high speed protective relaying,” IEEE Trans. on Power Delivery, vol. 10, no. 2, pp. 1002–1011, 1995. : 77-84. [9] “High speed protection of EHV transmission lines using digital traveling waves”, International journal of academic research Conclusion and Recommendation This paper reveals that protection of Transmission lines can detect, calculate location of fault by using ANN. When faults are detected and isolated faster, it minimizes © 2012, IJARCSSE All Rights Reserved Page | 73