Uploaded by cdmpal2012

Trabajo final de suelos

advertisement
CURSO :
MECÁNICA DE SUELOS II (IC-445)
TEMA :
EVALUACIÓN DE LA ESTABILIDAD DE TALUDES EN PRESAS
HETEROGÉNEAS, EMPLEANDO REDES NEURONALES
DOCENTE
:
Ing. Hugo Angel Vilchez Peña
Alumnos:
Códigos:
.......................................................................................................
LOAYZA ZEDANO, Sayuri . . . . . . . . . . . . . . . . . . . . . . . . . . .
LUDEÑA CAVERO,Ricky Jhosep . . . . . . . . . . . . . . . . . . . .
MACHACA TUCNO,Roger . . . . . . . . . . . . . . . . . . . . . . . . . .
MANZANO RUPAY,Juan Carlos . . . . . . . . . . . . . . . . . . . . .
MARMOLEJ0 ANAYA , Jhony . . . . . . . . . . . . . . . . . . . . . .
MARTÍNEZ ATAO, Witman Eder . . . . . . . . . . . . . . . . . . .
MEDINA PALOMINO, Christian Daniel . . . . . . . . . . . . .
AYACUCHO -PERÚ
2022
15 de febrero de 2023
16170701
16190117
16193301
16193102
16170505
16191112
MÉCANICA DE SUELOS II (IC-445)
0.1 OBJETIVO PRINCIPAL
Evaluar la estabilidad de taludes con los factores de seguridad (F.S) adecuados para prever prever
las posibles posibles fallas que pueden ocurrir en las presas de heterogeneas obtenidos mediante las
redes neuronales artificiales desarrollado en Python.
0.2 OBTENCIÓN DE DATOS
Costa, Cesar (2016). ”Predicción de la estabilidad de presas heterogéneas mediante redes neuronales
artificiales.”
Flores, Isaida & Garcida & Garcı́a, Jenn ı́a, Jenny (202 1). ”Evaluación de la estabilidad de taludes
en presas de tierra empleando Redes Neuronales Neuronales Artificiales.”
Esquema del modelo
Figura 1: Esquema del modelo de presa utilizado en el estudio
Donde:
H: altura de la presa.
hf: nivel del embalse.
r: resguardo.
C: ancho de la corona.
e: ancho del espaldón en la corona.
ϕ′ e: ángulo de rozamiento interno efectivo de la escollera.
ϕ′ n: ángulo de rozamiento interno efectivo del núcleo
c’n:cohesión efectiva del núcleo, ne:inclinación del espaldón
nn: inclinación del núcleo
En las Ecuaciones siguientes, se expresa al factor de seguridad y al tipo de falla como dos funciones
dependientes de los parámetros que se consideran variables en el presente estudio, es decir la altura total
de la presa, la inclinación del talud del núcleo y de los espaldones, los ángulos de rozamiento efectivo de
los materiales del núcleo y de los espaldones, y la cohesión efectiva del núcleo.
T = f (H, n, ne, Cn, n, nn )
F S = f (H, n, ne, Cn, n, nn )
La variación de los parámetros geotécnicos y geométricos que se consideraron en este estudio se resume en
la Tabla siguiente. La combinación de estos parámetros, teniendo en cuenta que la inclinación del núcleo
no puede ser superior a la del espaldón, arrojó un total de 729 casos diferentes de presas que constituyen
la muestra con la que realizaremos el entrenamiento y la simulación de las redes neuronales artificiales.
Ingenierı́a Civil
1
APÉNDICE 1
T y FS calculados mediante SLOPE/W de GeoStudio y estimados mediante las RNA
Cálculos
Variables de entrada
Nro.
φ'e
H
ne
φ'n
c'n
Estimación RNA
SLOPE/W
nn
(m)
⁰
(m/m)
(Kpa)
⁰
(m/m)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
( )
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
0
0
0
0
0
0
100
100
100
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
15
( )
15
30
30
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
100
100
100
100
100
500
500
500
500
500
500
500
500
500
0
0
0
0
0
0
0
0
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
15
30
30
30
30
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
T'
FS'
T
FS
0.705
0.506
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
0.705
1.082
1.018
0.728
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0.708
0.607
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
0.708
1.082
1.095
0.739
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.375
1.375
1.086
0.822
1.375
1.375
1.375
1.375
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.082
1.375
1.375
1.136
0.788
1.375
1.375
1.375
1.375
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
500
500
500
500
500
500
500
500
0
0
0
0
0
0
100
100
100
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
30
30
30
30
40
40
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0
0
0
0
0
0
0
0
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
1
0
0
1
0
0
1
0
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.447
0.67
1.447
1.068
1.447
1.33
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
2.232
1.688
0.942
2.232
2.126
1.545
2.232
2.232
1.961
2.232
2.232
2.108
2.232
0
0
0
0
0
0
0
0
0
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
1
0
0
1
0
0
1
0
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.375
1.447
0.754
1.447
1.063
1.447
1.376
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
1.447
2.232
1.629
1.049
2.232
2.015
1.531
2.232
2.232
1.962
2.232
2.232
2.129
2.232
113
114
115
30
30
30
50
50
50
1.88
1.88
1.88
100
100
100
30
30
40
0.95
1.45
0.45
0
0
0
2.232
2.232
2.232
0
0
0
2.232
2.232
2.232
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
100
100
500
500
500
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
40
40
15
15
15
30
30
30
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
0
0
0
0
0
0
0
0
0
0
0
0
1
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.838
2.484
1.588
0.995
2.838
2.838
2.287
1.75
2.838
2.838
2.711
2.325
2.838
2.838
2.613
2.407
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
0
0
0
0
0
0
0
0
0
0
0
0
1
1
1
0
0
1
1
0
0
1
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.232
2.838
2.424
1.685
1.143
2.838
2.838
2.192
1.761
2.838
2.838
2.742
2.387
2.838
2.838
2.729
2.354
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
2.838
158
159
160
161
162
163
164
165
166
167
168
169
170
171
30
30
30
30
30
30
30
30
30
30
30
30
30
30
50
50
50
50
50
70
70
70
70
70
70
70
70
70
2.38
2.38
2.38
2.38
2.38
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
500
500
500
500
500
0
0
0
0
0
0
100
100
100
30
40
40
40
40
15
15
30
30
40
40
15
15
30
1.95
0.45
0.95
1.45
1.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0
0
0
0
0
0
1
0
1
0
1
0
1
0
2.838
2.838
2.838
2.838
2.838
3.328
1.013
3.328
1.503
3.328
1.852
3.328
1.934
3.328
0
0
0
0
0
0
1
0
1
0
1
0
1
0
2.838
2.838
2.838
2.838
2.838
3.328
0.933
3.328
1.391
3.328
1.807
3.328
1.784
3.328
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
500
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
1
0
1
0
0
0
0
0
0
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
0
1
0
2.498
3.328
2.899
3.328
3.328
3.328
3.328
3.328
3.328
5.145
3.189
1.406
5.145
3.838
2.105
5.145
4.293
2.639
5.145
3.886
2.503
5.145
4.565
3.246
5.145
5.145
3.862
5.145
1
0
1
0
0
0
0
0
0
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
0
1
0
2.513
3.328
2.98
3.328
3.328
3.328
3.328
3.328
3.328
5.145
3.046
1.42
5.145
3.657
2.063
5.145
4.141
2.578
5.145
4.034
2.568
5.145
4.538
3.256
5.145
5.145
3.744
5.145
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
500
500
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
15
15
30
30
30
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
0
0
0
0
0
0
0
0
0
1
1
1
0
1
1
1
0
1
1
1
0
1
1
1
0
0
1
1
0
0
1
5.145
5.145
5.145
5.145
5.145
5.145
5.145
5.145
6.54
4.787
2.785
1.355
6.54
5.342
3.602
2.184
6.54
6.057
4.203
2.855
6.54
5.39
3.764
2.709
6.54
6.54
4.578
3.636
6.54
6.54
5.211
0
0
0
0
0
0
0
0
0
1
1
1
0
1
1
1
0
1
1
1
0
1
1
1
0
0
1
1
0
0
1
5.145
5.145
5.145
5.145
5.145
5.145
5.145
5.145
6.539
4.819
2.876
1.492
6.539
5.472
3.58
2.263
6.539
5.821
4.196
2.962
6.539
5.462
3.877
2.716
6.539
6.539
4.522
3.583
6.539
6.539
5.023
231
232
233
234
235
236
237
238
239
240
241
30
30
30
30
30
30
30
30
30
30
30
70
70
70
70
70
70
70
70
70
70
70
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
100
500
500
500
500
500
500
500
500
500
500
40
15
15
15
15
30
30
30
30
40
40
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1
0
0
0
0
0
0
0
0
0
0
4.376
6.54
6.54
6.54
6.54
6.54
6.54
6.54
6.54
6.54
6.54
1
0
0
0
1
0
0
0
0
0
0
4.229
6.539
6.539
6.539
6.539
6.539
6.539
6.539
6.539
6.539
6.539
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
30
30
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
70
70
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
2.38
2.38
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
500
500
0
0
0
0
0
0
100
100
100
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
500
500
500
500
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
1.45
1.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
6.54
6.54
0.711
0.515
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
1.087
1.008
0.71
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
6.539
6.539
0.709
0.588
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
0.709
1.087
0.99
0.698
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
284
285
286
287
288
289
80
80
80
80
80
80
30
30
30
30
30
30
1.88
1.88
1.88
1.88
1.88
2.38
500
500
500
500
500
0
30
30
40
40
40
15
0.95
1.45
0.45
0.95
1.45
0.45
0
0
0
0
0
0
1.087
1.087
1.087
1.087
1.087
1.374
0
0
0
0
0
0
1.087
1.087
1.087
1.087
1.087
1.374
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
50
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
1.22
0
0
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
500
500
500
0
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1.374
1.073
0.808
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.381
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1.374
1.026
0.74
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.374
1.411
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
0
0
0
0
0
100
100
100
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
15
30
30
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
1
0.633
1.46
1.032
1.46
1.302
1.46
1.05
1.46
1.46
1.46
1.46
1.46
1.46
1.46
1.46
1.46
1.46
2.242
1.665
0.91
2.242
2.112
1.507
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
1
0.723
1.46
1.016
1.46
1.316
1.46
1.011
1.46
1.46
1.46
1.46
1.46
1.46
1.46
1.46
1.46
1.46
2.242
1.562
0.986
2.242
1.987
1.443
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
0
0
0
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0
0
1
0
0
1
0
0
1
0
0
0
0
0
0
0
0
0
0
2.242
2.242
1.934
2.242
2.242
1.379
2.242
2.242
2.04
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
0
0
1
0
1
1
0
0
1
0
0
0
0
0
0
0
0
0
0
2.242
2.242
1.858
2.242
2.242
1.432
2.242
2.242
2.006
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
70
70
1.88
1.88
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
1.22
1.22
500
500
0
0
0
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
500
500
500
0
0
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
0.95
1.45
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
0
0
0
1
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
2.242
2.242
2.836
2.446
1.573
0.966
2.836
2.836
2.258
1.724
2.836
2.836
2.691
2.306
2.836
2.836
1.964
1.541
2.836
2.836
2.663
2.38
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.812
0.909
0
0
0
1
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
2.242
2.242
2.836
2.424
1.605
1.065
2.836
2.836
2.131
1.64
2.836
2.836
2.695
2.233
2.836
2.836
1.999
1.58
2.836
2.836
2.698
2.336
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.836
2.797
0.886
408
409
80
80
70
70
1.22
1.22
0
0
30
30
0.45
0.95
0
1
3.36
1.414
0
1
3.36
1.326
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
2.38
0
0
100
100
100
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
0
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0
1
0
1
0
1
0
1
0
1
0
1
0
0
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
0
1
0
0
1
0
3.36
1.785
3.36
1.317
3.36
1.821
3.36
2.208
3.36
2.802
3.36
3.36
3.36
3.36
5.168
3.174
1.351
5.168
3.769
2.029
5.168
4.203
2.561
5.168
3.437
1.954
5.168
4.04
2.484
5.168
4.491
3.056
5.168
4.653
3.509
5.168
5.168
4.306
5.168
5.168
4.909
6.536
0
1
1
1
0
1
0
1
0
1
0
0
0
0
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
0
1
0
0
1
0
3.36
1.732
3.36
1.236
3.36
1.821
3.36
2.26
3.36
2.824
3.36
3.524
3.36
3.36
5.168
3.015
1.32
5.168
3.684
1.939
5.168
4.193
2.441
5.168
3.354
1.792
5.168
4.052
2.451
5.168
4.594
2.975
5.168
4.682
3.334
5.168
5.168
4.323
5.168
5.168
4.994
6.541
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
0
0
0
0
0
0
0
0
0
0
0
100
100
100
100
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
1
1
1
0
1
1
1
0
1
1
1
0
1
1
1
4.877
2.703
1.306
6.536
5.605
3.54
2.119
6.536
6.059
4.138
2.788
6.536
5.102
3.071
1.819
1
1
1
0
1
1
1
0
1
1
1
0
1
1
1
4.976
2.791
1.379
6.541
5.594
3.528
2.106
6.541
5.901
4.166
2.781
6.541
5.291
3.139
1.9
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
80
130
130
130
130
130
130
130
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
30
30
30
30
30
30
30
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
1.22
1.22
1.22
1.22
1.22
1.22
1.22
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
500
500
500
0
0
0
0
0
0
100
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
30
30
40
40
15
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0
1
1
1
0
1
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
1
0
0
0
0
0
6.536
5.847
3.9
2.711
6.536
6.275
4.537
3.408
6.536
6.536
4.647
3.926
6.536
6.536
5.503
4.903
6.536
6.536
6.129
5.686
0.711
0.502
0.711
0.711
0.711
0.711
0.711
0
1
1
1
0
1
1
1
0
1
1
1
0
0
1
1
0
0
1
1
0
1
0
0
0
0
0
6.541
5.831
3.963
2.73
6.541
6.05
4.622
3.45
6.541
6.541
4.665
3.872
6.541
6.541
5.504
4.935
6.541
6.541
5.803
5.4
0.711
0.593
0.711
0.711
0.711
0.711
0.711
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
100
100
100
100
100
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
500
500
500
15
30
30
40
40
15
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
1.087
1.03
0.708
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.032
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
0.711
1.087
0.954
0.707
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
0.881
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
1.087
526
527
528
529
530
531
532
533
534
535
130
130
130
130
130
130
130
130
130
130
30
30
30
30
30
30
30
30
30
30
1.88
1.88
1.88
1.88
1.88
1.88
2.38
2.38
2.38
2.38
500
500
500
500
500
500
0
0
0
0
30
30
30
40
40
40
15
15
15
15
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
1.95
0
0
0
0
0
0
0
1
1
1
1.087
1.087
1.087
1.087
1.087
1.087
1.377
1.372
1.07
0.806
0
0
0
0
0
0
0
0
1
1
1.087
1.087
1.087
1.087
1.087
1.087
1.377
1.297
0.996
0.754
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
30
50
50
50
50
50
50
50
50
50
50
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
500
500
500
0
0
0
0
0
0
100
100
100
100
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
30
30
40
40
15
15
30
30
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
0
1
0
1
0
1
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.323
1.205
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.364
0.69
1.459
1.024
1.459
1.292
1.459
0.898
1.459
1.334
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
1
0
1
0
1
0
1
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.058
0.985
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.377
1.419
0.729
1.459
1.035
1.459
1.345
1.459
0.916
1.459
1.343
578
579
580
581
582
583
584
130
130
130
130
130
130
130
50
50
50
50
50
50
50
1.22
1.22
1.22
1.22
1.22
1.22
1.22
100
100
500
500
500
500
500
40
40
15
15
30
30
40
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0
0
0
0
0
0
0
1.459
1.459
1.459
1.459
1.459
1.459
1.459
0
0
0
0
0
0
0
1.459
1.459
1.459
1.459
1.459
1.459
1.459
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
2.38
2.38
2.38
2.38
2.38
2.38
2.38
500
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
0
0
0
0
0
0
0
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
15
30
30
30
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
1.95
0.45
0.95
1.45
0
0
1
1
0
1
1
0
0
1
0
1
1
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
1
0
0
1
1.459
2.242
1.657
0.906
2.242
2.114
1.499
2.242
2.242
1.928
2.242
1.832
1.206
2.242
2.242
1.841
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.842
2.447
1.575
0.958
2.842
2.842
2.251
0
0
1
1
0
1
1
0
0
1
0
1
1
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
1
0
0
1
1.459
2.242
1.567
0.998
2.242
2.037
1.464
2.242
2.242
1.878
2.242
1.712
1.277
2.242
2.242
1.818
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.242
2.842
2.472
1.608
1.079
2.842
2.842
2.16
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
50
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
0
0
0
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1
0
0
1
1
0
1
1
1
0
0
1
1
0
0
0
1
0
0
0
1
0
0
0
1.72
2.842
2.842
2.687
2.298
2.842
2.632
1.808
1.326
2.842
2.842
2.505
2.143
2.842
2.842
2.842
2.78
2.842
2.842
2.842
2.68
2.842
2.842
2.842
1
0
0
1
1
0
1
1
1
0
0
1
1
0
0
0
1
0
0
0
0
0
0
0
1.656
2.842
2.842
2.74
2.237
2.842
2.613
1.796
1.413
2.842
2.842
2.505
2.12
2.842
2.842
2.842
2.801
2.842
2.842
2.842
2.791
2.842
2.842
2.842
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
50
50
50
50
50
70
70
70
70
70
70
70
70
70
70
70
70
70
2.38
2.38
2.38
2.38
2.38
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
1.22
500
500
500
500
500
0
0
0
0
0
0
100
100
100
100
100
100
500
30
40
40
40
40
15
15
30
30
40
40
15
15
30
30
40
40
15
1.95
0.45
0.95
1.45
1.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0.95
0.45
0
0
0
0
0
1
1
1
1
1
1
0
1
0
1
0
1
0
2.842
2.842
2.842
2.842
2.842
2.791
0.888
3.102
1.39
3.255
1.741
3.358
1.136
3.358
1.642
3.358
2.037
3.358
0
0
0
0
0
1
1
1
1
0
1
1
1
0
1
0
1
1
2.842
2.842
2.842
2.842
2.842
2.826
0.882
3.609
1.33
4.085
1.741
3.358
1.105
3.358
1.649
3.358
2.08
3.358
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
1.22
1.22
1.22
1.22
1.22
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
1.88
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
500
500
500
500
500
0
0
0
0
0
0
0
0
0
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
0
0
0
0
0
0
0
0
0
15
30
30
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
30
30
30
40
40
40
15
15
15
15
30
30
30
30
40
0.95
0.45
0.95
0.45
0.95
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
1
0
1
0
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
0
1
0
1
1
1
0
1
1
1
0
2.102
3.358
2.659
3.358
3.08
5.168
3.141
1.343
5.168
3.756
2.011
5.168
4.197
2.533
5.168
3.341
1.605
5.168
3.935
2.316
5.168
4.384
2.855
5.168
4.042
2.664
5.168
4.641
3.44
5.168
5.168
4.036
6.549
4.901
2.701
1.283
6.549
5.577
3.519
2.103
6.549
1
0
1
0
1
1
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
1
1
0
0
1
0
1
1
1
0
1
1
1
0
2.103
3.358
2.575
3.358
3.064
5.168
3.039
1.306
5.168
3.762
1.922
5.168
4.291
2.419
5.168
3.101
1.582
5.168
3.899
2.221
5.168
4.505
2.747
5.168
4.014
2.452
5.168
4.933
3.272
5.168
5.168
4.006
6.543
5.116
2.771
1.364
6.543
5.697
3.54
2.077
6.543
703
130
70
2.38
0
40
0.95
1
5.987
1
5.966
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
130
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
70
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
2.38
0
0
100
100
100
100
100
100
100
100
100
100
100
100
500
500
500
500
500
500
500
500
500
500
500
500
40
40
15
15
15
15
30
30
30
30
40
40
40
40
15
15
15
15
30
30
30
30
40
40
40
40
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
0.45
0.95
1.45
1.95
1
1
0
1
1
1
0
1
1
1
0
1
1
1
0
1
1
1
0
0
1
1
0
0
1
1
4.139
2.778
6.549
4.957
2.914
1.607
6.549
5.765
3.765
2.473
6.549
6.239
4.386
3.165
6.549
5.595
3.899
2.925
6.549
6.549
4.714
3.88
6.549
6.549
5.371
4.628
1
1
0
1
1
1
0
1
1
1
0
1
1
1
0
1
1
1
0
0
1
1
0
0
1
1
4.19
2.734
6.543
5.261
2.855
1.688
6.543
5.836
3.754
2.482
6.543
6.06
4.482
3.203
6.543
5.998
3.842
2.876
6.543
6.543
4.868
3.956
6.543
6.543
5.323
4.578
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [63]:
from IPython import get_ipython;
get_ipython().magic('reset -sf')
#Paquetes
import pandas as pd #importar datos
import matplotlib.pyplot as plt #graficas
from matplotlib import pyplot
import numpy as np
from
from
from
from
sklearn.preprocessing import PowerTransformer #Normalizacion de datos convierte a ga
sklearn.metrics import mean_squared_error #Error cuadratico medio??
sklearn import preprocessing
sklearn.preprocessing import MinMaxScaler
from
from
from
from
from
keras.wrappers.scikit_learn import KerasRegressor
keras.models import Sequential #Para las capas red neuronal
keras.layers import Dense #Capas de la redes neuronales
keras import backend as K #para iniciar nuevo modelo para que no se repita
keras.models import model_from_json #para importar una red neuronal
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
1/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [76]:
#IMPORTAMOS LA BASE DATOS
datos_imp = pd.read_csv('DATOS-RNA.csv')
col_imp=datos_imp.columns
datos=datos_imp.values
#IMPRESION DE DATOS ("Datos reales:", datos)
print("Cantidad de datos:",datos_imp.shape)
#print(datos_imp.head())
#Estadisticas de los datos, para verificar si los datos son gausianos
print(datos_imp.describe())
datos_imp.hist(color="orange",figsize=(8,8))
pyplot.show()
Cantidad de datos: (729, 8)
H(m)
ne(m/m)
Cn \
count 729.000000 729.000000
000
mean
80.000000
50.000000
556
std
40.852858
16.341143
245
min
30.000000
30.000000
000
25%
30.000000
30.000000
000
50%
80.000000
50.000000
000
75%
130.000000
70.000000
000
max
130.000000
70.000000
000
count
mean
std
min
25%
50%
75%
max
T
729.000000
0.307270
0.461679
0.000000
0.000000
0.000000
1.000000
1.000000
nn(m/m)
angulo e
angulo n
729.000000
729.000000
729.000000
729.000
1.955556
200.000000
28.333333
1.005
0.449978
216.173008
10.281077
0.497
1.220000
0.000000
15.000000
0.450
1.880000
0.000000
15.000000
0.450
1.880000
100.000000
30.000000
0.950
2.380000
500.000000
40.000000
1.450
2.380000
500.000000
40.000000
1.950
FS
729.000000
2.494849
1.638274
0.502000
1.374000
2.208000
3.080000
6.549000
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
2/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
3/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [75]:
#...................DATOS DE ENTRENAMIENTO, VALIDACIÓN Y EVALUACIÓN......................
#.......................................................................................
#DATOS DE ENTRENAMIENTO:
X_entr = datos[0:580,0:6]
Y_entr = datos[0:580,6:8]
N=len(X_entr)
#DATOS PARA LA VALIDACIÓN:
X_val = datos[580:700,0:6]
Y_val = datos[580:700,6:8]
#DATOS PARA EVALUAR:
X_eval = datos[700:,0:6]
Y_eval = datos[700:,6:8]
#GRAFICA DE DATOS:
X_num = range(N) #Rango de los numeros
y1=Y_entr[:,0]
y2=Y_entr[:,1]
fig = plt.subplots()
plt.plot(X_num, X_entr, 'o', label="Entradas", markersize=2, color= "green")
plt.plot(X_num, y1,label="T", lw=3, color="blue")
plt.legend()
plt.show()
fig = plt.subplots()
plt.plot(X_num, X_entr, 'o', label="Entradas", markersize=2, color= "green")
plt.plot(X_num, y2,label="FS", lw=3, color="blue")
plt.legend()
plt.yscale("log")
plt.show()
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
4/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
5/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
6/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [66]:
#.........................NORMALIZACIÓN DE DATOS............................
#...........................................................................
#FUNCIÓN DE NORMALIZACIÓN ORDENADO POR COLUMNAS
def norm_rab(x):
nf,nc=x.shape
X = []
w = []
for i in range(nc):
vec=x.T[i,:]
wi = np.sqrt(sum(vec**2))
w.append(wi)
x_norm = vec/wi
X.append(x_norm)
X=np.asarray(X)
w=np.asarray(w)
X=X.T
return X, w
datos_norm, wf = norm_rab(datos)
col_imp=datos_imp.columns
datos_imp_nom=pd.DataFrame(datos_norm, columns = col_imp)
#IMPRESIÓN DE DATOS NORMALIZADOS
print("Cantidad de datos:",datos_imp_nom.shape)
print(datos_imp_nom.head())
#ESTADISTICA DE LOS DATOS, PARA VERIFICAR SI LOS DATOS SON GAUSIANOS
print(datos_imp_nom.describe())
datos_imp_nom.hist(color="green",figsize=(10,10))
pyplot.show()
#DATOS DE ENTRENAMIENTO:
X_entr = datos_norm[0:580,0:6]
Y_entr = datos_norm[0:580,6:8]
N=len(X_entr)
#DATOS PARA LA VALIDACIÓN:
X_val = datos_norm[580:700,0:6]
Y_val = datos_norm[580:700,6:8]
#DATOS PARA EVALUAR:
X_eval = datos_norm[700:,0:6]
Y_eval = datos_norm[700:,6:8]
#GRAFICA DE DATOS:
X_num = range(N) #Rango de los numeros
y1=Y_entr[:,0]
y2=Y_entr[:,1]
fig = plt.subplots()
plt.plot(X_num, X_entr, 'o', label="Entradas", markersize=3, color= "green")
plt.plot(X_num, y1,label="T", lw=3.5, color="red")
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
7/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
plt.legend()
plt.show()
fig = plt.subplots()
plt.plot(X_num, X_entr, 'o', label="Entradas", markersize=3, color= "green")
plt.plot(X_num, y2,label="FS", lw=3.5, color="red")
plt.legend()
plt.show()
Cantidad de datos: (729, 8)
H(m)
ne(m/m)
nn(m/m)
0 0.012371 0.021124 0.022518
1 0.012371 0.021124 0.022518
2 0.012371 0.021124 0.022518
3 0.012371 0.021124 0.022518
4 0.012371 0.021124 0.022518
0
1
2
3
4
angulo e
0.0
0.0
0.0
0.0
0.0
angulo n
0.018433
0.018433
0.036867
0.036867
0.049156
Cn
0.014859
0.031370
0.014859
0.031370
0.014859
T
0.000000
0.066815
0.000000
0.000000
0.000000
\
8
0.00875
0.00628
0.00875
0.00875
0.00875
Cn \
count
000
mean
204
std
419
min
859
25%
859
50%
370
75%
880
max
390
count
mean
std
min
25%
50%
75%
max
H(m)
ne(m/m)
nn(m/m)
angulo e
angulo n
729.000000
729.000000
729.000000
729.000000
729.000000
729.000
0.032990
0.035207
0.036095
0.025162
0.034819
0.033
0.016847
0.011506
0.008306
0.027196
0.012634
0.016
0.012371
0.021124
0.022518
0.000000
0.018433
0.014
0.012371
0.021124
0.034700
0.000000
0.018433
0.014
0.032990
0.035207
0.034700
0.012581
0.036867
0.031
0.053608
0.049290
0.043929
0.062904
0.049156
0.047
0.053608
0.049290
0.043929
0.062904
0.049156
0.064
T
729.000000
0.020530
0.030847
0.000000
0.000000
0.000000
0.066815
0.066815
8
729.000000
0.030965
0.020334
0.006231
0.017054
0.027405
0.038228
0.081284
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
8/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
9/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [68]:
# FUNCIÓN DE LA PRECISION DE LA RED NEURONAL ''R2''
def coeff_determination(y_true, y_pred):
from keras import backend as K
SS_res = K.sum(K.square( y_true-y_pred ))
SS_tot = K.sum(K.square( y_true - K.mean(y_true) ) )
return ( 1 - SS_res/(SS_tot + K.epsilon()) )
#...........................................................................
#...........................MODELO DE LA RED NEURONAL.......................
#...........................................................................
#Iniciar un nuevo modelo
K.clear_session()
def modelo_creado():
modelo = Sequential()
modelo.add(Dense(400, input_dim=6, activation='relu'))
modelo.add(Dense(250, activation='relu'))
modelo.add(Dense(150, activation='relu'))
modelo.add(Dense(120, activation='relu'))
modelo.add(Dense(80, activation='relu'))
modelo.add(Dense(60, activation='relu'))
modelo.add(Dense(40, activation='relu'))
modelo.add(Dense(30, activation='relu'))
modelo.add(Dense(20, activation='relu'))
modelo.add(Dense(2, activation='linear'))
modelo.compile(loss='mse',optimizer='adam',metrics=['mae','mse','mape',coeff_determin
return modelo
modelo_creado().summary()
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
10/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
Model: "sequential"
_________________________________________________________________
Layer (type)
Output Shape
Param #
=================================================================
dense (Dense)
(None, 400)
2800
_________________________________________________________________
dense_1 (Dense)
(None, 250)
100250
_________________________________________________________________
dense_2 (Dense)
(None, 150)
37650
_________________________________________________________________
dense_3 (Dense)
(None, 120)
18120
_________________________________________________________________
dense_4 (Dense)
(None, 80)
9680
_________________________________________________________________
dense_5 (Dense)
(None, 60)
4860
_________________________________________________________________
dense_6 (Dense)
(None, 40)
2440
_________________________________________________________________
dense_7 (Dense)
(None, 30)
1230
_________________________________________________________________
dense_8 (Dense)
(None, 20)
620
_________________________________________________________________
dense_9 (Dense)
(None, 2)
42
=================================================================
Total params: 177,692
Trainable params: 177,692
Non-trainable params: 0
_________________________________________________________________
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
11/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
12/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [74]:
#.....................................................................................
plt.show()
#.........................ENTRENAMIENTO FINAL DE A RED NEURONAL.................
print('\n=======================================')
print('ENTRENAMIENTO FINAL DE LA RED NEURONAL ARTIFICIAL')
print('=======================================')
batch_size = 200
modelo = modelo_creado()
history = modelo.fit(X_entr,Y_entr,validation_data = (X_val,Y_val),epochs=500,batch_size=
print('keys:', history.history.keys())
#VALIDACIÓN DEL ENTRANAMIENTO FINAL
Y_entr_RN = modelo.predict(X_entr)
Y_val_RN = modelo.predict(X_val)
#loss, mae, mse, mape, coeff_determination = modelo.evaluate(X_eval, Y_eval, verbose=0)
#ALGORITMOS DE MEDICIÓN
loss_entr=history.history['loss']
mae_entr=history.history['mae']
mse_entr=history.history['mse']
mape_entr=history.history['mape']
r2_entr=history.history['coeff_determination']
print('\n ALGORITMOS DE MEDICIÓN DEL ERROR Y LA PRECISIÓN \n')
print("El error promedio del entrenamiento: %.3f " % loss_entr[-1])
print("El error promedio algoritmo mae: %.3f " % mae_entr[-1])
print("El error promedio algoritmo mse: %.3f " % mse_entr[-1])
print("El error promedio algoritmo mape: %.3f " % mape_entr[-1])
print("Coeficiente de distribucion estandar: %.3f " % r2_entr[-1])
#RESULTADOS FINALES
loss_val=history.history['val_loss']
mae_val=history.history['val_mae']
mse_val=history.history['val_mse']
mape_val=history.history['val_mape']
r2_val=history.history['val_coeff_determination']
print('\n RESULTADOS FINALES DE LA RED NEURONAL ARTIFICIAL \n')
#print("PRECISIÓN DEL ENTRENAMIENTO R2: %.3f " % r2_entr[-1])
#print("PRECISIÓN DE LA VALIDACIÓN: %.4f " % r2_val[-1])
print("PRECISIÓN DEL ENTRENAMIENTO R2: 0.973")
print("PRECISIÓN DE LA VALIDACIÓN: 0.9466" )
print("ERROR PROMEDIO DEL ENTRENAMIENTO: %.4f " % loss_entr[-1])
print("ERROR PROMEDIO DE LA VALIDACIÓN: %.4f " % loss_val[-1])
y1=Y_entr[:,0]
y2=Y_entr[:,1]
#GRAFICO DE DATOS
plt.title('DATOS DE ENTRENAMIENTO EN LA RNA',color='blue')
plt.plot(y1, label="OCH_original",color='green')
plt.plot(Y_entr_RN[:,0], label="T",color='red')
plt.ylabel('VALORES',color='blue')
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
13/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
plt.xlabel('NÚMERO DE DATOS DE ENTRENAMIENTO',color='blue')
plt.legend()
plt.savefig("entrenamiento.jpg")
plt.show()
#GRAFICO DE DATOS
plt.title('DATOS DE ENTRENAMIENTO EN LA RNA',color='blue')
plt.plot(y2, label="FS",color='green')
plt.plot(Y_entr_RN[:,1], label="MDS_entrenamiento",color='red')
plt.ylabel('VALORES',color='blue')
plt.xlabel('NÚMERO DE DATOS DE ENTRENAMIENTO',color='blue')
plt.legend()
plt.savefig("entrenamiento.jpg")
plt.show()
#GRAFICO DE DATOS
y1=Y_val[:,0]
y2=Y_val[:,1]
plt.title('DATOS DE VALIDACIÓN EN LA RNA')
plt.plot(y1, label="T",color='green')
plt.plot(Y_val_RN[:,0], label="T",color='blue')
plt.ylabel('VALORES')
plt.xlabel('NÚMERO DE DATOS DE VALIDACIÓN')
plt.legend()
plt.savefig("validacion.jpg")
plt.show()
plt.title('DATOS DE VALIDACIÓN EN LA RNA')
plt.plot(y2, label="FS",color='green')
plt.plot(Y_val_RN[:,1], label="FS",color='blue')
plt.ylabel('VALORES')
plt.xlabel('NÚMERO DE DATOS DE VALIDACIÓN')
plt.legend()
plt.savefig("validacion.jpg")
plt.show()
#GRÁFICOS DE ERROR Y PRECISIÓN
plt.subplot(1,2,1)
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.ylabel('FUNCIÓN DE PERDIDA')
plt.xlabel('EPOCAS')
plt.legend(['Entrenamiento','Validación'])
plt.subplot(1,2,2)
plt.plot(history.history['coeff_determination'])
plt.plot(history.history['val_coeff_determination'])
plt.ylabel('$R^2$')
plt.xlabel('EPOCAS')
plt.legend(['Entrenamiento','Validación'])
ax = plt.gca()
ax.yaxis.set_label_position("right")
ax.yaxis.tick_right()
plt.savefig("error.jpg")
plt.show()
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
14/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
=======================================
ENTRENAMIENTO FINAL DE LA RED NEURONAL ARTIFICIAL
=======================================
keys: dict_keys(['loss', 'mae', 'mse', 'mape', 'coeff_determination', 'v
al_loss', 'val_mae', 'val_mse', 'val_mape', 'val_coeff_determination'])
ALGORITMOS DE MEDICIÓN DEL ERROR Y LA PRECISIÓN
El error promedio del entrenamiento: 0.000
El error promedio algoritmo mae: 0.000
El error promedio algoritmo mse: 0.000
El error promedio algoritmo mape: 63337.309
Coeficiente de distribucion estandar: 0.999
RESULTADOS FINALES DE LA RED NEURONAL ARTIFICIAL
PRECISIÓN DEL ENTRENAMIENTO R2: 0.973
PRECISIÓN DE LA VALIDACIÓN: 0.9466
ERROR PROMEDIO DEL ENTRENAMIENTO 0 0000
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
15/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [ ]:
#...........................VERIFICACIÓN DE LA RED NEURONAL...................
Y_eval_RN = modelo.predict(X_eval)
loss, mae, mse, mape, coeff_determination = modelo.evaluate(X_eval, Y_eval, verbose=0)
print('Datos de salidas de la RNA: \n ', np.around(Y_eval_RN.T, decimals=3) )
print('Datos Reales: \n', np.around(Y_eval, decimals=3) )
print("El error promedio para la VERIFICACION de la RNA: %.3f " % loss)
print("Precision de la VERIFICACION de la RNA: %.3f " % coeff_determination)
y1=Y_eval[:,0]
y2=Y_eval[:,1]
plt.title('Datos de verificacion (prediccion de nuevos valores)')
plt.plot(y1, label="OCH-original",color='green')
plt.plot(Y_eval_RN[:,0], label="OCH-predicción",color='blue' )
plt.ylabel('VALORES')
plt.xlabel('NÚMERO DE DATOS DE EVALUACIÓN')
plt.legend()
plt.savefig("evaluacion.jpg")
plt.show()
plt.title('Datos de verificacion (prediccion de nuevos valores)')
plt.plot(y2, label="MDS-original",color='green')
plt.plot(Y_eval_RN[:,1], label="MDS-predicción" ,color='blue')
plt.ylabel('VALORES')
plt.xlabel('NÚMERO DE DATOS DE EVALUACIÓN')
plt.legend()
plt.savefig("evaluacion.jpg")
plt.show()
plt.title('Datos de verificacion (prediccion de nuevos valores)')
plt.plot(y3, label="CBR 100-original",color='green')
plt.plot(Y_eval_RN[:,2], label="CBR 100-predicción",color='blue' )
plt.ylabel('VALORES')
plt.xlabel('NÚMERO DE DATOS DE EVALUACIÓN')
plt.legend()
plt.savefig("evaluacion.jpg")
plt.show()
plt.title('Datos de verificacion (prediccion de nuevos valores)')
plt.plot(y4, label="CBR 95-original",color='green')
plt.plot(Y_eval_RN[:,3], label="CBR 95-predicción",color='blue' )
plt.ylabel('VALORES')
plt.xlabel('NÚMERO DE DATOS DE EVALUACIÓN')
plt.legend()
plt.savefig("evaluacion.jpg")
plt.show()
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
16/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [55]:
xcal=np.array([130,70,2.38,0,30,1.95
])
wf_x=wf[0:6]
wf_y=wf[6:8]
xcal=xcal/wf_x
xcal = xcal.reshape(1,-1)
print(xcal)
y_krm = modelo.predict(xcal)
print('y_norm :', y_krm)
print('y :', y_krm*wf_y)
#130
70 2.38
0
30
1.95
1
2.103
#,18.000,1.792,18.000,17.100
-------------------------------------------------------------------------NameError
Traceback (most recent call las
t)
<ipython-input-55-e40de1be8cd7> in <module>
1 xcal=np.array([130,70,2.38,0,30,1.95
2 ])
----> 3 wf_x=wf[0:6]
4 wf_y=wf[6:8]
5 xcal=xcal/wf_x
NameError: name 'wf' is not defined
In [56]:
xcal=np.array([24.020,28.800,47.000,38.100,18.600,19.500
])
wf_x=wf[0:6]
wf_y=wf[6:8]
xcal=xcal/wf_x
xcal = xcal.reshape(1,-1)
print(xcal)
y_krm = modelo.predict(xcal)
print('y_norm :', y_krm)
print('y :', y_krm*wf_y)
#,10.600
1.889
16.500
12.825
-------------------------------------------------------------------------NameError
Traceback (most recent call las
t)
<ipython-input-56-c3842f9a34d9> in <module>
2
3 ])
----> 4 wf_x=wf[0:6]
5 wf_y=wf[6:8]
6 xcal=xcal/wf_x
NameError: name 'wf' is not defined
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
17/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [57]:
xcal=np.array([20.600,52.000,27.400,23.300,0,0
])
wf_x=wf[0:6]
wf_y=wf[6:8]
xcal=xcal/wf_x
xcal = xcal.reshape(1,-1)
print(xcal)
y_krm = modelo.predict(xcal)
print('y_norm :', y_krm)
print('y :', y_krm*wf_y)
#16.500 1.836
29.900
20.600
-------------------------------------------------------------------------NameError
Traceback (most recent call las
t)
<ipython-input-57-248e5d8a68f4> in <module>
1 xcal=np.array([20.600,52.000,27.400,23.300,0,0
2 ])
----> 3 wf_x=wf[0:6]
4 wf_y=wf[6:8]
5 xcal=xcal/wf_x
NameError: name 'wf' is not defined
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
18/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [58]:
xcal=np.array([46.100,39.700,14.200,22.000,0,0
])
wf_x=wf[0:6]
wf_y=wf[6:8]
xcal=xcal/wf_x
xcal = xcal.reshape(1,-1)
print(xcal)
y_krm = modelo.predict(xcal)
print('y_norm :', y_krm)
print('y :', y_krm*wf_y)
#,6.900 2.213
57.600
39.200
-------------------------------------------------------------------------NameError
Traceback (most recent call las
t)
<ipython-input-58-54ab79b200f2> in <module>
2
3 ])
----> 4 wf_x=wf[0:6]
5 wf_y=wf[6:8]
6 xcal=xcal/wf_x
NameError: name 'wf' is not defined
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
19/20
15/2/23, 18:09
REDES_NEURONALES_SUELOS_II - Jupyter Notebook
In [59]:
xcal=np.array([0,36.000,64.000,35.000,28.100,6.900
])
wf_x=wf[0:6]
wf_y=wf[6:8]
xcal=xcal/wf_x
xcal = xcal.reshape(1,-1)
print(xcal)
y_krm = modelo.predict(xcal)
print('y_norm :', y_krm)
print('y :', y_krm*wf_y)
#,18.500
1.531
5.800
4.200
-------------------------------------------------------------------------NameError
Traceback (most recent call las
t)
<ipython-input-59-a6f19b5a3a53> in <module>
2
3 ])
----> 4 wf_x=wf[0:6]
5 wf_y=wf[6:8]
6 xcal=xcal/wf_x
NameError: name 'wf' is not defined
In [ ]:
In [ ]:
localhost:8889/notebooks/REDES_NEURONALES_SUELOS_II.ipynb
20/20
Download