Uploaded by Ali O. S.

internet-medical-things-iomt

advertisement
The Internet of Medical
Things (IoMT)
Scrivener Publishing
100 Cummings Center, Suite 541J
Beverly, MA 01915-6106
Advances in Learning Analytics for Intelligent Cloud-IoT Systems
Series Editors: Dr. Souvik Pal and Dr. Dac-Nhuong Le
Scope: The role of adaptation, learning analytics, computational Intelligence, and data analytics in the
field of Cloud-IoT Systems is becoming increasingly essential and intertwined. The capability of an
intelligent system depends on various self-decision making algorithms in IoT Devices. IoT based smart
systems generate a large amount of data (big data) that cannot be processed by traditional data processing
algorithms and applications. Hence, this book series involves different computational methods incorporated
within the system with the help of Analytics Reasoning and Sense-making in Big Data, which is centered
in the Cloud and IoT-enabled environments.
The series seeks volumes that are empirical studies, theoretical and numerical analysis, and novel research
findings. The series encourages cross-fertilization of highlighting research and knowledge of Data Analytics,
Machine Learning, Data Science, and IoT sustainable developments.
Please send proposals to:
Dr. Souvik Pal
Department of Computer Science and Engineering
Global Institute of Management and Technology
Krishna Nagar
West Bengal, India
souvikpal22@gmail.com
Dr. Dac-Nhuong Le
Faculty of Information Technology, Haiphong University, Haiphong, Vietnam
huongld@hus.edu.vn
Publishers at Scrivener
Martin Scrivener (martin@scrivenerpublishing.com)
Phillip Carmical (pcarmical@scrivenerpublishing.com)
The Internet of Medical
Things (IoMT)
Healthcare Transformation
Edited by
R.J. Hemalatha
D. Akila
D. Balaganesh
and
Anand Paul
This edition first published 2022 by John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
and Scrivener Publishing LLC, 100 Cummings Center, Suite 541J, Beverly, MA 01915, USA
© 2022 Scrivener Publishing LLC
For more information about Scrivener publications please visit www.scrivenerpublishing.com.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or
transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title
is available at http://www.wiley.com/go/permissions.
Wiley Global Headquarters
111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Limit of Liability/Disclaimer of Warranty
While the publisher and authors have used their best efforts in preparing this work, they make no rep­
resentations or warranties with respect to the accuracy or completeness of the contents of this work and
specifically disclaim all warranties, including without limitation any implied warranties of merchant-­
ability or fitness for a particular purpose. No warranty may be created or extended by sales representa­
tives, written sales materials, or promotional statements for this work. The fact that an organization,
website, or product is referred to in this work as a citation and/or potential source of further informa­
tion does not mean that the publisher and authors endorse the information or services the organiza­
tion, website, or product may provide or recommendations it may make. This work is sold with the
understanding that the publisher is not engaged in rendering professional services. The advice and
strategies contained herein may not be suitable for your situation. You should consult with a specialist
where appropriate. Neither the publisher nor authors shall be liable for any loss of profit or any other
commercial damages, including but not limited to special, incidental, consequential, or other damages.
Further, readers should be aware that websites listed in this work may have changed or disappeared
between when this work was written and when it is read.
Library of Congress Cataloging-in-Publication Data
ISBN 978-1-119-76883-8
Cover image: Pixabay.Com
Cover design by Russell Richardson
Set in size of 11pt and Minion Pro by Manila Typesetting Company, Makati, Philippines
Printed in the USA
10 9 8 7 6 5 4 3 2 1
Contents
Preface
1
2
In Silico Molecular Modeling and Docking Analysis in Lung
Cancer Cell Proteins
Manisha Sritharan and Asita Elengoe
1.1 Introduction
1.2 Methodology
1.2.1 Sequence of Protein
1.2.2 Homology Modeling
1.2.3 Physiochemical Characterization
1.2.4 Determination of Secondary Models
1.2.5 Determination of Stability of Protein Structures
1.2.6 Identification of Active Site
1.2.7 Preparation of Ligand Model
1.2.8 Docking of Target Protein and Phytocompound
1.3 Results and Discussion
1.3.1 Determination of Physiochemical Characters
1.3.2 Prediction of Secondary Structures
1.3.3 Verification of Stability of Protein Structures
1.3.4 Identification of Active Sites
1.3.5 Target Protein-Ligand Docking
1.4 Conclusion
References
Medical Data Classification in Cloud Computing Using Soft
Computing With Voting Classifier: A Review
Saurabh Sharma, Harish K. Shakya and Ashish Mishra
2.1 Introduction
2.1.1 Security in Medical Big Data Analytics
2.1.1.1 Capture
2.1.1.2 Cleaning
xv
1
2
4
4
4
4
4
4
4
5
5
5
5
7
7
14
14
18
18
23
24
24
24
25
v
vi Contents
2.2
2.3
2.4
2.5
2.6
3
2.1.1.3 Storage
2.1.1.4 Security
2.1.1.5 Stewardship
Access Control–Based Security
2.2.1 Authentication
2.2.1.1 User Password Authentication
2.2.1.2 Windows-Based User Authentication
2.2.1.3 Directory-Based Authentication
2.2.1.4 Certificate-Based Authentication
2.2.1.5 Smart Card–Based Authentication
2.2.1.6 Biometrics
2.2.1.7 Grid-Based Authentication
2.2.1.8 Knowledge-Based Authentication
2.2.1.9 Machine Authentication
2.2.1.10 One-Time Password (OTP)
2.2.1.11 Authority
2.2.1.12 Global Authorization
System Model
2.3.1 Role and Purpose of Design
2.3.1.1 Patients
2.3.1.2 Cloud Server
2.3.1.3 Doctor
Data Classification
2.4.1 Access Control
2.4.2 Content
2.4.3 Storage
2.4.4 Soft Computing Techniques for Data Classification
Related Work
Conclusion
References
Research Challenges in Pre-Copy Virtual Machine Migration
in Cloud Environment
Nirmala Devi N. and Vengatesh Kumar S.
3.1 Introduction
3.1.1 Cloud Computing
3.1.1.1 Cloud Service Provider
3.1.1.2 Data Storage and Security
3.1.2 Virtualization
3.1.2.1 Virtualization Terminology
3.1.3 Approach to Virtualization
25
26
26
27
27
28
28
28
28
29
29
29
29
29
30
30
30
30
31
31
31
31
32
32
33
33
34
36
42
43
45
46
46
47
47
48
49
50
Contents vii
3.1.4
3.1.5
3.1.6
3.1.7
3.2
3.3
3.4
3.5
3.6
4
5
Processor Issues
Memory Management
Benefits of Virtualization
Virtual Machine Migration
3.1.7.1 Pre-Copy
3.1.7.2 Post-Copy
3.1.7.3 Stop and Copy
Existing Technology and Its Review
Research Design
3.3.1 Basic Overview of VM Pre-Copy Live Migration
3.3.2 Improved Pre-Copy Approach
3.3.3 Time Series–Based Pre-Copy Approach
3.3.4 Memory-Bound Pre-Copy Live Migration
3.3.5 Three-Phase Optimization Method (TPO)
3.3.6 Multiphase Pre-Copy Strategy
Results
3.4.1 Finding
Discussion
3.5.1 Limitation
3.5.2 Future Scope
Conclusion
References
Estimation and Analysis of Prediction Rate of Pre-Trained Deep
Learning Network in Classification of Brain Tumor MRI Images
Krishnamoorthy Raghavan Narasu, Anima Nanda,
Marshiana D., Bestley Joe and Vinoth Kumar
4.1 Introduction
4.2 Classes of Brain Tumors
4.3 Literature Survey
4.4 Methodology
4.5 Conclusion
References
An Intelligent Healthcare Monitoring System for Coma Patients
Bethanney Janney J., T. Sudhakar, Sindu Divakaran,
Chandana H. and Caroline Chriselda L.
5.1 Introduction
5.2 Related Works
5.3 Materials and Methods
5.3.1 Existing System
51
51
51
51
52
52
53
54
56
57
58
60
62
62
64
65
65
69
69
70
70
71
73
74
75
76
78
93
95
99
100
102
104
104
viii Contents
6
7
5.3.2 Proposed System
5.3.3 Working
5.3.4 Module Description
5.3.4.1 Pulse Sensor
5.3.4.2 Temperature Sensor
5.3.4.3 Spirometer
5.3.4.4 OpenCV (Open Source Computer Vision)
5.3.4.5 Raspberry Pi
5.3.4.6 USB Camera
5.3.4.7 AVR Module
5.3.4.8 Power Supply
5.3.4.9 USB to TTL Converter
5.3.4.10 EEG of Comatose Patients
5.4 Results and Discussion
5.5 Conclusion
References
105
105
106
106
107
107
108
108
109
109
109
110
110
111
116
117
Deep Learning Interpretation of Biomedical Data
T.R. Thamizhvani, R. Chandrasekaran and T.R. Ineyathendral
6.1 Introduction
6.2 Deep Learning Models
6.2.1 Recurrent Neural Networks
6.2.2 LSTM/GRU Networks
6.2.3 Convolutional Neural Networks
6.2.4 Deep Belief Networks
6.2.5 Deep Stacking Networks
6.3 Interpretation of Deep Learning With Biomedical Data
6.4 Conclusion
References
121
Evolution of Electronic Health Records
G. Umashankar, Abinaya P., J. Premkumar, T. Sudhakar
and S. Krishnakumar
7.1 Introduction
7.2 Traditional Paper Method
7.3 IoMT
7.4 Telemedicine and IoMT
7.4.1 Advantages of Telemedicine
7.4.2 Drawbacks
7.4.3 IoMT Advantages with Telemedicine
7.4.4 Limitations of IoMT With Telemedicine
143
122
125
125
127
128
130
131
132
139
140
143
144
144
145
145
146
146
147
Contents
8
ix
7.5 Cyber Security
7.6 Materials and Methods
7.6.1 General Method
7.6.2 Data Security
7.7 Literature Review
7.8 Applications of Electronic Health Records
7.8.1 Clinical Research
7.8.1.1 Introduction
7.8.1.2 Data Significance and Evaluation
7.8.1.3 Conclusion
7.8.2 Diagnosis and Monitoring
7.8.2.1 Introduction
7.8.2.2 Contributions
7.8.2.3 Applications
7.8.3 Track Medical Progression
7.8.3.1 Introduction
7.8.3.2 Method Used
7.8.3.3 Conclusion
7.8.4 Wearable Devices
7.8.4.1 Introduction
7.8.4.2 Proposed Method
7.8.4.3 Conclusion
7.9 Results and Discussion
7.10 Challenges Ahead
7.11 Conclusion
References
147
147
147
148
148
150
150
150
151
151
151
151
152
152
153
153
153
154
154
154
155
155
155
157
158
158
Architecture of IoMT in Healthcare
A. Josephin Arockia Dhiyya
8.1 Introduction
8.1.1 On-Body Segment
8.1.2 In-Home Segment
8.1.3 Network Segment Layer
8.1.4 In-Clinic Segment
8.1.5 In-Hospital Segment
8.1.6 Future of IoMT?
8.2 Preferences of the Internet of Things
8.2.1 Cost Decrease
8.2.2 Proficiency and Efficiency
8.2.3 Business Openings
161
161
162
162
163
163
163
164
165
165
165
165
x Contents
9
8.2.4 Client Experience
8.2.5 Portability and Nimbleness
8.3 loMT Progress in COVID-19 Situations: Presentation
8.3.1 The IoMT Environment
8.3.2 IoMT Pandemic Alleviation Design
8.3.3 Man-Made Consciousness and Large Information
Innovation in IoMT
8.4 Major Applications of IoMT
References
166
166
167
168
169
Performance Assessment of IoMT Services and Protocols
A. Keerthana and Karthiga
9.1 Introduction
9.2 IoMT Architecture and Platform
9.2.1 Architecture
9.2.2 Devices Integration Layer
9.3 Types of Protocols
9.3.1 Internet Protocol for Medical IoT Smart Devices
9.3.1.1 HTTP
9.3.1.2 Message Queue Telemetry Transport
(MQTT)
9.3.1.3 Constrained Application Protocol
(CoAP)
9.3.1.4 AMQP: Advanced Message Queuing
Protocol (AMQP)
9.3.1.5 Extensible Message and Presence Protocol
(XMPP)
9.3.1.6 DDS
9.4 Testing Process in IoMT
9.5 Issues and Challenges
9.6 Conclusion
References
173
10 Performance Evaluation of Wearable IoT-Enabled Mesh
Network for Rural Health Monitoring
G. Merlin Sheeba and Y. Bevish Jinila
10.1 Introduction
10.2 Proposed System Framework
10.2.1 System Description
10.2.2 Health Monitoring Center
10.2.2.1 Body Sensor
170
171
172
174
175
176
177
177
177
178
179
180
181
181
183
183
185
185
185
187
188
190
190
192
192
Contents
10.2.2.2
Wireless Sensor Coordinator/
Transceiver
10.2.2.3 Ontology Information Center
10.2.2.4
Mesh Backbone-Placement and Routing
10.3 Experimental Evaluation
10.4 Performance Evaluation
10.4.1 Energy Consumption
10.4.2 Survival Rate
10.4.3 End-to-End Delay
10.5 Conclusion
References
11 Management of Diabetes Mellitus (DM) for Children
and Adults Based on Internet of Things (IoT)
Krishnakumar S., Umashankar G., Lumen Christy V., Vikas
and Hemalatha R.J.
11.1 Introduction
11.1.1 Prevalence
11.1.2 Management of Diabetes
11.1.3 Blood Glucose Monitoring
11.1.4 Continuous Glucose Monitors
11.1.5 Minimally Invasive Glucose Monitors
11.1.6 Non-Invasive Glucose Monitors
11.1.7 Existing System
11.2 Materials and Methods
11.2.1 Artificial Neural Network
11.2.2 Data Acquisition
11.2.3 Histogram Calculation
11.2.4 IoT Cloud Computing
11.2.5 Proposed System
11.2.6 Advantages
11.2.7 Disadvantages
11.2.8 Applications
11.2.9 Arduino Pro Mini
11.2.10 LM78XX
11.2.11 MAX30100
11.2.12 LM35 Temperature Sensors
11.3 Results and Discussion
11.4 Summary
11.5 Conclusion
References
xi
192
195
196
200
201
201
201
202
204
204
207
208
209
209
210
211
211
211
211
212
212
213
213
214
215
215
215
216
216
217
218
218
219
222
222
223
xii Contents
12 Wearable Health Monitoring Systems Using IoMT
Jaya Rubi and A. Josephin Arockia Dhivya
12.1 Introduction
12.2 IoMT in Developing Wearable Health Surveillance System
12.2.1 A Wearable Health Monitoring System with
Multi-Parameters
12.2.2 Wearable Input Device for Smart Glasses Based
on a Wristband-Type Motion-Aware Touch Panel
12.2.3 Smart Belt: A Wearable Device for Managing
Abdominal Obesity
12.2.4 Smart Bracelets: Automating the Personal Safety
Using Wearable Smart Jewelry
12.3 Vital Parameters That Can Be Monitored Using Wearable
Devices
12.3.1 Electrocardiogram
12.3.2 Heart Rate
12.3.3 Blood Pressure
12.3.4 Respiration Rate
12.3.5 Blood Oxygen Saturation
12.3.6 Blood Glucose
12.3.7 Skin Perspiration
12.3.8 Capnography
12.3.9 Body Temperature
12.4 Challenges Faced in Customizing Wearable Devices
12.4.1 Data Privacy
12.4.2 Data Exchange
12.4.3 Availability of Resources
12.4.4 Storage Capacity
12.4.5 Modeling the Relationship Between Acquired
Measurement and Diseases
12.4.6 Real-Time Processing
12.4.7 Intelligence in Medical Care
12.5 Conclusion
References
225
13 Future of Healthcare: Biomedical Big Data Analysis and IoMT
Tamiziniyan G. and Keerthana A.
13.1 Introduction
13.2 Big Data and IoT in Healthcare Industry
13.3 Biomedical Big Data Types
247
225
226
227
228
228
228
229
230
231
232
232
234
235
236
238
239
240
240
240
241
241
242
242
242
243
244
248
250
251
Contents xiii
13.4
13.5
13.6
13.7
13.3.1 Electronic Health Records
13.3.2 Administrative and Claims Data
13.3.3 International Patient Disease Registries
13.3.4 National Health Surveys
13.3.5 Clinical Research and Trials Data
Biomedical Data Acquisition Using IoT
13.4.1 Wearable Sensor Suit
13.4.2 Smartphones
13.4.3 Smart Watches
Biomedical Data Management Using IoT
13.5.1 Apache Spark Framework
13.5.2 MapReduce
13.5.3 Apache Hadoop
13.5.4 Clustering Algorithms
13.5.5 K-Means Clustering
13.5.6 Fuzzy C-Means Clustering
13.5.7 DBSCAN
Impact of Big Data and IoMT in Healthcare
Discussions and Conclusions
References
14 Medical Data Security Using Blockchain With Soft
Computing Techniques: A Review
Saurabh Sharma, Harish K. Shakya and Ashish Mishra
14.1 Introduction
14.2 Blockchain
14.2.1 Blockchain Architecture
14.2.2 Types of Blockchain Architecture
14.2.3 Blockchain Applications
14.2.4 General Applications of the Blockchain
14.3 Blockchain as a Decentralized Security Framework
14.3.1 Characteristics of Blockchain
14.3.2 Limitations of Blockchain Technology
14.4 Existing Healthcare Data Predictive Analytics Using Soft
Computing Techniques in Data Science
14.4.1 Data Science in Healthcare
14.5 Literature Review: Medical Data Security in Cloud
Storage
14.6 Conclusion
References
252
252
252
253
254
254
254
255
255
256
257
258
258
259
259
260
261
262
263
264
269
270
272
272
273
274
276
277
278
280
281
281
281
286
287
xiv
Contents
15 Electronic Health Records: A Transitional View
Srividhya G.
15.1 Introduction
15.2 Ancient Medical Record, 1600 BC
15.3 Greek Medical Record
15.4 Islamic Medical Record
15.5 European Civilization
15.6 Swedish Health Record System
15.7 French and German Contributions
15.8 American Descriptions
15.9 Beginning of Electronic Health Recording
15.10 Conclusion
References
289
Index
301
289
290
291
291
292
292
293
293
297
298
298
Preface
It is a pleasure for us to put forth this book, The Internet of Medical Things
(IoMT): Healthcare Transformation. Digital technologies have come into
effect in various sectors of our daily lives and it has been successful in
influencing and conceptualizing our day-to-day activities. The Internet
of Medical Things is one such discipline which seeks a lot of interest as
it combines various medical devices and allows these devices to have a
conversation among themselves over a network to form a connection of
advanced smart devices. This book helps to know about IoMT in the health
care sector that involves the latest technological implementation in diagnostic level as well as therapeutic level. The security and privacy of maintaining the health records is a major concern and several solutions for the
same has been discussed in this book. It provides significant advantages
for the wellbeing of people by increasing the quality of life and reducing
medical expenses. IoMT plays a major role in maintaining smart healthcare system as the security and privacy of the health records further leads
to help the health care sector to be more secure and reliable. Artificial
Intelligence is the other enabling technology that helps IoMT in building
smart defensive mechanisms for a variety of applications like providing
assistance for doctors in almost every area of their proficiencies such as
clinical decision-making. Through Machine Learning and Deep Learning
techniques, the system can learn normal and abnormal decisions using the
data generated by the health worker/professionals and the patient feedback. This book demonstrates the connectivity between medical devices
and sensors is streamlining clinical workflow management and leading to
an overall improvement in patient care, both inside care facility walls and
in remote locations. This book would be a good collection of state-of-theart approaches for applications of IoMT in various health care sectors. It
will be very beneficial for the new researchers and practitioners working in
the field to quickly know the best methods for IoMT.
xv
xvi Preface
• Chapter 1 concentrates on the study of the threedimensional (3-D) models of lung cancer cell line proteins
(epidermal growth factor (EGFR), K-Ras oncogene protein
and tumor suppressor (TP53)). The generation and their
binding affinities with curcumins, ellagic acid and quercetin
through local docking were assessed.
• Chapter 2 focuses on cloud computing and electronic health
record system service EHR used to protect the confidentiality of patient sensitive information and must be encrypted
before outsourcing information. This chapter focuses on the
effective use of cloud data such as search keywords and data
sharing and the challenging problem associated with the
concept of soft computing.
• Chapter 3 elucidates the study of cloud computing concepts,
security concerns in clouds and data centers, live migration and its importance for cloud computing, and the role
of virtual machine (VM) migration in cloud computing. It
provides a holistic approach towards the pre-copy migration
technique thereby explore the way for reducing the downtime and migration time. This chapter compares different
pre-copy algorithms and evaluates its parameters for providing a better solution.
• Chapter 4 concentrates on Deep Learning that has gained
more interest in various fields like image classification, selfdriven cars, natural language processing and healthcare
applications. The chapter focuses on solving the complex
problems in a more effective and efficient manner. It elaborates for the reader how deep learning techniques are useful for predicting and classification of the brain tumor cells.
Datasets are trained using pre-trained neural networks such
as Alexnet, Googlenet and Resnet 101 and performance of
these networks are analysed in detail. Resnet 101 networks
have achieved highest accuracy.
• Chapter 5 illustrates an intelligent healthcare monitoring
system for coma patients that examines the coma patient's
vital signs on a continuous basis, detects the movement happening in the patient, and updates the information to the
doctor and central station through IoMT. Consistent tracking and observation of these health issues improves medical
assurance and allows for tracking coma events.
Preface
• Chapter 6 details the Deep Learning process that resembles
the human functions in processing and defining patterns used
for decision-making. Deep learning algorithms are mainly
designed and developed using neural networks performing
unsupervised data that are unstructured. Biomedical data
possess time and frequency domain features for analysis and
classification. Thus, deep learning algorithms are used for
interpretation and classification of biomedical big data.
• Chapter 7 discusses how the electronic health records
automates and streamlines the clinician’s workflow and
makes the process easy. It has the ability to generate the
complete history of the patient and also help in assisting for the further treatment which helps in the recovery of the patient in a more effective way. The electronic
health records are designed according to the convenience
depending on the sector it is being implemented. The main
aim of electronic health records was to make it available
to the concerned person wherever they are, to reduce the
work load to maintain clinical book records and use the
details for research purposes with the concerned persons
acknowledgement.
• Chapter 8 elaborates technical architecture of IoMT in relation to biomedical applications. These ideologies are widely
used to educate people regarding the medical applications
using IoMT. It also gives a detailed study about the future
scope of IoMT in healthcare.
• Chapter 9 provides knowledge on the different performance
assessment techniques and types of protocols that suits best
data transfer and increases safety. The chapter provides the
best protocol which helps in saving energy and is useful for
the customer. It will help the researchers to select the best
IoT protocol for healthcare applications. Testing tools and
frameworks provide knowledge to assess the protocols.
• Chapter 10 addresses the issue of a Health Monitoring
Centre (HMC) in rural areas. The HMC monitors and
records continuously the physiological parameters of the
patients in care using wearable biosensors. The elderly suffering from chronic diseases is monitored periodically or
continuously under the care of the physician. To enhance
the performance of the system a smart and intelligent mesh
xvii
xviii Preface
•
•
•
•
backbone is integrated for fast transmission of the critical
medical data to a remote health IOT cloud server.
Chapter 11 concentrates on Diabetes Mellitus (DM) which
is one of the most widely recognized perilous illnesses for
all age groups in the world. The patients need to settle on
the best-individualized choices about day-by-day management of their diabetes. Noninvasive glucose sensor used to
find out the glucose value of patients from its fingertip and
other sensors also connected to the patient to get relevant
data. A completely useful IoT-based eHealth stage that wires
humanoid robot help with diabetes and planned successfully. The created platform encourages a constant coupled
network among patients and their caretakers over physical
separation and, in this manner, improving patient’s commitment with their caretakers while limiting the cost, time, and
exertion of the conventional occasional clinic visits.
Chapter 12 explores the concepts of wearable health monitoring systems using IoMT technology. Additionally, this
chapter also provides a brief review about challenges and
applications of customized wearable healthcare system that
are trending these days. The basic idea is to have a detailed
study about the recent developments in IoMT technologies
and the drawbacks, as well as future advancements related
to it. The recent innovations, implications and key issues are
discussed in the context of the framework.
Chapter 13 provides knowledge on biomedical big data analysis which plays a huge impact in personalized medicine.
Some challenges in big data analysis like data acquisition,
data accuracy, data security are discussed. Huge volume of
data in healthcare can be managed by integrating biomedical data management. This chapter will provide brief information on different software that are used to manage data in
healthcare domain. Impact of big data and IoMT in healthcare will enhance data analytics research.
Chapter 14 concentrates on blockchain which is a highly
secure and decentralized networking platform of multiple
computers called nodes. Predictive analysis, soft computing
(SC) and optimization and data science is becoming increasingly important. In this chapter, the authors investigate privacy issues around large cloud medical data in the remote
cloud. Their proposed framework ensures data privacy,
Preface
xix
integrity, and access control over the shared data with better efficiency. It reduces the turnaround time for data sharing, improves the decision-making process, and reduces
the overall cost while providing better security of electronic
medical records.
• Chapter 15 discusses the evolution of electronic health
record starting with the history and evolution of the health
record system in the Egyptian era when the first health
record was written, all the way to the modern computerized health record system. This chapter also includes various
documentation procedures for the health records that were
followed from the ancient times and by other civilizations
around the world.
We thank the chapter authors most profusely for their contributors written during the pandemic.
R. J. Hemalatha
D. Akila
D. Balaganesh
Anand Paul
January 2022
1
In Silico Molecular Modeling and Docking
Analysis in Lung Cancer Cell Proteins
Manisha Sritharan1 and Asita Elengoe2*
Department of Science and Biotechnology, Faculty of Engineering and Life Sciences,
University of Selangor, Bestari Jaya, Selangor, Malaysia
2
Department of Biotechnology, Faculty of Science, Lincoln University College,
Petaling Jaya, Selangor, Malaysia
1
Abstract
In this study, the three-dimensional (3D) models of lung cancer cell line proteins
[epidermal growth factor (EGFR), K-ras oncogene protein, and tumor suppressor
(TP53)] were generated and their binding affinities with curcumin, ellagic acid,
and quercetin through local docking were assessed. Firstly, Swiss model was used
to build lung cancer cell line proteins and then visualized by the PyMol software.
Next, ExPASy ProtParam Proteomics server was used to evaluate the physical and
chemical parameters of the protein structures. Furthermore, the protein models
were validated using PROCHECK, ProQ, ERRAT, and Verify3D programs. Lastly,
the protein models were docked with curcumin, ellagic acid, and quercetin by
using BSP-Slim server. All three protein models were adequate and in exceptional
standard. The curcumin showed binding energy with EGFR, K-ras oncogene protein, and TP53 at 5.320, 2.730, and 1.633, kcal/mol, respectively. Besides that, the
ellagic acid showed binding energy of EGFR, K-ras oncogene protein, and TP53
at −2.892, 0.921, and 0.054 kcal/mol, respectively. Moreover, the quercetin showed
binding energy of EGFR, K-ras oncogene protein, and TP53 at −1.249, −1.154,
and −0.809 kcal/mol, respectively. The EGFR had the strongest bond with ellagic
acid while K-ras oncogene protein and TP53 had the strongest interaction with
quercetin. In order to identify the appropriate function, all these potential drug
candidates can be further assessed through laboratory experiments.
Keywords: EGFR, K-ras, TP53, curcumin, ellagic acid, quercetin, docking
*Corresponding author: asitaelengoe@yahoo.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (1–22) © 2022 Scrivener Publishing LLC
1
2
The Internet of Medical Things (IoMT)
1.1 Introduction
Lung cancer is known to be the number one cause of cancer deaths among
all the cancer in both men and women in worldwide. According to a World
Health Organization (WHO) survey, lung cancer caused 19.1 deaths per
100,000 in Malaysia, or 4,088 deaths per year (3.22% of all deaths) [1].
Moreover, there was a record of 1.69 million of deaths worldwide in 2015
due to lung cancer. Furthermore, a research in UK estimated that there
will be 23.6 million of new cases of cancer worldwide each year by 2030
[1]. The main cause of lung cancer deaths is smoking. Almost 8% of people
died because of it. Furthermore, the second reason is exposure to secondhand smoke. Thus, it is very clear that smoking is the leading risk factor
for lung cancer. However, not everyone who got lung cancer is smokers
as many people with lung cancer are former smokers while many others
never smoked at all. Moreover, radiation exposure, unhealthy lifestyle,
secondhand smoke, pollution of air, genetic markers, prolongs inhalation
of asbestos, and chemicals as well as other factors can cause lung cancer
non-smokers [2].
Furthermore, it seems that most lung cancer signs do not appear until
the cancer has spread, although some people with early lung cancer do
have symptoms. Generally, the symptoms of lung cancer are a cough that
does not go away and instead gets worse, shortness of breath, chest pain,
feeling tired or weak, new onset of wheezing, and some lung cancer can
even cause syndrome [3]. On top of that, a number of tests can be conducted in order to look for cancerous cell such as X-ray image of lung that
could disclose the abnormal mass or nodule, a CT scan to exhibit small
lesions in the lungs which may not detected on X-ray, blood investigations,
sputum cytology, and tissue biopsy [4]. Lung cancer treatments being carried out are adjuvant therapy which may include radiation, chemotherapy,
targeted therapy, or immunotherapy.
Since they originate from the bronchi within the lungs, small-cell
lung carcinoma (SCLC) and non–small-cell lung carcinoma (NSCLC)
are the two main clinic pathological classes for lung cancer. They are
also known as bronchogenic carcinomas because they arise from the
bronchi within the lungs [4]. Lung cancer is believed to be arising after
a series of continuous pathological changes (preneoplastic lesions)
which very often discovered accompanying lung cancers as well as in
the respiratory mucosa of smokers. Apart from that, the genes involved
in lung cancer are epidermal growth factor receptor (EGFR), KRAS,
MET, LKBI, BREF, ALK, RET, and tumor suppressor gene (TP53) [5].
Molecular Modeling and Docking Analysis
3
The three most common genes in lung cancer are EGFR, KRAS, and
TP53, and the structure of these genes is explored in thus study. EGFR
is actually transmembrane protein that has cytoplasmic kinase movement and it transduces essential development factor motioning from
the extracellular milieu to the cell. According to da Cunha Santos, more
than 60% of NSCLCs expresses EGFR which has turned into an essential
focus for the treatment of these tumors [6]. In addition, the KRAS mutation is the most widely recognized oncogene driver change in patients
with NSCLC and presents a poor guess in the metastatic setting, making it an imperative focus for tranquilize advancement. It is difficult to
treat patients with KRAS mutations since there is no targeted therapy
yet [7]. Among the mutations, the most common mutation that found
to occur in lung cancer is TP53 mutations and its frequency becomes
greater with tobacco consumption [8]. In this study, three compounds
(curcumin, ellagic acid, and quercetin) were used for docking with the
three mutant proteins. Curcumin has excellent safety profile that focus
on different infections with solid confirmation on molecule level. Thus,
improvement in formulation criteria can aid in developing therapeutic
drug [9]. Next, ellagic acid has the ability to bind with cancer cells to
make them inactive and it is also effective to resist cancer in rats and
mice according to a research [10]. Quercetin is a pigment from plant
(flavonoid) which has anti-oxidant and anti-inflammatory effect. It has
shown to inhibit the multiplication of cancer cells according to PaoChen Kuo et al. [11].
Bioinformatics is a multidisciplinary discipline that creates methods
and software tools for storing, extracting, organizing, and interpreting
biological data. To analyze biological data, a combination of bioinformatics and computer science, statistics, physics, chemistry, mathematics, and
engineering is useful. Currently, this method is growing rapidly because
it is cheap and quite faster than experimental approaches. Computational
biology tools such as protein modeling (e.g., Swiss Model, Easy
Modeller, and Modeller), molecular dynamic simulation (e.g., Gromacs
and Amber), and docking (e.g., Autodock version 4.2, AutodockVina,
Swissdock, and Haddock) helped design substrate-based drugs to study
the interaction between the target proteins (cancer cell proteins) and
ligand (phytocomponents).
The aim of conducting this research is to initiate three-dimensional (3D)
models of lung cancer line proteins (EGFR, K-ras oncogene, and TP53)
and to guesstimate their binding affinities with curcumin, ellagic acid, and
quercetin via docking approach.
4
The Internet of Medical Things (IoMT)
1.2 Methodology
1.2.1 Sequence of Protein
The entire amino acid sequence of the EGFR (GI: 110002567), K-ras oncogene protein (GI: 186764), and TP53 (GI: 1233272225) were obtained from
National Center for Biotechnology Information Center for Biotechnology
Information (NCBI). Next, EGFR consists of 464 amino acids, K-ras oncogene protein contains 188 amino acids, while TP53 consists of 346 amino
acids.
1.2.2 Homology Modeling
As of now, the 3D models of EGFR, K-ras oncogene protein, and TP53
are not available in Protein Data Bank (PDB). As a result, the models were
started with Swiss Model [12] and then visualized with PyMol [13].
1.2.3 Physiochemical Characterization
The physical and chemical characters of the protein structures were analyzed by using the ExPASy ProtParam Proteomics tool [14]. Besides that,
hydrophobic and hydrophilic were foreseen by ColorSeq analysis [15].
Furthermore, The ESBRI program [16] was used to reveal salt bridges in
protein structures, while the Cys Rec program was used to count the number of disulfide bonds [17].
1.2.4 Determination of Secondary Models
The secondary structural properties were discovered using the Alignment
Self-Optimized Prediction Process (SOPMA) [18].
1.2.5 Determination of Stability of Protein Structures
PROCHECK was used to test the protein models [19]. ProQ [20], ERRAT
[21], and Verify3D programs were used to conduct further research [22].
1.2.6 Identification of Active Site
The 3D model of EGFR, K-ras oncogene protein, and TP53 were submitted
to active site-prediction server [23] in order to discover their binding sites.
Molecular Modeling and Docking Analysis
5
1.2.7 Preparation of Ligand Model
The tertiary structure of the quercetin, curcumin, and ellagic acid are not
openly accessible. The whole sequence of quercetin, curcumin, and ellagic
acid were attained from PubChem, National Center for Biotechnology
Information (2017) [24].
1.2.8 Docking of Target Protein and Phytocompound
The 3D structure of EGFR was docked with quercetin, curcumin, and
ellagic acid by using BSP-Slim server [25]. In addition, the best docking
complex model was chosen based on the lowest binding score. The similar
docking method was carried out between the other two protein models
and phytocompounds (quercetin, curcumin, and ellagic acid).
1.3 Results and Discussion
1.3.1 Determination of Physiochemical Characters
The isoelectric point (pI) value quantified for EFGR (pI > 7) specify basic
feature while the pI for k-ras and TP53 (pI < 7) exhibit acidic. Besides
that, the molecular weight of EFGR, k-ras oncogene protein, and TP53
are 50,343.70, 21,470.62, and 38,532.60 Daltons, respectively. The extent of
light being by absorbed by protein at a specific wavelength was used to calculate the extinction coefficient of TYR, TRP, and CYS residues where for
EGFR is 38,305 M/cm, k-ras oncogene protein is 12,170M/cm, and TP53
is 43,025 M/cm. In addition, −R is the negatively (ASP + GLU) and +R is
the positively charged (ARG + LYS) residues in the amino acid sequence.
The total of –R and +R for each protein model was described in Table 1.1.
According to the instability index of ExPASy ProtParam, EGFR proteins
are classified as stable because the instability index for both proteins are less
than 40 while K-ras oncogene protein and TP53 is categorized as unstable
as the instability index is more than 40. The instability index for EGFR is
35.56, K-ras oncogene protein is 43.95, and TP53 is 80.17. On top of that,
the very low grand average of hydropathicity (GRAVY) index (a (negative
value GRAVY) of EGFR, K-ras oncogene protein, and TP53 denotes their
hydrophilic nature (Table 1.1). Apart from that, EFGR, K-ras and TP53
have more polar residues (41.52%, 53.33%, and 45.29%) than non-polar
residues (26.74%, 30.0%, and 27.35%) which were determined using Color
Protein Sequence.
Length
460
180
340
Protein
EGFR
KRAS
TP53
38,532.60
21,470.62
50,343.70
Molecular
weight (kDa)
5.64
8.18
7.10
pI
41
29
49
−R
33
31
49
+R
43,025
12,170
38,305
Extinction
coefficient
80.17
43.95
35.56
Instability
index
63.99
77.18
72.91
Aliphatic
index
−0.592
−0.559
−0.269
GRAVY
Table 1.1 Physiochemical characters of EGFR, K-ras, and TP53 proteins as determined by ExPASy ProtParam program.
6
The Internet of Medical Things (IoMT)
Molecular Modeling and Docking Analysis
7
Furthermore, the structure and function of the protein can be affected
by salt bridges. Thus, salt bridge disruption minimizes the stability of protein [26]. Next, it is also associated with regulation, molecular recognition,
oligomerization, flexibility, domain motions, and thermostability [27, 28].
The greater number of arginine in the protein model enhances the stability of the protein. This is happens through the electrostatic interactions
between their guanidine group [29]. Hence, it was confirmed that all the
protein models are in the identical stable conditions. The outcome of Cys_
Recserver exhibits that the quantity of disulfide bonds in EGFR is 42, K-ras
oncogene protein is 5, and TP53 is 11 (Table 1.2).
1.3.2 Prediction of Secondary Structures
Results from SOPMA analysis shows that random coils dominant among
secondary structure components in the protein models (Figure 1.1). The
constitution of alpha helix in EGFR, K-ras oncogene protein, and TP53
were shown in Table 1.3.
The outcome from this analysis specified that EGFR, K-ras oncogene
protein, and TP53 constitutes of 15, 11, and 10α helices, respectively.
Besides that, Table 1.4 represents the details of the longest and shortest
alpha helix of all the protein models.
1.3.3 Verification of Stability of Protein Structures
PROCHECK server was used to verify the stereo chemical quality and the
geometry of protein models through Ramachandran plots (Figure 1.2).
Furthermore, it was revealed that all the protein structures are in most
favorable region because they had percentage value more than 80% (Table
1.5). Thus, the standard of these proteins was assessed to be immense and
reliable. On top of that, PROCHECK analysis disclose that a number of
residues such as TYR265 and GLU51 for EGFR while LYS180 for K-ras
oncogene protein were located away from energetically favored regions of
Ramachandran plot. Besides that, there are no residues found at forbade
region for TP53 protein model.
Thereby, the stereo chemical interpretation of backbone phi/psi dihedral angles deduced that EGFR, K-ras oncogene protein, and TP53 have
low percentage of residues among the protein models. Moreover, ProQ was
utilized in order to validate “the quality” with the usage of Levitt-Gerstein
(LG) score and maximum subarray (MaxSub). All the protein models
were within the range for LG and MaxSub score according to the outcome
exhibited for creating a good model (Table 1.5).
8
The Internet of Medical Things (IoMT)
Table 1.2 The number disulfide bonds were quantitated by Cys_Rec prediction
program.
Protein
Cys_Rec
Score
EGFR
Cys_9
−13.0
Cys_13
39.2
Cys_17
100.1
Cys_25
98.3
Cys_26
104.2
Cys_30
104.1
Cys_34
90.5
Cys_42
48.2
Cys_45
56.0
Cys_54
58.2
Cys_58
49.5
Cys_85
55.7
Cys_89
54.9
Cys_101
50.4
Cys_105
44.0
Cys_120
63.0
Cys_123
73.3
Cys_127
75.3
Cys_131
61.6
Cys_156
33.0
Cys_264
45.3
Cys_293
43.8
Cys_300
56.5
Cys_304
46.8
(Continued)
Molecular Modeling and Docking Analysis
9
Table 1.2 The number disulfide bonds were quantitated by Cys_Rec prediction
program. (Continued)
Protein
KRAS
Cys_Rec
Score
Cys_309
66.0
Cys_317
65.6
Cys_320
60.2
Cys_329
49.1
Cys_333
42.2
Cys_349
42.2
Cys_352
32.9
Cys_356
62.9
Cys_365
70.2
Cys_373
54.2
Cys_376
54.8
Cys_385
35.8
Cys_389
41.2
Cys_411
78.8
Cys_414
85.1
Cys_418
84.4
Cys_422
26.5
Cys_430
3.7
Cys_12
−28.5
Cys_51
−74.2
Cys_80
−72.6
Cys_118
−56.4
Cys_185
−15.2
(Continued)
10
The Internet of Medical Things (IoMT)
Table 1.2 The number disulfide bonds were quantitated by Cys_Rec prediction
program. (Continued)
Protein
Cys_Rec
Score
TP53
Cys_124
−19.4
Cys_135
−1.6
Cys_141
−17.9
Cys_176
−9.1
Cys_182
−45.1
Cys_229
−54.4
Cys_238
1.6
Cys_242
−5.5
Cys_275
−34.4
Cys_277
−51.5
Cys_339
−32.8
ERRAT analysis is used for assessing the protein models which were
determined by x-ray crystallography. Next, the value of ERRAT relies upon
the statistics of non-bonded atomic interactions in the 3D protein structures. The protein is generally accepted as high quality protein if the percentage is greater than 50%. The ERRAT analysis score result shows that
K-ras oncogene protein had the highest at 94.767. Therefore, it can be seen
that K-ras oncogene protein has high quality resolution among the protein
models. Besides that, the score value for EGFR is 88.010 while 90.374 for
TP53 (Figure 1.3).
The Verify3D server was used to reveal the residues in each protein in
which EGFR, K-ras oncogene, and TP53 had 98.59%, 100.00%, and 92.96%
residues, respectively. Next, the average 3D-1D score of all three proteins
are more than 0.2. As a consequence, it specifies that all of the sequences
were in line with its protein model (Figure 1.4). Certainly, the resulting
energy minimized EGFR, K-ras oncogene protein, and TP53 protein models satisfied the standard for evaluation of protein. Hence, the docking
analysis with ligand will be carried out.
Molecular Modeling and Docking Analysis
50
100
150
200
50
100
150
200
250
300
350
250
300
350
(a)
50
20
100
40
150
60
80
200
100
250
120
140
(b)
50
100
150
50
100
150
200
250
200
250
(c)
Figure 1.1 SOPMA plots for (a) EGFR, (b) K-ras oncogene protein, and (c) TP53.
11
12
The Internet of Medical Things (IoMT)
Table 1.3 Secondary structure of the EGFR, K-ras oncogene protein, and TP53.
Secondary
structure
Alpha helix
(Hh)
Extended
strand (Ee)
Beta turn
(Tt)
Random coil
(Cc)
EGFR
16.81
16.81
3.23
64.44
KRAS
43.62
21.81
7.45
27.13
TP53
18.79
18.21
3.18
59.83
Table 1.4 Composition of α-helix EGFR, K-ras oncogene protein, and TP53.
Amino acid
Longest alpha
helix
Residues
Shortest alpha
helix
Number of
residues
EGFR
α14
14
α3, α6, α11, α15
1
KRAS
α11
20
α1, α10
1
TP53
α5
11
α7
1
180
135
180
B
–b
–1
Psi (degrees)
Psi (degrees)
45
a
A
0
–a
TYR 265 (A)
–45
–90
ASN 322 (A)
–p
–180
a
A
0
LYS 180 (C)
–45
–135
p
b
–90
–45
0
45
Phi (degrees)
90
135
180
–p
–b
–b
–135
–1
1
–90
GLU 51 (A)
–135
–b
–b
90
1
45
b
135 b
b
90
–b
B
b
b
–180
p
–135
–90
–45
0
45
Phi (degrees)
(a)
–b
90
135
180
(b)
180
135
B
–b
b
b
–b
–1
90
1
Psi (degrees)
45
a
A
0
–a
–45
–90
–135
–p
–b
p
b
–180
–135
–90
–45
0
45
Phi (degrees)
–b
90
135
180
(c)
Figure 1.2 Ramachandran plots for (a) EGFR, (b) K-ras oncogene protein, and (c) TP53.
92.9
TP53
7.1
8.9
0.0
0.0
0.0
0.6
−0.16
−0.12
−0.27
90.5
0.3
KRAS
8.6
0.6
90.6
0.07
0.04
0.02
Covalent
forces
EGFR
Disallowed
Dihedral
angles
Generously
allowed
Most
favored
Structure
Additionally
allowed
Goodness factor
Ramachandran plot statistics
Table 1.5 Validation of the EGFR, K-ras oncogene protein, and TP53.
−0.06
−0.04
−0.14
Overall
average
4.417
4.094
3.814
LG score
ProQ
0.454
0.474
0.302
MaxSub
Molecular Modeling and Docking Analysis
13
14
The Internet of Medical Things (IoMT)
Error value*
Overall quality factor**: 88.010
99%
95%
20
40
60
80
100
120
140
160
180
200
Residue # (window center)
220
240
260
280
300
Error value*
(a)
Overall quality factor**: 94.767
99%
95%
20
40
60
80
100
120
Residue # (window center)
Error value*
Overall quality factor**: 90.374
140
160
180
(b)
99%
95%
100
120
140
160
180
200
220
Residue # (window center)
240
260
280
(c)
Figure 1.3 ERRAT plots for (a) EGFR, (b) K-ras oncogene protein, and (c) TP53.
1.3.4 Identification of Active Sites
For EGFR, K-ras oncogene protein, and TP53, the BSP-Slim server was
used to obtain the active site protein volume and the residues that form
an active site pocket (Table 1.6). The protein volume for EGFR, K-ras, and
TP53 were 837 A3, 718A3, and 647A3, respectively.
1.3.5 Target Protein-Ligand Docking
Based on Murugesan et al. study [30], the plant compounds from methanolic leaf extract of Vitexnegundoweredocked successfully with cyclooxygenase-2 (COX-2) enzyme. The phytocompounds had a better interaction
Molecular Modeling and Docking Analysis
0.8
Average Score
15
Raw Score
0.6
0.4
0.2
0
-0.2
-0.4
-0.6
A1
:I
A1
2:
R
A2
3:
Q
A3
4:
S
A4
5:
A5 E
6:
M
A6
7:
V
A7
8:
T
A8
9:
A1 V
00
:
A1 G
11
A1 :V
22
:
A1 R
33
:F
A1
44
A1 :I
55
:
A1 G
66
:
A1 G
77
A1 :P
88
:
A1 E
99
A2 :P
10
:
A2 N
21
:
A2 H
32
:
A2 N
43
A2 :E
54
A2 :N
65
:W
A2
76
:
A2 T
87
:
A2 C
98
:
A3 C
09
A3 :R
20
:
A3 G
31
:
A3 E
42
A3 :E
53
:
A3 Q
64
A3 :P
75
A3 :D
86
:
A3 G
97
:
A4 K
08
:C
A4
25
:P
-0.8
(a)
0.8
Average Score
Raw Score
0.6
0.4
0.2
0
-0.2
-0.4
-0.6
:C
85
A1
A1
:M
A6
:L
A1
1:
A
A1
6:
K
A2
1:
A2 I
6:
N
A3
1:
E
A3
6:
I
A4
1:
R
A4
6:
I
A5
1:
C
A5
6:
L
A6
1:
Q
A6
6:
A
A7
1:
Y
A7
6:
E
A8
1:
V
A8
6:
N
A9
1:
E
A9
6:
A1 Y
01
:
A1 K
06
A1 :S
11
:M
A1
16
:
A1 N
21
:
A1 P
26
A1 :D
31
:Q
A1
36
:
A1 S
41
:
A1 F
46
A1 :A
51
:G
A1
56
:
A1 F
61
:
A1 R
66
:H
A1
71
:
A1 S
76
:K
-0.8
(b)
0.8
Average Score
Raw Score
0.6
0.4
0.2
0
-0.2
-0.4
-0.6
A1
:P
A6
:V
A1
1:
T
A1
6:
Y
A2
1:
G
A2
6:
G
A3
1:
V
A3
6:
S
A4
1:
K
A4
6:
L
A5
1:
P
A5
6:
V
A6
1:
P
A6
6:
V
A7
1:
A7 I
6:
Q
A8
1:
V
A8
6:
P
A9
1:
C
A9
6:
A1 G
01
:
A1 Q
06
A1 :V
11
A1 :R
16
:
A1 D
21
A1 :F
26
:
A1 V
31
A1 :P
36
:
A1 S
4
A1 1:I
46
:
A1 M
51
:
A1 C
56
:
A1 N
61
A1 :L
66
A1 :L
71
:
A1 G
76
A1 :R
81
A1 :V
86
A1 :C
91
:R
A1
99
:R
-0.8
(c)
Figure 1.4 Verify 3D plots for (a) EGFR, (b) K-ras oncogene protein, and (c) TP53.
compared with aspirin and ibuprofen. They had a good binding energy and
docking result.
Besides that, four components [1,3-Dioxolane, 2-(3-bromo-5,5,5trichloro-2,2-dimethylpentyl)-, Butanoic acid, 2-hydroxy-2-methylmethyl ester, DL-3,4-Dimethyl-3,4-hexanediol, and Pantolactone] from
Moringaconcanensishad good binding affinity with brain cancer receptors.
The binding energies were −3.90, −2.75, −3.05, and −4.15 kcal/mol. They
had the lowest binding energies [31].
According to Deepa et al. study, plant compounds from the ethanolic
leaf extract of VitexNegundo [(4S)-2-Methyl-2-phenylpentane-1,4-diol, 7Methoxy-2,3-dihydro-2-phenyl-4-quinolone, 3-(tert-Butoxycarbonyl)-6-(3benzoylprop-2-yl)phenol, (3R,4S)-4-(methylamino)-1-phenylpent-1-en-3-ol,
and (2S,1’S)-1-Benzyl-2-[1’-(dibenzylamino) ethyl]aziridine] were docked
16
The Internet of Medical Things (IoMT)
Table 1.6 Predicted active sites of the EGFR, K-ras oncogene protein, and TP53.
Protein
Volume
Residues that forming pocket
EGFR
837
GLY100, ALA 101, ASP102, SER103, TYR104,
GLU105, MET106, GLU107, GLU108,
LYS113, LYS115, LYS 116, CYS117,
GLU118, GLY119, PRO120, CYS121,
ARG122, LYS123, VAL124, ASN149,
THR151, SER152, SER154, THR185,
LYS187, GLU188, THR190, ASN210,
GLU212, ILE213, ARG215, LYS242, CYS99
K-ras oncogene
protein
718
GLY10, ALA11, CYS12, GLY13, VAL14,
GLY15, LYS16, ASP33, PRO34, THR35,
GLU37, LEU56, ASP57, THR58, ALA59,
GLY60, GLN61, GLU62, GLU63, SER65,
ARG68, MET72, ALA83, ASN86, LYS88,
SER89, GLU91, ASP92, ILE93, HIE94,
HIE95, TYR96, ARG97, GLU98, GLN99,
ILE100, ARG102, VAL103
TP53
647
GLU107, ASN109, THR11, PRO128, TYR129,
GLN13, GLU130, PRO131, PRO132,
GLU133, VAL134, GLY135, SER136,
ASP137, CYS138, THR139, THR140,
ILE141, HIE142, TYR143, TYR16, GLY17,
ASN177, SER178, PHE18, ARG19, LEU20,
GLY21, PHE22, LEU23, HIE24, TYR35,
ASN40, MET42, THR49, CYS50, PRO51,
GLN53, LEU54, TRP55, VAL56, ASP57,
THR59, PRO60, PRO61, THR64
with glucosamine 6 phosphatase synthase. They had the lowest and most negative value for binding energy (−36.53, −33.57, −35.90, −33.88, and −37.65 kcal/
mol) [32].
According to Kasilingam and Elengoe study, apigenin successfully
docked with p53, caspase-3, and MADCAM1 using BSP-Slim server.
Apigenin was the plant compound while p53, caspase-3, and MADCAM1
were the target proteins in lung cancer cell line. Apigenin bound strongly
with p53, caspase-3, and MADCAM1 at the lowest binding energies (4.611,
5.750, and 5.307 kcal/mol, respectively) [33].
Based on Ashwini et al. study, coumarin, camptothecin, epigallocatechin, quercetin, and gallic acid were screened for potential binding with
Molecular Modeling and Docking Analysis
17
caspase-3 (target protein) in human cervical cancer cell line (HeLa).
Coumarin had the strongest interaction with caspase-3 at the lowest binding affinity (−378.3 kJ/mol). Therefore, it could be a potential anti-cancer
drug. However, gallic acid had the least interaction with caspase-3 at the
lowest binding energy (−181.3 kJ/mol). The docking approach was carried
out using Hex 8.0.0 docking software [34].
Chakrabarty et al. study demonstrated that 1-hexanol and 1-octen-3-ol
suppressed the enzyme activity of Ach (PDB id: 2CKM) and BACE1 (PDB
id: 4IVT). Ach and BACE1 are the proteins responsible for Alzheimer disease. 1-hexanol and 1-octen-3-ol were the plant compounds derived from
leaf extract of Lantana Camera (L.). Glide Standard Precision (SP) ligand
docking was performed to determine the binding energy. The results show
that 1-hexanol and 1-octen-3-ol bound strongly with Ach at −2.291 and
−2.465 kJ/mol, respectively. Whereas, 1-hexanol and 1-octen-3-ol had the
lowest binding affinity with BACE 1 at −0.948 and −1.267 kJ/mol, respectively. 1-octen-3-ol may have the potential to be an effective drug against
Alzhemeir disease. It had the best interaction with both enzymes (Ach and
BACE1) when compared with 1-hexanol [35].
Based on Supramaniam and Elengoe study, glycyrrhizin successfully
docked with p53, NF-kB-p105, and MADCAM1 using BSP-Slim server.
Glycyrrhizin was the plant compound while p53, NF-kB-p105, and
MADCAM1 were the target proteins in breast cancer cell line. Glycyrrhizin
bound strongly with p53, NF-kB-p105, and MADCAM1 at the lowest binding affinities (−4.040, −5.127, and −5.251 kcal/mol, respectively). Therefore,
glycyrrhizin could be a potential drug in breast cancer treatment [36].
According to Elengoe and Sebestian study, p53, adenomatous polyposis coli (APC), and EGFR were generated using homology modeling
approach. These proteins were the target proteins. They were docked successfully with plant compounds such as allicin, epigallocatechin-3-gallate,
and gingerol. Plant compounds were used as ligands in docking process.
p53 had the most stable interaction with the allicin among the three target
proteins. p53 docked with allicin at the lowest binding energy of 4.968.
However, the other target proteins had the good docking score too [37].
In this study, EGFR is successfully docked with quercetin, curcumin,
and ellagic acid by using the BSP-Slim server. The same target protein-­
phytocompound complex docking method was repeated with K-ras oncogene protein and TP53. Furthermore, the most suitable docking complex
was selected based on the lowest binding energy (DGbind). Results of
docking showed that EGFR had a strong bond with ellagic acid since it was
the most favorable with the lowest energy value (−2.892 kcal/mol) when
compared to curcumin and quercetin (Table 1.7). In addition, there was
18
The Internet of Medical Things (IoMT)
Table 1.7 Docking result of the EGFR, K-ras oncogene protein, and TP53.
Protein
Compounds
Binding energy (kcal/mol)
EGFR
Curcumin
5.320
Ellagic acid
−2.892
Quercetin
−1.249
Curcumin
2.730
Ellagic acid
0.921
Quercetin
−1.154
Curcumin
1.633
Ellagic acid
0.054
Quercetin
−0.809
K-ras oncogene protein
TP53
strong interaction between K-ras oncogene protein and quercetin with
lowest energy (−1.154 kcal/mol) that was most favorable when compared
to curcumin and ellagic acid. In addition, the strongest interaction for
TP53 was with quercetin when compared to other two compounds with
lowest energy (0.809 kcal/mol) according to the docking analysis.
1.4 Conclusion
In a nutshell, EGFR was successfully docked with curcumin, ellagic acid,
and quercetin. Besides that, the same approach of docking simulation was
performed for K-ras oncogene protein and TP53. Among the three protein
models, EGFR had a strong interaction with ellagic acid due to the lowest
energy value while K-ras oncogene protein and TP53 had a strong interaction with quercetin as the binding energy was the lowest. Consequently,
result from this study will aid in designing a suitable structure-based drug.
However, wet lab must be carried out to verify the results of this study.
References
1. Cancer Research UK, Worldwide cancer statistics, 2012, https://www.cancer
researchuk.org/health-professional/cancer-statistics/worldwide-cancer#
collapseZero.
Molecular Modeling and Docking Analysis
19
2. American Cancer Society, Lung cancer prevention and early detection, 2016,
https://www.cancer.org/cancer/lung-cancer/prevention-and-early-detection/​
signs-and-symptoms.html.
3. American Cancer Society, Causes, risk factors and prevention, 2016, https://
www.cancer.org/cancer/non-small-cell-lung-cancer/causes-risks-prevention/​
what causes.
4. Mayo Clinic, Lung cancer, 2018, https://www.mayoclinic.org/diseases-conditions/
lung-cancer/diagnosis-treatment/drc-20374627.
5. El-Telbany, A. and Patrick, C.M., Cancer genes in lung cancer. Genes Cancer,
M3, 7–8, 467–480, 2012.
6. Santos, D.C., Sheperd, F.A., Tsao, M.S., EGFR mutations and lung cancer.
Annu. Rev. Pathol.: Mechanisms of Disease, 6, 49–69, 2016.
7. Bhattacharya, S., Socinski, M.A., T.F., KRAS mutant lung cancer: progress
thus far on an elusive therapeutic target. Clin. Transl. Med., 4, 35, 2015.
8. Halverson, A.R., Silwal-Pandit, L., Meza-Zepeda, L.A. et al., TP53 mutation spectrum in smokers and never smoking lung cancer patients. Front.
Genet., 7, 85, 2016.
9. Basnet, P. and Skalko-Basnet, N., Curcumin: An anti-inflammatory molecule from a curry spice on the path to cancer treatment. Molecules, 6, 6,
4567–4598, 2011.
10. Healthline, Whyellagic acid is important?, 2020, https://www.healthline.
com/health/ellagic-acid.
11. Kuo, P.-C., Liu, H.-F., Chao, J.-I., Survivin and p53 modulate quercetininduced cell growth inhibition and apoptosis in human lung carcinoma cells.
J. Biol. Chem., 279, 53, 55875–55885, 2004.
12. Biasini, M., Bienert, S., Waterhouse, A., Arnold, K., Studer, G., Schmidt,
T. et al., SWISS-MODEL: Modelling protein tertiary and quaternary
structure using evolutionary information. Nucleic Acids Res., 42, W2528, 2014.
13. Delano, W.L., The PyMOL molecular graphics system, 2001, http://www.
pymol.org.
14. Gasteiger, E., Hoogland, C., Gattiker, A. et al., Protein identification and
analysis tools on the ExPASy server, in: The proteomics protocols handbook,
J.M. Walker (Ed.), Humana Press, Totowa, 2015.
15. Prabi, L.G., Color protein sequence analysis, 1998, https://npsaprabi. ibcp.fr/
cgi- bin/npsa_automat.pl?page=/NPSA/npsa_color.html.
16. Costantini, S., Colonna, G., Facchiano, A.M., ESBRI: a web server for evaluating salt bridges in proteins. Bioinformation, 3, 137–138, 2008.
17. Roy, S., Maheshwari, N., Chauhan, R. et al., Structure prediction and functional characterization of secondary metabolite proteins of Ocimum.
Bioinformation, 6, 8, 315–319, 2011.
18. Geourjon, C. and Deleage, G., SOPMA: significant improvements in protein
secondary structure prediction by consensus prediction from multiple alignments. Comput. Appl. Biosci., 11, 681–684, 1995.
20
The Internet of Medical Things (IoMT)
19. Laskowski, R.A., MacArthur, M.W., Moss, D.S. et al., PROCHECK: a program to check the stereo chemical quality of protein structures. J. Appl.
Cryst., 26, 283–291, 1993.
20. Wallner, B. and Elofsson, A., Can correct protein models be identified?
Protein Sci., 12, 1073–1086, 2003.
21. Colovos, C. and Yeates, T.O., Verification of protein structures: patterns of
non-bonded atomic interactions. Protein Sci., 2, 1511–1519, 1993.
22. Eisenberg, D., Luthy, R., Bowie, J.U., VERIFY3D: assessment of protein models with three- dimensional profiles. Methods Enzymol., 77, 396–404, 1977.
23. Jayaram, B., Active site prediction server, 2004, http://www.scfbio-iitd.res.in/
dock/ActiveSite.jsp.
24. National Center for Biotechnology Information, Pubchem., 2017, https://
pubchem.ncbi.nlm.nih.gov/.
25. Hui, S.L. and Yang, Z., BSP-SLIM: A blind low-resolution ligand-protein
docking approach using theoretically predicted protein structures. Proteins,
80, 93–110, 2012.
26. Kumar, S., Tsai, C.J., Ma, B. et al., Contribution of salt bridges toward protein
thermo-stability. J. Biomol. Struct. Dyn., 1, 79–86, 2000.
27. Kumar, S. and Nussinov, R., Salt bridge stability in monomeric proteins.
J. Mol. Biol., 293, 1241–1255, 2009.
28. Kumar, S. and Nussinov, R., Relationship between ion pair geometries and
electrostatic strengths in proteins. Biophys. J., 83, 1595–1612, 2002.
29. Parvizpour, S., Shamsir, M.S., Razmara, J. et al., Structural and functional
analysis of a novel psychrophilic b-mannanase from Glaciozyma Antarctica
PI12. J. Comput. Aided Mol. Des., 28, 6, 685–698, 2014.
30. Murugesan, D., Ponnusamy, R.D., Gopalan, D.K., Molecular docking study
of active phytocompounds from the methanolic leaf extract of vitexnegundoagainst cyclooxygenase-2. Bangladesh J. Pharmacol., 9, 2, 146–53, 2014.
31. Balamurugan, V. and Balakrishnan, V., Molecular docking studies of
Moringaconcanensisnimmo leaf phytocompounds for brain cancer. Res. Rev.:
J. Life Sci., 8, 1, 26–34, 2018.
32. Santhanakrishnan, D., Sipriya, N., Chandrasekaran, B., Studies on the phytochemistry, spectroscopic characterization and antibacterial efficacy of
Salicornia Brachiata. Int. J. Pharm. Pharm. Sci., 6, 6430–6432, 2014.
33. Kasilingam, T. and Elengoe, A., In silico molecular modeling and docking of
apigenin against the lung cancer cell proteins. Asian J. Pharm. Clin. Res., 11,
9, 246–252, 2018.
34. Ashwini, S., Varkey, S.P., Shantaram, M., Insilico docking of polyphenolic
compounds against Caspase 3-HeLa cell line protein. Int. J. Drug Dev. Res., 9,
28–32, 2017.
35. Chakrabarty, N., Hossain, A., Barua, J., Kar, H., Akther, S., Al Mahabub, A.,
Majumder, M., Insilico molecular docking of some isolated selected compounds of lantana camera against Alzheimer’s disease. Biomed. J. Sci. Tech.
Res., 12, 2, 9168–9171, 2018.
Molecular Modeling and Docking Analysis
21
36. Supramaniam, G. and Elengoe, A., In silico molecular docking of glycyrrhizin
and breast cancer cell line proteins, in: Plant-Derived Bioactives, 1, 575–589,
2020.
37. Elengoe, A. and Sebestian, E., In silico molecular modelling and docking of
allicin, epigallocatechin-3-gallate and gingerol against colon cancer cell proteins. Asia Pac. J. Mol. Biol. Biotechnol., 4, 2851–67, 2020.
2
Medical Data Classification in Cloud
Computing Using Soft Computing
With Voting Classifier: A Review
Saurabh Sharma1*, Harish K. Shakya1† and Ashish Mishra2‡
Dept. of CSE, Amity School of Engineering & Technology, Amity University (M.P.),
Gwalior, India
2
Department of CSE, Gyan Ganga Institute of Technology and Sciences,
Jabalpur, India
1
Abstract
In the current context, a tele-medical system is the rising medical service where
health professionals can use telecommunication technology to treat, evaluate, and
diagnose a patient. The data in the healthcare system signifies a set of medical
data that is sophisticated and larger in number (X-ray, fMRI data, scans of the
lungs, brain, etc.). It is impossible to use typical hardware and software to manage
medical data collections. Therefore, a practical approach to the equilibrium of privacy protection and data exchange is required. To address these questions, several
approaches are established, most of the studies focusing on only a tiny problem
with a single notion. This review paper analyzes the data protection research carried out in cloud computing systems and also looks at the major difficulties that
conventional solutions confront. This approach helps researchers to better address
existing issues in protecting the privacy of medical data in the cloud system.
Keywords: Medical data, soft computing, fuzzy, cloud computing, data privacy,
SVM, FCM
*Corresponding author: saurabhgyangit@gmail.com
†
Corresponding author: hkshakya@gwa.amity.edu
‡
Corresponding author: ashish.mish2009@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (23–44) © 2022 Scrivener Publishing LLC
23
24
The Internet of Medical Things (IoMT)
2.1 Introduction
There are many definitions in Electronic Health Record (EHR), such as the
electronic record that holds patient information on a health record system
operated by healthcare providers [1]. Although EHR has a good effect on
healthcare services, development in many healthcare institutions globally,
particularly in poor nations, is delayed due to numerous common problems.
Patient data security has been a problem since the beginning of medical history and is an important issue in modern day. Initiated by the idea of confidentiality, the Oath of Hippocrates has proved to be an honorable activity in
clinical and medical ethics. It is of highest importance to protect the privacy
and confidentiality of patient information; security is trustworthy. Medical
record security generally involves privacy and confidentiality [2]. Cloud computing provides the option of accessing massive amounts of patient information in a short period of time. This makes it easier for an unauthorized person
to obtain patient records. It confirms this feeling by saying “illegal access to
traditional medical records (paper-based) has always been conceivable, but
computer introduction increases a little problem to a large problem.”
Cloud computing is a concept for easy, on-demand access to a common
pool of configurable computer resources (e.g., networks, servers, storage,
applications, and services), which may easily be provided and disclosed with
minimal administration effort or engagement from service providers [4]. The
newest, most exciting, and comprehensive solution in the world of IT is cloud
computing. Its major purpose is to use the Internet or intranet to exchange
resources for users [5]. Cloud computing is an affordable, automatically scalable, multi-tenant, and secure cloud service provider platform (CSP).
2.1.1 Security in Medical Big Data Analytics
Big data is complex and uncomplicated by its very nature and requires sup­
pliers to take a close look at their techniques to collection, storage, analysis,
and presentation of their data to personnel, business partners, and patients.
What are some of the most challenging tasks for enterprises when starting up a big data analytics program, and how can they overcome these
problems to reach their clinical and financial goals?
2.1.1.1 Capture
All data comes from someone, but regrettably, it is not always from someone with flawless data management habits for many healthcare providers.
Collecting clean, comprehensive, precise, and correctly structured data for
Medical Data Classification in Cloud Computing
25
numerous systems is a constant battle for businesses, many of whom are
not on the gaining side of the conflict.
In a recent investigation at an ophthalmology clinic, EHR data were only
23.5% matched by patient-reporting data. When patients reported three or
more eye problems, their EHR data were absolutely not in agreement.
Poor usability of EHRs, sophisticated processes, and an incomplete
understanding why big data is crucial to properly collect all can contribute
to quality problems that afflict data during its life cycle.
Providers can begin to improve the data capture routines by prioritizing
valuable data types for their specific projects, by enlisting the data management and integrity expertise of professional health information managers,
and by developing clinical documentation improvement programs to train
clinicians on how to ensure data are useful for downstream analysis.
2.1.1.2 Cleaning
Health providers are familiar with the necessity of cleanliness in both the
clinic and the operating room, but are not aware of the importance of
cleaning their data.
Dirty data can swiftly ruin a large data analytics project, especially if
multiple data sources are used to capture clinical or operational elements
in slightly different formats. Data cleaning—also known as cleaning or
scrubbing—guarantees accuracy, correctness, consistency, relevance, and
in no way corruption of datasets.
While most data cleaning activities are still done manually, certain IT
vendors provide automated scrubbing instruments that compare, contrast,
and rectify big data sets using logic rules. These technologies may grow
more sophisticated and accurate as machine learning techniques continue
to progress rapidly, lowering time and cost necessary to guarantee high
levels of accuracy and integrity in health data stores.
2.1.1.3 Storage
Clinicians at the front line rarely worry about the location of their data,
yet it is a critical cost, safety, and performance issue for the IT department.
Due to the exponential growth in the amount of health data, several suppliers can no longer manage the costs and implications on local data centers.
While many firms are more convenient to store data in the premises,
which promises control over security, access, and up-time, the on-site
server network can be costly, hard to operate, and prone to data silo production in various departments.
26
The Internet of Medical Things (IoMT)
Cloud storage is becoming more and more common as costs decrease
and reliability increases. Nearly, 90% of healthcare firms use some cloudbased IT infrastructure, including warehousing and applications in a 2016
survey.
The cloud promises a smooth recovery from disasters, reduced upfront
costs, and simpler expansion—even though enterprises have to be exceedingly careful to select partners who understand the significance of HIPAA
and other compliance and safety issues for health.
Many firms have a hybrid approach to their data store initiatives, which
can offer providers with diverse access and storage requirements the most
flexible and workable solution. However, providers should be careful to
ensure that separate systems can communicate and share data with other
sectors of the company when appropriate while establishing a hybrid
infrastructure.
2.1.1.4 Security
Data security for healthcare businesses is the number one issue, particularly following a fast fire succession of high-profile violations, hackings,
and ransomware outbreaks. From phishing assaults, viruses, and laptops
left accidently in a cab, health information is exposed to an almost endless
range of dangers.
The HIPAA Security Rule offers a broad set of technological guarantees for PHI storage organizations, including transmission security, authentication procedures and access, and integrity and auditing
measures.
These precautions really lead to common sense safety processes, such as
the use of up-to-date anti-virus software, the setup of firewalls, the encryption of sensitive data, and multi-factor authentication.
However, even the most closely secured data center can be overcome by
personnel who tend to give priority over long software updates and sophisticated limits on their access to data or software.
Health organizations should often remind their staff members of the
important nature of data security standards and continuously examine
who has access to high-value data in order to prevent damage caused by
malevolent parties.
2.1.1.5 Stewardship
Health data has a long shelf-life, especially on the clinical side. In addition to keeping patient data accessible for at least 6 years, clinicians may
Medical Data Classification in Cloud Computing
27
choose to use de-identified datasets for research projects, which is vital for
continued stewardship and cure. For additional objectives, such as quality
measurement or performance benchmarking, data may also be repurposed
or re-assessed.
Understanding when and for what purposes the data were created—as
well as who utilized it previously, why, how, and when—is vital to academics and data analysts.
The development of complete, accurate, and up-to-date metadata is an
important component of a successful data management plan. Metadata
enables analysts to precisely duplicate earlier questions that are critical for
scientific investigations and proper benchmarking and prevents the creation of “data trash”.
Health organizations should employ a data manager to produce and
curate valuable metadata. A data controller may ensure that all pieces have
standard definitions and formats, are properly documented from creation
to deletion, and remain valuable for the tasks involved.
2.2 Access Control–Based Security
Access control is a mechanism to ensure that users are who they say they
are and have enough access to company data.
Access control at a high level is a selective restriction of data access. It
comprises two primary components: authentication and authorization, as
explained by Daniel Crowley, IBM’s X-Force Red research manager with a
focus on data security.
Authentication is a technique used to check that someone claims to be.
Authentication alone is not enough to protect data, as noted by Crowley.
What is required is an additional authorization layer that assesses if a user
should be authorized to access or execute the transaction.
2.2.1 Authentication
Authentication is the process of establishing trust in user identity.
Certification assurance levels will be in accordance with the application
and nature and sensitivity to the risk involved. An increasing number of
cloud providers are reached using their previously certified standards and
user support and administration applications and data. Also, a common
two-factor authentication, in the form of strong authentication, is, for
example, to be used as online banking. In theory, it should be protected
using strong authentication networks. The stricter requirements apply
28
The Internet of Medical Things (IoMT)
mainly to CSP employees. They also have access to IT resources; just for
example, it will be provided through strong authentication, using a chip
card or USB stick that can be generated by hardware through hardware-­
based password authentication system or media. This is really necessary
to use on the Internet. He went on to establish strict procedures that
are the basis of all relationships of trust between participants for relationships between two actors. After the trust relationship is established
through a series of trusted from a certification authority, participants can
be used to authenticate each other in connection with [3]. There are a
variety of authentication methods and techniques that organizations can
choose as follows.
2.2.1.1 User Password Authentication
Authentication is the process of identifying users who ask for system, network or device access. Access control frequently determines user identity
using credentials such as login and password. Additional authentication
technologies, such as biometric and authentication applications, are also
utilized to authenticate user identification.
2.2.1.2 Windows-Based User Authentication
Typically, the list is stored in the Windows Active Directory for the organization. The access control framework must be enabled to provide authentication for the user’s primary domain controller (PDC).
2.2.1.3 Directory-Based Authentication
To continue our expansion in business volume, often millions of users trying to use resources simultaneously. In such a scenario, the authentication
body should be able to provide faster authentication. A directory-based
authentication technique that is used to respond goes to the store LDAP
user directory to verify user credentials.
2.2.1.4 Certificate-Based Authentication
It is also the user where you can connect digital ID, strong authentication
technology. It released the authority for digital ID verification, also known
as a digital ID trustworthy digital certificate. To ensure identification, a
user has checked a variety of other parameters.
Medical Data Classification in Cloud Computing
29
2.2.1.5 Smart Card–Based Authentication
This certificate is used as a second factor [13]. Smart card is the smallest
co-processor data operation cryptographic tool.
2.2.1.6 Biometrics
This is a strong certification [9]. The third aspect of authentication to be
done is based on the user. He said that those that they know (username)
and (either network or token) or after work that they have (retinal scan,
fingerprint or thermal scanning). In cases necessary for data, such as military/defense, are confidential.
2.2.1.7 Grid-Based Authentication
It is used as a second authentication factor. The user knew that (authenticated by the authentication username password), and then they asked her
(grid card information). Entrust Identity Protector provides this certificate.
2.2.1.8 Knowledge-Based Authentication
In order to gain additional confidence in the identity of those users, keep
in mind that the challenge attacker [2] is unlikely to be able to provide.
On the basis of “shared secret”, the organization questions the user, when
appropriate, to allow user information that has been through the registration process, or how to go on related to the confirmation of the previous
transaction wants to do.
2.2.1.9 Machine Authentication
Authentication of a machine is the authorization of automated communication from person-to-machine (M2M) by verification of digital certificates or digital credentials.
Digital certificates used in machine permits are like a digital passport
that provides a trustworthy identification for secure information exchange
on the Web. Digital credentials are similar to types of ID and password
issued by the machine.
Machine authentication is used to allow machine interactions on cable
and wireless networks in order to allow autonomous interaction and
information sharing between computers and other machines. Machine
30
The Internet of Medical Things (IoMT)
authentication operations can be carried out with simple devices such as
sensors and infrastructure meters.
2.2.1.10
One-Time Password (OTP)
A password is generated dynamically and is valid only once. The advantage
of a one-time password is that if an intruder does not hack it, then he cannot
use it anymore. There are two types of OTP generator traces: synchronous
and asynchronous. One-time password (OTP) systems provide a mechanism for logging on to a network or service using a unique password that
can only be used once, as the name suggests. The static password is the most
common authentication method and the least secure.
2.2.1.11
Authority
The integrity of cloud computing needs an important information security to
maintain relevant authority. It follows the following controls and privileges
in the process stream in cloud computing. The rights management system
should ensure that each role (including metadata) can see the need to obtain
the data function. Access control should be based and the established role
goes on and officers should be reviewed regularly. In general, the model of
least privilege should be used, and the user and administrator only have the
necessary rights for the CSP to enable them to achieve their functions [14].
2.2.1.12
Global Authorization
Subscribing to global organizations (as many as access control decisions)
and rules and regulations (such as a limited user) must be lost locally. The
decision should be two pieces of information provided. Subscribed virtual
organizations are using the grid. In the early version of Globus software,
subscription information will be found on the local network. The network
[12] is mapped to the DN Mapfail account in that they require an account
on all of the resources they wish to use. The authorization process performed on the Grid DAS side exploiting Community Authorization extensions (VO-based) present into the user's credentials (e.g., proxy).
2.3 System Model
In this section, we propose a model system HERDescribes blurred system
architecture keyword search.
Medical Data Classification in Cloud Computing
31
2.3.1 Role and Purpose of Design
Our host is considering a cloud computing environmentEHR services. In
particular, as shown in Figure 2.1, there are four entities involved in the
system.
2.3.1.1 Patients
They are institutions that you and your HERPlace it on the cloud server.
2.3.1.2 Cloud Server
A cloud server is a virtual server (rather than a physical server) running in
a cloud computing environment.
2.3.1.3 Doctor
Accessing a patient‘s chart, a doctor gets summarized data including
patient demographics, immunization dates, allergies, medical history, lab
and test results, radiology images, vital signs, prescribed medications, and
current health problems along with the health insurance plan and billing
details.
Global Authority
issue private keys
Patients
Doctors
publish public parameters
retrieve PHRs
store PHRs
Cloud Server
Figure 2.1 Architecture for PHR system.
32
The Internet of Medical Things (IoMT)
2.4 Data Classification
Data classification is the process of data to identify data elements in relation to value in the business of the classification process. Cost, use, and
control of access restrictions depend on whether they are identified, as
shown in Figure 2.2.
2.4.1 Access Control
The aim of the access control is to provide access only to those who are
authorized to be in a building or workplace. Together with the matching
metal key, the deadbolt lock was the gold standard of access control for
many years, but modern enterprises want more. Yes, you want to check
who is passing through your doors, but you also want to monitor and manage access. Keys now have passed the baton to computer based electronic
access control systems that give authorized users fast and comfortable
access and prohibit access to unauthorized persons.
Today, we carry access cards or ID badges to secure places instead of
keys. Access control systems may also be utilized in order to restrict access
to workstations and file rooms containing sensitive information, printers,
and portals. In bigger buildings, entrance to the external door is typically
managed by a tenant or managing agency, but access to the internal office
door is controlled by the tenant.
Frequency of access: Frequency of Access control is a fundamental
component of data security that dictates who‘s allowed to access and use
Data Classification Properties
Access Control
Frequency of Access
Frequency of Update
Visibility and
Accessibility Retention
Content
Precision/Accuracy
Reliability/Validity
Degress of Completeness
Consistency and
Auditability
Figure 2.2 Data classification in cloud computing.
Storage
Storage-encryption
Communication-encryption
Integrity
Access Control
Backup and recovery plan
Data Quality Standards
Medical Data Classification in Cloud Computing
33
company information and resources. Through authentication and authorization, access control policies make sure users are who they say they are
and that they have appropriate access to company data.
Frequency of update: Update will update the data to be duplicated. Is it
a low, medium, or result?
Visibility and accessibility: The ability of one entity to “see” (i.e., have
direct access to) another.
A related concept: The lexical scope of a name binding is the part of the
source code in which the name can refer to the entity
Retention: Data retention, or record retention, is exactly what it sounds
like—the practice of storing and managing data and records for a designated period of time. There are many reasons why a business might need
to retain data: to maintain accurate financial records; to abide by local,
state, and federal laws; to comply with industry regulations; to ensure that
information is easily accessible for eDiscovery and litigation purposes; and
so on. To fulfill these and other business requirements, it is imperative that
every organization develops and implements data retention policies.
2.4.2 Content
These are data related to quality content modification. There are many properties that can make data content and can be classified into the following:
Accuracy: Use high data accuracy can be classified as low or poor. Highcontent precision and accuracy, on the other hand, are required for some
data elements.
Reliability/Validity: Concepts used to assess the quality of research are
reliability and validity. They show how well something is measured through
a method, methodology, or test.
Data Resolution: is a leading global provider of hosted technology solutions for businesses of all sizes. SaaS, managed virtual environments, business
continuity solutions, cloud computing and advanced data center services.
Auditability: A data audit refers to the auditing of data to assess its quality or utility for a specific purpose. Auditing data, unlike auditing finances,
involves looking at key metrics, other than quantity, to create conclusions
about the properties of a data set.
2.4.3 Storage
Data retention policies can be applied based on the lack of applicable criteria diversity.
34
The Internet of Medical Things (IoMT)
Storage Encryption: is the use of encryption for data both in transit and
on storage media. Data is encrypted while it passes to storage devices, such
as individual hard disks, tape drives, or the libraries and arrays that contain
them.
K-communication encryption: Leakage and data from the system or
eavesdropping risk. A sensitive and data communication must be provided
in encryption items.
Problems Integrity: Data integrity is handled by critical issues and has
a hash algorithm such as MD5 and SHA. This also applies to the security
level of the data essential element.
Policy Access Control: It’s aims to ensure that, by having the appropriate access controls in place, the right information is accessible by
the right people at the right time and that access to information, in all
forms, is appropriately managed and periodically audited.
Backup and Recovery Plan: A backup plan is required for disaster recovery storage purposes. Data must be connected to base backup scheme.
There are separate standards for authenticating user data as required by
data quality standards for classification data.
2.4.4 Soft Computing Techniques for Data Classification
Soft computing techniques are collection of soft computing techniques
methodology.
• Exploit the tolerance for imperfection and uncertainty.
• Provide capability to handle real-life ambiguous situations.
• Try to achieve robustness against imperfection.
One of the most popular soft computing-based classification techniques
is fuzzy classification. Fuzzy classes can better represent transitional areas
than hard classification, as class membership is not binary but instead one
location can belong to a few classes. In fuzzy set-based systems, membership values of data items range between 0 and 1, where 1 indicates full
membership and 0 indicates no membership. Figure 2.3 shows a block diagram of fuzzy classification technique.
This section explains the various layers of analysis framework.
Analytical framework is divided into user interface layer and processing layer. User interface layer is responsible for taking input from the
user and processing. Processing layer is responsible for classification and
comparison. Data access layer is responsible for connecting applications
Medical Data Classification in Cloud Computing
35
Fuzzy
Learning
Training
set
Fuzzy
Classification
Learn
Model
Test set
Apply
Model
Figure 2.3 Fuzzy classification block diagram.
to databases for storing data. Figure 2.4 shows the system architecture
and the interaction between the various components. Each layer is
implemented use the class file that will implement the interface and data
processing.
Figure 2.4 illustrates that the analytical framework consists of two layers where first layer provide user interface that allows users to select the
desired dataset and algorithms and second layer provide processing component to selected algorithm.
User
User interface layer
Classifier
Comparat
Classification
algorithm
Data set
Processing layer
Classification
Figure 2.4 Analysis framework architecture.
Comparison
36
The Internet of Medical Things (IoMT)
2.5 Related Work
The authors [1] analyzed health data using safety management and proposals of Blockchain. However, Blockchain are computationally expensive, demand for high bandwidth and additional computing, and not fully
suitable for limited resources because it was built for smart city of IoT
devices. In this work, they use the device—IoT Blockchain—that tries to
solve the above problems. The authors proposed novel device structure—
IoT Blockchain—a model suitable for additional privacy and is considered
to be property, other than the conservation property and their network.
In our model, this additional privacy and security properties based on
sophisticated cryptographic priority. The solution here is more secure and
anonymous transactions to IoT applications and data-based Blockchain
networks.
Whitney and Dwyer [2] introduced in the medical field the advantage
of the Blockchain approach and proposed the technology blockchain personal health record (PHR), data can be handled well if it is properly classified, for example, we can classify different medical data like BMI of a
person as lean, normal, fat and obese. Some of the important applications
of data mining techniques in the field of medicine include health informatics, medical data management, patient monitoring systems, analysis of
medical images for unknown information extraction and automatic identification of diseases.
In the paper [3], the authors proposed a novel EHR sharing, including
the decentralization structure of the mobile cloud distribution platform
Blockchain. In particular, they are designed to be the system for achieving
public safety EHRs between various patients and medical providers using a
reliable access control smart contract. They provide a prototype implementation using real-data Ethereum Blockchain shared scenarios on mobile
applications with Amazon cloud computing. Empirical results suggest that
the proposal provides an effective solution for reliable data exchange to
maintain sensitive medical information about the potential threats to the
mobile cloud. Evaluation models of security systems and share analysis
also enhance lighting, design, performance improvement in high security
standards, and lowest network latency control with data confidentiality
compared with existing data.
The authors [4] proposed a system for detecting lung cancer while using
the neural network and genetic algorithm Backpropagation. In this paper,
classification was performed using Neural Network Backpropagation
which would classify as normal or abnormal the digital X-ray, CT images,
Medical Data Classification in Cloud Computing
37
MRIs, and so forth. The normal condition is that which is characteristic of
a healthy patient. For the study of the feature, the abnormal image will be
considered further. The genetic algorithm can be used for adaptive analysis
to extract and assign characteristics based on the fitness of the extracted
factors. The features selected would be further classified as cancerous or
noncancerous for images previously classified as abnormal. This method
would then help to make an informed judgment on the status of the patient.
The authors [5] proposed segmentation techniques to improve tumor
detection efficiency and computational efficiency; the GA is used for automated tumor stage classification. The choice in the classification stage shall
be based on the extraction of the relevant features and the calculation of the
area. The comparative approach is developed to compare four watersheds,
FCM, DCT, and BWT-based segmentation techniques, and the highest is
chosen by evaluating the segmentation score. The practical products of the
proposed approach are evaluated and validated based on the segmentation
ranking, accuracy, sensitivity, specificity, and dice similarity index coefficient for development and quality evaluation on MRI brain images.
In [6], a Blockchain-based platform is proposed by the authors that can
be used to store electronic medical records in cloud environments and
management. In this study, they have proposed a model for the health data
Blockchain-based structure for cloud computing environments. Their contributions include the proposed solution and the presentation of the future
direction of medical data at Blockchain. This paper provides an overview
of the handling of heterogeneous health data, and a description of internal
functions and protocols.
Authors in [7] presented a fuzzy-based method for iterative image
reconstruction in Emission Tomography (ET). In this, two simple operations, fuzzy filtering and fuzzy smoothing, are performed. Fuzzy filtering
is used for reconstruction to identify edges, while fuzzy smoothing is used
for penalizing only those pixels for which the edges are missing in the nearest neighborhood. These operations are performed iteratively until appropriate convergence is achieved.
Authors in [8] developed image segmentation techniques using fuzzybased artificial bee colony (FABC). In that research, the author has
combined the fuzzy c-means (FCM) and artificial bee colony (ABC) optimization to search for better cluster century. The proposed method FABC
is more reliable than other optimization approaches like GA and PSO (particle swarm optimization). The experiment performed on grayscale images
includes some synthetic medical and texture images. The proposed method
has the advantages of fast convergence and low computational cost.
38
The Internet of Medical Things (IoMT)
Authors in [9] preserved the useful data; the suggested adaptive fuzzy
hexagonal bilateral filter eliminates the Gaussian noise. The local and
global evaluation metrics are used to create the fuzzy hexagonal membership function. The recommended method combines the median filter and
the bilateral filter in an adaptive way. The bilateral filter is often used to
retain the edges by smoothing the noise in the MRI image and by using
a local filter to maintain the edges and obtain structural information. The
proposed approach and the existing approach performed a series of experiments on synthetic and clinical brain MRI data at various noise levels.
The outcome demonstrates that the proposed method restores the image
to improved quality of the image which can be used for the diagnostic purpose well at both low and high Gaussian noise densities.
In [10], the authors conceptualized the proposed use of share information on the protection of health and health data to share any individual
technology line dynamic Blockchain transparent cloud storage. In addition, they also provide quality control checking module machine learning data quality engineering data base. The main objective of the proposed
system will allow us to share our personal health data in accordance with
the GDPR for each common interest of each dataset, control, and security. This allows researchers for high quality research to effectively protect
personal health data through consumer and commercial data for commercial purposes. The first characters of data from this work, personal data
of health (grouped into different categories of dynamic and static data),
and a method for health-related data capable of data acquisition) enabled
mobile devices (continuous data and real time). In the case of a solution
that has been integrated, using a pointer hash for storage space in a variety
of sizes has been integrated. First, they proposed to use different sizes of
dynamic run sharing. Second, they proposed dynamic system Blockchain
and cloud storage of health data. They also proposed the size of cloudshaped Blockchain health encrypted data that can be stored in both formats data. To control the inherent quality of the proposed system, the data
module is recognized, and Lions and stock may also be associated with the
transactions and metadata. Third, the machine is supported by hardware
and software technology.
Authors proposed system for medical image classification, a robust
sparse representation is presented based on the adaptive type-2 fuzzy learning (T2-FDL) method. In the current procedure, sparse coding and dictionary learning method are iteratively performed until a near-optimum
dictionary is produced. Two open-access brain tumor MRI databases,
“REMBRANDT and TCGA-LGG,” from the Cancer Imaging Archive
(TCIA), are used to conduct the experiments. The research findings of a
Medical Data Classification in Cloud Computing
39
classification task for brain tumors indicate that the implemented T2-FDL
approach can effectively mitigate the adverse impacts of ambiguity in
images data. The outcomes show the performance of the T2-FDL in terms
of accuracy, specificity, and sensitivity compared to other relevant classification methods in the literature.
The authors proposed the framework to introduce briefly the various
soft computing methodologies and to present various applications in medicine. The scope is to demonstrate the possibilities of applying soft computing to medicine related problems. The recent published knowledge
about use of soft computing in medicine is observed from the literature
surveyed and reviewed. This study detects which methodology or methodologies of soft computing are used frequently together to solve the special
problems of medicine. According to database searches, the rates of preference of soft computing methodologies in medicine are found as 70% of
fuzzy logic-neural networks, 27% of neural networks-genetic algorithms
and 3% of fuzzy logic-genetic algorithms in our study results. So far, fuzzy
logic-neural networks methodology was significantly used in clinical science of medicine. On the other hand neural networks-genetic algorithms
and fuzzy logic-genetic algorithms methodologies were mostly preferred
by basic science of medicine. The study showed that there is undeniable
interest in studying soft computing methodologies in genetics, physiology,
radiology, cardiology, and neurology disciplines.
The authors have proposed an automatically analyzing machine learning prediction results. Predictive modeling is a process that uses data
mining and probability to forecast outcomes. Each model is made up of
several predictors, which are variables that are likely to influence future
results. Once data has been collected for relevant predictors, a statistical
model is formulated. The model may employ a simple linear equation,
or it may be a complex Neural Network, mapped out by sophisticated
software. As additional data becomes available, the statistical analysis
model is validated or revised. Predictive analytics can support population health management, financial success, and better outcomes across
the value-based care sequence. Instead of simply presenting information
about past events to a user, predictive analytics estimates the likelihood of
a future outcome based on patterns in the historical data. The electronic
medical record data set from the Practice Fusion diabetes classification
competition containing patient records from all 50 states in the United
States were utilized in this work and illustrated the method of predicting
type 2 diabetes diagnosis within the next year. The prediction was done
using two models, one for prediction and another for the explanation. The
first model is used only for making predictions and aims at maximizing
40
The Internet of Medical Things (IoMT)
accuracy without being concerned about interpretability. It can be any
machine learning model and arbitrarily complex. The second model is a
rule-based associative classifier used only for explaining the first model’s
results without being concerned about its accuracy.
The authors also described a decentralized system of managing personal
data that users create themselves and control their data. They implement
the protocol to change the automatic access control manager on Blockchain,
which does not require a third-party trust. Unlike Bitcoin, its system is not
strictly a financial transaction; it has to carry instructions for use, such as
shared storage, query, and data. Finally, they discussed the extent of future
potential Blockchain which can be used as the solution round for reliable computing problems in the community. The platform enables more:
Blockchain intended as an access control itself with storage solutions in
conjunction with Blockchain. Users rely on third parties and can always be
aware of what kind of data is being collected about them and do not need
to use them. Additionally, users of Blockchain recognize as the owner of
their personal data. Companies, in turn, can focus on helping protect data
properly and how to specifically use it without the bins being concerned.
In addition, with the decentralization platform, sensitive data is gathered;
it should be simple for legal rulings and rules on storage and sharing. In
addition, laws and regulations can be programmed in Blockchain, so that
they can be applied automatically. In other cases, access to data (or storage)
may serve as evidence of that law, as it would be compromised.
In this review proposed a machine learning-based framework to identify type2 diabetes using EHR. This work utilized 3 years of EHR data. The
data was stored in the central repository, which has been managed by the
District Bureau of Health in Changning, Shanghai since 2008. The EHR
data generated from 10 local EHR systems are automatically deposited into
the centralized repository hourly. The machine learning models within
the framework, including K-Nearest-Neighbors, Naïve Bayes, Decision
Tree, Random Forest, Support Vector Machine, and Logistic Regression.
Also, the identification performance was higher than the state-of-the-art
algorithm.
Traditional health system handles thrust disasters, employment development, dissemination, and fame. There is no appropriate guidance for
the old doctor. Envisaging and tracking others prohibited financial funding, scheduling, organization, measures, and government estimates. This
would support the evidence that some has been provided. Researchers
increase complications in future. Remedial healthcare review and problems are inseparable. This form of education and future developments are
Medical Data Classification in Cloud Computing
41
ready in today’s context that are defined and will essentially help testers
for scientific design. Remedial health services to lawsuits will honor traditionally the drugs, healthcare systems, modern outrage, and the court of
modern diagnostic technology, given the time when this notion has gone.
Techniques to reach the clouds have proposed a number of data, continuing to develop techniques that provide tools. This same performance
development on cloud software simulated cloud computing environment.
Typical cloud environmental simulation test was performed by taking the
final test matches result. The damaged devices provided for tolerance and
efficiency to meet the environmental consequences of fake cloud workflow
software that comes as a scientific and social networking sites [1, 12, 13]
are continued to develop a method to obtain a higher amount and take
advantage of security capabilities. Co-processor is called cryptographic.
This increases cost and increases functionality data protection in a distributed computing [9].
PaaS (as a protection of data services): The award has become user
safety standards. Data protection, data security, and proposals provide data
authentication and data protection for administrators, out of some malicious software though. Hindrance single-cloud platform is beneficial for
protecting large amounts of application users.
One fuzzy nearest neighbor technology is the proposed framework for
decision rule fuzzy prototype; there are three strategies that determine the
membership value of the training samples, helping for more blur results by
providing input sample vector with unknown neighbor grade classification. When it is believed to be more than two neighbors, likely, this is why
neighbors are between large numbers of parts of the tie, that it, Kashmir
nearest neighbor residue groups.
A cloud technology to avoid data duplication currently uses computing
decks, and efferent and convergence remain important management strategy to secure de-duplication. Insured reduction strategy unnecessary data
is widely applied to cloud storage despite the mass convergence encryption. The distribution of security is implemented for a major concern.
Convergence works as encryption here. Large amounts of key are required
to maintain power. At the same time, it is a difficult task.
Research shows that some areas employ classification of public data to
share data in private and technology and to protect personal data. One
such classification technology is the k-nearest neighbor algorithm. It is a
machine learning to know the types of technology classified as personal
data and public data. Personal data is encrypted and sent using RSA technology cloud server.
42
The Internet of Medical Things (IoMT)
2.6 Conclusion
Personal data is one of the main issues when dealing with data storage
in cloud security. Classification of data in the cloud is the identification
of a set of standards. This proposal depends on the type of security level
content and access. We are able to provide a level of security in the cloud
storage needed for privacy and restrictions on access to a set of data. We
classified them based on analysis of multiple data elements and criteria.
This paper focuses on data security for cloud technology environment. The
main objective of this study was to classify data protection elements based
on data. This data in sensitive and non-sensitive partitions winning better
technology will improve. Sensitive data is sent to the cloud and sent via the
data algorithm blowfish, while non-transmitting sensitive data are stored
in the cloud server. Also, the clouds split isolated a separate partition and
stored in data partition. But all data will be stored in the same cloud.
A clinical decision support system (CDSS) is an application that analyzes
data to help healthcare providers make decisions, and improve patient care.
A CDSS focuses on using knowledge management to get clinical advice
based on multiple factors of patient-related data. Clinical decision support
systems enable integrated workflows, provide assistance at the time of care,
and offer care plan recommendations. Physicians use a CDSS to diagnose
and improve care by eliminating unnecessary testing, enhancing patient
safety, and avoiding potentially dangerous and costly complications. The
applications of big data in healthcare include, cost reduction in medical
treatments, eliminate the risk factors associated with diseases, prediction
of diseases, improves preventive care, analyzing drug efficiency. Some challenging tasks for the healthcare industry are:
(i) How to decide the most effective treatment for a particular
disease?
(ii) How certain policies impact the outlay and behavior?
(iii) How does the healthcare cost likely to rise for different
aspects of the future?
(iv) How the claimed fraudulently can be identified?
(v) Does the healthcare outlay vary geographically?
These challenges can be overcome by utilizing big data analytical tools
and techniques. There are four major pillars of quality healthcare. Such as
real-time patient monitoring, patient-centric care, improving the treatment
Medical Data Classification in Cloud Computing
43
methods, and predictive analytics of diseases. All these four pillars of quality healthcare can be potently managed by using descriptive, predictive,
and prescriptive big data analytical techniques.
References
1. Smith, Rawat, P.S., Saroha, G.P., Bartwal, V., Evaluating SaaSModeler (small
mega) Running on the Virtual Cloud Computing Environment using
CloudSim. Int. J. Comput. Appl. (0975-8887), 5, 13, 2012c of September.
London, A247, 529–551, April 1955.
2. Whitney, A. and Dwyer, II, S.J., the performance and implementation of the
K-nearest neibbor decision rule with no training samples correctly identified, in: Proc. 4 Ann. Allerton Conf. On the Circuit System Band Theory, 1966.
3. Dasarathy, B.V., Nosing aroung the environment: A new structure and a system of classification rules for recognition in the most affected. IEEE Trans.
Pattern Anal. Mach. Intell., PAMI- 2, 67–71, 1980.
4. Keller, J.M., Gray, M.R., Givens, Jr, J.A., A Fussy-K-Nearest Neighbor algorithm. IEEE Trans. Syst. Man Cybern., SMC-15, 4, 580–585, July/August
1985.
5. Bauer, E. and Kohavi, R., A comparison of voting classification algorithms
impirical: Bagging, Improve and variants. Mach. Learn., 36.1, 36105–139,
1999.
6. Khan, M., Ding, Q., Perrizo, W., K-Nearest Neighbor Classification on Spatial
Data Streams Using P-Trees1, 2. PAKDD 2002 LNAI, vol. 2336, pp. 517–528,
2002.
7. Catteddu, D. and Hogben, G., Cloud Computing: Benefits, risks and recommendations for information security, ENISA, Berlin, Heidelberg, 2009.
8. Phyu, T.N., Survey on Data Mining Classification Techniques. Proceedings of
the International MultiConference of Engineers and Computer Scientists 2009,
March 18-2009, vol. I, IMECS, 2009.
9. Ram, C.P. and Sreenivaasan, G., Security as a Service (SASS): Securing of
user data by the coprocessor and distributing data. Trendz Information and
Computing Sciences (TISC2010), December 2010, pp.152–155.
10. Yau, S.S. and Ho, G., Privacy protection in cloud computing sytems. Int. J.
Software Inform., 4, 4, 351–365, December 2010.
11. Mishra, A., An Authentication Mechanism Based on Client-Server
Architecture for Accessing Cloud Computing, International Journal of
Emerging Technology Advanced Engineering, 2, 7, 95–99, July 2012.
12. Huang, S.-C. and Chen, B.-H., Highly accurate moving object detection in
variable bit rate video-based traffic monitoring systems. IEEE Transactions
on Neural Networks and Learning Systems, 24.12, 1920–1931, 2013.
44
The Internet of Medical Things (IoMT)
13. Kafhali, S.E. and Haqiq, A., Effect of Mobility and Traffic Models on
the energy consumption in MANET Routing Protocols. arXiv preprint
arXiv:1304.3259, 2013.
14. Mishra, A. et al., A Review on DDOS Attack, TCP Flood Attack in Cloud
Environment. Elsevier SSRN International Conference on Intelligent
Communication and computation Research, Available at https://ssrn.com/
abstract=3565043, March 31, 2020.
3
Research Challenges in Pre-Copy Virtual
Machine Migration in Cloud Environment
Nirmala Devi N.1 and Vengatesh Kumar S.2*
Department of Computer Science, Auxilium College, Vellore, India
Department of Computer Applications, Dr. SNS Rajalakshmi College of Arts and
Science, Coimbatore, India
1
2
Abstract
The Internet of Medical Things (IoMT) is a blend of medical devices and applications, which promotes clinical knowledge of network technology to analyze
medical data. The IoMT supports the patient to share their medical data with the
doctor in a secure manner. There is an exponential growth in the implementation
of IoMT in healthcare, but there are still some crucial problems in medical data
security. Cloud computing technology offers a new application without network
issues in an optimized manner and with the quickest access service. In cloud computing, medical data management takes place in an Infrastructure as a Service in
the data center. These medical data are stored in a virtual machine (VM) that can
be transferred easily without data loss to other data centers. VM migration is a
part of cloud computing that provides system maintenance, fault tolerance, load
balancing, and reliability to the users. There are many migration approaches are
available in the present scenario. We are going to discuss the pre-copy live migration technique where all active memory pages are transferred and then execution
takes place. Downtime and overall migration time are significant problems in precopy migration. To manage these issues, there are several methodologies available.
In this chapter, we analyze the challenges in various pre-copy live migration techniques and explore the way to reduce downtime and migration time.
Keywords: IoMT, virtual machine, cloud computing, live migration,
pre-copy technique
*Corresponding author: gowthamvenkyy@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (45–72) © 2022 Scrivener Publishing LLC
45
46
The Internet of Medical Things (IoMT)
3.1 Introduction
Cloud computing is recognized as one of the mainstream paradigms of
computing in recent decades, offering user-based storage, processing
power, and software services. Due to numerous advantages of cloud computing, business applications are deployed in the cloud platform and thus
cause popularity. With the enormous growth in cloud service demand, the
chances of service overloading and maintenance are very frequent. These
issues can also address through live migration of virtual machine (VM) [11]
from one data center to another data center. The cloud service providers
normally utilized the pre-copy method for VM migration. The efficiency
of the VM migration technique is determined by three parameters such as
the size of the VM, the dirty rate of the running application, and total number of iteration for migration. We analyze these key factors in this study
and determine the strategy which improves the VM migration technique
performance.
3.1.1 Cloud Computing
Cloud computing provides on-demand IT resources through the internet
where the pay-as-you-go-pricing process is proceeding. Instead of buying or owning the physical data center, the customer can access their data
through the cloud as per their requirement. There are several applications
connected to the cloud computing for transferring data among the users
which are depicted in Figure 3.1.
MOBILE
NETWORK
DATABASE
TABLET
SERVER
LAPTOP
CLOUD COMPUTING
SMARTPHONE
Figure 3.1 Cloud computing.
Pre-Copy Virtual Machine Migration
47
3.1.1.1 Cloud Service Provider
There are three services are available for users through service providers
such as Software as a Service (SaaS), Platform as a Service (PaaS), and
Infrastructure as a Service (IaaS). Some of the service providers are Amazon
Web Service (AWS), Microsoft Azure, and Google Cloud Platform.
1. Software as a Service: Cloud service provides software to the
customers for a particular time or lifetime. The customer
does not require downloading the software as they can use
directly through the browser.
2. Platform as a Service: It provides a framework for the customer where they can develop their application or customize
the pre-built application.
3. Infrastructure as a Service: It supports the customer to rent
the IT infrastructure such as a server, OS, storage, and VM
from the cloud.
3.1.1.2 Data Storage and Security
In cloud-based data security, the data are stored in an internet-connected
server which will be manageable by data centers, thereby preventing the
illegal access of data. The data center secures the user’s data and provides
secure access. The cloud is categorized into three types such as public, private, and hybrid cloud, which is illustrated in Figure 3.2.
(a) Public cloud: Public cloud is suited for unstructured data
such as files in the folder. It is unpreferable for customers
who need customized storage.
(b) Private cloud: Private cloud provides a single-tenant environment for a specific organization. It provides an enhanced
security level for the organization data.
(c) Hybrid cloud: Hybrid cloud is the integration of public and
private cloud. It is the balance between affordability and customization. Many companies prefer it for keeping common
data on public and sensitive data on the private cloud.
For security, the advanced firewalls are used to analyze the content
integrity of each packets traveling from source to destination. The program always maps the packet content and intimate about the security
threat in data. The cloud data is accessible by many users at a time based
48
The Internet of Medical Things (IoMT)
Hybrid Cloud
Private Cloud
Public Cloud
Figure 3.2 Types of cloud.
on intrusion detection. Cloud providers can provide multiple levels of
detection to ensure the authorized user is logging in or not. Cloud provider supports the cloud server to be unbreakable but providing network
initial defenses. The logs of the users are monitored and built a narrative
concerning network events which prevents security breaches. The internal
firewalls ensure that the compromised account does not have full access to
the stored data. It boosts the security level. At last, the encryption provides
the data safe from unauthorized users. Hackers can᾿t access the files in the
cloud data without the key.
3.1.2 Virtualization
Virtualization is a process of converting physical computing objects (server
and network gears) into software-based alternatives [10]. In this virtualization, technology upholds various operating systems (OSs) to run in a
single PC or server. This enables the host PC to support different characteristic OS which is represented in Figure 3.3. For example, you have five
physical computers (PCs) with separate OS and related applications. The
VM is supported by hypervisor software to manage and organize multiple
OS on a single PC.
Applications
Applications
OS
OS
VM 1
VM 2
Virtual Machine Monitor (VMM)/Hypervisor
Hardware
Figure 3.3 Virtualization.
Pre-Copy Virtual Machine Migration
49
Each VM will conveniently share memory, network bandwidth, processor cycles, and other resources, etc., depending on the demand. As a
single computer, these five PCs have been combined, but they can operate
independently.
3.1.2.1 Virtualization Terminology
3.1.2.1.1 Virtual Machines
The main building block of virtualization is the VM, and it can add extra
OS on a single device with its software. The work of VM is completely
independent of the host and its processing would not affect the host performance. Some of the OS is Windows 8, Windows 10, MAC, etc.
3.1.2.1.2 Virtual Server
There are numerous servers that are available for a VM which runs the
server-based application such as Windows server.
3.1.2.1.3 Virtual Network Interface Card
The software which resembles the behavior of an Ethernet adapter that
contains MAC address through the Ethernet package transmission takes
place.
3.1.2.1.4 Virtual SCSI Adapter
The software which resembles like SCSI adapter generates the SCSI commands and is attached to multiple virtual disks.
3.1.2.1.5 Virtual CPU
The software which resembles a physical CPU is categorized into software-­
emulated and software-modified.
3.1.2.1.6 Virtual Disk
It is similar to the physical disk which contains files, set of files, software,
and represented as a SCSI disk.
3.1.2.1.7 Virtual Machine Monitor
VM monitor (VMM) controls all the execution of VM. It provides notification to the VM to use physical resources in the host.
50
The Internet of Medical Things (IoMT)
3.1.3 Approach to Virtualization
The VM is built based on the physical server’s features which are configured via network port with numerous processor, RAM, storage resources,
and networking.
The hypervisor supports the OS to transfer the data between the host
and destination node in an efficient way. The configured files in the VM
describe the attributes of it. Those files contain details of the virtual processor (VCPUs) allocated, RAM allocated, which type of I/O devices can
access, several Network Interface Cards (NICs) allowed in a virtual server,
etc. VM is constructed based on numerous files, and this causes it possible
to copy entire server including OS, application, and other hardware configuration. VMMs are classified into two types, which are illustrated in Tables
3.1 and 3.2.
Table 3.1 Type 1 VMM Approach.
Applications
Applications
OS1
OS2
VM1
VM2
Virtual Machine Monitor
Shared Hardware
Table 3.2 Type 2 VMM Approach.
Applications
Applications
OS1
OS2
VM1
VM2
Virtual Machine Monitor
Host Operating System
Shared Hardware
Pre-Copy Virtual Machine Migration
51
3.1.4 Processor Issues
In a virtual environment, there are two approaches to fulfill the processor
resources such as emulate a chip as software or provide a segment of processing time on the physical processor (PCPUs). Most of the virtualization
hypervisor offers PCPUs for the users. In processor analysis, it is essential
to determine the allocated Virtual processors for each VM. Due to Moore’s
law, the processor works four times faster than the original physical server.
To sizing the VM, there are numerous tools available in monitor resources
which analyze the usage capacity of the VM.
3.1.5 Memory Management
The VM’s page sharing is handled by the hypervisor, and the VM is unaware
of the physical device management mechanism. Ballooning is one of the
powerful methods of memory management that triggers the balloon driver
to inflate the pages to flush into a disc. Deflation takes place after the inflation is completed where all the data is cleaned up and ready to allocate new
physical memory for VM.
3.1.6 Benefits of Virtualization
•
•
•
•
•
Centralized resource management
Reducing cost
Better backup and disaster recovery
Driving new applications in faster
Improved migration process
3.1.7 Virtual Machine Migration
Migration is a vital feature in modern VM technology which enables the
users to switch to other OS without host node interruption. This provides
an efficient way of system maintenance, load balancing, and fault tolerances which are used in cluster and data center for analysis purposes. In
addition to tracking the usage of VM’s resources, such a technique needs to
be mindful of the impact of VM migrations to map the new VM from time
to time to physical machine allocation. The VM migration is divided into
live and non-live migration. Live migration is the transition of the application without disconnecting the current application between the current
machine and the target machine, while non-live migration is the technique in which the current machine ceases operating and then moves all
52
The Internet of Medical Things (IoMT)
Live migration
Pre-copy Phase
Warm-Up
Post-copy Phase
Stop-Copy
Figure 3.4 Phases of virtual machine migration.
of its software to the target machine. Live migration reduces the source
machine’s downtime. Live migration has been further separated into two
phases which are depicted in Figure 3.4. The pre-copy and post-copy
migration strategies that are introduced in the hypervisor play a crucial
role in the migration of VMs from one data center to another.
The migration time plays a vital role in differentiating pre-copy [12]
and post-copy methods. Two methods used in pre-copy migration are
warm-up and stop-and-copy.
Apart from live and non-live migration, there are some other approaches
to attain the VM migration such as hyper-jacking mitigation [9], VM
escape mitigation, unsafe VM migration, VM sprawl mitigation, guest OS
vulnerabilities mitigation, and denial-of-service migration.
3.1.7.1 Pre-Copy
The pre-copy is a transfer of all VM data to the target VM in an iterative
manner. The modified pages in the last iteration will be transferred in the
next iteration. This iteration process continues until the process does not
exceed the extreme iteration number, if the count is reached, then the VM
sends entire memory pages to the target VM [14].
3.1.7.2 Post-Copy
The post-copy method initiates the execution process in the destination
node by transferring a minimum number of pages. The post-copy minimizes the total migration time and downtime but it produces service
Pre-Copy Virtual Machine Migration
53
degradation which occurred by the page fault that is resolved through the
source node over the network.
3.1.7.3 Stop and Copy
The stop-and-copy process happened in the non-live migration method.
During the end of the data fetching, the source node is paused and transfer
all the memory pages to destination. After the complete transaction of all
pages, the destination VM is resumed and executes the VM data.
Live migration easily achieves the transfer of running VM from host
to destination. It is considered a seamless migration in the digital environment. There are numerous commercial hypervisors that are available
which perform live VM migration such as VMware and open-source
hypervisor such as Xen and KVM [4]. The performance of live migration
is measured based on total page transfer, downtime, total migration time,
and overhead.
3.1.7.3.1 Total Page Transfer
Total page transfer represents the complete number of pages transfer to the
destination node in n iteration.
n
Vmig =
∑V
i
i =0
where Vmig is VM migration and Vi is ith iterations number of pages transfer.
3.1.7.3.2 Downtime
Time is taken by the migration process to terminate the VM in the host
node and execute it in the target node. This value depends upon the last
iteration pages.
3.1.7.3.3 Total Migration Time
Total migration time is characterized as the complete time taken for the
information to move from host to destination node.
3.1.7.3.4 Overhead
It is a transfer of additional pages during migration.
54
The Internet of Medical Things (IoMT)
Rd =
Vmig
Vmem
where Rd is the redundancy ratio, Vmig it total data transfer during migration, and Vmem is virtual memory size.
3.2 Existing Technology and Its Review
In cloud computing, numerous problems have to deal with in VM migration, such as balancing load, error handling, low-level device management,
and reducing energy consumption during data transfer. In this section, we
audit several existing works in pre-copy live migration.
Kasidit et al. [1] presented a concept for improving the pre-copy migration of VM in a cloud computing environment. The per-copy migration is
utilized in most of the hypervisor but it does not consider ideal because
of its operating intensive and memory-intensive workloads. The workload
determines the VM operation that is processed based on configuration
parameters such as downtime. This causes the VM to take a long time for
migration or downtime. This also affects VM migration in the cloud automatically. To overcome these issues, memory-bound pre-copy live migration is implemented in the VM computation at a present state where the
dirty and non-dirty pages are separated.
Problem Definition: In the above article, the technique is not checked for
different VM workloads.
Gursharan Singh et al. [2] represent the memory starvation problem
due to migration increases which required storage space on the host. To
overcome these issues, they suggested a technique that reduces the data
image size stored in the host before migration. Based on the probability
factor, the unwanted data are removed from the data images. This is called
the memory reusing technique, experimented through CloudSim 3.0 Java
Netbeans IDE 8.0 MySQL. The evaluation showed that the memory size
of image is reduced to 33% based on the threshold level and probability
factor.
Problem Definition: In this article, the pre-copy migration techniques
are used to transfer the data from host to destination node. By analyzing the result, the technique represents the approximate value of memory
Pre-Copy Virtual Machine Migration
55
reduction. To improve the accurate level of prediction, the author suggests
the post-copy method to do the same. In post-copy, there is no need to
transfer the whole data and store the images so it provides the best result
than the pre-copy technique.
Adithya et al. [3] analyze the factor affecting the pre-copy VM migration
in cloud computing, which states that the VM migration is influenced by
VM size, migration iteration, and dirty rate of running application. Based
on the simulation result, the half iteration of pre-copy migration is adequate to decrease the migration and downtime.
Problem Definition: In the above article, evaluation was too good and it
was done based on downtime and migration time using MATLAB 9.2 simulation tool. But there is no mechanism to convey, how the control this
migration delay. So, we suggest that using any data reduction techniques
(content redundancy, compression, container-based virtualization), these
problems will resolve and we get faster migration with less migration delay.
Yi Zhong et al. [4] presented a pre-copy approach where the system
deals with the optimization of memory state transfer in the iteration
period. This system is designed to predict the hot dirty pages accurately by
analyzing the verifiable dirty page information periodically. Dirty pages’
weight determines the transaction rate in VM. The simulation result shows
that the optimization method is far better than traditional method in terms
of downtime, total migration, and total number of transferred pages.
Problem Definition: In the above article, they did not mention how much
time to take, to complete the migration process.
Ashu Dadrwal et al. [6] introduced a pre-copy–based live migration
technique. This paper introduced a new algorithm; here, checkpoint
was created using the checkpoint algorithm and saved the last checkpoint values in the reserved memory location on each VM separately.
If any fault occurs, then the work could be recovered using checkpoint
points otherwise the migration process is done correctly. While comparing the threshold and workload values, if the workload value is greater
than the threshold value then the new host can transfer the pages during
migration.
Problem Definition: In the above article, the memory pages preventing if
the fault occurs but none of the steps to be taken to resolve these faults.
56
The Internet of Medical Things (IoMT)
Hongliang Liang et al. [5] proposed a novel pre-copy strategy that
reduces the total number of memory pages that are transferred. Here, periodically collecting the verifiable statistics and current statistics of memory
pages, from these they calculate the frequency of each dirty page (weight).
This calculation concluded that, whether the memory pages are frequently
updated or not to be updated for the current iteration. The dirty rate of the
memory pages is determined by the transfer rate among the source and
destination nodes.
The optimized strategy using the ImpLM tool which significantly
reduces the total data transferred and migration time.
Problem Definition: In the above article, the output result of the live migration is less which has to be improved by any optimization method.
Praveen Jain et al. [7] proposed a pre-copy method for transferring the
VM data during VM migration. While processing, the pages are divided
into two phases. One is modified pages and another one is unmodified
pages. Again, this modified page is portioned into two phases. Collecting
the verifiable iteration of the page, it divides high modified pages and normal modified pages. Each page has owned its verifiable details. Here leaving the normal pages and taking it high modified page and using the bit
value they are maintaining each page’s status. The pages are transferred
to the target VM using its bit values. Finally, they evaluated and significantly reduced the downtime and total migration time using the CloudSim
simulator.
Ruchi et al. [8] presented a solution to overcome the load balancing
problem for a cloud provider. The multiphase pre-copy live migration
technique is implemented. In the first phase, the host node transfers all the
memory pages to the destination node. In the second phase, the history of
each page is analyzed based on that sending procedure is carried out. AR
forecasting approach is used to predict the page behavior which supports
the system to determine to send this page or not. Overall performance is
measured using the Clouds simulation tool.
3.3 Research Design
In this research analysis, the performance metrics of the VM pre-copy
live migration are evaluated based on four metrics such as total migration
time, downtime, and total data transfer, and iteration. Here, we focused on
balancing the loads in cloud data centers by applying the VM migration
Pre-Copy Virtual Machine Migration
57
technique. For an effective migration process, migration time and downtime should be minimum [7]. There are numerous techniques that are
implemented for pre-copy live migration, and among them, some selective
techniques are analyzed and evaluated the metric parameters for research
purpose which is explained below.
3.3.1 Basic Overview of VM Pre-Copy Live Migration
The pre-copy migration transfers the source machine memory pages to the
destination end node without stopping the execution of VM. The memory
pages are fetched into the nodes through an iteration process where the
updated pages have to be resent to the destination node again. The migration time depends upon the updating pages during transaction in VMs.
The major benefits of this approach are that the destination nodes get all
updates from the host machine. Some of the pages are updated frequently
in the machine which may cause poor performance in VM migration. The
pre-copy algorithm represents the flow of data transfer among host and
destination nodes which is represented in Figure 3.5. From algorithm,
consider VM is running on Host A where the resources are maintained at
phase 0. At phase 1, the reservation process happened when the target host
is connected.
The VM initially selects a container to transfer all the pages. At phase 2,
iterative pre-copy is carried on where shadow paging is enabled and iterates the dirty pages for successive rounds for data copying. When the VM
is out of service (downtime), the system stops and redirects the traffic to
Host B. Then, synchronize the remaining VM state to Host B. At phase 4,
the VM state of Host A is released and VM running normally on Host B
at phase 5. The normal operation is resumed in Host B and connected to
the local device. The timeline of pre-copy migration is illustrated in Figure
3.6. Totally three stages in the live migration approach are warm-up, stop
and copy, and pre-copy stage. There are various advantages of using live
migration; they are load balancing, server consolidation, and Hotspot and
Cold spot migration.
In VM migration, the VMs are transferring from host to destination
but its performances depend on the downtime and total migration time.
Many problems of the pre-copy migration are that it sends the same pages
several times due to page modification which increases the downtime and
migration time. To overcome this problem and improve the performance
of the pre-copy live migration, we analyze the working progress of different
techniques and evaluated its metric parameter performance which is listed
below.
58
The Internet of Medical Things (IoMT)
VM running
ordinarily
on HOST A
Phase 0: Pre-migration
Warm-up
Stage
Active VM on HOST A
Substitute physical host might be preselected for migration
Block devices are mirrored and free assets kept up
Phase1: Reservation
Instate a holder on the target host
Phase2: Iterative pre-copy
Overhead due
to copying
Make it possible forshadow paging
Copy dirty pages in progressive rounds
Phase 3: Stop and copy
Downtime
(VM out of
service)
Stop and
copy Stage
Suspend VM on host A
Generate ARP to divert traffic to Host B
Synchronize all excess VM state to Host B
Phase 4: Commitment
VM State on Host A is delivered
VM running
normally on
Host B
VM ensues on Host B
Interface to local devices
Pre-copy
Stage
Resumes ordinary operation
Figure 3.5 Pre-copy algorithm.
3.3.2 Improved Pre-Copy Approach
In this method, a bitmap page is added which tracks the frequently updating pages in Xen 3.3.0 [13]. At the end of iteration process, the updated
pages are transferred into a bitmap, and thus, it prevents the increase of
Pre-Copy Virtual Machine Migration
59
Total Migration Time
Preparation (live)
Resume time
Rounds
Downtime
Pre-migration
1
Pre-migration phase
T1
2
…
…
Time
N
Iterative copy phase
T2
T3
Stop-and-copy
phase
T4
Figure 3.6 Timeline for pre-copy.
migration time in data transfer which is depicted in Figure 3.7. To achieve
this performance, the memory pages in the Xen are categories into three
kinds of bitmap pages such as TO_SEND_ON, TO_SKIP_OUT, and TO_
FIX. These bitmap pages are described below.
• TO_SEND_ON: This bitmap page marks the dirty pages
which have to be transfer in the current iteration.
• TO_SKIP_OUT: This bitmap page points the pages which
have to be skipped in the current iteration.
• TO_FIX: This bitmap page analyzes and represents the pages
that have to be transferred in the last iteration.
The pages which are categorized under the To_SKIP_OUT bitmap
get transfer to the To_SEND_LAST bitmap page where those pages are
Memory Pages
VM
Frequently
updated pages
transmitted
only in last
iteration
Dirty
Page
Clean
Page
Figure 3.7 Improved pre-copy live migration.
Destination Host
VM
Memory Pages
Source Host
60
The Internet of Medical Things (IoMT)
transmitted at the last iteration. At last, the updated memory pages are
transferred into the TO_SEND_LAST bitmap page.
3.3.3 Time Series–Based Pre-Copy Approach
The pre-copy approach is not suitable for high dirty pages due to the repetitive transmission of dirty pages in cycle when the VM’s memory is to read
and write acceptance. To overcome these problems, time series–based precopy approach [15] is put forth which establish the high dirty pages in cycle
itself. The dirty page’s verifiable statistics are stored in the TO_SEND_ON
bitmap which avoids further repetition of the pages in the iteration. At last,
these dirty pages are transmitted to the destination node at the final iteration. Thus, it reduces the iteration count, migration time, and downtime. It
meets the Service-Level Agreement (SLA) of the user thereby increase the
transmission rate among host and destination. TO_SEND_ON _H is an
array of verifiable bitmaps where K is the maximum size of time series and
N threshold value of the dirty page. Let p be the memory pages, dirty page
be TO_SEND_ON, skipping of dirty page TO_SKIP_OUT in the iteration. The TO_SEND_ON_H array is assisted to decide whether p is sent
or not. To avoid consumption of a lot of migration time, time series–based
approach provides enough detailing about dirty pages in the iteration. The
overview of the time series–based pre-copy approach is depicted in Figure
3.8. The pages are classified into high and low dirty pages. Based on the
equation, the high dirty page is determined. Based on Equation (3.1), the
pages will be sent or not.
∑iN=1 ( p ∈to _ send _ on _ h[i]) ≥ K
Source Host
Destination Host
(3.1)
Dirty page
Iteration: 1
VM
Iteration: n
Iteration: 1
Memory Pages
Memory Pages
VM
Iteration: n
Figure 3.8 Iteration of time series–based pre-copy live migration.
Clean page
High dirty
page
Pre-Copy Virtual Machine Migration
61
Algorithm of Time Series–Based Pre-Copy
Info:
K: limit of high dirty pages
N: the size of time-arrangement of dirty pages
to_send_on: dirty pages of the past emphasis
to_skip_out: dirty pages of the current cycle
to_send_on_h: verifiable dirty pages
Start
1 Send all memory pages in the first run through;
2 to_send_on← dirty pages;
3 to_skip_out←null; i←0;
4 For every cycle do
5 For each page p do
6 If (p∈to_send_on and p∈to_skip_out) or
7 (p∉to_send_on and p∈to_skip_out) or
8 (p∉to_send_on and p∉to_skip_out) at that point
9 Proceed;
10 Else if(p∈to_send_on and p∉to_skip_out) at that point
11 In the event that Condition (5) is valid
12 Proceed;
13 Else goto 17;
14 Else on the off chance that (last_cycle and p∈to_send_on)
15 go to 17;
16 Else Proceed;
17 Send page p to target have;
18 End For
19 to_send_on_h[i]←to_send_on;
20 i←(i+1) % N;
21 to_send_on←to_skip_out;
22 Update to_skip_out;
23 End For
End
Initially, all the memory pages are fetched to the destination node,
where N is the time series of dirty pages and K is the threshold of dirty
pages. TO_SEND_ON provides data about the previous iteration and
TO_SKIP_OUT provides data of the current iteration. Based on the dirty
pages array, the data shared among the host and destination in an iterative manner.
62
The Internet of Medical Things (IoMT)
3.3.4 Memory-Bound Pre-Copy Live Migration
The major drawback of pre-copy live migration is that require maximum
tolerable downtime. In pre-copy, the downtime is set default which cannot support the high-performance computing (HPC) applications. To
resolve this issue, the VM needs an automatic configuration and accomplish the migration in a period of time. The memory-bound pre-copy
live migration (MPLM) does not need the most extreme mediocre
downtime parameter for VM live migration. The MPLM concerns about
the current state of VM calculation without considering the downtime
constraint [1].
The memory pages are split into dirty and non-dirty sets. It is classified into three stages like startup stage, live migration stage, and stopand-copy stage. In the startup stage (S1), the MPLM examines the dirty
page generation in VM memory. In live migration (S2), the memory
pages are partitioned into dirty and non-dirty sets which move in a multiplexing style to the destination end. When everyone of the no-dirty
pages is moved to the destination end, the VM goes into the stop-andcopy stage (S3). The VM halts for some time and transfers the remaining
dirty pages to the destination node. MPLM is more successful and proficient than the pre-copy. I/O is the main thread of MPLM which connects
with users. VCPU core is responsible for workloads execution which is
detected by VCPU threads. At Epoch 0, the MPLM transfers the nondirty pages using Bitmap_sync(0) function. Inv is the period of interval
for data transfer, and we set 3 seconds as default. It uses less migration
time, and migration thread is operated in a series of Epoch. The migration time is determined by the generation of dirty pages in the live migration which is shown in Figure 3.9.
3.3.5 Three-Phase Optimization Method (TPO)
To resolve the overhead problem, an optimization algorithm is implemented at each iteration. It reduces the memory page count during transactions among hosts and destinations. It is splitted into three phases such
as Phase 1, Phase 2, and Phase 3. To_SEND_h[i] array stores all the modified pages in the iteration process [16]. Based on the threshold value,
the less modified pages are transferred earlier. At last in the third phase,
the remaining pages are transferred. In Phase 1, the first iteration started
which transfers all the data among them and the TPO sends only unmodified pages to Phase 2.
Pre-Copy Virtual Machine Migration
I/O threads
63
Migration threads
vcpu threads
Ts1
S1
Bitmap_sync(0)
Epoch 0
Bitmap_sync(1)
Inv
Inv
S2
Epoch 1
Bitmap_sync(2)
Epoch 2
Inv
Bitmap_sync(3)
Epoch 3
Bitmap_sync(4)
S3
Figure 3.9 Structure and operations of MPLM.
If all the pages are modified, then those page’s verifiable statistics are
stored in To_SEND_h[i] array. In Phase 2, 2 to n − 1 iterations are handled
where frequently updated pages are traced by the TPO. The dirty pages are
divided into two groups G1 and G2. The threshold value T1 is determined
by the combination of highly modified and unmodified pages represent in
Equation (3.2).
T1 = [[maxi [modification rate in page]
+ mini [modification rate in page] ÷ 2]
(3.2)
The T1 is calculated based on To_SEND_h[i] array, and thus, it reduces
the repetitive page transfer efficiently and reduces the migration time.
Phase 3 represented a stop-and-copy phase where the remaining pages are
copied based on stopping conditions. If the number of dirty pages count
is 1.5 times greater than the previous iteration, then the pages have to be
compressed before transmission. Run-length encoding supports the pages
to get compressed and transfer to the destination node. The structure of
TPO is shown in Figure 3.10.
64
The Internet of Medical Things (IoMT)
Source
Destination
VM
VM
Phase 1
iteration 1
iteration 1
Phase 2
2 to iteration n
Iteration 2
Phase 3
Apply stop and copy
Iteration n-1
Figure 3.10 Structure of TPO.
3.3.6 Multiphase Pre-Copy Strategy
To resolve the load balancing problems for the cloud provider the multiphase pre-copy live migration approach is implemented which is shown
in Figure 3.11. The first phase transfers all the clean pages. In the second
phase, the verifiable behavior of pages is mapped by Auto-Regressive (AR)
forecasting prediction model which decides to send the pages or not [17].
Source
Phase 1
Destination
VM
VM
iteration 1
iteration 1
Phase 2&3
2 to iteration n
Iteration 2
Phase 4
Prediction model
& Apply stop and copy
Iteration n-1
Figure 3.11 Structure of multiphase pre-copy live migration.
Pre-Copy Virtual Machine Migration
65
In the third phase, the less modified pages are sent and highly modified
pages sizes are minimized by the AR forecasting approach. Based on the
page modification rate the threshold value is calculated for each dirty page.
At last, the remaining page is transferred in the fourth phase. AR is a prediction algorithm based on time series which evaluate the present value
by the sum of several priori value and error terms, AR(p) Equation (3.3),
AR(p): (Xt) = m0 + m1xt − 1 + ⋯.+ mpxt − p + errt
(3.3)
errt is the white noise and {x1, x2, x3} is the time series.
The history of each page is maintained in SEND_TO and SKIP_TO if
the SEND_TO=0 and SKIP_TO=1, then the page will be sent to the next
phase or else it would not send in the current iteration. SEND_TO_H array
determines the pages which have to be sent in the current iteration where
pages are categorized into the highest and least modified page. Let be the
number of all times the page is updated and N be the size of the array. If the
page contains numerous 1 value, then it is assigned as a high dirty page and
vice versa based on Equation (3.4),
K = ∑iN=1 p ∈SEND _ TO _ H [i]
(3.4)
3.4 Results
We research the challenges and issues of the pre-copy live migration techniques and discuss the performance of live migration by analyzing the
migration taking time, downtime, and total migration time and iteration.
This study provides an elaboration perspective of workload balancing in
the cloud data center.
3.4.1 Finding
We analyze the improved pre-copy approach where we found that it reduced
the transfer data ratio and migration time on average. Each machine has 4
GB of memory for data storage. The guest VM machine size is varying from
64 MB to 1,024 MB. To test the improved pre-copy approach, a MUMmer
program is run in the VM which increases the workload because it has an
intensive memory usage for aligning the genomes. Reduce network traffic
in the data transmission in a 10% (1,024 M) and 63% (64 M) [13]. This
approach is highly suitable for low-bandwidth wide area networks (WANs).
66
The Internet of Medical Things (IoMT)
This provides good results in data transmission with less data for each
round. It took average value for migration time which is highly attractive
for an administrator to run the VM in cluster or data center. But downtime
ratio is higher than the pre-copy method because of the transmission of the
duplicate page in the last iteration. It provides a less migration time with a
low overhead value. The performance metrics parameters are evaluated for
improved pre-copy technique and plotted in Figure 3.12. After the evaluation of the improved pre-copy method which is utilized by the small VMs
with 4 GB of memory for data transmission, it provides low migration time
for less amount of data transfer among the host and destination.
In time series–based pre-copy migration, the low and high dirty pages
are considered. The guest’s memory size varies from 128 M to 1,024 M
for testing purposes. It focused on improving migration time by avoiding
duplicate pages. Here, the downtime ratio is less than the improved precopy approach. It meets the SLAs of the users [15]. Xen is installed as a
hypervisor with Network File System (NFS) Service.
Time series–based metric parameters are calculated which is depicted
in Figure 3.13. The migration time increases simultaneously when there
is more dirty pages generation, and they use two different hypervisors
for evaluating the metric parameters (Xen—low dirty pages; improved
400
350
Time (seconds)
300
250
200
150
100
50
0
Page Transfer
Total Migration time
Figure 3.12 Pre-copy vs. improved pre-copy.
Down time
Pre-Copy Virtual Machine Migration
67
1200
1000
Time (seconds)
800
600
400
200
0
Page Transfer
Total Migration time
Down time
iteration
Figure 3.13 Low dirty and high dirty pages based on time series.
Xen—high dirty pages). This method provides a good result but it needs
different hypervisors for achieving this performance.
The MPLM concerns about the high-performance computing (HPC)
application which has maximum tolerable downtime that needs to be satisfied to attain good performance. The dirty page generation correlates with
the downtime during live migration of VM [1]. It supports the data center to perform live migration automatically. The host and client VM run
on Ubuntu 14.04. NFS is utilized for MPLM implementation. Each VM
is configured with 8 VCPUs and 8-GB RAM for data transmission which
makes it to attain a maximum tolerable downtime parameter. The pre-copy
approaches waste migration resources and bandwidth due to default tolerable downtime. MPLM acts as a middle way to generate efficient migration
and downtime for each benchmark data in an automatic manner. Thus,
MPLM provides less migration and downtime thereby making HPC application to transmit the data in a fast manner.
In TPO, they took four different workloads for testing the performance in live migration. Idle, Kernel compiles, Memtester, and Stress
are the four workloads utilized for checking the TPO performance. To
create a virtualization environment, we need 4-GB RAM, Intel Core i5
2400 CPU@3.10GHz processor, and VT-X technology-enabled. The host
machine is installed with CentOS6.3 as OS with Xen 4.3.0. The files are
transferred through the NFS protocol. Iteration is a key performance in
68
The Internet of Medical Things (IoMT)
pre-copy live migration. The log files store all the necessary data about
the metric parameters during migration. CloudSim supports the analyzer
to measure the efficiency of the approaches [16]. The performance of the
techniques is tested under the standard workloads which are represented
below.
• Idle: It is a boosted Centos OS without any running
application.
• Kernel-built: The kernel source is compiled with the VM
that provides an intensive workload.
• Memester 4.3: It supports to test the memory of the subsystem during fault and imposed in-memory workload.
• Stress 1.0.1: It provides a high workload for testing purposes
by imposing a configurable amount of memory, CPU, and
I/O.
The overhead value is reducing efficiently and provides efficient bandwidth utilization for live migration. The TPO is evaluated based on the
metric parameter values which are depicted in Figure 3.14. For higher
work load, the TPO attains total page transfer (701%), total migration time
(70%), and downtime (3%) than pre-copy method.
The multiphase pre-copy live migration is developed based on the
TPO, CloudSim simulator supported to measure the efficiency of these
approaches. It reduces the migration time and transferred page. It is
1200000
Time (seconds)
1000000
800000
600000
400000
200000
0
Page Transfer
Figure 3.14 Pre-copy vs. TPO.
Total Migration time
Down time
Pre-Copy Virtual Machine Migration
69
300
250
Time (seconds)
200
150
100
50
0
Page Transfer
Total Migration time
Down time
Figure 3.15 TPO vs. multiphase.
implemented in a real cloud for performance evaluation [18]. The metric
parameters are calculated for multiphase and depicted in Figure 3.15.
Based on the simulation result of pre-copy live migration techniques,
some methods improve the migration time, overhead values, and reduce
the page memories. In the multiphase technique, the AR forecasting
method supports to reduce the time of live migration. The memory-bound
technique provides an automatic tolerable downtime for VM live migration which is highly acceptable by the cloud service providers. As per our
research analysis, the combination of multiphase and MPLM provides a
balancing environment for both cloud and hypervisor for live migration.
The four metric parameter performances are also improved by the binding
the multiphase and MPLM. We are researching this combined technique
to achieve less iteration, page transfer, migration time, and downtime. It
will be highly beneficial for transferring the VMs among the data centers
in an efficient manner.
3.5 Discussion
3.5.1 Limitation
Even though we analyzed the VM pre-copy live migration techniques, some
issues remain unsolved and need further improvement. Some techniques
have good migration time but it can utilize for a small amount of data
transfer methods. To attain an efficient live migration, we need an efficient
70
The Internet of Medical Things (IoMT)
method that can maintain the workload, less migration time, and downtime. The VMs have a different configuration pattern and memory size,
and it is difficult to stop the iteration phases at the right time. To balance
the workload in the cloud, the users need an optimal termination approach
which has to be suited for all types of OS configuration and memory size.
The method must be integrated with the cloud automatically without manual support for data transmission. Thereby, it has to evaluate the overall
application execution and resource allocation performance.
3.5.2 Future Scope
We integrate the multiphase and MPLM novelty method for improving precopy live migration. Due to the presence of the AR forecasting approach,
the multiphase avoids the repetitive pages and memory-bound generates
an automatic tolerable downtime based on workloads. The multiphase not
only decreases the migration time but also minimizes the transfer pages,
thereby properly utilizing the resources of VM. Unlike the traditional precopy method, MPLM provides an automatic configuration of maximum
tolerable downtime based on data center and VM. Based on these benefits,
we planned to implement a VM architecture which supports all type of system configuration and memory size for data transfer. This will improve the
overall performance of the pre-copy live migration. By combining these
two techniques, it provides a prominent solution to pre-copy live migration problems. We are researching to integrating the multiphase-MPLM
technique in an effective way for attaining better results in live migration.
If this approach gets succeeded, then we will add different kind of VM
workload for evaluating its performance.
3.6 Conclusion
Cloud Computing provides more beneficial for users and cloud providers
through VM migration. It supports the users to manage their hardware’s
performance, data, and load balancing efficiently. The mobility of the VM
migration among the data centers breaks many locks for the users. There are
plenty of researches available for improving the live migration of VM. We
concentrate on pre-copy-based live migration because it provides a selective
page during transactions rather than providing duplicate pages. Post-copy
live migration provides numerous duplicate pages for execution which cause
tremendous errors in live migrations. Pre-copy live migration concerns about
effectively transferring the memory pages and then provides the execution
Pre-Copy Virtual Machine Migration
71
process in the destination node. We research numerous pre-copy live migration methods and its metric parameters such as low migration time, iteration, downtime, and page transfer. But still, some modification needed to
improve the live migration method. From our research analysis, we concluded that there is a possibility of improving the live migration by combining the advantage of the multiphase and MPLM methods. The multiphase
evaluation states that it provides less migration time, downtime, and iteration process in VM. Thereby, the MPLM automatically assigns a maximum
tolerable downtime parameter for computation-intensive and memory-­
intensive workloads. Therefore, the integration of multiphase-MPLM will be
a promising solution for improving the pre-copy live migration technique.
References
1. Chanchio, K. and Yaothanee, J., Efficient Pre-Copy Live Migration of
Virtual Machines for High Performance Computing in Cloud Computing
Environments. 3rd International Conference on Computer and Communication
Systems, 2018.
2. Singh, G., Behal, S., Taneja, M., Advanced Memory Reusing Mechanism
for Virtual Machines in Cloud Computing. 3rd International Conference on
Recent Trends in Computing 2015, Proc. Comput. Sci., 57, 91–103, 2015.
3. Bhardwaj, A. and Krishnab, C.R., Impart factors of affecting VM migration technique for cloud computing. ICN3I-2017, Proceedings, vol. 18,
ScienceDirect, pp. 1138–1145, 2019.
4. Zhong, Y., Xu, J., Li, Q., Zhang, H., Liu, F., Memory State Transfer
Optimization for Pre-copy based Live VM Migration. IEEE Workshop on
Advanced Research and Technology in Industry Applications (WARTIA), 2014.
5. Liang, H., Dai, H., Pei, X., Zhang, Q., A New Pre-Copy Strategy for Live
Migration of Virtual Machines. International Conference on Identification,
Information and Knowledge in the Internet of Things, 2016.
6. Dadrwal, A., Nehra, S., Khan, A.A., Dua, M., Checkpoint based Live
Migration of Virtual Machine. International Conference on Communication
and Signal Processing, India, April 3-5, 2018.
7. Jain, P., and Agrawal, R., An Improved Pre-copy Approach for Transferring the
VM Data during the Virtual Machine Migration for the Cloud Environment.
Int. J. Eng. Manuf., 6, 51–59, 2016.
8. Shukla, R., Gupta, R.K., Kashyap, R., A Multiphase Pre-copy Strategy for the
Virtual Machine Migration in Cloud. Proceeding of the Second International
Conference on SCI 2018, vol. 1.
9. Venkata Subramanian, N., Saravanan, N., Shankar Sriram, V.S., Survey on
Mitigation Techniques of Virtualization Technique. ARPN J. Eng. Appl. Sci.,
12, 2, 2017.
72
The Internet of Medical Things (IoMT)
10. Kumar, D.K., Sarachandrica, T.P., Rajasekhar, B., Jayasankar, P., Review on
Virtualization for Cloud Computing. Int. J. Adv. Res. Comput. Commun.
Eng., 3, 8, 48, August 2014.
11. Sunitha Rekha, G., A Study on Virtualization and Virtual Machines. Int. J.
Eng. Sci. Invention (IJESI), 7, 5, 51–55, 2018.
12. Shukla, R., Gupta, R.K., Kashyap, R., A Multiphase Pre-copy Strategy for
the Virtual Machine Migration in Cloud, in: Smart Innovation, Systems and
Technologies, vol. 104, 2018.
13. Ma, F., Liu, F., Liu, Z., Live Virtual Machine Migration based on Improved Precopy Approach, in: IEEE International Conference on Software Engineering
and Service Sciences, 2010.
14. cui, W. and song, M., Live Memory Migration with Matrix Bitmap Algorithm.
IEEE 2nd Symposium on Web Society, 2010.
15. Hu, B., Lei, Z., Lei, Y., Xu, D., Li, J., A Time-Series Based Precopy Approach
for Live Migration of Virtual Machines. IEEE 17th International Conference
on Parallel and Distributed Systems, 2011.
16. Sharma, S., and Chawla, M. A three phase optimization method for precopy
based VM live migration. SpringerPlus 5, 1022, 2016.
17. Shukla R., Gupta R.K., Kashyap R., A Multiphase Pre-copy Strategy for
the Virtual Machine Migration in Cloud, in: Smart Intelligent Computing
and Applications. Smart Innovation, Systems and Technologies. Satapathy
S., Bhateja V., Das S. (eds), vol. 104. Springer, Singapore, 2019. https://doi.
org/10.1007/978-981-13-1921-1_43
4
Estimation and Analysis of Prediction Rate
of Pre-Trained Deep Learning Network in
Classification of Brain Tumor MRI Images
Krishnamoorthy Raghavan Narasu*, Anima Nanda, Marshiana D., Bestley Joe
and Vinoth Kumar
Sathyabama Institute of Science and Technology, Chennai, Tamilnadu, India
Abstract
Early detection and classification of brain tumors is very important in clinical
practice. In recent decades, deep learning has gained more interest in various
fields like image classification, self-driven cars, natural language processing, and
healthcare applications. It solves the complex problems in more effective and efficient manner. It is a subset of layers which comprises of convolutional neural network, activation function, and decision-making layers. In this article, AlexNet,
GoogleNet, and ResNet101 networks are used to classify the MRI images into
four classes, e.g., normal, glioma, meningioma, and pituitary tumors, which have
been carried out.
Dataset consists of 120 MRI images with four different classes. The pre-trained
networks are used to classify the class in three different ways. In the first case, 10
sample images are considered and its prediction rate and training and validation
time are recorded. Similar in the second and third methods, 20 and 30 images
are used to evaluate the metrics. The result concluded that, for more images,
processing time is increased, yielding better accuracy. The highest accuracy of
91 is achieved by the ResNet101 network with the processing time of 6 minutes.
Researchers can still reduce this processing time and increase the accuracy rate by
incorporating the image augmentation techniques to the raw MRI images.
Keywords: Electroencephalography, convolutional neural network, Googlenet,
balanced accuracy, Alexnet, confusion matrix, Resnet
*Corresponding author: moorthy26.82@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (73–98) © 2022 Scrivener Publishing LLC
73
74
The Internet of Medical Things (IoMT)
4.1 Introduction
The most complex organ in the human body is the brain which acts as central
nervous system along with the spinal cord. It controls the major activity of
the body by collecting the information and process it and coordinating with
other parts of the body through billions of neurons. It receives from the sense
organs and making decisions as instructions to be sent to the rest of the body.
Medical imaging technologies such as functional neuro-imaging and electroencephalography (EEG) recordings are important in studying the brain.
A brain tumor occurs when abnormal cells form within the brain. There
are two types of tumors: malignant (cancerous) and benign (non-cancerous) tumors. These tumor cells affect the healthy cells and activity of the
normal brain becomes non-progressive. Benign tumor cells grow slowly
in brain and it also originates inside the brain. The specialty of the benign
tumor cells is that it does not spread to other organs in the human body
other than brain. For these reasons, it is also termed as non-progressive. On
the other hand, malignant tumor cells are progressive type; it can spread in
any parts of the human body. Depending upon the mode of origination, it
is further classified into primary and secondary malignant tumor. Primary
malignant cells originate in the brain itself, whereas the secondary one can
originate anywhere in the body and affects the functionality of the brain.
To detect and classify the type of tumor cells, researcher relies on the
Magnetic Resonance Imaging (MRI). It is one of the oldest technique
through which the brain activity can be analyzed and proper treatment will
be given if the patient is affected by the tumor cells. MRI image has very
high resolution which gives information about the abnormalities of the
brain and also provides the clear structure of the brain. This will enhance
the doctors and researchers to perform various analyses and give the proper
treatment to the patients. Over the last few decades, neural networks and
support vector machine approach plays a vital role in the detection and
classification of brain tumor cells. Recently, due to the processing speed
and user friendly nature, deep learning technique is widely applied in various applications including the healthcare. It consists of series of activation
function, convolutional layer, and pooling layers. This will easily out to
represent even the complex relationship of the input images. Convolutional
Neural Network (CNN), Recurrent Neural Network (RNN), and K-nearest
Neural Network (KNN) are the new architecture which are widely used by
the researchers in various medical applications.
Deep learning is a subset of machine learning and it has set of layers which
are trained for classification or regression purpose. It plays a vital role in many
DL Network in Classification of Brain Tumor
75
applications like driverless cars, to distinguish a pedestrian from a lamppost
and enabling to recognize a stop sign. In real-time scenario, our human brain
is trained to perform lot of work. In child hood, our brain is getting trained by
parents by showing some objects and naming them. After when we grow up,
we started learning our-self through the training phase. Similar way, the deep
learning is working; sets of data are feed to the pre-trained networks during the
training phase. Once training has been done successfully, it can able to predict
or classification of the new set of data during the validation phase. The various applications of DL in the medicine field are analyses of medical insurance
fraud crimes, prediction of Alzheimer’s diseases, image diagnostics, determination of genome defects, detect heart problems, and tracking the glucose
level of diabetic patients. It is also applied to non-medical applications such
as mobile advertising, natural language processing, and visual art processing.
4.2 Classes of Brain Tumors
Brain tumors are basically classified into three types, namely, meningioma,
gliomas, and pituitary tumor. Meningioma is also referred as meningeal
tumor. It is slow-growing tumor which forms as a layer in the brain and
spinal cord. Women are most affected then men in the ratio of 1:2 due to
their weight and excess fat in the body. It causes Neurofibromatosis type 2, a
genetic disorder and exposure to radiation. Second type is brain tumor is glioma and it occurs in the spinal cord and brain. Glial cells surround the nerve
cells and make them works properly. It is actually begin in the gluey supportive cells. Classifications of gliomas are based on its types and its genetic features. Gliomas comprise about 30% of all brain tumors and central nervous
system tumors and 80% of all malignant tumors. Glioma is further classified
into Astrocytomas, Ependymomas, and Oligodendrogliomas. The treatment
procedure for glioma includes targeted therapy surgery, chemotherapy, radiation therapy, and experimental clinical trials.
Abnormal growth in the pituitary gland results in the pituitary tumors.
Tumor cells make the glands to segregate lower levels of hormones and
thereby result in the improper function of human body. Generally this type
of tumor cells is non-cancerous and measure in large scale in the size of
about 1 cm or even more. Tumor cells having size of more than 1 cm are
termed as macroadenomas and less than 1 cm are referred as microadenomas. Due to its larger size compared to other type of tumor cells, it puts
pressure on the pituitary gland and its supporting structures. The various
symptoms of these tumors are listed in Table 4.1.
76
The Internet of Medical Things (IoMT)
Table 4.1 Various symptoms of brain tumors.
Causes
Meningioma
Glioma
Pituitary tumor
Symptoms
• Blurred vision
• Weakness in arms
and legs
• Numbness
• Speech problems
• Headaches
• Seizures and
dementia
• Loss of bladder
control
• Headache
• Vomiting
• Decline in brain
function
• Memory loss
• Personality
changes
• Difficulty with
balance
• Urinary
incontinence
• Blurred vision
• Speech
difficulties
• Seizures
• Headache
• Vision loss
• Over
functioning
• Deficiency
• ACTH tumor
• Growth
hormone–
secreting tumor
• Prolactin
secreting tumor
• Thyroidstimulating
hormone
secreting tumor
4.3 Literature Survey
Detection of breast cancer [1] has been tested massively using deep learning
technique. Assessment of deep learning tools carried out by both academic
and community imaging radiologists. The deep learning model showed
promising result compared to personal investigation by radiologist. A total
of 2,174 sample images were taken and validation of deep learning model
was done. The proportion of assessed mammograms by radiologist was
dropped from 47% to 41% after implementation of deep learning model.
A rapid growth of application of deep learning in the field of radiology
was studied in [2–5]. Using mammographic breast density parameter [6,
7], one can easily estimate the presence of breast cancer. In [8], radiologist
accepted that deep learning has plays a vital role in the medical filed after
proper real-time investigation.
MRI images are affected by various factors such as noise, low image resolution, and quality of MRI devices. To overcome these drawbacks and to
classify the class of MRI images as either benign or malignant, single image
super resolution technique [9] was proposed. The input image is segmented
using Maximum Fuzzy Entropy Segmentation (MFES) value, and later,
the classification of image is done through pre-trained neural networks.
Features of images are extracted by the CNN block of residual neural network (ResNet) architecture, followed by that support vector machine are
DL Network in Classification of Brain Tumor
77
used for classification of class of the MRI images. Combination of both
SISR and MFES showed promising improvement in the performance of
image segmentation. Other existing classification models for brain tumor
classification are sparse representation [10], support vector machines [11],
ANFIS [12], transfer learning and fine-tuning [13], concatenated random
forests [14], superpixel-based classification [15], watershed [16], and deep
CNN [17].
Astrocytoma is a class of brain tumor falls as the subclass in the glioma type. Radiologist finds difficulty in predicting this type of cancer. To
easy out the prediction process, the input MRI image needs to be carefully
pre-processed and its features has to be extracted correctly [18]. Digital
image processing and machine learning comes together to do this. In Back
Propagation Neural Network (BPNN) algorithm, determination of weights
of neurons was a crucial part. Weights are basically adjusted in the direction of steepest descent rule. Dolphin clicks [19] are modeled and it was
used as carrier signal to demonstrate the effective data transmission in
underwater communication.
In [20], Scaled Conjugate Gradient was used for calculating the neuron weights. In addition to this, 19 Principal Components Analysis (PCA)
and PNN (21) were employed followed by the BPNN. Over-fitting arises in
classifying the Low Grade Glioma (LGG) and High Grade Glioma (HGG)
cancer images. To overcome this, CNN operated as patched using 3*3 kernels [22]. Almost 450,000 and 350,000 patched were employed to train the
CNN architecture for HGG and LGG classification. The investigation of
BPNN and CNN [23] architecture was carried out for grading multiphase
MR images.
The various stages of computer aided methods in the brain tumor
diagnosis involve detection, segmentation, and classification processes.
Recently, interest has developed in using DL techniques for diagnosing brain tumors with better robustness and accuracy. A comprehensive
study of deep learning in MRI image classification [24] was carried out. It
plays a roadmap for future research and can apply different strategy and
evolve their findings. Performance of pre-trained network AlexNet [25]
was tested using its CNN blocks. Discrete Wavelet Transform (DWT) was
combined with the PCA [26] for feature extraction in the brain tumor MRI
classification process.
In [27], new method was designed by combining the concepts of
Stationary Wavelet Transform and Growing CNNs for the brain tumor
image segmentation. Comparative analysis also carried out between this
method and SVM and CNN. The results concluded that performance metrics such as accuracy, MSE, and PSNR will gives better than the CNN and
78
The Internet of Medical Things (IoMT)
SVM. K-Nearest Neighbor (K-NN) [28] has duplicate property in sense
that it has quiet long processing time. It is executed at run time when the
training data-set is large. This disadvantage has overcome by applying
SVM [29] in training phase of KNN. SOM [30] and genetic algorithm [31]
in brain tumor classification also has promising result.
Deep learning technique also plays strong role in the underwater communication medium. To train the process, basically, deep learning requires
large set of dataset. It is available for air medium, whereas for underwater
medium, notable dataset is there. In [32], the authors proposed use of photorealistic synthetic imagery for training the network. SegNet also referred
as seep encoder-decoder network was trained using synthetic image which
has dimension of 960 × 540 pixel image for biofouling detection purpose.
The various other techniques and neural networks used for detection of
natural images are Trellis Coded Modulation–Orthogonal Frequency
Division Multiplexing (TCM-OFDM) [33], CNN [34], GoogleNet [35],
AlexNet [36], ResNet [37], and Adaptive Equalizer [38].
A novel-based CNN [39] was proposed for multi-grade brain tumor
classification. Initially, tumor regions were detected using deep learning
technique, and then, data augmentation was used for training the dataset.
Finally, pre-train CNN model was used for classification purpose. A class
of feed-forward artificial neural network (ANN) CNN has been extensively applied for various non-medical applications such as speech recognition [40], computer vision [41], authentication system [42], and image
processing [43].
4.4 Methodology
Deep learning concept is used in this article to perform a classification of
tumors cells using brain MRI images and its various performance metrics
are valuated. It aims to differentiate between normal with other three types
of tumors occurring in brain such as glioma, meningioma, and pituitary
tumors. The design flow of the proposed system is shown in Figure 4.1.
The raw MRI datasets are collected as .mat files. It consists of 120 .mat
file with four subclasses, each class carries 30 files. To train the images using
deep learning, one has to maintain the image in RGB format. Therefore,
these raw images are initially converted into gray format. Then, using
string concatenated method, we have attained the MRI images in RGB
format. The .mat file contains different pixel size, using resize technique,
all the datasets are modified into image of 240 × 240 pixels size. Sample
images of all the four classes are shown in Table 4.2.
DL Network in Classification of Brain Tumor
79
INPUT
DATASET
IMAGE
PREPROCESS
CLASSIFICATION USING
PRE-TRAINED NETWORK
(ALEXNET, GOOGLENET,
RESNET101)
NORMAL
MENINGIOMA
PITUITARY TUMOR
GLIOMA
Figure 4.1 Design flow of the proposed system.
Training phase has been carried out in three different categories. In the
first phase, only 10 images are considered, in the second phase, 20 images
are taken. In the last category, all 30 images are taken into account from
all subclasses for training phase. The splitting of images for training and
validation phases is in the ratio of 7:3. The input images are augmented and
stored in the augmented data store. The layers of the pre-trained networks
are modified and appropriate learning rates are applied for both training
and validation purpose. The few parameters considered for training the
images are minimum batch size of 10, Max Epochs as 4 with learning rate
of 10−4. Three different pre-trained networks, namely, AlexNet, GoogleNet,
and ResNet101 are applied to the dataset and its performance is evaluated.
Description of the networks is as follows.
• AlexNet: It consists of CNN, ReLu Activation function, and
pooling blocks which has been trained with more than a
million images from the ImageNet database. AlexNet contained eight layers; the first five were convolutional layers,
some of them followed by max-pooling layers, and the last
three were fully connected layers.
Pituitary
Tumor
Glioma
Meningioma
Class
Sample images
Table 4.2 Sample images used for classification purpose.
(Continued)
80
The Internet of Medical Things (IoMT)
Normal
Class
Sample images
Table 4.2 Sample images used for classification purpose. (Continued)
DL Network in Classification of Brain Tumor
81
82
The Internet of Medical Things (IoMT)
• GoogleNet: GoogleNet is a pre-trained CNN that is 22 layers
deep. You can load a network trained on either the ImageNet
or Places 365 datasets. The network trained on ImageNet
classifies data into 1,000 object categories, such as mouse,
keyboard, many animals, and pencil.
• ResNet101: ResNet101 is a CNN that is trained on more
than a million images from the ImageNet database. It is 101
layers deep. A ResNet is class of ANN which builds on constructs similar to pyramidal cells in the cerebral cortex.
The confusion matrix consists of 2 × 2 matrix. It is useful in the field
of machine learning for the problem of statistical classification. It is also
known as an error matrix. Each row of the matrix represents the instances
in predicted class while each column represents the instances in an actual
class. The four outcomes of classification are as follows.
• True positive (TP): These are cases in which we predicted as
disease, and they do have the disease.
• False positive (FP): We predicted as disease, but they do not
actually have the disease
• True negative (TN): We predicted as no disease, and they do
not have the disease.
• False negative (FN): We predicted no disease, but they actually do have the disease
The performance of the classifier is calculated in terms of accuracy,
recall, precision, and F-measure using the confusion matrix. The accuracy
is defined as the ratio of number of correctly labeled predictions to the
total number of predictions. It is given by
Accuracy
TP
TP TN
TN FP
FN
The sensitivity or recall is the number of correctly positive labeled predictions divided by the number of results that should have been returned.
It is given by
Sensitivity
TP
100
TP FN
DL Network in Classification of Brain Tumor
83
The prevalence is the number of actual yes condition the occurs in a
sample to the total number of predictions. It is given by
Precision =
TP
TP FP
Specificity is the ratio of correctly negative labeled by the program to the
number of all results. It is given by
Specificity =
TN
100
TN FP
The two scores are combined into the calculation of the F-scores. It is
given by
F1 score =
2 Precision Recall
Precision Recall
Balanced accuracy (Bac) is the sum of sensitivity and specificity to the
fraction of two. It is given by
Bac
Sensitivity
Specificity
2
The training progress and its corresponding confusion matrix for the
three pre-trained networks are simulated and its results are listed from
Tables 4.3 to 4.5.
The training process is performed with AlexNet, GoogleNet, and
ResNet101 for 10 images, 20 images, and 30 images under each classification. The output of different classes is obtained. Figure 4.2 illustrates
the Comparative analysis of Pre-trained networks in the classification
brain tumor images. Table 4.6 shows the comparison between AlexNet,
GoogleNet, and ResNet101. This table compares the validation accuracy
between the three architectures and time taken by each architecture to
compare 10, 20, and 30 images. Hence, AlexNet shows higher accuracy
compared to other. The more the images we input, the more the accuracy
we get and time taken for training also increases.
10
No. of sample
(Images)
0
0.5
1
1.5
2
0
20
40
60
80
100
Epoch 1
Epoch 1
0
0
1
1
2
2
Training process
3
3
Epoch 2
Epoch 2
4
Iteration
4
Iteration
Table 4.3 Performance of AlexNet pre-trained network.
Accuracy (%)
Loss
5
5
Epoch 3
Epoch 3
6
6
7
7
Epoch 4
Epoch 4
8
8
Final
Final
0.17
0
0
0.33
0.50
0
0
1
Confusion matrix
0.17
0.83
0.17
0
(Continued)
0.83
0
0
0
84
The Internet of Medical Things (IoMT)
20
0
0.5
1
1.5
2
0
20
40
60
80
100
0
Epoch 1
0
Epoch 1
2
2
4
4
Training process
Accuracy (%)
Loss
No. of sample
(Images)
6
6
Epoch 2
Epoch 2
8
8
10
12
Iteration
Epoch 3
10
12
Iteration
Epoch 3
14
14
Table 4.3 Performance of AlexNet pre-trained network. (Continued)
Epoch 4
16
Epoch 4
16
18
18
20
20
Final
Final
0
0.67
0
0
0.67
0
0
0
Confusion matrix
0
1
0
0.17
(Continued)
1
0
0.33
0.17
DL Network in Classification of Brain Tumor
85
30
No. of sample
(Images)
0
0.5
1
1.5
2
0
20
40
60
80
100
0
0
Epoch 1
Epoch 1
5
5
Training process
10
Epoch 2
10
Epoch 2
15
15
Iteration
20
Epoch 3
Iteration
20
Epoch 3
Table 4.3 Performance of AlexNet pre-trained network. (Continued)
Accuracy (%)
Loss
Epoch 4
25
Epoch 4
25
30
30
Final
Final
0.67
0
0
0.11
0
0
0.33
1
Confusion matrix
0
0.89
0
0
1
0
0
0
86
The Internet of Medical Things (IoMT)
10
0
0
0.5
1
1.5
0
0
20
40
60
80
100
Epoch 1
Epoch 1
1
1
Training process
Accuracy (%)
Loss
No. of sample
(images)
2
2
3
3
Epoch 2
Epoch 2
4
4
Iteration
Iteration
Table 4.4 Performance of GoogleNet pre-trained network.
5
5
Epoch 3
Epoch 3
6
6
Epoch 4
7
Epoch 4
7
8
8
Final
Final
0.33
1
0
0
0
0
0
0
Confusion matrix
0
1
0
0.67
(Continued)
1
0
0
0
DL Network in Classification of Brain Tumor
87
20
No. of sample
(images)
0
0
0.5
1
1.5
0
0
20
40
60
80
100
Epoch 1
Epoch 1
2
2
4
4
Training process
6
6
Epoch 2
Epoch 2
8
8
10
12
Iteration
Epoch 3
10
12
Iteration
Epoch 3
14
14
Epoch 4
16
Epoch 4
16
Table 4.4 Performance of GoogleNet pre-trained network. (Continued)
Accuracy (%)
Loss
18
18
20
20
Final
Final
0.17
0
0
0.33
0.5
0
0
1
Confusion matrix
0.17
0.83
0.17
0
(Continued)
0.83
0
0
0
88
The Internet of Medical Things (IoMT)
30
0
0
0
0.5
1
1.5
0
20
40
60
80
100
Epoch 1
Epoch 1
5
5
Training process
Accuracy (%)
Loss
No. of sample
(images)
10
Epoch 2
10
Epoch 2
15
15
Iteration
20
Epoch 3
Iteration
20
Epoch 3
25
Epoch 4
25
Epoch 4
Table 4.4 Performance of GoogleNet pre-trained network. (Continued)
30
30
Final
Final
0.22
0.11
0.22
0.67
0.22
0
0.56
0.33
Confusion matrix
0
0.78
0.11
0.11
0.67
0
0
0
DL Network in Classification of Brain Tumor
89
10
No. of sample
(Images)
0
0.5
1
0
20
40
60
80
100
0
0
Epoch 1
Epoch 1
1
1
2
2
Training process
3
3
Epoch 2
Epoch 2
4
Iteration
4
Iteration
5
5
Epoch 3
Epoch 3
Table 4.5 Performance of ResNet101 pre-trained network.
Accuracy (%)
Loss
6
6
7
7
Epoch 4
Epoch 4
8
8
Final
Final
0
0
0
0.67
0
0
0
0
Confusion matrix
0
1
0
0.67
(Continued)
1
0
0.33
0.33
90
The Internet of Medical Things (IoMT)
20
0
0
0
0.5
1
0
20
40
60
80
100
Epoch 1
Epoch 1
2
2
4
4
Training process
Accuracy (%)
Loss
No. of sample
(Images)
6
6
Epoch 2
Epoch 2
8
8
10
12
Iteration
Epoch 3
10
12
Iteration
Epoch 3
14
14
16
Epoch 4
16
Epoch 4
Table 4.5 Performance of ResNet101 pre-trained network. (Continued)
18
18
20
20
Final
Final
0
0.83
0
0
1
0.17
0
0.17
Confusion matrix
0
1
0
0
(Continued)
0.83
0
0
0
DL Network in Classification of Brain Tumor
91
30
No. of sample
(Images)
0
0
0
0.5
1
1.5
0
20
40
60
80
100
Epoch 1
Epoch 1
5
5
Training process
10
Epoch 2
10
Epoch 2
15
15
Iteration
20
Epoch 3
Iteration
20
Epoch 3
25
Epoch 4
25
Epoch 4
Table 4.5 Performance of ResNet101 pre-trained network. (Continued)
Accuracy (%)
Loss
30
30
Final
Final
0
0
0.11
0.67
0.22
0
0.11
0.78
Confusion matrix
0
1
0
0
0.89
0
0.11
0.11
92
The Internet of Medical Things (IoMT)
93
DL Network in Classification of Brain Tumor
COMPARISON OF ALEXNET, GOOGLENET AND RESNET 101
120
100
80
97
86
98
92
97
94
85
95
86 88
78
74
66
59
60
47
40
28 24
20
14
cy
Ba
la
nc
e
d
Ac
cu
ra
FSc
or
e
en
ce
al
Pr
ev
y
Sp
ec
ifi
cit
ty
vi
iti
ns
Se
Ac
cu
ra
cy
0
Alexnet Googlenet Resnet 101
Figure 4.2 Comparative analysis of pre-trained networks for brain tumor classification.
4.5 Conclusion
The detection of brain tumor at the early stage is important. Brain tumor
includes three types: meningioma, glioma, and pituitary tumor. In this
research, a technique has been proposed where the automatic detection
of brain tumor is involved and performed using deep learning where the
tumor can be classified at the early stage. In this article, we used three
types of architectures: AlexNet, ResNet101, and GoogleNet, accuracy and
confusion matrix is obtained. Finally, we compared which network gives
high accuracy than the other and classified the different types of tumor to
which category belongs to. The validation accuracy and confusion matrix
is obtained—10 images, 20 images, and 30 images for each architecture.
Many researches proposed different categories in deep learning. Heba
Mohsen et al. proposed a theory on classifying brain tumor images with
deep neural networks. They used DWT and DNN transforms for classifying, whereas we used three different types of architectures. Deep learning
has been advanced in various industries for object detection, mammography etc. Brian N. Dontchos et al., in 2020, had used a deep learning model
for predicting mammographic breast density in clinical practice.
The segmentation of glioma tumors was proposed by Saddam Hussian,
using a two- phase weighted training method and improves its performance parameters. Using transfer learning, the accuracy ranges to 90%,
whereas our proposed system classifies three types of architectures with
accuracy of 82%. Even if the accuracy less compared to transfer learning,
30
0.76
0.88
GoogleNet
ResNet101
0.72
ResNet101
0.91
0.76
GoogleNet
AlexNet
1
1
Resnet101
AlexNet
0.72
GoogleNet
20
1
AlexNet
10
Accuracy
Architecture
No. of samples
(Images)
0.77
0.44
0.77
0
0.66
1
1
0.33
1
Sensitivity
0.92
0.90
0.96
1
0.78
0.17
0.20
0.22
0
0.34
0.27
0.27
1
1
0.18
0.37
Prevalence
0.87
1
Specificity
Table 4.6 Comparison of performance metrics between AlexNet, GoogleNet, and ResNet101.
0.77
0.53
0.84
0.67
0.86
0.5
0
0.82
0.72
1
1
0.6
1
Balanced
accuracy
0.61
1
1
0.39
1
F-Score
94
The Internet of Medical Things (IoMT)
DL Network in Classification of Brain Tumor
95
Table 4.7 Evaluation of accuracy and processing time of pre-trained networks.
Network
No. of sample
images
Accuracy (in
percentage)
Processing time
(in minutes)
AlexNet
10
75
4
20
83.33
5
30
88.89
6
10
66.67
3
20
62.5
4
30
75
6
10
58.33
4
20
75
5
30
91.67
6
GoogleNet
ResNet101
deep learning is very cheap and can calculate its quantitative parameters
easily and diagnose the disease at very early stage of tumor. By this proposed project, the performance analysis is done and can be analyzed by
the doctor.
The performance of proposed method was validated based on accuracy,
prevalence, sensitivity, specificity, balanced accuracy, and F-measures,
which gave better accuracy than the other network. Table 4.7 shows the
accuracy and processing time of three different architectures with sets of
sample data. The accuracy value for AlexNet, GoogleNet, and ResNet101
pre-trained networks was calculated as 82.40%, 68.05%, and 75%, respectively. Hence, the classification method compared with AlexNet provided
higher accuracy than GoogleNet and ResNet101. The tumors are classified
successfully to their respective categories.
References
1. Dontchos, B.N., Yala, A., Barzilay, R., Xiang, J., Lehman, C.D., External
Validation of a Deep Learning Model for Predicting Mammographic Breast
Density in Routine Clinical Practice. Acad. Radiol., 27, 157–310, Feb 2020.
2. Bahl, M., Barzilay, R., Yedidia, A.B. et al., High-risk breast lesions: a machine
learning model to predict pathologic upgrade and reduce unnecessary surgical excision. Radiology, 286, 3, 810–818, 2017.
96
The Internet of Medical Things (IoMT)
3. Kohli, M., Prevedello, L.M., Filice, R.W. et al., Implementing machine learning in radiology practice and research. AJR Am. J. Roentgenol., 208, 754–760,
2017.
4. Lakhani, P. and Sundaram, B., Deep learning at chest radiography: automated classification of pulmonary tuberculosis by using convolutional neural networks. Radiology, 284, 574–582, 2017.
5. Geras, K., Wolfson, S., Shen, Y. et al., High-resolution breast cancer screening with multi-view deep convolutional neural networks, arXivorg, 2017, 6
Cornell University, November 2017, Epub.
6. Kallenberg, M. et al., Unsupervised Deep Learning Applied to Breast Density
Segmentation and Mammographic Risk Scoring. IEEE Trans. Med. Imag., 35,
1322–1331, 2016.
7. Wu, N., Geras Krzystof, J., Shen, Y. et al., Library CU (Ed.), arXiv.org, Cornell
University, 2020.
8. Lehman, C.D., Yala, A., Schuster, T. et al., Mammographic breast density
assessment using deep learning: clinical implementation. Radiology, 290, 1,
52–58, 2018.
9. Sert, E., ozyurt, F., Dogantekin, A., A new approach for brain tumor diagnosis system: Single image super resolution based maximum fuzzy entropy
segmentation and Convolutional Neural Network. Med. Hypotheses, 133,
109413, December 2019.
10. Yuhong, L., Fucang, J., Jing, Q., Brain tumor segmentation from multimodal
magnetic resonance images via sparse representation. Artif. Intell. Med., 73,
1–13, 2016.
11. Zhang, J., Ma, K., Er, M., Chong, V., Tumor segmentation from magnetic resonance imaging by learning via one-class support vector machine. Workshop
on advanced image technology, pp. 207–11, 2004.
12. Selvapandian, A. and Manivannan, K., Fusion based Glioma brain tumor
detection and segmentation using ANFIS classification. Comput. Methods
Programs Biomed., 166, 33–8, 2018.
13. Swati, Z.N.K., Zhao, Q., Kabir, M., Ali, F., Ali, Z., Ahmed, S. et al., Brain
tumor classification for MR images using transfer learning and fine-tuning.
Comput. Med. Imaging Graph., 75, 34–44, 2019.
14. Tustison, N., Shrinidhi, K.L., Wintermark, M., Durst, C.R., Kandel, B.M.,
Gee, J.C. et al., Optimal symmetric multimodal templates and concatenated
random forests for supervised brain tumor segmentation (simplified) with
ANTsr. Neuroinformatics, 13, 209–25, 2015.
15. Rehman, Z.U., Naqvi, S.S., Khan, T.M., Khan, M.A., Bashir, T., Fully automated multi-parametric brain tumour segmentation using superpixel based
classification. Expert. Syst. Appl., 118, 598–613, 2019.
16. Letteboer, M.M.J., Niessen, W.J., Willems, P.W.A., Dam, E.B., Viergever,
M.A., Interactive multi-scale watershed segmentation of tumors in MR brain
images. Proceedings of interactive medical image visualization and analysis,
pp. 11–6, 2001.
DL Network in Classification of Brain Tumor
97
17. Deepak, S. and Ameer, P.M., Brain tumor classification using deep CNN features via transfer learning. Comput. Biol. Med., 111, 103345, 2019.
18. Geethu, M. and Subashini, M.M., MRI based medical image analysis: Survey
on brain tumor grade classification. Biomed. Signal Process. Control, 39, 139–
161, January 2018.
19. Krishnamoorthy, N.R., Suriyakala, C.D., Subramaniom, M., Ramadevi, R.,
Marshiana, D., Kumaran, S., Modelling of dolphin clicks and its performance
analysis compared with adaptive equalizer in ocean environment. Biomed.
Res. J., 29, 12, 2454–2458, 2018.
20. Zhang, Y., Dong, Z., Wu, L., Wang, S., A hybrid method for MRI brain image
classification. Expert Syst. Appl., 38, 10049–10053, 2011.
21. Gaikwad, S.B. and Joshi, M.S., Brain tumor classification using principal
component analysis and probabilistic neural network. Int. J. Comput. Appl.,
120, 5–9, 2015.
22. Pereira, S., Pinto, A., Alves, V., Silva, C.A., Brain tumor segmentation using
convolutional neural networks in MRI images. IEEE Trans. Med. Imaging,
35, 1240–1251, 2016.
23. Pan, Y., Huang, W., Lin, Z., Zhu, W., Zhou, J., Wong, J., Ding, Z., Brain tumor
grading based on neural networks and convolutional neural networks. Eng.
Med. Biol. Soc (EMBC), 2015 37th Annu. Int. Conf. IEEE, pp. 699–702, 2015.
24. Abd-Ellah, M.K., Awad, A.I., Khalaf, A.A.M., Hamed, H.F.A., A review on
brain tumor diagnosis from MR images: Practical implications, key achievements, and lessons learned. Magn. Reson. Imaging, 61, 300–318, September
2019.
25. Abd-Ellah, M.K., Awad, A.I., Khalaf, A.A.M., Hamed, H.F.A., Two-phase
multi-model automatic brain tumour diagnosis system from magnetic resonance images using convolutional neural networks. EURASIP J. Image Video
Process, 97, 1, 1–10, 2018.
26. Mohsen, H., El-Dahshan, E.S.A., El-Horbaty, E.S.M., Salem, A.B.M.,
Classification using deep learning neural networks for brain tumors. Future
Comput. Inf. J., 3, 68–71, 2018.
27. Mittal, M., Goyal, L.M., Kaur, S., Kaur, I., Verma, A., Hemanth, D.J., Deep
learning based enhanced tumor segmentation approach for MR brain
images. Appl. soft computing, 78, 346–354, May 2019.
28. Vrooman, H.A., Cocosco, C.A., Lijn, F.V.D., Stokking, R., Ikram, M.A.,
Vernooij, M.W. et al., Multi-spectral brain tissue segmentation using automatically trained k-nearest-neighbor classification. NeuroImage, 37, 1, 71–81,
2007.
29. Ayachi, R. and Amor, N.B., Brain tumor segmentation using support vector
machines; symbolic and quantitative approaches to reasoning with uncertainty. Lecture Notes Comput. Sci., 5590, 2009, 736–747.
30. Logeswari, T. and Karnan, M., An improved implementation of brain tumor
detection using segmentation based on hierarchical self organizing map. Int.
J. Comput. Theory Eng., 2, 4, 1793–8201, 2010.
98
The Internet of Medical Things (IoMT)
31. Kharrat, A., Gasmi, K., Messaoud, M.B., Benamrane, N., Abid, M., A hybrid
approach for automatic classification of brain MRI using genetic algorithm
and support vector machine. Leonardo J. Sci., 9, 17, 71–82, 2010.
32. O’Byrne, M., Pakrashi, V., Schoefs, F., Ghosh, B., Semantic Segmentation of
Underwater Imagery Using Deep Networks Trained on Synthetic Imagery.
Underw. Imaging, 6, 3, 4 august 2018.
33. Krishnamoorthy, N.R. and Suriyakala, C.D., Performance of Underwater
Acoustic Channel using modified TCM OFDM coding techniques. Indian J.
Geo Mar. Sci., 46, 3, 629–637, 2017.
34. Long, J., Shelhamer, E., Darrell, T., Fully convolutional networks for semantic
segmentation, in: Proceedings of the IEEE Conference on Computer Vision and
Pattern Recognition, Boston, MA, USA, pp. 3431–3440, 7–12 June 2015.
35. Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan,
D., Vanhoucke, V., Rabinovich, A., Going deeper with convolutions,
in: Proceedings of the IEEE Conference on Computer Vision and Pattern
Recognition, Boston, MA, USA, pp. 1–9, 7–12 June 2015.
36. Krizhevsky, A., Sutskever, I., Hinton, G.E., Imagenet classification with deep
convolutional neural networks, in: Advances in Neural Information Processing
Systems, pp. 1097–1105, The MIT Press, Cambridge, MA, USA, 2012.
37. Simonyan, K. and Zisserman, A., Very deep convolutional networks for largescale image recognition. arXiv 2014, arXiv:1409.1556.
38. Krishnamoorthy, N.R. and Suriyakala, C.D., Optimal Step Size and
Performance Analysis of Adaptive Equalizer in Underwater Acoustic
Channel. Engg Journals Publications, Int. J. Eng. Technol., 7, 5, 1952–1956,
2015.
39. Sajjad, M., Khan, S., Muhammad, K., Wu, W., Ullah, A., Baik, S.W., Multigrade brain tumor classification using deep CNN with extensive data augmentation. J. Comput. Sci., 30, 174–182, January 2019.
40. Badshah, A.M., Rahim, N., Ullah, N., Ahmad, J., Muhammad, K., Lee, M.Y.
et al., Deep features-based speech emotion recognition for smart affective
services. Multimed. Tools Appl., 1–19, 2017.
41. Muhammad, K., Ahmad, J., Baik, S.W., Early fire detection using convolutional neural networks during surveillance for effective disaster management. Neurocomputing, 288, 30–42, 2018.
42. Sajjad, M., Khan, S., Hussain, T., Muhammad, K., Sangaiah, A.K., Castiglione,
A. et al., CNN-based anti-spoofing two-tier multi-factor authentication system. Pattern Recognit. Lett., 1–9, 2018.
43. Ahmad, J., Muhammad, K., Lloret, J., Baik, S.W., Efficient conversion of deep
features to compact binary codes using fourier decomposition for multimedia Big data. IEEE Trans. Ind. Inf., 14, 3205–3215, 2018.
5
An Intelligent Healthcare Monitoring
System for Coma Patients
Bethanney Janney J.*, T. Sudhakar, Sindu Divakaran, Chandana H.
and Caroline Chriselda L.
Department of Biomedical Engineering, Sathyabama Institute of Science and
Technology, Chennai, Tamilnadu, India
Abstract
Monitoring of changes in the real time caused by the body movement is an essential tool. Monitoring system and patient movement is the process used to track
changes of motion in patients with state of coma. Coma is a profound loss of consciousness disorder, which can have multiple causes. Massively important effects
can happen rapidly, consistently, and sometimes with therapeutic implications.
The purpose of this work is to provide a detailed analysis of patient EEG analysis,
number of eye blinks, hand movement, movement of the legs, heart rate, temperature, and oxygen saturation of the coma patients. Camera setup with integration
of Raspberry Pi has been fixed to clearly differentiate any motions in the patient’s
eye and yawn identification. Patient records are preserved in the cloud for quick
access and review for the long term. It will examine the coma patient’s vital sign
on a continuous basis, and in any situation, when any movement happens in the
patient, the device will identify and activate the message and send it to doctor
and central station through IoMT. Thereby, the vital signs are those that expose
dramatic changes in coma upon processing, as well as provide precise data about
causative agent and treatment plan. Consistent tracking and observation of these
health issues improves medical assurance and allows for tracking coma events.
Keywords: Eye blinks, heart rate, temperature, oxygen saturation, EEG,
Raspberry Pi, cloud server, Internet of Medical Things (IoMT)
*Corresponding author: jannydoll@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (99–120) © 2022 Scrivener Publishing LLC
99
100
The Internet of Medical Things (IoMT)
5.1 Introduction
A brain dead subject has a total lack of alertness and therefore unable to
voluntarily feel, speak, hear, or move. Two significant neurological aspects
must function in order for the patient to achieve consciousness. A gray
matter, which forms the extreme portion of the brain, is a cerebral cortex.
The other structure is the reticular activation system (RAS) which is situated in the brain [1].
Damage to one or both of these components is likely to lead to a coma
for a patient. The cerebral cortex is a collection of dense, small gray matter
composed of neurons whose axons constitute white matter and account
for vision, the transmission of visual data through a thalamic cascade and
many other neurological processes such as abstract thinking [2].
Coma can be caused by many kinds of issues. The effect of drug toxicity is 40% of comatose states. Side-related effects of medications, including irregular heart and blood pressure, as well as irregular respiration and
sweating, can also harm ARAS and induce coma indirectly. Considering
that a large number of coma patients are subject to opioid exposure, health
facilities first examine all coma patients using a vestibular-ocular reflex to
analyse pupil diameter and eye movement [2].
The second main cause is the lack of oxygen, usually attributed to cardiac arrest, which accounts for about 25% of cases. A great deal of oxygen
is needed for neurons of the central nervous system (CNS). The lack of
oxygen in the brain, also known as hypoxia, makes sodium and calcium
decreased outside the neurons and intracellular calcium rises, which is
harmful to neuron contact. ATP fatigue as well as a cells failure due to
cytoskeletal and nitric oxide damage is also caused by inadequate oxygen
in the brain [3].
Two thirds of the brain dead states are due to the symptoms of a stroke.
Flow of blood to portion of the brain is constrained or blocked during a
stroke. Ischemic stroke, brain haemorrhage, or tumor may lead to high
blood pressure flow restriction. Blood failure in the cells in the brain
prevents the entry into neurons of oxygen, thereby disrupting cells and
causing them to die. Brain cells will die which may affect the workings
of ARAS and further aggravate brain tissue. In the other 15% of coma,
trauma, severe blood loss, malnutrition, hypothermia, high glucose, and
many more biological disorders are involved [4].
To assess brain injuries and diagnose comas, emergency doctors use
physical exams as well as medical technology. They assess the medical
records of a patient to monitor for medications, illnesses that include
Intelligent Healthcare Monitoring System
101
asthma and medical activities along with strokes after they have treated
open wounds and developed adequate respiratory and blood drift to the
mind. Then, to assess the degree of recognition, they look at the reflexes
of a patient. In addition, physicians take a blood sample to check blood
matter, electrolytes, and glucose levels and search for any residual drugs or
toxins, including carbon monoxide [5].
Health technology helps medical physicians to classify the area and
severity of the injuries suffered. Haemorrhaging, bleeding, mind stem
trauma, and non-convulsive seizures, an underlying purpose of coma, are
tested by CT scans, MRI scans, and EEG tests. In addition, they allow medical doctors to perceive the level of focus and to create appropriate solution
plans. Electroencephalography (EEG) refers to a type of electrophysiological monitoring that employs electrodes/sensors mounted on the scalp of
the brain to monitor the electrical activity that happens on the surface of
the brain [6].
Blinking is a function of the body; it is a semi-autonomous process of
fast eyelid closure. The vigorous closure of the eyelid tests a quick blink.
The fundamental eye function relates to the propagation of tears and the
removal of irritants from the corneal and conjunctive surfaces. Other
functions of eye movement can occur, since the eye is lubricated more frequently. Elements such as fatigue, eye damage, medications, and disease
can be affected by blink speed. The “blinding center” defines the blinking
intensity, but external stimulation may also be affected [7].
A yawn is a reflex composed mainly of the concurrently inhalation
of the air and the stretching of the ears, followed by an exhalation of the
breath. Yawning occurs most frequently in adulthood shortly pre and post
sleep, during tiring events, and due to its contagious quality. It is usually
associated with fatigue, pressure, drowsiness, frustration, or even starvation. In humans, yawning is often triggered by the perception that others
are yawning. Possibly one of the operations of lacrimation in yawning is
to keep the eyes well lubricated during pressure changes to which they are
exposed. The flow of tears through the nasolachrymal duct is sufficient to
cause the nose to blow [8].
The proposed procedure detects coma patients yawning, brain waves,
number of eye blinks, hand activity, leg movement, heart rate, temperature, and oxygen saturation. EEG tests electrical impulses from the brain
in terms of voltage variations actually happening within brain neurons. In
turn, this behavior will occur on the computer screen, which, in turn, is
linked to electrodes implanted in the brain as output waveforms observed
in voltage or as digital values of varying amplitude and frequency.
102
The Internet of Medical Things (IoMT)
5.2 Related Works
Saadeh et al. developed a great learning classification processor for reliable DOA prediction, irrespective of the age and anaesthetic agent of the
individual. The distinction is based entirely on six characteristics that are
derived from the EEG signal. In order to achieve a four-class DOA classification, the machine learning fine tree classification is implemented. The
proposed 256-point Fast Fourier Transform (FFT) accelerator is introduced to accomplish SEF, beta ratio, and FBSE that allow reduced latency
and high precision extraction of the function. A 65-nm CMOS processing
is used in the DOA Processor introduced, and the FPGA check is conducted with EEG measurements in 75 electro-op patients with different
forms of anaesthetic medication [9].
Y. Cui et al. suggested a novel solution Feature-Weighted Episodic
Training (FWET), to fully remove the need for calibration. The measurement of driver sleepiness levels can also increase driving safety through the
electroencephalogram (EEG) signal and preventive steps. This work incorporates two techniques: weighting characteristics to gain importance of
particular characteristics and episodic planning for domain generalization.
Experiments in the sleepiness evaluation of EEG drivers have shown that
both weighting features and episodic training are effective and can further
improve generalization. FWET does not need calibration data from a new
topic named or unmarked and could be very useful on brain-­computer
plug-and-play interfaces [10].
Veena Tripathi et al. introduced the concept for hospitalized patients
whose physiological condition requires treatment by continuous IoT-led
monitoring. The Internet of Things (IoT) is quite strongly used for related
healthcare. Sensors collect comprehensive physiological data using this
method of approach and gateways and clouds to process and store information, and transmit the analysed data wirelessly to be analysed and evaluated. This work provided an overview of this research area and of the
sensing equipment that has been used for medical surveillance, the role of
wearable health surveillance systems, data collection, and reporting based
on different parameters [11].
Yusuf A.N.A. et al. proposed the idea of a system of monitoring for
patient safety and disease prevention called Mooble (Monitoring for the
Better Life Experience). Mooble is composed of the Web client, the database and architecture of the API, and the applications of Android mobile
devices. This work focused on the mobile device subsystem design and
development. Three major aspects include the design, production and
Intelligent Healthcare Monitoring System
103
testing of applications. The software is designed using the RUP paradigm
that results in patients using a mobile application [12].
Bertrand Massot et al. developed the continuous monitoring of many of
the main patient parameters. These included the evaluation of the function
of the autonomic nervous system by the use of not invasive sensors, and
the provision of information to patients for mental, tactile, cognitive, and
physiological examinations to improve consistency and effectiveness in
home and hospital health and medicine. It consists of a small wrist system
linked to multiple sensors to detect autonomous nervous system activity
that regulates skin resistance, skin temperature, and heart velocity. It is an
ambulatory system. It can also monitor or pass data to the device through
a removable media’s wireless communication [13].
Purnima Puneet Singh designed and developed a reliable, energyefficient patient monitoring system. This is capable of sending patient
parameters in real time. It enables physicians to track patient health
parameters (temp, pulse, ECG, and position) in real time. Patient safety is
constantly tracked throughout the current proposed program and the data
collected is analyzed by a centralized ARM microcontroller. If the patient’s
health parameter falls below the threshold value, an automatic SMS is sent
to the doctor’s pre-configured mobile number using the standard values.
The doctor can get a record of the patient’s details by simply accessing the
patient’s computer database, which is constantly updated via the Zigbee
Receiver Module [14].
Qian D et al. aimed at identifying drowsiness during the day brief meals,
to further grasp the intermittent rhymes of physiological conditions and
then to foster a healthy sense of alertness. To diagnose human drowsiness using the physiological features derived of the electromephalograms
(EEGs), the process Bayesian-Copula Discriminant Classifier (BCDC) has
been applied. In comparison to the conventional Bayesian decision theory,
the BCDC approach attempts, by utilizing the principle of copulation and
the kernel density calculation, to construct the class-conditional probability density functions. The BCDC approach suggested was tested with
sample datasets and contrasted to other standard approaches for the diagnosis of drowsiness. The findings revealed that our system surpassed three
assessment requirements through certain systems [15].
Cheolsoo Park et al. put forward the new method for the evaluation
of asymmetry and brain lateralization by extending the algorithm of
Empirical Mode Decomposition (EMD). The localized and adaptive architecture of EMD makes it highly suitable for estimating amplitude information for non-linear and non-stationary data across frequency. Research
shows how bivariate EMD extension (BEMD), a realistic principle in EEG,
104
The Internet of Medical Things (IoMT)
allows for enhanced multi-channel record spectrum predictions consisting
of identical signal components. Simulation of virtual data structures and
feature previews for a brain computer (BCI) software is evaluated using the
proposed asymmetric estimation technique [16].
Pimplaskar D. suggested the approach was tested in high and low occlusion eye location surveillance. As a powerful and accurate algorithm, a
machine vision issue in real-time eye monitoring device suggested a new
method focused on the original centroid analysis methodology for estimating visual position and path. Here, they use the linked part methodology and the centroid approach to monitor and blink eyes on the OpenCV
network, which is open source and built by Intel [17].
Gang Liu et al. determined EEG-R for the early prognosis of results in
Comatosis cases that assessed the significance of quantifiable electrical
stimulation. EEG has been reported for cardiopulmonary resuscitation
(CPR) or stroke in consecutive adults in coma. EEG-R has been checked
for normal electric stimulation. The cerebral consistency categories (CPC)
or the adjusted ranking scale (mRS) score were given to each patient for 3
months of follow-up. The EEG-R involvement was 92.3% responsiveness,
77.7% accuracy, 85.7% PPV, and 87.5% NPV in all patients. EEG-R is a
strong indicator of the prognostic outcome in comatose following CPR or
stroke in quantifiable electrical stimulation [18].
5.3 Materials and Methods
The system is mainly used only for coma patients undergoing treatment in
hospitals. The duration of time a person is in a coma inevitably specifies
the degree of recovery. The coma patient needs to be checked 24/7. Since
the patient cannot really be monitored manually 24/7 and gather information, the sensor detects the coma patient’s pulse rate, oxygen saturation
(SpO2), and eye responses and registers the patient’s details. Then, the data
is transferred to the healthcare professionals to help the doctor continue
treatment that helps patients quickly recover.
5.3.1 Existing System
The device used by coma patients in the ward use the MEMS sensor in the
existing system to track leg motions. Eye blink sensors were used to detect
activities of eye balls in the form of glasses. Zigbee is always used to convert
information from the patient to the nurse station. The demerits of the system is that it does not describe real-time tracking, the design did not cover
Intelligent Healthcare Monitoring System
105
Power
Supply
Open CV
USB Camera
AVR Controller
Eye Blink & Yawn
Detection
EEG Electrode
Pulse Sensor
Temperature Sensor
Spirometer
USB to TTL
Converter
Raspberry
pi
MATLAB Output
(EEG pattern)
Display
Heart Rate Detection
Temperature
SpO2 level
IoMT-SMS Api
Integration
Figure 5.1 Block diagram.
cloud computing. Real-time execution is less; this study focused only on
the motion of the pulmonary processes in the body that blinks the eye and
the movement of leg [14].
5.3.2 Proposed System
This block diagram shown in the Figure 5.1, describes the overall detailed
design. It also describes each module that is to be implemented.
5.3.3 Working
The proposed automatic system monitors patient’s body temperature,
heart rate, body movements, yawn detection, SpO2, and EEG pattern. The
MAX30100 pulse oximeter heart rate sensor module and the eye and mouth
motion detector camera are used to obtain the analog output received by
the AVR Microcontroller to process the data. This information is moved to
the raspberry module. The raspberry collects the signal and automatically
106
The Internet of Medical Things (IoMT)
sends the alert message to the caretaker whenever an abnormal action
is detected. The USB digital camera is used to track a coma patient’s eye
blink. So, when an eye blink is detected, a fully automated message is sent
to the caregiver through Internet of Medical Things (IoMT). This notification is sent using the third party SMS API. The brainwave patterns of coma
patients are continuously monitored and processed in MATLAB. The variations in the EEG signal is detected and reported to the doctors. Thus,
the proposed system helps to protect coma patients effectively and to take
immediate action whenever necessary.
Two techniques can primarily be used to implement the eye blink and
yawn detection system: the frequency of eye blinks and the number of
yawns. An output may be detected if there is a movement in coma patients
on the basis of these two criteria. Initially, the dependencies or libraries
needed for the code are loaded which, when running the algorithm will
assist in different functionalities. This machine vision-based system tests
the coma patient’s eye twitch and yawn identification by observing the face
in real-time. If movement is found, then an alarm is generated. In this algorithm, a facial landmark file is imported to locate the coordinates of a specific person’s facial characteristics. These coordination systems are helpful
when the contours are established and the distance ratio between eyes and
mouth are registered. Pattern and edge detection are implemented using
Imutils library functions. The aspect ratio of the eye is calculated in order
to determine the contours of the eyes from the left to the right edge of
the eye. A particular Euclidean function can be used here to measure the
difference between two eyes. Likewise, for calculating the frequency of the
yawn count, the mouth distance is also measured.
5.3.4 Module Description
5.3.4.1
Pulse Sensor
Heart rate module bears Maxim’s MAX30100 combined pulse oximetry
and a heart rate monitor is shown in Figure 5.2. It is an optical sensor that
receives its readings by transmitting two wavelengths of light by two LEDs,
red and infrared, and then measuring the absorption of pulsing blood
using a photo detector. In order to read data from one fingertip, this particular color LED combination is optimized. The signal is managed with
a low-noise analog signal processing unit and distributed using the I2C
micro BUS interface to the intended MCU. End-user device developers
should remember that unnecessary movement and temperature change
will adversely affect the readings. Therefore, extreme pressure can reduce
Intelligent Healthcare Monitoring System
107
Figure 5.2 Pulse oximeter and heart rate sensor.
the capillary blood supply and thus decrease data reliability. There is also a
programmable INT pin. The 3.3V power supply is regulated [19].
5.3.4.2 Temperature Sensor
The temperature sensor LM35 (Figure 5.3), which is an IC sensor used to
measure the temperature using the analogue output proportional to the
temperature, has been used to measure the temperature. The LM35 is an IC
sensor with output voltage proportional to the temperature of Celsius. The
LM35 is better than the linear temperature sensors in Kelvin, as a Celsius
reader cannot sense a huge constant tension from the output value. These
key features of the LM35 sensor make it much easier to communicate with
any form of circuit [20].
5.3.4.3 Spirometer
Spirometry refers to a collection of basic measurements of an individual’s
respiratory ability. Spirometry involves measures of the volume of air inhaled
and exhaled by the lungs during a specified period of time to assess the
Figure 5.3 Temperature sensor.
108
The Internet of Medical Things (IoMT)
lung capacity. The tool which is used for this reason is called a spirometer.
Such tests are helpful in testing pulmonary function of the coma patient.
When the spirometer is a type of pneumatic, the spirometry evaluation process typically includes the patient breathing into a hose or tube of which one
end comprises a sensor that quantifies the air flow. Spirometer is a calculation of the amount of oxygen that the lungs inspire and expire. This system
is attached directly to the pressure sensor and the average value is measured
to the port Raspberry Pi by spirometry ratio FEV1/FVC (in percentage) [21].
5.3.4.4 OpenCV (Open Source Computer Vision)
The OpenCV library includes a variety of programming features that provide computer-based visual functionality. The programming languages
such as C/C++ can be used in OpenCV extensions. The vision can be used
for a variety of tasks, such as reading, typing it into the file or software,
viewing pictures in the window, adjusting the color of the image, redimensioning, rotation of image, thresholding, segmentation, edge-detection,
filtering, and picture contouring. In addition, various functions support
programming languages such as AC Pythons and C, C++ and profound
learning contexts such as tensor flux, pavilion, and caffe. It also includes
different programming languages [22].
5.3.4.4.1 Imutils
The library is primarily used to convert, rotate, scale, skeletonize, and view
images conveniently utilising OpenCV and matplotlib libraries. Through
this work, the coordinates of face marks which are predefined in the .dat
file are imported.
5.3.4.4.2 Dlib
Dlib is a machine learning library that is implemented specifically for the
recognition of frames and facial landmarks. It allows multiple objects to be
monitored in a single frame and can ultimately be used to detect objects. The
library is a linear algebra-based toolkit. It supports C++ in addition to Python.
5.3.4.5 Raspberry Pi
Raspberry Pi is used as a processing module for this system Raspberry Pi is
a series of small single-board computers produced in schools and developing countries by the Raspberry Pi Foundation to promote basic computer
science education. The initial version was much better than planned and
Intelligent Healthcare Monitoring System
109
was marketed outside its target market for robotic applications. This does
not contain peripherals (such as keyboards and mice) or cases. However,
some of the items have been used in a range of official and non-official sets.
Processing speeds range from 700 MHz to 1.4 GHz for Pi 3 Model B+ and
the on-board stored stores of SDHC (early Raspberry Pi’s) or Micro SDHC
(Later Raspberry Pi’s) rank between 256 MB to 1 GB RAM Stable Digital
(SD) cards. Three or four USB modules are available on the platforms. For
video output with a regular 3.5-mm jack for output devices, HDMI and
composite video are allowed. A number of GPIO pins supporting common
protocols such as I2C provide lower performance. Together, the Raspberry
Pi and IoT indicate to be a highly successful healthcare system [23].
5.3.4.6 USB Camera
A webcam is a webcam that feeds or broadcasts a picture or video to or from
a device on a wireless network, such as the Web, in real time. Webcams
typically are small cameras that are mounted on a desk and attached to
the monitor of the system. Webcam program helps users to capture a picture or view a video on the Internet. Since video transmission over the
Web needs a lot of bandwidth, such services typically utilize compressed
formats. The average resolution of a webcam is often smaller than other
portable video cameras, because higher quality will be decreased during
transmission. Lower quality enables webcams to be comparatively cheap
compared to most film cameras, but the result is sufficient for film chat sessions. In healthcare, most advanced webcams are capable of measuring the
variations that occurring in facial expression by utilizing a basic algorithmic trick. For video monitoring, webcams can be placed in locations such
as ICU in hospitals to track patient’s movement and general operation [23].
5.3.4.7 AVR Module
AVR microcontroller is often used as a processor module and is used to
transfer data to the LCD Screen and Gate Way modules. AVR is among
the first microcontroller families to use on-chip flash memory for program
storage, as opposed to the one-time programmable ROM, EPROM, or
EEPROM used by other microcontrollers at the time [24].
5.3.4.8 Power Supply
In this circuit, diodes are used to form a bridge rectifier that delivers
pulse-dc voltage. Then, the power supply circuit is powered by a condenser
110
The Internet of Medical Things (IoMT)
that fed the voltage from the rectification and then fed to the rectifier in
order to remove the a.c. Components are available even after rectification.
The filtered DC voltage is given for the 12-V constant DC voltage regulator. In addition, 230-V AC strength is translated to 12-V AC (12-V RMS
value with a peak of about 17 V), but 5-V DC is required; 17-V AC energy
must be transformed mainly into DC power and can be decreased to 5V
DC for this reason. The electronic power conversion called a corrector may
be used for transforming AC electricity to DC. The different styles include
the Half-Wave Remover, full Wave Remover, and Bridge Remover. The
bridge rectifier is used for transforming AC to DC owing to the benefits of
a bridge rectifier over the median and full wave rectifier [23, 24].
5.3.4.9 USB to TTL Converter
USB TTL Serial cables are a range of USB to serial converter cables that
offer communication between USB and serial UART interfaces. A variety
of cables with various connector interfaces provide compatibility at 5 V, 3.3
V, or user-specific signal speeds [25].
5.3.4.10
EEG of Comatose Patients
EEG testing is used to manage the precise depth of narcosis in coma. The goal
is to prepare a compilation of reliable EEG metrics for automated evaluation
of coma scales. There is a link in comatose patients between the EEG indicators and clinical ratings [26]. The use of a learning classifier is one potential
approach to pattern recognition. Diffuse modifications with a reduction in
alpha and a rise in theta and delta behaviors prevail in moderate consciousness
disruption. In more shallow stages of coma, intermittent rhythmic delta activity arising are more commonly over the frontal areas, sometimes even posterior, it is observed. In a number of etiologies, sustained bursts of slow-wave
activity may occur in deeper coma stages and are most commonly diffuse but
can also be lateralized even without spatiotemporal evolution [27, 28].
EEG signals are received through the 12 electrodes and processed
through the MATLAB. The graph showing the distinctions between regular and abnormal signals was recorded and mapped. For additional precision, each data set was approximated with the five separate brain waves
along with their frequency (HZ). The MATLAB assists in the collection,
evaluation, and training of EEG signals. Finally, the diagram differentiating the normal and the abnormal EEG pattern is obtained on the basis of
the frequencies for the five brain wave types [29].
Intelligent Healthcare Monitoring System
111
5.4 Results and Discussion
The proposed system of the health parameter monitoring kit for the homebound coma patients was tested on the normal patients for now. The
parameters such as eye blink, yawn detection, temperature, heart rate, and
SpO2 of the patient were tested, in which a camera was used for eye blink
and yawn detection. The heart rate and SpO2 were incorporated in same
sensor. Tests were conducted on the subjects producing real-time results.
For the EEG study, the signals obtained by using the 12 electrodes and were
processed via MATLAB. They were trained and plotted in a graph which
showed the differences between normal and abnormal signals. For further
accuracy, the different brain waves along with their frequency (HZ) were
approximated for each data set.
This study proves to give knowledge between normal and abnormal
conditions. The graphical representation that differentiates between the
normal and abnormal is based on the frequencies of the four types of
brain waves. The normal range frequencies are between certain limits.
Delta waves ranges between 0.1 and 3.5 Hz, Theta waves ranges between
4 and 8 Hz, Alpha wave has 8 to 12 Hz, Beta wave ranges above 12 Hz,
and Gamma wave ranges above 30 Hz. Like sleep, with which coma is
also actively abolished the EEG shows high amplitude, sluggish delta
waves, primarily within the delta frequency (<4 Hz) but intermingled
with spindles (7–14 Hz) in the initial stages of coma. The new study
shows that there is a deeper form of coma that goes beyond the flat line,
and during this state of very deep coma, cortical activity revives. The
frequency, height, shape, and position of each type of brain wave are natural. Double-way will display abnormal EEG performance. First of all,
natural brain activity can be halted and altered unexpectedly. This occurs
in seizures with epilepsy. Section of the brain displays a sudden pause
in partial seizures. The other way an EEG can show abnormal results is
called non-epileptic form changes. It may have an abnormal frequency,
height, or shape.
Figure 5.4 shows the complete hardware of the coma patient monitoring
system. Figures 5.5 and 5.6 show the detection of the eye blinking and the
yawning of the subject being tested along with the percentage of yawning
and no of eye blinks detected in the screen page. When the detection happens, message is sent to the caretaker’s mobile like shown in Figure 5.7.
The sensor tested on the subject is shown in Figure 5.8, and the results of
heart rate and oxygen saturation levels are given in Figure 5.9. Here, coding
is done using Arduino.
112
The Internet of Medical Things (IoMT)
Figure 5.4 Complete hardware of the coma patient monitoring system.
Figure 5.5 Eye blink detection.
Intelligent Healthcare Monitoring System
Figure 5.6 Yawning detection.
Figure 5.7 Alert message page.
113
114
The Internet of Medical Things (IoMT)
Figure 5.8 Subject testing.
For the treatment of EEG signal originating from patients, the coding
for MATLAB is introduced. Figure 5.10 shows the background frequency
of operation, such as Alpha, Theta, Delta, and Beta, and the different EEG
background reactivity may predominate in different coma encephalopathies. In patients with beta coma, generalized 12- to 16-Hz bottom activity
is often seen across the frontal areas. In unspeakable patients with alpha
(8–13 Hz), alpha-coma is distinguished by electroencephalographic patterns. Over the headland areas, the activity of alpha is mainly seen. Theta
coma refers to a 4- to 7-Hz diffuse coma history [30]. This pattern can
Intelligent Healthcare Monitoring System
115
Figure 5.9 Results of heart rate and SpO2.
occur both with and without alpha or delta intermixed behavior. Theta
activity is more diffuse and reactive than the previous regions and typically
has a weak prognosis. Coma high voltage delta behavior is characterized
as a 1- to 3-Hz background with amplitudes often reaching more than 100
μV. Comas can be of polymorphic form or more rhythmic triphasic waves
in the delta pattern. This pattern, while typically seen in late coma stages, is
largely responded to harmful stimuli. However, the background reactivity
116
The Internet of Medical Things (IoMT)
Beta
Theta
Alpha
Delta
Figure 5.10 EEG patterns in coma patient.
to external stimuli decreases and becomes unreactive when coma further
deepens.
The EEG analysis will also provide early evidence on the cause and
prognosis of coma conditions. Repeated EEG recordings enhance the efficiency of the diagnosis and make it easier to monitor changes in the coma,
mainly in order to determine the prognosis, partially because there might
be epileptic behavior needing care along the way. Regardless of the cause,
the risk of acquiring non-convulsive epileptic status is relatively high in
comas. This highlights the significance for comatose patients of EEG.
5.5 Conclusion
An intelligent healthcare monitoring system for coma patients is implemented using IoMT. This intelligent system effectively detects the yawning, eye blinks, and brain wave changes of coma patients. The proposed
approach is designed using Raspberry Pi and AVR controller. The movement of eye and yawning through live monitoring is determined using
OpenCV library in windows environment with a single camera view.
These efficient systems provide best and fast treatment possible for coma
patient. Hence, this work helps in monitoring the coma patients effectively.
The future work review the application of the coma patient monitoring
Intelligent Healthcare Monitoring System
117
technology in the healthcare field and it can promote for any advance
patient monitoring technology with more accuracy.
References
1. Guerit, J.M., Fischer, C., Facco, E., Tinuper, P., Murri, L., Ronne-Engstrom,
E., Nuwer, M., Standards of clinical practice of EEG and EPs in comatose and
other unresponsive states. Electroencephalogr. Clin. Neurophysiol. Suppl., 52,
117–131, 1999.
2. Brenner, R.P., The electroencephalogram in altered states of consciousness.
Neurol. Clin., 3, 3, 615–631, 1985.
3. Kansal, N. and Dhillon, H.S., Advanced Coma Patient Monitoring System.
Int. J. Sci. Eng. Res., 2, 6, 347–359, 2011.
4. Kansal, N. and Dhillon, H.S., Advanced Coma Patient Monitoring System.
Int. J. Sci. Eng. Res., 2, 6, 1–10, 2011.
5. Lokesh, B., Angulakshmi, A., Ashwini, K.P., Manojkumar, S., Lokesh, M., IoT
based Coma Patient Monitoring System using wearable sensors. Int. J. Adv.
Sci. Technol., 29, 10, 4401–4409, 2020.
6. Brenner, R.P., The electroencephalogram in altered states of consciousness.
Neurol. Clin., 3, 3, 615–631, 1985.
7. Norrima, B., Hamzah, A., Masahiro, I., Real Time Eyeball tracking via
DDTW. Int. J. Innov. Comput. Inf. Control, 7, 7, 146–161, 2011.
8. Tipprasert, W., Charoenpong, T., Chianrabutra, C., Sukjamsri, C., A Method
of Driver’s Eyes Closure and Yawning Detection for Drowsiness Analysis by
Infrared Camera. 2019 First International Symposium on Instrumentation,
Control, Artificial Intelligence, and Robotics (ICA-SYMP), Bangkok, pp.
61–64, 2019.
9. Saadeh, W., Khan, F.H., Altaf, M.A.B., Design and Implementation of a
Machine Learning Based EEG Processor for Accurate Estimation of Depth
of Anesthesia. IEEE Trans. Biomed. Circuits Syst., 13, 4, 658–669, 2019.
10. Cui, Y., Xu, Y., Wu, D., EEG-based driver drowsiness estimation using feature
weighted episodic training. IEEE Trans. Neural Syst. Rehabil. Eng., 27, 11,
2263–2273, 2019.
11. Tripathi, V. and Shakeel, F., Monitoring Healthcare System using Internet of
Things - An Immaculate Pairing. International Conference on Next Generation
Computing and Information Systems (ICNGCIS), pp. 153–158, 2017.
12. Yusuf, A.N.A., Zulkifli, F.Y., Mustika, I.W., Development of monitoring
and health service information system to support smart health on android
platform. 4th International Conference on Nano Electronics Research and
Education (ICNERE), Hamamatsu, Japan, pp. 1–6, 2018.
13. Massot, B., Gehin, C., Nocua, R., Dittmar, A., McAdams, E., A wearable,
low-power, health-monitoring instrumentation based on a programmable system-on-chipTM. 2009 Annual International Conference of the IEEE
118
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
The Internet of Medical Things (IoMT)
Engineering in Medicine and Biology Society, Minneapolis, MN, pp. 4852–
4855, 2009.
Purnima, P.S., Zigbee and GSM based patient health monitoring system.
International Conference on Electronics and Communication Systems (ICECS),
2014, pp. 1–5, 2014.
Qian, D., Wang, B., Qing, X., Zhang, T., Zhang, Y., Wang, X., Drowsiness
Detection by Bayesian-Copula Discriminant Classifier Based on EEG Signals
During Daytime Short Nap. IEEE Trans. Biomed. Eng., 64, 4, 743–754, 2017.
Park, C., Looney, D., Kidmose, P., Ungstrup, M., Mandic, D.P., TimeFrequency Analysis of EEG Asymmetry Using Bivariate Empirical Mode
Decomposition. IEEE Trans. Neural Syst. Rehabil. Eng., 19, 4, 261–274, 2011.
Pimplaskar, D., Nagmode, M.S., Borkar, A., Real time eye blinking detection
and tracking using opencv. Int. J. Eng. Res. Appl., 13, 14, 15–29, 2015.
Liu, G., Su, Y., Liu, Y., Jiang, M., Zhang, Y., Zhang, Y., Gao, D., Predicting
Outcome in Comatose Patients: The Role of EEG Reactivity to Quantifiable
Electrical Stimuli. Evid.-Based Complementary Altern. Med., 2016, 1–7, 2016.
Gu, Y., Shen, J., Chen, Y., A Smart Watch Based Health Monitoring System.
2019 IEEE/ACM International Conference on Connected Health: Applications,
Systems and Engineering Technologies, Arlington, pp. 7–8, 2019.
Fati, S.M., Muneer, A., Mungur, D., Badawi, A., Integrated Health Monitoring
System using GSM and IoT. International Conference on Smart Computing
and Electronic Enterprise (ICSCEE), pp. 1–7, 2018.
Soliński, M., Łepek, M., Kołtowski, Ł., Automatic cough detection based
on airflow signals for portable spirometry system. Inform. Med. Unlocked,
100313, 1–9, 2020.
Maganti, M., Vikas, B., Subhadra, K., Drowsiness detection using Eye-Blink
frequency and Yawn count for Driver Alert. Int. J. Innov. Technol. Explor.
Eng., 9, 2, 314–317, 2019.
Ramtirthkar, A., Digge, J., Koli, V.R., IoT based Healthcare System for Coma
Patient. Int. J. Eng. Advanced Technol., 9, 3, 327–341, 2020.
Su, M.-C., Wang, K.-C., Chen, G.-D., An eye tracking system and its application in aids for people with severe disabilities. Biomed. Eng. Appl. Basis
Commun., 18, 6, 197–210, 2006.
Goto, S., Nakamura, M., Uosaki, K., On-line spectral estimation of nonstationary time series based on AR model parameter estimation and order
selection with a forgetting factor. IEEE Trans. Signal Process., 43, 6, 1519–
1522, 1995.
Zamin, S.A., Altaf, M.A.B., Saadeh, W., A Single Channel EEG-based All
AASM Sleep Stages Classifier for Neurodegenerative Disorder. Biomedical
Circuits and Systems Conference (BioCAS), 2019, IEEE, pp. 1–4, 2019.
Yoo, J., Yan, L., El-Damak, D., Altaf, M.A.B., Shoeb, A.H., Chandrakasan,
A.P., An 8-Channel Scalable EEG Acquisition SoC With Patient-Specific
Seizure Classification and Recording Processor. IEEE J. Solid-State Circuits,
48, 1, 214–228, 2013.
Intelligent Healthcare Monitoring System
119
28. Besancon, G., Becq, G., Voda, A., Fractional-Order Modelling and
Identification for a Phantom EEG System. IEEE Trans. Control Syst. Technol.,
28, 1, 130–138, 2020.
29. Patel, J., Chavda, R., Christian, M., Patel, S., Gupta, R., Image Processing
based coma patient monitoring system with Feedback. Int. J. Recent Sci. Res.,
7, 2, 8885–8888, 2016.
30. Fleischmann, A., Pilge, S., Kiel, T., Kratzer, S., Schneider, G., Kreuzer, M.,
Substance-Specific Differences in Human Electroencephalographic Burst
Suppression Patterns. Front. Hum. Neurosci., 12, 368–380, 2018.
6
Deep Learning Interpretation
of Biomedical Data
T.R. Thamizhvani1*, R. Chandrasekaran1 and T.R. Ineyathendral2
Department of Biomedical Engineering, Vels Institute of Science,
Technology and Advanced Studies, Chennai, India
2
Department of Zoology, Queen Mary’s College (Autonomous), Chennai, India
1
Abstract
Deep learning can be stated as a new field in the area of machine learning related
with artificial intelligence. This learning technique resembles human functions in
processing and defining patterns used for decision making. Deep learning algorithms are mainly developed using neural networks performing unsupervised
data that are unstructured. These learning algorithms perform feature extraction
and classification for identification of the system patterns. Deep learning also
defined as deep neural network or deep neural layer possess different layers for
processing of the learning algorithms that helps in active functioning and detection of patterns. Deep learning network consists of basic conceptual features like
layer and activation function. Layer is the highest building block of deep learning process which can be categorised based on its function. Deep learning used
in various applications, one among them is the field of Biomedical Engineering
where big data observations are made in form of bio signals, medical images,
pathological reports, patient history and medical reports. Biomedical data possess time and frequency domain features for analysis and classification. The study
of large amount of data can be performed using deep learning algorithms. Thus,
deep learning algorithms are used for interpretation and classification of biomedical big data.
Keywords: Deep learning algorithms, biomedical applications, learning models,
interpretation
*Corresponding author: thamizhvani.se@velsuniv.ac.in
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (121–142) © 2022 Scrivener Publishing LLC
121
122
The Internet of Medical Things (IoMT)
6.1 Introduction
Deep learning can be defined as a subcategory of machine learning
with learning algorithms and models categorized by the functional and
structural activity of the brain. These are part of artificial intelligence
that possesses networks of unsupervised learning. Networks of unsupervised learning are configured for the unstructured data or samples.
Deep learning networks make use of artificial neural networks (ANNs)
in hierarchical order that proceeds with the machine learning techniques. These ANNs are designed and developed based on the features
that are similar to human brain and connected through neural nodes
[1]. The traditional methods analysis the data or samples in a linear
manner but the hierarchical technology–based methods in deep learning models helps in accessing the machines in analyzing the data or
samples using non-linear approach. The key features of deep learning
are given below:
• Deep learning algorithms and models are form of Artificial
Intelligence that resembles the human brain activities in
defining, analyzing, and processing the samples with decision-making feature.
• Deep learning in Artificial Intelligence is used for the learning of data that are unstructured.
• Deep learning described as a subset of machine learning
mainly helps in configuring and detecting the changes in the
systemic functions that are unstructured; for example, fraud,
or money laundering.
Deep learning methods are used in different fields which analyze the
unstructured samples. These learning methods are used to extract high,
complex, and described features from the original raw data. Certain factors and changes in the features illustrate that understanding in humanlevel is necessary for the analysis and categorization of the features [2, 3].
Human level intelligence is attained using these deep learning processes.
Representation learning does not define the features that require intelligence in human standards. Deep learningresolves the problem of representation in learning by developing simpler model of representation. Deep
learning helps the computer systems in designing complex modules from
simple learning techniques which is illustrated in Figure 6.1.
Deep Learning Interpretation of Biomedical Data
123
Deep learning technique illustrates the concept of learning as combination of models with simple processes like convolution. For example,
an image of any object can be described using the basic features such as
edges that include contours and corners. The best deep learning technique
used for simpler concepts is multilayer perceptron (MLP) or feed forward
deep network. MLP is a mapping function that depends on mathematical
expressions of the input and output variables. The mathematical functional
statements are designed using simpler feasible functions. The resultant
function with different mathematical expressions defines new frame for
the analysis of the input.
The perspective of deep learning process is defined with the exact representation of the features or input values and also depends on the development of the system to perform multi-tasking actions. These perspectives
truly associate the deep learning process with representation and functional activities based on the input and output values. Representative layers of the learning process can be illustrated with the memory of system
designed after the process of execution of the instruction sets in parallel
manner. Greater the depth of the networks, larger the sequential instructions described for the deep learning process. Sequential execution of
instructions defines high source of power with the help of which the results
of the previous instructions can be analyzed. The instructions can be used
• Input
• Convolutional Layers
• Fully connected layers
• Output
Figure 6.1 Basic structural representation of deep learning process.
124
The Internet of Medical Things (IoMT)
as reference for the later sets of instructions which possesses greater power
[4, 5].
Deep learning forms layers that represent the input values illustrating the
information change based on the input factors. The main aim of the layer
representation is information and data storage. The computer systems find
it difficult to analyze the raw input information. An example for raw input
information is an image defined by the smallest units called pixels. The object
identification with the help of functional mapping of pixel sets is difficult.
Layered representation and analysis of the pixel sets is necessary the object
categorization, pattern learning, and validation. Thus, deep learning process
effectively helps in finding solutions to the functional activity or mapping
with series layered representation. Each layer in the deep learning architecture is designed and described in unique manner. Input layer stated as visible
layer consists of observational values of the data. The hidden layers gathers
information with respect to the significant features derived from the image
pixel sets. The term hidden illustrates that the information or features derived
does not possess any value instead the models should be efficient in describing the relationship between the observational values of the data (inputs) [6].
The images represented using visualization features that are defined by
each and every hidden unit. With respect to images, the hidden layers in
deep learning illustrate the edges in the images using the first hidden layer
through comparison of the neighborhood pixel brightness. Second layer
in the hidden unit define the corners and extended areas of contours that
are stated as combination of edges [7]. The parts or areas of objects are
described using the third hidden layer by identifying the contours and
edges. The description of the images based on the hidden layers is used
to analyze the objects in simple terms helpful for object recognition. The
architecture can be illustrated in Figure 6.2.
Output Layers
Hidden layers
Input Image Layers
Figure 6.2 Simple architecture with hidden layers.
Deep Learning Interpretation of Biomedical Data
125
6.2 Deep Learning Models
Deep learning technology possesses different deep network systems with
changing topological models. Neural networks consist of number of layers of networks that are framed based on the feature functions [8]. These
neural networks are used in the practical applications. The layers of the
networks are added that involves interconnections and weights within the
layer networks. Deep learning technique initiates by defining the features
of systems and categorization completely hidden within the graphical user
interface. In deep learning system, graphical user interface (GPU) benefits
the technique for training and execution of the layered networks. Various
algorithms and architectural models are designed and defined in the process of deep learning. The different models of deep learning techniques are
illustrated in Figure 6.3.
These models of architectures are defined in wider ranges based on the
practical applications. Table 6.1 illustrates the various applications of the
models of deep learning.
6.2.1 Recurrent Neural Networks
Recurrent Neural Network (RNN) is the basic functional networks which is
part of the deep learning architectures. The primary difference between a typical multilayer network and a recurrent network is that rather than completely
feed-forward connections, a recurrent network might have connections that
feed back into prior layers (or into the same layer). This feedback allows
RNNs to maintain memory of past inputs and model problems in time [9].
The architecture of the Recurrent neural network is illustrated in Figure 6.4.
RNN
LSTM
CNN
DBN
DSN
GRU
Figure 6.3 Architectural models of deep learning.
126
The Internet of Medical Things (IoMT)
Table 6.1 Applications of deep learning networks.
Architecture
Application
Recurrent Neural Network
(RNN)
Speech recognition and handwritten pattern
recognition
LSTM/GRU networks
Text language compression, recognition of
handwritten documents, gesture and speech
recognition, image identification
CNN
Video analysis and processing, image
categorization, language processing and
analysis
DBN
Recognition of images, retrieval of data or
information, understanding of language,
failure prediction
DSN
Retrieval of data or information, speech
recognition in continuous manner
I0
….
h0
h1
In
h2
C0
O0
….
C1
On
C2
Figure 6.4 Architecture of recurrent neural networks.
RNN possesses a high structural set of networks and architectures in
which the feedback in the network is used as a differentiator. In this, the
network changes the hidden layer by itself resulting in the combinational
output layer. Feedback illustrates the neural network performance in the
RNN.
Deep Learning Interpretation of Biomedical Data
127
RNN is designed in such a way that helps in unfolding with time and
training the network using standard or variant back-propagation. These
RNN’s can be described as back-propagation in time.
6.2.2 LSTM/GRU Networks
In 1997, the long short-term memory (LSTM) was designed by Hochreiter
and Schimdhuber which has been used predominantly in various applications
compared to the architecture of RNN. LSTM networks are more significantly
used in daily applications like smartphones. For example, LSTMs are applied
for unique milestone application such as speech recognition by IBM. The
The standard neural network was described through these LSTM network architectures. This neural network defined memory cell as the primary
concept. The function of the inputs can retain the value for long or short
span using the memory cell. LSTM network enables the cell to memorise the
value that is important rather than the previous computed value [10].
The cell with memory consists of three gates which controls the flow of
information inside and outside of the cell. The gate of the input defines the
flow of new information in the memory cell. The gate of forget helps in controlling the forgotten information and remembers the new input given. At
last, the gate of output defines the output of the cell in controlled manner.
Weights which are considered to be part of the cell is described in the every
gate. Any algorithm used for training, for example, back-propagation in time,
analyzes the output network error with the weights for optimization process.
Figure 6.5 represents the LSTM memory cell with “ot” as output gate,
“it” as input gate, and “ft” as forgot gate. The simplified LSTM network was
ot
it
ct
ft
Figure 6.5 LSTM memory cell.
128
The Internet of Medical Things (IoMT)
r
h
z
Figure 6.6 GRU cell.
designed in the year 2014 which is completely based on the gate recurrent
unit (GRU). This specified network model has two different gates excluding the gate of output defined in the LSTM network. This GRU has similar
appearance and application performance when compared with LSTM. But
the major advantage of the network architecture is less weights and high
execution speed.
The two gates of GRU are update gate and reset gate.
Update gate: This gate in the GRU helps in identifying and
maintaining the contents in the previous cell.
Reset gate: The GRU gate is mainly used for collaboration of the
new input with the values in the previous cells.
GRU is an architecture which transforms to the RNN model through
the reset and update gate which is set as 1 and 0, respectively [11].
GRU cell is defined in Figure 6.6 with the reset (r) and update (z) gate.
Based on the comparison, LSTM is the little complicated with respect to
GRU. The GRU can be easily trained and has an efficient execution. But
LSTM in which more data can be used in higher expression for better
results.
6.2.3 Convolutional Neural Networks
Convolutional neural networks (CNNs) can be defined as an ANN with
multiple layers which is related with the biological visual cortex in the
human system. Architecture of the CNNs is mainly used for imaging applications. YannLeCun designed the primary CNNs.
For example, the architectural network of CNN with time can be applied
for character recognition in handwritten representation like interpretation
of the postal codes. The CNN deep neural network (DNN) mainly helps
in recognizing the features extracted through the layers present at first that
Deep Learning Interpretation of Biomedical Data
129
are described as edges. The layers at the end perform the recombination
process that binds the features with high level of input forms.
LeNet architecture in CNN is formed of various layers that make use
of the extracted features. The classification is performed using these features. The best example for the process of classification using the network
is image processing. The input images considered for the process of classification is segmented into respective regions and fed into the layers of
convolution. These layers of convolution in the DNN help in deriving the
features from the input units. Pooling is the next step which is designed for
the reduction of dimensions of the features derived can be defined as down
sampling. These networks also perform maximum pooling to retain the
most significant features or information. After this process, the convolution
and pooling processes are performed again and later fed into a completely
interconnected multilayered perceptron. The resultant is the output layer
which illustrates the unit by identifying the extracted features. Training
can be performed using different techniques basically; back-propagation is
used for the analysis and classification process in DNNs.
The processing, convolutions, pooling, and a fully connected multilayer system has paved path for various applications. In Figure 6.7, the
basic framework for the CNN is illustrated. These applications in different
fields involve deep learning algorithms for classification and processing.
Not only in image processing, the CNN deep networks are used for video
processing and recognition and applied for numerous applications, one of
them is natural processing of languages [12, 13].
Image and video processing systems are defined as the recent applications of the major deep learning networks such as CNNs and LSTMs. CNN
FEATURE EXTRACTION
INPUT LAYER
CLASSIFICATION
CONVOLUTION-1
HIDDEN LAYER
POOL-1
OUTPUT LAYER
CONVOLUTION-2
POOL-2
Figure 6.7 Basic structural framework of convolutional neural network (CNN).
130
The Internet of Medical Things (IoMT)
DNN applied in processing the videos and images. In these networks, the
model of LSTM trains the system for conversion of the output of CNN into
the understandable language.
6.2.4 Deep Belief Networks
Deep belief networks are a unique model of networks that possess a specific algorithm for training. This deep network model is a multilayered
system with every pair of the layers connected is described as restricted
Boltzmann machine (RBM). The deep belief networks are designed to be
with RBM stacks. The model or network consists of raw sensory input data
and the hidden layer abstracts the represented input values. Output layer is
different when compared with other layers in the network that is used for
the classification of the network model. The basic framework for the deep
belief networks are described in Figure 6.8.
Training process or algorithm in the deep belief networks always occurs
in two different steps.
• Unsupervised pretraining
• Supervised fine tuning
In unsupervised pretraining, the RBM is used for the reconstruction of
the input based on the training process. The first RBM helps in reconstruction of the primary hidden layer. The next restricted network gets trained
INPUT 1
OUTPUT 1
OUTPUT 2
INPUT 2
Hidden 1
Hidden 2
Hidden 3
Figure 6.8 Architectural framework for deep belief networks.
Deep Learning Interpretation of Biomedical Data
131
similarly in such a way that the primary hidden layer can be considered as
the input layer [14]. After training, the machine network is analyzed and
trained with the help of the outputs from the primary hidden layer as the
input units. The process of the network application continues with each
and every layer whenever the network undergoes pretraining.
Fine tuning in the deep networks occurs when the process of pretraining
is complete. In these network models, the output units are represented as
the labels which provide meaning to the model context. The training process of the full network is used for the back-propagation or gradient-based
decent learning. These learning algorithms help in defining the complete
process of training [15].
6.2.5 Deep Stacking Networks
Architecture of deep stacking network (DSN) is illustrated as the final
architectural network which is also described as deep convex network.
DSN varies from the traditional form of DNNs that consists of deep models. The simple architecture of the Deep stacking networks are defined in
the Figure 6.9. But actually the DSN model has a set of individual networks with own hidden layers. The specific disadvantage of deep learning
OUTPUTS
HIDDEN LAYER
OUTPUTS
INPUTS
HIDDEN LAYER
OUTPUTS
INPUTS
HIDDEN LAYER
INPUTS
Figure 6.9 Simple architecture of deep stacking networks.
132
The Internet of Medical Things (IoMT)
is training complexity which is rectified using these DSNs. Mainly this network does not consider the training problems in single form but relates it
with a set of individual training problems [16].
DSN possesses modules as a set and each module is described as a subnetwork illustrating the complete system of the network. Three modules
are designed for the network system. The module primarily consists of
input, hidden and output layers. Modules in the network are arranged in
such a way that one is on the top of another. In this network, the input units
of the layers have outputs that are described in prior with original vector
of the input units. The process of layering helps the complete network system to analyze more and more complex classification rather than a single
module description.
DSNs allow the train of the modules individually through isolation
that defines the network more significant and effective through training that occurs in parallel manner. Supervised learning algorithms make
use of back-propagation network for every module than applying the
back-propagation network to the entire network. DSNs play a major in
various applications and perform effectively in the process of classification and identification compared to the deep belief networks. These are
the basic architectures of the deep learning algorithms and networks used
in different applications. Deep learning algorithms are used in many fields
especially in the analysis, processing and interpretation of the biomedical
data. The interpretation and application of the deep learning networks are
illustrated in the chapter for clear understanding.
6.3 Interpretation of Deep Learning With Biomedical
Data
Deep learning involves different structural and architectural networks in
various fields which can be used for the analysis, processing, and classification. Deep networks are used for the interpretation of various domains,
one among them is field of biomedical engineering. Biomedical data incorporates larger fields within such as biosignals data, medical images data,
genomics data, structural analysis, protein study, identification of disease
conditions, and so on.
Biomedical field acts as a high source for research in various terms
of applications and in different platforms that involve medical field and
its associated departments like pathologies. In these, physicians are well
versed and more confident with certain results related to the pathologies.
Heterogeneous dataset is used by the physicians in the field of biomedical
Deep Learning Interpretation of Biomedical Data
133
engineering for technical and advanced scientific variations. These datasets
possess a wide angle of parameters for analysis of the biological changes
due to various physiological adaptations and also used in the field of imaging by determining the modalities [17].
Multiple parameters and systems can be described using the biomedical
data derived from different acquisition devices or systems. The biomedical
dataset are normally in imbalance state due to the multiple variations in
short span and completed structural changes in the disease conditions. In
general, with the help of learning algorithms, these datasets can be defined
to be non-stationary that are synchronized and classified using high complex forms. Machine and deep learning techniques can be illustrated for
these kinds of non-stationary variable datasets. These datasets help in recognizing the performance of the networks more effectively and acts as an
opportunity for the designed networks to describe their efficiency. The
opportunistic characteristics are defined below:
• To improve the big data analysis process in medical field and
to help all the professionals in the medical field.
• Reduction of the risks created by the errors in the medical
field.
• To define a harmony between the diagnostic process and
treatment protocols described.
Deep learning and ANNs are commonly used different fields like processing of images and deduction of fault. These networks and algorithms
are predominantly used learning tools. Deep learning algorithms applied
in the field of biomedical engineering includes each and every level in the
medical field. For example, deep learning is used in different medical levels
like gene expression detection in the genome applications, administrative
health management, decision-making intelligence for diagnosis of disease, prediction of the infectious rate of the epidemic disease conditions,
structural detection of the anatomical changes, and so on. Recent publications define clearly that deep learning algorithms are maximum used in
processing the biomedical datasets. With the references stated there is an
exponential development in the use of the deep learning algorithms for the
diagnosis, analysis, prediction and management of the biomedical data.
Comparing all the researches performed using biomedical data; two
sub-fields contribute larger which involves medical imaging and genomics. In medical imaging, deep learning mainly helps in diagnosis and
identification of the disease conditions or abnormalities. The recognition
and description of the abnormality based on images depends on certain
134
The Internet of Medical Things (IoMT)
features such as imaging modality, acquisition process and interpretation
of images [18, 19].
Imaging modalities and acquisition systems have technical advancements and developments that enables the use of the deep learning networks for diagnosis and analysis. In the recent innovations, technological
improvements in the field of medical imaging with respect to artificial
intelligence is described using DNNs. Medical images derived from different modalities are analyzed by the medical physicians and interpretation of
medical image data is performed. Variation in the process of interpretation
may occur. The main aim of the application of deep learning in the field
of bioimaging involves computer aided diagnosis and interpretations of
imaging datasets. Analysis of the medical images using the deep network
is more effective in describing the abnormalities even with slight variations in the structural features of the images derived using various modalities that include computer tomography, magnetic resonance imaging, and
ultrasound imaging.
For example, analysis of histopathological images of any cancer derived
from the electron microscopes and digital systematic analysis of the pathological images which completely focus on the biomarkers are created and
defined based on the anatomical and physiological changes in the human
system. These changes are clearly viewed in the images for the purpose
of diagnosis and treatment. Deep learning algorithms in medical imaging
are used in different ways which may include the segmentation, identification, classification, recognition, and interpretation. Biomarkers illustrate
the structural changes in depth that helps in the effective classification of
the abnormalities. Another example involves grading of the rheumatoid
arthritis. Rheumatoid arthritis is categorized based on the extension of the
synovial region. Using ultrasound image with the application of the deep
learning algorithms different grades of arthritis can be illustrated [22].
Any information about an individual’s healthcare and status can be
maintained by Electronic Health Records (EHR), a typical kind of biomedical data. Mainly, the methods to use EHR data to the maximum for
clinical support are focused as application for deep learning in biomedical
informatics research. EHR is largely used as an important resource in storing medical images. High-level tasks such as classification, detection, and
segmentation are obtained by applying pre-trained features which are in
turn acquired on training deep learning models by researchers using traditional routine. Few examples are, for tumor architecture classification, the
accuracy of classification can be improved by learning the features of histopathology tumor images with DNNs; different types of pathologies in chest
x-ray images can be identified using CNNs even on non-medical image;
Deep Learning Interpretation of Biomedical Data
135
in low field MRI scans for segmentation of tibial cartilage, the hierarchical
representations can be studied using CNNs. The feature representation and
automatic prostate MR segmentation can be learned using a unified deep
learning framework.
Deep learning models can be applied for clinical radiology research and
assist physicians. Yet, the existing deep learning models and their applications are still being studied. Even though these models are applicable commonly in applied science, there is still a need for investigation in the usage
of these model designs in medical domain. The image obtained and their
analysis and results are yet another issue for the usage of deep learning by
physicians.
Development of various mimic models have been tried in many studies
for interpreting the results of deep learning models. Hyper factors with
high correlations can replace the most important risk factors which is a
main drawback. Examples are age, sex, and various other factors that are
all important risk factors in predicting a bone fracture. On training a deep
learning model, cardiovascular factors can be overweighed, while age and
sex can be underweighted, because the former is highly correlated and in
final feature representations becomes the main contribution.
Deep learning can be applied to biomedical informatics for potential
future. Physicians and healthcare workers might have interested studies
on utilizing both medical images and clinical diagnosis reports and using
them in designing deep learning models. Clinical data public or shareable
could be big obstacle due to PHI (Protected Health Information) provided
by the Health Insurance Portability and Accountability Act (HIPAA).
Because of which, there will be a lack of public clinical data available which
obstructs researchers of computer science in tackling real clinical problems. However, to overcome this, using deep learning in feature representations and work embeddings and representing PHI in encoded vector
forms make the sharing of clinical data secure for the researchers to use.
These techniques can be applied by researchers through collaborations
with hospitals and healthcare agencies.
Successful applications of deep learning can be found in various fields
including image recognition, speech recognition, and machine translation and can be applied in industrial systems such as the one developed
by Google DeepMind and AlphaGo. The recent accomplishments of deep
learning have made its entry into medical domain with large amount of
available data.
Various machine algorithms are being largely applied in bioinformatics to extract knowledge from huge information available as biomedical
data. Major advances in several domains like image recognition, speech
136
The Internet of Medical Things (IoMT)
recognition, and natural language processing are mainly due to ω deep
learning that evolved from large data acquisition and the parallel, distributed computing and sophisticated training algorithms. Deep learning
research involved in bioinformatics field for the analysis and diagnosis of
the abnormalities. These bioinformatics domains involve medical imaging and signal processing. The architectures of DNNs define the processing systems for the categorization and description of the biomedical data.
Deep learning algorithms are applied in various forms in the field of bioinformatics that describes the analysis of imbalanced information, optimization of hyper parameter, deep learning models, and acceleration training.
Emerging innovations in the bioinformatics are mainly due to the deep
learning algorithms.
The biomedical data involves brain and body interfacing machines in
which the electric signals derived from the human systems like brain and
muscles. The sensors are used for the acquisition of the electric signals
from the physiological human systems. These sensors are used in various
applications in the field of biomedical engineering. The entire device of
brain and body interfacing machines consists of four main components
like sensing device, an amplifier, a filter, and a control system. The interfacing system processes and decodes the signals from the mechanisms of the
complex brain to facilitate the digital transmission between the computer
systems and the brain. The electric signals from the brain effectively helps
in initiating the reflex neural actions which is generated by the current
activity.
Deep learning is used in biosignal processing techniques for acquisition
and processing. Development and advancements in the signal processing techniques are enabled using deep learning algorithms. For example,
the invasive techniques like implanting electrodes in the scalp for recording electrical activities. The different diagnostic techniques are available
for the analysis of the signal processing. Signal acquisition techniques
are Electroencephalogram (EEG), Magnetoencephalography (MEG),
functional near infrared spectroscopy (fNIRS), and functional Magnetic
Resonance Imaging (f-MRI). After the machine interface with the brain,
the deep learning algorithm frames the second part that particularly helps
in detection and diagnosis of the various abnormalities in the physiological
systems.
Applications of deep learning involves various processes such as detection of coronary artery disease through Electrocardiograph (ECG) signals, automatic detection of the infarction with the help of ST segment in
the ECG signals, seizure identification, and Alzheimer’s disease detection
using the electroencephalography (EEG). Deep learning algorithms also
Deep Learning Interpretation of Biomedical Data
137
incorporate muscular activities for the development of muscle computer
interface. Electromyography movements and electric activities can be used
for the processing and classification of the prosthetic hands and recognition of the gestures using surface EMG electrodes.
Transfer learning is mainly to define the similarities between the
different knowledgeable systems. The dataset in the field of biomedical
engineering helps in learning the new tasks and analyzing the characteristics of the system. CNN is commonly designed deep neural architecture
that is used for analyzing the capability of the knowledge transfer process
especially in image classification process. The transfer learning process
is enabled through weight transfer in which the network is trained as a
source for any task and the weights of the layers are transferred to form
a second network that performs the another task. Transfer learning are
applied in analysis of images specifically in medical imaging. In the biomedical field, the datasets acquired are labeled and mentioned based on
the requirements defined as a challenge in the learning process and help
in fine tuning the model. The DNN architecture is used for the process
of pre-training the natural image datasets or images from any medical
domain. These images are analyzed and fine-tuned for the process of
classification. Transfer learning is used for the application of the detection of the seizure, classification of mental task, and for the prediction of
the enhancement process.
For the researchers, biomedical signal processing makes use of electrical signals acquired from the human system which helps in solving
problems. Recording signals produce artifacts and noises which are to
be reduced using the filter systems. Raw signals, in general, can been
decomposed into spatial or frequency components for the analysis process in the DNNs. Certain designed features can also be used in the layers of the network for the effective functioning of the deep networks. In
general, based on the previous works studied, the signal processing in the
biomedical field is categorized into two forms. One is decoding of brain
signals using EEG and for the diagnosis of the abnormalities and disease
conditions.
In biomedical data, as discussed imbalanced data plays a significant role
whose solutions are categorized into three different groups.
• Pre-processing of data includes sampling through different
methods.
• Sensitive learning based on cost which is applied for lost
function.
• Modification of the algorithmic techniques.
138
The Internet of Medical Things (IoMT)
DNNs used in the machine learning phases are applied in biomedical
areas for the development and advancements. The biomedical areas include
bioinformatics, genomics, imaging, health management, and interfacing
brain body systems. In the field of deep learning, the growing context is
the CNN developed in end-to-end process. These networks replace the traditional form of networks used in the learning algorithms. The literatures
study performed states that CNN acts as a main part in the architecture of
the DNNs that possess varying abilities used for the process of classification performed through transfer of weight. DNNs used in different analysis
process in the field medical imaging. For example grading of rheumatoid
arthritis using different image processing algorithms specifically grades are
detected using DNNs.
In many cases, the transfer learning concentrates on the biomedical
imaging processes and its applications. The most advanced and emerging
architecture designed for the application in biomedical field is defined as
generative adversarial networks. These neural architectures use augmentation process to represent and enhance the networks by annotating the
training dataset. The generative adversarial networks are mainly applied in
biomedical imaging applications. Despite the great success of DNNs in biomedical applications, many difficulties such as model building or the interpretability of the obtained results are encountered by deep learning users.
In deep learning, the term “deep” stands for the several layers through
which data is transformed. Because two to three layers of traditional neural
networks can be replaced using multilayers of DL for automated analysis of data. Deep learning is being used recently in proteomics. The publicly available data on genome and peptide sequencing are results of deep
learning, a sub-field of machine learning that uses automated. With limited
capacity of models and increased expensive computational processes helps
in automation of the models. Till the introduction of high performance
GPUs and other such specific hardware, the DL models were unrealistic
and much limited. DL has the ability to deal with large sets of data and
complex patterns and the future of proteomics data analysis. Recently,
deep learning is used in magnetic resonance imaging, computed tomography and in solving numerous image related problems.
With inspiration from the neurons and neuronal network in human
brain, ANNs in deep learning are being developed. ANN is a set of connected neurons modeling the synapses and the passing of stimuli across
the neural network. Nowadays, DNNs are being applied in speech recognition, vision, and many other fields. Multiple hidden layers are present
in DNNs and more additional layers have the capability to capture more
complex data patterns [20, 21].
Deep Learning Interpretation of Biomedical Data
139
In contrary to ANN, which is a collection of connected units that
pass signal from one unit to other, the CNN is a type of DNN in which
each layer is a combination of a non-linear operator and a convolutional layer and the input is received from previous layers, and these
together make an output. Using machine learning techniques, the
filters in convolutional layers can be made to perform specific tasks.
CNNs are being used in image classification, style transfer, and deconvolution in photography.
CNN gives a clear difference between the lower resolution and higher resolution images of a specimen. RNNs display temporal dynamic behaviour
and integrate internal memory. This is because, RNNs have connection
between nodes forming a directed graph along a temporal sequence. One
of the important advantages of RNN is that the present task can collect
information from previous tasks. Thus with the help of previous models,
predictive models with sequential signaling can be determined. In contrast
to this short-term memory, LSTM is a variation of the RNNs.
The use of deep learning in microscopy is significant in various fields
that use microscopy tools. By enhancing spatial resolution, DNN can
improve optical microscopy. An image acquired with a regular optical
microscope is used as the input. Images of low resolution were converted
to better resolution using deep learning. This approach can also be used in
other imaging techniques, spanning different parts of the electromagnetic
spectrum, designing computational imagers, and establishing a variety of
transformations in different modes of imaging.
Similarly, physiological conditions affecting health in various ways can
be diagnosed using deep learning techniques. Currently, life science and
various other fields depend primarily on genomic studies which are made
easy using deep learning techniques. Despite the great success of DNNs in
biomedical applications, many difficulties such as model building or the
interpretability of the obtained results are encountered by deep learning
users.
6.4 Conclusion
Deep learning algorithms aim for the development and improvement of
different fields in object recognition and identification process. The architecture and algorithm of the deep network describes the training and testing
process with the help of the input, hidden, and output units as in the simple network. The deep learning network, in general, mimics the functional
flow of the neurons in human system. The deep learning algorithms are
140
The Internet of Medical Things (IoMT)
used for various applications such as processing, segmentation, extraction
of features, optimization, recognition, and analysis.
Deep learning is closely related with a specific field connected with
medical devices, i.e., biomedical engineering. The biomedical data evolves
continuously and feeds each network with new information or data. The
features of the biomedical data describes complex functioning, expanded
size that stimulates the design of DNNs. With the application of the deep
learning networks, the biomedical data derived from various sources have
paved path for new innovations and discoveries that are in practical use.
These applications illustrate the technical pointers to the field. In general,
biomedical data includes vast collection of data or information from different medical fields. With respect to the study research, deep learning algorithms used in various applications are categorized into five sub-divisional
areas that increase the spatial scale. The five sub-areas include genomics,
proteomics, chemoinformatics, biomedical imaging, and healthcare and
transcriptomics.
Deep learning applied in medical field is one of the emerging fields in
the world of science and development. With this note, every equipment
designed and developed for medical analysis has extended abilities like
processing modules for signal and image data. These data can be featured and categorized using deep learning algorithms. Quantitative data
or information is required for the diagnostic purpose and these features
are independent of the variable changes and the device type. The devices
should be capable of producing results with high efficiency even in noise
filled environment. Deep learning algorithm helps in defining the minute change or variations in the system with respect to the individual data.
Interpretation of biomedical data using DNNs initiates a new era in diagnosing and predicting the abnormalities produced in the human system
due to certain changes. Deep learning techniques are adversely used in all
diagnostic systems that produce an effective advancement in the field of
biomedical engineering. Thus, deep learning network currently plays an
ambitious role in the diagnostic and prediction process.
References
1. LeCun, Y., Bengio, Y., Hinton, G., Deep learning: A Review. Nature, 421,
436–444, 2015.
2. Shrestha, A. and Mahmood, A., Review of Deep Learning Algorithms and
Architectures. IEEE Access, 7, 53040–53065, 2019.
Deep Learning Interpretation of Biomedical Data
141
3. Leung, M.K., Xiong, H.Y., Lee, L.J., Frey, B.J., Deep learning of the tissueregulated splicing code. Bioinformatics, 30, i121–i129, 2014.
4. Ravì, D. et al., Deep Learning for Health Informatics. IEEE J. Biomed. Health
Inf., 21, 1, 4–21, Jan. 2017.
5. Yang, X., Ye, Y., Li, X., Lau, R.Y.K., Zhang, X., Huang, X., Hyperspectral
Image Classification With Deep Learning Models. IEEE Trans. Geosci.
Remote Sens., 56, 9, 5408–5423, Sept. 2018.
6. Wang, Y., Application of Deep Learning to Biomedical Informatics. Int. J.
Appl. Sci. – Res. Rev., 5, 3, 1–3, 2016.
7. Zemouri, R., Zerhouni, N., Racoceanu, D., Deep Learning in the Biomedical
Applications: Recent and Future Status. Appl. Sci., 9, 1526, 2019.
8. Wang, S., Fu, L., Yao, J., Li, Y., The Application of Deep Learning in Biomedical
Informatics. 2018 International Conference on Robots & Intelligent System
(ICRIS), Changsha, pp. 391–394, 2018.
9. Yu, Y., Si, X., Hu, C., Zhang, J., A Review of Recurrent Neural Networks: LSTM
Cells and Network Architectures. Neural Comput., 31, 7, 1235–1270, 2019.
10. Tong, W., Li, L., Zhou, X. et al., Deep learning PM2.5 concentrations with
bidirectional LSTM RNN. Air Qual Atmos. Health, 12, 411–423, 2019.
11. Dey, R. and Salem, F.M., Gate-variants of Gated Recurrent Unit (GRU) neural networks. 2017 IEEE 60th International Midwest Symposium on Circuits
and Systems (MWSCAS), Boston, MA, pp. 1597–1600, 2017.
12. Jiang, P., Chen, Y., Liu, B., He, D., Liang, C., Real-Time Detection of Apple Leaf
Diseases Using Deep Learning Approach Based on Improved Convolutional
Neural Networks. IEEE Access, 7, 59069–59080, 2019.
13. Amin, S.U., Alsulaiman, M., Muhammad, G., Bencherif, M.A., Hossain, M.S.,
Multilevel Weighted Feature Fusion Using Convolutional Neural Networks
for EEG Motor Imagery Classification. IEEE Access, 7, 18940–18950, 2019.
14. Kaur, M. and Singh, D., Fusion of medical images using deep belief networks.
Clust. Comput., 23, 1439–1453, 2020.
15. Zhang, N., Ding, S., Liao, H. et al., Multimodal correlation deep belief networks for multi-view classification. Appl. Intell., 49, 1925–1936, 2019.
16. Khamparia, A. and Mehtab Singh, K., A systematic review on deep learning architectures and applications. Expert Syst., 36, 1–22, 2019, https://doi.
org/10.1111/exsy.12400.
17. Mosavi, A., Ardabili, S., Várkonyi-Kóczy, A.R., List of Deep Learning Models,
in: Engineering for Sustainable Future. INTER-ACADEMIA 2019. Lecture
Notes in Networks and Systems, vol. 101, A. Várkonyi-Kóczy (Ed.), Springer,
Cham, 2020.
18. Warburton, K., Deep learning and educationfor sustainability. Int. J. Sust.
Higher Ed., 4, 1, 44–56, 2003.
19. Gawehn, E., Hiss, J.A., Schneider, G., Deep Learning in Drug Discovery. Mol.
Inf., 35, 3–14, 2016.
20. Norgeot, B., Glicksberg, B.S., Butte, A.J., A call for deep-learning healthcare.
Nat. Med., 25, 14–15, 2019, https://doi.org/10.1038/s41591-018-0320-3.
142
The Internet of Medical Things (IoMT)
21. Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., Alsaadi, F.E., A survey of deep
neural network architectures and their applications. Neurocomputing, 234,
11–26, 2017.
22. Hemalatha, R.J., Vijaybaskar, V., Thamizhvani, T.R., Automatic localization
of anatomical regions in medical ultrasound images of rheumatoid arthritis using deep learning. Proc. Inst. Mech. Eng., Part H: J. Eng. Med., 233, 6,
657–667, 2019.
7
Evolution of Electronic Health Records
G. Umashankar*, Abinaya P., J. Premkumar, T. Sudhakar
and S. Krishnakumar
Department of Biomedical Engineering, Sathyabama Institute of Science and
Technology, Chennai, India
Abstract
IoMT is a connected infrastructure of medical devices, software applications
health system, and services. The following is integrated in various forms of medical applications. The evolution of IoMT starts from the 1500 and the era still
has a dramatic growth. The evolution of electronic health records (EHRs) began
in the 1960s and has its own importance in the future. The EHRs automate and
streamline the clinician’s workflow and make the process easy. It has the ability to
generate the complete history of the patient and also help in assisting for the further treatment which helps in the recovery of the patient in a more effective way.
The EHRs are designed according to the convenience depending on the sector
it is being implemented. The main aim of EHRs was to make it available to the
concerned person wherever they are to reduce the work load to maintain clinical
book records and use the details for research purposes with the concerned person’s
acknowledgement. Thus, with the influence of the IoT, the process of maintaining
the medical records has become even more easy and effective.
Keywords: IoMT, EHR, sensors, health data, data security
7.1 Introduction
Electronic health records (EHRs) are an essential component of the
healthcare industry because they allow accurate, systematic organization
of patient data. EHR improves healthcare, lowers healthcare costs, and
improves clinical diagnosis [1]. EHRs (allowing pragmatic clinical trials)
*Corresponding author: umashankar.bme@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (143–160) © 2022 Scrivener Publishing LLC
143
144
The Internet of Medical Things (IoMT)
on a macro-economic scale can help decide whether new therapies or
improvements in the delivery of healthcare result in better performance
or savings in health [2]. EHRs have primarily developed as a way of improving the quality of healthcare and gathering billing data. There are parallel priorities for primary clinical care and public health to strengthen the
health of patients and families, but seldom to create concrete partnerships
to strengthen the well-being of patients and populations [3].
In healthcare and general practice communities, there are extensive
researches on the use of information technology (IT), including EMRs.
These reports discussed diabetes care problems, accuracy of the health
record, mechanisms of decision-making, electronic correspondence,
supplier results, and patient outcomes. While there is some evidence of
improved quality in areas such as treatment avoidance and compliance with
requirements, several challenges have been reported. These include variable
clarity and reliability of medical records content, lack of time, and funding
to tackle progress and the need for proper training and support [4].
7.2 Traditional Paper Method
To meet the needs of modern medicine, the paper-based medical record is
woefully insufficient. In the 19th century, it emerged as a highly personalized “lab diary” that could be used by clients to document their observations and plans so that when they next visited the same patient, they could
be reminded of important information. No bureaucratic requirements were
present, no assumptions that the cord would be used to supplement port
communication between various providers of treatment, and remarkably
little data or test results to fill the record sheets. Over the decades, the record
that fulfilled the needs of clinicians a century ago has struggled mightily to
respond to modern standards as healthcare and medicine have changed [5].
7.3 IoMT
The medical items that are able to transmit data over a network without
involving human-to-human or human-to-machine contact are referred to
as the Internet of Medical Items (IoMT) [6]. IoMT proposes the Internet
of All Medical Things (IoE) definition to capture and preserve the overwhelming amount of information during sensing and transmitting. Not
only does the interconnection of wearable systems generate uncertainty,
but it also encourages the massive use of energy and other services, which is
Evolution of EHRs
145
the greatest barrier to treating the medical environment in a influential way.
Energy-aware practices are the most promising way to cope with energy
drain measurement and efficient provisioning of health data delivery [7].
IoT products such as electrodes, including ECG, blood pressure monitors, and EGD may have varying uses in the healthcare industry. This
network of sensors, actuators, and other instruments for mobile communication is poised to revolutionize the healthcare sector’s working. These networks, known as the Internet of Medical Items (IoMT), are an integrated
web of medical instruments and software that gather data that is then supplied via online computing networks to healthcare IT systems. Today, to
advise healthcare decisions, 3.7 million electronic instruments are in operation and connected to and monitor different areas of the body. Any of the
literature refers to IoT-MD, IoMT, Medical IoT, mIoT, and IoHT as the
Internet of Things in healthcare. To link to the definition-based Internet of
Medical Things, we use IoMT [8].
7.4 Telemedicine and IoMT
Healthcare is one of a human being’s most significant fundamental needs.
Access to healthcare facilities is still really critical for ordinary residents.
There are, however, some challenges in this field and it is difficult to overcome them overnight. In this case, telemedicine and IoMT help improve
the quality and affordability of healthcare. It is important to expand the
telemedicine and IoMT infrastructure anywhere [9].
A medical system is where medical professionals have the ability to
diagnose, analyze, and treat a patient from a remote location with cellular
technology, and the patient has the ability to obtain medical information
easily, efficiently, and without any risk of communication [9].
7.4.1 Advantages of Telemedicine
Various opportunities are provided through telemedicine. Few of them are
mentioned before and few of them are listed below:
• All-round service is the most common telemedicine service
available
• Less risk of booking cancellation
• Cost efficiency
• Less chance of infectious diseases
• Less waiting time
146
The Internet of Medical Things (IoMT)
• Most efficient in emergency cases
• Cost-effective
• Doctors and nurses act as second eyes [9].
7.4.2 Drawbacks
The gift of the century is a telemedicine program. However, there are
some limitations that we may call disadvantages. Below are some of the
drawbacks.
•
•
•
•
•
•
Difficult to handle the big stuff
The level of trust could be lower
There are also rural areas outside telecommunications
Access to 3G data is also expensive
Constant change in technology
Consultation is a major concern [9].
7.4.3 IoMT Advantages with Telemedicine
IoMT provides us the ability to access higher quality, more customized,
and cost-effective healthcare facilities. Basically, through connected
devices, IoMT provides different healthcare support. There are several
IoMT advantages, some of which are described below:
• Cost-effective: loMT provides the patient the ability to
access healthcare facilities using their linked smartphones.
This removes the hassle of standing in front of the doctor’s
room. The appointment can be remotely accessed by patients.
• Foster care results: Connected instruments provide the
patient with real-time observation that helps to enhance the
medication’s outcome.
• Efficient disease management: loMT enables systems to
detect a person 24/7, transmitting regular reports on the
health of the individual. This knowledge makes it easier for
healthcare practitioners to make an advanced judgment
about possible diseases. In a single area, it prevents infections from spreading out.
• Shrink errors: Although both the data and the patient may
be under surveillance, errors are less likely to be made.
IoMT stresses the needs of patients related to aggressive
care, increasing precision and, most importantly, early action
Evolution of EHRs
147
by clinicians to help develop patient confidence and increase
the consistency of treatment outcomes [9].
• Improved drug control: It helps improve the management
of medications in the supply chain. Innovation-IoMT is all
about real-time assessment and effective data that help physicians make decisions, not only because it is the path to the
development of future healthcare systems.
7.4.4 Limitations of IoMT With Telemedicine
The IoMT includes WiFi-connected wearable devices. The key thing is that
there is no Wi-Fi available anywhere and the gadgets are both pricey and
amid control. Other aspects that we need to take into account are also
available. The most critical thing is to take data confidentiality into account
because health information is vulnerable and it will also ruin the future if
it spills, hampering personal life. Another thing is that the IoMT needs a
hybrid cloud environment. Maintaining cloud encryption is difficult.
Telemedicine is the health treatment of the day. In Bangladesh, due to its
large population and scarce healthcare services, telemedicine is desperately
needed to expand. Telemedicine and IoMT are both the most powerful
means of developing healthcare services and the responsibility of the government to resolve the barrier [9].
7.5 Cyber Security
Every layered structure of network access is vulnerable. To be safe, any
layer of the structure has to be secured. One bad layer could flame out the
entire system. Due to the heterogenicity of the data and the need for decision-making intelligence, a multilayered security model is suggested [10].
7.6 Materials and Methods
7.6.1 General Method
Design intelligent algorithms to help decision-making, administer customized treatment, and ensure treatment conformity.
Build automated technologies that can store and evaluate the disease, improve our knowledge of the disease, and assess the efficiency of
physicians.
148
The Internet of Medical Things (IoMT)
It ensures that access to care is open from home, not only from the hospital. It renders access a fundamental human right to someone’s own health
records.
It makes readily accessible computers and sensors that capture health
data, protecting patient’s health records and privacy to discourage abuse
of information.
We need to digitize healthcare to guarantee access to safe, affordable
services for all, while preventing the danger of ubiquitous access to private
health information [11].
7.6.2 Data Security
There has lately been a growing trend in supporting medical and e-health
systems by leveraging blockchain technologies. With its open and trustworthy existence, Blockchain has shown tremendous promise in numerous sectors of e-health, such as protected exchange of EHRs and control of
data access among multiple medical agencies. Blockchain implementation
will also offer promising solutions to promote the delivery of healthcare
and thereby revolutionize the healthcare industry [12].
Another method proposed is used to reduce the following problems.
Every tiered network access arrangement is susceptible. Any layer of
the structure must be secured in order to be safe. A single faulty layer
might bring the entire system to a halt. A multi-layered security strategy
is recommended due to the heterogeneity of the data and the need for
­decision-making intelligence [10].
7.7 Literature Review
This report compiles data from a range of national and healthcare contexts
on EHR systems, including LMIC cultures, diverse institutional systems,
and different types of health systems. It illustrates the “maturity” of different technical infrastructures and the EHR framework and their subsequent demands for human capital. This analysis highlights the problems
facing the use of EHR programs to strengthen Asia’s public health. Highly
variable infrastructural constraints associated with the support of EHR
systems (e.g., stable energy and mobile technologies) add to the device
specifications a degree of complexity and the level of EHR sophistication
that can be enabled. Harm can also be implied in the implementation of
EHRs in a given environment for public health use [3].
Evolution of EHRs
149
Energy: The effective transfer of data in the IoMT is the desperate need
of today’s medical industry to satisfy the requirements of patients and physicians that we have proposed to the EEOOA (energy efficient on/off algorithm) in order to conserve energy for the betterment of society as a whole.
This paper’s main contribution is two-fold. First, during data transmission
in the IoMT, EEOOA is proposed with an energy model and a focus on the
energy drain of the transmission component. Second, the IoMT introduces
a new three-layer–based data transfer system. The working theory of the
proposed EEOOA is based on the sensor devices’ active and sleep cycles.
More electricity will be saved or less resource drain will be found in this
manner [7].
Security is one of the big issues in the digital age of mobile devices. The
number of vulnerabilities they expose to an attacker is very high, with so
many devices connecting to the network. In the case of the IoT, there are
also several hardware weaknesses besides the software vulnerabilities that
the attacker can take advantage of security is one of the big issues in the
digital age of mobile devices. The number of vulnerabilities they expose to
an attacker is very high, with so many devices connecting to the network.
In the case of the IoT, there are also several hardware weaknesses besides
the software vulnerabilities that the attacker can take advantage of to gain
network access. This work introduces the computer authentication method
that authenticated the tools that are present on the network using PUFs.
One of the disadvantages of this scheme is that the computer is not stored
in the server memory. A PUF module that can be used for verification will
be required for each device, but the PUF client modules do not store the
challenge and react to the server memory. This will aid in cases where the
computer is compromised and device data is not revealed to the enemy [13].
The field of IoMT-based technologies and IoMT systems was discussed
from a multi-layer perspective in this study. We also found that the CPS
method allows for better control not only of system robustness, stability, and
reliability but also of inspection and validation. Cyber-physical structure is
an efficient simulation tool for such structures to be designed, built, tested,
and deployed, since these challenges are important when designing biomedical systems. A full list of CPS methods used in the IoMT was implemented
and debated, and potential IoMT research directions were proposed [6].
In the use of EHR systems by healthcare organizations, steady growth
has occurred. However, many healthcare providers have been hesitant to
implement the EHR, and there is a need for accessibility, interoperability, and protection to be improved. Policies resolving the existing obstacles
150
The Internet of Medical Things (IoMT)
to avoiding more extensive use of EHR and study findings that analyze
whether these reward schemes convert into improved outcomes for healthcare must be followed [5].
Perspective of social influence: This study analyzed the plans of physicians to increase their usage actions in relation to the volume and diversity of functions of EHR systems in the healthcare operating environment.
This thesis has advanced an understanding of how the intent of physicians
to extend their use of EHRs to design and test a theoretical model has
been influenced by social impact factors (rewards and group preferences
in this specific study). Furthermore, our research instructs clinicians to be
mindful of the social conditions that can influence their widespread use
of EHRs [14].
This paper introduces an IoMT program called DiabLoop for the identification and assistance of diabetic patients. This contains many attributes,
such as diagnosis alert or recommendation warning, doctor’s drug order,
and predictive curve dashboard for tracking the evolution of blood sugar
of the patient [15].
Work has been undertaken to decrease the insecurity of the records.
One of the works undertaken is documented to illustrate the security
changes that can be undertaken. In the health information management
systems, we face different security threads where we have been able to
effectively reduce the security vulnerabilities and compromises created by hackers. We implemented SHA-3 (secure hash algorithm) and
were able to use the given salt and key to encrypt the data stored on the
server. In order to avoid any loop holes in the device, SHA-3 encryption
is supplied to the entire server GUI. This architecture is easy to execute
and allows the system to provide fewer code changes. It is possible to
upgrade the protection of the framework by offering a newer version of
SHA3 salt [16].
7.8 Applications of Electronic Health Records
7.8.1 Clinical Research
7.8.1.1 Introduction
Medical records provide tools for optimizing the treatment of patients, for
incorporating changes in the efficacy of clinical practice and for optimizing
identification and recruitment in the clinical study of eligible patients and
healthcare engineers. EHRs on a macro-economic scale (through realistic
Evolution of EHRs
151
clinical trials) will help determine whether novel therapies or advancements in healthcare deliveries result in improved health conditions or savings [2].
7.8.1.2 Data Significance and Evaluation
The consistency and validation of data are critical factors when determining if EHRs may be an acceptable source of data in clinical trials. When
healthcare facilities enter data directly into the EHRs or when EHRs are
used in all aspects of the health system, questions about coding inconsistencies or bias presented by reviewing codes based on billing benefits
instead of on medical evaluation may be minimized, but such programs
have not yet been commonly used. Errors have indeed been recorded in
the EHR. Reliable data capture is indeed a key aspect of EHR-based clinical
trials, particularly where EHRs are being used for computation of endpoints or the compilation of SAEs [2].
7.8.1.3 Conclusion
In order to increase the reliability of clinical trials and to draw on new testing methods, EHRs are a promising platform. The pace of technology has
provided accelerating processing skills, but appropriate measures must be
taken to attention has been focused, privacy as well as ensure the adequacy
of informed consent. By means of dispersed analyses, current projects have
put in place inventive solutions to those problems, enabling organizations
to retain access to the information and involving patient interested parties.
If EHRs can be directly implemented to standard meth production, enrollment checks remain to be shown and rely on cost efficiency and evidence
of authenticity and also on the achievement of anticipated efficiencies [2].
7.8.2 Diagnosis and Monitoring
7.8.2.1 Introduction
The Internet of Medical Items (IoMT) is a gathering of medical software
applications that are linked to health information systems via online government computers. IoMT-based equipment networking is facilitated by
medical devices equipped with Wi-Fi or other wireless distribution modern communications. Cloud networks, such as Amazon’s Web Database,
can be connected to IoMT machines where the generated data can be processed and analyzed. In this sense, cited as one of the focal areas of the
152
The Internet of Medical Things (IoMT)
operation to be spent on the accomplishment of the mission of the ESP
2025 surface, the healthcare market benefits from the allocation of 36 billion CFA francs of economic means specifically devoted to the development of software programs covering a wide range of subjects, except the
control of chronic applications [15].
7.8.2.2 Contributions
The main characteristics are as follows: proposal for an IoMT system
architecture to identify and improve the daily lives of diabetic patients;
scheduling approach that helps the device to use the blood glucose
recorded by the patient and analyzes it on the premise of depth of knowledge data and trends, immediately trying to send either a medical note,
specific advice or a warning to the parent and/or the doctor in the event
of an emergency; implementation of the DiabLoop IoMT program and
the Screenshot Validation and Evaluation of Devices seen in the case
study [15].
7.8.2.3 Applications
The program has three main features, namely, formulating the meal and
estimating the amount of carbohydrate included in the nutritional directory; customizing the nutritional base for your own data; inclusion of your
own insulin sensitivity measurement techniques, insulin doses, and similar
things in the logbook and graphical visualization that may influence blood
sugar [15].
This article presents an IoMT program called DiabLoop for the detection and assistance of people with diabetes. This contains many attributes, such as diagnosis alert or recommendation warning, doctor’s drug
order, predictive curve dashboard for tracking the evolution of blood
sugar of the patient, etc. In the evaluation and validation of the method,
taking into account the screenshots of the program, it can be implied that
the DiabLoop program is capable of changing the living status of diabetic
patients and of ensuring that physicians track their patients remotely
without cluttering the health systems with visits by many patients. The
mobile part of the system is ready for testing at the front end, the bottom
half is being established and it is implemented to be finalized in the days
ahead, either with the implementation of all the required functionalities
[15].
Evolution of EHRs
153
7.8.3 Track Medical Progression
7.8.3.1 Introduction
Circulatory disease remains the major cause of death in the United States,
and coronary artery disease account information for more than 60% of
all heart attack cases. In addition, health issues such as heart failure, valve
heart problems, and irregular heartbeat may lead to a wide range of complications. Reducing the factors identified related to heart events, including
tobacco, hypertension, hyperlipidemia, diabetes, and obesity mellitus and
attempting to maintain their changes over time are vital to lowering the
rate of new atrial fibrillation. In order to track enhancements, the usual
response is to first recognize all temporal representations and instead assign
them to something like the nearest target term. The amount of sales among
both them may be the difference between the phrase and also the term, or
they may be based solely on syntactic parsing. Associations leading even
from a strategy, but at the other hand, may not be correct, especially if
another text perceived by the Natural Language Processing System (NLP)
is inaccurate or includes arbitrary line breaks. In view of this, the research
study proposes a knowledge approach, which first rearranges the context
with acceptable temporal details in order to enrich it. The algorithm again
defines the correct timing characteristics of all known words, premised on
the context time analysis formulated, in comparison to the time of field of
metal documents [17].
7.8.3.2 Method Used
Baseline system that relies on pipeline: This analysis integrates a baseline system to discover the efficacy of current programs to monitor the
production of risk factors for heart disease. The post-processing section
also maps identified medicinal products to their categories of medicinal products by the same standardization measures as specified in the
“Identification of Medicines” section. All instances of medical terminology in the training collection are regarded as training cases and related
class marks serve as their associated time attributes. The elements used
include the phrase signals of the desired clinical theory, the surrounding phrase tokens, the detail portion and the essence of the concept
(e.g., mention, case, or symptom). Both custom elements are merged
into cTAKES using the platform Unstructured Knowledge Management
System [17].
154
The Internet of Medical Things (IoMT)
7.8.3.3 Conclusion
This article proposes an understanding approach by analyzing their time
attributes to the advancement of medical concepts. The system recognizes
prototype based on translations and machine-based learning methods.
Subsequently, after the correct time information has also been explained,
the expertise search algorithm improves the context of a known concept.
For the five main health services associated, a fair F-score of 0.827 was
shown using a quasi method that uses the average probability of the time
attribute, including its classification model, without recognizing the temporal information in the text. Nevertheless, in terms of precision, the suggested technique surpassed the non-context-aware method and resulted in
an enhancement in the F-score of 0.055. The advantage of the discussion
method referred to is that it smartly extracts temporal information from
the printed, including knowledge obscured by separate sentences [17].
7.8.4 Wearable Devices
7.8.4.1 Introduction
Nowadays, humans all live in an extremely developed world, where everything is developed technology, and we all live in a vast and astronomical
pool of knowledge and entertainment. A smart-phone interface is currently used by about 60% of the global population. Today, for almost all
in information, work, entertainment, play, connectivity, and discovery, we
use our smart devices. According to new estimates, by 2024, 1,012 computers will be accessed through the internet, and the estimated global
value of IoT will be US$ 6.25 trillion, with a 31% rise from other healthcare
computers ($2.4 trillion), that is around 26 wearable technology objects
for every human being on Earth. It describes how positively the interconnected world of today is and illustrates the growth of the devices connected over the decades. By any metric, the amount of Open IoT data
produced, involving thousands of wired devices, would be astronomically
massive. The planet has brought a new age of networking that also encompasses the human realm. More devices and objects in our modern world
would then connect devices and with us through built-in sensing devices
without human influence. These “sensor nodes” have the power to recognize, to sound, to feel, and to hear the globe around them. According
to numerous surveys conducted by a wide variety of organizations, the
wearable segment is likely to soon become one of the biggest industry
across the globe [18].
Evolution of EHRs
155
7.8.4.2 Proposed Method
More systems and things in our real world will instead communicate with
other devices and with us through built-in sensing devices without human
involvement. These “sensor nodes” have the strength to perceive, to sound,
to feel, and also to hear the globe across them. If the correlated values of
this sensing element exceed the threshold value, then a caution appears on
the screen of the mobile application requiring the user’s authorization to
turn on the air conditioning unit. The Quick Press knob will act as a despair
button and press it five times in 3 seconds to activate a wide range of app
operational activities First, it sends the longitude and latitude values (realtime destination) to the 911 communication and police force; second, it
automatically connects the 911 patrol officer call, previously assigned to the
user; and third, it instantaneously locates the contact to both the app screen.
7.8.4.3 Conclusion
Wearable technology can be an integral part of a person’s everyday lives.
Various multi-purpose technical specs for consumer comfort need to be
enforced in the future, but it would be cumbersome for the consumer to
bring in more than two wearable devices. In the proposed scheme, multidisciplinary objectives, spanning a multitude of features, have been
implemented. All common aspects of the planned scheme include medical
check, remote telemedicine, calling and delivering critical information to
the appropriate authorities and individuals of interest in times of emergency, and the functionality of IoT-based smart devices. Combined with
the powerful introduction of technologies that uses less power and memory, the user-friendly device and user interface enable the whole platform
a standard managed care plan system [18].
7.9 Results and Discussion
EHRs are the parts of the medical records of a patient that are recorded in a
database file and the technical advantages gained from providing an EHR.
The IoM laid out eight main tasks that an EHR should be able to perform:
• Health data and data: This includes better access to the
information required by care professionals, using a defined
data set that includes medical and nursing conditions, a list
of drugs, illnesses, profiles, clinical narratives, reports of laboratory testing, and more.
156
The Internet of Medical Things (IoMT)
• Control of results: Electronic results for improved analysis
and easier identification and evaluation of medical problems; it eliminates redundant assessments and increases
quality of care between different providers.
• CPOE (Computerized physician order entry) systems
increase the workflow of order processing, remove missing
orders and ambiguities created by illegible handwriting, track
redundant orders, and decrease the time needed to fill orders.
• Assistance for decisions: It covers treatment, opioid prescription, monitoring and control, and identification of
adverse effects and outbreaks of disease.
• In fields such as vaccines, screening for breast cancer, colorectal screening, and cardiovascular risk mitigation, device
alerts and prompts boost prevention activities.
• Electronic communication and connectivity Improving
patient safety and quality of care among care partners, particularly for patients with multiple providers. For instance,
patient education and home supervision by patients using
electronic devices.
• Patient care: Administrative practices and monitoring
improve the efficacy of healthcare facilities and provide customers with faster, timer treatment.
• Monitoring and population health promotes the monitoring and prompt reporting of adverse reactions and epidemic
outbreaks of primary standard metrics [19].
Three key requirements for an EHR were defined by the CPRI:
• Collect data at the point of treatment and combine data
from various sources.
• Provide support for judgment.
• It reaches the next level in the means of measurement and
data collection when comparing IoMT for health information. The uses of IoMT are comparatively more and it is very
accurate and beneficial in the field of medicine. It is more
fitting that it reaches the next level in terms of operation and
protection when medicine meets technology.
The IoMT apps are used by many healthcare providers to optimize procedures, control illnesses, minimize complications, improve patient service, control medications, and decrease costs. They recommend that this
Evolution of EHRs
157
increase may be due to the upper hand of remote health facilities that can
make diagnosis persistent life-threatening diseases. By doing so, we will
conclude that IoT has chosen to take the kidneys and also that people
will have autonomous recognition of their well-being needs; they can try
tuning their systems to notify them of their appointments, calorie counts,
exercise management, changes in blood pressure, and so much more. There
are new safety challenges related to anonymity, integrity, and compatibility
(CIA). As most IoT components transfer signals wireless communication,
this puts IoMT at risk for data breaches in the wireless sensor nodes. IoT
well-being and data storage and exchanging in the network is affected due
to extreme data breaches. All of these concerns contribute to the protection of patient data and privacy. The confidentiality and safety of patient
data has been compromised by attacks on multiple connected computers,
which can also contribute to adverse results [20].
The use of new IT has strongly facilitated the implementation of innovative health. Electronic wellness has become a wide and growing field with
a variety of scientific publications in medical journals.
• The nations around the world that have published e-health
documents have very effective collaboration, with the relatively
close countries in the European Union, with the USA leaderboard first in proportion to the number of information filed.
• Smart medical records, telehealth, and e-health have become
a major part of e-health and telehealth scientific knowledge,
and academics have also concentrated on common topics
such as privacy, security, and improving social quality.
• The first progress in e-health is interactive wellbeing, which
concentrates on IoT-based smart smart wearables. The second is the combination of patient monitoring and healthcare.
Moreover, research on health-related convergence of massive data and
complicated cloud-based information services is a key issue [21].
7.10 Challenges Ahead
Challenges ahead the IoMT market is still booming with all the technical
developments we are seeing today and its environment is still evolving.
The change of demographics, the digital revolution, the influence of government and the demand for value-based customer service are just some
of the reasons behind this transition. Although these changes are gaining
158
The Internet of Medical Things (IoMT)
momentum, there are also concerns that need to be addressed: record protection, synchronization, and regulation, among many others [22].
Players such as Helix, 23andMe, Myriad Genetics, BK Biobank, and
the Large Institute are making major strides in predictive analytics using
genomic data. This makes it possible, by genetic data processing, to estimate the risks of diseases such as cancer or even IQ. This appears to be the
next quantum leap in the safety of public health but still raises tremendous
ethics issues, including the genetic risk [11].
There are a number of clever apps on the IoMT actually being applied
around the healthcare sector. When the patients could themselves do
them, it makes the procedure increasingly simpler. For proper diagnosis
and potential use, the data produced is processed and analyzed. The market is strongly in demand for this technology as it makes a lot of jobs and
it is not costly at all. The concern with data protection is the only concern
with this technology [23].
Quantum computing will continue to change the way patients are
treated with medication in hospitals, and healthcare providers will have a
greater level of treatment personalized to each patient on the basis of their
multiple sensors and analysis [19].
7.11 Conclusion
IoMT is revolutionizing the world of healthcare. Doctors will also effectively
identify and treat patients, administer tailored and customized medication,
and improve hospital staff ’s coordination and workflow. At certain phase, the
IoT will be the planet’s main bastion of truth. Healthcare practitioners will
be able to understand and leverage the availability of data mining from interconnected networks, as well as to understand and model current and future
health trends, in order to make more informed decisions. Digital health
control, or telehealth, is possible with IoT, and for patients who live in rural
areas, it may help overcome chronic diseases. The large variety of biosensors
and medical wearables which are readily available on the market today are
another reason why digital health tracking is gaining popularity [24].
References
1. Butt, N., Computer Science Department Seidenberg School of CSIS, Pace
University New York, NY, USA nida.b94@gmail.com and Shan, J., Computer
Science Department Seidenberg School of CSIS, Pace University New York,
Evolution of EHRs
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
159
NY, USA jshan@pace.edu, CyberCare: A Novel Electronic Health Record
Management System. 2016 IEEE First Conference on Connected Health:
Applications, Systems and Engineering Technologies, 326–327, 2016.
Cowie, M.R., Blomster, J.I., Curtis, L.H., Duclaux, S., Ford, I., Fritz, F.,
Goldman, S., Janmohamed, S., Kreuzer, J., Leenay, M., Michel, A., Ong, S.,
Pell, J.P., Southworth, M.R., Stough, W.G., Thoenes, M., Zannad, F.,
Zalewski, A., Electronic health records to facilitate clinical research. Clinical
research in cardiology: official journal of the German Cardiac Society, 106, 1,
1–9, 2017.
Dornan, L., Pinyopornpanish, K., Jiraporncharoen, W., Hashmi, A.,
Dejkriengkraikul, N., Angkurawaranon, C., Utilisation of Electronic Health
Records for Public Health in Asia: A Review of Success Factors and Potential
Challenges. Biomed. Res. Int., 2019, 1–9, 2019.
Lau, F., Price, M., Boyd, J., Partridge, C., Bell, H., Raworth, R., Impact of electronic medical record on physician practice in office settings: a systematic
review. BMC Med. Inform. Decis. Mak., 12, 10, 2012, http://www.biomedcentral.com/.
Shortliffe, E.H., The Evolution Of Electronic Medical Records. Academic
Medicine: journal of the Association of American Medical Colleges. 74, 4, 414–
419, 1999.
Vishnu, S., Ramson, S.R.J., Jegan, R., Internet of Medical Things (IoMT) - An
overview. 2020 5th International Conference on Devices, Circuits and Systems
(ICDCS), 101–104, 2020.
Pirbhula, S., Wu, W., Mukhopadhyay, S.C., A Medical-IoT based Framework
for eHealthcare, 2018 International Symposium in Sensing and Instrumentation
in IoT Era (ISSI), 1–4, 2018.
Anandarajan, M. and Malik, S., Protecting the Internet of medical things: A
situational crime-prevention approach. Cogent Medicine, 5, 1, 1513349, 2018.
Emon, T.A., Rahman, Md.T., Prodhan, U.K., Rahman, M.Z., Telemedicine
and IoMT: Its importance regarding healthcare in Bangladesh. Int. J. Sci. Eng.
Res., 9, 2, 1782–178, February 2018.
Rizk, D., Rizk, R., Hsu, S., Applied Layered Security Model To Iomt, 2019
IEEE International Conference on Intelligence and Security Informatics (ISI).,
pp. 227–227, 2019.
Singh, M., FIETE, Internet of Medical Things: Advent of Digital Doctor,
AKGEC International Journal of Technology, 9, 1, 26–34, 2013.
Nguyen, D.C. and Pathirana, P.N., Ding, M., Seneviratne, A., Blockchain
for Secure EHRs Sharing of Mobile Cloud Based E-Health Systems, IEEE
Access, vol. 7, pp. 66792–66806, 2019.
Yanambaka, V.P., Mohanty, S.P., Kougianos, E., Senior, Puthal, D., PMsec:
Physical Unclonable Function-Based Robust and Lightweight Authentication
in the Internet of Medical Things, IEEE Transactions on Consumer Electronics,
65, 3, 388–397, 2019.
160
The Internet of Medical Things (IoMT)
14. Wang, W., Zhao, X., AccFin, J.S., Zhou, G., Exploring physicians’ extended use
of electronic health records (EHRs): A social influence perspective. Health
Information Management: journal of the Health Information Management
Association of Australia. 45, 3, 134–143, 2016.
15. Mbengue, S.M.K., Diallo, O., EL Hadji, M.N., Rodrigues, J.J.P.C., Neto, A.,
Al-Muhtadi, J., Internet of Medical Things : Remote diagnosis and monitoring application for diabetics, 2020 International Wireless Communications
and Mobile Computing (IWCMC), pp. 583–588, 2020.
16. Azhagiri, M., Amrita, R., Aparna, R., Jashmitha, B., Secured Electronic
Health Record Management System. 2018 3rd International Conference on
Communication and Electronics Systems (ICCES), pp. 915–919, 2018.
17. Chang, N.-W., Dai, H.-J., Jonnagaddala, J., Chen, C.-W., Tsai, R.T.-H., Hsu,
W.-L., A context-aware approach for progression tracking of medical concepts in electronic medical records. J. Biomed. Inform., 58, S150–S157, 2015.
18. Sood, R., Kaur, P., Sharma, S., Mehmuda, A., Kumar, A., IoT Enabled Smart
Wearable Device – Sukoon, 2018 Fourteenth International Conference on
Information Processing (ICINPRO), pp. 1–4, 2018.
19. Doyle-Lindrud S. The evolution of the electronic health record. Clinical journal of oncology nursing, 19, 2, 153–154, 2015.
20. Bilal, M.A. and Hameed, S., Comparative Analysis of encryption Techniques
for Sharing Data in IoMT Devices. Am. J. Comput. Scie. Inform. Technol., 8,
1, 46, 2020.
21. Gua, D., Lia, T., Wang, X., Yanga, X., Yu, Z., Visualizing the intellectual structure and evolution of electronic health and telemedicine research. Int. J. Med.
Inform., vol. 130, 103947, 2019.
22. Chenthara, S., Ahmed, K., Wang, H., Whittaker, F., Security and PrivacyPreserving Challenges of e-Health Solutions in Cloud Computing, IEEE
Access. vol. 7, pp. 74361–74382, 2019.
23. Giri, A., Chatterjee, S., Paul, P., Chakraborty, S., Biswas, S., Impact of “Smart
Applications of IoMT (Internet of Medical Things)” on HealthCare Domain
in India. Int. J. Recent Technol. Eng. (IJRTE), 8, 4, 881–885. November 2019.
24. Dias, D., & Paulo Silva Cunha, J. Wearable Health Devices-Vital Sign
Monitoring, Systems and Technologies. Sensors (Basel, Switzerland). 18, 8,
2414, 2018.
8
Architecture of IoMT in Healthcare
A. Josephin Arockia Dhiyya
*
Department of Biomedical Engineering, Vels Institute of Science Technology and
Advanced Studies, Chennai, India
Abstract
In today’s world, Internet of Medical Things plays a vital role in the healthcare
industry. This paves a new way for transforming a unique platform for automation and processing techniques. E-health and M-health are being shaped in a better
way. This technique is mainly helpful for medical engineers to integrate the Internet
of Things with medicine. Regarding medical parameters, ECG sensors and other
medical sensors are used to track a patient’s status, such as skin temperature, glucose, and blood pressure. It can also be transformed into wearable, such as activity trackers, garments, and watches. The device should be appropriately fabricated
and then licensed by concerned authorities. IoMT also plays a vital role in remote
monitoring systems for the elderly and the telemedicine process. Though the architecture of IoMT is complex, the processed data can be stored and analyzed. Major
applications include assistive care using e-health and healthcare. The basic concept
architecture involved is the information that will be tracked from the patient using
sensors, and then, it will be processed using a smart device. The information will
be stored and analyzed in the cloud. Then, it will be transmitted to the healthcare
workers and family, etc. Further, IoMT has the advantage of expanding the research
to the core, and it also highlights the research on medical fields.
Keywords: IoMT, e-health, health, storage, analysis, research
8.1 Introduction
The Internet of Medical Things (IoMT) is a blend of medical gadgets and
applications associated with medical care data innovation frameworks
Email: a.dhivya.se@velsuniv.ac.in
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (161–172) © 2022 Scrivener Publishing LLC
161
162
The Internet of Medical Things (IoMT)
utilizing organizing advances [1]. It can decrease mushiness clinic visits
and the weight on medical cue frameworks by interfacing patients with
their doctors and allowing timely clinical information over a protected
organization.
The IoMT market comprises gadgets, such as cables and clinical/indispensable screens, used safely for clinical need either on the body, in the
resident sector or medical networks, and it is widely used in medical technology [2].
8.1.1 On-Body Segment
This segment is widely used in wearable technology and used extensively
in medical fields.
Wearables include grade gadgets for individual health or fitness, such
as action trackers, groups, wristbands, sports watches, and excellent clothing articles. Well-being specialists do not direct the more significant part
of these gadgets however might be embraced by specialists for explicit
well-being applications dependent on casual clinical approval and shopper
examines [3].
8.1.2 In-Home Segment
The in-home fragment incorporates individual crisis reaction frameworks,
far-off patient observing, and medicine-related virtual online visits [4].
This technology integrates wearable gadget transfer units and a live clinical call place administration to build independence for home-bound or
restricted portability seniors. The bundle permits clients to impart and get
crisis clinical consideration rapidly.
RPM comprises all Horne checking gadgets and sensors utilized for
ongoing infection the executives, which includes consistent observing
of physiological boundaries to help long haul cue in a patient’s home
with an end goal to slow illness movement; intense home checking, for
the ceaseless perception of released patients to quicken recuperation
time and forestall re-hospitalization; and prescription administration,
to give clients drug updates and dosing data to improve adherence and
results.
Medicine-related virtual visits include health discussions virtually and
help and aid patients who deal with their medical conditions and solutions
will be acquired as suggested. They involve widely in video protocols, and
assessments will be done based on the computerized test [5].
Architecture of IoMT in Healthcare
163
8.1.3 Network Segment Layer
There are five segments in that layer:
Movable services permit traveler vehicles to follow well-being boundaries during travel.
Crisis reaction intelligence is intended to help specialists on call, paramedics, and clinic crisis office care suppliers.
Kiosks are actually used with a computer with a touch screen and assistance is provided.
For the purpose of taking cure devices, we use medical gadgets that have
a supplier outside the house with clinical settings [6].
Logistics of medicine involves the vehicle and conveyance of medical
services merchandise and enterprises, including drugs, clinical and careful
supplies, clinical gadgets and hardware, and different items required via
care suppliers. IoMT models remember sensors for drug shipments that
measure temperature, stun, dampness, and tilt; start to finish permeability arrangements that track customized medication for a particle n disease
quiet, utilizing radio-recurrence distinguishing proof (RFID) and standardized tags and robots that offer quicker last-mile conveyance.
8.1.4 In-Clinic Segment
This section incorporates IoMT gadgets utilized for authoritative or clinical capacities (either in the center, in the medicine model, or at the purpose
of care). Purpose of care gadgets is in contrast from those in the network
portion in one key perspective: rather than the consideration supplier genuinely utilizing a gadget, the supplier can be found distantly. In contrast, a
gadget is utilized by technically well-qualified staff [7].
8.1.5 In-Hospital Segment
This fragment is isolated into IoMT-empowered gadgets and a bigger gathering of arrangements in a few administration territories: Resource management monitors and trucks high-esteem capital hardware and versatile
resources, such as incontinent siphons and wheelchairs, all through the
office[8].
Faculty management measures staff effectiveness and profitability.
Tolerant stream management improves office tasks by enhancing
patients’ experience, for instance, checking of patient appearance from a
working space to present consideration on a wardroom.
164
The Internet of Medical Things (IoMT)
Stock management streamlines requesting, stockpiling, and utilizing
emergency clinic supplies, consumables, and drugs and clinical gadgets to
diminish stock expenses and improve staff proficiency [9].
Climate (e.g., temperature and humidity) and monitoring oversees
power use and ensures ideal conditions in persistent territories and extra
spaces.
Inventive gadgets incorporate a defibrillator, which consistently screens
patients in danger of ventricular tachycardia or fibrillation. Many research
works consolidates an inhabitant sensor and an ongoing area framework
beneficiary to follow the personality of workers utilizing the container and
utilization examination to decide if representatives are following cleanliness convention:
8.1.6 Future of IoMT?
Around 60% of worldwide medical care associations have just actualized
Internet of Things (IoT) advancements, and an extra 27% are relied upon
to do as such by 2019. Customary medical services see a change in outlook as computerized change places indicatively progressed and associated items in patients’ and medical clinicians’ positions for better process
management.
The benefits of the Internet of Things (IoT) have changed how SMBs
approach gadgets in the working environment. In the present advanced
scene, gadgets, machines, and objects, all things considered, can consequently move information through an organization, viably “talking” with
one another progressively.
Top five points of interest of the IoT are as follows:
1.
2.
3.
4.
5.
Cost decrease
Proficiency and profitability
Business openings
Client experience
Portability and dexterity
For independent ventures, expanding dependence on the IoT speaks to
a sort of modem upheaval, with 80% of organizations as of now utilizing an
IoT foundation of some sort [10].
For SMBs, this flood of innovation gives occasions to grow their advanced
abilities and an opportunity to utilize IoT innovation to better their activities and become more painful, wore secured, and more beneficial.
Architecture of IoMT in Healthcare
165
8.2 Preferences of the Internet of Things
How about we take a gander at a few different ways to use the fate of the
IoT and its forefront innovation to improve basic puts of business.
8.2.1 Cost Decrease
These organizations use these type of technology to enhance the organizations, when these type of innovations, it is used to concern the company’s
maximum security and cyber protection works.
Upkeep expenses can be decidedly affected when IoT gadgets are utilized with sensors to keep business hardware running at top proficiency.
Other-fly investigating of office gear gets issues before they sway staff
and representatives by shooing away the problems caused due to money
issues.
This limits expensive expanded private time for fixes only one of the
advantages the IoT brings to your tasks and support work process [11].
As you can envision, this innovation is beneficial to organizations in the
assembling, coordination, and food and refreshment areas to give some
examples.
There we also various approaches to utilize IoT innovation to effectively
affect your main concern through smoothing out basic working cycles, a
top driver of IoT speculation for some organizations.
8.2.2 Proficiency and Efficiency
Proficiency plays a vital role to improvise the efficiency of IoT. These IoT
devices improve the efficiency in a better way [12]. Truth be told, as indicated by an overview directed by Harvard Business Review, 58% of organizations see expanded joint effort using IT gadgets.
Finally, these types of technology have great profitability and it also
widely helps to improvise the level of the running business. All the needed
information has been plotted in the form of diagrammatic data.
8.2.3 Business Openings
Many types of organizations play a vital role in protecting the best firm
technique since they play a vital role in admin process and client satisfaction also plays an important role in getting information regarding. These
166
The Internet of Medical Things (IoMT)
methods are used to take new ideas to implemented and have not yet been
used extensively.
IoT sensors have been fitted in vehicles to control alcoholic driving,
sleeping control management, and avoid collision management [13].
These types of technical innovations are used in business concerns for
best outputs. The utilization of IoT has such a great effect on business
frameworks that 36% of organizations are thinking about new business
starting on account of their IoT activities. New terms of administering
software for primarily commercial industries are for satisfying the needs
of client sources.
Using IoT-based devices, clients play a vital role in executing the important standards and managing customers’ survival.
With more information accessible through IoT gadgets than any other
in recent memory on client inclinations and item execution over the long
haul, organizations can utilize this and standards of conduct and needs of
customers in excess of anyone’s imagination.
8.2.4 Client Experience
Customers used to contact the organizations and take immense pressure
to find a solution to give their inputs properly to get a fair output. The
clients who prioritize in the first place play a vital role in connecting
them to several clients they have to widen their circle using IoT gadgets.
When all the specifications and needs are met, it is widely used as an
application. In addition, 40% of shoppers could not care less whether a
chat bot or a genuine human encourages them, as long as they are getting
the assistance they need. Consolidating the outputs got helps the organization to improve efficiency and this will help to serve their clients in
a better way.
8.2.5 Portability and Nimbleness
This innovation has a firm source of taking it anywhere, such as adaptive
nature playing a vital role. This type of enhancement helps the workers to
work from any location.
While situations like a pandemic or flood IoT pave away in making the
employers work from any place where these types of technology widely
help enrich the employee and the organization’s knowledge.
These types of work styles have been helpful to the people during the
COVID-19 pandemics. Associated firms must take proper steps to help
the workers work in a relaxed manner from any place in the world [14].
Architecture of IoMT in Healthcare
167
Doing the daily routine needs using telecommunication, the effect and the
percentage of output gained will be more advantageous.
8.3 IoMT Progress in COVID-19 Situations:
Presentation
As per the WHO guidelines, the novel new COVID-19 is being identified
and referred to as deadly infectious and has high transmission chances. As
informed by WHO, this fatal disease is infectious, which is the family of
SARS and MERS. This disease was identified in the year 2019 on December
31. This was first identified and found in Wuhan province, China. This disease has a high transmission rate and contamination.
The seriousness of the disease was considered a pandemic from much
onward. It has nearly affected globally in hundreds of lakh infecting 4Hl
countries.
Corona virus has the worst consequences because they come under the
finally of SARS and MERS.
It has a certain sort of sickness and mortality rate. Existing technologies
also interrogate the previous history of SARS and MERS. Existing methodologies include severe sickness and illness for SARS and MERS. World
Health Organization has been designed worldwide to improve the wellness
of the people and the medical team keeps on researching therapeutic and
diagnostic features for treating COVID-19 patients.
IoMT is very common and has the highest possibility for illness and it
also involves handling with illness. This type of IoMT technology is used
in checking mobiles, laptops, personal systems, etc. Here, artificial intelligence (AI) and computerized analytical reasoning are used. These are
highly used to treat COVID-19 patients. This type of framework is extensively used in pandemic diseases. Cloud computing is also widely used.
This type of technology is used extensively used in creating a framework
for treating this pandemic. They help in grasping the information for
assessment. Sensors can be used based on client requirements.
This type of research work is extensively used in analyzing the frameworks used in the COVID-19 settings. The concerned things are related to
engineering and issue regarding security. More importance is given in the
sector of novelty.
This work has concentrated more on the areas of relief in this pandemic
in the sector of engineering. It involves studying SARS family’s causes
and effects and the study of contamination paper starts with a review of
the IoMT environment, trailed by a conversation on late proposed IoMT
168
The Internet of Medical Things (IoMT)
designs, the standard reference model, and conceivable IoMT pandemic
relief engineering.
8.3.1 The IoMT Environment
The clinical and biological system has developed with quick head ways in
science, innovation, medication, and the multiplication of keen clinical
gadgets. The headway of correspondence innovations has transformed different clinical administrations into available virtual frameworks and far-off
applications.
Current executions of the IoT into clinical frameworks have gradually
affected a lot in clinical life. Experts suggest IoMT plays a vital role in finding a solution for Covid related issues. Highly specified design specialists
opt IoMT on an excellent output.
The most common way of clinical environment is by and large includes
understanding, specialist, prescription (drug specialist), and treatment.
Notwithstanding these, IoMT clinical environment incorporates cloud
information, applications (on the web, portable, continuous, and non-constant), wearable sensor gadgets, and security frameworks
Many technical specialists have been implementing more exciting techniques to create a good health environment by amazingly making plans
and creating platforms using design, innovation, and implementation. The
clinical environment security includes weakness, assault, safeguard, and
alleviation. They were crafted by features the advances in IoMT innovations, structures, and security applications. The security highlights incorporate security prerequisites, danger models, assault, and danger on the
board.
To help a safe IoMT framework gave a methodology for information
check hand evaluation. The method characterized examination strategies
suitable for chose IoT gadgets. Their examination included shortcomings,
assaults, and dangers. Among the broke down information are sensor
information assortment, information inquiry, client enlistment, and the
executive’s stage. An IoMT checking framework that protected security is
intended to stick to the block chain segment. The main aims are to safeguard the information gathered from the body sensors [15]. More focus
has been given on microarchitecture which has more clinical cues and
safety security reasons. IoMT technique has been used widely to achieve
the goal. Correspondence is a significant thought in IoMT. For instance, a
planned IoMT framework that utilizes narrow band IoT convention investigated IoMT framework with Long-Term Evolution correspondence and
Architecture of IoMT in Healthcare
169
consolidated 5G-based correspondence to help long reach remote correspondence, while crafted by zeroed in on short-reach remote correspondence convention, i.e., Wi-Fi feature plays a vital role in IoMT.
In the IoMT, cloud stage innovation has planned a multi-distributed
computing IoMT engineering to help huge framework development and,
simultaneously, uphold fail-over for capacity disappointment recuperation. The plan incorporates a falling director, stockpiling reinforcement,
asset steering, and disappointment recuperation. Another headway features sensor-O shrewd apparel and wearable gadgets to help in distant wellbeing recognition and indicative administrations. They likewise depicted
a cloud-based psychological figuring and human-made consciousness
robot-tolerant connection. Fitting the sensors into the body will play a vital
role in framing the bodywork.
8.3.2 IoMT Pandemic Alleviation Design
A conversation of this major application and novelty would have completed without having a reference system model. Some standards should
be designed for industrial specialists’ reference for better improvement and
adaptation for business culture. The framework of IoT should be done properly. It also gives a direction to the improper way of IoT frameworks and is
intended to bind together IoT frameworks and limit industry discontinuity.
There are certain targets: i) to give a protected and IoT frameworks structure for various application spaces; ii) to provide a structure for evaluations
and correlations of accessible IoT frameworks; and iii) to give a system to
help in quickening plan, activity, and organization of IoT frameworks[16].
It has a system of architecture application-oriented platform. The standard incorporates the connection of the center, explicit to enormous information, distributed computing, and edge registering advancements with
brought together the viewpoints.
The gadget layer comprises of equipment, for example, sensors, regulators, the face camera, wellness smart devices, well-being observing sensors,
insulin siphons, and infrared temperature sensors, which are among the at
present utilized equipment. The sensors can be either body wearable or in
the form of smart gadgets, implementable type of gadgets and the smart
gadgets surrounded sound.
The following layer is the correspondence network layer. A portion of
the ongoing correspondence advancements utilized are wireless kind of
networks such as Wi-Fi and Bluetooth. These are lightweight conventions
that are reasonable for low force gadgets in remote organizations. Another
170
The Internet of Medical Things (IoMT)
significant component is the aggregator, for example, switches that go
about as entryways to give multi-thing availability. The information-driven
nature of the IoT structure is where the substance is the critical component
in the foundation. ICN offers versatility, productive steering portability,
storing methodology, and security components to IoMT.
The IoT stage layer is a type of variant center layer that offers support
uphold, data, distributed computing, and middle innovation. For the cloud
stage, we take examples as the many companies such as Google, Amazon,
and Oracle. The highest layer of the IoT design is the application layer. This
layer incorporates quite a few gadgets, for example, observing framework,
following/finder framework, wellness/well-being frame work, clinical erecord, distant analyze framework, tele medicine, and so on.
Different analysts have additionally proposed IoMT reference models, the fact that not zeroing on IoMT explicit engineering; for instance,
reference works include a design which has a base, computer, and registering edge. Another work has an IoMT framework engineering containing a four-layered design: recognition layer, transport layer, cloud admin
layer, and cloud-to-end combination which has distinguished well-being
framework as among IoT-based brilliant urban areas application closed the
design layer as detecting, organization, distributed computing, and application layer.
8.3.3 Man-Made Consciousness and Large Information
Innovation in IoMT
They deal with helping the well-being in the organizations in the use of the
corona analytic gadget. These test demonstrative models for corona is certainly not a direct cycle. Different models are needed to be surveyed, and
a portion of the standards are in struggle with one another. Thus, a choice
network that consolidates appraisal boundaries and analytic models for
get multi-rules dynamic regarding the evaluation rules is required. These
models need proper training to be done. These classifiers help to deal with
the outcome they got.
These benchmarking and evaluation measure in the comma characterization frameworks is known to be a multi-objective/standard issue. The
objective is to give an incorporated system to the appraisal and benchmark
of different corona analytic classifiers. This aspires the improvement of
bound together classifiers inside a solitary framework, covering all the proficiency measurements of the appraisal of the communal classifier models.
The method created as a help system to help leaders in the clinical and
Architecture of IoMT in Healthcare
171
well-being, figuring out which of the best grouping plans can be utilized to
analyze COVID-19 by contrasting different order models.
It gives an outline of the COVID-19 illness and evaluations the seriousness of the pandemic, the endurance rate and the casualty rate utilizing
referred to AI strategies just as numerical reproduction procedures. The
objective is to decide the connection between the reliant variable and the
free factor. However, it can be obtained through transmission from contaminated people—contaminated (the individual has gotten the sickness)
and recuperated/perished. This neural model was the best of the AI strategies utilized in the trial. The discoveries are useful in anticipating and
forestalling the episode of any scourges or pandemics in each nation or the
globe.
Progressed AI strategies have been used in a methodical characterization of COVID, CRISP-based COVID identification test, and discriminating COVID. Many investigations goes for the related things like AI,
big information examination, cutting edge 5G correspondence that can
assume an imperative function in forestalling the spread of irresistible illnesses. This encourages all the while the information recording, tolerant
well-being following, information examination, and cautioning.
The utilization of these type of IoMT in China has been shown during
the proceeding with COVID episode. Features on the utilization of clinical IoT toward COVID are the use of infra cameras and frame network
of the individual face. To lessen the conceivable danger of the COVID-19
in presentation to power when leading direct internal heat level motions,
China assembled mBots fitted with AI. It uses infrared cameras and
thermometers.
8.4 Major Applications of IoMT
Nowadays, IoT is a blooming field where many health technicians are getting benefited. AI and deep learning algorithms are embedded into the
field of IoMT.
The main domains are as follows:
1.
2.
3.
4.
5.
IoMT-based tracker
Patient monitoring system in a real-time basis
Diagnosing and tracking the fitness data of an individual
Insensible smart devices
Continuous diabetics monitoring
172
The Internet of Medical Things (IoMT)
References
1. C., Towards Non-invasive Extraction and Deterinination of Blood Glucose
Levels. National Library of Medicine, 4, 4, 82, Sep 27 2017.
2. Saasa, V., Sensing technologies for detection of acetone in human breath
for diabetes diagnosis and monitoring. J. Colloid Interface Sci., 8, 1, 12, Mar
2018.
3. Liu, Acetone gas sensor based on NiO/ZnO liollow sphères: fast response
and recovery, and low (ppb) detection limit. National Library of Medicine,
495, 207–215, Jan 31, 2017, Epub.
4. Ahmad, R. Dr., Halitosis: a review article. Int. J. Curr. Res., 5, 12, 3758–3762,
December 2013.
5. Scluiabel, R., Analysis of volatile organic coinpounds in exhaled breath to
diagnose ventil ator-associated pneumonia. Sci. Rep., 5, 17179, 2015.
6. Lourenço, Breath Analysis in Disease Diagnosis: Methodological
Considerations and Applications, 465–498, Pubmed, June 2014.
7. Moon, All villi-like metal oxide nanostructures-based chemiresistive electronic nose for an exhaled breath analyzer. Sens. Actuators B Chem., 257,
295–302, October 2017.
8. Kang, Human breath stiinulator using controlled evaporator mixer to test
the performance of a breath analyzer. Appl. Therm. Eng., 10, 363–368, 1359–
4311, 2016.
9. Burgi, Portable electronic device with breath analyzer, United States patent,
Feb 2017.
10. Nakhleh, M.K., Detection of halitosis in breath: Between the past, present,
and future, 24, 5, 685–695, Jul 14. 2018.
11. Rahman, R.A.B., lot based personal healthcare monitoring device for
diabetic patients. IEEE, 19th October 2017, https://doi.org/10.1109/
ISCAIE.2017.8074971.
12. Aylikci, B.U., Halitosis: From diagnosis to management. J. Nat. Sci. Biol.
Med., 4, 1, 14–23, Jan-Jun 2013.
13. Alaskaq, M., Assessment of halitosis using the organoleptic method and volatile sulfur compounds monitoring. J. Dental Res. Rev., 3, 3, 94–98, 2016.
14. Clioi, Selective Detection of Acetone and Hydrogen Sulfide for the Diagnosis
of Diabetes and Halitosis Using SnO2 Nanofibers Functionalized with
Reduced Graphene Oxide Nanosheets, 26, 6, 4, 2588–97, Feb 2014, Pubmed.
15. Thati, A., Breath acetone-based non-invasive detection of blood glucose levels. Int. J. Smart Sens. Intell. Syst., 8, 2, 1,244–1,260, June 2015.
16. Dhivya, A.J.A., Chandrasekaran, R., Sharika, S., Thamizhvani, T.R.,
Hemalatha, R.J., IoT based halitosis detection. Indian J. Public Health Res.
Dev., 10, 5, June 2019.
9
Performance Assessment of
IoMT Services and Protocols
A. Keerthana1* and Karthiga2
Biomedical Engineering VISTAS, Chennai, India
Biomedical Engineering, Agni College of Engineering, Chennai, India
1
2
Abstract
The Internet of Medical Things (IoMT) is combination of healthcare devices
and its applications where it is connected. The IoMT is useful for researchers,
patients, and medical advisers, and it is also useful for assisting the patients and
tracking their details from rural area. Medical resources and health services are
interconnected by digital healthcare system and researchers are contributing it.
The data transfer of patient should be accurate for best healthcare service using
Internet of Things. There are three layers in architecture of the protocol system:
1. data receiving; 2. “protocols”; and 3. data collected by sensor “sensing layer”.
Many medical sensors like temperature, ECG, heartbeat, pressure measurement,
and glucose sensor can be seen in sensing layer. Sensors measure the subject data
and received data is transferred to doctor through layer known as server layer.
Communication protocols are in the server layer, which helps in authorizing the
data between devices and IoT in many fields. The need for the protocols is very
important such as CoAP and MQTT, which is more compatible. The information
that is transferred to the doctor will be sent to sensing layer which will be processed and stored. The IoT protocol made by researcher should be high efficient,
energy consumption, suitable for data transfer like electronic health, and often
operate on batteries, and the energy can be saved by using suitable protocol.
Keywords: IoMT, medical sensors, digital healthcare system, architecture,
protocols, CoAP, MQTT, data transfer
*Corresponding author: keerthana0792@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (173–186) © 2022 Scrivener Publishing LLC
173
174
The Internet of Medical Things (IoMT)
9.1 Introduction
The Internet of Things (IoT) concept expands with different domain applications like the Internet of Robotic Things (IoRT), the Internet of Medical
Things (IoMT), Autonomous System of Things (ASoT), Autonomous
Internet of Things (A-IoT), and Internet of Things Clouds (IoT-C). Many
protocols are emerging because of great expansion of applications of the
IoT using hardware and sensors and smart objects.
The IoMT is useful for patients, medical professionals, researchers, and
insurers, and it is also useful for assisting, data insights, drugs management, operations augmentation, and tracking patients and staff from rural
area. Researchers are contributing toward a digitized healthcare system by
interconnecting the available medical resources and healthcare services
[1]. The data transfer of patient should be accurate for best healthcare service using IoT.
Security and privacy fragility are the problems faced in IoMT. In recent
year, IoMT has been at the forefront of cyber-attacks. There is no option
for IoMT stakeholders but to believe in security solutions. It is necessary to
develop the assessment model to allow the security expandability in terms
of security.
In the coming years, there are many number of IoMT devices that
are expected to arrive and it may contain components which will lead to
interoperability and privacy-related problems. So, healthcare platform
should be enough to handle these problems. In this sense, a pervasive
healthcare platform must be flexible enough to handle all these concepts
[2].
There are open issues that is related to heterogeneity of device and possible data representations, as well as the volume and complexity of the collected data:
1. Standardization of healthcare.
2. The performance of communication protocols in IoT/IoMT
depends on the functionalities they offer. Depending on the
application context, IoMT platform should adapt the use of
existing protocols and must integrate the new one.
3. Electronic health record [3] (EHR) is a technique used for
storing patient data in hospital information systems. Due to
lack of integration of IoMT platforms with health records,
it affects the health professionals to manage the patient’s
information.
Performance Assessment of IoMT
175
4. More volume and data complexity made difficult interpretation by the physician; this challenge can be eradicated by
using artificial intelligence algorithms.
9.2 IoMT Architecture and Platform
An architecture of IoMT has been shown in Figure 9.1, which include the
main components of clouds and electronic healthcare devices based on
IoMT techniques.
The medical or vital data collected from different sensors or from different repositories are sent to the gateway and then to the cloud system. Smart
phones used by patients are also considered as gateways for monitoring.
The data that is stored in cloud can be processed for statistical analysis using machine learning (ML), big data analysis, and online analytics
processing. The healthcare information can be used for local hospitals for
managing and visualizing the patient records. The information in cloud
can be used for third party applications.
Healthcare
Professional
External
Application
Cloud Services
Smartphone
Gateway
IoMT
Devices
Figure 9.1 IoMT healthcare system.
Fog Server
IoMT
Devices
Heterogeneous
Healthcare
Repositories
176
The Internet of Medical Things (IoMT)
9.2.1 Architecture
The framed architecture has been shown in Figure 9.2. The platform consists of three layers as follows [4]:
1. Integration layer
2. Data integration layer
3. Knowledge extraction and data visualization layer.
Domain Experts
Applications
Data Visualization Data Mining Tools
FHIR API
Data Marts Data Wranging openEHR to FHIR
openEHR Storage
openEHR Repository
Data Coordinator
IN-CSE
IoMT Gateway
IoMT Fog Servers
IoMT Devices
IoMT Platforms
Figure 9.2 Platform architecture.
Performance Assessment of IoMT
177
9.2.2 Devices Integration Layer
This layer consist of integrated heterogeneous sensors, administers, and
data sources. The following different data sources are shown:
1. One M2M compatible IoMT devices—which send data to
gateway.
2. Non-one M2M IoMT devices—independent.
3. An IoMT platform data source—handles heterogeneous
healthcare data.
Transformation of data to open EHR is done by non-one M2M IoMT
devices—independent and IoMT platform data source—which handles
heterogeneous healthcare data.
9.3 Types of Protocols
Increasing use of smart sensors, wearable techs, digital assistants, and
cheap data access has prompted loads of medical data that requires safe
communication between devices, users, and clients. In order to allow communication to the world via an IP (Internet Protocol), IoMT follows several standards and protocols.
Communication (networking) protocol for medical IoT devices includes
the following:
a) Device-to-Device (D2D) Communication or Machine-toMachine (M2M) Communication.
b) Device-to-Server (D2S) Communication or Machine-toServer (M2S) Communication.
c) Server-to-Server (S2S) Communication.
All the above communications are envisaged by IP protocol for smart
devices (Session, Network Encapsulation, and Network Routing) and low
power technologies (Data Link) [4]. Refer to Figure 9.3 for overview of
communication protocol.
9.3.1 Internet Protocol for Medical IoT Smart Devices
Regardless of the device design, structure, integrity, and application, they
are connected to end user through the network (internet) follows various
178
The Internet of Medical Things (IoMT)
Server to Server (S2S)
Device to Server (D2S)
Device to Device (D2D)
Figure 9.3 Communication protocol overview.
session IDs such as HTTP, MQTT, MQTT-SN, XMPP, CoAP, DDS, and
AMQP.
9.3.1.1 HTTP
HTTP (Hypertext Transfer Protocol) is the most common type of protocol
used for IoT. It is an application layer protocol that allows users to communicate on the Internet [5]. HTTP data rely on an open communication of
TCP protocol with a server. Once a connection is established large amount
of data can be reliably transferred. Data transfer can be stopped once the
connection is made cut off (Request/Response protocol). While HTTP
may seem to be a simple network protocol, it is not desirable for IoMT
as it is unidirectional (i.e., either Client or Server can only send data at a
time). Client or Server responding time utilizes system’s I/O threads, CPU
cycles on both the sides. Since many sensors are connected in single IoMT
devices, it puts a heavy load on the server as for every HTTP data to be sent
TCP protocol has to be called. This leads to high utilization of resources,
thereby resulting in high-power consumption.
Performance Assessment of IoMT
179
9.3.1.2 Message Queue Telemetry Transport (MQTT)
MQTT is a protocol designed by IBM for sending simple data flows from
sensors to applications and middleware and communicating it to servers
(D2S) [6]. MQTT’s architecture involves three components: connected
devices known as “Clients”, server communicators called “Brokers”, and
interactive “Subscribers”. When a client wants to send data to the broker,
this is known as a “publish”. MQTT acts as a publish/subscribe model
that contrasts with request/response (HTTP) model is explained in
Figure 9.4.
All messages go through the server (“Broker”) before they can be delivered to the subscribers. Hence, choosing the server requires careful consideration for scalability and capabilities. MQTT rides on TCP/IP protocol
with SSL (Secure Sockets Layer) service certificate ensure that MQTT with
SSL/TSL may not be a viable option for resource constrained IoT devices.
This disadvantage is overcome by using clear username and password for a
client-server handshake.
MQTT has clear advantage over competing protocols. These are as
follows:
• Light weight protocol (messages have a small footprint with
fixed header and a QoS level) makes this protocol session to
be quick to implement.
• MQTT uses low network usage, due to minimized data
packs.
• Transmission of data over client to a broker involves a handshake through authenticated server certificates requires usage
of small amounts of power and optimizes the bandwidth.
Subscribe
Publish sensor data
Multiple
Sensors
Data
Processing
and Storage
Broker
Subscribe
Publish
Figure 9.4 The MQTT publish and subscribe model for IoT.
Admin
180
The Internet of Medical Things (IoMT)
Meanwhile the drawbacks of MQTT include the following:
• MQTT is unencrypted. (It uses SSL/TSL encryption not
suitable for many sensors connected IoT devices.)
• It is difficult to create a globally scalable MQTT network.
Secure Message Queue Telemetry Transport (SMQTT) is an extension
to MQTT protocol for encryption to deliver the message to multiple nodes.
MQTT-SN (Message Queue Telemetry Transport—for sensor networks) is
another form of MQTT designed specifically for wireless sensor networks.
9.3.1.3 Constrained Application Protocol (CoAP)
CoAP is protocol created by CoRE task force by IETF. This protocol
addresses the constrained environment faced by IoT devices (small battery-­
powered devices) such as energy constraints, memory limitations, unreliable networks, higher latency in communication, and unattended network
operation. CoAP runs on UDP as such HTTP runs on TCP IP and it is a
D2D communication by design. Since CoAP runs on UDP, it is a connectionless protocol. It is also an asynchronous light weight protocol based
on Request/Response Client Server system. CoAP uses “confirmable” and
client
Server
CON
ACK
client
Server
NON
Figure 9.5 CoAP message model.
Performance Assessment of IoMT
client
181
Server
CON
GET / Request
ACK
Acknowledged
Token
Figure 9.6 CoAP request/response model.
“non-confirmable”, “acknowledgement”, and “reset” as four messages in its
messaging model. Refer to Figures 9.5 and 9.6 for CoAP message model
and request/response model.
CoAP messaging model carries a small 4-byte binary header message. Each
message is provided a Message-ID and transaction between Client and Server
using CON (Confirmable) and ACK (Acknowledgement) message. In case
of unreliable message transaction, the interaction between client and server
corresponds to NON (Non-Confirmable) and RST (Reset). Likewise, CoAP
uses GET, PUT, POST, and DELETE messages in utilizing HTTP server.
9.3.1.4 AMQP: Advanced Message Queuing Protocol (AMQP)
In a fast-moving software industry, companies own software applications,
their packages, and associated frameworks that require repeated supporting to the parent company [7].
AMQP can connect across organizations, technologies, third party
applications, unattended systems, and poor networks [8].
AMQP allows an interconnection between various vendors, middleware,
and clients based on message exchange protocol. AMQP protocol’s goal
includes security and reliability. Refer to Figure 9.7 for AMQP Interaction
Model. Figures 9.8 and 9.9 shows AMQP capabilities and AMQP for cloud
connections.
9.3.1.5 Extensible Message and Presence Protocol (XMPP)
XMPP is an xml real-time messaging communication developed in 1999
by jabber open-source community. XMPP allows real-time (online/offline
presence) access to data such as instant messaging, voice-video calls, and
multiple group calls.
182
The Internet of Medical Things (IoMT)
LEGACY
APP
BUSINESS
APP
SHARED
RESOURCE
DATABASE
LEGACY
APP
BUSINESS
APP
SHARED
RESOURCE
DATABASE
SYSTEM 1
SYSTEM 2
Cloud
3rd Party Apps
Cloud Based Applications
Figure 9.7 AMQP interaction model with middleware.
Messaging
Publish / Subscribe
transact
File Transfer
Detect
Report
Figure 9.8 AMQP capabilities.
Publish
Exchange
Queue
Consume
Consumer
Producer
Routes
Figure 9.9 AMQP for cloud connection.
Performance Assessment of IoMT
Brokers
Publisher
183
Subscriber
Subscriber
Publisher
Subscriber
Publisher
DDS Architecture
Figure 9.10 DDS protocol Architecture.
XMPP is preferred open source protocol because of its advantages such
as being an open (or) public standard, stable, secure, and decentralized
flexible standards. XMPP applications include network management, gaming, cloud computing, and remote systems.
9.3.1.6 DDS
Data distribution system is a real-time D2D (M2M) communication. This
protocol utilizes scalable multicasting technologies in the data transmission
and QoS. DDS rides on DCPS (Data-Centric Publish-Subscribe) layer to
communicate from publishers to subscribers through reliable IoT devices.
DLRL (Data Local Reconstruction Layer) enables sharing of distributed
data among IoT devices. Refer to Figure 9.10 for DDS protocol architecture.
DDS uses fully distributed GDS (Global Data Space) to avoid failure
or bottleneck. GDS also announce DDS to be fully distributive devoid of
broker-less architecture unlike MQTT and CoAP protocols.
9.4 Testing Process in IoMT
In a healthcare domain, the range of IoMT is not limited to wearable
tech and telemetry devices. It involves data propagation among 10–20
devices, patients, and the whole hospital. Hence, testing should comprise
the propagated data also. Moreover, IoMT solutions are complex and
184
The Internet of Medical Things (IoMT)
multi-perspective. This ensures that it needs multilevel testing approaches
as follows.
Usability Testing (UT): Usability testing ensures that the interface
between device/application and user is met satisfactorily. The primary
focus of UT lies on ease of use, ease of learning and/or familiarization,
responsiveness, throughput of the device/app without having bottleneck,
and its ability to throw exceptions, warnings, and errors to communicate.
An UT passed IoMT device/app would allow the user without training or
a guide.
Reliability Testing (RT) and Scalability Testing: Reliability factors
include identification, selection of sensors, and proper IoT network protocols as well. Scalability is another important factor that is decided by the
number of devices connected to a system as well as the data consumption
it takes. For an instance, data transfer in a HTTP IP employs high-power
consumption and load and therefore not scalable for a large IoT network.
Security Testing: IoMT generally involves transfer of clinical data; there
is always a probability that the data can be accessed or read or updated
during data transfer. From a testing standpoint, it must be checked that
the data is protected/encrypted when getting transferred from one device
to the other, restricting unauthorized data access. Wherever there is an UI
(User Interface), it should be made password mandatory and validated [9].
Performance Testing: As mentioned previously, IoMT devices are scaled
to entire hospital and involves more than 180–200 people concerned.
Hence, performance testing should encompass testing of large-scale operations, user responsiveness, traffic handling and device/system response,
usage, and temperature.
Compatibility Testing: In a large-scale IoT system, there may be an integration of different platforms, applications, software package, device compatibility, browser specifications, operating system versions, etc. Therefore,
it is necessary to test for compatibility among different devices, platforms,
and operating systems using UT.
Pilot Testing: Pilot testing is a controlled and/or limited real-time field
testing. During this testing, careful consideration of bugs and errors are
noted, received, and reviewed for upgradation. An IoMT device that
passed the pilot testing is ready for production deployment.
Regulatory Testing: Every device should follow design considerations
and regulations. Proper regulations followed in sensor selection, network,
data, and session selection will make sure that regulatory testing is cleared.
Upgrade Testing: Upgradation is the key factor in testing, any developed
device should allow the user and the developer a room for upgradation,
Performance Assessment of IoMT
185
only then we can overcome the challenges. Upgrade testing can be done
after UT, PT, and RT as well or after some technical advancements as well.
9.5 Issues and Challenges
There are many limitations and challenges in implementation of IoMTbased patient monitoring system. Issues may occur in inter-operability
between hardware and software, bandwidth, quality of health services, limitation of battery life, and sensor biocompatibility. But the emerging technology with advanced intercommunication will be helpful in overcoming
these challenges.
9.6 Conclusion
IoMT is emerging approaches for enhancement of healthcare services. In
combination with data mining, cloud computing, and ML, it will help the
physician to make good diagnosis and it also increase their knowledge.
There are many security problems in IoMT, and this can be reduced by
designing proper protocol architecture for healthcare applications.
Different protocols can improve healthcare data interchange, which
will make it to be faster and to reduce the loss of data. The storage and
transmission of data related to healthcare observations, which promotes
interoperability regarding data representation formats, should be considered while designing the platform for IoMT.
References
1. Joyia, G.J. and Liaqat, R.M., on Internet of Medical Things (IoMT):
Applications, Benefits and Future Challenges in Healthcare Domain.
J. Commun., 12, 4, 240–247, 2017.
2. Rubí, J.N.S. and Gondim, P.R.L., on IoMT Platform for Pervasive Healthcare
Data Aggregation, Processing, and Sharing Based on OneM2M and
OpenEHR. Sensors (Basel), 19, 1–25, 2019.
3. Mandel, J.C., Kreda, D.A., Mandl, K.D., Kohane, I.S., Ramoni, R.B., SMART
on FHIR: A standards-based, interoperable apps platform for electronic
health records. J. Am. Med. Inform. Assoc., 23, 899–908, 2016, [CrossRef].
4. Sethi, P. and Sarangi, S., on Internet of Things: Architectures, Protocols, and
Applications. J. Electr. Comput. Eng., 2017, 1–25, 2017.
186
The Internet of Medical Things (IoMT)
5. Protocol Support Library Hypertext Transfer Protocol Internet. Source:
https://www.extrahop.com/resources/protocols/http.
6. IoT Agenda MQTT. Internet Source: https://internetofthingsagenda.­
techtarget.com/definition/MQTT-MQ-Telemetry-Transport.
7. Azure Service Bus messaging overview, Microsoft Documentation. Internet
Source: https://docs.microsoft.com/en-us/azure/service-bus-messaging/.
8. ISO/IEC JTC 1 and the ISO and IEC Councils, International Standards.
Internet Source: https://www.amqp.org/.
9. OWASP Top 10 Security Risks & Vulnerabilities. Internet Source: https://
sucuri.net/guides/owasp-top-10-security-vulnerabilities-2020/.
10
Performance Evaluation of Wearable
IoT-Enabled Mesh Network for
Rural Health Monitoring
G. Merlin Sheeba1* and Y. Bevish Jinila2
School of Electrical and Electronics, Sathyabama Institute of Science and
Technology, Chennai, Tamilnadu, India
2
School of Computing, Sathyabama Institute of Science and Technology, Chennai,
Tamilnadu, India
1
Abstract
Wearable Internet of Things (IoT)–enabled biosensors are attaining endless
interest day by day. The biosensors are a device made up of transducer, biosensor reader device, and a biological element. The growing healthcare demand and
consciousness in elderly people has become one of the important aspects. Due to
huge technology growth, the medical treatments in urban and rural areas have
accelerated to greater dimensions. In rural region, the elderly are not treated in
time or treated reactively. The sensors are in the form of bandages, tattoos, shirts,
etc., which allows continuous monitoring of blood pressure, glucose, and other
biometric physiological data. To address this issue a point-of-care monitoring
unit is developed in rural areas for healthcare and awareness. To enhance the performance of the system, a smart and intelligent mesh backbone is integrated for
fast transmission of the critical medical data to a remote health IoT cloud server.
By experimental analysis, it can be inferred that the survival rate of the critical
patients is 10% better compared to conventional scheme. In addition, the endto-end delay in data transmission is considerably 10% to 30% less compared to
conventional scheme.
Keywords: Wearable biosensors, biological receptors, mesh backbone, glucose
sensor, diabetics, point-of-care, IoT
*Corresponding author: merlinsheebu@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (187–206) © 2022 Scrivener Publishing LLC
187
188
The Internet of Medical Things (IoMT)
10.1 Introduction
With the growing need of reliable point-of-care monitoring, huge interest
has been created to design wireless integrated networks for faster transmission of medical data. According to the International Diabetics Federation
Council (2013), the percentage of diabetics has reached 65.1 million. The
conventional risk factors are due to the modernization, unhealthy eating
habits, and lack of physical activity coupled with differences in body weights.
The most disturbing trend set is that the young people getting into trouble quickly compared to the western countries. Type 1 diabetics are often
unnoticed and untreated. Usually, in rural areas, the elders are not predetermined to take medicines and also they avoid a regular check-up. People
are not aware whether they have type 1 or type 2 diabetics, and mostly,
the symptoms are very mild and hence they are unnoticed. Smoking and
drinking habits decrease the glomerular filtration rate in diabetic patients
with normal renal function. It also increases the risk of microalbuminuria
and increases the progression of renal failure in patients with type 2 diabetes. An analysis trend shows an increase in diabetic prevalence among the
rural population is at the rate of 2.02% over 1,000 per population per year
[1, 2]. A WBAN network [4, 8, 9] has significant advantage over the traditional wired patient monitoring system. It improves the quality of diagnosis
and rehabilitation. Additionally, the network provides less investment cost
compared to the conventional deployment. The physiological conditions
must be monitored continuously for the patients with chronic diseases.
Wireless Mesh Network (WMN) [3] makes patient monitoring easier and more reliable. The low investment cost of mesh network serves
as a promising solution for rural medical facilities. In a WMN, the node
can send and receive the messages in multihop. It can be deployed to get
dynamic and cost effective connectivity over a various geographic area. A
node also functions as a router and relay messages to its neighbor. Through
this relaying system, a packet of wireless data will be forwarded to its destination through the intermediate nodes. The mesh network is redundant
and reliable. If one of the nodes fails to operate, then the remaining nodes
can still communicate with each other. A WMN is a special type of wireless
ad-hoc network. This network often has more planned configuration.
Based on the survey of World Health Statistics the under developed
countries are facing many complex health issues which are unnoticed by
the families or by themselves because of economy and the transition made
between home and the hospitals [11]. Access to medical facilities in remote
areas is a major problem. Chronic health conditions are a major challenge
Wearable IoT-Enabled Mesh Network
189
in global health. The diseases like diabetes and heart disease are rarely
diagnosed or treated until too late and the costs of treatment often weaken
the family. To solve these kinds of situations, a wireless-enabled smart and
intelligent E-health service is of utmost important. It is indeed necessary
point-of-care unit in each village. Hence, the proposed system is skewed
toward the rural older adults for early identification of health risks. Several
works on wireless patient monitoring have been carried out.
Lee et al. [4, 5] proposed an indoor mobile care acquisition device
attached to the patient body and the physiological parameters are transferred to a central management via WLAN. But the signal could be weekend
as it passes through many obstacles such as door walls and windows. The
dead spots will cause a communication disconnection between the mobile
care device and WLAN. Chih Lai et al. [6] have proposed a wireless multihop relay network for patient monitoring. The authors have worked on
the case study of home alone elder patients. The ECG data from the patient
body is acquired by the sensors. A residential gateway is responsible in
gathering and uploading the data to a remote care server. The authors have
developed a prototype only for the ECG data acquisition. GSM modems
are connected to the residential gateway to send alert SMS which is not a
cost effective solution.
M.R. Yuce [7] has provided an idea of using miniaturized wearable sensor nodes with portable wireless gateway nodes which is used to connect
the sensor nodes to the Internet or the WLAN network. He has performed
detail discussions about the wireless technologies to be integrated with the
patient monitoring system. Youm et al. [9] have designed a web based selfcheckup system for the users to promote their health lifestyle. The health
check up terminal consists of an external measurement device from where
the physiological parameters are taken from the user’s body and transferred to the software application installed in the server terminal. The system does not support the illiterates.
Yena Kim [10] has proposed an energy efficient patient monitoring system with Body Area Network (BAN). The monitoring sensor nodes are
formed as clusters in each patient body with a cluster head. The smart
phones integrated with the system acts as a master node to increase the
lifetime of the sensors. Yang et al. [12] have discussed on the emerging
technologies for healthcare systems. The following trends of health services are observed in sensing, data analysis, and cloud computing services.
The physiological parameters are sensed from the patient, providing a
means of record monitoring continually and seeking emergency assistance
in times of critical scenario. Cloud computing achieves an ideal platform
for efficient use of computing resources. Still, there are many issues and
190
The Internet of Medical Things (IoMT)
challenges to be addressed. Ludwig et al. [13], Steele et al. [15], and Hague
et al. [16] have conducted an extensive study on health services rendered
for older adults.
To summarize, in the existing schemes, patients in the rural area are not
serviced on time, and the network which supports a faster transmission is
not addressed. This leads to a condition where there are chances of critical
data loss. To address this issue, in this work an effective system that integrates the intelligent mesh backbone with IoT-enabled wearable sensors is
proposed. The IoT cloud server stores the received data from the patients
which is monitored by the doctors at the remote end. The rest of the work
is organized as follows. In Section 10.3, details about the proposed system
including the architecture and the components are addressed. Section 10.4
deals with the experimental results obtained. Section 10.5 analyzes the performance of the proposed system with the conventional scheme. Section
10.6 concludes the proposed work.
10.2 Proposed System Framework
10.2.1
System Description
The proposed system includes a health monitoring IoT cloud server [20]
integrated with biosensors and mesh backbone to observe the physiological data from the patients and emergency service [18, 19] is interconnected using a backbone WMN. Rural areas where medical assistance is
at subsidiary level can be benefited using this proposed system. The key
success in interconnecting two types of networks depends on the ability to reduce radio disturbances [3]. The addressing schemes and routing
strategies can be different with respect to the sensor network and mesh
network. Figure 10.1 shows the modules of the proposed system, namely,
Health Monitoring Center (HMC), the self-configurable backbone mesh
network, the faraway E-health service, and the emergency ambulatory
service.
The HMC monitors and records continuously the physiological parameters of the patients in care using the wearable biosensors. The adults suffering from chronic diseases are monitored periodically and few of them face
a situation to be under the care of the physician continuously. The physiological conditions of a patient are acquired through a sensor coordinator
node or device compatible with acquisition mode and transmitting mode
called wireless transceiver. Suppose for example if the admitted patient is
suffering from a chronic diabetic disorder, then the blood pressure and
Wearable IoT-Enabled Mesh Network
191
Distance City/Town Hospital
HMC
Remote Village -1
Backbone Mesh
Network
Mesh Gateway
HMC
Internet
Remote Village -2
E-health Doctor
Service
HMC
Remote Village -3
HMC - Health Monitoring Centre
VANET-Ambulance
services
Figure 10.1 Architecture of wearable IoT-enabled rural health monitoring system.
glucose sensors are actively used to diagnosis their health condition. The
medical sensed data is collected by a data acquisition unit and transferred
wirelessly to the neighbor mesh router (MR). When a critical scenario
emerges an alert request is routed multihop through the mesh backbone
which is auto reconfigurable and self-healable. If any neighbor router fails,
then the forwarding router automatically reconfigures an alternate path
to reach the Mesh Gateway (MG). The response time of the system is an
important factor which will save a life. Human intervention is not needed
to reroute the message. Simultaneously, at the instant of alert scenario, a
SMS/e-mail is forwarded to the faraway doctor. Now, the E-health service
is activated and the doctor reviews the medical records of the intensive
care patients. Suggestions regarding the treatment are provided in worst
192
The Internet of Medical Things (IoMT)
case of emergencies the patients are advised to get admitted to the nearby
town/city hospitals. An ambulatory service is networked with the mesh
backbone to transit the people from rural to urban hospitals.
10.2.2
Health Monitoring Center
The HMC consists of the components, respectively, (a) biosensor nodes,
(b), sensor coordinator, (c) mesh backbone, and (d) E-health IoT cloud
server.
10.2.2.1
Body Sensor
A body sensor unit shown in Figure 10.2 consists of a RISC processor and
a flash memory for storing and reading. The WBSN runs with TinyOS
which is small and an open source energy efficient operating system. The
OS manages both the hardware and the WMN like taking the physiological
sensor measurements, transmitting or routing in the energy efficient path
and also checking the power dissipation. The sizes of the nodes are approximately 26 mm which requires only 0.01 mA of power in active mode and
1.36 mA for complex computations.
10.2.2.2
Wireless Sensor Coordinator/Transceiver
For critical patients, the sensor nodes are placed in areas of their body from
where the physiological reading should be monitored continuously. The
overall proposed system design of HMC is shown in Figure 10.3 and the
main sensor functions are discussed as follows.
(a) Glucose Sensor: It is a small sensor and an electronic coordinator to monitor a diabetes patient’s sugar levels. This
allows the physician to see the trend of the glucose levels so
Display Interface
Glucose
sensor
Amp
User Interface
MCU
Zigbee
Bluetooth
USB
Memory
Figure 10.2 Body sensor node and its internal architecture.
Antenna
Data
exchange
Wearable IoT-Enabled Mesh Network
Critical Patient
Physician(s)
Non-Critical Patient
WBAN
IoT Cloud
Database
Server
EEG
Hearing
Positioning
ECG
Motion
Sensor
Blood
Pressure
SPO2 &
motion
Sensor
Coordinator
Ontology Information
Recording System
Diagnosis Treatment
Maintaining a medical
history
Prescriptions
ECG, EEG, BP
Glucose level
Temperature
Glucose
Sensor
VANETAmbulance
services
Emergency Procedures
Ambulance SMS
Tracking Module
EMG
Motion
Sensor
Mesh backbone
WBAN-Wireless Body Area network
VANET-Vehicular Adhoc Network
Wireless Transceiver/
Co-ordinator
Figure 10.3 System framework of health monitoring center (HMC).
as to avoid episodes of hypoglycaemia. Based on the diabetes repository datasets, the sugar levels are classified in Table
10.1.
(b) Blood Pressure Sensor: It is a non-invasive sensor to measure the human blood pressure. It measures systolic, diastolic, and mean arterial pressure utilizing the oscillometric
technique. The pulse rate is also monitored. With reference
to the blood pressure datasets, the levels are categorized as in
Table 10.2.
(c) ECG Sensor: The sensor is attached to the patient using disposable electrodes on the left and the right side of the chest.
The signal obtained from the sensor is filtered and amplified.
193
194
The Internet of Medical Things (IoMT)
Table 10.1 Blood glucose classification.
Blood glucose types
Blood glucose
Fasting Glucose
Hypoglycaemia
<70
Normal
70–99
Hyperglycaemia
≥126
Within 2 hours after meal
Hypoglycaemia
<70
Normal
70–139
Hyperglycaemia
≥200
Table 10.2 Blood pressure classification.
Blood pressure levels
Blood pressure in
shrinkage (mmHg)
Blood pressure in
shrinkage (mmHg)
Hypotension
<100
<60
Normal
100–120
60–80
Level-1 Hypertension
141–159
91–99
Level-2 Hypertension
>160
>100
The analog signal is converted to a digital signal using the
ADC converter. The serial to Bluetooth or Wi-Fi module
interacts with the coordinator node with the result. The
transmission range is about 10 m with frequency 0.05 to 16
Hz
(d) EMG Sensor: Electromyography is a diagnostic technique to
check the electrical activity of the muscles. The sensor will
measure the filtered and rectified electrical activity of the
muscles. The signals are used to analyze the biomechanics of
human movement.
(e) Coordinator Node: It is a wireless transceiver near the body
or it is attached to the patient’s body. This node collects the
sensed readings from the body and communicates with a
Wearable IoT-Enabled Mesh Network
195
server system where the patient’s records are maintained.
In our proposed framework, multiple villages are integrated
through the mesh backbone for easy health services. The
database server in each HMC sends the critical request to a
mesh point to route the data through the mesh backbone.
10.2.2.3
Ontology Information Center
The coordinator node acquires all the physiological parameters from the
patient’s body through the respective sensors. The acquired data is transferred to the server system and periodic monitoring of admitted patients
is performed. The medical history for regular checkups is also maintained
in the server. As stated earlier, Indian population is affected with diabetics,
the silent killer of this era. We have considered this issue to be very important for human. Hence, we have considered the case study on diabetics.
Generally, the level of diabetics is classified as pre-diabetics, type 1 diabetics, and type 2 diabetics. Prediabetics do not have any signs are symptoms.
The signs and symptoms of disease [1] are listed in Table 10.3.
Each server in the HMC has a lookup request algorithm to respond for
critical situations. The evaluation of blood pressure and glucose is based on
the heart diseases (1988) and diabetes UCI repository datasets. The positions of the sensor is based on the wearable computing classification of
body postures and movements (PUC-Rio 2013) dataset.
Table 10.3 Symptoms and signs of diabetic types.
Type 1 diabetics
Type 2 diabetics
Increased or extreme thirst
Increased thirst
Increased appetite
Increased appetite
Increased fatigue
Fatigue
Increased or frequent urination
Increased urination, especially at night
Unusual weight loss
Weight loss
Blurred vision
Blurred vision
Fruity odour or breath
Sores that do not heal
In some cases, no symptoms
In some cases, no symptoms
196
The Internet of Medical Things (IoMT)
The server runs the lookup request Algorithm 10.1 in the GUI. The critical request alert is intiated when the condition is true for very high and low.
Algorithm 10.1 Lookup request
Intialize
bp=min;
glu=min;
//Blood Pressure //
Range 1: a ↔ b; b ↔ c ; c ↔ d //Glucose//
Range 2: a1 ↔ b1; b1 ↔ c1; c1 ↔ d1
Repeat steps n times
If (bp<min && glu<min)
var=verylow;
initiate ctritical request;
elseif (bp>min && glu>min)
if bp (a ↔ b) && (a1 ↔ b1)
var = normal;
elseif bp (b ↔ c ) && (b1 ↔ c1)
var = high;
elseif bp ( c ↔ d ) && ( c1 ↔ d1)
var=very high;
initiate critical request;
endif
endif
endif
endif
stop
10.2.2.4
Mesh Backbone-Placement and Routing
The mesh backbone design is a challenging task since performance of
WMNs depend on the placement of nodes. In real-world scenario, it is
difficult to place the nodes in uniform pattern always. In a rural region,
human are sparsely populated with dense vegetation. So, there are many
topological restrictions in which the nodes cannot follow a uniform pattern. Eventhough there are limitations, the ultimate objective is to implement a mesh backbone to have maximum coverage. Here, we consider
that the position of mesh clients are known in prior. We also assume the
input of a given number of MRs. It is common that the decision making
is guided through optimization. This optimization problem can be related
197
Wearable IoT-Enabled Mesh Network
to a facility location problem. Here, the facilities are mesh clients and
services are the MRs. Minimizing the number of routers with maximum
coverage is a NP-hard problem. Hence, heuristic approaches are used to
solve this complexity. There are many evolutionary algorithms such as
Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Ant Colony
Optimization (ACO), Bee Colony Optimization, Differential Evolution
(DE), Tabu Search, and Simulated Annealing [14, 17].
DE is one of the stochastic optimization metaheuristics. It is inspired
by the GA combined with gemotric search methodology [20]. It is simple
and a powerful solver of non-linear and multimodal optimization problems. The key operators of the Algorithm 10.2 are mutation, crossover, and
selection. The first two operators generate new trail vectors and selection
determines which vector will survive for the next generation. The current
population is represented by P(G) contains the encoded individuals Xi and
G indicate the generation. Np is the control parameter selected by the user
for D dimensional vectors that remains constant throughout the optimization process.
P(G ) = [ Xi(G) XNp(G )]
(10.1)
Xi(G ) = [ X 1, i (G ) XD , i (G ) ]
(10.2)
Algorithm 10.2 Differential Evolution for mesh backbone.
1. Select the Control Parameters. No. of MRs, HMC (Clients),
grid size
2. Decide the upper and lower limit of the Control Parameters.
3. Use a Random number generator randj(0,1) to generate
a uniform distributed random number within the range
[0 1].
X (0)
= [X min
+ rand j (0,1) ∗ (X max
− X min
)] where i=1..........Np
j
j
j
j,i
max min
and j=1-------- D, X j X j are the minimum and maximum
values.
4. Mutation:
DE mutates and recombines the popualtion to produce a
population of Np trail vectors
(G)
(G)
X'(G)
= X (G)
a + _S[X b − X c ] where a≠b≠c≠i S is a scaling factor
i
ϵ (0,1)
198
The Internet of Medical Things (IoMT)
5. Crossover:
by mixing the mutant vectors
DE builds a trail vector X ''(G)
i
and target vector Xi according to the probability distribution
function
 X ' j , i(G ) if randj(0,1) ≤ CR 
X '' j,i (G) = 
 where crossover
 Xj , i(G ) otherwise
 constant CR ε (0,1)
The CR is user defined parameter value that controls its function. If the random number ≤ CR, then the trail mutant vector is selected; otherwise, the parameter Xi (G) is inherited.
6. Selection:
The selection operator determines the population by choosing the trail vector and the target vectors. If the trail vector
X’’j,i(G) gives a optimal lower fitness solution, then the target
vector replaces the next generation. Otherwise, the traget
vector retains its place for one more generation.
 X '' i(G )
Xi(G + 1) = 
 Xi(G )
if f[X''i(G)] ≤ f[Xi(G)

otherwise

MRs are different from other routers in terms of coverage and power
constraints. One of the significant feature of WMN is its robustness. Every
node in the network toplogy is connected in a multihop fashion which
enables the information to be transmitted in the available paths redundantly. The sensor traffic over a mesh backbone has several advantage
such as more bandwidth, energy, and power efficiency. When the transmission power from the sensors are reduced ultimately, the energy used is
also reduced. Hence, small amount of intermediate hops are done between
the source and the destination which reduces the end-to-end delay [14,
21]. There are several methods to route the traffic from a sensor network
to a mesh network, respectively, as (1) mesh backbone simply acting as a
repeater to route the traffic, (2) secondly by considering gateway nodes as a
super node, (3) by adding intelligence to the backbone to avoid unwanted
packet transmission, (4) by providing backward compataibility using
protocol translation gateway, and (5) by providing virtual stack instead
of replacing the existing. But these latter two methods are complex. The
sequence diagram showing the routing between the coordinator node and
the mesh backbone is illustrated in Figure 10.4.
Wearable IoT-Enabled Mesh Network
SCN
Beacon
MR
199
MG
Peer link open
Peer Link confirm
Mesh Peer
Link creation
ACK
Bio_data
ACK
Bio_data
ACK
SCN-Sensor co-ordinator Node
MR-Mesh Router
MG-Mesh Gateway
Figure 10.4 Sequence diagrams of mesh peering and routing medical data.
When a critical request is intiated in the HMC, the coordinator node
sends a Beacon signal to the neighbor MR. The MR which is ready
gives response to the signal. Mesh peer link is established between the
routers and the gateway. A peer link ACK is sent from the MR to the
coordinator node. On receiving the ACK packet, the Bio_data signal
consisting of the medical record of the critical patient is transmitted
from the coordinator node to the MR. Now, the MR utilizes a Open
Shortest Path First (OSPF) routing protocol [14] to route the medical data through the shortest path without loosing any data. The MG
recieves the Bio_data and acknowledges the MRThe life of the patient
depends on the response time of each node involving in transfering
the data to the faraway E-health server. The response time from the
E-health server doctor is alo important and challenging. The ambulance services which are networked in ad hoc mode facilitate the villages if the HMC could not assist in further treatment.
200
The Internet of Medical Things (IoMT)
10.3 Experimental Evaluation
In this section, the experimental evaluation is done for 100 critical request
patients. A GUI framework is created for the server end as shown in Figure
10.5, in which the medical records of the patients are maintained. The datasets are refered from UCI diabetics repository created from AIM’94 with
20 attributes. The codes 58, 59, and 65 are deciphered and used in the database for reference with the realtime patient data. The pre- and post-breakfast glucose reading is monitored, and the patient is classified the type of
diabetic level. The cuff-less blood pressure dataset (2015) provides preprocessed and cleaned vital signals with attributes of 3. The lookup request
algorithm checks the level of glucose and bloodpressure and gives an alert
message to initiate a ctritical request for the patient. The DE method of
node placement uses a population size of 1,000, number of MR is 48, number of HMC (clients) is 10, and the grid size is 64 × 64 in a 3,000 m × 3,000 m
(a)
(b)
(c)
Figure 10.5 GUI alert when the patient’s blood pressure and sugar is critically high.
(a) Main menu. (b) Monitoring blood sugar. (c) Monitoring blood sugar.
Wearable IoT-Enabled Mesh Network
201
Table 10.4 DE parameter settings.
Parameter
Values
Number of generations
200
Population size
1,000
Crossover
0.5
Scaling factor
0.6
dimension geographical area. The DE parameter settings is shown in Table
10.4. The best optimum topology of placement is selected to maximize the
coverage. The DE parameter settings is given in Table 10.4.
10.4 Performance Evaluation
The proposed system is evaluated using the metrics such as energy consumption of sensor coordinator node, survival rate of the critical patients,
end-to-end delay, latency, and the response time of the server. The experimental results of the proposed system are compared with the conventional
method of placement.
10.4.1
Energy Consumption
Energy is an important factor for the sensor coordinator nodes in the HMC.
It works as a network lifetime deciding parameter. The sensors need energy
for acquistion, communication, and processing. As the number of samples
of patients increases, the energy is consumed more as shown in Figure 10.6.
The sleep and wake strategies in the sensor nodes help them from draining
the energy in less amount. The physician in HMC monitors the coordinator
such that it has always a maximum energy to process the data.
10.4.2
Survival Rate
To evaluate the proposed system, the survival rate is calculated based on
the medical records in the database server of HMC. It is defined as the
ratio between the number of newly diagonized patients under observation
(A) minus the number of deaths occured in a specified period (D) to the
number of newly diagonized patients (A).
202
The Internet of Medical Things (IoMT)
200
190
Energy Consumption(mJ)
180
170
160
150
140
130
120
110
100
0
1
2
3
4
Patient Samples
5
6
7
Figure 10.6 Energy consumption in HMC.
S=
(A − D)
A
× 100
(10.3)
Figure 10.7 shows the survival rate of the critical request patients with
random palcement of nodes and DE placement of nodes. Our system
shows an improved survival rate than the conventional method.
10.4.3
End-to-End Delay
End-to-end delay is the time taken to transmit from the source to the destination node. The average time taken by a data packet to arrive in the destination. It also includes the delay caused by route discovery process and the
queue in data packet transmission. Only the data packets that successfully
delivered to destinations are counted. When the end-to-end delay is lower,
the the routing protocol is performing good. Figure 10.8 shows the end-toend delay based on the number of bio-data packets.
End-to-End Delay = ∑ ( arrive time − send time )/
∑ Number of connections
(10.4)
Wearable IoT-Enabled Mesh Network
random Placement
100
DE Placement
90
80
Survival Rate
70
60
50
40
30
20
10
0
20
40
60
80
100
80
100
No. of critical patients
Figure 10.7 Survival rate.
random Placement
100
DE Placement
90
End to End Delay (ms)
80
70
60
50
40
30
20
10
0
20
Figure 10.8 End-to-end delay.
40
60
No. of bio_data packets
203
204
The Internet of Medical Things (IoMT)
10.5 Conclusion
The proposed system stands for its prime and wide coverage of health service for under developed countries. The system allows early identification
of risk cases and treated to increase the survival rate. Also, the system eliminates the burden of manual record keeping and serves as an alert service
for public health. The statistical analysis and individual followups helps the
patients who are unable to travel long distance to city hospitals. The performance of the HMC is improved using an optimally placed mesh backbone
using differential evolution compared to the conventional placement. The
work can be further extended using fault-tolerant mechanisms in the mesh
backbone to reduce more, loss of data, and delay.
References
1. Aljumah, A.A. et al., Application of data mining: Diabetes healthcare in
young and old patients. J. King Saud Univ.–Comp. Inform. Sci., 25, 127–136,
2013.
2. Misra, P., Upadhyaya, R.P. et al., A review of the epidemiology of diabetes in
rural India. Diabetes Res. Clin. Pract., 92, 3, 303–311, 2011.
3. Buckaert, S., Interconnecting Wireless Sensor network and wireless Mesh
network: Challenges and Strategies, in: Proc. of IEEE Communication Society
GLOBECOM, DBLP, 2009.
4. Lee, R.-G., Hsiao, C.-C., Chen, C.-C., Liu, M.-H., A mobile-care system integrated with Bluetooth blood pressure and pulse monitor, and cellular phone.
IEICE Trans. Inform. Syst., E89-D, 5, 1702–11, 2006.
5. Lee, R.-G., Chen, K.-C., Hsiao, C.-C., Tseng, C.-L., A mobile care system
with alert mechanism. IEEE Trans. Inform. Technol. Biomed., 11, 5, 507–17,
2007.
6. Lai, C.-C. et al., A H-QOS Demand personalized home physiological monitoring system over a wireless multihop relay network for mobile home
healthcare application. J. Network Comput. Appl., 32, 6, 1229–1241, 2009.
7. Yuce, M.R., Implementation of Wireless Body area network for healthcare
system. Sens. Actuators A: Phys., 162, 1, 116–129, 2010.
8. Darwish and Hassanien, A.E., Wearable and Implantable Wireless Sensor
Network Solution for Healthcare Monitoring. Sensors, 11, 5561–5595, 2010.
9. Youm, S. et al., Development of remote healthcare system for measuring and
promoting healthy lifestyle. Expert Syst. Appl., 38, 2828–2834, 2011.
10. Kim, Y. and Lee, S., Energy-efficient wireless hospital sensor networking for
remote patient monitoring. Inform. Sci., 282, 332–349, 2014.
Wearable IoT-Enabled Mesh Network
205
11. Sung, W.-T. and Chang, K.-Y., Health parameter monitoring via a novel wireless system. Appl. Soft Comput., 22, 667–680, 2014.
12. Yang, J.-J. et al., Emerging information technologies for enhanced healthcare.
Comput. Ind., 69, 3–11, 2015.
13. Wolfram, et al., Health Enabling Technologies for the elderly-An overview
of service based on a literature review. Comput. Methods Programs Biomed.,
106, 70–78, 2012.
14. Sheeba, G.M. and Nachiappan, A., Improving Link Quality using OSPF
Routing Protocol in a Stable WiFi Mesh Network. International Conference
on Communication and Signal Processing, IEEE, pp. 23–26, 2012.
15. Steele, R. et al., Elderly persons’ perception and acceptance of using wireless
sensor networks to assist healthcare. Int. J. Med. Inform., 78, 788–801, 2009.
16. Hawley-Hague, H. et al., Older adults’ perceptions of technologies aimed at
fallsprevention, detection or monitoring: A systematic review. Int. J. Med.
Inform., 83, 416–426, 2014.
17. Sheeba, G.M. and Nachiappan, A., Fuzzy Differential Evolution Based
Gateway Placements in WMN for Cost Optimization. Adv. Intell. Syst.
Comput., 385, 137–145, 2015.
18. Jothi1, K.R. and Jeyakumar, A.E., An Effective Approach for Bio-Medical
Data Transmission Using Hop Scheduled Data Dissemination Through
VANET. J. Pure Appl. Microbiol., 9, 147–153, 2015.
19. Sheeba, G.M. and Nachiappan, A., Gateway Placements in WMN with Cost
Minimization and Optimization using SA and DE Techniques. Int. J. Pharm.
Technol., 7, 1, 8274–8281, 2015.
20. Wan, J., A. A. H. Al-awlaqi, M., Li, M. et al.,Wearable IoT enabled real-time
health monitoring system. J. Wirel. Commun. Netw., 2018, 298, 2018, https://
doi.org/10.1186/s13638-018-1308-x.
21. Sheeba, G.M. and Nachiappan, A., Computation of Mesh Node Placements
Using DE Approach to Minimize Deployment Cost with Maximum
Connectivity. Wirel. Pers. Commun., 107, 291–302, 2019, https://doi.
org/10.1007/s11277-019-06255-8.
11
Management of Diabetes Mellitus
(DM) for Children and Adults Based
on Internet of Things (IoT)
Krishnakumar S.1*, Umashankar G.1, Lumen Christy V.1, Vikas1
and Hemalatha R.J.2
Department of Biomedical Engineering, School of Bio and Chemical Engineering,
Sathyabama Institute of Science and Technology, Chennai, Tamilnadu, India
2
Department of Biomedical Engineering, Vels Institute of Science, Technology &
Advanced, Chennai, Tamilnadu, India
1
Abstract
Diabetes mellitus (DM) is a metabolic disorder described by hyperglycemia, because
of imperfections in secretion or potential activity of insulin. Diabetes disorders are
more common to all groups of people due to various factors. The concept of the
study is to design and develop an assistant system for DM for children and adults
using the Internet of Things (IoT). The prototype device contains microcontrollers to
update the real parameters of the patients to the IoT cloud database. The sensors utilized here are the MAX30100 to gauge the SPO2 and the non-invasive glucose sensor
used to quantify the blood glucose level and the temperature sensor used to gauge
the constant temperature of the patients. The chatbot will transmit data to interact
about the patient’s health conditions. An advanced assistive device to help manage
diabetes in children and adults to regulate in their daily lives for better treatment. The
main target of this project is to prevent children from harmful conditions of diabetes
with help of technical things like robotic assistant and emergency alarm. The present
study used to monitor blood glucose level, body temperature, pulse rate, and SPO2
effectively. The data recorded by the device is directly sent to the doctors through the
internet. The doctors can monitor the patient from medical centres continuously.
Based on the patient’s health parameters and blood glucose level, doctors can suggest
the prescription to the nurses or caretakers for the patient through the IoT.
Keywords: Diabetes mellitus, blood glucose, glucose sensor, chatbot, IoT
*Corresponding author: drkrishnakumar_phd@yahoo.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (207–224) © 2022 Scrivener Publishing LLC
207
208
The Internet of Medical Things (IoMT)
11.1 Introduction
Diabetes mellitus (DM) is one of the most widely recognized perilous
illnesses for all age groups in the world. These days, diabetes is turning
into a genuine and disturbing sickness because of the way of life and food
propensity for the individuals [1]. Diabetes is the condition that is known
to increase blood glucose levels and may thus additionally present different metabolic pathways in people. The adjustment in digestion influences
legitimately or by implication influences the electrochemistry of different
body liquids, for example, blood, salivation, urine, and tears. The ailment
of DM cannot control blood glucose levels of the body. It is possible to
distinguish diabetes into type 1 and type 2. Type 1 diabetes is a disease
in which the insulin hormone cannot be produced by the affected person to direct blood glucose levels. The insulin hormone is fundamental for
the human body to change glucose into energy over to lead their life. The
body’s insulin supply is inadequate to convert glucose to energy in type 2
diabetes. This may happen generally in individuals 40 or more years old.
The event of type 2 diabetes is spreading overall more quickly than type 1.
Type 2 diabetes may prompt numerous genuine ailments, for example,
cardiovascular diseases, eye disorders, renal disorder, brain dysfunction,
and premature mortality [2].
There are 415 million grown-ups who have been influenced by diabetes and the number is relied upon to ascend to 642 million by the year
2040 [3]. Diabetes is the most reasonable justification of sickness one out
of 10 passing among individuals of 20–59 years of age. In the UK, three
individuals get diagnosed to have DM at regular intervals, and the most
upsetting certainty is that around 5 lakhs individuals have diabetes are as
yet undiscovered. As indicated by the ongoing report by Indian Diabetic
Federation, 382 million individuals were discovered to be diabetic in the
year 2013. With the largest number of people with diabetes, Malaysia is
ranked tenth in the world (World Health Organization, 2016). The fundamental driver of DM is still hidden, but body weight, gender, diet, genetic,
and actual exercises are firmly established. Since having a persistently
high amount of glucose in the blood, the symptoms of diabetes must be
seen between 1 and 6 years, which may additionally prompt other critical
medical problems, such as kidney failure, cardiovascular disease, vision
impairment, stroke, and neuropathy. As indicated by research, half of the
diabetes patients are foreseen to be experiencing apprehensive confusion
and vision issues. The major drawback associated with diabetes is blood
dependency which makes it an invasive approach and also increases the
Management of DM Based on IoT
209
risk of infection for the patient. Also, the consequence of such testing
requires quite a while.
11.1.1
Prevalence
The predominance of diabetes (both type 1 and 2) in adults aged 20–70
years was 415 million overall. This is focused on the fact that by 2040, there
will be 642 million adults. In the UK, 10% of young people with diabetes have type 1 diabetes and 90% have type 2, compared to 400,000 and
3.6 million individuals, respectively. In the UK, there are around 31,500
children and people younger than 19 with diabetes. This may be a minor
matter, since not all children over 15 years of age are supervised for pediatric consideration. Around 95% have type 1 diabetes, 2% have type 2
diabetes, and 3% have young, cystic fibrosis-related diabetes with either
­maturity-onset diabetes.
11.1.2
Management of Diabetes
Viable diabetes treatment decreases the risk of long-haul disease-related
intricacies that include cardiovascular disease, vision impairment, stroke,
kidney disease, and removals that cause incapacity and untimely mortality. With treatment that holds the circling glucose levels as near as
expected as could reasonably be expected, the risk of entanglements is
dramatically decreased, thereby decreasing tissue damage. If blood glucose levels go too high, called hyperglycemia, or too low, called hypoglycemia, then transient confusion may occur. Extreme hypoglycemia is
more serious, and critical support is required. It can cause fits, lack of
concentration, unconsciousness, and even death. Interestingly, hyperglycemia side effects include an increase in urination, migraines, sleepiness,
torpidity, and an expanded parched. For many people with diabetes, dealing with the condition affects their way of life and personal satisfaction.
Diabetes is systematically handled by the person with the disorder or with
the aid of a caretaker. Type 1 diabetes requires regular checking of blood
glucose levels and infusing insulin when and when expected to ensure a
sufficient degree of glycemic regulation and to detect low levels of blood
glucose before hypoglycemia occurs. The standards recommend that
adults with type 1 diabetes should monitor their blood glucose levels four
times a day before each feast and before sleep, in any case. Children with
type 1 diabetes are advised to monitor their blood at a minimum of five
times daily. Conventional invasive strategies for assessment of the glucose
210
The Internet of Medical Things (IoMT)
levels on the human body require a patient to prick his/her finger (penetrating the skin) to gather a blood sample to decide the blood glucose
levels. The traditional strategy presents trouble for patients with diabetes
because they have to prick their fingers a few times each day to gather the
blood to control the glucose levels. The patients feel inconvenience, trouble, and maybe anguish contingent upon the seriousness of puncturing
the finger. The intrusive technique can harm the finger tissue and cause
serious torment as well. In addition, the needle can incite lethal body contaminations into the circulatory system. All in all, the most widely recognized economically accessible glucose checking gadgets are intrusive that
require a blood test to decide the glucose fixation on the human blood.
To lessen the uneasiness to the patient, different strategies on noninvasive
methodology are utilized.
Portable and wearable body sensors have been as of late created with
an expanded broad consideration in medical services applications for
persistent and constant observing of actual boundaries and individual
strength of patients. These sensors are sent to quantify pulse, blood SPO2
level, internal heat level, and glucose discovery from the perspiration. In
such a manner, it is exceptionally essential to create non-invasive wearable sensors and frameworks that decide and screen the glucose levels
in blood in the continuous observing framework. Continuous Glucose
Monitoring (CGM) or implantable frameworks are notable in the medical care industry yet they are obtrusive and require substitution following
2 or 3 days, and convey restrictions, for example, restricted battery life.
The glucometer is taking a shot at the rule of electrochemical identification of the body [4]. In early days, glucose levels can be observed by
GBP-covered sensors, for example, on-body CGM gadgets. CGM gadgets commonly have glucose sensors including a needle or test that is
embedded into the tissue of a client to quantify the glucose levels in the
encompassing tissue liquid [5]. This observing is additionally done by
the planning of discovery blood glucose levels in non-obtrusive–based
microcontroller.
11.1.3
Blood Glucose Monitoring
People with diabetes will make better choices about their diet, activity, and
insulin drug needs by tracking their blood glucose levels. Individuals with
diabetes usually use a handheld device known as a blood glucose meter to
monitor their blood glucose levels. There are more than 65 blood glucose
meters presently accessible with fluctuating in size, weight, test time, memory abilities, and extraordinary highlights.
Management of DM Based on IoT
11.1.4
211
Continuous Glucose Monitors
CGM gadgets can screen glucose continuously and consequently. A
standard framework includes a dispensable glucose sensor inserted
simply under the skin and worn before replacement for a few days. A
­sensor-to-non-embedded transmitter association provides a radio receiver
and an electronic circuit worn like a pager that tracks and displays glucose
levels. The glucose levels in the interstitial fluids in and around cells are
measured by these gadgets.
11.1.5
Minimally Invasive Glucose Monitors
Without entering the blood vessels, minimally invasive glucose control
systems compromise the skin barrier. Be that as it may, such frameworks,
particularly during the night, are short on the accuracy and control of currently accessible frameworks. Minimally invasive systems that sample the
ISF have been set up.
11.1.6
Non-Invasive Glucose Monitors
Non-invasive glucose monitors advancement without selling off skin
obstruction, screening glucose levels. Such developments are supposed to
provide ceaseless readings such as the CGMs currently used or sporadic
readings where the test is essential for understanding movement. The
advanced right-hand device can assist with overseeing diabetes in youngsters and direct their day-by-day life for better treatment. The principle target of this venture is to forestall kids by destructive states of diabetes with
the help of specialized things like robotic assistant and crisis alert. By 2020,
wellness will be a commonplace, inevitable thing on a global scale, with
fewer actual trips to medical services, out and out of keen clinics—this is
just a rough picture of Internet of Things (IoT) progress. So, as youthful as
the idea seems to be, the reformist emergency clinics of the present usually
do not value the novel. A significant portion of them is either upgrading
big IoT processes or skills or have improved components in their adjustment phase as of now. The production range of IoT devices in medical
services is estimated to cross over 161 million units by the end of 2020.
11.1.7
Existing System
In the existing system, things are done manually and there is no assistive
device to monitor the real condition of the patient at every moment in
212
The Internet of Medical Things (IoMT)
existing system parents have to visit a hospital or medical center to check
up and have to keep a nurse or caretaker for their baby to do right things
because as we know no one can better care than a mother. In the existing
system, the patients need to visit the hospital for treatment even in the
beginning stage of diabetes and the data are monitoring using the machine
and it will display.
The main objective of the present study is to design a compact non-­
invasive blood glucose checking gadget. The device ought to have the
option to identify glucose level in blood utilizing a red laser. Also, it can
decide glucose level and showing the glucose level on the LCD screen. This
work is planned by deciding the chip-based use of the glucose checking
framework. The persistent observation of this blood glucose level is done
in a non-invasive method utilizing red laser light transmittance and absorbance. Henceforth, analysis is performed to recognize and an early check
intends to maintain a strategic distance from visual deficiency and mortality because of DM.
11.2 Materials and Methods
The Beer-Lambert law is an optical calculation that considers the relationship between material absorption and measurement. It proposed an
approach to process the calculation of a substance in an illustration using
its absorption rate, so that the material’s light assimilation correlates to the
sum of related substance. The Beer-Lambert law explains how energy is
reflected by the object in question: The power of transmitted light reduces
exponentially as the concentration of the substance in the device increases.
The power of emitted light reduces exponentially as the separation carried through the material increases. By sending a laser beam through the
fingertip as the fundamental concept, this approach was used to measure
the glucose content in a blood sample. The equation given is defined by a
simplified model of this law.
absorption=
11.2.1
intensity of incident light
intensity of transmitted light
Artificial Neural Network
The artificial neural network (ANN) that was used to determine the concentration of blood glucose was developed and trained using TensorFlow,
Management of DM Based on IoT
213
a Google-created and maintained open-source stage for deep learning and
machine learning AI. This enables the creation of tensors, which are neural
network models that are applied to multidimensional data arrays. C++,
Python, and Java are some of the most notable features, as they can run on
multiple CPUs and GPUs and can be executed. The Flask worker, a Python
microframework for developing web application programming interfaces
(APIs), was used to run the ANN. The end device uses the same to provide
Python microservices.
11.2.2
Data Acquisition
It takes into account the interaction of these components: Raspberry Pi
camera and 650-nm laser, all of which are implanted in a 3D printing technology case and connected to the glove’s index fingertip. The laser shaft is
positioned in the case, facing the camera focal point, with enough space in
the middle to enclose an individual’s fingertip satisfactorily. Furthermore,
this configuration is intended for data collection in order to explain how
the laser beam interacts with the finger.
1. The laser-beam is used to travel through the medium as the
fuel source.
2. The medium by which the light will be sent is the finger.
3. The camera functions as a tracker, recording the reflected
light and how it disperses as it travels through the finger.
At this stage, 640 × 480 px fingertip images are taken to allow the
camera to conduct an accurate focus for an 8-second duration. This loop
is completed in 2 minutes, providing a sum of 14 images along these
lines. The first and last photos might not be correctly recorded by the
demonstration of holding and extracting the finger, involving mistakes
in future phases. Consequently, the framework only thinks of the focal
12 images, the first and last being disposed of. The camera configuration
used in this project was configured to night exposure mode with camera
exposure correction of 25, ISO tolerance of 800, and brightness and contrast level of 70.
11.2.3
Histogram Calculation
Image histograms, in general, have worldwide statistics on their tensile
strengths and help optimize images. Histograms are also a part of the
knowledge flow that allows for the study of scattering in different ways.
214
The Internet of Medical Things (IoMT)
The histogram is used as an indicator of the images obtained in the presented design, which reflects the intensity of the light emitted through
the finger at that time. The measurement of light dissipating in the finger is inherently observable due to the use of the histogram as an object
descriptor.
Significant differences in blood glucose can be observed in the time
period between fasting and several hours after a feast. Previous studies
involved a series of experiments using the red, green, and blue histograms
of fingertip photographs of clients who had fasted for 8 hours and then
fed two hours later. As shown by the Mann-Whitney measurable test, the
blue histograms detailed factually broad varieties. This paper considered
the use of blue channel histograms.
In order to perform an effective data transfer, the histogram of the blue
channel in the images obtained is typically processed on the RPi prior
to transferring information to the IoT cloud processing level. Instead of
transmitting the whole file, only 256 values are transmitted for each image
with more than 300,000 data points, dramatically minimizing latency of
data transmission.
11.2.4
IoT Cloud Computing
ANN was trained in the programming language of Python using the
TensorFlow library. The training set consists of 514 histograms, each of
which shows that the 12 histograms obtained for all subjects are normal.
The ANN used in this analysis has 256 input neurons, pixel values, and two
hidden layers, each with 1,024 neurons, with one neuron corresponding to
the output layer’s glucose intensity level. A 0.20 dropout was considered
toward the end of both hidden layers. The activation function of ReLU was
used in all cases in this model. The ADAM technique was used to train
the ANN to eliminate the error. Also, a total of 100 epochs with a batch
size of 50 is considered. Evaluation metrics include the following: the average square error and Clarke error grid analysis were conducted to evaluate
the model’s performance. The mean absolute error (MAE) is determined
using the equation where y represents the standard glucose values and y
represents the levels obtained by the algorithm. In the Clarke error grid,
the standard glucose amounts versus the evaluated measurements are
plotted and isolated into five regions. Zones A and B denote the unique
or necessary effects of glucose; zone C may necessitate inadequate treatment, while zones D and E may necessitate possibly lethal mistreatment.
Cross-validation model: The entire input data set was split into training,
testing subset, and validation subset. As a result, 70% of the entire data
Management of DM Based on IoT
215
set was randomly implemented with 10-fold cross-validation, representing
the training, testing subset; the remaining 30% was used as the validation
subset. In the 10-fold cross-validation, data is uniformly partitioned into
10 groups or folds, each with an equal number of objects. The first fold
functions as a subset of evaluations, while the other nine folds improve the
classification performance. Until each fold is presented as a test subset, this
procedure is replicated 10 times.
11.2.5
Proposed System
The present study that comprises a system for 24/7 human health monitoring is designed and implemented for diabetic children. The NodeMCU
board is used for collecting and processing all data. The following different
sensors are used for measuring different parameters. ESP8266-12E module
is used for connecting to the internet. The artificial medical assistant–based
chabot monitors the glucose, temperature, heartbeat, and SPO2 using the
IoT. Noninvasive glucose sensor is used to find out the glucose value of
patients from its fingertip and other sensors also connected to the patient
to get relevant data.
11.2.6
Advantages
It makes clinic visits preventable, gathering, and thoroughly reviewing
critical health data lately, and so on. Fabulous long-haul technologies are
provided space by the IoT. Perhaps, the proficient autonomous device that
will cost less to run and “employ” over the long term is the most favorable
role of IoT in medical care. In order to check and check real-time patient
conditions from their office, specialists will see all the critical details.
11.2.7
Disadvantages
The tremendous use of the IoT for medical treatment includes the following: it is conceivable to sabotage privacy. Frameworks get hacked, as
we have just referenced. Loads of consideration should concentrate on
the protection of information, which needs enormous additional investment. There is unauthorized centralization admittance. There is a chance
that deceptive intruders may access centralized systems and understand
some pitiless goals. Worldwide health associations are also releasing recommendations that government clinical institutions must actively adopt
when implementing IoT into their work processes. To some point, these
can restrict future skills.
216
The Internet of Medical Things (IoMT)
I2C LCD
MAX30100
non-invesive
glucose sensor
NodeMcu
And
Power
Supply
Nano
With Chatbot
Temperature
Sensor
Figure 11.1 Block diagram of the proposed system.
11.2.8
Applications
The IoT makes a centralized network of interconnected devices within
a solitary system which can generate and exchange data. All that knowledge can also be tracked and assembled in real-time, providing a latent
accumulation of analytical materials (Figure 11.1). The regular clinic visit
can be turned into a smart hospital in terms of developing clinical offices.
It is an advanced facility where everything is simultaneously tracked and
monitored as all the information is collected in a centralized database. The
advantages of IoT applications in medical services are more and unending.
The innovation has an extremely diverse field of use in medicine.
11.2.9
Arduino Pro Mini
The Arduino Pro Mini is an Arduino.cc-developed microcontroller board
that is merged with the Atmega328 microcontroller within the board.
There are 14 digital I/O’s on this board, of which 6 pins are used to provide
PWM output. On the frame, there are 8 usable analog pins. Compared to
the Arduino Uno, i.e., 1/6 of the Arduino Uno’s all-out scale, it is incredibly thin. On the board, there is only a single voltage regulator consolidated,
i.e., 3.3 or 5 V, depending on the board’s rendition. For the 3.3-V variant,
the Arduino Uno board runs at 16 MHz while the Pro Mini runs at 8 MHz.
Management of DM Based on IoT
217
On the board, there is no accessible USB port and it also requires an inherent
creator. For example, KB33 indicates a 3.3-V edition and KB50 indicates a
5-V edition. The marking on the controller characterizes the version of the
board. Nonetheless, by calculating the voltage between Vcc and GND pin,
the board rendition can also be demonstrated. Based on the prerequisites and
space available, this board does not have built-in connectors that allow you
the flexibility to weld the connector in any way you may. Arduino Pro Mini is
open source, so you can modify and use the board according to your specifications, since all the knowledge and help identified with this board is readily
available. Another component that makes this gadget safe to use in applications where passing current affects the overall project output is overcurrent
protection capacity. It comes with 32 KB of flash memory, 0.5 of which is
used for a bootloader. The flash memory is used for the board’s code storage.
It is a non-volatile memory which retains information regardless of the flexibility of the voltage being lost. Static Random-Access Memory (SRAM) is
deeply volatile in nature and depends primarily on a constant power supply
source. It is possible to erase and remodel read-only memory (ROM). By
using higher than normal electrical signals, this memory can be erased.
11.2.10
LM78XX
The LM78XX is a three-terminal controller and outfitted with several fixed
yield voltages causing them to oblige wide degree of utilizations. The first
one is restricted on-card recommendation, which avoids the allocation difficulties that single-point policy has. These regulators can be used in logic
systems, acoustics, HiFi, and other powerful state electronic equipment
because of the voltages available. These gadgets can be used with outside
parts to procure movable voltages and flows; however, they were intended
to be a voltage controller. The LM78XX course of action is available in an
aluminum TO-3 group which will allow over 1.0A load current. Safe zone
assurance for the yield semiconductor is given to limiting inner force scattering. If within force scattering becomes too high for the glow absorption
given, then the hot closure circuit takes over and stops the IC from excessive
heat. A lot of work was put into making the LM78XX controller arrangement convenient to use and limiting the number of outer sections. Avoiding
the yield is not necessary, because it enhances latent reaction. Only if the
controller is located well away from the force supply’s channel capacitor
does information bypassing become essential. The LM117 and the structure
have a yield voltage range of 1.2 to 57 V for yield voltages other than 5, 12,
and 15 V. LM78XX has the features that incorporate: the output current is
in abundance of 1A; inner warm overburden insurance to the framework;
218
The Internet of Medical Things (IoMT)
no outside parts required; output semiconductor safe zone assurance; interior short circuit current cutoff open in the aluminum TO-3 pack; and the
voltage extent of LM7805C 5V, LM7812C 12V, and LM7815C 15V. The item
qualities incorporate I2C 1602 LCD module is 2 lines by 16 characters show
that is interfaced with an I2C board. The I2C interface needs 2 information
associations, +5 VDC, and GND for itemized data on the I2C interface. The
particulars incorporate 2 lines by 16 characters; I2C Address Range 0 × 20
to 0 × 27 with working voltage 5 Vdc; backlight white; adjustable by potentiometer on I2c interface; 80 mm × 36mm × 20 mm; 66 mm × 16mm. The
gadget is controlled by a single 5Vdc connection.
11.2.11
MAX30100
The MAX30100 is a synchronized sensor system for pulse oximetry and
heart-rate monitoring. It integrates LEDs that can recognize oxygen saturation and pulse rate signals, a photodetector, improved optics, and lownoise analog signal processing into a single unit. LEDs, a photosensor,
and a high-performance analog front-end are all built into the system.
The MAX30100 operates from power supplies of 1.8 and 3.3 V and can be
powered down by programming with negligible reserve current, allowing
the supply of the facility to remain continuously connected. Different highlights of the parts incorporate ultra-low-power operation that increases
battery life for wearable devices programmable sample rate and led current
for power savings ultra-low shutdown current (0.7 μA, type). Advanced
function increases high SNR calculation efficiency that offers stable motion
artifact resilience.
11.2.12
LM35 Temperature Sensors
The LM35 device is an effective integrated-circuit temperature method with
a legitimately related output voltage to the Centigrade temperature. The
LM35 gadget has a slight advantage over direct temperature sensors
associated with Kelvin, since the user does not have to remove a significant
constant voltage from the output to achieve beneficial Centigrade scaling.
To have an average accuracy of ±1⁄4°C at room temperature and ±3⁄4°C
cover a maximum temperature range of −55°C to 150°C, the LM35 gadget
do not require external calibration or snipping. Lower costs are ensured
by wafer management and alteration. The low output impedance, linear
output, and accurate intrinsic adjustment of the LM35 gadget make it particularly simple to communicate with reading or control hardware. The
gadget is used for single power supplies or with plus and minus supplies.
Management of DM Based on IoT
219
Since it extracts just 60 A from the supply, the LM35 has a poor degree of
self-heating in still air of less than 0.1°C. The LM35C gadget is evaluated for
a range of −40°C to 110°C (−10° with better accuracy). Bundled in airtight
TO semiconductor bundles, the LM35 arrangement gadgets are available,
while in the plastic TO-92 semiconductor bundle, the LM35C, LM35CA,
and LM35D gadgets are available. The LM35D device is compatible with
an 8-lead surface-mount small structure kit and a plastic TO-220 bundle.
11.3 Results and Discussion
The estimation of blood glucose levels needed by individuals with diabetes to
keep both chronic and acute complications from the infection without drawing blood, penetrating the skin or causing torment or injury is the moving
Figure 11.2 Components of the noninvasive glucose monitoring system.
220
The Internet of Medical Things (IoMT)
Figure 11.3 Prototype of the glucose monitoring system.
marvel to the doctor (Figure 11.2 and Figure 11.3). The quest for a fruitful
strategy started around 1975 and has proceeded to the present without a
clinically or monetarily suitable product. Starting in 1999, just a single such
product had been endorsed available to be purchased by the FDA, in light of
a method for electrically getting glucose through unblemished skin, and it
was removed after a while owing to poor performance and incidental harm
to the skin of clients.
New methodologies that have been tried include near-infrared spectroscopy (estimating glucose through the skin using marginal wavelength
light than the visible region), transdermal estimation (endeavoring glucose through the skin using either chemicals, electricity, or ultrasound),
estimating the amount of glucose pivoting polarized light in the anterior
chamber of the eye.
Noninvasive glucose meter was being promoted in various nations across
the globe [6]. All things considered, as the mean absolute deviation of this
gadget was almost 30% in clinical preliminaries, further exploration endeavors were wanted to altogether improve the precision. A most recent constant
glucose regulation system, using electrochemical recognition of glucose in
the blood, has been implemented by Cappon et al. [7]. To test blood sugar
levels in the interstitial tissue fluid, a glucose sensor is inserted underneath
the skin and attached to the transmitter as a tiny electrode. The signal would
be transmitted to the control and display system by a remote radio frequency
by the transmitter. A short time later, if their glucose level is less or more than
the normal range, then the gadget will distinguish and inform the patient.
Management of DM Based on IoT
221
Figure 11.4 Output of glucose monitoring.
The advantage of the framework is that, for the duration of the day and night
(Figure 11.4), glucose levels can be continuously quantified. Jui et al. [8] suggested another approach using electrochemical detection. An electrochemical sensor containing a glucose test strip and an automated test device is used
in the application. Another ultrasonic methodology has been proposed by
Buda and Mohd. Addi [9]. They also combined the transdermal extraction
of interstitial fluid with the detection of plasma resonance outside. The two
studies revealed that the processes used would accurately quantify the glucose content in the blood. The use of a subcutaneous implantation technique
has the benefit of avoiding diseases such as septicemia, blood clot fouling,
222
The Internet of Medical Things (IoMT)
and embolism [10]. A glucose sensor with a fine needle or adaptable wire has
been planned and the active sensing feature is modified and inserted in the
subcutaneous tissue at its tip. There are various types of constant observation
mechanisms for glucose that have been promoted these days. Cases of such
a device can use electrochemical detection or glucose oxidase optical recognition to quantify glucose in the blood.
11.4 Summary
Diabetes patients are not regularly supervised; however, whether with a
clinic or hospital, they normally deal with their condition without someone around. Patients must subsequently decide on the best-­individualized
consideration of day-by-day diabetes care options. For example, for
patients with T1DM to sustain their glucose levels beyond satisfactory
scope, the precise measurement of the insulin bolus per meal or bite is
important. The caretakers are often called to support their young people
in traditional terms. The robot provides decision help in the proposed system in the calculation of the insulin bolus as well as in the provision of
real-time feedback, summarizing the BG readings over recent hours and
how they contrast with the trend of the readings recently obtained. In
addition, the structure also offers subtleties on the perceived BG designs
(through the robot) alongside fitting advice that is generated based on both
the present and authentic knowledge placed in the clinical record of the
patient. Consequently, this kind of aid is produced by DMH and continuously transmitted to the patient (through the robot) without the expert
caretaker’s direct mediation. Then again, the framework likewise bolsters
the parental figures through creating a compliance index for every patient.
11.5 Conclusion
The execution in children has successfully arranged and planned a completely useful IoT-based eHealth phase that wires humanoid robot assistance
with diabetes. This is achieved by the participatory plan in an informative,
adjustable, and reconfigurable period through which patients are intensively active in making their customised well-being profile, follow-up, and
therapy schedules. The built platform supports a constant but roughly connected network over separation between patients and their caregivers and
thus increases the devotion of patients to their caregivers and limits the
expense, time, and effort of traditional occasional clinic visits. This will
Management of DM Based on IoT
223
likewise add to long haul social change from unfortunate to solid ways of
life. The end-to-end usefulness and data quality of the created stage were
tried through a pilot clinical acceptable study. The recommended design
and applications can likewise be viewed as an outline for building up a
nonexclusive eHealth stage for the management of different persistent
sicknesses other than diabetes. This platform is consequently staying open
for additional specialized upgrades and clinical examinations.
References
1. Tabish, S.A., Is Diabetes Becoming the Biggest Epidemic of the Twenty-first
Century? Int. J. Health Sci. (Qassim), 1, 2, V–VIII, 2007.
2. Petrie, J.R., Guzik, T.J., Touyz, R.M., Diabetes, Hypertension, and
Cardiovascular Disease: Clinical Insights and Vascular Mechanisms. Can. J.
Cardiol., 34, 5, 575–584, May 2018.
3. Ogurtsova, K., da Rocha Fernandes, J.D., Huang, Y., Linnenkamp, U.,
Guariguata, L., Cho, N.H., Cavan, D., Shaw, J.E., Makaroff, L.E., IDF Diabetes
Atlas: Global estimates for the prevalence of diabetes for 2015 and 2040.
Diabetes Res. Clin. Pract., 128, 40–50, 2017.
4. Chen, C., Zhao, X.-L., Li, Z.-H., Zhu, Z.-G., Qian, S.-H., Flewitt, A.J., Current
and Emerging Technology for Continuous Glucose Monitoring. Sensors
(Basel), 17, 1, 182, Jan 2017.
5. Reddy, N., Verma, N., Dungan, K., Monitoring Technologies- Continuous
Glucose Monitoring, Mobile Technology, Biomarkers of Glycemic Control,
in: Endotext [Internet], 2020.
6. Gonzales, W.V., Mobashsher, A.T., Abbosh, A., The Progress of Glucose
Monitoring—A Review of Invasive to Minimally and Non-Invasive
Techniques, Devices and Sensors. Sensors (Basel), 19, 4, 800, Feb 2019.
7. Cappon, G., Vettoretti, M., Sparacino, G., Facchinetti, A., Continuous Glucose
Monitoring Sensors for Diabetes Management: A Review of Technologies
and Applications. Diabetes Metab. J., 43, 4, 383–397, Aug 2019.
8. Lai, J.-L., Wu, H.-n., Chang, H.-H., Chen, R.-J., Design a Portable BioSensing System for Glucose Measurement. International Conference on
Complex, Intelligent and Software Intensive Systems, CISIS 2011, Seoul, Korea.
9. Buda, R.A. and Addi, M.M., A Portable Non-Invasive Blood Glucose Monitoring
Device, © World Health Organization, 2016, Global report on diabetes. 1.
Diabetes Mellitus – epidemiology. 2. Diabetes Mellitus – prevention and control. 3. Diabetes, Gestational. 4. Chronic Disease. 5. Public Health. I. World
Health Organization, France. (NLM classification: WK 810).
10. Nichols, S.P., Koh, A., Storm, W.L., Shin, J.H., Schoenfisch, M.H.,
Biocompatible Materials for Continuous Glucose Monitoring Devices.
Chem. Rev., 113, 4, 2528–2549, 2013.
12
Wearable Health Monitoring
Systems Using IoMT
Jaya Rubi* and A. Josephin Arockia Dhivya
Department of Biomedical Engineering, VISTAS, Pallavaram, Chennai, India
Abstract
Our world today is dominated by internet and technology. Digital technologies
have come into effect in various sectors of our daily lives and it has been successful in influencing and conceptualizing our day to day activities. The Internet of
Medical Things (IoMT) is one such discipline which seeks a lot of interest as it
combines various medical devices and allows these devices to have a conversation
among themselves over a network to form a connection of advanced smart devices.
Firstly, in order to elucidate some of the salient features, interests and issues related
to optimized wearable devices, this chapter will provide brief elaboration on use
of IoMT in developing wearable health monitoring system. As a backdrop to this
discussion, a short reflection on various sensors that are equipped enough to capture and transmit the healthcare data will also be discussed in this chapter. This
chapter would also present a brief investigation about the drawbacks faced in customizing IoMT devices. The chapter would also provide a brief perspective about
the future advancements that would facilitate the healthcare delivery system and
also improve the patient outcomes. The chapter would conclude considering a few
solutions which would have a potential impact on current challenges being faced
by healthcare systems.
Keywords: IoMT, healthcare, patient care, wearable devices, monitoring systems
12.1 Introduction
Health is one the most important and primary needs of every individual.
A good health would obviously lead to a better and successful life. One of
*Corresponding author: rubijames1604@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (225–246) © 2022 Scrivener Publishing LLC
225
226
The Internet of Medical Things (IoMT)
the most important trends in our society is improving healthcare facilities. As the healthcare facilities get improved, people’s lives would become
better. According to a recent study, many healthcare workers are trying to
improve the diagnostic and therapeutic processes using Internet of Things
(IoT) technologies. This technique would gradually reduce the errors and
lower the costs which would gradually improve the efficiency and effectiveness of healthcare processes. IoT has turned out to be crucial field and has
led to rapid revolution in the healthcare sector. The recent technological
advancements have changed every individual’s perception about the use of
IoT in healthcare. However, these technological changes have also created
uncertainties and raised several questions over data security. Wearable
healthcare devices are a looming technology that enables us to continuously monitor the vital signs of human body while performing our daily
activities. WHDs have become so compact and user friendly that it has
become a part of human life. WHDs are an important part of the personal
healthcare systems which gives the user a sense of confidence and self-reliance. The main aim of this technology is to raise interest among people
about self-care and health status and to provide them a sense of satisfaction
and self-empowerment. Wearable healthcare devices are being designed
in such a way that they are more reliable and cost efficient, too, as it has
a potential to provide clinicians with more valuable data and leading to
earlier diagnostics and guidance of treatment.
One of the most important reasons of success of wearable healthcare
devices is the miniaturization of the electronic equipment that enables us
to design adaptable wearables contributing to major changes in worldwide
approach.
12.2 IoMT in Developing Wearable Health
Surveillance System
The main integrity of using Internet of Medical Things (IoMT) technology is allowing the user to carry out his regular activities even when the
patient is under continuous health surveillance. The conventional health
monitoring systems that are already available have a lot of drawbacks and
also cause discomfort for the patient due to the numerous counts and
proportions of modules attached to the patient. The greater use of IoMT
technology also gives the patient the primacy of receiving an economical
hospital bill. One of the most important drawbacks is frequent charging
of the modules or replacing the batteries. IoMT has evolved a lot in terms
Wearable Health Monitoring Systems Using IoMT
227
of integrating a greater number of sensors as well as becoming more
adaptable among the users. This evolution has successfully resolved some
of the major concerns of the healthcare industry by designing certain
modest power consuming miniature sensors which are compact in size
and communication friendly.
The IoMT technology predominantly comprises of a compact and
mobile patient monitoring unit which is primarily made of electronic circuits and sensors. This set up will be capable enough to acquire all the vital
parameters from the patient’s home and send it to the real-time monitoring
system which is present in the hospital [1].
Just like the IoMT would introduce robotization to innumerable daily
activities and assignments, these portable devices will not only keep a track
of our physical well-being but will also blend flawlessly into our lives, providing a great connection to the IoMT technology [2]. The main motive
behind this chapter is to amalgamate all the imperative features into customized wearable devices and learn in detail about the drawbacks and
solutions involved in optimizing these devices.
Various surveys have been conducted by different institutions according to which it is predicted that wearable technology is very soon going to
trend as one of the dominant societies globally. It is also surveyed that 6
out of 10 mobile phone users are assertive that wearables are going to have
enormous uses beyond health, fitness, and well-being [2].
Now, let us discuss about some trending wearable technologies in detail.
12.2.1
A Wearable Health Monitoring System
with Multi-Parameters
This wearable health monitoring system is a belt-type data acquisition and
communication module that must be worn around the chest. It provides
continuous monitoring of the user’s ECG, respiratory functions, and the
body temperature. The devices have incorporated some advanced sensors
and algorithms to collect the data and send it to a mobile device. The sensors used are of extremely high precision and have given precise results for
heart rate, QRS segment, local body temperature, and respiration related
activities. The device has given a comprehensive analysis of the physiological parameters with great stability. The system is also capable of effectively
storing the medical records. The mobile device can be used to communicate with the customer management center via WiFi network [3]. Data
mining and pattern recognition can be applied to acquire advanced results
and preventing chronic disabilities.
228
The Internet of Medical Things (IoMT)
12.2.2
Wearable Input Device for Smart Glasses Based on a
Wristband-Type Motion-Aware Touch Panel
A novel device is designed which is presented as wearable smart glasses
with a wristband-type, motion-aware touch panel. Using this device, the
user can easily manipulate the smart glasses’ AR system to select and move
the contents by touching, scrolling, or dragging the 3D objects using a
rotating wrist. The device also has an advanced touch panel system that
allows the user to control and operate the system effectively. The users can
also carry out the tasks via a Head Mounted Display which makes the task
easier. The designed device is a simple hardware that can be embedded into
any existing smart watches or smart phones that are already available in the
market [4].
12.2.3
Smart Belt: A Wearable Device for Managing
Abdominal Obesity
This novel device has been designed for people who maintain an improper
lifestyle such as lack of exercise and overeating. Such lifestyle leads to
deposition of fat in the abdomen leading to abdominal obesity which
might cause problems like high blood pressure (BP) and heart failure.
Maintaining a proper posture could lead to reduction of abdominal obesity. This tool has been designed to enhance the ability of the users to measure, record, and correct their postures by themselves. This wearable device
also known as Smart Belt uses data processing technology to monitor and
analyze the data acquired from the living body. Several sensors like force
sensor and acceleration sensor were combined to detect the incorrect postures [5]. The system can be further enhanced by designing an application
to provide a personalized to the user instantly.
12.2.4
Smart Bracelets: Automating the Personal Safety Using
Wearable Smart Jewelry
As we know that there are many devices already existing in the market that can save a person from becoming a victim of physical assaults.
Most panic button-type devices require the intervention of the victim in
order to contact any emergency services. To resolve this issue, a wearable device was designed in the form of a bangle or a bracelet to instinctively recognize and distinguish violations or incursions. The use of
certain sensors and the machine learning techniques makes this device a
smart device. The system takes measures to find the alternative services
Wearable Health Monitoring Systems Using IoMT
229
and gives a sense of protection against the assailant by taking a series of
protective actions. The device is designed in such a way that it concentrates particularly on differentiating the regular movements as well as
the movements related to the incursions and assaults, which eventually
automates the process of shouting for help throughout the assault. This
wearable jewelry bracelet accounts to great stability and results and it is
highly favorable to potential victims of physical assault as well as elderly
persons [6].
There are certain other sensors also known as sensor patches that can be
bonded to the skin for either fitness tracking or for touch sensitive applications. It is also important to know that there is technology called electronic
tattoo that are very user friendly adjustable and malleable. Using these tattoos, the information can be transmitted wirelessly by placing an electronic
circuit just beneath the surface.
12.3 Vital Parameters That Can Be Monitored Using
Wearable Devices
Selection of vital parameters plays a vital role in designing a wearable
device. Sensors are used to capture, analyze, and record these vital
parameters. The selection of sensor plays an important role as it must
minimize the power consumption and maximize the gain output. Usage
of low power sensing components is useful but we also have to note that
these sensors are not enough to store and transmit the date. Since the
data has to be stored and transmitted in the IoMT platform, we must
use certain sensors which can transmit the data in real time [7]. It is also
noted that turning off these sensors can reduce the sensing abilities in
certain wearable devices. Using energy-efficient sensors can also increase
the cost of the sensors. Several projects were proposed which introduced
a scheme for selection of sensor to reduce the power utilization for detection of human activity. This scheme is very useful as it fuses the information acquired from the classifiers and thus helps the user to operate the
individual sensors easily. The sensor modules are adopted with respect to
their provision of accuracies. The different methods adopted for selection
of sensors should be such that it satisfies the most important criteria that
is conservation of energy.
As we know that various types of sensors are available which are capable
of acquiring the vital parameters such as pulse rate, heart rate, and respiration rate (RR) body temperature. Sensors play a major role in converting
the vital physiological signal into electrical signal that can be recorded,
230
The Internet of Medical Things (IoMT)
transmitted, or stored. They are capable of acquiring the vital parameters
and also process the acquired data, this data is further analyzed by uploading it over a network device [8]. Now, let us discuss about certain examples
which can play a major role in designing the wearable IoMT devices. These
parameters include the following.
12.3.1
Electrocardiogram
Electrocardiograms (ECGs) are most widely used technique which uses
biosignals, as an indicative tool in the healthcare world. These biosignals
provide information about the cardiac activity and cycle of human body.
The ECG waveform is characterized by certain peaks named as P, Q, R, S,
T, and U.
Each peak in ECG denotes variations in the electrical potential of heart.
This consequently results in alterations in muscle activity of heart.
The peak which is different from all the other peaks is the R peak which
plays a vital role in detecting most of the heart diseases. This peak represents the depolarization of the ventricles, and thus, it shows high differential potential. The R-R interval is mostly preferred to measure the heart
cycles and to analyze the cardiac rhythm.
Several complications like ischemia, coronary blockage, and even myocardial infractions can be analyzed and predicted by measuring the QRS
waveform. The analysis of the patterns of ECG waveform plays a crucial
role in the diagnosing several cardiovascular diseases such as congestive
heart failure, heart attack, bradycardia, and cardiac dysrhythmia.
Thus, the analysis of ECG in real time can play an important role in
continuous monitoring of cardiac patients. For this purpose, it has to be
integrated to an optimized wearable device which uses IoMT to transmit
the data safely over a network. The advantage of using wearable devices
for medical purposes is one of the major steps toward detection of cardiac
diseases in its early stages. It is also important for diagnosis of atrial fibrillation in early stages.
The initial step toward improvement includes detection of defects
related to atrial fibrillation which is a widespread concern all over the
world. Certification of ECG monitoring is very important as it allows early
detection of such diseases. Certain wet electrodes such as silver and silver
chlorides are used.
The usage of Ag/AgCl electrodes (wet electrodes) are the most widely
used to transduce the ionic current from the heart into electron current
in metallic wires. Its manufacturing and design characteristics allow the
cell to work with very low electric potential. The electrodes used are very
Wearable Health Monitoring Systems Using IoMT
231
compact and reliable [9]. The only drawback might be the initiation of skin
irritation due to its adhesive properties. When the contact is not made
properly, the probability of the gel getting dried out increases.
Holter monitors are the devices usually used for long acquisitions. The
main disadvantage is that it interrupts daily life routine of the patients
which makes it really uncomfortable for the patient using it. Several developments were made to overcome this issue. The use of fabric embedded
electronics and dry electrodes was proposed and implemented using different materials. The major advantage is that this type of material does not
cause any skin irritation. There is also a limitation with this type of monitoring, as artifacts appear because of body movement, this makes the usage
of such electrodes clinically unfeasible [10].
Several new technologies were made to reduce the artifacts that are
produced by body movement and skin irritation. Development of dry and
stretchable sensor was useful as it could easily attach with the human skin
but it had several drawbacks too. The best way to overcome the adhesive
property is to have dry electrodes which are non-sticky and which can easily use in wearable technologies.
The other types of ECG sensors include non-contact capacitive electrodes which are very useful in acquiring the ECG data without direct skin
contact but are more sensitive to motion artifacts.
12.3.2
Heart Rate
Heart rate (HR) is one of the vital signs that can be easily extracted from
either using an ECG or a photoplethysmography (PPG) equipment. It is
very important during any fitness activity to know whether the heart’s reaction to the exercise or activity is appropriate or not. Recently, the analysis of
HR is gaining a lot of attention in the field of biomedical engineering. It is
one of the simplest indicators of the condition of heart and continuous and
real-time monitoring of this parameter will be really useful in designing
wearable IoMT devices. There are several other methods to measure heart
rate, one of the most common examples being ballistocardiogram (BCG),
which uses inertial sensors. This measurement which is taken from BCG
provides more information than heart rate, like the strength, amplitude,
and regularity of pulse. These methods do have several drawbacks like difficulty of usage and feasibility issues which will be discussed in detail in the
next section [11].
Pulse signal must not be considered the same as that of heart rate. As
HR can be measured using pulse oximetry principles which are also used
to measure the oxygen saturation of blood.
232
The Internet of Medical Things (IoMT)
12.3.3
Blood Pressure
As we know that BP is considered as the most important cardiopulmonary
parameter which indicates the pressure imposed by the blood against the
arterial wall [12]. Measurement of BP indirectly gives us the information
about blood flow during contraction and relaxation. It can also indicate
cellular oxygen delivery. It is also influenced by several human physiological characteristics such as follows:
1.
2.
3.
4.
Cardiac output
Peripheral vascular resistance
Blood volume and viscosity
Vessel wall elasticity
BP monitoring allows getting BP readings several times a day, and this
process allows monitoring of high BP which is also known as hypertension. Hypertension is one of greatest threats leading to several cardiovascular diseases. As we know that traditionally BP measurement takes place
using inflatable pressure cuffs with a stethoscope placed on the patient’s
arm. This method is being carried out to perform autonomous BP measurement. In some cases, we also use the same equipment with a fully automated inflatable cuff that measure BP by proposing a relation between the
magnitudes of arterial volume pulsations.
There are several limitations related to continuous monitoring using a
cuff. This can result in undesirable side effects which might include sleep
disruptions, skin irritations, and increased stress levels which would gradually increase the HR of the individual. Several new technologies were
introduced to solve this problem. Ambulatory BP monitoring is one of the
best examples, which was developed to estimate BP based on the pulse
wave transit time between the pulse wave obtained by PPG and the R peak
of ECG. Both the signals were measured from the chest. Yu-Pin Hsu [13]
also developed a new technique for BP measurement based on the measurement of pulse wave velocity by using a series of microelectromechanical electrodes placed on the wrist and neck region of the human body.
One of the recent advancements has been development of a watch type
prototype which uses a pressure sensor to measure the activities of radial
artery and can provide almost accurate BP measurement in real-time [14].
12.3.4
Respiration Rate
RR is a basic physiological parameter that deals with respiration system of the human body. The measurement of RR is an important health
Wearable Health Monitoring Systems Using IoMT
233
information and it can give us a premature warning about the respiratory conditions of a person. In many of the cases, the respiratory rate is
often guessed or not recorded properly using clinical equipment. It is very
important to know that analysis of oxygen saturation provides a much better reflection of patient respiratory function.
This vital parameter is usually calculated based on two important parameters that are inspiration and expiration. This process is done by acquiring
the respiratory waveform that represents the chest volume variation during
inhalation and exhalation. Another important parameter which plays
a vital role is the measurement of thoracic expansion. The movement of
muscles also is taken into consideration to calculate the respiratory defects.
These analysis and calculations are of great use in achievement of better
respiratory performance especially in athletes.
Nowadays, in order to obtain the respiratory function, there are three
primary methods being used often:
1. Elastomeric plethysmography (EP)
2. Impedance plethysmography (IP)
3. Respiratory inductive plethysmography (RIP)
EP is technique that converts current variation of the piezoelectric sensors into voltage by using an elastic belt. Guo et al. [15] had developed a
prototype garment which was capable of measuring both abdominal and
chest volume changes with great accuracy and precision. This technique
basically used a piezoresistive fabric sensor.
IP uses impedance changes of the outer body surface which is caused
due to the expansion and contraction of the body during breathing. One of
the best examples of this technology is design of uniform vest which could
be used by the soldiers.
The above-mentioned methods are most widely used. Other than these
technologies, several other methods and devices are used to obtain the
respiratory waveforms. Some of them are listed below:
1.
2.
3.
4.
Accelerometers
Waveform derived from ECG
Waveform taken from pulse oximetry
Optical fibers
Many methods are available and used commercially but are not suitable
to be used in wearable healthcare devices. Usage of infrared cameras or
other acoustic methods can have several drawbacks as it might increase the
234
The Internet of Medical Things (IoMT)
complexity of the device. Other methods that are not suitable to be implemented in WHDs are referred on his review such as using infrared cameras
or acoustic methods. Several research works are going on to obtain the
chest volume variations, which would broaden the chances of detecting
secondary respiratory diseases in an early stage.
12.3.5
Blood Oxygen Saturation
Blood oxygen saturation (SpO2) is an extremely important vital parameter.
It can be easily measured using a PPG technology or using pulse oximetry
principles. The PPG method helps us to acquire variations in blood vessel
waveforms. It can be measured using two wavelengths that are 660 and 905
nm. Using a PPG, it is possible to get an estimation of SpO2.
The hemoglobin absorbance spectrum plays an important role in determining the changes in oxygen saturation. This estimate further leads to
early detection of conditions such as hypoxia (refers to lower percentage
of oxygen) which might lead to insufficient oxygen supply to human body.
There are several problems related to measurement of SpO2 measurement.
One of the problems arises when the patient is mostly anemic.
The inclusion of a SpO2 sensor in any wearable device can be very useful. The evaluation of the aerobic efficiency of person who performs routine exercise becomes very easy. The continuous analysis of SpO2 levels can
also help in maximization of athlete performance. The system is very useful in military and space applications. It also plays a major role in astronomy and space-related applications as changes in gravity levels can directly
affect the oxygen level in blood.
Several non-invasive technologies are available in the market that can
measure SpO2 but PPG is most widely used in the medical applications of
wearable devices. Out of all the other body parts, finger is the most commonly used part which is used to acquire SpO2 levels most commonly
used in clinical conditions. Most advanced PPG sensors use ring electrodes
instead of clipping electrodes. These sensors are easy to adapt and are more
comfortable to the patients. The best way in which these sensors can be
used is by integrating it with mobile phones which can help in acquiring
continuous instantaneous results.
Many other research works based on development of ear lobes in the
form of a very small chip is being carried out recently. These ear lobes are
capable of measuring the SpO2. PPG sensors placed on forehead are useful
in measuring brain oxygenation. Similarly, PPG sensors which can be used
on the surface of the chest was developed which could demonstrate the
viability of oxygen in human body.
Wearable Health Monitoring Systems Using IoMT
235
Giovangrandi et al. [33] designed a prototype which could adjust the
parameters such that the depth of tissue measurement can be analyzed,
this process could gradually increase the functionality of the equipment
for clinical applications.
12.3.6
Blood Glucose
Blood glucose (BG) measurement is a vital parameter which is carried
out all over the world for diabetic subjects. Diabetes disease causes several other physiological disorders which can cause serious threat to human
beings. Some of the most common diseases which develop as a result of
being a diabetic patient include the following:
1.
2.
3.
4.
Cardiovascular diseases
Cerebral vascular disturbance
Retinopathy
Nephropathy
In order to prevent these complications, diabetic patients try to control
BG concentration by continuously monitoring the BG level and also by
following a very strict diet. Depending upon the BG levels, some patients
also tend to inject insulin whenever required in order to maintain the standard values.
It is the most commonly used method to carry out evaluation of blood
sugar concentration. It is done by collecting the blood sample using a
pricking device called the lancet. There have been a lot of research activities going on, in order to make this process completely noninvasive. Several
devices were designed and are already in use in the market.
Continuous Glucose Monitoring (CGM) device is capable to measure
the BG levels using an adhesive patch which has a needle along with it. This
device sends the data required to measure the BG into a wearable insulin
pump that can release insulin into the human body. Adding to the advancement in BG level measurement, another device was designed which had an
adhesive patch with a needle to measure the blood sugar, and thus, the data
acquired can be sent wirelessly to a mobile device immediately. This device
was quite successful as it could give a continuous monitoring on BG level
using their mobile application [15, 25]. The needle used in this device can
be inserted just inside the skin and the measurements can be obtained.
Several non-invasive BG measurement techniques have been developed
to increase the efficacy as well as to improve the self-monitoring abilities
of the diabetic patients. It is also important to know that GlucoWatch was
236
The Internet of Medical Things (IoMT)
one of first commercially available device, which was able to measure the
glucose level every 20 min through the skin via a process known as reverse
iontophoresis. It had several drawbacks and one of the major drawbacks
being skin irritation, and it was stopped being used in medical applications. Non-invasive techniques have always been useful because of the ease
of use and less pain. Some of the noninvasive techniques used to measure
BG levels are as follows:
1.
2.
3.
4.
5.
6.
Bioimpedance spectroscopy
Electromagnetic Sensing
Fluorescence technology
Measurement of BG through the eye
Ultrasound technology
Near infrared spectroscopy
These non-invasive techniques were developed but each and every technique had one or the other drawback. The bioimpedance spectroscopy was
a good technique for monitoring the BG levels but it had poor reliability. The component requirements for acquisition of data were costly and
non-efficient. Similarly, the use of eye to measure the BG level also had several barriers. Interference from biological compounds present inside the
body was also a matter of concern for the above-mentioned techniques.
The high signal strength and sensitivity of the ultrasound technology was
also a matter of concern. The penetration power and temperature changes
in the surrounding tissues also affected the measurement of BG levels.
The devices which used the above-mentioned technologies also had to
include several other sensors such as temperature sensors, skin perspiration,
and actimetry sensors in order to predict and monitor the energy expenditure
in human body. Based on these data, the estimation of insulin to be administered in the patient would be done. There are several barriers and challenges
still faced by researchers leading to constant improvement and also encouraging young researchers and research groups to develop new and better devices
to get a reliable, convenient, and stable wearable device with continuous
monitoring capability. Estimation of BG within a provided time window is
also important. Several new algorithms and software technologies are being
designed to determine the BG level within a particular time window.
12.3.7
Skin Perspiration
Skin perspiration is one of the most important physiological signs, which is
very useful in analysis of human reaction during different situations. Even
Wearable Health Monitoring Systems Using IoMT
237
though it is not considered as a clinical parameter, it plays a major role in
learning about the stimulation produced by the nervous system. As we
know that life situations can cause several neurological reactions from the
autonomic nervous system which stimulates an increase of skin sweating.
Because of the sweating which can also be termed as moisture, there can be
certain changes in the electrical conductance of the skin. This phenomenon
can be measured using galvanic skin resistance, which is capable of measuring the amount of sweat produced by the sweat glands. When certain other
sensors are supported with galvanic skin resistance, it becomes easy for the
physician to know about the mental state of the patient. For example, when
HR sensor is combined with skin perspiration, it becomes easy for the physician to get an idea about the mental stress of the patient instantaneously [16].
Measurement of skin perspiration plays a vital role in designing several
fitness equipment which are wearable and user friendly. In recent times,
several fitness bands have been developed which can give information
about the user’s heart rate, perspiration, and other parameters. Sometimes,
without knowing the physical activity context, the data gets interpreted
leading to miscommunication between the device and the user. Some of
the important ions and molecules which play a major role in determining the skin perspiration include ammonium, calcium, and sodium. These
molecules give a clear indication about the electrolyte imbalance caused
in our body due to different situations. Some of the situations when not
treated on time can further lead to severe complications like cystic fibrosis,
osteoporosis, and mental and physical stress. One of the best examples for
this analysis is the detection of psychological and physiological stress that
the militaries undergo during intense training. Evaluation of such stress
can help us to attain important information about different individuals and
their reactions to the training can be recorded [17].
The sensors which are used in skin sweat monitoring can be divided into
two main categories.
1. Epidermal sensors
2. Sensors based on fabric or plastic material
The epidermal-based sensors have a compliance between the surface of
the electrode and the biofluid with which it is in contact. Similarly, elastomeric stamps can also be used to print the electrodes directly on the
epidermal layer of the human skin for continuous monitoring of skin
perspiration.
The second type of sensor which is based on fabric or flexible plastic is
most commonly used as it is in constant contact with the large surface area
238
The Internet of Medical Things (IoMT)
of the skin. These sensors can be embedded into a fabric which can be easily used to obtain the measurements of pH and also the ion concentrations
of sodium, potassium, etc.
A new sensor was introduced with a wide variety of fabrics which was
able to measure GSR. The device has good wearability and was user friendly.
The device designed was small and flexible, because of which it was able
to maintain a stable contact with the surface of the skin. The surface of
the sensor was made with a dry polymer foam electrode. Another device
which is under development is based on the analysis of sweat. It is termed
as microfluid-based test analysis. The important components of this device
are the Bluetooth module and the microcontroller which is going to play a
vital role in continuous skin perspiration monitoring [18].
12.3.8
Capnography
Capnography is great way to access the arterial oxygenation. It is a non-invasive way to acquire the data about partial pressure of carbon dioxide from
the airway. It provides many physiological details on ventilation, metabolism, perfusion analysis, etc. These details play an important role in determining the necessary details about air way management in several devices.
The output of the device is represented as the maximum partial pressure of
carbon dioxide which is obtained at the end of exhalation. The results are
mostly obtained as numeric values, sometimes, it can also be represented
in graphical format. A capnograph helps to represent the expired carbon
dioxide in a graphic display of waveform. In most of the cases, pulse oximetry can be used to get the required information about arterial oxygenation,
when it comes to assessment of human ventilation, this method has several
drawbacks. Capnography is one of the most important non-invasive techniques, which is also a cost effective method to analyze the carbon monoxide levels present in the lungs. Capnography is not just used to evaluate
the carbon dioxide levels and the RR; it was also widely used for anesthesia
care in operating theatres. The physicians and the anesthesiologists could
easily evaluate the consciousness levels of the patient. This was especially
useful during the sedation process in the operation theatres.
Capnography use in medical applications is very limited. Several studies
prove that there is a strong correlation of lethality and mortality because
of the underutilization of capnography in ICUs. Several scientists strongly
believe that capnography must be considered as an essential monitor to
show the integrity of airway and ventilation. Adapting this process would
help us predict the health status of patient continuously. Capnography is
mainly used specifically for certain diseases called sleep apnea. Sleep apnea
Wearable Health Monitoring Systems Using IoMT
239
monitors usually have capnography device integrated to them, in order to
continuously measure the RR of patient. Usually this disorder is continuously monitored using polygraphy. Several studies proved that capnography can make an early diagnosis of sleep apnea syndrome on its own
without any supportive sensors [19].
Wearable and portable devices are already available for commercial
use which can be used at home for continuous monitoring of sleep apnea
syndrome. Capnography is becoming a widespread parameter on many
portable devices, and in near future, it is believed that many portable and
wearable devices will have this technology as reliable and cost-effective
product [20].
12.3.9
Body Temperature
Body temperature is one of the predominant parameters which would give
a brief analysis of a person’s well-being. It is defined as the consequence
of the balance between the heat produced and heat lost from the body. It
is also a known fact that certain proteins denature or lose their function
when the body temperature increases. The body temperature can be widely
classified into two types:
1. Core temperature
2. Skin temperature
The body’s core temperature is basically regulated by thermoregulation
mechanisms, whereas the skin temperature keeps varying depending upon
external environmental changes. Skin temperature is also affected by blood
circulation and heart rate. Some of the external factors which play a vital
role in regulating the body temperature include the following:
1. Humidity
2. Atmospheric temperature
3. Air circulation
Different wearable devices are already present in the market which can
give precise skin temperature measurements. Wearable devices with adhesive property are useful in continuous monitoring of body temperature.
The recent advancement in measurement of skin temperature includes
non-contact technology which uses radio frequency identification system.
This temperature sensor has many advantages as it is wireless and reusable,
and it can easily acquire epidermal temperature. It is a battery less radio
240
The Internet of Medical Things (IoMT)
frequency thermometer that can be used to measure the core temperature.
Still measuring the core temperature using noninvasive method remains a
challenge in medical field. Several algorithms were proposed to identify the
core temperatures of human body [21]. The changes that occur in human
body due to certain external factors are the actual reason for this challenge.
Telemetric pill is a new advancement in measurement of body temperature. Even though it allows better usability, it has several complications,
too. One of the greatest complications is that the temperature measurement is easily influenced by the temperature of the food and the fluids
ingested by the subject.
12.4 Challenges Faced in Customizing Wearable
Devices
The research activities in the field of wearable devices are improving every
day. Some significant challenges play a major role in the domain of IoMT.
If these challenges are met, then the design and commercial use of wearable device would improve drastically. IoMT can provide more reliable,
user friendly, and better services in the field of IoMT. Finding solutions
to these challenges would play a major role in bridging the gap between
the doctors and patients. IoMT would also play a major role in helping the
health professionals to work with more ease and flexibility.
12.4.1
Data Privacy
Data privacy is one of the major issues faced by hospital network and IoMT
devices. Cybercriminals targeting the susceptibility in these IoMT devices
can easily obtain unauthorised access to sensitive healthcare information.
It is also possible that they can access or attack other connected devices
that can cause significant life-threatening harm to the patient. Scientists
are working on various domains such as light weight security solutions,
blockchain for healthcare security, and privacy preserving technologies
which would provide authentication to licensed users.
12.4.2
Data Exchange
Data exchanges play an important role in development and deployment
of healthcare devices. A lot of research is going on to achieve a sustainable
solution for data exchange. One of the advanced technologies employed
for this purpose is blockchain technology. It has enormous potential in
Wearable Health Monitoring Systems Using IoMT
241
governing the complex requirements happening in the background of
several sensors and networks. There are several more concerns related to
data exchange which need to be solved before initiating the data exchange.
Some of the barriers include the format in which data is available, interoperability, APIs, etc. New data streams need to be developed to liquify the
flow of data for IoMT deployments.
12.4.3
Availability of Resources
Resource availability is very closely related to development of IoMT in
remote geographic regions. If we do not have an overview of what are the
raw materials available, then setting up a network to make connections
between various components becomes very difficult. Several sensors, storage devices, miniaturized microcontrollers, etc., are required to build up a
customized device. It is very important to have a data acquisition and data
transmission system also [21].
Since we are dealing with customized wearable devices, it is also important for us to send the recorded data to a physician for further assistance.
This procedure requires a stable internet connection. So, all these resources
are together capable of creating a smart wearable device which can be commercially used. Depending upon the availability and connections between
the products, identity validation is imposed based on which the device can
restrict itself from establishing multiple connections with a given server.
Similarly, the magnitude of any security breach has direct connection with
the protection of these resources. Lightweight encryption schemes are sued
to authenticate the connections between various resources [22].
12.4.4
Storage Capacity
Designing a wearable device which is also customized is not an easy task.
One of the most important challenges involved in designing an IoMT device
is storage capacity. The device must have an inbuilt mechanism to store the
recorded data in the device itself in the cloud. The device must have certain
features to add compute, pre-process, and present a data in a certain format. Later, the data is uploaded and sent for long term processing.
A lot of companies are working on data storage and data management
standards and tools that treat the data with high level of security and consistency. These data are processed outside the datacenter in public and
private clouds. IoMT devices usually have low processing capabilities and
limited memory and storage capacity on RAM. It is a remarkable challenge for the manufacturers to design and develop various components
242
The Internet of Medical Things (IoMT)
with a good storage capacity as well as comprehensive security measures.
The design must be kept simple, with all the necessary features in it while
also leaving enough space for security software. Certain companies are also
designing customized hardware which can process the data in real time as
it is fed as input from the external sources [23].
12.4.5
Modeling the Relationship Between Acquired
Measurement and Diseases
A lot of management tools like mathematical and computational models
for making qualitative and quantitative predictions are being proposed.
This would give a brief idea about different control measure [25].
12.4.6
Real-Time Processing
As we see that new and highly pathogenic infections are increasing at an
incredible rate. In the past decade, it has seen a dramatic increase within
the significance attached to infectious diseases from the general public
health perspective [26]. A more sensible formulation would be to specify the probability of leaving a category as a function of the time spent
within the category, such initially the prospect of leaving the category
is tiny, but the probability increases because the mean infectious/latent
period is reached. Stream processing could be a technology enabling its
applications to act-on (collect, integrate, visualize, and analyze) real-time
streaming data, while the information is being produced. When you will
be able to process real-time streaming data as fast as you collect it, you
will be able to answer changing conditions like never before [27]. We
will capture and aggregate a lot of events per second then instantly take
action to prevent Mastercard theft, make a real-time offer, or prevent a
medical device failure [28].
12.4.7
Intelligence in Medical Care
Companies are integrating AI-driven platforms in the field of medical
scanning devices in order to boost image clarity and clinical outcomes by
reducing exposure to radiation to higher levels. It is also a known fact that
since the inclusion of artificial intelligence in the healthcare industry [29].
i.
forms of AI applications currently in development at
industry-leading firms;
Wearable Health Monitoring Systems Using IoMT
243
ii.
common trends among innovation efforts, the effect of
those on the longer term of healthcare;
iii. applications that appear to deliver the foremost value for
healthcare professionals;
iv. companies are integrating AI-driven platforms in medical
scanning devices to boost image clarity and clinical outcomes by reducing exposure to radiation;
v. AI and IoT: several companies are integrating AI and IoT
in order to monitor the patient activities [24].
12.5 Conclusion
IoMT is an upcoming field which can play a major role in enhancing the
healthcare business. Several modern technologies can be combined top
acquire better knowledge about the healthcare domain. Several paradigms,
such as enhanced data processing, cloud computing, and deep learning
techniques, can be used to empower the current available wearable devices.
This process would help the physicians to create better decisions and to
improve their knowledge about wearable healthcare devices. However,
these wearable devices also face several issues during acquisition, processing, and transmission of data. These problems might hamper the applications of IoMT in healthcare [30].
The processing of data in IoMT platform requires integration of multiple
healthcare technologies as well as devices in order to share the knowledge
about a patient’s vital parameters. This act of processing is important and
during this process the data must be standardized. The standardized data
formats play a major role in analytic processing of healthcare information.
These IoMT procedures follow different interoperability formats for transmission and reception of data. The wearable devices discussed above will
allow humans to interact in real time over great distances. The IoMT technology will allow remote learning, searching, and conducting surgeries.
Certain factors like end point security and internal segmentation play a
major role in IoMT devices [31]. Application of better policies in order to
authenticate the performance of the wearable device as well as to monitor
the user activity in real time makes the whole system more efficient. It is a
well-known fact that physicians and organizations are discarding the currently available conventional methods and pitching upon technologically
advanced digital solutions [32].
The IoMT will transform the perspective with which the healthcare
industry operates. It can connect both digital and non-digital devices,
244
The Internet of Medical Things (IoMT)
which becomes another major advantage for the physicians. One of the
best examples is connecting a HR monitor with patient beds via internet.
This facility would be very helpful both for the physicians as well as the
patients. The above-mentioned devices and sensors will facilitate patient
monitoring even from remote locations as a result the patient need not
visit a physician regularly. This system would save a lot of money well as
time of both patient as well as the physician. This would become a great
proof of success and act as bridge between the doctors and the patients to
keep track of all the physical activities of the patient. This process would
gradually improvise the physician and patient’s relationship and help in
providing on time medical alerts for the patient.
References
1. Vishnu, S., Electronics and Communication Engg. Vignan’s Foundation
for Science, Technology and Research Guntur, India vishnuvazhamkuzhiyil@gmail.com, Internet of Medical Things (IoMT) - An overview. 5th
International Conference on Devices, Circuits and Systems (ICDCS), IEEE
Journal, 7, 101–104, 2020.
2. Sood, R., Kaur, P., Sharma, S., Mehmuda, A., Kumar, A., Electronics &
Communication Engineering Department, National Institute of Technology
Hamirpur (H.P.), India 1 rittwik3@gmail.com, IoT Enabled Smart Wearable
Device – Sukoon. 2018 Fourteenth International Conference on Information
Processing (ICINPRO), Bangalore, India, 1–4, 2018.
3. Liu, J., Xie, F., Zhou, Y., Zou, Q., Key Laboratory of Biomedical Engineering
of Ministry of Education College of Biomedical Engineering & Instrument
Science Zhejiang University Hangzhou, China, A Wearable Health
Monitoring System with Multi-parameters. 2013 6th International Conference
on Biomedical Engineering and Informatics (BMEI 2013), 40, 332–336, 2013.
4. Ham, J., Hong, J., Jang, Y., Ko, S.H., Woo, W., Korea Advanced Institute of
Science and Technology, Wearable Input Device for Smart Glasses Based on
a Wristband-Type Motion-Aware Touch Panel. IEEE Symposium on 3D User
Interfaces 2014, IEEE, 29–30 USA, 147–148, March 2014.
5. Smart Belt: A wearable device for managing abdominal obesity. 2016
International Conference on Big Data and Smart Computing (BigComp),
IEEE, Hong Kong, China, 430–434, 2016.
6. Smart Bracelets: Towards Automating Personal Safety using Wearable Smart
Jewelry. 2018 15th IEEE Annual Consumer Communications & Networking
Conference (CCNC), IEEE, Las Vegas USA, 2018.
7. Seneviratne, S., Hu, Y., Nguyen, T., Lan, G., Khalifa, S., Thilakarathna, K.,
Hassan, M., Seneviratne, A., A Survey of Wearable Devices and Challenges,
Data61, IEEE Communications Surveys & Tutorial, CSIRO, University of
Wearable Health Monitoring Systems Using IoMT
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
245
New South Wales, Australia University of Technology Sydney, Australia first
name.lastname@data61.csiro.au, 19, 2573–2620, 2017.
Yang, Z., Wang, Z., Zhang, J., Huang, C., Zhang, Q., Wearables can afford:
Light-weight indoor positioning with visible light, in: Proceedings of the
13th Annual International Conference on Mobile Systems, Applications, and
Services, ACM, pp. 317–330, 2015.
Lee, Y.H. and Medioni, G., RGB-D camera based wearable navigation system
for the visually impaired. Comput. Vis. Image Underst., 149, 3–20, 2016.
Mansour, A. and Mootoo, R. et.al., Sensor based home automation and security system. Conference Record - IEEE Instrumentation and Measurement
Technology Conference, 722–727, May 2012.
Mohd Azlan Abu1, Siti Fatimah Nordin1, Mohd Zubir Suboh1, et.al., Design
and Development of Home Security Systems based on Internet of Things Via
Favoriot Platform. Int. J. Appl. Eng. Res., 13, 2, 1253–1260, 2018.
Vidal, M., Turner, J., Bulling, A., Gellersen, H., Wearable eye tracking for
mental health monitoring. Comput. Commun., 35, 11, 1306–1311, 2012.
H., Y.-P. and Young, D.J., Skin-Coupled Personal Wearable Ambulatory Pulse
Wave Velocity Monitoring System Using Microelectromechanical Sensors.
IEEE Sens. J., 14, 3490–3497, 2014.
Woo, S.H., Choi, Y.Y., Kim, D.J., Bien, F., Kim, J.J., Tissue-informative mechanism for wearable non-invasive continuous blood pressure monitoring. Sci.
Rep., 4, 6618, 2014, [CrossRef] [PubMed].
Guo, L., Berglin, L., Wiklund, U., Mattila, H., Design of a garment-based sensing
system for breathing monitoring. Text. Res. J., 83, 499–509, 2012, [CrossRef].
Gandis, G., Mazeika, M., Swanson, R., R. CRTT, Respiratory Inductance
Plethysmography an Introduction, PRO-TECH services, Inc. USA, Available
online: http://www.pro-tech.com/ (accessed on 5 June 2017).
Anmin, J., Bin, Y., Morren, G., Duric, H., Aarts, R.M., Performance evaluation of a tri-axial accelerometry-based respiration monitoring for ambient
assisted living, in: Proceedings of the Engineering in Medicine and Biology
Society, Minneapolis, MN, USA, pp. 5677–5680, 3–6 September 2009.
Mendelson, Y., Dao, D.K., Chon, K.H., Multi-channel pulse oximetry
for wearable physiological monitoring, in: Proceedings of the 2013 IEEE
International Conference on Body Sensor Networks (BSN), MA, USA, USA,
pp. 1–6, 6–9 May 2013.
Chen, C.-M., Kwasnicki, R., Lo, B., Yang, G.Z., Wearable Tissue Oxygenation
Monitoring Sensor and a Forearm Vascular Phantom Design for Data
Validation, in: Proceedings of the 11th International Conference on Wearable
and Implantable Body Sensor Networks, Zurich, Switzerland, pp. 64–68,
16–19 June 2014, [CrossRef].
Zysset, C., Nasseri, N., Büthe, L., Münzenrieder, N., Kinkeldei, T., Petti, L.,
Kleiser, S., Salvatore, G.A., Wolf, M., Tröster, G., Textile integrated sensors
and actuators for near-infrared spectroscopy. Opt. Express, 21, 3213, 2013,
[CrossRef] [PubMed].
246
The Internet of Medical Things (IoMT)
21. Krehel, M., Wolf, M., Boesel, L.F., Rossi, R.M., Bona, G.L., Scherer, L.J.,
Development of a luminous textile for reflective pulse oximetry measurements. Biomed. Opt. Express, 5, 2537–2547, 2014, [CrossRef] [PubMed].
22. Medtronic MiniMed, I. Continuous Glucose Monitoring. MDPI Journal,
Electronics, 6, 65, 1–16, 2017, Available online: https://www.medtronicdiabetes.com (accessed on 7 July 2017). doi:10.3390/electronics6030065
23. Dexcom, I. Dexcom G4 Platinum. 2015 Journal of Diabetes Science and
Technology, PubMed, 9, March 2015, Available online: http://www.dexcom.
com/pt-PT (accessed on 7 July 2017). 2015Journal of Diabetes Science and
Technology, March 2015, 9 DOI:10.1177/1932296815577812
24. Burton, A., Stuart, T., Ausra, J., Gutruf, P., Smartphone for monitoring basic
vital signs: Miniaturized, near-field communication based devices for chronic
recording of health. Smartphone Based Med. Diagn., 10, 177–208, 2020.
25. Takahashi, M., Heo, Y.J., Kawanishi, T., Okitsu, T., Takeuchi, S., Portable continuous glucose monitoring systems with implantable fluorescent hydrogel
microfibers, in: Proceedings of the 2013 IEEE 26th International Conference
on Micro Electro Mechanical Systems (MEMS), Taipei, Taiwan, pp. 1089–
1092, 20–24 January 2013.
26. Di Rienzo, M., P., G., Brambilla, G., Ferratini, M., Castiglioni, P., MagIC
System: A New Textile-BasedWearable Device for Biological Signal
Monitoring. Applicability in Daily Life and Clinical Setting, in: Proceedings of
the 2005 IEEE, Engineering in Medicine and Biology 27th Annual Conference
2005, Shangai, China, pp. 7167–7169, 1–4 September 2005.
27. Lymberis, A.G.L., Wearable health systems: From smart technologies to real
applications, in: Proceedings of the Annual International Conference of the
IEEE Engineering in Medicine and Biology Society, New York, NY, USA, pp.
6789–6792, 30 August–3 September 2006.
28. Rita Paradiso, G.L. and Taccini, N., A Wearable Healthcare System Based on
Knitted Integrated Sensors. IEEE Trans. Inf. Technol. Biomed., 9, 337–344,
2005, [CrossRef] [PubMed].
29. Seoane, F., Mohino-Herranz, I., Ferreira, J., Alvarez, L., Buendia, R., Ayllon,
D., Llerena, C., Gil-Pita, R., Wearable biomedical measurement systems for
assessment of mental stress of combatants in real time. Sensors, 14, 7120–
7141, 2014, [CrossRef] [PubMed].
30. Yilmaz, T., Foster, R., Hao, Y., Detecting vital signs with wearable wireless
sensors. Sensors, 10, 10837–10862, 2010, [CrossRef] [PubMed].
31. Statista, B.I., Wearable Device Sales Revenue Worldwide from 2016 to 2022 (in
Billion U.S.Dollars), Statista Inc, New York, NY, USA, 2017.
32. Yussuff, V. and Sanderson, R., The World Market for Wireless Charging in
Wearable Technology, IHS, Englewood, CO, USA, 2014.
33. Giovangrandi, L., Inan, O.T., Banerjee, D., Kovacs, G.T., Preliminary results
from BCG and ECG measurements in the heart failure clinic, in: Proceedings
of the 2012 Annual International Conference of the IEEE Engineering in
Medicine and Biology Society, San Diego, CA, USA, pp. 3780–3783, 28
August–1 September 2012.
13
Future of Healthcare: Biomedical
Big Data Analysis and IoMT
Tamiziniyan G.1* and Keerthana A.2
GRT Institute of Engineering and Technology, Tiruttani, Tamil Nadu, India
Vels Institute of Science, Technology & Advanced Studies (VISTAS), Velan Nagar,
P.V. Vaithiyalingam Road, Pallavaram, Chennai, Tamil Nadu, India
1
2
Abstract
Biomedical big data analysis is an advanced technique exploring a plethora of
datasets for extracting useful diagnostic and therapeutic information. It assists
biomedical researchers to develop new algorithms and prediction models.
Advancements in big data will improve the quality of diagnosis and prophylaxis. Leveraging big data analysis will reduce the challenges in healthcare ecosystem. Integration of datasets can support the healthcare providers for better
patient outcomes. Big data analysis has a huge impact in personalized medicine.
Development of different biomedical data repositories for medical images, biosignals and biochemistry will strengthen medical data analysis. Data collection
and storage over the cloud is getting attention as the usage of wearable sensor
is becoming well accepted. Cloud storage can share the information across different healthcare systems and impart possibilities for big data analytics. Internet
of Things (IoT) can be used in biomedical and health monitoring applications
which comprises of various biosensors and medical devices that are connected
to the network. This will produce enormous data for better diagnostics and therapeutics. Physiological data of patients can be acquired using smart sensors and
data can be stored in the cloud using internet. This will radically change the
diagnostic approach and provide better point of care.
Keywords: Biomedical big data analysis, Internet of Things, biomedical data
acquisition, biomedical data management, clustering algorithms
*Corresponding author: gteniyan@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (247–268) © 2022 Scrivener Publishing LLC
247
248
The Internet of Medical Things (IoMT)
13.1 Introduction
Big data is a collection of large sets of complex data used to computationally analyze and extract useful information. Big data can be defined simply
by three V’s: volume, velocity, and variety (Figure 13.1).
• Volume is the data size in the range of terabytes or petabytes.
Data collected from various sources will be stored in a variety of platforms. This requires extensible storage devices and
support for maintaining the complex multiple data sources.
One of the greatest challenges in big data is to maneuver the
massive size of data stored in different databases and also
identification of specific data in an immense structured
dataset.
• Velocity defines the processing and analyzing time of big
data. Since data were streaming every instance, it is very
important how fast the data is processed and updated in
Volume
3 V of
Big data
Velocity
Figure 13.1 Three important V of big data.
Variety
Biomedical Big Data Analysis and IoMT
249
the database for real-time analysis. Capturing and processing the data in real time or near real time is a major
challenge in healthcare sector for clinical decision-making.
Automated decision-making systems require instantaneous data to predict the outcomes accurately and
efficiently.
• Variety is the format of data stored in the database as either
structured numerical data or unstructured documents like
audios, pictures, and videos. Big data is collection of different types of data acquired from various sources including
web content, multimedia, web server logs, audio and video
records, transaction activities, and location data. Processing
the different format of data also makes big data analysis
complicated. Advancements in data processing algorithms
every day allow us to handle these diverse formats of data
skillfully.
There has been a surge of data since the past twenty years as the world
is in the digital era consisting of internet and many communication technologies and smart gadgets. Data acquired from various sources and in
various formats are stored digitally for different applications. But before
the digital era also plenty of data existed which are the archived files and
records documented simply in papers. Today, we have highly advanced
technology such as online and offline databases, spreadsheets, and cloud
computing systems that can facilitate the data accessing from anywhere
in the world. Large amounts of data were created every day and this
tends to increase steadily. Every internet user knowingly or unknowingly
leaves a digital trace of their digital activity. In the next 10 years, the
digital data available globally from various sources will reach around 100
zettabytes [1, 2]. This stream of data flow will challenge us to acquire,
process, store, analyze, and exchange the datasets. Hence, the currently
available technologies and systems may not handle this huge volume of
data. This is where big data analysis is implemented to organize and manage the databases.
Big data is growing every day and changing the way decisions are made
in many sectors such as production industries, healthcare, finance, and
marketing. Storing, sorting, processing, and analyzing the data requires
250
The Internet of Medical Things (IoMT)
well-equipped storage devices, algorithms, and communication system
which are evolving daily. Big data analysis strives to resolve the challenges
easily and cost efficiently. It is possible to predict the future outcomes
using big data analysis by developing predictive models [3, 4]. This concept is majorly used in finance sector to find the root cause of the past and
expected outcome in future.
13.2 Big Data and IoT in the Healthcare Industry
Medical field records large volume of patient data such as vital biosignals,
medicines prescribed for various diseases, human genetic data, pharmacokinetics, pathogen genomics, course assessments, routine clinical documentation, medical scan records, and biochemical results which are
acquired every day at higher rate [5]. Apart from the above-mentioned
datasets, personal health monitoring devices such as smart watches, activity trackers, sleep cycle monitors, and body area networks are used widely
in recent years which also acquires and stores lot of patient information.
This causes data overflow in healthcare sector and requires sophisticated
technology to manage the data. Also, medical big data is different from
other big data as it is hard to access frequently and it has legal complications associated with it. These data also possess a lot of variations in the
datasets which is acquired from different patients, different demography,
different age group, and different clinical approaches. Big data analytics
can be used in medical field to understand the nature of a disease by developing predictive relational models using the available large volumes of
medical data [6]. Using this big data approach, it is possible to predict the
outcome of a disease or a diagnostic technique which leads to advancements in treatment design [7]. With big data analytics, patient analysis
can be made extremely easy and early diagnosis of critical diseases is also
possible. Big data can be utilized greatly by scientists and specialists to
estimate the risk population for many chronic diseases and develop new
drugs for those diseases [8].
Healthcare industry is one of the crucial applications of IoT. IoT is simply defined as “anything, anytime, anywhere” [9]. Healthcare industry is
completely remodeled and enhanced using IoT as there are many advancements in technology that take place in this field every day. IoT applications provide better health monitoring facilities both in the hospital, home,
and outdoor environment. Developments in communication technology
allow the healthcare industry to provide door step services to patients.
Biomedical Big Data Analysis and IoMT
251
Predictive
Analytics
Medical
and
Pharmacological
research
Preventive
care
strategies
Personalized
healthcare
Big data
and IoT
in
Healthcare
Identification
of
Clinicians and
Patients needs
Prevention
of medical
errors
Clinical Trials
Disease
trend
analysis
Figure 13.2 Applications of big data and IoT in healthcare.
Enhanced internet facilities help the patients to consult the doctor from
home through video conferencing technology which greatly reduces the
patient waiting time in hospitals. There are a lot of developments happening in big data and IoT field every day which will definitely enhance the
healthcare infrastructure in the near future (Figure 13.2).
13.3 Biomedical Big Data Types
Biomedical data is a predominant source of information for most of the
healthcare and medical research. Biomedical data can be collected from
the patient during hospital visit or individually from a clinical trial program or personally through IoT applications. These biomedical data will
be in different types which are discussed below.
252
The Internet of Medical Things (IoMT)
13.3.1
Electronic Health Records
Electronic health records (EHRs) are obtained from the patient at the hospital or a clinic or a point of care medical center (Figure 13.3). EHRs are
also referred as electronic medical record (EMR) and these data are not
provided to anyone outside the concerned place. The data collected from
patient includes diagnostic and therapeutic details, biochemistry laboratory results, pharmaceutical details, health history, patient hospitalization
and discharge details, patient billing, and insurance claims. All these data
will be collected and stored securely in hospital database and used for
cohort studies within the hospital. In some clinical research work, experts
from the outside will be allowed to access the hospital data repositories for
finer research outcomes [10–12].
13.3.2
Administrative and Claims Data
Administrative data consist of the statistical reports of number of patients
admitted and discharged in the hospital. These data are periodically
updated to the government health agency to generate demographic wise
patient information and their health statistics and expenses.
Claims data shows the insurance claims of individual patient as well
as the healthcare providers. It also records the insurance company details
through which the hospital billing transaction was made. Patient can claim
insurance for diagnostic and therapeutic purpose (including pharmacy).
All these transactions are billed by the healthcare provider (hospital or
health center) and sent to the insurance company for claiming the insurance based on patient’s consent. These data are shared to the government
agency during a critical case investigation [13].
13.3.3
International Patient Disease Registries
Patient disease registry provides information about various chronic diseases and the different treatment approaches used for those diseases. These
registries are updated regularly by all the hospitals and healthcare centers
in a particular region. Information stored in these registries will be used
for managing chronic conditions such as diabetes, hypertension, respiratory diseases, and cancer. Some of the international disease registries
are Global Alzheimer’s Association Interactive Network (GAAIN) [14];
Australian EEG Database [18]; National Cardiovascular Data Registry
(NCDR) [15]; Personal Genome Project (PGP) [16]; Epilepsiae European
Database on Epilepsy [17]; National Program of Cancer Registries [19];
Biomedical Big Data Analysis and IoMT
253
Diagnostic
reports
Billing and
insurance claims
Health
History
Pharmaceutical
information
Discharge
instructions
Figure 13.3 Basic details available in an electronic health record.
National Trauma Data Bank (NTDB) [20]; National Patient Care Database
(NPCD); Alzheimer’s Disease Neuroimaging Initiative (ADNI) [21]; and
Surveillance, Prevention, and Management of Diabetes Mellitus Data Link
(SUPREME DM) [22].
13.3.4
National Health Surveys
National health surveys are conducted in every country to assess the
health condition of their population. Chronic disease surveys are mostly
conducted in economically weaker countries to accurately estimate the
health expenses of the country. These surveys evaluate the number of
people affected by specific disease and document the data electronically
by means of spreadsheets or software provided by the government agencies. Collected information will be used for research purpose to develop
254
The Internet of Medical Things (IoMT)
new medicines, diagnostic methods, patient specific therapy, etc. Some
of the national health survey includes Medicare Current Beneficiary
Survey (MCBS) [23]; National Health and Nutrition Examination Survey
(NHANES) [24]; Medical Expenditure Panel Survey (MEPS) [25]; National
Center for Health Statistics; and CMS Data Navigator, National Health and
Aging Trends Study (NHATS) [26].
13.3.5
Clinical Research and Trials Data
Clinical research is one of the important fields where enormous amount of
data is recorded every instance of time. Clinical research is conducted by
both government and private agencies. Data obtained from the research
is evaluated and stored in national laboratories for reference. Data can be
accessed only by some limited officials and restricted for general public
use. These research data consist of details about new vaccines, drugs under
research, genetics, molecular research, etc. [27].
13.4 Biomedical Data Acquisition Using IoT
Healthcare industry has abundant source of information which includes
electronic hospital records, medical examination reports from laboratories
and biomedical sensors and devices which are connected to the internet.
Apart from the above specified information sources, personal health monitoring gadgets like smart watches, activity trackers, and smartphone applications provide substantial amount of health information of individuals.
13.4.1
Wearable Sensor Suit
Physiological parameters such as ECG, pulse rate, blood oxygen level, temperature, respiration rate, and sweat rate can be acquired using devices
such as wearable sensors and data can be pre-processing using suitable
hardware and or software before transmitting the physiological parameters. Developments in smart textiles have made a noticeable contribution
toward the healthcare industry for continuous evaluation of physiological
parameters. The wearable sensors must be designed as light weight, miniature, and operate accurately. The wearable device should also be durable
and convenient to use. Sensors used in the wearable device should perform
continuously without any shortcoming like power supply. Advancements
in sensor fabrication technique provide better sensor design in terms of
size, accuracy, precision, resolution, and durability. Some of the commonly
Biomedical Big Data Analysis and IoMT
255
used wearable sensors are potentiometric and amperometric biosensors
used for measuring biochemical compositions of body fluids, fiber optic
biosensor for measuring blood oxygen concentration levels, and calorimetric biosensors for detecting bacteria and other pathogens.
Real-time embedded systems are interfaced with wearable sensors to
acquire the physiological signals continuously either wired or wirelessly.
Digitization of physiological signal executed by the signal conditioning
circuit embedded in the data acquisition system. These digitized signals
can be stored in hard disks or solid-state drives and can be uploaded to the
cloud when required. In some data acquisition system, physiological data
are directly stored to the cloud continuously in real time [28].
13.4.2
Smartphones
Future smartphones will be integrated with biosensors to monitor personal
health using smart applications. This helps the user to monitor their health
anytime and anywhere. Also, smartphones are well designed to transmit
the data to anyone which allows the user to share their health report using
mobile internet. Health monitoring applications can store the user data
regularly in the phone memory or on the cloud if the user has access to
cloud storage and this enables the user to retrieve their health report anytime and anywhere. Some hospitals already have their own software applications for e-consultation and e-pharmacy. Patients can directly consult
the doctor without visiting the hospital using these software applications.
The information exchanged by the patient with the doctor may be documented for future purpose. This information such as health history, drug
prescription, and diagnostic approaches can be later used by the hospital
for clinical research after getting consent from the patient [29].
13.4.3
Smart Watches
Smart watches are digital watches also called as a wearable computer which
came in existence since 1970. These watches were initially designed with
some basic digital capabilities like digital time telling, information storage, arithmetic operation, and gaming application. Smart watches developed after 2010 have more functionality just like smartphones that include
mobile operating system, mobile applications, media players, FM radio,
and wireless connectivity such as Wi-Fi and Bluetooth. Nowadays, smart
watches have advanced functionalities that include mobile phone calling
features using LTE networks, message notifications, and calendar synchronization. Most of the smart watches are integrated with many peripheral
256
The Internet of Medical Things (IoMT)
devices like compass, accelerometers, altimeters, GPS receivers, barometers, temperature sensor, heart rate sensor, SpO2 sensor, pedometers,
memory card slots, and small speakers [30].
Smart watches are widely used in telemetry applications to monitor
the health condition of individuals. Since many sensors are integrated in
a smart watch, it is possible to acquire vital parameters such as heart rate,
blood oxygen saturation level, body temperature, calorie burned level,
and sleep cycle. Some smart watch makers integrated ECG sensor that
can monitor the heart’s electrical activity instantaneously. In the advancements in sensor fabrication, embedded system, software, and communication systems, it is possible to acquire, monitor, as well as transmit the vital
parameters using a smart watch. Data can be stored and monitored in real
time with connected devices like smartphones, computers, and cloud computing systems. These devices allow the doctors, care providers or family
members to monitor the elder people’s vital parameters anytime and anywhere. Currently, many manufacturers are involved in developing many
healthcare applications and biosensors to acquire various vital parameters
for continuous long-term monitoring.
13.5 Biomedical Data Management Using IoT
Generally, big data means huge volume of discrete data generated and collected every instance of time. The collected data is used for optimizing consumer services. Data collected from different sources must be categorized
and organized optimally for analyzing the data for future work. Handling
the biomedical big data is a major challenge since heterogeneous formats
of file are stored in different databases. Data must be stored in simply readable file format for easy access and efficient analysis of data since expertise from different background are required to work together to achieve
it. The acquired data can be stored using cloud computing and it can be
analyzed whenever required. This will help the scientific community to do
better research and come out with new decision-making approaches. The
other challenge in medical big data is the implementation of protocols and
hardware for healthcare applications. Biomedical big data also requires
advanced embedded system architecture for biomedical sensor interfacing, neural algorithms for signal processing, and communication systems
for information exchange. Machine learning applications and artificial
intelligence can also be used to analyze and process the stored medical
data. Most important work to be done for managing data is annotating,
integrating and presenting the complex data in order to understand the
Biomedical Big Data Analysis and IoMT
257
Data
Collectio
n
Data Sto
Big data Management
rage
Data
Integrati
on
Data Min
Data An
ing
alysis
Import.io
Hadoop
Pentaho
IBM Mod
eler
Data
Visualiza
ti
BigML
Data
Languag
Tableau
Data
Cleanin
Python
on
es
g
Data Cle
ane r
Figure 13.4 Commonly used big data management tools.
information much better. If there is any absence of most relevant information, then it affects the precision of prediction or diagnosis process by the
physician. Visualization tools can also be used to explore the data better
and easily outlook the medical scan reports. Some of the biomedical data
management software are discussed here (Figure 13.4).
13.5.1
Apache Spark Framework
Apache Spark is an open-source cluster computing model. This framework
can handle large volume of data rapidly using its memory cluster computing. Distributed data processing can be done easily in this framework
using many higher-level libraries such as GraphX for graph processing,
MLlib for machine learning, Spark SQL for SQL queries, SparkR for data
processing in R, and Spark Streaming for data streaming. These libraries
allow the developers to build, compute, and analyze the coding effortlessly.
Spark executes the application at faster rate by reducing the total read and
write processes into the memory. Spark can be used in various programming languages like Java, R language, Scala, and Python. Spark reduces the
258
The Internet of Medical Things (IoMT)
administration load of datasets and maintains functions like collaborative
queries, group applications, and flowing and iterative procedures in a specific system. Spark can be built as much faster in multi-pass analytics by
implementing resilient distributed datasets (RDDs). But processing very
big data size may demand a lot of memory which may increase the system
cost and complexity. Another Apache real-time framework called Storm
was developed for data stream processing. Storm provides better built-in
fault tolerance capability and scalability and it can also be implemented in
many programming languages same as Spark [31].
13.5.2
MapReduce
MapReduce is one of the commonly used data management program
models for handling massive volumes of distinct datasets. It is based on
java language. MapReduce program has a map procedure (which executes
filtering and sorting) and a reduce procedure (which executes a summary
operation). A MapReduce system usually consists of three steps, namely,
map, shuffle, and reduce. Map operation involves application of map function to a local data and writing the output to a temporary storage through
a worker node. In the next stage, a single copy of redundant input data is
processed which is verified by the master node. In the shuffle procedure,
worker nodes will redistribute the data based on output keys and locate all
the data with similar key on the same worker node. In the reduce procedure,
parallel processing will be executed in each output data group. MapReduce
libraries have different levels of optimization and can be implemented in
many programing languages [32].
13.5.3
Apache Hadoop
Apache Hadoop is an open-source software framework used to store and
process big data. Hadoop can handle both structured and unstructured
data. It uses the MapReduce programming model for processing and generating large datasets. Hadoop analyzes huge volumes of complex datasets
by distributing and processing the data parallelly on multiple nodes. This
approach increases the processing speed greatly as the datasets are localized
from entire database. Hadoop efficiently processes the data, handles multiple programming issues, plans machine-to-machine communication, and
deals with multiple nodes using the map and reduce operation. Hadoop
Distributed File System (HDFS) is the basic distributed storage utilized by
Hadoop applications. HDFS collects and stores enormous amount of data
in clusters using cloud as well as physical storage device [33, 34].
Biomedical Big Data Analysis and IoMT
259
In 2011, about 150 billion gigabytes of data was generated by the US
health industry. Every year, the data size increases in healthcare industry
tremendously. It is estimated that 80% of the biomedical data is unstructured and it can be handled successfully using Hadoop framework.
Hadoop technology is used in cancer treatments for mapping billions of
DNA base pairs. This helps the scientists to develop patient specific drug
and treatment based on the genetic information available in the gene database. Hadoop technology is also used for monitoring patient vital parameters with the support of smart sensors and smart gadgets. Wearable sensors
are used to acquire the patient vital parameters and smart devices will store
these data in cloud and the data will be managed by Hadoop ecosystem
components like Spark, Hive, Impala, and Flume. Hadoop technology is
used in the Hospital Network for providing better clinical support. It helps
the patients with critical disease by providing best treatment plans based on
the real-time clinical data analysis. Hadoop technology is used in Healthcare
Intelligence applications to assist healthcare providers, healthcare agencies,
and insurance companies. Disease data and the cost spent for the treatment
of that disease in a specific demography can be investigated using this application. Hadoop technology is also used for Fraud Prevention and Detection
in health insurance payments. This is possible because of real-time data
processing using the available information such as medical claims data of an
individual. Hadoop supports healthcare sector in reducing treatment cost,
predicting epidemics, drug and vaccine discovery, innovating new diagnostic and therapeutic approaches, scientific research, and improving quality of
healthcare as well as quality of human life [35].
13.5.4
Clustering Algorithms
Clustering is an unsupervised machine learning technique commonly used
for statistical analysis of data in which data points are grouped according
to several criteria. Clustering algorithm analyzes each and every data point
present in a specific group of databases and organizes the data points into
a separate data cluster with similarities. Each cluster will have a different
similarity property (e.g., mean, variance, standard deviation, and size), and
data points are clumped accordingly [36]. Some of the commonly used
clustering algorithms are discussed below.
13.5.5
K-Means Clustering
K-means clustering is one of the most commonly used clustering algorithms
in data science and machine learning. It is an easy and simple clustering
260
The Internet of Medical Things (IoMT)
procedure in which the data points are grouped to specific clusters defined
in advance. Here, “K” represents the total number of clusters recognized
from the given database. This algorithm assigns the data points to a specific
cluster by computing the least sum of the squared distance between the centroid and the data points. The algorithm is executed as follows:
Consider, D = [d1, d2, d3,…..,dn] = set of data points,
C = [c1, c2, c3,…..,cn] = set of centers.
• Step 1: Choose a cluster center randomly.
• Step 2: Compute the distance between each and every data
point and the cluster centers.
• Step 3: Data points with minimum distance (pre-defined)
from the cluster center are grouped together.
• Step 4: New cluster center is computed.
• Step 5: Distance between the remaining data points and the
new cluster center (obtained in step 4) is computed.
• Step 6: Repeat Step 3.
• Step 7: Stop if all the data points are grouped; else, repeat
from Step 4.
K-means clustering algorithm is quite faster, easy to implement, and
efficient when the datasets are different from each other. K-means algorithm has some drawbacks such as selection of groups before the start of
process, and random choice of cluster centers may result in difference in
outputs which leads to lack of repeatability and consistency [37, 38].
13.5.6
Fuzzy C-Means Clustering
Fuzzy c-means (FCM) clustering technique is one of the widely used algorithms in big data mining for pattern analysis. FCM is also referred as soft
k-means or soft clustering technique. In FCM, data points present in a
dataset will be associated with more than one cluster. Data points are associated to each cluster center based on the membership grades (distance
between the data points and cluster center) assigned to each data point.
Membership grades show the degree of closeness (distance) between the
data point and the cluster center. Higher membership grade indicates that
the data point and the cluster center are very close (inside the cluster) and
lower membership grade means that the data point may be located at the
edge of the cluster. FCM algorithm is executed as follows:
Biomedical Big Data Analysis and IoMT
261
Consider,
D = [d1, d2, d3,…..,dn] = set of data points,
C = [c1, c2, c3,…..,cn] = set of centers.
• Step 1: Choose a cluster center randomly.
• Step 2: Compute the fuzzy membership.
• Step 3: Determine the fuzzy centers using the fuzzy membership function.
• Step 4: Repeat Steps 2 and 3 until the objective function
reduces to a minimum value.
FCM provides better outcome than k-means algorithm, and the data points
are highly correlated to specific clusters based on the membership grades
which increases the degree of similarity. FCM is a multidimensional clustering algorithm that has many applications in bioinformatics, economics, image
analysis, marketing sector, pharmacology, and in many industries [39, 40].
13.5.7
DBSCAN
Density-based spatial clustering of applications with noise (DBSCAN) is a
commonly used non-parametric clustering algorithm. It is mostly used to
find the non-linear shapes and structures based on the concept of density
reachability and density connectivity to discriminate the data points from
the noise present in the dataset. The basic concept of this algorithm is that
a data point is assigned to a cluster if the data point is close to many data
points from that cluster. For example, consider a data point “x” which is
stated to be density reachable from a data point “y” if the point “x” is within
a distance “d” from point “y” and also “y” has a greater number of data
points in its neighborhood that are located within the distance “d”. Similarly,
the data points “x” and “y” are stated to be density connected if a data point
“h” has a greater number of data points in its neighborhood and both “x”
and “y” are within the distance “d”. Therefore, if “y” is neighbor of “h”, “h”
is neighbor of “z”, “z” is neighbor of “w”, and “w” is neighbor of “x” which
implies that “y” is neighbor of “x”. So, all the neighbors are inter-related
and this approach is similar to a chaining operation. The key parameters
of DBSCAN algorithm are eps and minPts. eps is the distance between the
neighborhoods and minPts defines the minimum number of data points
required to create a cluster. DBSCAN algorithm is implemented as below:
262
The Internet of Medical Things (IoMT)
Consider, D = [d1, d2, d3,…..,dn] = set of data points,
• Step 1: Start with a random data point.
• Step 2: Find the neighborhood of this data point using eps.
• Step 3: Start the clustering process if there are many neighborhoods around this point, then this data point will be
noted as visited. If there are no neighborhoods around, then
this means this data point is noted as noise.
• Step 4: If any data point is close to the cluster, then its eps
neighborhood will be considered as the part of the cluster.
This procedure is repeated for all eps neighborhood points
until all the data points in the cluster are found.
• Step 5: A new unvisited data point will be considered and the
procedure is repeated from Step 2.
• Step 6: Stop the process if all the data points are noted as
visited.
This algorithm efficiently detects the arbitrary shaped clusters (high
density regions) and outliers (noisy clusters) and it is designed to analyze very large image databases. There is no need to specify the number of
clusters before the start of operation. DBSCAN algorithm fails to handle
high-dimensional data with varying density cluster since it is difficult to
choose the eps and minPts appropriately for all the clusters [41, 42].
13.6 Impact of Big Data and IoMT in Healthcare
Enormous volumes of data are streaming in healthcare field every day,
most of which are containing unstructured information. It is quite hard to
analyze these huge volumes of data without any well-designed computation method. Many researchers from different organizations are involved
in developing efficient and reliable computing algorithms and methodologies for managing and processing the healthcare data. Some of the companies are providing open access healthcare datasets for stakeholders to
uplift the data analytics research. Some organizations also provide solutions to many healthcare data related issues [43–46]. IBM Healthcare and
Life Science are involved in providing solutions to healthcare organizations
by developing customized analytical software to manage structured and
unstructured data. GE healthcare provides digital solutions such as clinical
networking and cyber security for safer data sharing. They also provide
clinical performance management solutions for better operational, clinical,
Biomedical Big Data Analysis and IoMT
263
and financial outcomes. Dell Healthcare IT Solutions and Transformations
provide advanced methodologies to monitor the healthcare research and
tools for predictive models in biomedical data science. Cisco Healthcare
offers solutions for various healthcare that provides telehealth consulting,
virtual triage, network infrastructure, and cybersecurity. Amazon Web
Services (AWS) provides Cloud Computing Services to many organizations
including healthcare sector. AWS creates organization specific network
infrastructure for data acquisition, storage, processing, and management.
Oracle Health Sciences is developing many digital applications for biomedical and pharmaceutical research which includes mHealth Connector
Cloud Service that allows data acquisition from patient in real time
through wearable sensors, telemetry units, and network connected instruments and equipment. Intel Healthcare and Life Sciences Technology have
collaborated with many government and private healthcare organizations
to develop artificial intelligence applications for data management. They
also provide solutions for issues related to healthcare databases [47–50].
13.7 Discussions and Conclusions
Big data is a state-of-the-art approach in healthcare field which is going to
revolutionize the future of clinical diagnostics and therapeutics. There are
many challenges in biomedical big data and IoMT including data management, data analysis, and data quality. Efficient management of data can
increase the level of data quality and it helps in promising analytics application. The unstructured and semi-structured data from different sources
will be managed using robust data management algorithms. Systematic
biomedical data classification will ensure the data to be managed quickly.
Big data in healthcare will reduce the diagnosis duration of physicians.
The developments in intelligent management solution will help clinicians
in predicting the disease at early stages. EHRs are digital records which
store the medical history and laboratory reports of the patients which will
be useful for the physician for tracking the patient data anytime and anywhere. It also implements the no paper policy which ensures better document organization and data retrieval. Real-time altering methods with
IoMT will help the patient to get consultation and treatment from home.
Big data and IoMT in biomedical sector will enhance the patient and clinician engagement. Healthcare managers can analyze the results of patient
and identify the solutions seamlessly. Big data analytics along with IoMT
can help in cancer pharmaceutical research based on the data stored in
cancer patient databases that are used to find the highest success treatment
264
The Internet of Medical Things (IoMT)
for cure. Integrating big data and IoMT with telemedicine can provide personalized treatment plans and can avoid re-admission. Developments in
neural algorithms may identify obscure patterns in medical scan reports
and assist the physician in diagnostics and therapeutics with high precision. Clinical prediction models can be designed virtually using big data
by exploring the pile of information available in bioinformatics databases.
Advancements in big data and IoMT will certainly enhance the healthcare
field inevitably.
References
1. Sagiroglu, S. and Sinanc, D., Big Data A Review, Collaboration Technologies
and Systems (CTS). International Conference On San Diego, CA, IEEE, 2013.
2. Abdrabo, M., Elmogy, M., Eltaweel, G., Barakat, S., Enhancing Big Data Value
Using Knowledge Discovery Techniques. Int. J. Inform. Technol. Comput.
Sci., 8, 8, 1–12, s2016.
3. Acharjya, D. and Anitha, A., A comparative study of statistical and rough
computing models in predictive data analysis. Int. J. Ambient Comput. Intell.,
8, 2, 32–51, 2017.
4. Ng, K., Ghoting, A. et al., PARAMO: A parallel predictive modeling platform
for healthcare analytic research using electronic health records. J. Biomed.
Inform., 48, 160–70, 2014.
5. Wan, X., Kuo, P., Tao, S., Hierarchical medical system based on big data andmobile internet: A new strategic choice in healthcare. JMIR Med. Inform., 5,
3, 1–6, e22, 2017.
6. Tomasic, I., Petrovic, N., Fotouhi, H., Linden, M., Bjorkman, M., Relational
database to a web interface in real time. Euro Med Bio Engg Conf & NordicBaltic Conf. on Biomed Engg & Med Physics (EMBEC & NBC 2017), Tampere,
Finnland, pp. 89–92, 2017.
7. Dabek, F. and Caban, J.J., A neural network-based model for predictingpsychological conditions, pp. 252–261, 2015.
8. Grieco, L.A. et al., IoT-aided robotics applications: Technological implications, target domains and open issues. Comput. Commun., 54, 32–47, 2014.
9. White Paper, Internet of Things Strategic Research Roadmap, Antoine de
Saint-Exupery, European Commission - Information Society and Media DG,
2009.
10. Miller, P. et al., Exploring a clinically friendly web-based approach to clinical
decision support linked to the electronic health record: Design philosophy,
prototype implementation, and framework for assessment. JMIR Med. Info.,
2, 2, e20, 2014.
Biomedical Big Data Analysis and IoMT
265
11. Gkoulalas-Divanis, A. and Loukides, G., Anonymization of Electronic
Medical Records to Support Clinical Analysis, in: SpringerBriefs in Electrical
and Computer Engineering, 2013.
12. Fernandez-Aleman, J.L., Senor, I.C., Lozoya, P.A.O., Toval, A., Security and
privacy in electronic health records: A systematic literature review. J. Biomed.
Inform., 46, 3, 541–562, 2013.
13. Wang, L. and Alexander, C.A., Big data in medical applications and healthcare.
Curr. Res. Med., 6, 1, 1–8, 2015.
14. Toga, A.W. et al., The Global Alzheimer’s Association Interactive Network.
Alzheimers. Dement.: J. Alzheimers. Assoc., 12, 1, 49–54, 2016.
15. Dehmer, G.J. et al., The National Cardiovascular Data Registry Voluntary
Public Reporting Program: An Interim Report From the NCDR Public
Reporting Advisory Group. J. Am. Coll. Cardiol., 67, 2, 205–215, 2016.
16. Church, G.M., The personal genome project. Mol. Syst. Biol., 1, 1: 2005.0030,
2005.
17. Ihle, M. et al., EPILEPSIAE - a European epilepsy database. Comput. Methods
Programs Biomed., 106, 3, 127–38, 2012.
18. Hunter, M. et al., The Australian EEG database. Clin. EEG Neurosci., 36, 2,
76–81, 2005.
19. Weir, H.K. et al., The National Program of Cancer Registries: explaining state
variations in average cost per case reported. Prev. Chronic Dis., 2, 3, A10,
2005.
20. Haider, A.H. et al., Influence of the National Trauma Data Bank on the
study of trauma outcomes: is it time to set research best practices to further
enhance its impact? J. Am. Coll. Surg., 214, 5, 756–68, 2012.
21. Petersen, R.C. et al., Alzheimer’s Disease Neuroimaging Initiative (ADNI):
clinical characterization. Neurology, 74, 3, 201–9, 2010.
22. Nichols, G.A. et al., Construction of a multisite DataLink using electronic
health records for the identification, surveillance, prevention, and management of diabetes mellitus: the SUPREME-DM project. Prev. Chronic Dis., 9,
E110, 2012.
23. Adler, G.S., A profile of the Medicare Current Beneficiary Survey. Healthcare
Financ. Rev., 15, 4, 153–63, 1994.
24. National Research Council (US), Coordinating Committee on Evaluation of
Food Consumption Surveys, in: National Survey Data on Food Consumption:
Uses and Recommendations, National Academies Press (US), Washington
(DC), 1984.
25. Cohen, J.W. et al., The Medical Expenditure Panel Survey: a national health
information resource. Inquiry: J. Med. Care Organ., Provision Financ., 33, 4,
373–89, 1996.
26. Freedman, V.A. and Kasper, J.D., Cohort Profile: The National Health
and Aging Trends Study (NHATS). Int. J. Epidemiol., 48, 4, 1044–1045g,
2019.
266
The Internet of Medical Things (IoMT)
27. Zhongheng, Z., Big data and clinical research: perspective from a clinician.
J. Thorac. Dis., 6, 12, 1659–64, 2014.
28. Mansoor, B.M. et al., A Systematic Review of Wearable Sensors and IoT-Based
Monitoring Applications for Older Adults - a Focus on Ageing Population
and Independent Living. J. Med. Syst., 43, 8, 233, 15 Jun. 2019.
29. Dimitrov, D.V., Medical Internet of Things and Big Data in Healthcare.
Healthc. Inform. Res., 22, 3, 156–63, 2016.
30. de Arriba-Pérez, F. et al., Collection and Processing of Data from Wrist
Wearable Devices in Heterogeneous and Multiple-User Scenarios. Sensors
(Basel, Switzerland), 16, 9, 1538, 21 Sep. 2016.
31. Guo, R. et al., Bioinformatics applications on Apache Spark. GigaSci., 7, 8,
giy098, 1 Aug. 2018.
32. Mohammed, E.A. et al., Applications of the MapReduce programming
framework to clinical big data analysis: current landscape and future trends.
BioData Min., 7, 22, 29 Oct. 2014.
33. O’Driscoll, A. et al., ‘Big data’, Hadoop and cloud computing in genomics.
J. Biomed. Inform., 46, 5, 774–81, 2013.
34. Yao, Q. et al., Design and development of a medical big data processing system based on Hadoop. J. Med. Syst., 39, 3, 23, 2015.
35. Kuo, A. et al., A Hadoop/MapReduce Based Platform for Supporting Health
Big Data Analytics. Stud. Health Technol. Inform., 257, 229–235, 2019.
36. Liao, M. et al., Cluster analysis and its application to healthcare claims data:
a study of end-stage renal disease patients who initiated hemodialysis. BMC
Nephrol., 17, 25, 2 Mar. 2016.
37. Alashwal, H. et al., The Application of Unsupervised Clustering Methods to
Alzheimer’s Disease. Front. Comput. Neurosci., 13, 31, 24 May 2019.
38. Steinley, D., K-means clustering: a half-century synthesis. Br. J. Math. Stat.
Psychol., 59, Pt 1, 1–34, 2006.
39. Wu, Y. et al., Multiple fuzzy c-means clustering algorithm in medical diagnosis. Technol. Healthcare: Off. J. Eur. Soc Eng. Med., 23, Suppl 2, S519–27, 2015.
40. Belhassen, S. and Zaidi, H., A novel fuzzy C-means algorithm for unsupervised heterogeneous tumor quantification in PET. Med. Phys., 37, 3, 1309–24,
2010.
41. Al-Shammari, A. et al., An effective density-based clustering and dynamic
maintenance framework for evolving medical data streams. Int. J. Med.
Inform., 126, 176–186, 2019.
42. Plant, C. et al., Automated detection of brain atrophy patterns based on MRI
for the prediction of Alzheimer’s disease. NeuroImage, 50, 1, 162–74, 2010.
43. Barrett, T., Troup, D. et al., NCBI GEO: mining tens of millions of expression
profiles–database and tools update. Nucleic Acids Res., 35, D760–D765, 2007.
44. Tanabe, L., Scherf, U. et al., MedMiner: an Internet text-mining tool for
biomedical information, with application to gene expression profiling.
Biotechniques, 27, 1210–1214, 1216–1217, 1999.
Biomedical Big Data Analysis and IoMT
267
45. Weeber, M., Kors, J. et al., Online tools to support literature-based discovery
in the life Sciences. Briefings Bioinform., 6, 277–286, 2005.
46. Knoppers, B.M. and Thorogood, A.M., Ethics and big data in health, Current
Opinion in Systems Biology. Current Opinion in Systems Biology, 4, 53e57,
2017.
47. Farahani, B., et al., Towards fog-driven IoTeHealth: promises and challenges
of IoT in medicine and healthcare. Future Gener. Comput. Syst., 78, Part 2,
659e676, 2018.
48. Lomotey, R.K., et al., Wearable IoT data stream traceability in a distributed
healthinformation system. Pervasive Mob. Comput., 40, 692e707, 2017.
49. Fatt, Q.K. and Ramadas, A., The usefulness and challenges of big data in
healthcare. J. Healthc. Commun., 3, 2, 3:21, 2018.
50. Viceconti, M., Hunter, P., Hose, R., Big data, big knowledge: big data for personalized healthcare. J. Biomed. Health Inform., 19, 4, 1209e1215, 2015.
14
Medical Data Security Using
Blockchain With Soft Computing
Techniques: A Review
Saurabh Sharma1*, Harish K. Shakya1 and Ashish Mishra2
Dept. of CSE, Amity School of Engineering & Technology, Amity University (M.P.),
Gwalior, India
2
Department of CSE, Gyan Ganga Institute of Technology and Sciences,
Jabalpur, India
1
Abstract
Medicinal services and genomics are the absolute most significant sorts of
information for cross-organization prescient displaying that appraisal understanding results by dissecting watched information and producing logical
proof utilizing information from different establishments. Records are stored
in different hospital’s databases; therefore, it is difficult to construct a summarized EMR (Electronic Medical Record) for one patient from multiple hospital
databases due to the security and privacy concerns. Blockchain technology and
the Cloud environment have proved their usability separately. These two technologies can be combined to enhance the exciting applications in healthcare
industry. Blockchain is a highly secure and decentralized networking platform
of multiple computers called nodes. To solve the above problems, we proposed
a Blockchain-based information management system to handle medical information security. In this paper, we investigated privacy issues and privacy protection issue within cloud computing. The proposed framework ensures data
privacy, integrity, and grained access control over the shared data with better
efficiency. The proposed research will reduce the turnaround time for data sharing, improve the decision-making process, reduce the overall cost, and provide
better security to EMRs.
*Corresponding author: saurabh.sharma44@gmail.com
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (269–288) © 2022 Scrivener Publishing LLC
269
270
The Internet of Medical Things (IoMT)
Keywords: Medical data security, soft computing, cloud computing, data
privacy, EMR
14.1 Introduction
The advantage of using cloud in the framework of social insurance is that it
provides the chronicle to erase medical records and reports. Through this
video calling and social insurance, experts can clearly give way to a coordinated effort by making the application versatile and so on. If the patient has
to create a crisis situation, then help them. Using the power of the Cloud
Foundation, the outside relies on customers, namely, expert co-op clouds
with your information [16]. In addition, we do not protect information in
the cloud, so the security and owner expert primary co-op concerns will
always try to use the cloud on the basis that it tells smooth data about drug
services and the attacker. Thus, in managing these issues, the proposed
research framework will be use of Cloud Blockchain for medical databases.
The core modules and functions of the proposed system are introduced
in the following sections.
Data quality validation: As shown in Figure 14.1, our study only focused
on the continuous dynamic data. These data are usually generated by standard sensors. The information of the sensor is accessible through the APIs
of the devices. Moreover, the pattern of the collected data can be evaluated
using advanced machine learning techniques to make sure that the data is
valid according to certain validation patterns or checks. It enables us to validate the quality of the data from both hardware and software aspects.
Cloud storage
Encrypted data
Encrypted data
Encrypted data
Consumer App
Sensor
Sensor
Sensor
APIs of sensors
User App
Quality validation
module
Blockchain
Transaction
Data
Timestamp, Content,
Quality, Size etc.
Sensor
Transaction
Consensus
Check
No
Cancel
transaction
Yes
Encrypted
data
Compression
& Encryption
Keys
Available data,
Data Info.,
Search function
Data download
Key receiving
Data decryption
Notice key keepers
to release keys
Key receiving, Key storage, Key releasing
Key keeper App
User
Key keepers
Figure 14.1 General architecture and workflow of the proposed system [7].
Customer
Medical Data Security Using Blockchain
271
• Hardware aspect: When a new device is connected tothe
user App, the hardware information of that device and the
sensors embedded in it will be acquired by the user App. If a
device or a sensor is from a validated manufacture, then it is
recognized as a qualified hardware and the data produced by
it are reliable. Otherwise, it will be refused to connect with
the App. For this purpose, a database of validated manufactures and devices should be predefined and well maintained.
• Software aspect: Supported by advanced machine learningtechniques, it is possible to classify the patterns of a time
series dataset with high accuracy. There have been many
studies on this topic. For instance, it is able to recognisea
user’s daily activities using the data collected from an accelerometer embedded in wearable devices. Using similar
machine learning techniques, we can create quality classifiers for different health data. Only the data with predefined
features will be saved and the meaningless data and noises
will be eliminated. Here, the quality of the data is a relative
standard. Take the above-mentioned acceleration data as an
example and imagine that a user’s acceleration data are collected bya smart watch during 24 hours. The quality validation algorithms will be able to distinguish sleep from other
daily activities. The data corresponding to the sleep period
could be classified as high-quality data or noise depending if
the user wants to share sleep-related data or only other daily
activities.
As shown in Figure 14.2 this device can be embedded in gadgets, as it
can be worn on the body that enables human services, small-scale controller’s electronic devices with clothes or jewellery.
Health Alert
Patient monitored
in home
Blockchain
Network
Health Service provider
React to alert, interact with patient
Figure 14.2 Remote patient monitoring [8].
272
The Internet of Medical Things (IoMT)
• Stationary Medical Devices: Use of static medical device components may have specific physical location (e.g., chemotherapy dispensing stations for home-based healthcare).
• Medical Embedded Devices: Embedded medical devices
that can be implanted in the body (e.g., pacemakers).
• Medical Devices: Medical equipment is prescribed by a
qualified physician (e.g., insulin pump).
• Health Monitoring Devices: Consumer products (e.g., FitBit
and FuelBand)
14.2 Blockchain
In Blockchain, the class can be described as a series of data. This process
should not be realistic for the purpose of stopping or tying them with a
sophisticated collection. Blockchain is used to provide cash, property,
contracts, and bank or secured transactions for things like government
that requires an external intermediary. When information is stored in the
Blockchain bar, it is very difficult to change it. Along with the advanced
tools for the archive, the purpose, which they cannot be predetermined,
is described in the 1991 timestamp for a group of specialists required to
create Blockchain.
14.2.1
Blockchain Architecture
Intelligently, Blockchain is a series of part assembled in data protected
(certified), secured, and proven system. In the end, Blockchain is a collection of PC servers connected by a cable, implying that the entire system is
decentralized, Figure 14.3.
Centralized
Decentralized
Figure 14.3 Blockchain architecture categories [7].
Distributed Ledgers
Public
Private
Users are
anonymous
Users are not
anonymous
Medical Data Security Using Blockchain
273
To make this a little more difficult, Blockchain visual work can be done
compared to what is done with Google Docs. You can see the doctor at the
time of injury. The report is confident that various members will make the
necessary changes.
The Blockchain technology allows you to distribute digital information
instead of copying it. Distributed accounts provide distributed transparency; confidence and data security are main components of Blockchain
architecture. Blockchain process enables to deploy advanced data, as it is
replicated.
14.2.2
Types of Blockchain Architecture
All Blockchain structures fall into three categories, Figure 14.4:
I. Public Blockchain Architecture: An open Blockchain design
means open access design to each accessible to people who
are willing to take more interest than those (e.g., Bitcoin and
Blockchain open framework).
II. Private Blockchain Architecture: Rather than open Blockchain
engineering, private structures cannot be controlled separately from a particular institution or customer approval,
which is welcome investment.
III. Consortium Blockchain Architecture: This may include the
Blockchain Union structure. A union, a system that regulates the customers that use it, is surrounded by beginners.
Validator Node
Member Node
Can both initiate/receive and validate transactions
Can only initiate/receive transactions
Figure 14.4 Nodes in public vs. private Blockchain [8].
274
The Internet of Medical Things (IoMT)
As referenced, Blockchain is a disseminated diary where all gatherings
hold a neighborhood duplicate. In any case, the structure can be quickly
brought together or decentralized, keeping in mind the Blockchain structure and specific conditions. This is basically the approach Blockchain
Engineering Structures and Control Record. A private Blockchain is progressively focused as it is limited by the prestigious gathering with extended
security. In fact, it is an open Blockchain and it is decentralized. In an open
Blockchain, all records are true to the general public and anyone can participate in understanding this process. However, each new record to achieve
is less effective because Blockchain design requires a lot of investment.
14.2.3
Blockchain Applications
Blockchain has a wide range of applications and uses in healthcare as
shown in Figure 14.5. The ledger technology facilitates the secure transfer
of patient medical records, manages the medicine supply chain, and helps
healthcare researchers unlock genetic code [2].
Situation 1: Primary patient care. Patients exist to solve the problem of
medical care structures. Blockchain is asked to use innovative ideas:
Scenario 1: This shows the how the Blockchain system works in medical
domain (data life cycle in Blockchain architecture). The healthcare data are
Scenario 1
Delegate
Patient
AC rights Data
GPs,
Specialists
Define
Scenario 2
Ledger
Ledger
Ledger
Biobank
Cloud server
Ledger
Insurance
Ledger
Pharmacy
Hospitals
Scenario 3
Figure 14.5 Scenarios of using Blockchain in different healthcare situations [8].
Medical Data Security Using Blockchain
275
sensitive and their management is cumbersome. Yet, there is no privacy-­
preserving system in clinical practice that allows patients to maintain
access control policy in an efficient manner.
• Sharing data between different healthcare providers may
require major effort and could be time consuming.
Next, we propose two approaches that can be implemented
separately or combined to improve patient care.
• Institution-based: The network would be formed by the
trusted peers: healthcare institutions or general practitioners (caregivers). The peers will run consensus protocol
and maintain a distributed ledger. The patient (or hisrelatives) will be able to access and manage his data through an
application at any node where his information is stored. If a
peer is offline, then a patient could access the data through
any other online node. The key management process and the
access control policy will be encoded in a chain code, thus,
ensuring data security and patient’s privacy.
• Case specific (serious medical conditions, examination, and
elderly care): During a patient’s stay in a hospital for treatment, rehabilitation, examination, or surgery, a case-specific
ledger could be created. The network would connect doctors,
nurses, and family to achieve efficiency and transparency of
the treatment. This will help to eliminate human-made mistakes, to ensure consensus in case of a debate about certain
stage of the treatment.
Scenario 2: Data aggregation for research purposes. It is highly important to ensure that the sources of the data are trusted medical institutions
and, therefore, the data are authentic. Using shared distributed ledger
will provide tracebility and will guarantee patients’ privacy as well as the
transparency of the data aggregation process. Due to the current lack of
appropriate mechanisms, patients are often unwilling to participate in data
sharing. Using blockchain technology within a network of researchers,
biobanks, and healthcare institutions will facilitate the processof collecting
patients’ data for research purposes.
Scenario 3: Connecting different healthcare players for better patient
care. Connected health is a model for healthcare delivery that aims to maximize healthcare resources and provide opportunities for consumers to
engage with caregiver and improve self-management of a health condition.
Sharing the ledger (using the permission-based approach) among entities
276
The Internet of Medical Things (IoMT)
(such as insurance companies and pharmacies) will facilitate medication
and cost management for a patient, especially in case of chronic disease
management. Providing pharmacies with accurately updated data about
prescriptions will improve the logistics. Access to a common ledger would
allow the transparency in the whole process of the treatment, from monitoring if a patient follows correctly the prescribed treatment, to facilitatingcommunication with an insurance company regarding the costs of the
treatment and medications.
Whereas a clear case (actual illness, examination, and chronic thoughts)
can be made: treatment, recovery, inspection, or treatment process of
patients; the subject of record is clear. Expert system will help you stand up
and fulfill the productivity and simplicity of family therapy. This would be
a man-made mistake, when the incident should guarantee the treatment
agreement discussed several stages.
Situation 2: Data collection for research purposes. More information is needed to guarantee that the information is real and, as a result,
in the organization restoration. Utilizing records scattered together and
will ensure patient safety as the process of information accumulation is
flawless. Due to the current lack of appropriate equipment, patients have
often been reluctant to take an interest in sharing information. The use of
social insurance Blockchain innovation in scientists, biobanks, and foundation systems will lead to collected information from patients for research
purposes.
Situation 3: Various health players are being added for better patient
care. The associated convention is a social insurance model, which gives
the possibility to expand human service assets and buyer associations and
to increase self-administration of a conditional parent with 23 parents.
Shared notes (using a method based on rights) among substances (e.g.,
insurance agencies and drug shops) will stimulate the drug and incur a
right to the patient, especially if the board continues to be a dangerous
event.
14.2.4
General Applications of the Blockchain
From a business perspective, it is helpful to think of blockchain technology as a type of next-generation business process improvement software.
Collaborative technology as shown in Figure 14.6, such as blockchain,
promises the ability to improve the business processes that occur between
companies, radically lowering the “cost of trust”. For this reason, it may
offer significantly higher returns for each investment dollar spent than
most traditional internal investments.
Medical Data Security Using Blockchain
277
Blockchain - Applications
Finance
Services
Asset
Management
Insurance
Claims
Processing
Cross-Border
Payments
Smart Property
Money
Lending
Smart
Car
Internet of
Things (IoT)
Smart
Healthcare
Smart
Appliances
Supply
Chain
Sensors
Smart
Phone
Smart
Government
Personal
Health Record
Keeping
Electronic
Passport
Access Control
Birth, …
Weddings
Certificates
Healthcare
Management
Insurance
Processing
Personal
Identification
Smart
Community
Figure 14.6 Potential applications of the Blockchain [10].
Financial institutions are exploring how they could also use blockchain
technology to upend everything from clearing and settlement to insurance.
With the possibility of the situation with the curve embossed, what
would be considered for an organization based on security Blockchain?
1. Concentration of dependence on outsiders every time for
individual or various tasks.
2. The outsider cannot be trusted, and the validity of a flat
exchange.
3. Acceptance of exchange is a requirement and the credibility
presented in a sophisticated framework and the reliability of
information along these lines is important.
4. This data is important for preparing and classification performance. Blockchain requires an investment class that is
not appropriate due to accepting it in series.
14.3 Blockchain as a Decentralized Security
Framework
Blockchain is emerging as one of the most propitious and ingenious technologies of cybersecurity. In its germinal state, the technology has successfully
replaced economic transaction systems in various organizations and has the
potential to revamp heterogeneous business models in different industries.
Although it promises a secure distributed framework to facilitate sharing,
exchanging, and the integration of information across all users and third
parties, it is important for the planners and decision makers to analyze it
278
The Internet of Medical Things (IoMT)
in depth for its suitability in their industry and business applications. The
blockchain should be deployed only if it is applicable and provides security
with better opportunities for obtaining increased revenue and reductions in
cost. This chapter suggests that the outline of the innovation of approval of
security crossover incidents spreads in an inevitable and immediate manner.
14.3.1
Characteristics of Blockchain
Figure 14.7 shows the characteristics of blockchain.
i.
Decentralized systems: In blockchain, decentralization refers
to the transfer of control and decision-making from a centralized entity (individual, organization, or group thereof)
to a distributed network. Decentralized networks strive to
reduce the level of trust that participants must place in one
another, and deter their ability to exert authority or control
over one another in ways that degrade the functionality of
the network.
ii. Immutability: Immutability can be defined as the ability of
a blockchain ledger to remain unchanged, for a blockchain
to remain unaltered and indelible. More succinctly, data in
the blockchain cannot be altered.
Each block of information, such as facts or transaction
details, proceeds using a cryptographic principle or a hash
value. That hash value consists of an alphanumeric string
generated by each block separately. Every block contains
a hash or digital signature not only for itself but also for
the previous one. This ensures that blocks are retroactively
coupled together and unrelenting. This functionality of
blockchain technology ensures that no one can intrude in
the system or alter the data saved to the block.
It is also important to know that blockchains are decentralized and distributed in nature, where a consensus is made
among the various nodes that store the replica of data. This
consensus ensures that the originality of data must be maintained. Undoubtedly, immutability is a definitive feature of
this technology. This concept has the ability to redefine the
overall data auditing process and makes it more efficient,
cost-effective, and brings more trust and integrity to the data.
iii. Digital Crypto Currency: This is the most unmistakable
component of a Blockchain, for example, Bitcoin (BTC) or
Atheriam (ETH).
Medical Data Security Using Blockchain
iv. Software Development Platform: The developers saw the
series as the importance of the first segment, which is very
safe as decentralized innovation programming and cryptography. APIs for application Blockchain progress may
vary.
v. A Distributed Ledger: A Blockchain is an open comment
that data from each exchange member and a substantial
amount of computerization, which has never been done.
This innovation will help gift exchange and accounts
on the system. Any customer exchange system and not
directly duplicate records can be fried, which can approve
exchanges. Significant security and accuracy of cryptography advantage is maintained for use and signal and surrounded by member.
vi. Minting: This innovation helps each account currency and
supply systems. Minting is the process of validating information, creating a new block, and recording that information into the blockchain. A record is also reflected in
duplicate in advance minutes or seconds. Key to the use
and target, security, and cryptography precision are kept
as advantages and surrounded by the member.
Decentralized
Digital
Ledger
Distributed
Transparent
and Verifiable
CryptoGraphically
Secured
Chronological and
Time Stamped
Blockchain
Characteristics
Irrevocable and
Auditable
Immutable and
Non-Repudiable
Reduces
Dependencies
on 3rd Parties
Figure 14.7 Characteristics of Blockchain.
“Trustless”
Operation (Based
on Consensus)
279
280
The Internet of Medical Things (IoMT)
• Cryptography: Blockchain approved and relied on foreign
exchange for the computation and verification of cryptographic, including the festival.
• Anonymity: Bitcoin is often viewed as an untraceable
method of payment that encourages lawbreaking activities
by criminals to carry out transactions without being traced.
This implies that users can carry out transactions in complete anonymity.
• Transparency: Blockchain is supposed to be a transparency machine in which anyone can join the network and,
as a result, view all information on that network. In the
case of crypto currencies, the transparency of blockchain
offers users an opportunity to look through the history of
all transactions.
14.3.2
i.
Limitations of Blockchain Technology
Greater expenses: Nodes working on supply and demand
guidelines tend to look for a higher reward for completing
transactions in business.
ii. Exchanges: Nodes organize exchanges with higher prizes,
and excesses of exchanges develop.
iii. Littler record: It is not realistic for Block chain’s full copy,
affecting the irreversible nature of the possible agreement
and so on.
iv. Exchange costs and arrange speed: Bitcoin transaction
costs are being too much light, which was described as
“free” for the early years.
v. Danger of blunder: Barring human factors, there is a
risk of disability; it remains a constant risk of error. In
the event that Blockchain is to fill in a database form,
all information received must be of high capacity. It can
resolve disorders quickly because of possible human
association.
vi. Wasteful: Every node runs the blockchain in order to maintain consensus across the blockchain. This gives extreme
levels of fault tolerance, ensures zero downtime, and makes
data stored on the blockchain forever unchangeable and
censorship-resistant. In any case, it is not efficient, reapplying the fact that any compromise work in the light of
the hub.
Medical Data Security Using Blockchain
281
14.4 Existing Healthcare Data Predictive Analytics
Using Soft Computing Techniques in Data
Science
Healthcare organizations aim at deriving valuable insights employing data
mining and soft computing techniques on the vast data stores that have
been accumulated over the years. This data, however, might consist of
missing, incorrect, and, most of the time, incomplete instances that can
have a detrimental effect on the predictive analytics of the healthcare data.
Preprocessing of this data, specifically the imputation of missing values,
offers a challenge for reliable modeling.
14.4.1
Data Science in Healthcare
A shift toward a data-driven socio-economic health model is occurring
as a result of the increased volume, velocity, and variety of data collected
from the public and private sectors involved in healthcare and science. In
this context, the last 5-year period has seen an impressive revolution in the
theory and application of computational intelligence and informatics in
health and biomedical science.
However, the effective use of data to address the scale and scope of
human health problems has yet to realize its full potential. The barriers
limiting the impact of practical application of standard data mining and
machine learning methods are inherent to the “big data” characteristics
that, besides the volume of the data, can be summarized in the challenges
of data heterogeneity, complexity, variability, and dynamic nature together
with data management and interpretability of the results.
14.5 Literature Review: Medical Data Security
in Cloud Storage
Dwivedi et al. [1], the authors analyzed health data using safety management and proposals of Blockchain. However, Blockchains are computationally expensive, demand for high bandwidth and additional computing,
and not fully suitable for limited resources because it was built for smart
city of IoT devices. They have used two neural network techniques, Back
Propagation Algorithm (BPA), Radial Basis Function (RBF), and one
non-linear classifier Support Vector Machine (SVM) and compared in
accordance with their efficiency and accuracy. They used WEKA 3.6.5
282
The Internet of Medical Things (IoMT)
tool for implementation to find the best technique among the above three
algorithms for kidney stone diagnosis. The main purpose of their thesis
work was to propose the best tool for medical diagnosis, like kidney stone
identification, to reduce the diagnosis time and improve the efficiency and
accuracy. From the experimental results, they concluded that the back
propagation (BPA) significantly improved the conventional classification
technique for use in medical field. In our model, this additional privacy
and security properties based on sophisticated cryptographic priority. The
solution here is more secure and anonymous transactions to IoT applications and data based Blockchain networks.
Park et al. [2], the authors have used data pre-processing, data transformations, and data mining approach to elicit knowledge about the interaction between many of measured parameters and patient survival. Two
different data mining algorithms were employed for extracting knowledge
in the form of decision rules. Those rules were used by a decision-making algorithm, which predicts survival of new unseen patients. Important
parameters identified by data mining were interpreted for their medical
significance. They have introduced a new concept in their research work,
and it has been applied and tested using collected data at four dialysis sites.
The approach presented in their paper reduced the cost and effort of selecting patients for clinical studies. Patients can be chosen based on the prediction results and the most significant parameters discovered. The authors
used de-identification of 300 patients privately and verified using a network construction of PHR Ethereum Blockchain version 1.8.4. Blockchain
private network node is consists of 300 patients with hospitals and nodes.
Their findings support exchange data Blockchain likely to use technology
from genuine patient’s private Blockchain network. Blockchain needs to
use data management, cost, and privacy to take into account the management of medical data.
Nguyen and Pathirana [3], the authors proposed a novel EHR sharing
including the decentralization structure of the mobile cloud distribution
platform Blockchain. In particular, they are designed to be the system
for achieving public safety EHRs between various patients and medical providers using a reliable access control smart contract. They provide a prototype implementation using real data Ethereum Blockchain
shared scenarios on mobile applications with Amazon cloud computing.
Empirical results suggest that the proposal provides an effective solution
for reliable data exchange to maintain sensitive medical information about
the potential threats to the mobile cloud. Evaluation models of security systems and shared analysis also enhance lighting, design, and performance
Medical Data Security Using Blockchain
283
improvement in high security standards and lowest network latency control with data confidentiality compared with existing data.
Zheng and Mukkamala [4], the authors proposed a conceptual design
of health data for health and safety in a transparent manner, Blockchain
shares dynamic technology of personal use to continuously complement
cloud storage and information sharing. The main purpose of the proposed system would be to allow users to share their personal health data
according to the General Data Protection Regulation (GDPR) for the benefit of each dataset, control and sharing safely. It also let the researchers
store personal health data for high-quality research and commercial data
for consumers in an effective way for commercial purposes. This work,
in the first character of the data, enables personal health data (classified
into various categories from dynamic and static data), and data acquisition
methods in terms of data related to health (continuous and real-time data)
enabled mobile devices. They put the proposal to use various solutions
running using hash pointers for storage space dynamically sharing data
sizes. Second, the proposed Blockchain dynamic system and cloud storage
of health data in a variety of sizes have been integrated. They also proposed
that data can be stored in an encrypted format in cloud-size Blockchain
Health and Transaction-only stores and related to share data and metadata.
Third, the recognition module data that is included in the machine’s proposed system is supported by hardware and software technology to control
the quality of data from both sides.
Liu et al. [5], the authors have proposed secrecy to block the sharing for
Electronic Medical Records (EMRs), called BPDS. In BPDS, basic EMRs
can be safely stored in the cloud and booked in the Consortium Index
Blockchain Tamper Proof. Through this, the risk of medical data leakage
can be significantly reduced, and the index ensures Blockchain that EMRs
cannot be arbitrarily changed. Access permissions can be completed automatically according to your Blockchain patients who have been established
through secure data sharing contracts. They have presented a work using
machine learning techniques, namely, SVM and Random Forest (RF).
These were used to study, classify, and compare cancer and liver and heart
diseases data sets with varying kernels and kernel parameters. Results of
RF and SVM were compared for different data set such as breast cancer
disease dataset, liver disease dataset, and heart disease dataset. The results
with different kernels were tuned with proper parameter selection. Results
were better analyzed to establish better learning techniques for predictions.
It is concluded that varying results were observed with SVM classification
technique with different kernel functions. By implementing the proposed
284
The Internet of Medical Things (IoMT)
BPDS, patients can access data easily and have user privacy or through
their own EMR full control without risk to institutional patients.
Kaur et al. [6], a Blockchain-based platform is proposed by the authors
that can be used to store electronic medical records in cloud environments
and management. In this study, they have proposed a model for the health
data Blockchain-based structure for cloud computing environments. Their
contributions include the proposed solution and the presentation of the
future direction of medical data at Blockchain. This paper provides an
overview of the handling of heterogeneous health data, and a description
of internal functions and protocols.
Theodouli et al. [7], developed a MedBlock framework based on blockchain technology to solve data management and data sharing problem
in an electronic medical records (EMRs) system and improve medical
information sharing. Patients can access the EMRs of different hospitals
through the MedBlock framework by voiding the previous medical data
being segmented into various databases. In addition, data sharing and collaboration via blockchain could help hospitals get a prior understanding
of patients’ medical history before the consultation.
Roehrs et al. [8], the authors presented the implementation and evaluation of a PHR model that integrates distributed health records using
Blockchain technology and the openEHR interoperability standard. They
had concentrated on the diagnosis of optic nerve disease through the
analysis of pattern electroretinography (PERG) signals with the help of
artificial neural network (ANN). They implemented multilayer feedforward ANN trained with a Levenberg Marquart (LM) BPA. The end
results were classified as healthy and diseased. The stated results shown
that the proposed method PERG could make an effective interpretation.
The integrated approach focused on the performance of non-functional
requirements, such as response time, in addition to evaluation to record
their evaluation criteria.
Abdur Rahman et al. [9], the authors presented a secure therapy framework that will allow a patient to own and control his/her personal data without any trusted third party, such as a therapy center. However, Blockchain
stored only medical data in a distributed or centralized essential DB chain
based on immutable hashes including metadata, actual multimedia data,
images, audios, videos, data, and other augmented reality therapies on the
application. This feature allows you to record the use of metadata and multimedia data or the provision of updates.
Zheng et al. [10], the authors conceptualized the proposed use of
share information on the protection of health and health data to share
Medical Data Security Using Blockchain
285
any individual technology line dynamic Blockchain transparent cloud
storage. In addition, they also provide quality control checking module
machine learning data quality engineering data base. The main objective of the proposed system will allow us to share our personal health
data in accordance with the GDPR for each common interest of each
dataset, control, and security. This allows researchers for high quality
research to effectively protect personal health data through consumer
and commercial data for commercial purposes. The first characters of
data from this work, personal data of health (grouped into different
categories of dynamic and static data), and a method for health-related
data capable of data acquisition) enabled mobile devices (continuous
data and real-time). In the case of a solution that has been integrated,
using a pointer hash for storage space in a variety of sizes has been
integrated.
Guo et al. [11], the authors proposed system that does not provide information, and they provide you with feature signature scheme based on the
characteristics of the patient’s right, and it supports the message according
to the evidence assurance Blockchain validity. In addition, reliable single or
central power, many without one, which avoids the problem of escrow and
distributed data collection mode causing Blockchain public/private keys
compatible with patient and distributed.
Xia et al. [12], the authors proposed the framework, in which they allow
only invited system based on access permissions, Blockchain, and verified
user. As a result of this design is better known to all users before guaranteed accountability and stored by Blockchain log their functions. This system allows users to verify their identity and request data from the shared
pool after the cryptographic key. In the proposed system, communication
and authentication protocols and algorithms are instituted that are not
fully investigated. Future studies will be really interesting to improve this
work with further research. They say that the architecture described above
in this paper is the Blockchain-based access control system, which is being
implemented and tested.
Xia et al. [13], the authors presented MeDShare which uses the data
negligence system to be used in monitoring the data depository institution. MeDShare reveals the same with the current state of the art solution for data sharing between cloud service providers. By implementing
MeDShare, data service providers and other custodians of data that prove
the time-sharing of medical data with medical institutions, including
institutions for minimal risk research and confidentiality and auditing,
will be able to obtain the cloud.
286
The Internet of Medical Things (IoMT)
Zyskind et al. [14], the authors also described a decentralized system
of managing personal data that users create themselves and control their
data. They implement the protocol to change the automatic access-­control
manager on Blockchain, which does not require a third-party trust. Unlike
Bitcoin, its system is not strictly a financial transaction—it has to carry
instructions for use, such as shared storage, query, and data. Finally, they
discussed the extent of future potential Blockchain which can be used in
the solution round for reliable computing problems in the community. The
platform enables more: Blockchain intended as an access control itself with
storage solutions in conjunction with Blockchain. Users rely on third parties and can always be aware of what kind of data is being collected about
them and do not need to use them. Companies, in turn, can focus on helping
protect data properly and how to specifically use it without the bins being
concerned. In addition, with the decentralization platform, sensitive data is
gathered; it should be simple for legal rulings and rules on storage and sharing. In addition, laws and regulations can be programmed in Blockchain,
so that they can be applied automatically. In other cases, access to data (or
storage) may serve as evidence of that law, as it would be compromised.
In a review Dhamodaran et al. [15], the authors have used data pre-­
processing, data transformations, and data mining approach to elicit knowledge about the interaction between many of measured parameters and patient
survival.
14.6 Conclusion
At the end of this review, exploration core points were taken in performance to confer what is the one and only tangible subsequent step to be
acquired in the direction of state-of-the-art diffusion of investigations
ahead motivating the investigators to learn novel platforms and procedures: identify the status that seems dedicated to publicizing the discoveries of their efforts. Still, what establishes real propagation (in terms of
influence and return on savings) stay uncertain. Investigators want superior and vibrant supervision on how worthiest to propose, source, and
enable their broadcasting events. Experienced, knowledgeable, and skilled
human assets are prime qualification for any established development. All
traditional healthcare systems are handling a stern disaster of manpower
growth, deployment, and headship. There is no appropriate guidance automated for outdated doctors. Hypothetical and other establishments are
befalling track with deprived financial fundings, scheduling, organization,
measuring, forecasting, and governance. It is trusted that the evidences
Medical Data Security Using Blockchain
287
provided will support the researcher in the time ahead to escalate the complications of reviewing remedial healthcare and the problems integral in
this form of education.
References
1. Dwivedi, A.D., Srivastava, G., Dhar, S., A Decentralized Privacy-Preserving
Healthcare Blockchain for IoT, www.mdpi.com/journal/sensors. Sensors, 19,
2, 326. 2019.
2. Park, Y.R., Lee, E., Na, W., Is Blockchain Technology Suitable for Managing
Personal Health Records? Mixed-Methods Study to Test Feasibility. J. Med.
Internet Res., 21, 2, e12533, 2019.
3. Nguyen, D.C. and Pathirana, P.N., Blockchain for Secure EHRs Sharing
of Mobile Cloud Based E-Health Systems, Special Section On Healthcare
Information Technology For The Extreme and Remote Environments. IEEE,
7, 66792–66806, 2019.
4. Zheng, X. and Mukkamala, R.R., Blockchain-based Personal Health
Data Sharing System Using Cloud Storage. 2018 IEEE 20th International
Conference on e-Health Networking, Applications and Services (Healthcom),
IEEE, 2018.
5. Liu, J., Li, X., Ye, L., Zhang, H., BPDS: A Blockchain based Privacy-reserving
Data Sharing for Electronic Medical Records, arXiv:1811.03223v1 [cs.CR], 8
Nov 2018.
6. Kaur, H., AfsharAlam, M., Jameel, R., A Proposed Solution and Future
Direction for Blockchain-Based Heterogeneous Medicare Data in Cloud
Environment. J. Med. Syst., Springer, 42, 8, 1–11, 2018.
7. Theodouli, A., Arakliotis, S., Moschou, K., On the design of a Blockchainbased system to facilitate Healthcare Data Sharing, 2018.
8. Roehrs, A., André da Costa, C., da Rosa Righi, R., Analyzing the Performance
of a Blockchain-based Personal Health Record Implementation. J. Latex
Class Files, 92, 103140, OCTOBER 2018.
9. Abdur Rahman, Md., Shamim Hossain, M., Hassanain, E., Blockchain-Based
Mobile Edge Computing Framework for Secure Therapy Applications. IEEE,
6, 2169–3536, 2018.
10. Zheng, X., Mukkamala, R.R., Vatrapu, R.K., Ordieres, J., Blockchain-based
Personal Health Data Sharing System Using Cloud Storage. IEEE 20th
International Conference on e-Health Networking, Applications and Services
(Healthcom), 2018.
11. Guo, R., Shi, H., Zhao, Q., Zheng, D., Secure Attribute-Based Signature
Scheme with Multiple Authorities for Blockchain in Electronic Health
Records Systems. IEEE, 6, 2169–3536, 2018.
288
The Internet of Medical Things (IoMT)
12. Xia, Q., Sifah, E.B., Asamoah, K.O., Gao, J., MeDShare: Trust-Less Medical
Data Sharing Among Cloud Service Providers via Blockchain, Digital Object
Identifier. IEEE, 5, 2169–3536, 2017.
13. Xia, Q., Sifah, E.B., Asamoah, K.O., Gao, J., MeDShare: Trust-Less Medical
Data Sharing Among Cloud Service Providers via Blockchain. IEEE, 5,
2169–3536, 2017.
14. Zyskind, G., Nathan, O., Pentland, A.S., Decentralizing Privacy: Using
Blockchain to Protect Personal Data. CS Security and Privacy Workshops,
IEEE, 2015.
15. Dhamodaran, S. and Balmoor, A., Future Trends of the Healthcare Data
Predictive Analytics using Soft Computing Techniques in Data Science. CVR
J. Sci. Technol., 16, 89–96, June 2019.
16. Mishra, A., An Authentication Mechanism Based on Client-Server
Architecture for Accessing Cloud Computing, International Journal of
Emerging Technology Advanced Engineering, ISSN 2250-2459, 2, 7, 95–99,
July 2012.
15
Electronic Health Records:
A Transitional View
Srividhya G.
*
Vels Institute of Science Technology and Advanced Studies, Chennai, India
Abstract
The electronic health record (EHR) is an unavoidable and vital tool for the medical and behavioral health professionals. It is difficult to imagine what patient care
would appear like today without EHRs, especially when these systems substitute
the paper record maintenance procedures. Documenting medical records of
patient data which included the nature of disease, symptoms, treatments, and medicines prescribed started at the very early age of Egyptians. Nowadays, the health
record has evolved to a far extent that with any one patient ID, the entire medical
history of the patient can be retrieved by the consulting physicians. This chapter’s
evolution of EHR comprises the history and evolution of the health record system
starting from the Egyptian era where the first health record was written till the
computer health record system. This chapter also includes various documentation
procedures for the health records that were followed from the ancient times and
contributions by various civilizations around the world.
Keywords: EHR, health record, data, documentation, symptom, treatment
15.1 Introduction
The electronic health record (EHR) is an unavoidable and vital tool for the
medical and behavioral health professionals. It is difficult to imagine what
patient care would appear like today without EHRs, especially when these
systems substitute the paper record maintenance procedures. Documenting
medical records of patient data which included the nature of disease,
Email: srividhya.se@velsuniv.ac.in
R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul (eds.) The Internet of Medical Things (IoMT):
Healthcare Transformation, (289–300) © 2022 Scrivener Publishing LLC
289
290
The Internet of Medical Things (IoMT)
Egyptian
hieroglyphs
Greek
Literature
Arabic
Civilization
Record
European
Medical
Record
system
Swedish
Study
American EHR
Figure 15.1 Evolution of EHR.
symptoms, treatments, and medicines prescribed started at the very early
age of Egyptians. Early patient medical records included brief, written case
history reports maintained for teaching purposes. As in Figure 15.1, the evolution of the Electronic Health Record dates from Egyptian medical transcripts and became a common practice from American Health record system
due to the invention of computers. Approximately, around 1,600 to 3000 BC
doctors were filing the patient health records which are in ancient Egyptian
hieroglyphs [1]. However, paper medical records creation and maintenance
were not in regular practice until the early 1900s. Nowadays, the health
record has evolved to a far extent that with any one patient ID, the entire
medical history of the patient can be retrieved by the consulting physicians.
During the ancient era of medical records, the health information system was written and maintained on paper. So, due to this, only one copy of
the medical record was available for reference. Development of computer
technology from the 1960s and 1970s laid the foundation for the significant progression of the electronic health information system.
15.2 Ancient Medical Record, 1600 BC
The ancient Egyptians were fastidious recorders of their history and they
had thousands of scribes to record the medical data. Papyrus is a material
EHRs: A Transitional View
291
made from a water plant. Egyptians used this material and the first medical record was transcribed on the scrolls of the papyrus. For more than
4000 years ago, the records transcribed by the Egyptians acknowledge that
medicine was practiced in many forms, i.e., from general medicine, dentistry, to surgery. Around 1600 BC, Egyptians have documented a surgical
procedure of war wounds on papyrus [11]. This might be the first medical
record known [7].
It admits the Egyptians’ knowledge about the relation of pulse to the
heart and also about the workings of the stomach, bowels, and larger blood
vessels [12]. Several case histories were originated from the Hippocrates
which were written a few hundred years ago.
15.3 Greek Medical Record
In 129 AD, during the Roman Empire, a Greek Physician named Galen
elaborated a literature about the diseases and treatments of patients. A
Greek physician named Paul Aegina in 700 AD wrote seven books which
are called as Medical Compendium. The Byzantine Empire which existed
between 330 AD and 1453 AD formed the junction between the GrecoRoman medicine and Arabic Medicine [6, 7].
15.4 Islamic Medical Record
During the eighth century, Greek medicinal knowledge influenced Arabs
to develop Islamic medicine and they were the first to introduce the concept of hospitals. Arabs were the beginners of keeping written records of
patients and the details of their treatments. The medical knowledge passed
on by Rhazes is a compilation and a synthesis of Arabic civilization’s
achievements in the early Middle Ages. It consisted of scientific advancements of ancient Greece and the entire Hellenic world, as well as ancient
Indian civilizations taking their roots in the very first human civilizations
of Harappa and Mohenjo-Daro, where medicine was practiced on a relatively high level.
Another accomplished physician in the early Islamic civilization is
the aforementioned Ibn Sina. Ibn Sina studied law and natural sciences.
It helped him develop an analytical approach toward his medical texts,
encapsulated in over 400 books. The most fundamental of which, that is
“Kanun fi’t-tibb” (translated as “The Canon of Medicine”; in Europe, it
is known as “Canon medicinae”). The medical knowledge included was
292
The Internet of Medical Things (IoMT)
highly organized. The entire “Canon medicinae” consisted of five books,
each of which was divided into parts and then chapters. They described
a variety of cases based on the previous educational health records [3–6].
15.5 European Civilization
In the early 18th century, everything in nature was categorized and
described. During the 17th century, there was an increase in the knowledge of natural science which was due to the observations from dissection
of cadaver. As there was an increase in the medical applications stimulated
the systematic recording of case histories for anatomical purposes.
This civilization brought the rapid development of the natural sciences
in Europe, as a consequence of curiosity awakened by the renaissance.
Post-mortem examinations were being conducted on an unheard of before
scale, which provided material for a gigantic amount of health records. The
phenomenon proved favorable to the development of science as a whole
[8, 9].
While discussing the health records created during that time, it is
impossible not to mention Philip Verheyen (1648–1710), who had his left
leg amputated during the second year of his studies. In 1693, Verheyen
started performing post-mortem examinations on his amputated leg,
which resulted in a discovery of Achilles tendon. Based on his exemplary
notes, he wrote and published a book called “Corporis Humani Anatomia”.
In the first decade of the 18th century, it was considered the best medical
textbook by the majority of European universities [10, 11].
15.6 Swedish Health Record System
In 1728, Swedish physician Rosén von Rosenstein received his doctoral
degree in medicine which was all about the correct documentation of the
progression in disease and improvement statuses of patients at various
stages, at the University of Harderwijk in Holland. This inspired the scientists of that time in determining the principles of patient record preparation which includes his or her surroundings. Rosén von Rosenstein was
the first physician to introduce careful notes taking patient details, their
symptoms, diagnosis, and treatment along with social conditions. With the
inauguration of the Seraphim Hospital at Stockholm in 1752, in Sweden,
the first medical record system was developed and a systematic medical
health recording was started.
EHRs: A Transitional View
293
15.7 French and German Contributions
It was due to the French Clinical School and German laboratory quantitative and qualitative measurements that started emerging which was a
significant stimulus for systematically recording the medical data [7, 8]. To
test the hypotheses on disease causes or efficacy of therapy, many mathematical methods have been used such as Pierre Louis’s Numerical methods
on many case histories [19].
In Paris, Hôtel-Dieu hospital became an important center for development of medicine and medical education thanks to Pierre Foubert (1696–
1766) and Pierre-Joseph Desault (1744–1795). Everyday checkups on
patients were obligatory and provided data needed for research. In 1791,
Pierre-Joseph Desault established “Journal de chirurgerie”, which included
the most interesting cases he came across, with his personal comments.
In that way for the first time in modern Europe, the concept of in-depth
health records became not only a set of tips for treating patients but also a
base for scientific research.
15.8 American Descriptions
The amount of health records in a form of sketches and descriptions
made up until the beginning of the 18th century is difficult to estimate.
Meanwhile, an accomplished American physician Benjamin Rush (1745–
1813) educated in Edinburgh, Scotland kept very detailed health records of
his patients in the form of a book. Nowadays, his work is considered to be
an archetype for medical history.
The United States started developing a permanent patients’ case records
system independently from Europe. According to American sources,
the steppingstone in the process was introduced in 1793 The Book of
Admissions and The Book of Discharges in a New York hospital opened in
1791 [13, 14]. The Governors of the New York Hospital’s society approved
the first hospital rules in 1793. The hospital dispensary prepared and delivered a monthly report of the “Names and Diseases of the Persons, received,
deceased or discharged details in the same, with the date of each event, and
the place from when the Patients last came”.
In the 18th century, diagnostic technique was dependent upon the
symptoms caused by the disease and physical examination also played a
significant role. New hospital gathered notes from the physician’s notebook and entered in bound medical and surgical volumes to preserve in
the library. By the end of the 19th century, medical records had been used
294
The Internet of Medical Things (IoMT)
as legal documents for insurance so that malpractices in hospitals could
be identified and minimized. Medical records contained tabulations of
admissions and discharge details of patients to document expenditures.
Medical records similar in structure to modern ones were first developed for educational purposes. The reviewed sources mention ancient
Egyptian medical papyruses. In 1862, an American Egyptologist bought
a manuscript written between 1600 and 1700 BC, which was named after
him—“Edwin Smith papyrus”. It is the oldest known medical script about
various injuries. It describes the methods of examination and determination of a diagnosis and ends with a treatment plan. Another example,
“Ebels papyrus”, bought in the 19th century by a German of the same
name, was an extensive source of knowledge about the treatments, surgical
procedures, and healing herbs known in ancient Egypt [17].
In 1724 in Berlin, formerly the capital of Prussia, a garrison hospital
was rearranged into a collegium medico-chirurgicum, later called Charité
by Frederick William I of Prussia. The first director of the institution was
Johann Theodor Eller (1689–1760), the Royal Doctor. One of the routines
in the college was everyday inspection of patients conducted by junior
surgeons—which involved writing up the patient’s condition and the history of treatment in a form of a journal. Johann T. Eller considered it the
best form of education, that enabled the doctors to gain new skills and
brought benefits to patients. He introduced a hierarchical system where
health records were a form of communication between experienced physicians and their pupils. All these modern ideas fell into the concept of
enlightened absolutism, the Prussian version of Enlightenment. The strong
centralized political power of the monarch supported by the developing
bureaucracy became an example to follow in institutions such as Charité.
It also influenced the way of creating health records [19, 20].
It was not until 4 June 1805 that Dr. David Hosack, now best known for
attending to Alexander Hamilton after his fatal duel with Aaron Burr [8],
suggested recording the cases of greatest value for the teaching of medical students [17]. The Board of Governors agreed: “The house physician,
with the aid of his assistant, under the direction of the attending physician,
shall keep a register of all medical cases which occur in the hospital, and
which the latter shall think worthy of preservation, which book shall be
neatly bound, and kept in the library for the inspection of the friends of the
patients, the governors, physicians and surgeons, and the students attending the hospital” [18]. Few entries were written initially [19], and the first
casebook consequently spanned from 1810 to 1834 [21].
Mid of 1800: At Massachusetts General Hospital, a physician
recorded his findings in admission and also recorded notes given by
EHRs: A Transitional View
295
attending physicians. They also copied notes from the hospital case books
and recorded in bound ledgers for future reference whenever required.
At Harvard Medical School and Hard Law school, the recorded data were
used for teaching case studies [19].
A quarter century after Hosack’s proposal, a physician proposed in 1830
that all cases be recorded [20]. The Board of Governors added that “no
Assistant shall be entitled to the appointment of House Physician or House
Surgeon until he shall have entered on the Register at least twelve cases”
[21]. Still written in retrospect, these casebooks demonstrate a slow evolution of practice, with no clear change in the recording methodology until
the mid-1860s.
In major medical centers of Paris and Berlin, medical records were in the
form of loose paper files. Later at the end of the 19th century, these medical paper files had family history, patient habits, previous illnesses, present
illness, symptoms, physical examination, admission, urine and blood analyses, and discharge report and instructions. These medical records were
arranged in serial numbered bound volume books. Inpatient medical and
surgical details and treatment details were maintained separately. Hence,
the data of patients were widely distributed and hard to retrieve to make
the data complete.
Medicine and health records may simply be connected to a term “hospital”. However, in medieval Europe, unlike nowadays, hospitals were treated
as asylums for the poor and ill. They were managed mainly by convents,
which was an effect of the Christian moral imperative to do good and show
mercy to those in need.
Civilizations were functioning independently from each other, so their
health records differed accordingly. Regardless of the place, their primary
purpose was determined by the administrative organs, almost always
connected to the church. The lists of patients admitted and released from
the hospitals have been kept in many such institutions and are nowadays
considered one of the first examples of medical data archiving in Europe.
Medieval health records can be considered as more autonomous than
ancient ones, and a habit of documenting the medical procedures or observations became a constant element in medical practice [19].
Because the Governor’s Council required annual reports, staff ’s duties
regarding health records were clearly defined: hospital admissions, discharges, the results of the treatments, and expenditures. Putting together
admissions and discharges was necessary to document the medical
achievements, but also to justify the expenditures [2]. That is why in 1830,
all patients were supposed to be registered, and their numbers were obligatorily connected to the prospects of the doctors’ promotions.
296
The Internet of Medical Things (IoMT)
By the early 1860s, case histories became more elaborate. They included
negative as well as positive information and began to reflect thought processes similar to those of today. Descriptions of hospital courses included
some data but often skipped many days at a time. The Queen Elizabeth
Hospital was chosen to prepare records in 1874 as it was a metropolitan
institution, was a specialty hospital during early days and the records of
this hospital were easily accessible and available for study [19, 21].
A common and general classification method adopted by all the hospitals was to determine a base for comparing the medical records. The
records were divided into various classes with respect to the order of admission along with the type of disease of the patients. The hospitals of that
time had common features in their administration. So, these records had
broad and meaningful similarity in purpose, structure, and function [18,
19]. The accumulated records of many hospitals are arranged based on a
common classification procedure. This provides a large and diverse source
for retrieving the patient records and of the general record-­maintaining
function.
Ever since 1880, the health records in the US and Europe have become
a subject in the matters of insurance and of possible abuse in this regard.
Along with the development of medical insurance, health records were
becoming increasingly significant. The changes became noticeable as late
as the mid-19th century when doctors started registering data of all their
patients. Universal templates of health records were introduced to avoid
confusion during case conferences.
During the phase where medical records were altered to other forms,
paper remained as a medium for storage of information in hospitals. The
growing specialization in healthcare which began emerging in the second
half of the 19th century affected the structures of hospitals and the form
of medical records. The sheer amount of the records was also becoming
increasingly larger, they were also copied and cultivated in libraries.
Along with medical record development, financial ledgers, wage and salary books, and case files were also developed. There were changes brought
in the way all the hospitals prepare and arrange the medical records. To
have a standardization, manuscripts were replaced by typescripts between
1890 and 1945. Loose paper files were also replaced by bound volumes in
the medical record offices [23].
After 1890, maintenance of records in typescript became familiar in
the records of “Policy and Management” and “Patient Care” categories.
Until the year 1890, all types of hospital records in all categories were in
the form of manuscripts. By 1948, minutes, reports, clinical summaries,
EHRs: A Transitional View
297
hospital correspondence, etc., were taken over by typescripts. Only financial records remained to be prepared in the form of manuscript [23].
Introducing universal history of the present illness forms and diagrams
at the beginning of the 20th century became common practice. It was a
result of applying some of the models already used in economics that had
proved to be effective, such as displaying information in a graphic form
[19, 21]. In 1916, in the US, there was a recommendation of writing down
the basic information about the illness in a standardized form. In 1918,
the American College of Surgery decided that registering all patients in
all hospitals in order to better monitor their treatment and compare the
results was a necessity. Offices and administrative networks were created to
keep the centralized registers in order. Hospitals started hiring professionals to handle the statistical data derived from the records.
15.9 Beginning of Electronic Health Recording
The base of the health information system can be traced back to the 1920s
when medical practitioners started using medical records to document
details of patients, their complications, treatments given, and recovery
status of the patient at the time of discharge [20]. The American College
of Surgeons (ACOS) formed the Association of Record Librarians of
North America (ARLNA), which is now renamed as the American Health
Information Management Association (AHIMA) to standardize medical
records in 1928. The ARLNA decides on how to standardize the usage of
medical records and how this information can be documented.
As new technological alternatives to paper documentation were developed, proposals for replacing the traditional medical charts with electronic
systems began to appear. The drastic changes in conducting the medical records, a gradual process of introducing EHRs, began in the 1960s.
Initially, the data were filled in using punch cards, which proved to be a
tedious process. However, it allowed the collected data from diagnostics to
be evaluated and for them to be used in research, educational, therapeutic,
economic, and administrative purposes in a more efficient way than paperbased documentation.
Since the existing technologies were not mature enough, at the initial
stages those propositions were much ahead of their time. With time this
changed and there were at least four major technological leaps, which
moved the idea of EHR from the realm of a futuristic concept into reality
[23]. The technological leaps include creation of mainframe computers,
298
The Internet of Medical Things (IoMT)
invention in personal computing, development of Internet and Cloud
computing technology, and availability of hand-held devices.
The drastic changes in conducting the medical records, a gradual process of introducing electronic health records, began in the 60s. Initially,
the data were filled in using punch cards, which proved to be a tedious
process. However, it allowed the collected data from diagnostics to be
evaluated and for them to be used in research, educational, therapeutic,
economic, and administrative purposes in a more efficient way than paperbased documentation. In 2009, HITECH (Health Information Technology
for Economic and Clinical Health) instructed all the medical centres to
introduce the health records system in all hospitals and health centers [19].
Currently, around 80% of hospitals and doctor’s offices use the EHRs
system, which allowed big databases of patients to be created. These databases serve as sources of information for treatment plans, the modeling of
the potential costs, the clearance of medical procedures, and research.
15.10 Conclusion
In light of the ongoing COVID-19 pandemic, application of EHRs could
be very much beneficial in terms of better coordination among the hospitals handling COVID19 patients. The symptoms of COVID-19 may seem
unrecognizable from common flu or fever, so finding common patterns of
the disease among larger numbers of patients could improve the diagnostic
procedure and the disease cases present in the hospitals/people in home
quarantine are easily accessible.
References
1. Gillum, R.F., From papyrus to the electronic tablet: a brief history of the
clinical medical record with lessons for the digital age. Am. J. Med., 126, 10,
853–857, 2013.
2. Craig, B.L., Hospital records and record-keeping, c. 1850-c. 1950. Part I:
The development of records in hospitals. Archivaria, 29, 57–80, 1989-1990.
(11609_ref).
3. Magner L.N., A History of Medicine. 2nd ed. Published by Taylor & Francis
Group, Boca Raton, 2005.
4. Amr S.S. and Tbakhi A., Ibn Sina (Avicenna): the prince of physicians. Ann.
Saudi Med. 27, 2, 134–135, 2007.
EHRs: A Transitional View
299
5. Markatos, K., Androutsos, G., Karamanou, M., Kaseta, M., Korres, D.,
Mavrogenis, A., Spine deformities and trauma in Avicenna’s Canon of
Medicine. Int. Orthop., 43, 5, 1271–1274, 2019.
6. Moosavi, J., The place of Avicenna in the history of medicine. Avicenna J.
Med. Biotechnol., 1, 1, 3–8, 2009.
7. Dalianis, H., Clinical Text Mining, Secondary Use of Electronic Patient Records,
Springer.
8. Reiser, S.J., The clinical record in medicine. Part 1: learning from cases. Ann.
Intern. Med., 114, 10, 902–907, 1991.
9. Musil, V., Stingl, J., Bacova, T., Baca, V., Achilles tendon: the 305th anniversary of the French priority on the introduction of the famous anatomical
eponym. 33, 5, 421–427, 2011.
10. Sakai, T., Historical evolution of anatomical terminology from ancient to
modern. Anat Sci. Int., 82, 2, 65–81, 2007.
11. Al-Awqati, Q., How to write a case report: lessons from 1600 B.C. Kidney Int.,
69, 12, 2113–2114, 2006.
12. Salem, M.E. and Eknoyan, G., The kidney in ancient Egyptian medicine:
where does it stand? Am. J. Nephrol., 19, 140–147, 1999.
13. Richet, G., La néphrologie naît avec Pierre J. Desault en 1785-179. Nephrologie,
24, 8, 437–42, 2003.
14. Engle, R.L., Jr., The evolution, uses, and present problems of the patient’s
medical record as exemplified by the records of the New York Hospital from
1793 to the present. Trans. Am. Clin. Climatol. Assoc., 102, 182–189, 1991.
discussion 189e192.
15. Lorkowski, J. and Jugowicz, A., The Historical Determinations of Creating
Health Records – A New Approach In Terms Of The Ongoing Covid-19
Pandemic, Poland, 2020, 10.20944/preprints202005.0352.v1.
16. Erksine, A., Culture and Power in Ptolemaic Egypt: The Museum and Library
of Alexandria. Greece Rome, 42, 1, 38–48, 1995.
17. Amr, S.S. and Tbakhi, A., Ibn Sina (Avicenna): the prince of physicians. Ann.
Saudi Med., 27, 2, 134–135, 2007.
18. Gillum, R.F., From Papyrus to the Electronic Tablet: A Brief History of the
Clinical Medical Record with Lessons for the Digital Age. Am. J. Med. 126,
10, 853–857, 2013, http://dx.doi.org/10.1016/j.amjmed.2013.03.024.
19. Reiser, S.J., The clinical record in medicine. Part 2: reforming content and
purpose. Ann. Intern. Med., 114, 11, 980–985, 1991.
20. Engle, R.L., Jr, The evolution, uses, and present problems of the patient’s
medical record as exemplified by the records of the New York Hospital from
1793 to the present. Trans. Am. Clin. Climatol Assoc., 102, 182–189, 1991.
discussion 189e192.
21. Siegler, E.L., The evolving medical record. Ann. Intern Med., 153, 10, 671–
677, 2010.
300
The Internet of Medical Things (IoMT)
22. Fry, J., Five Years of General Practice. BMJ, 2, 5059, 1453–1457, 1957, doi:
10.1136/bmj.2.5059.1453.
23. Craig, B.L. Hospital Records and Record-Keeping, c. 1850-c. 1950, Part 1:
The Development of Records in Hospitals. Society of American Archivists,
United Stated, 29, 57–87, (Winter 1989/1990).
Index
23andMe, 158
Access control, 30
Access control–based security, 27–30
Adaptive type-2 fuzzy learning
(T2-FDL) method, 38–39
Advanced message queuing protocol
(AMQP), 181
AlexNet, 73, 78, 79, 83, 87–89, 93, 95
Amazon’s web database, 151
American descriptions, 293
Anonymity, 280
Anonymity, integrity, and
compatibility (CIA), 157
Artificial bee colony (ABC), 37
Artificial neural network (ANN), 212,
284
Atheriam (ETH), 278
Authentication, 27–30
Authority, 30
Auto-regressive, 64
AVR module, 109
Back propagation algorithm (BPA),
281, 282
Ballooning, 51
Beer-Lambert law, 212
Beginning, 297
Big data,
variety, 248
velocity, 248
volume, 248
Bilateral filter, 38
Binding energy, 1, 15–18
Biomedical big data, 247
Biomedical big data types,
administrative and claims data, 252
clinical research and trials data,
254
electronic health records, 252
international patient disease
registries, 252
national health surveys, 253
Biomedical data acquisition,
smart watches, 255
smartphones, 255
wearable sensor suit, 254
Biomedical data management,
apache hadoop, 258
apache spark framework, 257
clustering algorithms, 259
DBSCAN, 261
fuzzy c-means clustering, 260
K-means clustering, 259
MapReduce, 258
Biometrics, 29
Bitcoin (BTC), 278
Bitmap, 58–60, 62–63, 72
BK biobank, 158
Blinking, 101
Blockchain, 36, 40
applications, 274–276
architecture, 272–273
as a decentralized security
framework, 277–280
existing healthcare data predictive
analytics, 281
general applications of, 276–277
301
302
Index
literature review, 281–286
types of blockchain architecture,
273–274
Blockchain technologies, 148
Blood pressure, 187, 190, 193–196,
200
BPDS, 283
Brain dead, 100
Brain tumor, 73
Brain waves, 111
Business openings, 165–166
Certificate-based authentication, 28
Challenges faced in customizing
wearable devices, 240
Civilization, 290–292, 295
Client experience, 166
Cloud computing, 24, 45–46, 54,
70–72
Cloud server, 31
Cloud service provider platform
(CSP), 24
Cloud storage, 26
Coma, 100–101
Constrained application protocol
(CoAP), 180
Context time analysis, 153
Coordinator node, 190, 194–195,
198–199, 201
Cost decrease, 165
CPOE (computerized physician order
entry) systems, 156
CPRI, 156
Crossover, 197–198, 201
Cryptographic co-processor, 41
Cryptography, 280
CTAKES, 153
Curcumin, 1, 3, 5, 18
Cyber security, 147
Cyber-physical structure (CPS), 149
Data,
aggregation process, 275
capture, 24–25
classification in cloud computing, 32
cleaning, 25
confidentiality, 36
controller, 27
quality validation, 270
science in healthcare, 281
security, 26
storage, 25–26
trash, 27
Data center, 45–47, 51–52, 56, 65, 67,
69, 70
Data classification,
access control, 32–33
content, 33
in cloud computing, 32
soft computing techniques for,
34–35
storage, 33–34
Data distribution system (DDS), 183
Data security, 148
Deep learning, 73, 78
Deep learning models,
convolutional neural networks,
128–130
deep belief networks, 130–131
deep stacking networks, 131–132
LSTM/GRU networks, 127–128
recurrent neural networks, 125–127
Denial-of-service, 52
Destination, 47, 52, 53, 56, 57, 59–64,
66
Diabetes mellitus, 207
Diabetics, 188, 195, 200
DiabLoop, 150
DiabLoop IoMT program, 152
Diagnosis, 292, 294
Dictionary learning method, 38
Digital crypto currency, 278
Digitalized healthcare system, 174
Directory-based authentication, 28
Dirty pages, 55, 57–63, 66–67
Dissection, 292
Docking, 1, 3, 5, 9–11, 13–15, 17–18
Downtime, 45, 52, 53–60, 62, 64–71
Index
Economics, 297
Educational health records, 292
EEG signals, 110
EEOOA (energy efficient on/off
algorithm), 149
E-health, 189–192, 199
Electroencephalograph (EEG), 74
Electroencephalography (EEG), 101
Electronic health record (EHR), 174
Electronic health records (EHRs),
evolution of,
applications of electronic health
records, 150–155
challenges ahead, 157–158
cyber security, 147
Internet of medical items (IoMT),
144–145
literature review, 148–150
materials and methods, 147–148
results and discussion, 155–157
telemedicine and IoMT, 145–147
Ellagic acid, 1, 3, 5, 17–18
End-to-end delay, 198, 201–203
Epidermal growth factor (EGFR), 1–8,
10–18
ERRAT, 1, 4, 10, 14
Ethereum blockchain, 36
European civilization, 292
Extensible message and presence
protocol (XMPP), 181
Fault, 45, 51, 53, 55, 68
Financial institutions, 277
Financial ledgers, 296
Firewalls, 47–48
FitBit, 272
French and German contributions, 293
FuelBand, 272
Future of IoMT, 164
Fuzzy-based artificial bee colony
(FABC), 37
Fuzzy c-means (FCM), 37
Fuzzy filtering, 37
303
Fuzzy logic-neural networks, 39
Fuzzy smoothing, 37
General data protection regulation
(GDPR), 283, 285
Genetic algorithm, 37
Genetic algorithm backpropagation,
36
Glioma, 73, 75, 76, 80, 93
Global data space (GDS), 183
Global organizations, 30
Glucose, 207
Google docs, 273
Googlenet, 73, 78, 79, 83, 87–89, 93, 95
Greek medical record, 291
Grid-based authentication, 29
Health monitoring center (HMC),
190–193, 195, 197, 199–202, 204
Healthcare service, 173, 174, 185
Heart rate sensor, 106–107
Helix, 158
HERDescribes blurred system
architecture keyword search,
role and purpose of design, 31
HIPAA, 26
Host, 48–50, 53–61, 66–67
Hybrid cloud, 47
Hyperglycaemia, 207
Hypertext transfer protocol (HTTP),
178, 179, 180
Hypervisor, 48, 51, 52, 54, 66, 69
Hypoglycaemia, 209
IBM’s X-Force Red, 27
Immutability, 278
Improved drug control, 147
In-clinic segment, 163
Infrastructure as a service, 45, 47
In-home segment, 162
In-hospital segment, 163–164
Insulin, 207
Insulin pump, 272
304
Index
Intensive, 54, 65, 68, 71
Internet of medical items (IoMT),
144–145
and telemedicine, 145–147
Interpretation of deep learning with
biomedical data, 132–139
IoMT,
architecture, 175, 176, 177
platform, 175, 177
testing process, 184
IoMT environment, 168–169
IoMT in developing wearable health
surveillance system, 226
IoMT pandemic alleviation design,
169–170
IoMT progress in COVID-19
situations, 167–168
IoT blockchain, 36
IoT cloud, 187, 190, 192–193
Islamic medical record, 291
Iterative, 52, 57–59, 61
Knowledge-based authentication, 29
K-ras oncogene protein, 1–3, 6, 9,
12–13
Large institute, 158
Levenberg Marquart (LM), 284
Lookup request, 195–196, 200
Machine authentication, 29–30
Major applications of IoMT, 171
Man-made consciousness and large
information innovation in IoMT,
170–171
Medical compendium, 291
Medical data classification in cloud
computing,
access control–based security, 27–30
data classification, 32–35
introduction, 24
related work, 36–41
security in medical big data
analytics, 24–27
system model, 30–31
Medical data security, 270–286
blockchain, 272–277
blockchain as a decentralized
security framework, 277–280
existing healthcare data predictive
analytics, 281
in cloud storage, 281–286
introduction, 270–272
Medical record, 290–298
Medicine, 290–294
MeDShare, 285
Memory pages, 45, 52–53, 55–57,
59–62, 70
Meningioma, 73, 75, 76, 80, 93
Mesh backbone, 196–198, 204
Mesh gateway, 191, 199
Message queue telemetry transport
(MQTT), 179
Metadata, 27
Migration time, 45, 52, 53, 55–57,
59–60, 62–63, 65–71
Minting, 279
Mitigation, 52
Monitoring system, 99
MRI databases, 38
Mutation, 197
Myriad genetics, 158
Natural language processing system
(NLP), 153
Network, 45–46, 48–50, 53, 65–66
Network bandwidth, 49
Network interface cards, 50
Network segment layer, 163
Neural network, 36
Neural network backpropagation, 36
Neural networks-genetic algorithms,
39
Node, 50, 52–53, 56–57, 60–63, 71
Non-invasive, 210
On-body segment, 162
One-time password (OTP), 30
Index
Os, 48–52, 68, 70
Overhead, 53, 58, 66, 68–69
PaaS (as a protection of data services),
41
Pacemakers, 272
Papyrus, 290–294
Pattern electroretinography (PERG),
284
Personal health record (PHR), 36
Pharmacies, 276
PHI storage organizations, 26
PHR ethereum blockchain version
1.8.4, 282
Pituitory tumor, 73, 75, 76, 80, 93
Platform as a service, 47
Point-of-care, 187–189
Portability and nimbleness, 166–167
Postcopy, 55, 65
Practice fusion diabetes classification,
39
Pre-copy, 45–46, 52–62, 64–72
Pre-copy live migration, 45, 54, 56–57,
59, 60, 62, 64–65, 68–71
Predictive analytics, 39
Preferences of the internet of things,
165
Primary domain controller (PDC), 28
Private cloud, 47
Processor, 49, 50–51, 67
PROCHECK, 1, 4, 7
Proficiency and efficiency, 165
ProQ, 1, 4, 7, 13
PSO (particle swarm optimization), 37
Public cloud, 47
PUF module, 149
Pulse sensor, 107
Quantum computing, 158
Quarantine, 298
Quercetin, 1, 3, 5, 16–18
Radial basis function (RBF), 281
RAM, 50, 67
305
Random forest (RF), 283
Raspberry Pi, 108
Resnet101, 73, 78, 79, 83, 87–89, 93, 95
Reticular activation system (RAS), 100
SCSI disk, 49
Secure message queue telemetry
transport (SMQTT), 180
Secure sockets layer (SSL), 179
Security, 47–48
Security breaches, 48
Security in medical big data analytics,
24–27
capture, 24–25
cleaning, 25
security, 26
stewardship, 26–27
storage, 25–26
Sensitive data, 47
Sensors, 145, 148, 149, 154, 155, 157,
158
Server, 47–51
Service-level agreement, 60
SHA-3 (secure hash algorithm), 150
Shrink errors, 146–147
Simulation, 55–56
Smart card–based authentication, 29
Soft computing techniques for data
classification, 34–35
Software, 46–49, 51–52, 72
Software as a service, 47
Source, 47, 53, 57, 59, 64, 68
Sparse coding, 38
Spirometer, 107–108
Stimulus, 293
Stop-and-copy, 52–53, 59, 63
Storage, 46–47, 50, 54, 65
Support vector machine (SVM), 281,
283
Survival rate, 187, 201–204
Swedish health record system, 292
Swiss model, 1, 3–4
Symptoms, 290, 293, 295, 298
System maintenance, 45, 51
306
Index
T2-FDL, 38–39
Telemedicine and IoMT, 145–147
advantages of, 145–146
drawbacks, 146
IoMT advantages with telemedicine,
146–147
limitations of IoMT with
telemedicine, 147
Temperature sensor, 107
Therapeutic, 297, 298
Transparency, 280
Tumor suppressor (TP53), 1–7, 10–16,
18
Type 1 diabetes, 208
Type 2 diabetes, 208
Unstructured knowledge management
system, 153
User password authentication, 28
Users, 47–48, 51, 62, 66, 70
Verify-3D, 1, 4, 10, 15
Virtual CPU, 49
Virtual machine, 49–52, 71–72
Virtual machine monitor, 48, 49,
50
Virtual network, 49
Virtual SCSI adapter, 49
Virtual server, 49, 50
Virtualization, 48–51, 55, 71–72
Vital parameters using wearable
devices, 229
VM migration, 45–46, 52–57,
70–71
Warm-up, 52, 57–58
Wearable devices, 154–155
WEKA 3.6.5 tool, 281–282
Windows-based user authentication,
28
Wireless mesh networks (WMN), 188,
190, 192, 196, 198
Workload, 54–55, 62, 65, 67–68,
70–71
Yawn, 101
Also of Interest
Check out these published and forthcoming titles
in the “Advances in Learning Analytics for Intelligent
Cloud-IoT Systems” series from Scrivener Publishing
Artificial Intelligence for Cyber Security
An IoT Perspective
Edited Noor Zaman, Mamoona Humayun, Vasaki Ponnusamy
and G. Suseendran
Forthcoming 2022. ISBN 978-1-119-76226-3
Industrial Internet of Things (IIoT)
Intelligent Analytics for Predictive Maintenance
Edited by R. Anandan G. Suseendran, Souvik Pal and Noor Zaman
Published 2022. ISBN 978-1-119-76877-7
The Internet of Medical Things (IoMT)
Healthcare Transformation
Edited by R. J. Hemalatha, D. Akila, D. Balaganesh and Anand Paul
Published 2022. ISBN 978-1-119-76883-8
Integration of Cloud Computing with Internet of Things
Foundations, Analytics, and Applications
Edited by Monika Mangla, Suneeta Satpathy, Bhagirathi Nayak
and Sachi Nandan Mohanty
Published 2021. ISBN 978-1-119-76887-6
Digital Cities Roadmap
IoT-Based Architecture and Sustainable Buildings
Edited by Arun Solanki, Adarsh Kumar and Anand Nayyar
Published 2021. ISBN 978-1-119-79159-1
Agricultural Informatics
Automation Using IoT and Machine Learning
Edited by Amitava Choudhury, Arindam Biswas, Manish Prateek and
Amlan Chakraborty
Published 2021. ISBN 978-1-119-76884-5
Smart Healthcare System Design
Security and Privacy Aspects
Edited by SK Hafizul Islam and Debabrata Samanta
Published 2021. ISBN 978-1-119-79168-3
Machine Learning Techniques and Analytics for Cloud Security
Edited by Rajdeep Chakraborty, Anupam Ghosh and Jyotsna Kumar Mandal
Published 2021. ISBN 978-1-119-76225-6
www.scrivenerpublishing.com
Download