Types of Control System

advertisement
Control Systems
Lecture 1: Introduction
Dr. Mohammed Gulam Ahamad
Text Books
1. Control Systems Engineering, (6th Edition)
By: Norman S. Nise.
2. Modern Control Systems, (12th Edition)
By: Richard C. Dorf and Robert H. Bishop.
3. Schaum’s Outline of Feedback and Control Systems, (2nd Edition)
By: Joseph J. Distefano, Allen R. Stubberud, and Ivan J. Willaims.
Reference Books
1. Modern Control Engineering, (5th Edition)
By: Katsuhiko Ogata.
2. Basic Feedback Controls in Biomedicine.
By: Charles S. Lessard.
3. Feedback Control of Dynamic Systems, (6th Edition)
By: Gene Franklin, J.D. Powell, and Abbas Emami-Naeini
4. Feedback Systems: An Introduction for Scientists and Engineers, by Karl J.
Åström and Richard M. Murray.
5. Physiological Control Systems: Analysis, Simulation and Estimation, by:
Michael C. K. Khoo.
Marks Distribution
Theory
 Total Marks = 100
 Sessional Marks = 40
• Home work = 10 marks
• Class Tests/Quizzes = 10 marks
 Final Exam Marks = 40
Expectations from Students
•
•
•
•
•
•
Full attendance is expected, except with prior-notified excuses in
written.
On-time arrival.
Active participation.
• Ask questions.
• Answer questions from instructor or students.
Help each other in reviewing notes, and solving complex
problems.
Promptly report/share problems/issues, including typos on slides,
or misspoken words from instructor.
Cheating is a very serious offense. It will be dealt with in the most
severe manner allowable under University regulations. If caught
cheating, you can expect a failing grade and initiation of a
cheating case in the University system.
Course Outline
Classical Control
Modern Control
• System Modelling
• Transfer Function
• Block Diagrams
• Signal Flow Graphs
• System Analysis
• Time Domain Analysis (Test Signals)
• Frequency Domain Analysis
• Bode Plots
• Nyquist Plots
• Nichol’s Chart
• State Space Modelling
• Eigenvalue Analysis
• Observability and Controllability
• Solution of State Equations (state Transition
Matrix)
• State Space to Transfer Function
• Transfer Function to State Space
• Direct Decomposition of Transfer Function
• Cascade Decomposition of Transfer
Function
• Parallel Decomposition of Transfer
Function
Course Outline
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Open loop and close loop control systems.
Block diagrams models and reduction techniques.
Signal flow graphs.
Mathematical modeling of electrical and mechanical systems.
Transient and steady state response of linear control systems.
State-space representation and analysis.
Eigen values and Eigen vectors.
Closed-loop system stability analysis using the Routh-Hurwitz criteria.
Stability, performance analysis, and control system design using the
Root Locus techniques.
Bode diagrams.
Polar plots.
Nyquist stability criterion.
Gain and Phase margins.
Nichol’s chart.
Prerequisites
 For Classical Control Theory
 Differential Equations
 Laplace Transform
 Basic Physics
 Ordinary and Semi-logarithimic graph papers
 For Modern Control theory above &
 Linear Algebra
 Matrices
Objectives of CS
On completion of this module, students will be able to do the following ;
•
•
•
•
•
•
•
•
•
•
Define the basic terminologies used in controls systems.
Explain advantages and drawbacks of open-loop and closed loop control systems.
Obtain models of linear control systems in ordinary differential equation, transfer
function, state space, or block diagram form.
Obtain overall transfer function of a linear control system using either block diagram
algebra, or signal flow graphs.
Simplify complex control system models using block diagram and signal flow graphs
reduction techniques.
Explain the relationship between system output response and transfer function
characteristics or pole/zero locations.
Determine the stability of a closed-loop control systems using the Routh-Hurwitz criteria.
Analyze the closed loop stability and performance of control systems based on open-loop
transfer functions using the Root Locus technique.
Compute and analyze the frequency response of control systems using Bode diagrams.
Analyze the closed loop stability and performance of control systems based on open-loop
transfer functions using the Polar plots and Nyquist stability criterion techniques.
What is Control System?
 A system Controlling the operation of another system.
 A system that can regulate itself and another system.
 A control System is a device, or set of devices to manage, command,
direct or regulate the behaviour of other device(s) or system(s).
Definitions
System – An interconnection of elements and devices for a desired purpose.
Control System – An interconnection of components forming a system
configuration that will provide a desired response.
Process – The device, plant, or system under control. The input and output
relationship represents the cause-and-effect relationship of the process.
Input
Process
Output
Definitions
Controlled Variable– It is the quantity or condition that is measured and
Controlled. Normally controlled variable is the output of the control
system.
Manipulated Variable– It is the quantity of the condition that is varied
by the controller so as to affect the value of controlled variable.
Control – Control means measuring the value of controlled variable of
the system and applying the manipulated variable to the system to
correct or limit the deviation of the measured value from a desired
value.
Definitions
Manipulated Variable
Input
or
Set point
or
reference
Controlle
r
Process
Output
Or
Controlled Variable
Disturbances– A disturbance is a signal that tends to adversely affect the
value of the system. It is an unwanted input of the system.
• If a disturbance is generated within the system, it is called internal
disturbance. While an external disturbance is generated outside
the system.
Types of Control System
 Natural Control System
 Universe
 Human Body
 Manmade Control System
 Vehicles
 Aeroplanes
Types of Control System
 Manual Control Systems
 Room Temperature regulation Via Electric Fan
 Water Level Control
 Automatic Control System
 Room Temperature regulation Via A.C
 Human Body Temperature Control
Types of Control System
Open-Loop Control Systems utilize a controller or control actuator to
obtain the desired response.
• Output has no effect on the control action. No feedback – no
correction of disturbances
• In other words output is neither measured nor fed back.
Input
Controlle
r
Output
Process
Examples:- Washing Machine, Toaster, Electric Fan
Types of Control System
•
Since in open loop control systems reference input is not compared with
measured output, for each reference input there is fixed operating
condition.
•
Therefore, the accuracy of the system depends on calibration.
•
The performance of open loop system is severely affected by the presence
of disturbances, or variation in operating/ environmental conditions.
Types of Control System
Closed-Loop Control Systems utilizes feedback to compare the actual
output to the desired output response.
Input
Comparator
Controlle
r
Output
Process
Measurement
Examples:- Refrigerator, Iron
Types of Control System
Multivariable Control System
Temp
Humidity
Pressure
Comparator
Controlle
r
Process
Measurements
Outputs
Types of Control System
Feedback Control System
• A system that maintains a prescribed relationship between the output and
some reference input by comparing them and using the difference (i.e.
error) as a means of control is called a feedback control system.
Input
+
-
error Controlle
r
Feedback
• Feedback can be positive or negative.
Process
Output
Types of Control System
Servo System
• A Servo System (or servomechanism) is a feedback control system in
which the output is some mechanical position, velocity or
acceleration.
Antenna Positioning System
Modular Servo System (MS150)
Types of Control System
Linear vs. Nonlinear Control System
• A Control System in which output varies linearly with the input is called a
linear control system.
u(t)
y(t)
Process
y(t )  2u(t )  1
y(t )  3u(t )  5
y=3*u(t)+5
y=-2*u(t)+1
35
5
30
0
25
y(t)
y(t)
-5
20
-10
15
-15
-20
10
0
2
4
6
u(t)
8
10
5
0
2
4
6
u(t)
8
10
Types of Control System
Linear vs. Nonlinear Control System
• When the input and output has nonlinear relationship the system is said to
be nonlinear.
Adhesion Characteristics of Road
Adhesion Coefficient
0.4
0.3
0.2
0.1
0
0
0.02
0.04
Creep
0.06
0.08
Types of Control System
Linear vs. Nonlinear Control System
• Linear control System Does not
exist in practice.
• When the magnitude of signals in
a control system are limited to
range
in
which
system
components
exhibit
linear
characteristics the system is
essentially linear.
0.4
Adhesion Coefficient
• Linear control systems are
idealized models fabricated by
the analyst purely for the
simplicity of analysis and design.
Adhesion Characteristics of Road
0.3
0.2
0.1
0
0
0.02
0.04
Creep
0.06
0.08
Types of Control System
Linear vs. Nonlinear Control System
• Temperature control of petroleum product in a distillation column.
°C
Temperature
500°C
Valve Position
0%
25%
% Open
100%
Types of Control System
Time invariant vs. Time variant
• When the characteristics of the system do not depend upon time itself
then the system is said to time invariant control system.
y(t )  2u(t )  1
• Time varying control system is a system in which one or more parameters
vary with time.
y(t )  2u(t )  3t
Types of Control System
Lumped parameter vs. Distributed Parameter
• Control system that can be described by ordinary differential equations are
lumped-parameter control systems.
M
d 2x
dt
2
dx
C
 kx
dt
• Whereas the distributed parameter control systems are described by partial
differential equations.
2
x
x
 x
f1
 f2
g 2
dy
dz
dz
Types of Control System
Continuous Data vs. Discrete Data System
• In continuous data control system all system variables are function of a
continuous time t.
x(t)
t
• A discrete time control system involves one or more variables that are
known only at discrete time intervals.
X[n]
n
Types of Control System
Deterministic vs. Stochastic Control System
• A control System is deterministic if the response to input is predictable and
repeatable.
x(t)
y(t)
t
t
• If not, the control system is a stochastic control system.
z(t)
t
Types of Control System
Adaptive Control System
 The dynamic characteristics of most control systems are not constant
for several reasons.
 The effect of small changes on the system parameters is attenuated in a
feedback control system.
 An adaptive control system is required when the changes in the system
parameters are significant.
Types of Control System
Learning Control System
 A control system that can learn from the environment it is operating is
called a learning control system.
Types of Control System
Control Systems
Natural
Man-made
Manual
Automatic
Open-loop Closed-loop
Non-linear
linear
Time variant Time invariant
Non-linear
linear
Time variant Time invariant
Examples of Control Systems
Water-level float regulator
Examples of Control Systems
Examples of Control Systems
Examples of Control Systems
Examples of Control Systems
History
18th Century James Watt’s centrifugal governor for the speed control of a steam
engine.
1920s Minorsky worked on automatic controllers for steering ships.
1930s Nyquist developed a method for analyzing the stability of controlled
systems
1940s Frequency response methods made it possible to design linear closed-loop
control systems
1950s Root-locus method due to Evans was fully developed
1960s State space methods, optimal control, adaptive control and
1980s Learning controls are begun to investigated and developed.
Present and on-going research fields. Recent application of modern control
theory includes such non-engineering systems such as biological, biomedical,
economic and socio-economic systems
???????????????????????????????????
Examples of Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
(a) Automobile
steering control
system.
(b) The driver uses
the difference
between the actual
and the desired
direction of travel
to generate a
controlled adjustment
of the steering wheel.
(c) Typical directionof-travel response.
Examples of Control Systems
Examples of Control Systems
Examples of Modern Control Systems
Open-loop & Closed-loop Models of Blood Glucose Control System
Examples of Control Systems
A Model of Heart Rate Control System
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Control Systems
Examples of Control Systems
Examples of Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
Examples of Modern Control Systems
Download