• Complex networks applications
– Ubiquitous, pervasive, embedded control, computing, and communication networks
– Biological regulatory networks
• New mathematics and algorithms
– robustness analysis
– systematic design
– multiscale physics
Collaborators and contributors
(partial list)
Biology : Csete, Yi, Borisuk, Bolouri , Kitano, Kurata, Khammash, El-
Samad, …
Alliance for Cellular Signaling: Gilman, Simon, Sternberg , Arkin,…
HOT: Carlson, Zhou,…
Theory : Lall , Parrilo, Paganini , Barahona, D’Andrea , …
Web/Internet : Low , Effros , Zhu,Yu, Chandy , Willinger, …
Turbulence : Bamieh, Dahleh, Gharib , Marsden , Bobba ,…
Physics : Mabuchi , Doherty , Marsden , Asimakapoulos ,…
Engineering CAD : Ortiz, Murray, Schroder, Burdick , Barr, …
Disturbance ecology : Moritz, Carlson, Robert, …
Power systems : Verghese, Lesieutre,…
Finance : Primbs, Yamada, Giannelli ,…
…and casts of thousands…
• On website accessible from SFI talk abstract
• Papers with minimal math
– HOT and power laws
– Chemotaxis, Heat shock in E. Coli
– Web & Internet traffic, protocols, future issues
• Thesis: Structured semidefinite programs and semialgebraic geometry methods in robustness and optimization
• Recommended books
–
A course in Robust Control Theory , Dullerud and
Paganini, Springer
– Essentials of Robust Control , Zhou, Prentice-Hall
–
Cells, Embryos, and Evolution , Gerhart and Kirschner
Biochemical Network: E. Coli Metabolism
Mass Transfer in Metabolism *
+ Regulatory Interactions
Complexity
Robustness
Supplies
Materials &
Energy
* from: EcoCYC by Peter Karp
Supplies
Robustness
From Adam Arkin
Robustness Complexity
Component behavior seems to be gratuitously uncertain, yet the systems have robust performance.
Mutation
Selection
Darwinian evolution uses selection on random mutations to create complexity.
Transcription/ translation
Microtubules
Neurogenesis
Angiogenesis
Immune/pathogen
Chemotaxis
….
Regulatory feedback control
• Such feedback strategies appear throughout biology
(and advanced technology).
• Gerhart and Kirschner
(correctly) emphasis that this
“exploratory” behavior is ubiquitous in biology…
• …but claim it is rare in our machines.
• This is true of primitive, but not advanced, technologies.
• Robust control theory provides a clear explanation.
Component behavior seems to be gratuitously uncertain, yet the systems have robust performance.
Transcription/ translation
Microtubules
Neurogenesis
Angiogenesis
Immune/pathogen
Chemotaxis
….
Regulatory feedback control
• Without extensive engineering theory and math, even reverse engineering complex engineering systems would be hopeless. (Let alone actual design.)
• Why should biology be much easier?
• With respect to robustness and complexity, there is too much theory, not too little.
• Two great abstractions of the 20 th Century:
– Separate systems engineering into control, communications, and computing
• Theory
• Applications
– Separate systems from physical substrate
• Facilitated massive, wildly successful, and explosive growth in both mathematical theory and technology…
• …but creating a new Tower of Babel where even the experts do not read papers or understand systems outside their subspecialty.
“Any sufficiently advanced technology is indistinguishable from magic.”
Arthur C. Clarke
“Any sufficiently advanced technology is indistinguishable from magic.”
Arthur C. Clarke
“Those who say do not know, those who know do not say.”
Zen saying
• Introduce basic ideas about robustness and complexity
• Minimal math
• Hopefully familiar (but unconventional) example systems
• Caveat: the “real thing” is much more complicated
• Perhaps any such “story” is necessarily misleading
• Hopefully less misleading than existing popular accounts of complexity and robustness
• Complexity phenotype : robust, yet fragile
• Complexity genotype : internally complicated
• New theoretical framework: HOT (Highly optimized tolerance, with Jean Carlson, Physics, UCSB)
• Applies to biological and technological systems
– Pre-technology: simple tools
– Primitive technologies use simple strategies to build fragile machines from precision parts.
– Advanced technologies use complicated architectures to create robust systems from sloppy components…
– … but are also vulnerable to cascading failures…
• Robust to large variations in environment and component parts (reliable, insensitive, resilient, evolvable, simple, scaleable, verifiable, ...)
• Fragile, often catastrophically so, to cascading failures events (sensitive, brittle,...)
• Cascading failures can be initiated by small perturbations (Cryptic mutations,viruses and other infectious agents, exotic species, …)
• There is a tradeoff between
– ideal or nominal performance (no uncertainty)
– robust performance (with uncertainty)
• Greater “pheno-complexity”= more extreme robust, yet fragile
• Cascading failures can be initiated by small perturbations (Cryptic mutations,viruses and other infectious agents, exotic species, …)
• In many complex systems, the size of cascading failure events are often unrelated to the size of the initiating perturbations
• Fragility is interesting when it does not arise because of large perturbations, but catastrophic responses to small variations
• Robustness is achieved by building barriers to cascading failures
• This often requires complicated internal structure, hierarchies, self-dissimilarity, layers of feedback, signaling, regulation, computation, protocols, ...
• Greater “geno-complexity” = more parts, more structure
• Molecular biology is about biological simplicity, what are the parts and how do they interact.
• If the complexity phenotypes and genotypes are linked, then robustness is the key to biological complexity.
• “Nominal function” may tell little.
Component behavior seems to be gratuitously uncertain, yet the systems have robust performance.
Mutation
Selection
Darwinian evolution uses selection on random mutations to create complexity.
Transcription/ translation
Microtubules
Neurogenesis
Angiogenesis
Immune/pathogen
Chemotaxis
….
Regulatory feedback control
Loss of Protein
Function
Network failure
Cell
Death
Unfolded
Proteins
Aggregates
Temp cell
Folded
Proteins
Temp environ
Cell
Loss of Protein
Function
Network failure
Unfolded
Proteins
How does the cell build
“barriers” (in state space) to stop
Death this cascading failure event?
Aggregates
Temp cell
Folded
Proteins
Temp environ
Folded
Proteins
Insulate &
Regulate
Temp
Temp cell
Temp environ
Folded
Proteins
Thermotax
Temp cell
Temp environ
More robust
( Temp stable) proteins
Unfolded
Proteins
Aggregates
Temp cell
Folded
Proteins
Temp environ
• Key proteins can have multiple (allelic or paralogous) variants
• Allelic variants allow populations to adapt
• Regulated multiple gene loci allow individuals to adapt
Unfolded
Proteins
Aggregates
Temp cell
Folded
Proteins
Temp environ
v
e
AE
RT
Log of
E. Coli
Growth
Rate
21 o
37 o
42 o
46 o
Heat Shock
Response
Log of
E. Coli
Growth
Rate
Robustness/performance tradeoff?
37 o
42 o
46 o
21 o
Unfolded
Proteins
Folded
Proteins
Refold denatured proteins
Heat shock response involves complex feedback and feedforward control.
Temp cell
Temp environ
Why does biology (and
• Robust proteins
– Temperature stability
– Allelic variants advanced technology) overwhelmingly opt for the complex control systems instead of just robust components?
– Paralogous isozymes
• Regulate temperature
• Thermotax
• Heat shock response
– Up regulate chaperones and proteases
– Refold or degraded denatured proteins
E. Coli Heat Shock
(with Kurata, El-Samad, Khammash, Yi)
Outer
Feedback
Loop
rpoH gene
Transcription
32 mRNA
-
Heat
32
Translation Feedforward
Heat stabilizes
32
T dependent
3 2
rate
1 s
0 .
03
32
0
k dist
32 degradatio n rate
32 free
T dependent
DnaK : P unfold
Dnak translation
& transcription dynamics k
1
DnaK
0
k
2 k
3 protease
FtsH
0
DnaK free
32
: DnaK r
2 r
1
32
: protease
32
: DnaK : FtsH hsp1 hsp2
Local Loop Transcription & Translation
FtsH
Lon
DnaK
GroL
GroS
Proteases
Chaperones
Heat
Thermostat
Tail
Added mass
Moves the center of mass forward.
Thus stabilizing forward flight.
At the expense of extra weight and drag.
Moves the center of pressure aft.
For minimum weight & drag,
(and other performance issues) eliminate fuselage and tail.
Why do we love building robust systems from highly uncertain and unstable components?
r
d (disturbance)
P + y
d
Assumptions on components:
• Everything just numbers
• Uncertainty in
P
• Higher gain = more uncertain y
( P
P r
d
P
1
P
2
P
1
P
2
P
1
P
2
r
d (disturbance) y
d
P + r
G d
+ y
(
)
K
Negative feedback y
GSr
Sd
1
K
1
Sd S
1
1
GK
r
G d
+
K y
Design recipe:
• 1 >> K >> 1/G
• G >> 1/K >> 1
•
G maximally uncertain!
•
K small, low uncertainty
G
1
K
1
S
1
GK
1 y
1
K r
Results for y
(1/K )r :
• high gain
• low uncertainty
• d attenuated y
GSr
Sd
1
K
1
Sd
S = sensitivity function
S
1
1
GK
r
G
K d
+ y
Extensions to:
• Dynamics
• Multivariable
• Nonlinear
• Structured uncertainty
All cost more computationally.
Design recipe:
• 1 >> K >> 1/G
• G >> 1/K >> 1
•
G maximally uncertain!
•
K small, low uncertainty
Results for y
(1/K )r :
• high gain
• low uncertainty
• d attenuated
r
G
K y
Uncertain high gain
Transcription/translation
Microtubule formation
Neurogenesis
Angiogenesis
Antibody production
Chemotaxis
….
Regulatory feedback control
• Primitive technologies build fragile systems from precision components.
• Advanced technologies build robust systems from sloppy components.
• There are many other examples of regulator strategies deliberately employing uncertain and stochastic components…
• …to create robust systems .
• High gain negative feedback is the most powerful mechanism, and also the most dangerous.
• In addition to the added complexity, what can go wrong?
d (disturbance) y
G + d
K
+
F
GK
F y
( )
d y
1
1
F d
1
F
d if F 1 y
y
1
1
F d d y
+
If y, d and F are just numbers:
S
y
d 1
1
F
F
S measures disturbance rejection.
S = sensitivity function It’s convenient to study ln(
S ).
N eg ativ e F ( F
0)
ln ( )
Disturbance attenuated
P o si tive F ( F
0)
S
Disturbance ampli fied
S
y
d 1
1
F ln( S )
F > 0 ln(S) > 0 amplification
F
F < 0 ln(S) < 0 ln( |S| ) attenuation
N eg ativ e F ( F
0)
ln ( )
Disturbance attenuated
P o si tive F ( F
0)
S
Disturbance ampli fied
S
y
d 1
1
F ln( S )
F
1 ln(S)
F
F
ln(S)
d
+
F
S
1
1
F y
If these model physical processes, then d and y are signals and F is an operator. We can still define
S(
= | Y(
/D(
| where E and D are the Fourier transforms of y and d . ( If F is linear, then S is independent of D.
)
Under assumptions that are consistent with F and d modeling physical systems (in particular, causality), it is possible to prove that:
log S (
) d
0
( F
0)
S
attenuate
( F
0)
S
amplify
he amplification ( F >0) must at least balance the attenuation ( F <0) .
log |S |
(Bode, ~1940)
log |S |
ln |S|
F
log |S |
ln |S|
F
Robustness of
HOT systems
Fragile
Robust
(to known and designed-for uncertainties)
Fragile
(to unknown or rare perturbations)
Uncertainties
Robust
• Negative feedback is both the most powerful and most dangerous mechanism for robustness.
• It is everywhere in engineering, but appears hidden as long as it works.
• Biology seems to use it even more aggressively, but also uses other familiar engineering strategies:
– Positive feedback to create switches (digital systems)
– Protocol stacks
– Feedforward control
– Randomized strategies
– Coding
Robustness Complexity
• So far, this is all undergraduate level material
• Current research involves lots of math not traditionally thought of as “applied”
• New theoretical connections between robustness, evolvability, and verifiability
• Beginnings of a more integrated theory of control, communications and computing
• Both biology and the future of ubiquitous, embedded networking will drive the development of new mathematics.
Robustness of
HOT systems
Fragile
Robust
(to known and designed-for uncertainties)
Fragile
(to unknown or rare perturbations)
Uncertainties
Robust
Robustness of
HOT systems
Humans
Archaea
Chess Meteors
Fragile
Robust
Robustness of
HOT systems
Humans
Archaea
Chess
Fragile
Meteors
Humans + machines?
Machines
Robust
Cancer
Epidemics
Viral infections
Auto-immune disease
Fragile
Uncertainty
Robust
• In a system
– Environmental perturbations
– Component variations
• In a model
– Parameter variations
– Unmodeled dynamics
– Assumptions
– Noise
F ( )
Fragile
Robust
F ( )
?
Fragile
Robust
F ( )
?
Typically NP hard.
• If true, there is always a short proof.
• Which may be hard to find.
?
Typically coNP hard.
• More important problem.
• Short proofs may not exist.
Fundamental asymmetries*
• Between P and NP
• Between NP and coNP
* Unless they’re the same…
How do we prove that
, F
?
• Standard techniques include relaxations, Grobner bases, resultants, numerical homotopy, etc…
• Powerful new method based on real algebraic geometry and semidefinite programming (Parrilo, Shor, …)
• Nested series of polynomial time relaxations search for polynomial sized certificates
• Exhausts coNP (but no uniform bound)
• Relaxations have both computational and physical interpretations
• Beats gold standard algorithms (eg MAX CUT) handcrafted for special cases
• Completely changes the P/NP/coNP picture
Bacterial chemotaxis
Random walk
Ligand Motion Motor
Bacterial chemotaxis (Yi, Huang, Simon, Doyle)
Biased random walk gradient
Ligand Motion
Signal
Transduction
Motor
Y p
Che
High gain (cooperativity)
“ultrasensitivity”
References:
Cluzel, Surette,
Leibler
Ligand Motion
Signal
Transduction
Motor
Y p
Che
ligand binding motor
FAST
ATP
MCPs
W
A
B
+ATT
-ATT
+CH
3
R
SLOW
P
B
-CH
3
ATP
MCPs
W
A
P
ADP
P i flagellar motor
P
Y
CW
Y
Z
P i
References:
Cluzel, Surette,
Leibler +
Alon, Barkai,
Bray, Simon,
Spiro, Stock,
Berg, …
Signal
Transduction
Motor
Y p
Che
ligand binding moto r
FAST
ATP
MCPs
W
A
+ATT
-ATT
+CH
3
R
SLOW
P
B
-CH
3
ATP
MCPs
W
A
P
ADP
P i
B flagellar motor
P
Y
CW
Y
Z
P i
ligand binding moto r
FAST
+ATT
-ATT flagellar motor
MCPs
W
A
ATP
MCPs
W
A
P
ADP
P
Y
CW
ATP Z
P i
Y
p
1
0
0 1 2 3
Ligand
4 5 6
Che Yp
Extend run
(more ligand)
Barkai, et al
0 1 2
No methylation
3 4 5
Time (seconds)
6
ATP
MCPs
W
A
+CH
3
R
SLOW
P
B
-CH
3
ATP
MCPs
W
A
P
ADP
P i
B
ligand binding moto r
FAST
ATP
MCPs
W
A
+ATT
-ATT
+CH
3
R
SLOW
P
B
-CH
3
ATP
MCPs
W
A
P
ADP
P i
B flagellar motor
P
Y
CW
Y
Z
P i
5
3
1
0
0 1000 2000 3000 4000 5000 6000 7000
No methylation
B-L
0 1000 2000 3000 4000 5000 6000 7000
Time (seconds)
Tumble
(less ligand)
Ligand
Extend run
(more ligand)
No methylation
Biologists call this
“perfect adaptation”
• Methylation produces “perfect adaptation” by integral feedback .
• Integral feedback is ubiquitous in both engineering systems and biological systems.
• Integral feedback is necessary for robust perfect adaptation.
Perfect adaptation is necessary
…
Tumbling bias ligand p
Y Che
Signal
Transduction
Motor p
Y Che
Perfect adaptation is necessary
…
…to keep CheYp in the responsive range of the motor.
ligand
Tumbling bias p
Y Che
• Maybe just not the right question.
• Fine tuned for robustness…
• …with resource costs and new fragilities as the price.
Biochemical Network: E. Coli Metabolism
Mass Transfer in Metabolism *
+ Regulatory Interactions
Complexity
Robustness
Supplies
Materials &
Energy
* from: EcoCYC by Peter Karp
Supplies
Robustness
From Adam Arkin
• Information & entropy
• Fractals & self-similarity
• Chaos
• Criticality and power laws
• Undecidability
• Fuzzy logic, neural nets, genetic algorithms
• Emergence
• Self-organization
• Complex adaptive systems
• New science of complexity
• Not really about complexity
• These concepts themselves are “robust, yet fragile”
• Powerful in their niche
• Brittle (break easily) when moved or extended
• Some are relevant to biology and engineering systems
• Comfortably reductionist
• Remarkably useful in getting published
• Tuning 1-2 parameters critical point
• In certain model systems (percolation, Ising, …) power laws and universality iff at criticality.
• Physics: power laws are suggestive of criticality
• Engineers/mathematicians have opposite interpretation:
– Power laws arise from tuning and optimization.
– Criticality is a very rare and extreme special case.
– What if many parameters are optimized?
– Are evolution and engineering design different? How?
• Which perspective has greater explanatory power for power laws in natural and man-made systems?
4
Cumulative
Frequency
3
2
1
6
5
WWW files
Mbytes
(Crovella)
Forest fires
1000 km
2
(Malamud)
Data compression
(Huffman)
Los Alamos fire
0
-1
-6 -5 -4 -3 -2 -1 0
Decimated data
Log (base 10)
Size of events
1 2
Size of events x vs. frequency log(probability)
dP dx
p ( x )
x
(
1 ) log(Prob > size) log(rank)
P
x
log(size)
4
Cumulative
Frequency
3
2
1
6
5
Web files
Fires
-1/2
Codewords
-1
0
-1
-6 -5 -4 -3 -2 -1 0
Size of events
1
Log (base 10)
2
• Engineers design (and evolution selects) for systems with certain typical properties:
• Optimized for average (mean) behavior
• Optimizing the mean often (but not always) yields high variance and heavy tails
• Power laws arise from heavy tails when there is enough aggregate data
• One symptom of “robust, yet fragile”
Based on frequencies of source word occurrences,
Select code words.
To minimize message length.
Data
Compression
• Ignore value of information, consider only “surprise”
• Compress average codeword length (over stochastic ensembles of source words rather than actual files)
• Constraint on codewords of unique decodability
• Equivalent to building barriers in a zero dimensional tree
• Optimal distribution (exponential) and optimal cost are: length l i
log( p i
)
p i
exp( cl i
)
Avg. length =
p i log( p i
) p l i i
Avg. length =
p i length l i
log(
p i
exp( cl i
) p i
)
How well does the model predict the data?
3
2
1
0
-1
0
Data
6
4
5 DC
1 2 log( p i
)
length
p i
Avg. length =
p i log( p i
)
l
i
log( exp( cl i
) p i
)
How well does the model predict the data?
Not surprising, because the file was compressed using
Shannon theory.
3
2
1
0
-1
0
6
Data + Model
4
5 DC
1 2
Small discrepancy due to integer lengths.
• Keep parts of Shannon abstraction:
– Minimize downloaded file size
– Averaged over an ensemble of user access
• But add in feedback and topology, which completely breaks standard Shannon theory
• Logical and aesthetic structure determines topology
• Navigation involves dynamic user feedback
• Breaks standard theory, but extensions are possible
• Equivalent to building 0-dimensional barriers in a 1- dimensional tree of content
document split into N files to minimize download time
toy
split into N files to minimize download time
# links = # files
Weather
Spark sources
Intensity
Frequency
Extent
Flora and fauna
Topography
Soil type
Climate/season
Burnt regions are 2-d
Fire suppression mechanisms must stop a 1-d front.
Optimal strategies must tradeoff resources with risk.
Generalized “coding” problems
• Optimizing d -1 dimensional cuts in d dimensional spaces…
• To minimize average size of files or fires, subject to resource constraint.
• Models of greatly varying detail all give a consistent story.
• Power laws have
1/d.
• Completely unlike criticality.
Data compression
Web
Fires
d = 0 d = 1 d = 2
data compression web layout forest fires p i
l i
c
1 d
) d
0
P
l
l d
1
1 d p i
exp(
cl i
) d
0
Data
6
5
4
3
2
1
0
-1
-6
WWW
FF
-5 -4 -3 -2 -1
DC
0 1 2
Data + Model/Theory
6
5
4
3
2
1
0
-1
-6
WWW
FF
-5 -4 -3 -2 -1
DC
0 1 2
Fire suppression mechanisms must stop a 1-d front.
Burnt regions are 2-d
Geography could make d <2.
• Rugged terrain, mountains, deserts
• Fractal dimension d
1?
• Dry Santa Ana winds drive large (
1-d) fires
Data + HOT Model/Theory
6
5
4
3
2
1
0
-1
-6
FF
(national) d = 2
-5 -4 -3
California brushfires
-2 -1 d = 1
0 1 2
Data + HOT+SOC
6
5
4
3
2
1
0 d = 2
.15
d = 1
SOC FF
-1
-6 -5 -4 -3 -2 -1 0 1 2
Critical/SOC exponents are way off
Data:
> .5
SOC
< .15
18 Sep 1998
Forest Fires: An Example of Self-Organized
Critical Behavior
Bruce D. Malamud, Gleb Morein, Donald L. Turcotte
10
3
10
2
HOT FF d = 2
10
1
SOC FF
10
0
10
-2
10
-1
10
0
10
1
Additional 3 data sets
10
2
10
3
10
4
Fires 1991-1995
Fires 1930-1990
SOC and HOT have very different power laws.
d
d
SOC d=1 d
1
10
• HOT decreases with dimension.
• SOC increases with dimension.
1 d
HOT d=1
• HOT yields compact events of nontrivial size.
• SOC has infinitesimal, fractal events.
HOT
SOC infinitesimal size large
SOC and HOT are extremely different.
SOC
Max event size Infinitesimal
Large event shape
Slope
Dimension d
Fractal
Small
d-1
HOT
Large
Compact
Large
1/d
Data
Large
Compact
Large
1/d
HOT
SOC
SOC and HOT are extremely different.
SOC
Max event size Infinitesimal
Large event shape
Slope
Dimension d
Fractal
Small
d-1
HOT & Data
Large
Compact
Large
1/d
HOT
SOC
Robust
Log(freq.) cumulative
Gaussian,
Exponential
Log(event sizes)
log(prob>size)
Improved design, more resources
Power laws are inevitable.
Gaussian log(size)
• Power laws are ubiquitous
• HOT may be a unifying perspective for many
• Criticality, SOC is an interesting and extreme special case…
• … but very rare in the lab, and even much rarer still outside it.
• Viewing a complex system as HOT is just the beginning of study.
• The real work is in new Internet protocol design, forest fire suppression strategies, etc…
throughput
Congestion induced
“phase transition.”
Similar for:
• Power grid?
• Freeway traffic?
• Gene regulation?
• Ecosystems?
• Finance?
demand
Congestion induced
“phase transition.” demand
Power laws log(file size)
H
3
2
log(thru-put)
Broadcast
Network
Making a “random network:”
• Remove protocols
– No IP routing
– No TCP congestion control
• Broadcast everything
Many orders of magnitude slower random networks log(demand)
log(thru-put) real networks
Broadcast
Network random networks log(demand)
flow
streamlined pipes
random pipes pressure drop
flow streamlined pipes random pipes
HOT
pressure drop
• Through streamlined design
• High throughput
•
Robust to bifurcation transition (Reynolds number)
•
Yet fragile to small perturbations
• Which are irrelevant for more “generic” flows
• Shear flows are ubiquitous and important
• HOT may be a unifying perspective
• Chaos is interesting, but may not be very important for many important flows
• Viewing a turbulent or transitioning flow as
HOT is just the beginning of study
The yield/density curve predicted using random ensembles is way off. designed
Yield, flow, …
HOT random
Densities, pressure,…
Similar for:
• Power grid
• Freeway traffic
• Gene regulation
• Ecosystems
• Finance?
Kumar Bobba, Bassam Bamieh
Turbulence is the graveyard of theories.
Hans Liepmann
Caltech
• The orthodox view:
• Adjusting 1 parameter (Reynolds number) leads to a bifurcation cascade to chaos
• Turbulence transition is a bifurcation
• Turbulent flows are chaotic, intrinsically nonlinear
• There are certainly many situations where this view is useful.
low equilibrium velocity periodic high chaotic
pressure drop
average flow speed
flow
(average speed) laminar bifurcation turbulent pressure (drop)
Random pipes are like bluff bodies.
flow pressure
Typical flow
log(flow) streamlined pipe
“theory” laminar experiment turbulent
Random pipe log(pressure)
log(flow) streamlined pipe
Random pipe
10
2
10
3
10
4
10
5 log(Re)
This transition is extremely delicate
(and controversial).
streamlined pipe
Random pipe
It can be promoted
(or delayed!) with tiny perturbations.
10
2
10
3
10
4
10
5 log(Re)
Transition to turbulence is promoted
(occurs at lower speeds) by
Surface roughness
Inlet distortions
Vibrations
Thermodynamic fluctuations?
Non-Newtonian effects?
None of which makes much difference for
“random” pipes.
10
2
Random pipe
10
3
10
4
10
5
log(flow)
80 ppm Guar water
It can be reduced with small amounts of polymers.
log(pressure)
flow streamlined pipes random pipes
HOT
pressure drop
• Through streamlined design
• High throughput
•
Robust to bifurcation transition (Reynolds number)
•
Yet fragile to small perturbations
• Which are irrelevant for more “generic” flows
streamwise
Couette flow
downflow upflow low speed streaks high-speed region
From Kline
Spanwise periodic
Streamwise constant perturbation
Spanwise periodic
Streamwise constant perturbation
y position z y position flow x flow z v velocity flow u w x v velocity w flow u
u x
v y
w z
0
0
u
t u u p u
R u
( , , )
t
u v x x u v y y u v w w w x y z z z
1
R
x
2
2
y
2
2
z
2
2
u
y flow
x
0 flow x v position velocity u x
z
z w
u x
v y
w z
0
0
u
t u u p u
R u
( , , )
t
u v x x u v y y u v w w w x y z z z
1
R
x
2
2
y
2
2
z
2
2
u
y flow
x
0 flow x v position velocity u x
z
z w
u x
v y
w z
0
0
u
t u u p u
R u
( , , )
t
u v x x u v y y u v w w w x y z z z
1
R
x
2
2
y
2
2
z
2
2
u
x
z
v
( , , )
z
, w
y
2d NS
u
t
t
z u y
z
y
u y z R
1
y
z R u
y position
x
0 flow x
2 dimensions z v velocity flow u
3 components w v
( , , )
z
, w
y
2d-3c model
u
t
t
z u y
z
y
u y z R
1
y
z R u
x
0
These equations are globally stable!
Laminar flow is global attractor.
v
( , , )
z
, w
y
2d-3c model
u
t
t
z u y
z
y
u y z R
1
y
z R u
energy
R
2
R
R
3
(Bamieh and Dahleh) t
10
5 energyN=10R=1000t=1000alpha=2
10
0
10
-5
10
-10
0 200 400 t
600 800 1000
What you’ll see next.
Log-log plot of time response.
Random initial conditions on
( , ,
0) concentrated at lower boundary.
Streamwise streaks.
Long range correlation.
Exponential decay.
flow streamlined pipes random pipes
HOT
pressure drop
• Through streamlined design
• High throughput
•
Robust to bifurcation transition (Reynolds number)
•
Yet fragile to small perturbations
• Which are irrelevant for more “generic” flows
• The orthodox view:
– Power laws suggest criticality
– Turbulence is chaos
• HOT view:
– Robust design often leads to power laws
– Just one symptom of “robust, yet fragile”
– Shear flow turbulence is noise amplification
• Other orthodoxies:
– Dissipation, time irreversibility, ergodicity and mixing
– Quantum to classical transitions
– Quantum measurement and decoherence
• HOT may make little difference for explaining much of traditional physics lab experiments,
• So if you’re happy with orthodox treatments of power laws, turbulence, dissipation, quantum measurement, etc then you can ignore HOT.
• Otherwise, the differences between the orthodox and
HOT views are large and profound, particularly for…
• Forward or reverse (eg biology) engineering complex, highly designed or evolved systems,
• But perhaps also, surprisingly, for some foundational problems in physics