40 Years in Research at USC Joseph E. Johnson, PhD Professor

advertisement
40 Years in Research
at USC
Joseph E. Johnson, PhD
Professor
Department of Physics and Astronomy, USC
March 18, 2009
jjohnson@sc.edu
www.nexus.sc.edu www.asg.sc.edu
My Personal Premise:
Science is the study of information –the
measured values of observables.
 If something cannot be measured,
(observed) its existence is meaningless.
 We should be very cautious when we base
our scientific theory on something that
cannot be measured.

Some Background Math We Need

Linear Vector Space (LVS), n dimensions
 |A>

+ |B> = |C> and a|A> = B with |A> = ai|i>
Metric Space = a LVS with a sym product:
 <A|B>

= a number = AiBi = S<A|i><i|B>= AB cos(q)
Lie Algebra = a LVS & an antisym product:
Li , Lj ] = Cijk Lk an obeys the Jacobi identity
 Generates a Lie Group via G(a) = exp (a L)
 Where L = ai Li and where Li is a basis for the space
[
The Classical Physics

Classical state of a particle is a vector in 3-space and
time in a metric space pointing to the particle along with
derivatives of these vectors and functions of them.


Space can be measured by the timing of a light pulse.
Time is measured by the # on a clock which is




A periodic process cycling in the smallest time
An aperiodic process that counts the cycles
Gravitational and EM fields provide a complementary
system with associated fields (and waves).
We here assume that no measurement interferes with
any other and that there is no limit to information or
accuracy (space, time, mass…) as real numbers.
The Quantum Theory
Observables are operators with known [ , ]
(interference properties) and form a LA
 The state of a system is a vector | > in a
metric (Hilbert inf. dim.) space
 | > must be a representation space of the
operators thus Cijk determines reality

A many particle system is described by the
outer product of these vectors optimally
written as creation a+ and annihilation
operators a acting on a vacuum |A> = aA+
|0> with aA|0>=0
 The state vectors must also support the
discrete group T,C,P.

Eight Questions that Bother me:







1. What are the fundamental observables and how are they related
(in relativistic quantum theory)?
2. What is information & entropy – diffusion, arrow of time,…
3. How can order (information - life) emerge in view of the second
law?
4. How can uncertain information be best represented
mathematically?
5. What are networks - How can we understand them?
6. Is there a better way to represent data structures?
7. What is the Truth? A possible new game theory


Can this ‘game theory’ improve education – knowledge & learning.
8. What is the ‘measurement’ (of an observable)?

How is that related to information, entanglement, & classical mechanics.
1. What are the fundamental observables and how
are they related (in relativistic quantum theory)?








X P I Heisenberg algebra [ X , P ] = I = i h/2p
Mun Lorentz Algebra w Rotations (symmetry group)
Poincare algebra Pm Mmn (larger symmetry group)
I did extensions to X P M I (a 15 parameter algebra)
This extends the Poincare algebra with relativistic
position operators
Results in a nice formal structure with TCP
But a proper time dynamics of multiple particles is not
easy.
But it provided a beautiful group theory basis for particles
that is much easier to utilize than the Poincare group.
The XPMI Lie Algebra
The general state of an elementary particle
is given by the simultaneous computing
operators for:

|h,|k0|,e(k0),k, b0, b1, s, s, +internal…>
Heisenberg representation of equations
of motion of a free spin ½ particle
Heisenberg representation of equations
of motion of a spin ½ charged particle
I looked for extensions to XPMI to explain
internal symmetries (charge, strangeness…)
I found an interesting regularity in the
hadron mass spectra (possibly a 5th force)
 But it did not extend to higher dimensional
representations as I had thought.
 We know that internal symmetries are the
result of the composite nature of matter –
at least the hadrons from quarks.

Rethink these problems:




My direction then changed to consider what it means to
measure an observable & the meaning of information.
In particular I was bothered by the exactness of position
and momentum measurements (which are impossible
i.e. the XP algebra is only approximate)
I was also bothered by our number system and our
representations of uncertainty – even classically
I was also bothered by the concept of measurement and
its relationship to information and how it collapses the
wave function and provides a link to classical theory.
2. What is information & entropy –
diffusion GL(n,R)?
Information (which measures order) is the
negative of entropy (which measures
disorder)
 Disorder can be studied with the random
walk of Einstein, or diffusion via Markov:

Markov matrix example:
(note columns sum to unity)
0.1
0.8
0.1
0.1
0.8
0.1
0.1
0.8
0.1
0.1
0.8
0.1
0.1
0.8
0.1
0.1
0.8
0.1
The Markov Lie Algebra basis elements form
a basis for all off-diagonal elements.
Markov Lie Algebra basis
Lij =
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
-1
0
0
0
0
0
0
Although diffusion has no inverse, I began
studying it from the point of view of group
theory and found a new decomposition of
the general linear group into a Markov
type group and the Abelian scaling group.
 GL(n,R) = M(n,R) + A(n,R)
 This gave insight into irreversibility but did
not immediately explain ‘information’

What was the result of this work?
Lie Algebras & Lie Groups were now
connected to Markov theory & thus to
diffusion.
 Two very different branches of
mathematics were now connected and we
can use the understanding in one area to
see further in the other.

3. How can order emerge in view of
the second law?
I attended a talk in 1991 on ‘Fibonacci
numbers
 I suspected that these numbers
represented some kind of natural order in
nature but what was their source?
 Rather than a numerical sequence, I
looked at them as (old rabbits, new
rabbits) – a vector.

Formal Study of Order in Systems






I reframed them as motion in GL(n,R) – But there were infinitely
many such motions.
Then I ‘quantized’ this classical continuous system as a two
component linear entropy seeking system.
This result led to an understanding of how order can emerge in nonequilibrium systems.
I showed that the Fibonacci sequence is the simplest quantized
linear system that preserves information.
It also suggested a whole family of new types of Fibonacci
sequences and how they were related.
It also opened a question of “why quantization occurs in a classical
system?”
4. . How can uncertain information
(& numbers) be best represented?



The fundamental bit of information (1 & 0) could
be generalized as probabilities but this does not
form a ‘closed mathematical structure’.
In looking at the fundamental 2 dimensional
representations of the Markov algebra, (x1,x0)
i.e. the transformations that leave x1+ x0 =1, I
realized that this ‘vector’ could be used to
generalize a bit (1 or 0) of information.
Take x1,= probability to be true and x0 = the
probability to be false.
Thus I am suggesting that probabilities
should always be treated as components
of a vector, NOT a scalar.
 We already do this when we use a
probability (or probability amplitude)
function as it includes all possible states.

I developed a new mathematics based upon
these objects (x1,x0) which I called a bit
vector or ‘bittor’ (similar to the spinor).



I (smoothly) generalized the Boolean truth tables
as zi = caijk xj yk (i,j,k = 1,0 and a = 0,1…15 for
the 16 types of AND, OR, NOR… logic
operations)
I generalized numbers as outer products of
these objects similar to binary numbers.
Renyi’ second order entropy is a natural object
in the truth table (EQV) and is extremely close to
Shannon entropy numerically.




The rebuilding of all arithmetic operations can be
done smoothly but with some difficult aspects
(probability correlations).
One hope is to automatically manage
uncertainty in all of math in computers.
Numbers are now Markov ‘group’
representations ()()…()
Eigenvalues are now Bittors and a quantum
state can represent a broad location of any
probability (as defined by the bittor product).


The fundamental branch operation IF x>y Then
aaa Else bbb now evokes a bifurcation
generating new threads of computation like
particle creation in quantum theory.
This is much like our mind reasons using
simultaneous multiple threads, and dropping
outcomes with low probability (annihilation of
threads).
5. What are networks - How can we
understand them?





In classical physics the state of a system (position,
momentum,..) is given by a vector.
This is also true in both relativity and in quantum theoryjust a vector in higher dimensions.
But a network is a set of relationships that must be
expressed by a matrix
Networks describe: power grids, transportation networks,
neural networks, communications networks (internet &
phone), electrical networks, and financial networks.
Their topologies are of exceptional difficulty and
complexity.


The relationship of entity i to entity j is given by a
positive value or zero, Cij, a matrix with no
diagonal and non-negative off diagonal values.
I was able to show that if the diagonals of Cij are
set to the negative sum of all other elements in
that column, then the result is always a member
of the Markov Lie algebra.
Thus each network exactly determines a
continuous set of Markov transformations
analogous to diffusion among the nodes at
the rates of connection.
 We have now connected all possible
continuous Markov transformations in a 11 fashion to all possible network
topologies.

We how have connected three branches
of mathematics: Lie algebras to Markov
Theory to Network theory uniquely.
 Results in one area can be used in
another.
 This allowed us to derive an entropy
network spectra of the columns and
identify network attacks and aberrations.

6. Is there a better way to
represent data structures?
Relational databases are powerful but very
difficult to build
 Hierarchical type data structures (like
XML) are highly flexible but are lacking in
speed of search and power of analysis
 I have designed a new kind of database
that is in the middle – part XML, part RDB.

This concept uses information ‘structures’
like Lego blocks connected in pairs by
relationships thus building a database that
is a network of simple tables.
 Our work resembles new research on the
semantic web by Tim Berners-Lee


/P

fn Joe
 ln Johnson
 /L



/rl works at
/place USC
/C







//
Rl work contact
Ph 803-777-6431
Web www.asg.sc.edu
//
//
ss 123-45-6789
This new database has the advantages
that (a) one can build a database after an
hour of instruction, and (b) fields can be
added and removed at will.
 With an icon click, the system will build an
associated relational database and also an
XML database.

Thus I suggest that our fundamental
information databases should be networks
of information units.
 Complex information is thus a network or
(Markov Lie algebra) with weighted links
among information units.

7. A New Type of Game Theory What is True?



Recall the two person game as a payoff matrix.
Dr. John Nash (Nobel Prize for Nash equilibrium in von
Neumann game theory) gave the invited talk at dinner at
an MIT conference on complex systems and spoke of
the difficulties of solving even a three person game when
alliances are allowed.
We already know that there are extreme difficulties with
N-person games, non-zero sum games, and even worse,
how to assign the payoff matrix in any real situation.
Standard von Neumann game theory has
payoff elements as the matrix elements:
A Two Person Game
1
2
3
A
-4
12
5
B
7
-3
2
Proposal:



Based upon my work with automatic grading of
questions to a class it occurred to me that a
different kind of game theory could be designed
that solved all four of the problems with the von
Neumann formulation.
Questions are posed that have one word or one
number unique unambiguous answers.
One gets ‘paid’ for the right answer.
The probability that each person is right
(eq 1) and the probability that each
response is right (eq 2) are coupled
nonlinear equations that can be solved
iteratively for self consistent solutions.
 The payoff is automatically determined
with information theory of the weight of a
vote (log (P/(1-P)), the logit function.




I presented this framework at the last MIT
Complexity Conference and have applied for a
patent on the design and algorithm.
I also gave a seminar at the University of Utah
last summer on this new game theory design.
We intend to apply this for expert consensus
voting for investments, petroleum mining,
medical & pharmaceutical judgments and
related problems.
7’. Can this ‘game theory’ improve
education – knowledge & learning
I am working on first applying this system
to education by having students with
pocket web devices log on upon entering a
class and enter simple responses to
questions from the instructor.
 The instructor has a laptop showing all
responses which are automatically graded.



Building on this system, with demographic data
collection, one can monitor the learning rates of
all students, keep them engaged, and send
them their grades after each class (as well as
the parents, and the school, district, and state
administrators.
I envision this system that also estimates the
value, validity, and other metrics of the questions
and extensive correlates.




One aspect of the system is to monitor keystroke rates to
flag and track the well being of participants.
Error and misspelling rates are also tracked for the
respondents.
Instructor’s question’s quality, validity, difficulty and
correlates with other indicators are monitored
automatically
Peer, subordinate, and supervisory evaluations are
periodically performed, anonymous and are in
themselves self correcting and graded.
8. What is measurement of an observable?
How is that related to information and
entanglement?


If regular numbers are special cases of Bittor
numbers then the eigenvalues in quantum
theory become the representations of the
Markov type Lie group (monoid) – a kind of ‘third
quantization’
These Bittor eigenvalues provide a ‘fuzzy
quantization’ to continuum eigenvalues such as
space-time and energy-momentum. Space-time
becomes ‘pixilated’ automatically.
The localization of particles is thus limited
by bittor values whose uncertainty is at
least bounded by their Compton
wavelength.
 The determination of any momentum is
also limited by bittor values whose
uncertainty is bounded by the values in a
box the size of the universe.


Consider an eigenvector |x….; p......> but
where the uncertainty in each does not
violate the Heisenberg principle.
There are an infinite number of degrees of
freedom for the uncertain bittor string but
each bittor is only half as important as the
preceding bittor.
 Thus we get 1+ ½ + ¼ degrees of new
freedom or dimensions in each of 4 space
time directions thus a 12 dimensional
space-time



A consequence of the numerical uncertainty for
space time is that one adds sequentially (with
decreasing importance) new dimensions (4*(2n1)) to space-time (giving it an extra 4,
12…dimensions) with some topological
constraints that are complicated.
These are analogous to the ground state
energies (e.g. as with the harmonic oscillator)
due to the uncertainty principle that manifests
itself here in a similar way.


The manifolds that join the extra ‘degrees of
freedom of uncertainty’ is a joining of a set of 4
positive quadrant sections of two dimensional
complex spaces – we get a nine dimensional
space and a three dimensional time.
These eight additional tiny curled up dimensions
that emerge are twice as many as in string
theory but turn out to have the feature of being
‘quantized’ (points are designated by integers).
Thank You
I deeply appreciate the years of support in
all of the forms which you, my colleagues,
have given me.
 It has been wonderful to be a part of this
University and this faculty.
 I intend to continue working on grants,
contracts, and these research directions.

Download