DLP®-Driven, Optical Neural Network Results and Future Design

advertisement
DLP®-Driven, Optical Neural Network
Results and Future Design
Emmett Redd
Professor
Missouri State
University
Neural Network Applications
•
•
•
•
•
•
•
•
Stock Prediction: Currency, Bonds, S&P 500, Natural Gas
Business: Direct mail, Credit Scoring, Appraisal, Summoning Juries
Medical: Breast Cancer, Heart Attack Diagnosis, ER Test Ordering
Sports: Horse and Dog Racing
Science: Solar Flares, Protein Sequencing, Mosquito ID, Weather
Manufacturing: Welding Quality, Plastics or Concrete Testing
Pattern Recognition: Speech, Article Class., Chem. Drawings
No Optical Applications—We are starting with Boolean
•
most from www.calsci.com/Applications.html
Optical Computing & Neural
Networks
• Optical Parallel Processing Gives Speed
– Lenslet’s Enlight 256—8000 Giga Multiply and
Accumulate per second
– Order 1011 connections per second possible with
holographic attenuators
• Neural Networks
–
–
–
–
–
–
Parallel versus Serial
Learn versus Program
Solutions beyond Programming
Deal with Ambiguous Inputs
Solve Non-Linear problems
Thinking versus Constrained Results
Optical Neural Networks
• Sources are modulated light beams (pulse or
amplitude)
• Synaptic Multiplications are due to attenuation of
light passing through an optical medium (30 fs)
• Geometric or Holographic
• Target neurons sum signals from many source
neurons.
• Squashing by operational-amps or nonlinear
optics
Standard Neural Net Learning
• We use a Training or Learning algorithm to adjust
the weights, usually in an iterative manner.
x1
…
xN
y1
S
…
…
S
yM
Learning
Algorithm
Target Output (T)
Other Info
FWL-NN is equivalent to a standard Neural
Network + Learning Algorithm
FWL-NN
S
S
S
S
+
Learning
Algorithm
Optical Recurrent Neural Network
Signal Source (Layer Input)
Synaptic Medium
(35mm Slide)
Postsynaptic
Optics
Target Neuron
Summation
Presynaptic
Optics
Squashing
Functions
Micromirror
Array
Recurrent Connections
Layer Output
A Single Layer of an Optical Recurrent Neural Network. Only
four synapses are shown. Actual networks will have a large
number of synapses. A multi-layer network has several
consecutive layers.
Definitions
• Fixed-Weight Learning Neural Network (FWL-NN)
– A recurrent network that learns without
changing synaptic weights
• Potency – A weight signal
• Tranapse – A Potency modulated synapse
• Planapse – Supplies Potency error signal
• Zenapse – Non-Participatory synapse
• Recurron – A recurrent neuron
• Recurral Network – A network of Recurrons
Optical Fixed-Weight Learning Synapse
Tranapse
x(t)
Σ
y(t-1)
W(t-1)
T(t-1)
x(t-1)
Planapse
Page Representation of a Recurron
U(t-1)
Bias
B(t)
A(t)
Tranapse
Tranapse
Σ
O(t-1)
Tranapse
Planapse
Optical Neural Network
Constraints
•
•
•
•
•
•
Finite Range Unipolar Signals [0,+1]
Finite Range Bipolar Attenuation[-1,+1]
Excitatory/Inhibitory handled separately
Limited Resolution Signal
Limited Resolution Synaptic Weights
Alignment and Calibration Issues
Optical System
DMD or
DLP®
Design Details and Networks
•
•
•
•
Digital Micromirror Device
35 mm slide Synaptic Media
CCD Camera
Synaptic Weights - Positionally Encoded
- Digital Attenuation
• Allows flexibility for evaluation.
Recurrent AND
Unsigned Multiply
FWL Recurron
DMD/DLP®—A Versatile Tool
• Alignment and Distortion Correction
– Align DMD/DLP® to CCD——PEGS
– Align Synaptic Media to CCD——HOLES
– Calculate DMD/DLP® to Synaptic Media
Alignment——Putting PEGS in HOLES
– Correct projected DMD/DLP® Images
• Nonlinearities
Stretch, Squash, and Rotate
 x1
 y
 1
 x12
D
 x1 y1
 y12

 1

yP 
xP 2 

xP y P 
2 
yP

1 
xP
1
None
x1y0 x0y1
Linear
x2y0 x1y1 x0y2
x3y0 x2y1 x1y2 x0y3
etc.
Quadratic
Cubic
etc.
Where We Are and
Where We Want to Be
DMD Dots Image (pegs)
Slide Dots Image (holes)
100
100
200
200
300
300
400
400
500
500
600
600
700
700
800
800
900
900
1000
200
400
600
C
800
1000
1200
1000
200
400
600
C'
800
1000
1200
CCD Image of Known DLP® Positions
DMD Dots Image (pegs)
Automatically
Finds Points
via Projecting
42 Individual
Pegs
100
200
300
400
500
600
C=H●D
700
H = C ● DP
800
900
1000
200
400
600
800
1000
1200
CCD Image of Holes in Slide Film
Slide Dots Image (holes)
Manually
Click on
Interference
to Zoom In
100
200
300
400
500
600
700
800
900
1000
200
400
600
800
1000
1200
Mark the Center
ZOOM OF DIFFRACTION PATTERN #1. Click on center
10
20
30
40
50
60
70
80
10
20
30
40
50
60
70
80
90
Plans for Automatic Alignment
• Methods and Details about automatically
finding the holes. Might add a couple of
more slides, more matrix manipulation,
and even an Excel spreadsheet
demonstration.
Eighty-four Clicks Later
Slide Dots Image (holes)
C' = M ● C
100
M = C' ● CP
200
300
400
500
600
700
800
900
1000
200
400
600
800
1000
1200
DLP® Projected Regions of Interest
D' =
H-1●M-1●H●D
100
200
and
300
C' = H ● D'
400
500
600
700
100
200
300
400
500
600
700
800
900
1000
Nonlinearities
1
Measured
light signals
vs. Weights
for FWL
Recurron
0.9
0.8
0.7
0.6
Opaque
slides aren’t,
~6% leakage.
0.5
0.4
0.3
0.2
0.1
0
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Neural Networks and Results
• Recurrent AND
• Unsigned Multiply
• Fixed Weight Learning Recurron
Recurrent AND
IN
IN-1
1-cycle
delay
IN  IN-1
Recurrent AND Neural Network
Source Neurons (buffers)
Bias (1)
-14/16
logsig(16*Σ)
IN
10/16
Σ
1
0
10/16
IN  IN-1
Terminal Neurons
-1/2
Σ
2/2
IN-1
linsig(2*Σ)
1
0
Synaptic Weight Slide
Weights
0.5
10/16 -1/2
-14/16 10/16
2/2
Recurrent AND Demo
• MATLAB
Pulse Image (Regions of Interest)
100
200
300
400
500
600
700
800
900
1000
200
400
600
800
1000
1200
Output Swings Larger than Input
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0
10
20
30
40
50
60
70
80
90
100
Synaptic Weight Slide
Unsigned Multiply Results
1
About 4 bits
0.9
0.8
0.7
0.6
Blue-expected
0.5
Black-obtained
0.4
Red-squared
error
0.3
0.2
0.1
0
0
5
10
15
20
25
Page Representation of a Recurron
U(t-1)
Bias
B(t)
A(t)
Tranapse
Tranapse
Σ
O(t-1)
Tranapse
Planapse
FWL Recurron Synaptic Weight Slide
Optical Fixed-Weight Learning Synapse
Tranapse
x(t)
Σ
y(t-1)
W(t-1)
T(t-1)
x(t-1)
Planapse
Future: Integrated Photonics
• Photonic (analog)
•
i. Concept
•
α. Neuron
•
β. Weights
•
γ. Synapses
Photonics Spectra and Luxtera
Continued
•
•
•
•
•
•
•
•
•
•
ii. Needs
α. Laser
β. Amplifier (detectors and control)
γ. Splitters
δ. Waveguides on Two Layers
ε. Attenuators
ζ. Combiners
η. Constructive Interference
θ. Destructive Interference
ι. Phase
Photonics Spectra
and Luxtera
DLP®-Driven, Optical Neural Network
Results and Future Design
Emmett Redd & A. Steven Younger
Missouri State University
EmmettRedd@MissouriState.edu
Source Pulse
100
200
300
400
500
600
700
100
200
300
400
500
600
700
800
900
1000
Download