Covering

advertisement
Synthesis For Finite State
Machines
FSM (Finite State Machine) Optimization
State tables
State minimization
identify and remove
equivalent states
State assignment
assign unique binary
code to each state
Combinational
logic optimization
use unassigned state-codes
as don’t care
net-list
FSM Optimization
01
-0
S2
11
01
-0
1-
00
S3
0-
10
-1
S4
S1
11
PI
Combinational
Logic
v1
PS
PO
u1
NS
v2
u2
State Minimization
Goal : identify and remove redundant states
(states which can not be observed from the
FSM I/O behavior)
Why : 1. Reduce number of latches
– assign minimum-length encoding
– only as the logarithm of the number
of states
2. Increase the number of unassigned states
codes
– heuristic to improve state-assignment
and logic-optimization
State Minimization Definition
• Completely-specified state machine
– two states are equivalent if outputs are
identical for all input combinations
Next states are equivalent for all input
combinations
– equivalence of states is an equivalence
relation which partitions the states into
disjoint equivalence classes
• Incompletely specified state machines
Classical State Minimization
1. Partition states based on input output values
asserted in the state
2. Define the partitions so that all states in a
partition transition into the same next-state
partition (under corresponding inputs)
Example
Ex :
0A
1A
0B
1B
0C
1C
0D
1D
0E
1E
0F
1F
0G
1G
0H
1H
B0
C0
D0
E0
F0
A0
H0
G0
B0
C0
D0
E0
F1
A0
H0
A0
(A,B,C,D,E,F,H)(G)
(A,B,C,E,F,H)(G)(D)
(A,C,E,H)(G)(D)(B,F)
(A,C,E)(G)(D)(B,F)(H)
State Assignment
• Assign unique code to each state to produce
logic-level description
– utilize unassigned codes effectively as don’t
cares
• Choice for S state machine
– minimum-bit encoding
log S
– maximum-bit encoding
• one-hot encoding
• using one bit per state
• something in between
• Modern techniques
– hypercube embedding of face constraint
derived for collections of states (Kiss,Nova)
– adjacency embedding guided by weights
derived between state pairs (Mustang)
Hypercube Embedding Technique
• Observation : one -hot encoding is the easiest
to decode
Am I in state 2,5,12 or 17?
binary : x4’x3’x2’x1x0’(00010) +
x4’x3’x2x1’x0 (00101) +
x4’x3x2x1’x0’(01100) +
x4x3’x2’x1’x0 (10001)
one hot : x2+x5+x12+x17
But one hot uses too many flip flops.
• Exploit this observation
1. two-level minimization after one hot
encoding identifies useful state group for
decoding
2. assigning the states in each group to a single
face of the hypercube allows a single product
term to decode the group to states.
State Group Identification
Ex: state machine
input current-state next state output
0
start
S6
00
0
S2
S5
00
0
S3
S5
00
0
S4
S6
00
0
S5
start
10
0
S6
start
01
0
S7
S5
00
1
start
S4
01
1
S2
S3
10
1
S3
S7
10
1
S4
S6
10
1
S5
S2
00
1
S6
S2
00
1
S7
S6
00
Symbolic Implicant : represent a transition from
one or more state to a next state under some input
condition.
Representation of Symbolic Implicant
Symbolic cover representation is related to a
multiple-valued logic.
Positional cube notation : a p multiple-valued
logic is represented as P bits
(V1,V2,...,Vp)
Ex: V = 4 for 5-value logic
(00010)
represent a set of values by one string
V = 2 or V = 4
(01010)
Minimization of Multi-valued Logic
Find a minimum multiple-valued-input cover
- espresso
Ex: A minimal multiple-valued-input cover
0 0110001 0000100 00
0 1001000 0000010 00
1 0001001 0000010 10
State Group
Consider the first symbolic implicant
0 0110001 0000100 00
• This implicant shows that input “0” maps
“state-2” or “state-3” or “state-7” into “state-5”
and assert output “00”
• This example shows the effect of symbolic
logic minimization is to group together the
states that are mapped by some input into the
same next-state and assert the same output.
• We call it “state group” if we give encodings to
the states in the state group in adjacent binary
logic and no other states in the group face, then
the states group can be implemented as a cube.
Group Face
• group face : the minimal dimension subspace
containing the encoding assigned to that group.
Ex: 0 010
0**0 group face
0100
0110
Hyper-cube Embedding
c
state groups :
{2,5,12,17}
{2,6,17}
b
a
12
17
6
5
2
2
17
wrong!
6
5
12
Hyper-cube Embedding
c
b
a
6
17
state groups :
{2, 6, 17}
{2, 4, 5}
2
5
4
17
6
2
5
4
wrong!
How to Check if a State Assignment
Satisfies the Constraint Matrix?
Step1: Find the group face of the encoding
Step2: For all states, check if a state that does
not belong to a state group intersects that
group face
Example
Constraint matrix A, state encoding S and
group-face matrix F
010
110
0110001
101
S=
1001000
A=
000
0001001
001
011
100
Step1: Group face F = A˙S =
1
0
*
* *
* 0
0 0
Step2: Check encoding of state-6 = [011]
Since it does not belong to group 1, 2 and 3,
check [0 1 1] ∩ [1 * *] = 
[0 1 1] ∩ [0 * 0] = 
[0 1 1] ∩ [* 0 0] = 
Encoding of state-6 satisfies the constraint
Other State Encoding
If encoding of state-6 = [111],
check [1 1 1] ∩ [1 * *] = 111
[1 1 1] ∩ [0 * 0] = 
[1 1 1] ∩ [* 0 0] = 
Do not satisfy the constraint.
Algorithm for State Assignment
Step 1: Select an uncoded state (or a state subset).
Step 2: Determine the encodings for that state
(states) satisfying the constraint relation.
Step 3: If no encoding exists, increase the state
code dimension and go to Step 2.
Step 4: Assign an encoding to the selected state
(states).
Step 5: If all states have been encoded, stop. Else
go to Step 1.
Step 3
–
Can always increase the coding
length by one bit
– New state assignment:
1. For states already assigned,
append 0 at the end
2. For the new state, ns,
case1:
ns does not belong to any state
group,
encoding of ns = [c | 1]
 c is any vector
case2:
ns belongs to some state group,
encoding of ns = [c | 1]
c is the encoding of any state
that belongs to the state group
Example
Ex:
00
0101
10
1010
A=
S = 01
1100
11
To encode a new state (state-5), we have a new
constraint matrix,
01011
A’ = 10100
11000
For the states already assigned, we have a new
encoding,
000
100
S’ =
010
110
For the new state (state-5), we have encodings
ns = [1 0 1] or [1 1 1]
Hyper-cube Embedding Method
• Advantage :
– use two-level logic minimizer to
identify good state group
– almost all of the advantage of one-hot
encoding, but fewer state-bit
Adjacency-Based State Assignment
Basic algorithm:
(1) Assign weight w(s,t) to each pair of states
– weight reflects desire of placing states
adjacent on the hypercube
(2) Define cost function for assignment of codes
to the states
– penalize weights for the distance between
the state codes
eg. w(s,t) * distance(enc(s),enc(t))
(3) Find assignment of codes which minimize
this cost function summed over all pairs of
states.
– heuristic to find an initial solution
– pair-wise interchange (simulated annealing)
to improve solution
Adjacency-Based State Assignment
• Mustang : weight assignment technique based
on loosely maximizing common cube factors
How to Assign Weight to State Pair
• Assign weights to state pairs based on
ability to extract a common-cube factor if
these two states are adjacent on the hypercube.
Fan-Out-Oriented
(examine present-state pairs)
• Present state pair transition to the same
next state
S1
S3
S2
$$$ S1
$$$ S3
S2 $$$$
S2 $$$$
Add n to w(S1,S3) because of S2
Fan-Out-Oriented
• present states pair asserts the same output
S3
S1
$/j
$/j
S2
S4
$$$ S1
$$$ S3
S2 $$$1$
S4 $$$1$
Add 1 to w(S1 , S3) because of output j
Fanin-Oriented (exam next state pair)
• The same present state causes transition to
next state pair.
S1
S2
$$$ S1
$$$ S1
S4
S2 $$$$
S4 $$$$
Add n/2 to w(S2,S4) because of S1
Fanin-Oriented (exam next state pair)
• The same input causes transition to next
state pair.
S1
S3
i
i
S2
$0$ S1
$0$ S3
S4
S2
S4
$$$$
$$$$
Add 1 to w(S2,S4) because of input i
Which Method Is Better?
• Which is better?
FSMs have no useful two-level
face constraints => adjacency-embedding
FSMs have many two-level
face constraints => face-embedding
Download