10/5

advertisement
Lecture Overview – October 5, 2015
• Housekeeping
–
–
–
–
Third assignment posted
Second lab posted
Questions about second assignment
Review of first assignment/first lab
• Computer circuitry from logic gates to silicon
–
–
–
–
AND, OR and NOT
NAND, NOR and XOR
Basic circuitry to add binary numbers
transistors
• Moore’s Law
• Abstract computers
– State machines (finite automata)
– Turing machines and Church’s thesis
Results of first assignments
scores on first assignment
How many college students in America?
12-20 million??
4-6 million??
16
14
12
10
8
6
4
2
0
15
23
24
25
26
27
28
29
30
31
32
22
scores on first lab
10
9
8
7
6
5
4
3
2
1
0
0
-0.5
-1
-1.5
-2
-2.5
-3
-4
-5
-6
XX
What really happens in the CPU circuitry
• Three types of gates
– AND
– OR
– NOT
• How they behave logically
• What their circuitry looks like
• How we build things from gates
What defines a basic building block
•
•
•
•
# inputs
# outputs
Relation between inputs and outputs (aka truth table)
Simplest circuit (pass through)
– 1 bit in, 1 bit out, output = input
– Truth table
Input
Output
0
0
1
1
• For any truth table, we can build a circuit that realizes it
Basic gates
Notation: AND = multiplication, OR = addition, NOT = bar above
Often the goal is to use these circuits to synthesize a truth table
About transistors
• Transistor = transit resistor
Their circuitry
And 3 additional gates
Exclusive OR is
often written as XOR
with symbol
What can you do with these gates?
Add 1 bit numbers
Sum is A XOR B
Carry is A AND B
A
B
Sum
Carry
0
0
0
0
0
1
1
0
1
0
1
0
1
1
0
1
Full adder
S1 = (A1 XOR B1) XOR (Carry1)
Carry2 = (A1 AND B1) XOR
((A1 XOR B1) AND Carry1)
A1
B1
Carry1
S1
Carry2
0
0
0
0
0
0
1
0
1
0
1
0
0
1
0
1
1
0
0
1
0
0
1
1
0
0
1
1
0
1
1
0
1
0
1
1
1
1
1
1
• S1 = (A1 XOR B1) XOR (Carry1)
• Carry2 = (A1 AND B1) XOR ((A1 XOR B1) AND Carry1)
A1 B1
A1 AND B1
A1 XOR B1
Carry1
S1
Carry2
0
0
0
0
0
0
0
0
1
0
1
0
1
0
1
0
0
1
0
1
0
1
1
1
0
0
0
1
0
0
0
0
1
1
0
0
1
0
1
1
0
1
1
0
0
1
1
0
1
1
1
1
0
1
1
1
Carry-Ripple Adder
X0 Full
Z0
Y0 Adder
0
Fixed at 0
X1 Full
Y1
C0
Z1
Adder
X2 Full
Z2
Y2
C2
Adder
C1
X2 X1 X0
Y2 Y1 Y0
============
C2 Z2 Z1 Z0
Could easily expand to 16 or 32 or 64 bits
Going from logic to silicon
A memory cell built from 6 transistors
Fabrication: making chips
• grow layers of conducting and insulating materials on a thin
wafer of very pure silicon
• each layer has intricate pattern of connections
– created by complex sequence of chemical and photographic processes
• dice wafer into individual chips, put into packages
– yield is less than 100%, especially in early stages
• how does this make a computer?
– when conductor on one layer crosses one on lower layer,
voltage on upper layer controls current on lower layer
– this creates a transistor that acts as off-on switch
that can control what happens at another transistor
• wire widths keep getting smaller: more components in given area
– today ~0.022 micron = 22 nanometers; next is 14 nm
1 micron == 1/1000 of a millimeter (human hair is about 100 microns)
– eventually this will stop
– has been "10 years from now" for a long time, but seems closer now
Moore's Law
(1965, Gordon Moore, founder & former CEO of Intel)
• computing power (roughly, number of transistors on a chip)
– doubles about every 18 months
– and has done so since ~1961
• consequences
– cheaper, faster, smaller, less power consumption per unit
– ubiquitous computers and computing
• limits to growth
– fabrication plants now cost $2-4B; most are outside US
– line widths are nearing fundamental limits
– complexity is increasing
• maybe some other technology will come along
– atomic level; quantum computing
– optical
– biological: DNA computing
From the NYTimes (9/26/15)
One transistor, about as wide as a cotton fiber, cost roughly $8 in
today’s dollars in the early 1960s; Intel was founded in 1968. Today,
billions of transistors can be squeezed onto a chip the size of a
fingernail, and transistor costs have fallen to a tiny fraction of a cent.
From the NYTimes 10/2/15
The advance would make it possible, probably sometime after the beginning of the next decade, to shrink
the contact point between the two materials to just 40 atoms in width, the researchers said. Three years
later, the number will shrink to just 28 atoms
The ability to reduce electrical resistance will not only make it possible to extend the process of
shrinking transistors beyond long-held beliefs about physical limits. It may also be the key to once again
increasing the speed of computer processors, which has been stalled for the last decade.
Transistor counts and Moore's Law
A linear plot of Moore’s Law
A computer now is around 130 000 times more powerful than in 1988
From http://www.dai.ed.ac.uk/homes/cam/Robots_Wont_Rule.shtml
CPU block diagram
Registers
Control unit
ALU
PC
Cache
memory
ALU = arithmetic/
logic unit
PC = program counter
= location of next instr
Caching: making things seem faster than they are
• cache: a small very fast memory for recently-used information
– loads a block of info around the requested info
• CPU looks in the cache first, before looking in main memory
– separate caches for instructions and data
• CPU chip usually includes multiple levels of cache
– faster caches are smaller
• caching works because recently-used info is more likely to be
used again soon
– therefore more likely to be in the cache already
• cache usually loads nearby information at the same time
– nearby information is more likely to be used soon
– therefore more likely to be in the cache when needed
• this kind of caching is invisible to users
– except that machine runs faster than it would without caching
Caching is a much more general idea
• things work more efficiently if what we need is close
• if we use something now
– we will likely use it again soon (time locality)
– or we will likely use something nearby soon (space locality)
• other caches in computers:
–
–
–
–
–
–
CPU registers
cache(s) in CPU
RAM as a cache for disk or network or …
disk as a cache for network
network caches as a cache for faraway networks
caches at servers
• some are automatic (in hardware), some are controlled by
software, some you have some control over
Pentium II
1997
7.5M transistors
Pentium III
1999 24M transistors
Intel Core i7 2008 731M transistors
This is likely to be in your PC/Mac
Intel Atom
2008 47 million transistors
Other kinds of computers
• not all computers are PCs or Macs
• "supercomputers"
– usually large number of fairly standard processors
– extra instructions for well-structured data
• "distributed" computing
– sharing computers and computation by network
– e.g., web servers
• embedded computers
– phones, games, music players, ...
– cars, planes, weapons, ...
• each represents some set of tradeoffs among cost, computing
power, size, speed, reliability, ...
Download