TM-5A Comparison of Wafer Retest Methods-SVTC

advertisement
A Comparison of Wafer Retest Methods: In‐Situ vs. Post‐sort Selective
Mark Banke, Altera Corp, mbanke@altera.com
Silicon Valley Test Conference 2010
1
Agenda
•
•
•
•
•
•
•
•
•
What is Wafer Retest, and Why do We do It?
What are the Types of Wafer Retest?
Wafer Retest Attributes
Software Issues Involving Retest
Results from Altera Wafer Retest Data
Cost Savings Potential
Retest Yield Distribution
Rule Based Retest
Summary
Silicon Valley Test Conference 2010
2
What is Wafer Retest, and Why do We do it?
• The semiconductor business is driven by yield:
Yield
$
• Testing doesn’t always yield the expected amount on the first pass.
• To reclaim possible false failures, wafers may be retested.
– False Failure: Dice which fail on the first test, but pass on subsequent retest(s). Can be caused by test hardware, test program, wafer / probe contamination, etc.
• The result is money is reclaimed from the scrap bin.
Silicon Valley Test Conference 2010
3
What are the Types of Wafer Retest?
•Whole Wafer Retest
Whole Wafer:
Retest every die
after complete
lot has finished
test
•In-situ and Selective Wafer Retest
In-situ: Retest
only specific dice
immediately after
whole wafer test
Selective: Retest
only specific dice
after lot has
finished test
Silicon Valley Test Conference 2010
4
Wafer Retest Attributes
• Whole Wafer:
– Verifies all dice. May eliminate false passes and false fails due to test program issues, setup error, hardware problems
– No additional test software – just retest the wafer and nullify past results
– Increases chance for bond pad damage
– Increases sort cycle time
Silicon Valley Test Conference 2010
5
Wafer Retest Attributes (continued)
• In‐situ:
– Only retests failing dice (or specified dice based on whole wafer binning) immediately upon whole wafer completion
– Saves on setup time, reduces setup error frequency, catches most of the false failures
– May not catch gross failures due to initial setup errors.
– May subject passing dice in multi‐site probing setup to more probe damage
– Requires more complicated software
These passing dice
get extra probe
contact
Silicon Valley Test Conference 2010
6
Wafer Retest Attributes (continued)
• Selective:
– More likely to catch gross failures due to setup.
– Single site probe card less likely to cause damage to adjacent dice
– Effective for testing dice at non‐yielding steps
– Increases number of setups, may increase setup errors
– May increase test cost
– Increase in test time – tester controlled prober indexing
– Possible increase in cycle time
– Requires more complicated test software
Silicon Valley Test Conference 2010
7
Software Issues Involving Retest
• In‐Situ: Matching summaries. Total die qty passing on retest needs to be added to original passing die total, but total tested shouldn’t be incremented.
W1F
Total Tested XXX
Total Passed YYY
Total Failed 65
W1I
W2F
W2I
W3F
W3I
Grand Total
65
XXX
70
XXX
48
XXXX
3
YYY
2
YYY
0
YYYY
62
70
68
48
48
361
Why? Need to keep accurate yield data!
Yield based on Total Passed / Total Tested (Full Wafer Dice)
Silicon Valley Test Conference 2010
8
Grand Total
Tested = only
sum of “F”
Totals
Grand Total
Passed=
Sum of all
Passing
Software Issues Involving Retest (continued)
• Selective: Summary data must be downloaded from server to tester so only specified failure dice are re‐tested
Full Sort Summary
W1 W2 W3 FS FS FS
Total Tested XXX XXX XXX
Total Passed YYY YYY YYY
Total Failed 46 63 44
Grand Total
XXX
Selective Retest Summary
W1 W2 W3 Grand SR SR SR Total
YYY
153
Final Yield =
(Total Passed Full + Total Passed Selective) /
Total Tested Full
Total Tested 46
Total Passed 5
Total Failed 41
Silicon Valley Test Conference 2010
9
63
44
153
25
10
40
38
34
113
Software Issues Involving Retest (continued)
„
„
Wafer OCR / Barcode reader must be used ‐ prevent wrong wafer associated with Retest data
Tester must gather summary data from network so it can direct prober to re‐test die locations.
Network
Server
Tester
Prober
Silicon Valley Test Conference 2010
10
Which retest method provides the best compromise between efficiency and yield?
In-Situ
In‐Situ
Vs
Selective
Selective
Silicon Valley Test Conference 2010
11
Results from Altera Wafer Sort Data
Total Avg Retest Wafers Recovery Yield
Total Unique Wafers: 9428
# In‐situ Recovery
"Winners" 5439
# Selective Recovery "Winners" 1091
No Die Difference
2898
‐
9.0%
4.6%
‐
Over 6 Million Dice Tested
Silicon Valley Test Conference 2010
12
In‐Situ vs. Selective Retest :
Difference of Die Yield Reclaimed
% of Wafers Tested
40.0%
35.6%
35.0%
30.0%
27.4%
25.0%
20.7%
21.6%
20.0%
13.3%
15.0%
7.9% 8.4%
10.0%
5.0%
1.0%
2.7%
4.1%
5.3%
6.8%
0.4%
0.0%
% Reclaim Yield Difference by Retest Method
Silicon Valley Test Conference 2010
13
In‐Situ vs. Selective Retest :
Difference of Die Yield Reclaimed
% of WafersTested
35%
31%
31%
30%
25%
20%
15%
13%
10%
5%
7%
1%
1%
0%
1%
2%
6%
2%
2%
0%
Difference in Dice Reclaimed by Retest Method
Silicon Valley Test Conference 2010
14
3%
In‐Situ vs. Selective Retest :
• So, looks like the winner is In‐situ…
• Upon “further review…”
– It’s not necessarily the Retest method recovery difference, it’s the amount of dice actually recovered because of the Retest…
?
Silicon Valley Test Conference 2010
15
In‐Situ vs. Selective Retest :
Total Avg Retest Additional Wafers Recovery Yield Recovery Dice
Total Unique Wafers: 9428
# In‐situ Recovery
"Winners" 5439
# Selective Recovery "Winners" 1091
No Die Difference
‐
‐
9.0%
50,014
4.6%
21,199
‐
‐
2898
Silicon Valley Test Conference 2010
16
In‐Situ vs. Selective Retest :
Difference of Die Yield Reclaimed
% of WafersTested
35%
30%
25%
What
about
these?
31%
31%
20%
15%
13%
10%
5%
7%
1%
1%
0%
1%
2%
6%
2%
2%
0%
Difference in Dice Reclaimed by Retest Method
Silicon Valley Test Conference 2010
17
3%
In‐Situ vs. Selective Retest : Difference of Die Yield Reclaimed – by Product Family
% of Wafers Tested
60%
55%
50%
42%
40%
30%
30%
19%
20%
10%
13%
1%
1%
1%
0%
1%
0%
2%
0%
3%
12%
7%
4%
1%
4%
2%
2%
0%
0%
0%
0%
0%
0%
Difference in Dice Reclaimed by Retest Method
Low Cost Families
Medium Cost Families
Silicon Valley Test Conference 2010
18
High Cost Families
Cost Savings Potential
• Possible savings may accrue if the test cost model charges extra for selective retest setup time.
Silicon Valley Test Conference 2010
19
Cost of Selective Retest Setup
Time
Cost of Setup Time vs # of Setups with 10 minute Average Setup Time
$25,000
Test Cost
$ / Hr
$20,000
$15,000
$50.00
$75.00
$100.00
$125.00
$10,000
$5,000
$0
10
20
50
75
100 200 500 1000
# of Setups
Silicon Valley Test Conference 2010
20
Cost of Setup Time vs # of Setups with 10 minute Average Setup Time Potential $ Value of Units not
Recovered
Number of Units vs $ Cost of Non-Recovered Part
$1
$10
$50
$500
21
$20,000
$18,000
$16,000
$14,000
$12,000
Selective
Retest Setup
Costs @ 500
Setups
$10,000
$8,000
$6,000
$4,000
$2,000
$-
$5
$35
$100
1 Unit
50
Units
100
Units
200
Units
300
Units
400
Units
500
Units
Number of Units Not Recovered
Silicon Valley Test Conference 2010
750
Units
1000
Units
Cost Savings Potential with In‐Situ Retest
IF….
Setup Cost
Unit Cost
# of Retest Units
Silicon Valley Test Conference 2010
22
Cost Savings Potential with In‐Situ Retest
• The results favored in‐situ Retest on average when average yield from first initial test was greater than certain yield thresholds, but selective sort proved necessary when problematic test setup conditions occurred.
Silicon Valley Test Conference 2010
23
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
30
23
25
20
17
20
25
15
2%
1%
1%
3%
6
10
3%
5
0
A
B
C
D
E
PRODUCT
% Wafers with No Selective Resort Dice Recovered
% Wafers with Selective Resort Dice Recovered
% > Yield Threshold
# of Lost Dice if Selective Resort Not Used
Silicon Valley Test Conference 2010
24
# of Dice Recoverd by Selective Retest
% of Wafers Tested
In‐Situ Recovery Rate vs Increase over Yield Threshold
Retest Yield Distribution
• Insitu Retest does in fact, on average, increase yield vs selective Retest. But the data suggest that one pass Retest may not always be enough to capture all available yield.
Silicon Valley Test Conference 2010
25
Distribution of Additional Dice Recovered
64%
% of Total Wafers
Tested
70%
60%
50%
40%
30%
26%
25% 24%
20%
8% 5% 9% 8%
10%
0%
0
<5
5 to 9
>= 10
Additional Dice Recovered
Silicon Valley Test Conference 2010
26
1st In-situ Resort
Attempt
All Selective Resort
Attempts
% of Total Wafers Tested
Distribution of Additional Dice Recovered Per Retest Attempt
50%
45%
40%
35%
30%
25%
20%
15%
10%
5%
0%
48%
1st In‐situ Resort Attempt
26%
25%
16%
1st Selective Resort Attempt
16%
7%
8%
9%
2%
3%
0
< 5
5 to 9
5%
2%
>= 10
Additional Dice Recovered
Silicon Valley Test Conference 2010
27
2nd Selective Resort Attempt
3rd Selective Resort Attempt
Rule Based Retest
• In‐Situ Retest will recover the majority of false failures. But there are cases where Selective Retest should be used.
Type of Part
Condition
– Rule Based Retest
Action
Any
1st Pass Yield << Std
(Bad Setup)
Selective Retest
Low Cost Die
1st Pass Low Yield
In‐Situ Retest
Long Test Time
Particular Failure Type
Selective Retest on Particular
Failure Dice
High Cost Die
1st Pass In‐Situ Retest, then
Selective Retest on Sample Wafers. If yield improvement, Selective Retest on all wafers
Silicon Valley Test Conference 2010
28
Summary
• Both In‐Situ and Selective Retest Methods are Viable
• In‐Situ will likely Recover most invalid failures, usually on the first attempt
• Rule‐based Retest will likely provide the most cost effective solution
Silicon Valley Test Conference 2010
29
Thank you!
• Questions?
Silicon Valley Test Conference 2010
30
Download