Assessment Procedure and Criteria for Determining Suitability of

advertisement
DLR-RF-PS-003
Issue 1.1
Sept. 2008
Assessment Procedure and Criteria
for Determining Suitability of
Commercial Components for Space Use
DL R
Deutsches Zentrum
für Luft- und Raumfahrt
German
Space Agency
DLR –RF-PS-003
Issue 1.1
Sept 2008
Assessment Procedure and Criteria
for Determining Suitability of
Commercial Components for Space Use
Prepared
J. Tetzlaff / QP-NB
Standardization and EEE-Components
DLR
Approved
M. Scheuer-Leeser / QP-L
Head of Product Assurance
DLR
Released
Date:
19.09.2008
Date:
19.09.2008
C. Hohage / RD-J
Date:
Director Space Projects
DLR
22.09.2008
Document Change Record
Issue/Rev
Date
Change Notice No. Modified Pages
or Paragraphs
1.0
July 2002
-
1.1
Sept. 2008
-
Nature of Change
Initial Issue
Cover Sheet Added
Assessment Procedure
DLR –RF-PS-003
Issue 1.1
Sept. 2008
Assessment Procedure and Criteria
for Determining Suitability of
Commercial Components for Space Use
TESAT-Spacecom GmbH & Astrium GmbH
Prepared under
DLR Contract No. 50 PS 0010
PHASE D, WORK PACKAGE 1
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-1-
19.09.2008
Assessment Procedure
CONTENTS
1
2
3
4
INTRODUCTION
1.1 Scope
1.2 Assessment Approach
1.3 Document Layout
1.4 Definitions and Abbreviations
1.5 References
ASSESSMENT GUIDE
2.1 Key Elements
2.2 Assessment Flow
2.3 Assessment Procedure
2.4 Risk Control
SUBSTITUTES FOR MISSING INFORMATION
3.1 Introduction
3.1.1 Information from Other Sources
3.1.2 Additional Testing/Inspection
3.2 Detailed Guide for each Key Element
3.2.1 Thermal Performance Characteristic
3.2.2 Mechanical Performance Characteristic
3.2.3 Constructional Performance Characteristic
3.2.4 Radiation
3.2.5 Process Control
3.2.6 Reliability Assessment System
3.2.7 Reliability Data
3.2.8 Final Production Electrical Measurements
3.2.9 Average Outgoing Quality
3.2.10 Quality System
3.2.11 Traceability
3.2.12 Specification
3.2.13 Delivery Time
DETAILED ASSESSMENT INFORMATION
4.1 Key Element Importance and Assessment
4.2 Detailed Assessment Criteria for Key Elements
4.2.1 Thermal Performance Characteristic
4.2.2 Mechanical Performance Characteristic
4.2.3 Constructional Performance Characteristic
4.2.4 Radiation
4.2.5 Process Control
4.2.6 Reliability Assessment System
4.2.7 Reliability Data
4.2.8 Final Production Electrical Measurements
4.2.9 Average Outgoing Quality
4.2.10 Quality System
4.2.11 Traceability
4.2.12 Specification
4.2.13 Delivery Time
4
4
4
4
5
6
7
7
8
9
11
12
12
12
12
13
13
13
13
13
14
14
14
14
14
14
14
14
15
16
16
19
19
21
23
26
30
36
39
42
44
46
48
50
52
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-2-
19.09.2008
Assessment Procedure
5
BACKGROUND INFORMATION
5.1 Use of Commercial Components in Space
5.2 Equivalence of Commercial and Space Qualified Parts
53
53
54
APPENDIX 1 FORM SHEETS
55
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-3-
19.09.2008
Assessment Procedure
1
INTRODUCTION
1.1
Scope
This document describes a simple, effective and standardised procedure for assessing whether
commercially available components meet the performance requirements for space applications.
The approach adopted is straightforward so that users can apply it without needing an in-depth
knowledge of any technical aspects of the part under assessment. It is also flexible enough to
be used for commercial components fabricated using widely different processes, employing
many different structures, and manufactured by companies having different quality systems,
methods and standards.
1.2
Assessment Approach
The basic assessment approach relies on the assessment of individual “key” elements which
have been identified as both sufficient and essential for the overall acceptability of a component.
The "key" elements are parameters that quantify the quality and reliability of the component, the
manufacturer of the component and its availability for use in space programmes. A component
assessed using this procedure can be individually assessed as:
•
Acceptable for general space use.
(all key elements are acceptable)
•
Acceptable for use on a specific project.
(one or more key elements are only acceptable
for specific project use)
•
Unacceptable.
(one or more key elements are unacceptable)
A detailed description of the key elements for assessment is given in chapter 4 of this
document. The assessment procedure and criteria for individual key elements are designed to
make the maximum possible use of existing information, although appropriate methods for
generating any missing information are also covered.
If a key element cannot be assessed directly, then it may be possible to give an assessment on
the basis of supporting information. For example, if no reliability data is available, it may be
possible to make an assessment on the basis of reliability data for a similar component from the
same manufacturer. In this document, the indirect assessment of a key element is called
assessment using "alternative information".
The basic assessment approach described in this document is extended in a supplementary
document entitled Risk Analysis and Management. If appropriate risk analysis and management
actions are carried out the extended approach can allow components to be accepted for specific
project use even if not all the key element criteria are satisfied.
1.3
Document Layout
The remainder of this document comprises five sections which contain:
•
A step by step guide to the assessment procedure, including an illustrative flow chart.
•
An additional guide covering the possibility of completing a key element assessment by
using alternative information obtained from other sources. This can be applicable if some of
the information normally required to complete the assessment is not available.
•
A detailed description of each key element and the appropriate acceptance criteria. This
should be consulted during an assessment to ensure consistent and standard results.
•
Background information about the reasons for using commercial components in space
applications, and the justification for considering some commercial components to be of
equivalent quality to traditional space qualified components.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-4-
19.09.2008
Assessment Procedure
•
An appendix containing a set of suitable form sheets which can be used for documenting a
component assessment.
1.4
Definitions and Abbreviations
The following definitions and abbreviations are used in this document.
AOQL
Average outgoing quality level
CVCM Collected volatile condensable material
DDEF
Displacement damage equivalent fluence
DEC
Dynamic evaluation circuit
DOE
Design of experiments
ESD
Electrostatic discharge
FET
Field effect transistor
FIT
Failure in time
FMECA Failure modes, effects and criticality analysis
PCM
Process control monitor
PIND
Particle impact noise detection
QA
Quality assurance
RF
Radio frequency
RML
Recovered mass loss
RVT
Radiation verification testing
SEBO
Single event burn-out
SEE
Single event effect
SEGR
Single event gate rupture
SEL
Single event latch-up
SEU
Single event upset
SEM
Scanning electron microscope
SMD
Surface mount device
SPC
Statistical process control
TCV
Technology control vehicle
TID
Total ionising dose
TML
Total mass loss
WLR
Wafer level reliability
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-5-
19.09.2008
Assessment Procedure
1.5
References
AQAP-110
NATO Quality Assurance Requirements for Design, Development and
Production
ASTM-E595-90
Outgassing
EIA/JESD-46-A
Guidelines for User Notification of Product/Process Changes by
Semiconductor Suppliers
EIA/JESD-47
Stress-Test-Driven Qualification of Integrated Circuits
EIA/JESD-48
Product Discontinuance
EIA/JEP-122
Failure Mechanisms and Models for Silicon Semiconductor Devices
EIA/JEP-131
Process Failure Mode and Effects Analysis (FMEA)
EIA/JEP-132
Process Characterisation Guidelines
EIA-554-A
Method Selection for Assessment of Non-conforming Levels in Parts-PerMillion (PPM)
EIA-554-1
Assessment of Average Outgoing Quality Levels in Parts-Per-Million
(PPM)
EIA-554-2
Assessment of Non-conforming Levels in Parts-Per-Million (PPM)
EIA-557-A
Statistical Process Control Systems
EIA-591
Assessment of Quality Levels in PPM using Variables Test Data
EIA-659
Failure-Mechanism-Driven Reliability Monitoring
EIA-738
Guideline of the Use and Application of Cpk
ESA/SCC22900
Total Dose Steady State Irradiation Test Method
ESA/SCC25100
Single Event Effects Test Method and Guidelines
ECSS-Q-70-02A
Thermal vacuum outgassing test for the screening of space materials
JEDS-16A
Assessment of Average Outgoing Quality Levels in Parts-Per-Million
(PPM)
JESD-34
Failure-Mechanism-Driven Reliability Qualification of Silicon Devices
MIL-STD-202
Test Methods for Electronic and Electrical Component Parts
MIL-STD-750
Test Methods for Semiconductor Devices
MIL-STD-883
Test Method Standard Microcircuits
MIL-PRF-19500
Performance Specification Semiconductor Devices, General Specification
for
MIL-PRF-38534
Performance Specification Hybrid Microcircuits, General Specification for
MIL-PRF-38535
Performance
Specification
Integrated
Manufacturing, General Specification for
MIL-HDBK-683
Statistical Process Control (SPC) Implementation
MIL-HDBK-814
Ionising Dose and Neutron Hardness Assurance Guidelines
QS9000
Quality Systems Requirements
ESA PSS-01-302
Failure Rates for ESA Space Systems
Circuits
(Microcircuits)
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-6-
19.09.2008
Assessment Procedure
2
ASSESSMENT GUIDE
2.1
Key Elements
As explained in Paragraph 1.2 this procedure relies on the assessment of key elements in order
to make an overall assessment of the component type acceptability for space applications. The
chart below shows those key elements which were identified, during the development of this
procedure, as being necessary and sufficient for assessing a component. To permit an easier
understanding of their significance they are grouped under the headings of ‘Parts Related’,
‘Manufacturer Related’ and ‘Procurement Related’, which are also often used in describing
Project PA requirements for components.
ASSESSMENT
Parts Related
Performance
characteristics
Thermal
Manufacturer Related
Quality assurance
system
Quality system
Procurement Related
Specification
Specification
Mechanical
Constructional
Radiation
Dependability
Process control
Customer Support
Traceability
Availability
Delivery time
Reliability assessment
system
Reliability data
Final production
electrical
measurements
Average outgoing
quality
Figure 1 – Key assessment elements grouped into three general categories
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-7-
19.09.2008
Assessment Procedure
2.2
Assessment Flow
The following flowchart depicts graphically the basic approach to be used when assessing
commercial components for space applications.
Define sequence of assessment
of the key elements *)
KEY ELEMENT X
KEY ELEMENT 2
KEY ELEMENT 1
Additional Screening
(Risk control) **)
Direct assessment
of key element
Additional
screening
required
Insufficient data
Indirect
assessment of key
element ***)
Element
acceptable
Acceptable for
specific projects with risk
Accept for
specific
projects
Element
unacceptable
Risk
Analysis
Key element
acceptable for specific
project requirements
Key element generally
acceptable
All key elements
acceptable
one or more key elements
are only acceptable for
specific project use
Component
acceptable
for general space use
Component
acceptable for use on
a specific project
Key element not
acceptable
One or more key
elements unacceptable
Component
unacceptable
*)
The sequence of assessment shall be defined such that unacceptable key elements are found as early
as possible in order to minimise the overall effort.
**) The assessment result is only valid for procurement with the applied screening measures
***) Assessment on the basis of alternative data. Can also include additional testing.
Figure 2 – Basic assessment flow
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-8-
19.09.2008
Assessment Procedure
If it decided to perform a risk analysis exercise this is covered by a separate procedure included
in the document entitled ‘Risk Analysis and Management’. A decision to perform a risk analysis
for a key element can allow the component assessment to continue either:
-
After satisfactory completion of the risk analysis.
-
Immediately, in order to allow any other key elements requiring risk analysis to be
identified. The necessary risk analyses can then all be performed together at the
completion of the assessment.
Performing an optional risk analysis exercise can allow a component to be accepted for specific
project use (possibly with the requirement that appropriate risk control measures are adopted)
even though not all the key element acceptance criteria are met.
2.3
Assessment Procedure
In this section a simple step by step guide to completing a basic component assessment is
given. While performing an assessment it will frequently be necessary to refer to Section 4 of
this document as it contains a detailed description of each key element and the appropriate
acceptance criteria. It will therefore be useful to read Section 4 before starting an assessment in
order to become familiar with its contents and to understand what types of information need to
be available to perform an assessment.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
-9-
19.09.2008
Assessment Procedure
Step 1. Identify component to be assessed
The assessor should have a clear knowledge and understanding of exactly which component is to be assessed in terms of
manufacturer, generic type, variant/option/revision, qualification/test level, etc.
Step 2. Obtain relevant information
All readily available published information relevant to the component to be assessed should be obtained. This will generally be
manufacturer’s information and information from other sources in the public domain, but might also include proprietary information
which has been made available to the assessor. The detailed criteria in Section 4 of this document show the areas which the
information must cover.
Step 3. Prepare assessment result forms
To record the results of the assessment a suitable set of forms in printed or electronic format should be used. The Appendix to this
document includes a set of forms suitable for this purpose. They should initially be prepared by completing all the necessary
information to identify the exact component type which is being assessed
ASSESSMENT OF KEY ELEMENTS
Step 4. Select a key element to be assessed
A key element which has not yet been assessed should be selected from those identified in Section 2.1 and on the
form. It is not necessary to assess the key elements in the sequence used in this document and any order can be
adopted. To reduce the overall effort required assessors might prefer to start the assessment with those key
elements for which a result of ‘not acceptable’ is considered most likely.
Step 5. Assessment of the selected key element
Assessment of the selected key element should be started to the criteria given in section 4 of this document. It
should be identified at this stage whether all the necessary information to complete the assessment and reach a
conclusion is available. If a direct assessment is possible go to step 7.
⎪
Step 6. Decide whether additional information can be obtained
Repeat
⎪
for each
⎪
key
⎪
element
↓
A decision should be based on whether it is at all possible that alternative, equivalent or related
information could be a suitable substitute for the missing information, and whether the effort
necessary to generate or obtain such information is considered justifiable. Section 3 of this document
should be used as an aid to making this decision. If the decision is negative go to step 7. Additional
information can be either already existing information which is obtained from alternative sources, or
new information which is generated by the assessor, e.g. by procuring and testing sample
components or by auditing the manufacturer.
Step 7. Decide whether the key element is acceptable
A decision as to whether the key element is acceptable should be based on all the available information (including
additional information if applicable) and should be made using the criteria given in Section 4 of this document.
Step 8. Decide whether risk analysis and management are appropriate
If a key element is assessed as ‘not acceptable’, either because it clearly does not meet the criteria or because
insufficient information is available to verify that it meets the criteria, then risk analysis and management might be
appropriate. Risk analysis and management are not covered in this document but are described in a supplementary
document entitled ‘Risk Analysis and Management’.
The result of a risk analysis and management exercise could be that a component is considered potentially
acceptable for specific use despite having a key element which is assessed as not acceptable to the defined criteria.
Step 9. Decide on acceptability of component
For a basic assessment the result for the component is determined on the basis of the key element results. It is equal to the worst
result obtained for any of the individual key elements and the possible component assessment results are therefore:
If a risk analysis and management exercise is performed in accordance with the supplementary document then it is possible to
accept a component for specific project use even if a key element is assessed as not acceptable. There is also an additional
Step 10. Complete assessment forms
The assessment forms for the component which has been assessed should be completed giving all the relevant results and
references to the sources of information used to justify the conclusions.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 10 -
19.09.2008
Assessment Procedure
2.4
Risk Control
This assessment procedure allows measures for risk reduction to be applied in order to give
increased confidence in the quality and reliability of the component. Allowing risk control
measures (Including screening) will increase the number of commercial part types assessed as
usable for space applications. Similar risk reduction measures as those often applied to the
procurement of space qualified components can also be applied to commercial components in
order to increase confidence in the quality and reliability of the flight parts. Such measures are
generally not applied to commercial components for terrestrial applications where a strategy of
"built-in" quality and reliability without additional screening has been established. However, for
space applications, with relatively low quantities of parts, risk reduction measures can be
usefully applied to both commercial and space qualified parts.
These additional risk reduction measures are not intended to be implemented on all part types
as is the case for space qualified parts. As for some commercial applications, they are intended
to eliminate specific weaknesses of the components.
Typical risk control measures are:
-
Burn-in
-
Inspections (visual, X-ray, SEM....)
-
Selection on the basis of electrical measurements
-
Other screening measures (E.g. PIND testing of cavity parts, vibration & shock testing
-
Radiation verification tests
-
Destructive Physical Analysis
Some of these measures are applied to the parts to be used in flight application and some are
applied to sample quantities of the available parts and may or may not be destructive.
If the inclusion of risk reduction measures has been used for the assessment of a component
then it is important that those measures are clearly identified. The result of the assessment is
only valid for parts procured and verified using the proposed measures for risk reduction. Any
risk reduction measures have to clearly define the conditions that have to be satisfied for
acceptance. The assessment of each key element may identify measures for risk reduction that
need to be applied in order to achieve sufficient confidence for a positive assessment of that key
element. The risk reduction measures must be registered in the assessment summary sheet
given in Appendix 1 of this document.
The assessor must decide if possible risk control measures are reasonable and acceptable for a
space parts procurement. If not, the key element is not acceptable and hence the component is
unacceptable for space use.
In addition to this document for assessment of commercial components, an additional
supporting document entitled "Risk Analysis and Control" has also been prepared. That
document is designed to be support the assessment of components that have been found to be
only acceptable for specific project requirements.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 11 -
19.09.2008
Assessment Procedure
3
SUBSTITUTES FOR MISSING INFORMATION
3.1
Introduction
If a key element assessment cannot be completed because essential data or information are
missing, it is possible with some elements to obtain suitable information from other sources or to
generate it by performing additional tests or inspections. Component information obtained in
these ways need not necessarily be identical to the data originally required, but could be
alternative/equivalent/related information which is nevertheless suitable for assessing the
acceptability of the key element or performing risk analysis on the component.
3.1.1
Information from Other Sources
If the information necessary for directly assessing the acceptability of a key element is not
obtainable directly from the manufacturer then generally the only other source will be a major
customer who procures large quantities of the component. Such a customer might have been
able to exert sufficient pressure on the manufacturer to extract additional information, or might
have performed their own testing to generate the additional information. However, even if a
customer is identified who possesses the necessary information it might not be possible to
obtain it because the manufacturer and/or customer might have defined it as proprietary.
If a customer has generated information by performing their own testing or inspection then this
could cover any of the key element assessment areas described in the following paragraph
3.1.2. For a major customer it could also include an AOQL figure generated by checking all the
received components to determine how many defective parts are supplied by the manufacturer.
If a major customer has used their larger purchasing power to apply pressure on the
manufacturer to supply additional information then this information could cover any of the areas
included in the assessment procedure.
If additional information is obtained for use in risk analysis, rather than for use in directly
assessing a key element, the range of potentially relevant information and possible sources is
much wider and cannot be defined here in detail.
3.1.2
Additional Testing/Inspection
If the necessary financial and manpower resources are available to procure and test special test
samples as part of an assessment exercise then this is an option which should always be
carefully considered to generate information which cannot be obtained from the manufacturer.
The following table gives the key elements and assessment areas for which additional testing is
a realistic option for generating missing information. For other key elements it is not possible to
replace any missing information with sample test results because acceptance of the element
requires long term test results or requires the manufacturer to have specific systems installed.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 12 -
19.09.2008
Assessment Procedure
Key Element
Assessment Area
Suggested Test Method
Mechanical performance
characteristic
Vibration and shock
testing
Relevant test methods in Mil-Std-202,
Mil-Std-750 and Mil-Std-883
(Paragraph 3.2.2)
Constructional performance Design, materials and
characteristic
workmanship
Constructional analysis to an
established space level procedure
(Paragraph 3.2.3)
Outgassing
ECSS-Q-70-02A or ASTM-E595-90
Radiation
Total ionising dose and
single event effects
ESA/SCC 22900, ESA/SCC 25100 or
Mil-Std-883 methods 1019, 1020,
1021, 1022, 1023
(Paragraph 3.2.4)
3.2
Detailed Guide for each Key Element
Note that to maintain correspondence between the paragraph numbers in this section and in
Section 4.2 all the key elements are included below. Those for which obtaining additional
information is not considered a realistic option are included in italics.
3.2.1
Thermal Performance Characteristic
If insufficient information is available from the manufacturer to determine the true operating and
storage temperature ranges then generating the necessary information is not considered a
realistic option. This is because of the extensive testing which would be necessary to determine
with sufficient confidence the true range over which the parts could be operated/stored without
compromising their reliability.
3.2.2
Mechanical Performance Characteristic
If insufficient information is available from the manufacturer to confirm the component’s shock
and vibration capabilities then generating the necessary information is considered a realistic
option. This is because the mechanical performance of a component is design related and could
be expected to be similar for all components of the same type. The necessary information could
therefore be generated by relatively simple shock and vibration testing of a limited number of
specifically procured test samples.
3.2.3
Constructional Performance Characteristic
If insufficient information is available from the manufacturer to confirm the component’s
constructional performance then generating the necessary information is considered a realistic
option. This is because it would be possible to check the design, workmanship, outgassing, and
use of prohibited materials or constructions by inspection/test of a limited number of specifically
procured test samples.
3.2.4
Radiation
If insufficient information is available from the manufacturer to confirm the component’s radiation
performance then generating the necessary information is considered a realistic option. As it is
anticipated that radiation data will be unavailable for most commercial components the
necessary testing for generating suitable data is already described in this assessment
procedure document.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 13 -
19.09.2008
Assessment Procedure
3.2.5
Process Control
If insufficient information is available from the manufacturer to confirm that the applied process
control is acceptable then generating alternative suitable information is not considered a realistic
option. This is because assessment of this key element relies entirely on information which can
only be provided by the manufacturer.
3.2.6
Reliability Assessment System
If insufficient information is available from the manufacturer to confirm that the reliability
assessment system is acceptable then generating alternative suitable information is not
considered a realistic option. This is because assessment of this key element relies entirely on
information which can only be provided by the manufacturer.
3.2.7
Reliability Data
If insufficient information is available from the manufacturer to determine the component
reliability then generating the necessary information is not considered a realistic option. This is
because of the extensive testing which would be necessary to determine with sufficient
confidence the reliability of the parts.
3.2.8
Final Production Electrical Measurements
If insufficient information is available from the manufacturer to demonstrate that all critical
electrical parameters are measured on 100% of parts before shipping then it is not possible to
demonstrate this using information from any other source. This is because only information from
the manufacturer can confirm exactly what in-house final testing is performed on the delivered
components. The possibility of accepting such components on the basis that all procured parts
are electrically tested after receipt is covered in the supplementary risk analysis and
management document.
3.2.9
Average Outgoing Quality
If insufficient information is available from the manufacturer to determine the average outgoing
quality level then obtaining the necessary information from another source might be a realistic
option. If the identical part was supplied to a commercial customer in million high quantities then
such a customer might have available suitable information on the risk of a defective part being
delivered. For anyone else to perform testing to determine the AOQL would not be a realistic
option because of the extremely high number of components which would need to be tested.
3.2.10 Quality System
If insufficient information is available from the manufacturer to determine whether the quality
system used is acceptable then obtaining suitable information from another source is not
considered a realistic option. It is assumed that if another customer with similar requirements
had already audited the manufacturer’s QA system and found it to be satisfactory then the
manufacturer would inform other potential customers of this.
3.2.11 Traceability
If insufficient information is available from the manufacturer to determine whether the level of
traceability is acceptable then obtaining suitable information from another source is not
considered a realistic option.
3.2.12 Specification
If there is insufficient information available from the manufacturer then this will always fail the
criteria for available documentation to form an acceptable specification. Generating additional
information is therefore not considered a realistic option.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 14 -
19.09.2008
Assessment Procedure
3.2.13 Delivery Time
If there is insufficient information available from the manufacturer to confirm an acceptable
delivery time then this is always unacceptable. Obtaining alternative information is therefore not
considered a realistic option.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 15 -
19.09.2008
Assessment Procedure
4
DETAILED ASSESSMENT INFORMATION
4.1
Key Element Importance and Assessment
The following table lists the key elements which have been identified as necessary for
performing a component assessment together with the reasons why it is important that the
element should be found acceptable.
Key Element
No
Name
PARTS RELATED
Performance Characteristics
1 Thermal
2
Mechanical
3
Constructional
4
Radiation
Importance
Thermal requirements must be satisfied to be certain that a part’s electrical
performance and failure rate will be as specified
Mechanical requirements must be satisfied to be certain that a part will not
fail due to mechanical stresses in typical space applications
Constructional requirements must be satisfied to be certain that a part does
not use materials or processes which could cause failures
Radiation requirements must be satisfied to be certain that a part will not fail
electrical function or parameter requirements in typical space applications
Dependability
5 Process Control
The level of process control must be sufficient to ensure good reproducibility
and therefore parts with consistent performance
6 Reliability assessment system The reliability assessment system must be well designed in order to give
confidence in the reliability data generated
7 Reliability data
The indicated level of reliability must be sufficiently high for typical space
applications
8 Final production electrical
Sufficient characterisation of each delivered component must be carried out
measurements
to verify that the defined electrical performance and parameters are met
9 Average outgoing quality
The quality level of the part must be good enough to ensure that the risk of
receiving defective parts is sufficiently low to be acceptable for typical space
applications
MANUFACTURER RELATED
Quality Assurance System
10 Quality system
The quality management system must be good enough to ensure that
control of all processes and procedures is sufficient to consistently produce
components of adequate quality and reliability
Customer Support
11 Traceability
A minimum level of traceability must be assured, particularly for radiation
sensitive components, to allow the identification of parts which potentially
have lot related problems
PROCUREMENT RELATED
Specification
12 Specification
A written specification, or equivalent, defining essential characteristics and
important parameter limits must be available in order to specify exactly what
is being procured
Availability
13 Delivery time
The delivery time of the component must be compatible with space project
schedules
Table 1 – Key assessment elements
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 16 -
19.09.2008
Assessment Procedure
To assess a component each key element must be systematically assessed using the detailed
criteria and guidance given in Section 4.2. For each key element a description of the element, a
table containing the appropriate criteria for accepting the component for general or project
specific use, and supporting information and guidance to help in the assessment are provided.
In some cases additional supporting information and guidance is given which is only applicable
to specific component families or technologies, but any limitations to applicability are always
identified. At the completion of the element assessment a decision must be made as to whether
it is acceptable for general use, acceptable for specific use, or not acceptable.
The assessment result for the component is then equal to the worst result obtained for any of
the individual key elements. The possible component assessment results are therefore:
-
acceptable for general use
-
acceptable for specific project use
-
not acceptable.
As all of the key elements must be acceptable if the component is to be acceptable a
component assessment should be terminated as soon as a single key element is found which is
not acceptable.
To extend the capabilities of this basic assessment procedure there are three areas where
additional activities can be performed. These cover:
(a)
Insufficient information – if insufficient information is available to verify that a key
element meets the specified criteria then that element must normally be assessed as
“not acceptable”, even if there is also insufficient information to definitely verify that it
does not meet the criteria. For some key elements, however, it is possible to identify
alternative, equivalent or related information which could be obtained from other sources
or which could be generated by additional testing or inspection. This additional
information might then allow the key element to be assessed as acceptable. It should be
noted that only relatively limited additional testing or inspection is envisaged as forming
part of an assessment covered by this document, and not anything as extensive as
traditional space level evaluation or qualification testing.
Obtaining alternative, equivalent or related information in place of missing information is
covered in this document (Section 3).
(b)
Risk analysis – if a key element cannot be assessed as meeting the specified criteria,
even if additional information is obtained and used, the component might still be
acceptable if the information can be used to demonstrate an acceptable risk in a specific
application. Risk analysis will not be applicable for all key elements and when performed
might identify that the components can only be used if appropriate risk control actions
(preventive or back-up solutions) are taken.
Risk analysis and control are not covered in this document but are covered in a separate
document entitled ‘Risk Assessment and Control’.
(c)
Risk control – if risk analysis shows that a component is not acceptable for use “as is”, it
might still be possible to use it if appropriate risk control activities are defined and taken.
Risk control activities can cover either preventive actions (to prevent a risk event from
occurring) or back-up actions (to mitigate any unwanted effects if the risk event does
occur).
Risk control also covers the acceptance of commercial component types on the basis
that 100% component screening, or sample lot acceptance testing, is performed on all
procured parts.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 17 -
19.09.2008
Assessment Procedure
Risk control is not covered in this document but is covered in a separate document
entitled ‘Risk Analysis and Control.
In Section 4.2 of this document it is indicated for each key element whether or not additional
activities could be appropriate if it is not initially possible to assess the element as “acceptable”.
This indication is given after the table in part B of the detailed criteria for each key element.
If it is identified during a component assessment that any of these additional activities are
appropriate for a key element then a number of options are available to the assessor. The two
principal options would be to complete all applicable additional activities on each individual key
element as the need is identified, or to complete the basic assessment on all the key elements
and then perform all the additional activities together at the end.
The following table also gives a quick overview of which additional activities can be applicable
for each key element.
Key Element
Insufficient
Information
(assessment using
supporting data)
Risk Analysis
4
Thermal
Parts related Mechanical
(performance) Construction
Radiation
Risk Control
4
4
4
4
4
4
4
4
4
4
4
4
Process Control
Reliability
Assessment System
Parts related
(dependability) Reliability Data
Final Electrical Test
AOQL
Manufacturer
related
Quality System
Procurement
related
Specification
4
Traceability
Delivery Time
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 18 -
19.09.2008
Assessment Procedure
4.2
Detailed Assessment Criteria for Key Elements
4.2.1
Thermal Performance Characteristic
A. Description of Element
The thermal performance is defined by two different temperature ranges, the operating
temperature range and the storage temperature range.
The operating temperature range is the range of temperature given in the data sheet or the
procurement specification over which the manufacturer guarantees that the electrical
performance of the component is within the specified limits.
The storage temperature range is the range of temperature given in the data sheet or the
procurement specification over which the manufacturer guarantees that the components can be
non-operatively stored without degradation.
Both temperature ranges are applicable to the external package or case temperature of the
component.
B. Assessment
Assessment Criteria
Acceptable
for general
use
Operating temperature -25 °C to +85 °C, or better and
Storage temperature -30 °C to +85°C, or better
Lower parts temperature of operation / storage is 5°C or more below
lowest temperature of equipment operation / storage, and
Upper parts temperature of operation is 20°C or more above the
highest operation temperature of equipment, and
Upper parts storage temperature is 10°C or more above highest
storage temperature of equipment.
Neither of the above conditions is satisfied or sufficient information for
an assessment is not available
Acceptable
for specific
project use
Not
acceptable
‹
‹
‹
Further analysis is not considered appropriate for this element.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 19 -
19.09.2008
Assessment Procedure
C. Supporting Information
Extended temperature range
The operation and storage temperature ranges for the components are usually given in the data
sheet or in the procurement specification. If these temperature ranges do not cover the required
temperature ranges for the application, either for general use of for specific use, then the
component should not normally be accepted for space use. An exception can be made if the
manufacturer has additional information which allows him to guarantee the performance over
the extended temperature range.
Temperature derating
For space projects it is common practice that the operating temperature ranges at component
level be derated by a specific factor. The derated operational and storage temperature ranges
must not be exceeded during testing or during the mission. This derating is applied in order to
give more confidence that the space equipment will survive the foreseen life period.
The operational and storage temperature ranges of commercial component are normally smaller
that those of traditional hi-rel space components. Full temperature derating cannot therefore be
applied for most commercial component types because this would lead to derated temperature
ranges that would not meet typical project needs.
When considering the applicability of temperature derating for commercial components, it
should be taken into consideration that commercial components are often manufactured using
the identical technology to the equivalent space qualified components even though they have a
lower temperature range specified. The real temperature capabilities of some commercial
components might therefore be greater than is indicated by the published temperature range,
and the manufacturer has therefore effectively already incorporated some derating into his
published figures.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 20 -
19.09.2008
Assessment Procedure
4.2.2
A.
Mechanical Performance Characteristic
Description of Element
The mechanical performance covers the ability of the component to withstand mechanical
stresses during handling and transportation. This applies to the individual component, to the
component after assembly into a unit of equipment, and during the launch of the equipment by
the carrier vehicle. The normal mechanical stresses are vibration and shock. Commercial
components for use in space applications must have demonstrated their ability to withstand
these stresses.
The values given below for general assessments are an average of different actual space
programme requirements.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
The part is robust against
Sine Vibration: 20 .... 2000 Hz, 50 g, 2 min per axis and
Shock: 5 shocks per direction, 1000 g or higher
The values are within the limits specified for the specific space project
Neither of the above conditions is satisfied or sufficient information for
an assessment is not available
Acceptable
for specific
project use
Not
acceptable
‹
‹
‹
Further analysis could be appropriate for this element if sufficient information is not initially
available.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 21 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Self-resonance of printed circuit boards used in space equipment
The vibrational stresses experienced by components assembled onto printed circuit boards
which have been integrated into space equipment might be higher than the levels given in the
performance specification for the space equipment. This is due to self-resonance which can
randomly occur at some specific locations on the boards. In some cases this amplified stress
might be more than 10 times higher than the inducing stress at equipment level. This aspect has
to be taken into consideration if the individual project requirements are to be used for the
assessment.
Assessment for specific space projects
The vibrational stress at equipment level is usually of a random nature, whereas the vibration
testing done at component level is normally of a sinusoidal nature. Therefore, the random
vibration requirements at equipment level must be transformed into appropriate sinusoidal
vibration requirements at component level. The self-resonance, as described above, which
might occur at equipment level must be considered additionally.
Testing method and mounting of the components
The methods used for vibration testing and mechanical shock shall be equal to or similar to the
relevant test methods described in the MIL standards MIL-STD-202, MIL-STD-750, or MIL-STD883. The device shall be rigidly fastened to the test platform and the leads or cables, if there are
any, should be adequately secured.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 22 -
19.09.2008
Assessment Procedure
4.2.3
A.
Constructional Performance Characteristic
Description of Element
A component with good constructional performance shall have a mature construction and
design using materials with proven compatibility. The design has to be robust enough to
withstand handling and to be easily assembled into the equipment without damage using well
established assembly techniques. The components shall be made using a reproducible and
stable production process and shall show good workmanship. Materials and component
constructions which are known as critical for space applications, due to their nature or due to
experience gained from previous space programmes, shall be avoided.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
The component shall cover the requirements stated in the supporting
information in part C of this paragraph.
The materials and construction are compliant with the specific space
project requirements
Neither of the above conditions is satisfied or sufficient information for
an assessment is not available
Acceptable
for specific
project use
Not
acceptable
‹
‹
‹
Further analysis could be appropriate for this element if sufficient information is not initially
available.
No specific limits are specified for the assessment described above. The assessor has to make
a decision based on his experience and on the data available. Therefore, it is highly
recommended that the person selected to carry out the assessment has been involved in the
technical aspects of parts procurement, or previously involved in the production of the kind of
component under consideration in order to have sufficient experience and skill to be able to
make the assessment.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 23 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Construction and design
The design and construction of the component have to be mature and it shall be robust against
handling and transportation stresses such as vibration, shock, ESD, etc. The individual parts of
the component shall use appropriate materials, e.g. with good matching of the thermal
coefficients of the lead material, package or case, insulation materials, die attachment material
etc. They shall also be able to withstand the electrical, mechanical and thermal stresses which
are expected in the application by having, e.g., adequate current density, field strength, thermal
conductivity, etc.
Ideally there should be evidence that after the component was designed the manufacturer
processed at least one engineering lot and then subjected parts to constructional analysis to
verify the integrity.
Special design features in discrete semiconductors and integrated circuits, such as diffusion
barriers in metal layer constructions and coatings, internal bond-wiring and bonding material, die
attachment techniques and die attachment materials shall be adequate for space use.
Workmanship and production
The components shall show good and reproducible workmanship, which is an indication of
stable and reproducible processes.
Testing method for outgassing testing
The method used for testing the outgassing of components should be equivalent to or similar to
those in ECSS-Q70-02A or ASTM-E595-90. The general outgassing limits allowed are:
•
TML (Total Mass Loss) < 1.0%
•
CVCM (Collected Volatile Condensible Material) < 0.10%
•
RML (Recovered Mass Loss) < 1.0%
In some cases, the allowed limits might be lower due to special application conditions inside the
equipment, e.g. in equipment using optical components. However, in some cases it might be
allowable to use components with outgassing values higher than the above-mentioned limits.
This shall only be allowed if the total amount of outgassing material used in the space
equipment is comparatively low. This has to be verified on a case by case basis for each
individual project.
Prohibited Materials
Based on their nature and their behaviour in space, the following materials should be avoided
as far as possible:
•
Tin coatings with less than 3% lead – electroplated or fused (when surface is exposed to
vacuum) (although components with pure tin coated leads can be used if the leads are
100% dipped in tin-lead solder before use)
•
Cadmium, zinc or selenium except in hermetically sealed packages
•
Mercury and other compounds of mercury
•
Lithium
•
Magnesium
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 24 -
19.09.2008
Assessment Procedure
•
Radioactive materials
•
Polyvinylchloride
•
Materials subject to reversion
•
Materials which evolve corrosive compounds
Component types and component constructions prohibited for space use
The following component types or components constructions should be avoided for use in
space programmes:
•
Point contact or whisker diode construction
•
Dice without glassivation
•
Unpassivated power transistors
•
Electrical crystals with mounting at less than 3 locations around the crystal periphery
•
Non-solid electrolytic capacitors, except CLR79
•
Wire-link fuses
•
Hollow core resistors or capacitors
•
Non-double welded relays
•
Any part whose internal construction uses metallurgical bonding or solder joints with a
melting temperature not compatible with end-application mounting conditions
Assessment for specific space projects
The materials and/or parts constructions prohibited might differ from project to project. In some
exceptional cases, the use of prohibited materials or components is possible when it can be
justified by the equipment manufacturer and is accepted by the prime contractor and/or the
customer. This has to be verified on a case by case basis for each individual project.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 25 -
19.09.2008
Assessment Procedure
4.2.4
A.
Radiation
Description of Element
The capability of the component to withstand radiation shall be determined and shall include:
•
Total ionising dose tolerance
•
SEE sensitivity
For commercial parts, the values for total ionisation dose shall be determined for multiple lots
and shall be measured on components in the same package as that to be used. The
assessment limits for SEE behaviour are an average of different space programme
requirements.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
Total ionising dose sensitivity ≥ 50 kRad (estimated over multiple lots)
and (where applicable)
LET for SEU > 35 MeV.cm²/mg and
LET for SEL > 75 MeV.cm²/mg and
no SEGR occurred at 100% VDS at 40 MeV.cm²/mg and
no SEBO occurred at 100% VDS at 40 MeV.cm²/mg
The values are within the limits specified for the specific space project
Neither of the above conditions is satisfied or sufficient information for
an assessment is not available
Acceptable
for specific
project use
Not
acceptable
‹
‹
‹
Risk analysis could be appropriate for this element if sufficient information is not initially
available.
Note the radiation limits given in the above table are applicable at parts level.
In addition to the above assessment of the usability of the component, it is important to
determine if the component requires radiation testing of each procured lot (Radiation Verification
Testing (RVT)). The following general rule shall be applied in cases where radiation
characterisation has been performed using the multiple lot method as described in part C.
TID design margin between 1 and < 1.3 x measured threshold
Use with RVT
TID design margin ≥ 1.3 x measured threshold
Use without RVT
Where radiation characterisation data have been obtained from a single lot the following rule
shall be applied:
TID design margin < 1.5 x measured threshold
Not accepted
TID design margin between 1.5 and < 3 x measured threshold
Use with RVT
TID design margin between ≥ 3 x measured threshold
Use without RVT
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 26 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Applicability
Radiation susceptibility and testing are principally applicable to integrated circuits, discrete
semiconductor devices and quartz crystal oscillators. There are however a number of other
radiation sensitive materials which, although not generally used in traditional space qualified
components, might be found in commercial components. These include a variety of plastic
materials which can display significant changes in their mechanical properties when subjected
to irradiation.
Procedure for assessment of total ionisation dose data
In order to simplify the processing of commercial components foreseen for space use the
following approach for multiple lot testing shall be used as far as possible. This approach is
similar to the requirements in Paragraph 5.8.8 of Military Handbook MIL-HDBK-814 (Ionising
Dose and Neutron Hardness Assurance Guidelines for Microcircuits and Semiconductor
Devices).
•
Total dose data assessment will be carried out on small sample sizes (5 to 15 samples) out
of different wafer lots. Care must be taken that the samples are really out of different wafer
lots and not only from different packaging or inspection lots taken from the same wafer lot.
•
Total dose data assessment shall be carried out for each sample separately. The test
results for each parameter that is affected by total dose irradiation, shall be assessed by
statistical methods. Mean value and standard deviation shall be calculated for each sample
separately.
•
The end points of parameter variation are then calculated for each parameter and for each
sample. (This is an assessment of each individual lot)
•
The number of samples of different lots, which shall be applied to characterisation testing, is
not specified in the MIL-HDBK-814. However, in order to get a representative test result, it is
considered that the number of samples (number of different lots) should be 3 or larger.
•
Finally the end point of parameter variation over all production lots is assessed by the use of
statistical methods. Assuming that this assessment will be carried out for a high confidence
level, the obtained final end point shall represent the worst case of parameter degradation
over all lots.
Traceability requirements
If radiation verification testing is required for total ionising dose, based on the assessment
requirements as given in part B, then the lot traceability must be sufficient to allow all delivered
parts to be traced back to their appropriate wafer processing lot. If radiation verification testing is
not considered necessary for delivered lots then the traceability must still be sufficient to ensure
that all delivered parts are made using the same processing as the assessed parts.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 27 -
19.09.2008
Assessment Procedure
Basis of measurement data
•
Total ionising dose
The testing used for the assessment shall have equivalent or similar requirements to those
stipulated in ESA/SCC22900 or MIL-STD-883, Method 2019. The type of radiation source
shall be as defined therein. Whether or not low rate dose test data is required depends on
whether the technology of the component and/or similar components are known to be low
dose rate sensitive. Existing test data used for assessment should not be older than 5 years
and shall be representative of the technology, manufacturer, production line, process,
design, etc. which will be applied for commercial component for space use. Whether the
electrical parameter change limits applied in the total dose test data are appropriate can be
assessed on the basis of limits for a similar traditional hi-rel space component type given in
Table 7 of an ESA/SCC detail specification or applied in a MIL specification for radiation
hardness assurance testing.
•
Single event effects
The testing for the data used for the assessment of SEE behaviour shall be equivalent to or
similar to ESA/SCC 25100 or MIL-STD-883, Methods 1020, 1021, 1022 and 1023. Test
methods, sample sizes and failure indication shall be equivalent to these standards. The
data used for assessment shall not be older than 5 years and it has to be clarified that the
technology, manufacturer, production line, process, design, etc. are representative of the
commercial components foreseen for space use.
Radiation Verification Testing
RVT testing of total ionising dose sensitivity shall be conducted in accordance with the
requirements stated in ESA/SCC 22900 or in MIL-STD-883, Method 2019. Part types or
technologies that have been identified in accordance with any project requirements as sensitive
to low dose rates shall be submitted to low dose rate testing.
Displacement Damage Degradation
Electronic devices may also suffer from displacement damage degradation when the spacecraft
or satellite orbit is exposed to a proton environment. Although transistors and linear
microcircuits may also be affected by this effect, it is primarily opto-electronic devices (CCDs,
opto-couplers, etc.) which are typically categorised as sensitive.
No fixed testing method exists yet for the assessment of proton induced damage in
semiconductor devices. That part of the deposited energy which creates displacement defects
(in the semiconductor’s lattice structure) is called Non-Ionising Energy Loss (NIEL). The trapped
proton flux spectrum is converted into a fluence of mono-energetic particles producing the same
amount of defects (typically 1 MeV neutrons or 10 MeV protons). For convenience an equivalent
fluence of 10 MeV protons in silicon is usually selected for any assessment. In defining the
fluence at component (or semiconductor) level some shielding can be included and the value is
typically expressed as a fluence of 10 MeV equivalent protons behind aluminium spherical
shields of various thicknesses. The transmitted proton fluxes have been computed with NOVICE
code for a silicon detector in a 1 m diameter aluminium sphere.
For any mission this curve provides a lower Displacement Damage Equivalent Fluence of
φeq (10 MeV, Si) = DDEF p/cm² @ 10 MeV protons
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 28 -
19.09.2008
Assessment Procedure
For projects where displacement damage degradation is applicable all sensitive parts (bi-polar
and opto-electronic devices) should survive the DDEF calculated for the mission. The
acceptance of the parts should be based on displacement damage test data which are
considered in addition to the total dose electrical parameter drifts in a worst case analysis. The
data should be taken from neutron testing data bases and proton test results and equivalence
between proton and neutrons can be deduced from the environment specification. If no data are
available proton irradiation evaluation testing should be considered. Due to the extremely high
cost of proton testing, however, the use of commercial versions of displacement damage
sensitive components types will generally not be a realistic option.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 29 -
19.09.2008
Assessment Procedure
4.2.5
A.
Process Control
Description of Element
Process control is a methodology used to detect defective processes prior to completion of
assembly. This includes the preventive, monitoring and corrective actions conducted during a
process run, with the intention to manufacture components that cover the quality and reliability
requirements, while using adequate quality tools. Unfortunately direct assessment of process
control is difficult because there is no single quantitative result or limit which can be associated
with it. Also any quantitative results which a manufacturer might generate, or limits which might
be applied, are normally treated as highly confidential proprietary information and are not
released (but of course are used internally to control the processes as an intrinsic monitor of a
process’s or product’s quality).
B.
Assessment
Assessment Criteria
Acceptable
for general
use
The topics described in part C of this paragraph show satisfactory
results when assessed
The above condition is not satisfied or sufficient information for an
assessment is not available
‹
Acceptable
for specific
project use
Not
applicable
Not
acceptable
‹
Further analysis is not considered appropriate for this element.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 30 -
19.09.2008
Assessment Procedure
C.
Supporting Information
The assessment of the key element "Process Control" is not possible by the estimation of an
individual figure, or by the comparison with a specific limit. Therefore, the acceptability of this
key element has to be judged on the basis of an assessment of the following related elements.
•
Process Maturity
•
Process Description and Documentation
•
Wafer Level Reliability (for semiconductor components)
•
Statistical Process Control (SPC)
•
Defect Control
•
Yield Analysis
•
Design Rules
Process Maturity
Process maturity gives a measure of the stability and the reproducibility of a production process
since introduction of that process to the market. It also covers the status of process
improvement over the period of application due to lessons learned from previous production
runs using the same or a similar production process.
The following points shall be considered for assessment of process maturity
1. The history of the technology and processes used, including the date of establishment and
any changes that have been introduced.
2. The justification for any changes which have been made to the technology or process, if
applicable.
3. The volume of components manufactured by the same processes without major changes.
4. The date and history of changes to the production line, e.g. change of facility, change of
production line, changes within an existing production line or the corresponding processes.
5. Feedback data from field experience, which may be correlated to the processes used for
manufacturing.
6. The long term trend of AOQL, FIT rates and process capability figures (Cp, Cpk). AOQL and
FIT rate should show a decreasing tendency and the process capability values should show
an increasing tendency.
No specific limits are given for the above mentioned points for assessment. The assessor has to
make a decision based on his experience and on the data available, while considering the
impact and the importance of the points for the key element that is under consideration.
A baseline for the assessment may be to compare these points with the performance of a state
of the art manufacturer or process, or with the best in class component or manufacturer.
Process Description and Documentation
This element is applied in order to verify that all activity for designing and producing a
component as well as the test steps and the test data obtained during these procedures is well
documented by the manufacturer.
The following points shall be considered for assessment of process description and
documentation:
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 31 -
19.09.2008
Assessment Procedure
1. The design of the component and the material used for the individual parts of the
component are well documented in drawings which are under the configuration control of
the manufacturer.
2. All processes used for production and testing of the component as well as the elements and
variables that make up a process, such as equipment, personnel, materials, environment,
etc., shall be documented. The manufacturer's documents shall be under configuration
control.
3. The test data obtained during in-process control, final production testing or screening
testing, if applicable, shall be documented and stored for a defined time period.
4. Notice of process changes shall be made available to the customers according to
EIA/JESD-46-A or a similar standard.
No specific limits are given for the above mentioned points for assessment. The assessor has to
make a decision based on his experience and on the data available, while considering the
impact and the importance of the points for the key element under consideration .
A baseline for the assessment may be to compare these points with the performance of a state
of the art manufacturer or process, or with the best in class component or manufacturer.
Wafer Level Reliability
For semiconductor components, Wafer Level Reliability (WLR) makes use of dedicated test
structures which are fabricated with the products on production wafers and which are used to
test the reliability of the products on that wafer. WLR test structures can often be designed so
that measurements can be carried out before the wafers have been fully processed, giving an
early assessment of the reliability of the production devices. The test structures are designed so
that a specific failure mechanism is assessed by a specific test structure. In that test structure,
changes due to the failure mechanism are accelerated so that an assessment can be achieved
in a matter of seconds.
It has been established that the quality and reliability of products is very much related to the lot
processing. I.e. a small number of lots exhibit bad reliability performance. Wafer level screening
and removal of lots with bad reliability performance can drastically improve the overall product
reliability. This is one of the major advances in integrated circuit reliability control and has
enabled manufacturers to reduce or eliminate expensive screening of many packaged devices.
Due to the relatively short time required to conduct WLR measurements, the statistical basis for
reliability prediction can be substantially increased.
Statistical Process Control (SPC)
Statistical Process Control is a system of established and documented measurements and
inspections, which are carried out at pre-defined stages of the production. The obtained data
shall be analysed and assessed. The test results shall be entered into feedback loops to other
nodes of the production process, which enables the operational personnel to conduct corrective
actions, if required, or to adjust the process in such a way that the outcome will be of sufficient
quality and reliability.
SPC is a generic term and covers many different methods and philosophies, e.g. control charts,
capability indexes, Parts Per Million (PPM), six σ, attributive testing or testing by variables etc.
Numerous papers and standards dealing with this subject are available. In many cases
standards for SPC testing do not only specify statistical methods, they also include
implementation and maintenance rules for such a system. These elements are not stand-alone
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 32 -
19.09.2008
Assessment Procedure
quality elements; they are elements of the overall quality system of the manufacturer. Therefore,
it is not considered appropriate to specify the use of a specific standard or methodology. The
identification or description of the kind of statistical methods applied is therefore also part of this
element, together with their effectiveness for control of the production process and the
introduced corrective actions resulting from the findings.
The following points shall be considered for assessment of in-process control and SPC:
1. Assessment of capability figures shall be carried out according to the supporting information
given at the end of this element. Further baselines for the use and application of Cpk are
given in EIA-738.
2. The statistical rules and methods used by the manufacturer for the determination and
calculation of SPC values and the assessment of their effectiveness shall be equivalent to or
similar to those given in EIA/JEP-132, EIA-557-A, MIL-HDBK-683. For example, an audit
checklist for a typical SPC system is given in annex C of EIA-577-A, or an evaluation guide
for a supplier's SPC programme is given in chapter 5 of MIL-HDBK-683.
3. Critical production steps and production nodes for the technology and processes used shall
have been identified by the manufacturer.
4. The SPC methods used must be able to control the intended quality goals of the
manufacturer in an effective manner. That is, the sample size applied for statistics, the
periods when they are repeated, either time dependent or volume dependent, the test
parameters chosen for measurements, and the mathematics used for the calculation of the
statistical values must be chosen in such a manner that there is enough test severity to
demonstrate that process control is effectively achieved.
5. The in-process control / SPC shall be effectively embedded into the overall quality system of
the manufacturer. Guidelines for that can be taken and derived, for example, from MILHDBK-683. In addition the in-process control / SPC system of the manufacturer shall be
able to carry out any necessary corrective actions in an effective manner.
6. Components or technology specific in-process controls shall be equivalent to “state of the
art” standard.
7. For integrated circuits the use of Process Control Monitors (PCMs), Technology Control
Vehicles (TCVs), Dynamic Evaluation Circuits (DECs), etc. to monitor process quality.
8. For discrete semiconductors and integrated circuits the monitoring of wafer defect densities
and distributions and the comparison of adjacent dice on a wafer.
9. The use of in-process tests and controls to detect and remove “anomalous” components at
an early stage in the manufacturing process wherever this is possible.
No specific limits are given for the above mentioned points for assessment. The assessor has to
make a decision based on his experience and on the data available, while considering the
impact and the importance of the points for the key element that is under consideration. A
baseline for this assessment might be to compare these points with the performance of a state
of the art manufacturer or process, or with the best in class component or manufacturer.
If a manufacturer uses capability figures Cp and Cpk for in-process control of his process, then
the following values for Cp and Cpk can be taken as guidance for the assessment of the
process.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 33 -
19.09.2008
Assessment Procedure
Tolerance zone in σ
units
Cp
Comments, with respect to state of the art technology
6
1.00
Not acceptable
8
1.33
Not recommended
10
1.67
Acceptable standard
12
2.00
Recommended
Comparable to 6σ philosophy
Process average to next
tolerance limit in σ units
Cpk
3
1.00
Comments, with respect to state of the art technology
Not acceptable
4
1.33
Not recommended
4.5
1.50
Recommended
5
1.67
Good
6
2.00
Excellent
Minimum Cpk value which is comparable to the 6σ philosophy
This value would be reached in the 6σ philosophy for processes
which are stable and the average is located in the middle of the
tolerance band
Defect Control
Because early and random failures are generally the result of defects in components, defect
removal or reduction is the most effective means to increase the reliability of components. Some
measures that are applied to control defects, mainly in semiconductor components, are:
•
Fabrication in extreme clean room conditions so that particle contamination is reduced to a
minimum.
•
Effective monitoring procedures to control particle defects and processing defects
•
Inspection procedures to detect defects. For example, visual inspection of chips or
automatic optical pattern recognition techniques. (Commercial equipment is available for
comparison of neighbouring dice).
In addition to optical inspection, electrical measurements during manufacturing can also be
used to identify defects at wafer level.
Manufacturers have implemented a wide range of different measures for defect control. Defect
control has become an important part of modern semiconductor manufacturing techniques.
Visual inspection is the most well known method for defect control and a number of standards
for internal visual inspection are available e.g. MIL STD 833 Method 2010. In military and space
and some automotive applications 100% visual inspection of completed wafers is carried out.
For commercial components, visual inspection is generally carried out on a sampling basis. A
further possibility for visual inspection is the use of pattern recognition equipment to identify
problem dice and then manual visual inspection to check those dice.
Yield analysis
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 34 -
19.09.2008
Assessment Procedure
The manufacturers shall have procedures implemented in his manufacturing facilities and
processes to determine the yield of the products for each production lot. Moreover, individual
yield figure limits might be necessary for individual, critical process steps to come to an
acceptable yield figure for the finished product.
The application of yield limits at critical process steps is to be handled as a gating control for
further processing, i.e. if minimum yield figures are not met, the lot has to be rejected and either
scrapped or returned for re-processing (if this is at all feasible). In addition an analysis must be
initiated to determine the root cause of insufficient yield, which will then be fed back to the
critical process for appropriate adjustment.
Accordingly a manufacturer shall have documented procedures for the disposition of lots or
semi-finished products with yield outside of limits, which shall be defined for each product.
Design Rules
These shall consist of a set of rules, which are applied to each new design or design
modification of a component or a part of a component, with the intention of designing high
quality and reliability into the components. The applied rules shall be documented in an
appropriate manner and shall be updated due to lessons learned from realised designs and
from state of the art requirements. The design rules may either be very specific and only related
to individual part types or part type families, e.g. specific rules for current density, power
dissipation, field strength, etc for a specific technology or component, or they can be general
rules, e.g. the use of design tools such as FMECA (Failure Modes, Effects and Criticality
Analysis), DOE (Design of Experiments), etc. for products or processes.
The following points shall be considered for assessment of design rules:
1. The manufacturer shall have established a set of design rules for each component,
component family or technology used.
2. The manufacturer shall have established a set of design rules, which are applied for each
new design, or design modification of a component, or part of a component (e.g. a die). The
design rules shall be continuously improved and amended, e.g. state of the art design rules
and lessons learned from realised design shall be implemented.
3. Process or construction FMECA or similar tools shall have been applied prior to
implementation of new and modified processes or designs into the production process.
Basic rules for conducting a FMECA are given in ECSS-Q30-02A.
4. The application of specific design rules for specific component types of families of
components, or for a specific technology shall be considered for the assessment. The
assessment of which design rules have to be used for which technology or process depends
on the experience of the assessor with this subject. State of the art design rules shall be
considered for the assessment and estimation of this element.
5. Design of experiments (DOE), factorial analysis, or similar design tools shall have been
used for determination of new or changed processes or technologies prior to their
implementation into the production process.
6. The manufacturer shall apply design models for worst case temperature and electrical
extremes, and for rules of thermal design and of package design.
7. Implementation of feedback loops from design, material and process development activities
into design guidelines shall have been implemented by the manufacturer.
8. For integrated circuits it shall be verified that each new or modified design will be checked
as follows:
-
Design rule check of geometrical and physical data
-
Design verification by simulation
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 35 -
19.09.2008
Assessment Procedure
-
Electrical, shorts and connectivity design rule check
-
Check of reliability design rules like electromigration and current density, reverse current
drops, latch-up, SEU (Single Event Upset), hot electrons, ESD (Electrostatic Discharge),
burnout, or back-gating, as applicable
-
Verification of full functional and parametric testability at wafer level and on packaged
devices
As a baseline for the assessment of the design rules applied at a manufacturer for a specific
component or technology the following points may be considered:
1. For transformers, inductors and coils the requirements of MIL-STD-981 can be considered.
2. Design rules for discrete semiconductors, integrated circuits and hybrid circuits can be
derived from the MIL performance specifications MIL-PRF-19500, MIL-PRF-38534 and MILPRF-38535.
3. Guidelines for process FMEA is given in document EIA/JEP-131.
4. Guidelines for design tools like DOE, FMEA, finite element modelling and analysis, etc. are
given in EIA/JEP-132.
The examples given above for design rules are only a baseline; the application of specific rules
depends strongly on the component type and on the technology used.
No specific limits are given for the above mentioned points for assessment. The assessor has to
make a decision based on his experience and on the data available, while considering the
impact and the importance of the points for the key element under consideration. Therefore, it is
highly recommended that the person carrying out this assessment has been involved with the
technical aspects of parts procurement or with the production of this kind of components in
order to have sufficient experience and skills to be able to make the assessment.
4.2.6
A.
Reliability Assessment System
Description of Element
The reliability assessment system is a documented manufacturer's standard of the procedures,
methods and test conditions which are used for obtaining reliability data on components, either
by special reliability testing or by other means for evaluating reliability such as existing field data
analysis for a similar component or technology. The basis for reliability calculations and the
background and the justification for using the applied rules shall also be considered. The
outcome of the assessment of the manufacturer's reliability assessment system shall determine
whether or not there is sufficient justification and confidence that the manufacturer's reliability
data have been determined in an adequate manner according to state of the art methods.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
The reliability system is based on state of the art means and inspires
confidence that useful data can be derived from the system,
considering the base lines given in supporting information in part C of
this paragraph
The above condition is not satisfied or sufficient information for an
assessment is not available
‹
Acceptable
for specific
project use
Not
acceptable
Not
applicable
‹
Further analysis is not considered appropriate for this element.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 36 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Definition of failure
For comparison of different data it is essential to have a clear understanding of what is defined
as a component failure and how many different failure categories are defined. The assessor has
to distinguish carefully between catastrophic failures, where a component is no longer working
at all, parametric failures, where the component is still working but the performance is at least
for one parameter out of the specified range, or drift failure, where the component is still working
well, but at least one parameter has exceeded a specified drift limit after a specific period of
operation. The nature of the failure as well as the limit which is exceeded have to be carefully
considered by the assessor in order to guarantee that the calculated reliability figures can be
used to make a comparison between different component types or manufacturers .
Test conditions and acceleration factors
The conditions which were applied during testing for reliability figures shall be well defined. The
nature and the value of the acceleration factors used for the calculations, e.g. high temperature,
higher voltage, higher current or higher power shall be clearly defined. There shall also be a
justification, based on test results or on published state of the art theoretical considerations, that
the conditions defined for testing and for use, and the resulting acceleration factors used, are
applicable for the components.
Additionally it should be checked whether the parts life tested to calculate the reliability figures
were submitted to any burn-in prior to the test. If any burn-in testing was applied, then it should
be verified whether procured parts will also be burned-in before delivery. If not, it must be
verified whether the calculated reliability figures can be modified to allow for this or whether
burn-in can be performed on all parts after procurement.
The status of the components for which the data is applicable should be well specified. E.g. are
the data applicable for packaged components, or are they only applicable for dice and a
package related factor has to be added in order to obtain a comparable baseline.
Standards for reliability data assessment
As guidelines for the assessment of a reliability assessment system the standards and
publications JESD-34, EIA/JESD47, EIA/JESP-122, EIA-659 or any other similar documents
can be used. JESD-34 deals with failure-mechanism-driven qualification of silicon devices and
EIA/JESD47 deals with stress-test-driven qualification of integrated circuits, whereas EIA/JEP122 provides the failure mechanisms and models for silicon semiconductor devices and EIA-659
gives support for failure-mechanism-driven reliability monitoring systems.
Baselines to be considered
The following points shall be considered for the assessment of the reliability system
•
The nature of the failures considered for reliability system as well as the limits applied for the
analysis have to be well defined.
•
The testing conditions for reliability figures and the natures of acceleration factors as well as
the applied values of acceleration have to be well defined and should be justified by
appropriate means.
•
The system shall have clear definitions for when a new or an updated reliability assessment
for a component, a technology or a production line has to be conducted. E.g. for new or
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 37 -
19.09.2008
Assessment Procedure
changed components design, for new or changed technology, or for new or changed
production line processes.
•
The calculated reliability figures for a specific component or technology shall be updated
after any new reliability assessment or verified for validity at least once a year.
•
The methods, means and processes used by the manufacturer for reliability assessment
shall be well supported by appropriate justification and background information.
•
For integrated circuits does the manufacturer monitor reliability at wafer level by including
appropriate test structures which can be used for highly accelerated on-wafer reliability
measurements of the known failure mechanisms of the device?
•
For integrated circuits is the correlation between wafer level reliability and component
reliability well established and verified?
•
Is failure analysis performed on field failures to provide additional information on failure
mechanisms and reliability?
No specific decision limits are given for the above mentioned assessment points. The assessor
has to decide whether or not the system is able to deliver actual reliability figures with a high
level of confidence, while covering all critical parameters under the temperature conditions used
in the space equipment.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 38 -
19.09.2008
Assessment Procedure
4.2.7
A.
Reliability Data
Description of Element
The values of failure rates, which were estimated and calculated by the manufacturer based on
the manufacturer's reliability assessment system, are defined as reliability data. For
comparability of assessed values obtained from different testing or application conditions the
values shall be transformed to values for 55°C ambient package or case temperature
considering all the acceleration factors applied during testing.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
The failure rates of the components are equal to or better than the
values given in the supporting information in part C of this paragraph
The values are within the limits specified for the specific space project
Neither of the above conditions is satisfied or sufficient information for
an assessment is not available
Acceptable
for specific
project use
Not
acceptable
‹
‹
‹
If reliability data are available but it is uncertain how applicable they are to procured lots (e.g.
reliability testing was performed on screened parts but procured parts are supplied unscreened)
then further analysis could be appropriate.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 39 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Baseline for failure rates
The component failure rates used as a baseline for the reliability analysis of specific space
equipments are traditionally those described in ESA PSS-01-302 and MIL-HDBK-217. These
rates can still be valid for current projects, even if the documents are no longer under
configuration control and will not be updated. Other national or international documents or
databases can also be used to provide suitable baseline figures.
The target maximum failure rates for commercial components for general space use are stated
in the table below, but actual values differing from these by an order of magnitude would still be
acceptable for most space projects. These failure rates should be applied for a confidence level
of 60% at a package or case temperature of 55 °C.
Component Group
Resistors, fixed
Resistors, variable
Resistors, chip
Capacitors, fixed
Capacitors, variable
Capacitors, ceramic
chip
Diodes, general
purpose, analogue,
switching, rectifier,
Schottky rectifier,
Zener, voltage
regulator, current
regulator
Diodes, Schottky,
detector, mixer; Si
and GaAs
Transistors, NPN,
PNP
Transistors, RF
Proposed FIT rates
9
in 1/10 hrs
0,3
0,5
0,3
0,3
2,0
0,5
Component Group
Integrated Circuits,
digital, MOS
(G = Gates)
Integrated Circuits,
linear, MOS
(T = Transistor
functions)
Integrated Circuits,
memories, RAM,
bipolar
0,5
1
Integrated Circuits,
memories, RAM,
MOS
≤1 W
0,05
>1 W
1
No specific value
proposed
Transistors,
No specific value
microwave
proposed
Transistors, FET, Si No specific value
proposed
Transistors, FET,
No specific value
GaAs
proposed
Integrated Circuits,
100 G 1,5
digital, bipolar
1000 G 2
(G = Gates)
3000 G 4
10000 G 11
30000 G 18
60000 G 25
Integrated Circuits,
100 T
4
linear, bipolar
300 T
6
(T = Transistor
1000 T 11
functions)
10000 T 17
Integrated Circuits,
memories, ROM,
bipolar
Integrated Circuits,
memories, ROM,
MOS
Integrated Circuits,
µP bipolar
Integrated Circuits,
µP MOS
Hybrids
Proposed FIT rates
9
in 1/10 hrs
100 G 2
1000 G 4
3000 G 7
10000 G 17
30000 G 28
60000 G 45
100 T
4
300 T
6
1000 T 11
10000 T 17
16 k
2,5
64 k
4
256 k
6
1000 k
11
16 k
3
64 k
5
256 k
10
1 M
15
4 M
25
16 k
4
64 k
6
256 k
10
1000 k
18
16 k
2,5
64 k
3
256 k
3,5
1000 k
6
8 bit
10
16 bit
20
32 bit
40
8 bit
15
16 bit
30
32 bit
60
No specific value
proposed, depends
on complexity
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 40 -
19.09.2008
Assessment Procedure
Consideration of burn-in testing
If the above mentioned failure rates are achieved without any burn-in testing, then the
components are considered to be applicable for space use without burn-in testing on the flight
components. However, if the above failure rates can only be achieved by components after
application of a burn-in, then the components should only be considered for space flight
application if they have been burned-in before procurement or can be burned-in after
procurement. The burn-in conditions for the space components shall be equivalent to or
comparable to the burn-in conditions applied to the components used for failure rate testing.
Age of the reliability data and verification of conformity of failure rate figures
The most recent data used for reliability calculations should not be older than 2 years. If this
requirement is not met there should have been at least a verification of validity of the reliability
figure during the last year.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 41 -
19.09.2008
Assessment Procedure
4.2.8
A.
Final Production Electrical Measurements
Description of Element
The final production electrical measurement is a 100% electrical testing of the components at
the end of the production process with the intention to detect and remove all those components
from the production lot which are unable to satisfy the electrical performance requirements
given in the data sheet or the procurement specification.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
At least all critical electrical parameters of the components are
submitted to 100% final production electrical measurements.
Not all critical electrical parameters are 100% measured or sufficient
information for an assessment is not available.
‹
Acceptable
for specific
project use
Not
applicable
Not
acceptable
‹
Further analysis could be appropriate for this element if it is not certain that all critical
parameters are 100% measured by the manufacturer. As a risk control exercise it would be
possible to perform 100% measurement of critical parameters on all flight parts after receipt.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 42 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Test methods and parameters to be measured
Neither the parameters tested nor the test methods used must be identical to those specified in
the data sheet or the procurement specification. Final production electrical testing may either be
performed by measuring the parameters specified in the data sheet or the detail specification
directly, or by using substitute measurement methods and parameter limits, e.g. for better
throughput. However, there must be a high correlation between the guaranteed performance
and the substitute methods and limits used. This shall be demonstrated by the assessment of
appropriate test data or by analysis reports.
Ideally final production electrical testing should be performed to tighter limits than those
specified in the data sheet (guard banding). In many cases there is a correlation between the
amount of guard banding and the reliability of the supplied devices.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 43 -
19.09.2008
Assessment Procedure
4.2.9
A.
Average Outgoing Quality
Description of Element
The average outgoing quality is the average proportion of non-conforming components, in parts
per million, from a series of lots. The value of the average outgoing quality shall be determined
by sample testing of finished components prior to shipment or storage. The method of AOQL
determination shall be equal to or similar to a state of the art standard. The parameters to be
measured should have been defined by the manufacturer. Assessing whether or not all critical
parameters are covered has to be based on the experience and skills of the assessor.
The average outgoing quality shows the average number of faulty components that are
accepted for shipment even after they have been subjected to all production testing including
final production electrical measurements. Possible reasons for shipment of faulty components
could be test equipment temporarily functioning incorrectly or bad handling of components after
production testing. Examples of these might be a component with a high internal leakage
current being accepted because a bad high resistance test connection reduced the current, or a
component being damaged by ESD. The AOQL values given below for general assessment
have been derived from values required for traditional hi-rel space components and given in
publications on commercial parts.
B.
Assessment
Assessment Criteria
Acceptable
for general
use
Method for AOQL determination is state of the art and
AOQL ≤ 50 ppm for electrical parameters and
AOQL ≤ 100 ppm for mechanical parameters (but see part C of this
paragraph)
AOQL unknown or
50 ppm < AOQL for electrical parameters or
100 ppm < AOQL for electrical parameters (but see part C of this
paragraph) or
method for AOQL determination is not considered satisfactory
or sufficient information for an assessment is not available
Acceptable
for specific
project use
Not
acceptable
‹
Not
applicable
‹
Further analysis could be appropriate for this element if it is initially assessed as not acceptable
for any reason.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 44 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Method of AOQL determination
The methods and rules for calculation of the AOQL values shall be equivalent to or similar to the
requirements given in EIA-554-A, EIA-554-1, EIA-554-2, EIA-591 and JESD16-A. All four
documents deal with standards and baselines for the assessment of quality levels in parts per
million (PPM). If none of these standards is applied it shall be verified that the calculation rules
applied by the manufacturer are equivalent to or comparable to the requirements stipulated in
those standards.
Parameters applied for AOQL determination
All critical electrical AC and DC parameters given in the procurement specification measured at
room temperature and high and low temperatures should at least be considered for AOQL
determination. In addition all mechanical parameters which are considered to be critical
parameters should be considered for mechanical AOQL determination.
For the estimation of whether or not a parameter shall be considered to be critical, either the
specific conditions of application of the component in a space programme, or the parameters
normally tested for equivalent or similar traditional hi-rel space components shall be used as a
baseline.
Test Methods and Procedures
The test conditions for samples submitted to electrical testing for AOQL determination shall be
applied according to the conditions stated in the data sheet and/or the detail specification.
Electrical testing of components is often carried out by measurement of substitute parameters,
especially for final production electrical measurements, in order to have an effective and quick
method of measurement. For AOQL determination, such substitute measurements have to be
avoided because one intention of AOQL determination is to assess the amount of delivered
components not satisfying the specified performance requirements, and one reason for bad or
high AOQL values might be bad correlation between limits and measurement methods for
substitute parameters and the specified performance of the component.
Relaxed Mechanical Requirements
When performing the necessary testing to establish AOQL some manufacturers use a
significantly smaller sample for mechanical testing than they use for electrical testing. This can
mean that the figure for mechanical AOQL does not reflect the true AOQL, but is due to
statistical limitations resulting from the small sample size tested. In addition some
manufacturers include “cosmetic” defects, which have no effect on performance, as mechanical
failures when determining AOQL. If either of these two cases apply the requirement for
mechanical AOQL can be relaxed to ≤ 1000 ppm.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 45 -
19.09.2008
Assessment Procedure
4.2.10 Quality System
A.
Description of Element
To provide confidence in the quality of the tasks performed and the product supplied, the
manufacturer must have an established and maintained system for quality control and
conformance which is described in a configuration controlled in-house document. The system
must include mandatory requirements covering all design, manufacturing, assembly and testing
of the components undergoing assessment, and the areas where these are performed. The
requirements must also include placing equivalent requirements on all subcontractors which are
used for any part of the processing.
B.
Assessment
Assessment Criteria
Acceptable Acceptable
for general for specific
use
project use
Quality system is compliant with ISO-9001 requirements (or equivalent
requirements), or it contains only minor non-compliances which would not
be expected to result in an inferior product and which could easily be
rectified
There is no defined quality system, or it does not meet the above
requirements, or sufficient information for an assessment is not available
‹
Not
acceptable
Not
applicable
‹
Further analysis is not considered appropriate for this element.
Note that if two or more companies are jointly responsible for the production of a component,
e.g. if it is designed by one company but manufactured by another, then each company must
independently be able to meet the relevant criteria for its own activities.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 46 -
19.09.2008
Assessment Procedure
C.
Supporting Information
ISO-9001
The ESA/SCC quality system requirements are very closely based on those of ISO-9001.
Therefore any quality system which complies with ISO-9001 requirements must also be largely
compliant with ESA/SCC requirements and therefore acceptable for producing components for
space use.
Many quality systems which are not specifically described as being compliant with ISO-9001
might nevertheless be compliant with its requirements. This will most commonly occur if the
quality system is based on requirements which have been copied almost unchanged from ISO9001 (as is the case for ESA/SCC requirements).
Equivalent Requirements
Equivalent requirements can be based on ISO-9001, and therefore be effectively identical, or
can be created independently of ISO-9001, but nevertheless define alternative quality system
requirements capable of achieving the same results.
Examples of requirements which are closely based on ISO-9001 are:
•
AQAP-110, NATO Quality Assurance Requirements for Design, Development and
Production (which refers to the ISO document for most requirements)
•
QS 9000, Automotive Quality Management System (which is described as an enhanced
version of ISO-9001 which reflects the needs of the automotive industry)
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 47 -
19.09.2008
Assessment Procedure
4.2.11 Traceability
A.
Description of Element
Traceability is the capability of a Manufacturer to trace backwards from completed, delivered
components and identify all the basic materials and processes used to produce them. It also
includes the capability to then trace forwards to identify all other components made using the
same materials or processes.
Traceability is “traditionally” required for space components so that:
•
if any defects are discovered in components then the use of other components from the
same manufacturing lot or flow can be restricted.
•
the results of any testing on components from a specific manufacturing lot or flow can also
be used for other components from the same lot or flow.
•
testing for radiation hardness, which can show an extremely high lot dependence for some
component types, can be effectively performed when required (see Paragraph 4.2.4 for a
fuller discussion of the importance of lot definition and traceability with respect to radiation
hardness assurance).
B.
Assessment
Assessment Criteria
Acceptable Acceptable
for general for specific
use
project use
A level of traceability can be demonstrated equal to that required for
ESA/SCC or MIL space level qualified components
OR
If the traceability is assessed according to the scheme described in
section C of this paragraph and all requirements are met
All traceability requirements defined for a specific project are met, even if
the ESA/SCC, MIL or section C requirements are not met
None of the above conditions are satisfied or sufficient information to
make an assessment is not available
Not
acceptable
‹
‹
‹
Further analysis is not considered appropriate for this element.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 48 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Information needed for Assessment
The level of traceability which a manufacturer maintains for its components can be assessed on
the basis of:
•
A manufacturer’s statement, made in response to a request for information, which describes
the traceability which he maintains.
•
A manufacturer’s quality assurance manual, manufacturing manual, or documented
procedure(s) which describe and control the actions taken during component manufacture to
maintain traceability.
•
A sample of any documentation (traveller) which is produced for, and which accompanies, a
production lot and which demonstrates the level of traceability.
It is possible that some manufacturer’s in-house documentation would not be supplied outside
the company, but it would be sufficient if this was made available for inspection at the
manufacturer’s premises.
Assessment of Information
Any information which is received must be evaluated using the following table.
Points allocated to manufacturers which meet various requirements
Requirement
Is the requirement met?
Yes
No
(or it is not
(or information
applicable)
not available)
Is the expression 'lot' well defined in the manufacturers documents?
Is the definition of 'lot' such that different designs, die constructions,
production lines or facilities are excluded within one individual 'lot' of a
specific component type?
Is the definition of 'lot' such that the components will be made out of the
same materials charge and manufactured according to the same
processes?
If manufacturing or testing lots or dates can be determined, are the
associated test personnel, equipment and data recorded?
Are data and documents retained for a minimum of 5 years?
For each assessed manufacturer/component all five requirements must be met if the traceability
is to be assessed as acceptable for general use.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 49 -
19.09.2008
Assessment Procedure
4.2.12 Specification
A.
Description of Element
A specification is any form of configuration controlled document, or documents, which defines
the components which have been, or are to be, procured or tested. It can be used as a
contractual definition of which requirements have to be met by the parts which will be delivered
by the manufacturer. The specification must define the components in enough detail for a
potential procurer to be certain that they are what he wants for his application, and to be certain
that any procured components will be essentially unchanged from those which were used to
generate the assessment data.
B.
Assessment
Assessment Criteria
Acceptable Acceptable
for general for specific
use
project use
Not
acceptable
Documentation is configuration controlled as described in the Supporting
Information and satisfactorily covers the following as an absolute
minimum:
•
Component Marking
•
Mechanical Requirements
- Package type and pin allocations
- Maximum overall package dimensions
•
Electrical Requirements
- Limits for externally applied conditions
- All critical DC parameters
- All critical AC parameters
•
•
Thermal Requirements
- Maximum external temperature in which the component can be
stored without risk of damage
- Maximum external temperature in which the component can be
operated without risk of damage
- Maximum permitted internal power dissipation for the component
above which there is a risk of damage
‹
Not
Applicable
Quality and reliability aspects
- Information about the quality and reliability level of the component
should be given (this must not necessarily be a specific FIT
value, it could be some other information like a reference to an
established reliability or a reference to the standard which is
applied for quality and reliability control, etc.)
Available documentation does not meet all the above criteria
‹
Further analysis is not considered appropriate for this element.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 50 -
19.09.2008
Assessment Procedure
C.
Supporting Information
Suitable forms of specification documentation
A suitable specification is needed to closely define the component type which is undergoing
evaluation for space use. Without such a document it is not possible for the organisation which
initially assesses the component type for space use to have any certainty that future
components with the same part type number will actually have the same performance. This
would mean that any assessment results could only be applied to the already existing
components used to produce the data on which the assessment was based. A suitable
specification is also needed so that potential users of the component can determine whether or
not it will meet the performance requirements for their intended application.
For commercial components any available “specification” will normally take the form of a
manufacturer’s data sheet, which could either be an individual data sheet or part of a data book.
A data sheet might be specific for one component type, or cover a range of components from
which the type specific data can be extracted.
If the commercial components are already being supplied to a major user with specific
requirements, for example an automotive user, then the manufacturer might have accepted a
procurement specification prepared by that user. Any such specification would normally contain
additional information to that in the manufacturer’s data sheet and could potentially also be used
for space components. Its actual use would depend on whether it was released for use by third
parties and whether the manufacturer could confirm that components supplied to third parties
would be identical to those supplied to the user who prepared the procurement specification.
If any deficiencies are identified during initial assessment then additional information, data or
documentation could be requested from the manufacturer to rectify these. Any further material
which a manufacturer supplied in response to such a request could, if it was confirmed to be
generally applicable to the component type, be considered to form part of the component
specification.
Configuration control of specification documentation
For the configuration control of whatever documentation forms the specification to be
acceptable it must meet the following requirements.
All documents which effectively form the procurement specification must have a reference
number, issue number, date of issue, and/or any other identification needed to uniquely identify
them.
• If any of the documents are described as “Draft”, “Advance Information”, “Preliminary” or
anything similar, written confirmation must be available from the manufacturer that the
contents are nevertheless accepted as a basis for procurement and will not be altered
without customer notification.
•
Every document must have been approved by an authorised signatory, either on the
document itself or as part of an internal approval system.
•
The organisation preparing any document must have an internal system for approving
proposed changes before they are incorporated into the document.
•
For documents which could have more than one issue the preparer must keep a record
showing which issue is current and what changes have been made from previous issues.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 51 -
19.09.2008
Assessment Procedure
4.2.13
A.
Delivery Time
Description of Element
Delivery time is the expected interval between placing an order for components and receipt of
the components by the procurer. Commercial components could generally be expected to be in
continuous or frequent production with ex-stock delivery.
B.
Assessment
Assessment Criteria
Production is continuous and/or parts are always available ex-stock
Delivery time is compatible with the project schedule requirements for the
specific use
Delivery time is not compatible with the project schedule requirements for
the specific use or sufficient information for an assessment is not
available
Acceptable Acceptable
for general for specific
use
project use
‹
Not
acceptable
‹
‹
Further analysis is not considered appropriate for this element.
C.
Supporting Information
None
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 52 -
19.09.2008
Assessment Procedure
5
BACKGROUND INFORMATION
5.1
Use of Commercial Components in Space
Electronic components designed and manufactured for use in terrestrial non-military and nonspace applications (commercial components) are being increasingly used in space. This trend is
a result of the rapidly shrinking supplier market for military and space components, the much
improved reliability and the much lower cost of commercial components. In addition, many
modern “state of the art” components are only available as commercial parts so that equipment
manufacturers wishing to make use of the increased functionality and lower power consumption
of new components, particularly integrated circuits, are forced to use commercial parts.
In mini and micro-satellites, the low budget of these programmes often prohibits the use of
conventional space qualified parts. Many programmes in Europe and the United States have
made use of commercial and/or plastic packaged parts. Those programmes include amateur
and low budget projects but also include projects like LANDSAT 7, SPARTAN, TDRSS, EOS,
EO-1 and the Hubble service missions. They have also been used in commercial space
ventures such as IRIDIUM, but have had only limited use in “mainstream” satellites such as
those used for telecommunications. The previous use of commercial parts in space has
demonstrated their usability, however, there have been some problems and failures. An
effective assessment procedure is needed in order to determine the viability of a commercial
part type either for general space applications or for a specific mission.
In contrast to space qualified parts, many commercial parts have been designed for a more
limited temperature range, are offered in SMD plastic packaging, have no lot identification and
must be procured via distributors. In addition, specification and data-sheet values are often not
well defined, the parts have short product life cycles and the radiation tolerance of the parts is in
many cases unknown. These negative aspects of commercial parts are to a great extent
counterbalanced by the excellent quality and reliability now achieved due to the high volume
production of commercial parts. Indeed, the quality and reliability of many commercial parts is
better than that obtained for space components because space components are in most cases
manually packaged, bonded and tested and cannot profit from the advantages of automated
production processes.
An impressive array of measures to improve the quality and reliability of integrated circuits have
been successfully implemented at the manufacturers. Those measures generally lead to
improved designs, improved processing techniques or improved measurement and inspection
procedures in order to reduce the number of defective components in the outgoing production.
Examples of these measures are the introduction of wafer level reliability testing on production
wafers, automated optical inspection of chips and the removal of abnormal chips, wafers or lots
identified using ingenious techniques for analysing the results of the electrical measurements.
Much improved control of the overall manufacturing process has been achieved by the
introduction of statistical process control and in-process sampling techniques. Also the users of
commercial components have contributed to their greatly improved quality and reliability. For
example, by feeding the results of application specific testing and field failure analysis back to
the manufacturers, the sources of these failures could be eliminated. Generally, the enormous
growth in the commercial semiconductor industry and the vast amounts of people and money
involved in the manufacture and use of these parts has lead to continuous improvement of their
quality and reliability so that many components now exhibit excellent quality and reliability.
The viability of commercial part types for space applications has to be assessed in a logical
manner before a decision regarding their use can be made. The procedure described in this
document is a scheme for the assessment of commercial parts based on documented
information available on the parts and is intended as a tool for users and procurement agents. It
can be extended by applying a further analysis and/or specific testing to cover any areas where
insufficient data is available.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 53 -
19.09.2008
Assessment Procedure
5.2
Equivalence of Commercial and Space Qualified Parts
Due to the major advances in the quality and reliability of commercial components, many
commercial parts can be considered to be equivalent to or better than space qualified parts. The
objective of the assessment procedure described herein is to identify commercial components
that are at least equivalent to space qualified components. The approach used to assure the
quality and reliability of commercial and space components is completely different so that
equivalence means an equivalent level of confidence in the component.
Although the quality and reliability of components successfully assessed using this procedure
should be equivalent to those of traditional space qualified parts, it is possible that they will have
reduced capabilities in such areas as their overall temperature range. Despite this it is
anticipated that components successfully assessed using this procedure will be suitable for
about 80% of the applications covered by the traditional space qualified components.
The confidence in the quality and reliability of space products is achieved by evaluation,
qualification, control of the manufacturing and subsequent screening and lot testing. For
commercial components, those parts with a basically high level of quality and reliability must be
selected and their ability to satisfy the additional requirements of space applications assessed
or tested. Commercial components selected in this way can be used with an equivalent level of
confidence to space qualified components.
In general, the supplier market for space components is shrinking because many suppliers
regard the space components market as a small niche market that detracts from their main
business. As a result, many of the remaining space qualified products are manufactured using
out of date technologies with associated quality problems. Some assembly and test houses
have specialised on the assembly and screening of small quantities of parts for space
applications using old stocks of wafers. It is obvious that these parts cannot obtain the level of
quality and reliability attained by high volume commercial components.
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 54 -
19.09.2008
Assessment Procedure
Appendix 1 - Form Sheets
Assessment Summary Sheet 1/3
Assessment Summary Sheet 2/3
Assessment Summary Sheet 2/3
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 55 -
19.09.2008
Assessment Procedure
Assessment Summary Sheet 1 of 3
ASSESSMENT FOR GENERAL USE IN SPACE APPLICATIONS
Part Type
Part No.
Manufacturer
Package / Case Size
Variant / Revision
Reference Date
Risk control measures
Thermal
√
Mechanical
?
√
Construction
?
Apply DPA
√
Radiation
?
Apply RVT
√
Process Control
√
Rel. Ass. System
√
Reliability Data
Additional 168 hr burn-in
√
Final El. Test
√
AOQL
√
Quality System
√
Traceability
√
Specification
√
Delivery Time
√
Legend: √ = Acceptable for general use; NO = Not Acceptable; ? = Insufficient Info/Data; N/A = not applicable
1
2
3
4
5
6
7
8
9
10
11
12
13
Overall result
Key Element
Description
Result
(indirect
assessment)
Key
Ele.
No.
Result
(direct
assessment)
Limitations
√
√
√
√
√
√
√
√
√
√
√
√
√
Final Disposition: The ref. part type is
acceptable for general space use
This assessment result is only valid for procurement with the risk control measures
indicated.
NOT ACCEPTABLE FOR SPACE USE !!!
ORGANISATION
Name
Date
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 56 -
19.09.2008
Assessment Procedure
Assessment Summary Sheet 2 of 3
ASSESSMENT FOR SPECIFIC PROJECT USE
PROJECT:_________________
Part Type
Part No.
Manufacturer
Package / Case Size
Variant / Revision
Reference Date
Thermal
√
Mechanical
?
?
No suitable measures available
NO
Construction
?
√
Radiation
?
√
Process Control
√
Rel. Ass. System
√
Reliability Data
√
Final El. Test
√
AOQL
√
Quality System
√
Traceability
√
Specification
√
Delivery Time
√
Legend: √ = Acceptable for general use; NO = Not Acceptable; ? = Insufficient Info/Data; N/A = not applicable
1
2
3
4
5
6
7
8
9
10
11
12
13
Overall result
Risk control measures
Result after
risk analysis
Key Element
Description
Result
(indirect
assessment)
Key
Ele.
No.
Result
(direct
assessment)
Limitations
√
NO
√
√
√
√
√
√
√
√
√
√
√
Final Disposition: The ref. part type is
acceptable for specific project use
Project:
This assessment result is only valid for procurement with the risk control measures
indicated.
NOT ACCEPTABLE FOR USE IN THE PROJECT !!!
ORGANISATION
Name
Date
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 57 -
19.09.2008
Assessment Procedure
Assessment Summary Sheet 3 of 3
Detailed description of risk control measures
No.
Risk control measures
Detailed description
1
2
3
4
5
Notes:
Contract 50 PS 0010
DLR-RF-PS-003_ASSESSMENT_PROCEDURE_COTS_V1.1.DOC
- 58 -
19.09.2008
Download