Cairns 4: Passing Cr

advertisement
3/8/20161
Project
IEEE 802.20 Working Group on Mobile Broadband Wireless Access
<http://grouper.ieee.org/groups/802/20/>
Title
Procedure for Evaluation of the Calibration Outcomes
Date
Submitted
2005-05-16
Source(s)
David Huo
67 Whippany Road
Whippany, NJ 07981
Re:
Evaluation Criteria
Abstract
This document evaluates available statistic methods for the hypothesis tests and proposes a
method for convergence of the simulator calibration process.
Purpose
Discuss and adopt
Notice
This document has been prepared to assist the IEEE 802.20 Working Group to accomplish
the simulator calibration process.
Release
Patent
Policy
Voice: +1 973 428 7787
Fax: +1 973 386 4555
Email: dhuo@lucent.com
The contributor grants a free, irrevocable license to the IEEE to incorporate material
contained in this contribution, and any modifications thereof, in the creation of an IEEE
Standards publication; to copyright in the IEEE’s name any IEEE Standards publication
even though it may include portions of this contribution; and at the IEEE’s sole discretion
to permit others to reproduce in whole or in part the resulting IEEE Standards publication.
The contributor also acknowledges and accepts that this contribution may be made public
by IEEE 802.20.
The contributor is familiar with IEEE patent policy, as outlined in Section 6.3 of the IEEESA
Standards
Board
Operations
Manual
<http://standards.ieee.org/guides/opman/sect6.html#6.3> and in Understanding Patent
Issues
During
IEEE
Standards
Development
<http://standards.ieee.org/board/pat/guide.html>.
1. Introduction
Calibration procedure is proposed by [1] so that the simulators used to evaluate the
proposal are equivalent in terms of conforming with [2]. Question is, however, raised , in
the last meeting in San Antonio on how to judge when the participating simulators can be
considered calibrated. The following are some suggestions.
2. Deterministic Simulation
In the deterministic part of the calibration, all numbers parameters are fixed. Therefore,
there is only one unique result and the deviation between the outcomes from different
simulators are definitely man-made.
3. Stochastic Simulation
Some parameters used in the simulation trigger pseudo-random number generator, which
generate random events. The random number generator used produces either uniform
distributed discrete digits or Gaussian distributed real/complex numbers. When
difference occurs, one can reduces the randomness to fall back to the deterministic
scenario gradually to find out the causes. At any rate, human interaction is required to
resolve the problem.
4. Evaluation Procedure
The calibration of simulators is an iterative procedure and resolution of the difference can
be facilitated by
1. checking the implementation codes, as that is where the most problems come
from. E.g. typing errors so that a different operation is performed than intended.
2. checking the implementation of the given assumptions and parameters, as that is
where the misunderstanding can lead to incorrect implementation. E.g. a different
formula is used due misunderstanding or insufficient specification.
3. checking the given assumptions and parameters, as the assumptions may not be
appropriate to enable a computer simulation to produce the expected results and
the parameters may not be compatible with the physical limits of the available
computer systems. E.g. the assumption does not lead to practically usable results.
4. changing or modifying the description of the task, as the task turns out nonproductive or not worthwhile due to technical or political concern. E.g. some task
may be defined over ambitious and cannot be accomplished in given time.
In specifics, the calibration as proposed by [1] is based on the assumptions: Slow fading
are semi-static, i.e. the mobile is not moving and each sector has a fixed amount of dB in
2
path gain accounting for the shadowing. Path loss model is fixed. Then, as example, the
evaluation may be performed as shown in the following:
Simulation
Assumptions
Fixed MS Locations
Geometry per MS
Power per MS (up)
Not Accepted
>3 rounds
A
Accepted
Fixed MS Locations
CDF of Geometry
Slow Fading
CDF of MS Power(up)
Not Accepted
>3 rounds
B
Accepted
Random MS Locations
CDF of C/I (up+dn)
Slow Fading + Fast Fading
Not Accepted
>3 rounds
C
Accepted
3
Figure 1: Calibration of Simulators via Simulation of Simple Scenarios
Steps A, B, C do not have to be performed in the order as given in the flow chart.
5. Conclusion
Since the entire calibration aims at eliminating the coding error, ambiguity of the
specification and vague understanding of the task, it differs from the statistic hypothesis
tests in that there is a feedback from the result to the statistic process. Therefore, a hard
criteria based on statistic test theory is not appropriate tool here. In comparison,
cooperation and common sense of the participants are the more useful ingredients in the
recipe. Discussion among the participants should be seen as the major vehicle towards the
success. We recommend the group to adopt this approach to the calibration process.
6. Reference
[1] C802.20-04-83r6
[2] IEEE802.20: Evaluation Criteria
4
Download