39912115 - Telecommunications Industry Association

Telecommunications Industry Association
Clearwater FL 01Dec 1999
Technical Committee TR-30 Meetings
Satchell Evaluations
Stephen Satchell
Phone: (775) 832-7157
(775) 831-2011
E-mail: satch@concentric.net
Why Move PN 3509 To Timed Transfer.
Members of TR-30.3 and meeting attendees
Copyright Statement
The contributor grants a free, irrevocable license to the Telecommunications Industry Association
(TIA) to incorporate text contained in this contribution and any modifications thereof in the creation
of a TIA Standards publication; to copyright in the TIA's name any TIA Standards publication even
though it may include portions of this contribution; and at the TIA's sole discretion to permit others
to reproduce in whole or in part the resulting TIA Standards publication.
Intellectual Property Statement
The individual preparing this contribution does not know of patents, the use of which may be
essential to a Standard resulting in whole or in part from this contribution.
Prior Publication: This paper has been published on the Web since October 20, 1999. Holders
of this paper are hereby given license to reproduce this paper for use within their
company only. Anyone wishing to republish or distribute copies of this paper
outside their own company should contact the author for licensing arrangements.
The author examines the characteristics of testing as specified in the
Telecommunications Industry Association (TIA) Telecommunications System
Bulletin (TSB) 38, and based on that examination makes specification
recommendations for TIA Project Number (PN) 3509.
In the conclusion, the author contends that the standard throughput test should
take place in an interval of 150 seconds, and that an optional 315-second transfer
interval be specifically specified in the Standard.
Further, the author's analysis serves as rebuttal for the idea of maintaining the
TSB 38 testing technique for V.32 bis and V.34 modems, showing that particularly
for V.34 the tests could be considered marginal.
This white paper describes the reasoning behind moving to time-based transfers in throughput
testing. This discussion pays particular attention to issues regarding V.90 (PCM) modem testing.
Finally, this paper tries to arrive at a rational time interval to use for testing, based on specific
goals and requirements.
The intent is to provide the basis of a rationale for the change in the procedure for measuring
modem data throughput between TIA TSB 38 and PN 3509.
Throughput and the Error Model for Modem Testing
The usual method for determining the "goodness" or robustness of a signal converter in a modem
is to pass blocks of random (or pseudo-random) bits through a modem connection over a
telephone channel with well-known characteristics, and measure the number of blocks that were
received correctly versus the number of blocks transmitted. The result is called the error ratio
measurement; a number of these measurements can be made to estimate the error rate of the
data converters over the particular connection. (See TIA TSB 38 and PN 3509 for definitions of
error rate and error ratio. There is a difference.)
Some number of errors are normal. The typical design goal is to have at most one block error in
one thousand blocks. In order to accurately measure the error ratio for a signal converter, and
thus accurately characterize the error rate for the modem, there needs to be enough data
transferred during each test such that at least ten (10) error events occur during each test. This
means that to verify that the modem meets the design specification of one error in one thousand,
the tester runs ten times that number of blocks, or ten thousand (10,000) blocks of data, so that if
the number of blocks lost or corrupted is less than eleven (11) then the modem meets the criteria.
In order to perform a block-error ratio test the modem needs to have the capability of transferring
data in a bit-synchronous manner. Fewer and fewer modems include this capability as a standard
part of the modem product, because the market need for synchronous modems continues to
drop. Also, critics have made a case that users transfer characters , not bits or blocks of bits, and
that those users typically take advantage of error control schemes such as the ITU
Recommendation V.42 error-control scheme for doing so. This suggests that using a throughput
measurement can provide accurate, indirect measurements of block-error ratio.
The key to using throughput for accurate error ratio measurement is ensuring that you have
enough opportunities for errors in the datastream. Unlike the block-error ratio test, where a fixedsize block of 1000 bits is used as the "test cookie," the V.42 packetizing scheme uses varying size
blocks of data. To make matters worse, typical V.42 error control implementations will change the
block size to maximize throughput in the face of what V.42 perceives the current error ratio to be.
The block size can range from 512 data bits (before headers, flags, and zero-bit insertion) to 8192
data bits.
One final note. There is a tradition in the computing industry that the real test of a system is to
offer a million opportunities to fail, and measuring the number of successes. In modems, this
tradition was upheld by requiring any test of robustness to pass one million bits of data. This was
back when one baud (symbol per second) held one bit.
TSB 38 Throughput Measurement
In TSB 38, the method developed to test modem throughput was to take files of fixed length and
content, determine the compressibility of those files, and determine the number of times the files
needed to be transmitted in particular conditions. In TSB 38 the basic file size was fixed at 32
kilobytes; there was no technical reason at the time to require larger files, and the size
accommodated BERT equipment available at the time so that testing could be performed without
waiting for equipment updates.
For throughput tests where V.42 bis compression is turned on, a software implementation of V.42
bis was utilized to determine the optimum source file size in order to obtain one million bits of
compressed data (before packetizing and zero-bit insertion) through the modem's signal
converter. The compression characteristics were determined for a 2048-element dictionary with
up to 32 character strings. It was felt that for testing modems conforming to ITU Recommendation
V.32 bis that this would result in a sufficiently long test to adequately test the modem.
V.32 bis
Using files of this length, the tests ran via V.32 bis at least sixty-nine (69) seconds, and resulted in
the processing of at least 165,600 symbols of data, each symbol carrying six bits of data. At
slower speeds, the number of symbols blossom to over a quarter million, which is getting into the
correct magnitude to adequately test the signal converter without extending the tests overmuch.
There was time for problems with equalizer convergence to emerge as well.
The problem was that errors happen to symbols, not to bits, and so even in V.32 bis there weren't
a million opportunities for error; there were about one-sixth the opportunities, worst case. The
other effect is that the effect of a single symbol error to the throughput measurement is magnified
by the larger blocks used by the V.42 protocol; the retransmission size is larger. (The effect is
mitigated by the fact that this happens in real life.) In the worst case, any one of 1,365 symbols
can cause a maximum-sized block to be discarded. Smaller V.42 blocks, at 173 symbols each,
correlate better with the 166 symbols that cover the 1000-bit block used in BLER testing.
The fastest rate, 33,600 bits/s, uses 3,429 symbols per second with almost 10 bits encoded per
symbol. This means that the minimum transfer time for a test is around 29.8 seconds, transferring
only 102,184 symbols during a test. At the slower rate of 28,800 bits/s, test time extends to 34.7
seconds, and a maximum of just under 119,000 symbols during the test.
With this modulation scheme, the number of opportunities for errors drops close to an order of
magnitude under the target value of one million. Interestingly, the transfer time comes close to the
time required for the two modems to complete the call establishment sequence, about 20-30
seconds. In short, half of the call time is spent setting up the call, and the other half is spent
sending a million bits of data.
The data-transfer time is roughly half of the time taken by V.32 bis (as you would expect) so any
effects of equalizer drifting or hunting may be masked by the short transfer time.
With V.90, we add an interesting wrinkle: we have two different modulation methods, two different
symbol rates, and two different data rates. In the downstream direction, the PCM "modulation
method" transfers 8000 symbols per second, encoding just under 7 bits per symbol for a top rate
of 53,333 bits/s. In the downstream direction, the V.34 modulation transfers 3200 symbols per
second for a top practical rate of 28,800 bit/s.
Assuming a one-way transfer downstream, the transfer of the test file would require at least 183/4 seconds. In that time, only 150,000 samples would occur in the downstream channel during
the test. This is far too short a time to measure reliably and repeatably with anything other than
millisecond-granularity clocks in the testing DTE. Moreover, any stability problem with
equalization, clock recovery (a major issue with PCM modems), or drift are not tested by so short
a transfer.
In a one-way upstream test, the test would extend out to a minimum of 34.7 seconds, with
111,000 symbols transferred during the test. This is to be expected, because the upstream
direction uses V.34 modulation techinques, and so all the comments regarding V.34 testing apply
When we start talking about two-way tests (transmitting data in both directions at the same time)
we are no longer measuring what we think we are measuring. The original intent of the two-way
test was to measure how well the modem handles CPU resource allocation within the controller
portion of the modem. With the asymetric character of the modem, though, some of the resource
sharing is masked by the early completion of data transfer in the downstream channel, leaving the
entire CPU to catch up in the upstream channel if that is a problem.
The PN 3509 Proposed Method
The method of testing throughput changes when you move to the PN 3509 method of performing
throughput tests. In that document, the testing is done based on time instead of on volume; that
is, you transfer data for a set amount of time and determine how many characters you sent in that
time. This is exactly the opposite of the method used in TSB 38, where you transmit a fixed
number of characters and see how long it takes.
Both methods yield the same measurement, characters per second. The difference is in the
transmit-stop conditions.
Picking a transfer interval
Setting the length of the interval for the data transfer is an interesting exercise. This paper
suggests that the appropriate transfer time is that which balances tradition -- allow for one million
opportunities for error -- with total testing time. The current practice of using around 160,000
opportunities to fail is, as tradition holds, too small. Further, the time selected should be long
enough that equalizer settling, jitter, and wandering would be properly exposed by the testing. The
current minimum of 19 seconds is entirely too short for this purpose.
If we were to strictly follow tradition, and assume that virtually all modem products we were testing
would run at symbol rates of 3200 or higher, then the time that allows for one million symbols is 5-
1/4 minutes. This would cause the TIA 3700 network model coverage test, or the PN 3857
integrated network model coverage test, to require just under 19 hours to complete. (This allows
45 seconds for call establishment, handshake, and call teardown and 45 seconds to set up the
test equipment for each test channel.) The PN 3857 universal network model coverage test would
add another 13.5 hours, for a total test time of 32.5 hours for network model coverage testing.
32.5 hours is just too long for most purposes, although this paper suggests that the 5-1/4-minute
(315 seconds) test time be provided as a guideline for "thorough testing." From a statistical
standpoint, cutting each test's time doesn't reduce the confidence of the testing overmuch, but it
does reduce the amount of time it takes to perform the testing. To make it simple, we would
specify the test time as 150 seconds. This cuts the TIA 3700 testing time to 11.2 hours, and full
PN 3857 testing time to just over 19 hours. The results from such a test represent roughly a threefold improvement over existing TSB 38 testing for V.32 bis and v.34 testing, and provides a
million-screwup-opportunity for V.90 downstream tests, a bonus.
PN 3509 testing for V.32, V.32 bis, and V.34 testing should use the time-based testing technique
so that the resulting tests better measure the effects of equalizer settling and drift, as well as
increasing the number of symbols transferred during a test.
The benefits of keeping the old technique of using data-volume-based throughput testing for V.32
bis and V.34 modems, specifically for the ability to compare tests to previously performed tests, is
outweighted by the improved test coverage of the time-based technique. The argument pales
when you realize that much V.90 testing has been published using TSB 38 procedures.
Grandfathering one without grandfathering all just doesn't make sense.
Additionally, the standard should specify 150 seconds for "standard" testing, and specify the
optional value of 315 seconds for performing a million-opportunity test in V.32, V.32 bis, and V.34
We would also need to caution users that any comparison between TSB 38 data and PN 3509
data would have tobe done carefully if at all. It would be better if testing of V.34 and V.90 products
was re-done under PN 3509.