1 Brief History of Telecommunications

advertisement
Introductory Notes to Lectures on Distributed Processing
in Telecommunications
Jan A Audestad
1 Brief History of Telecommunications
1.1 The early inventions
Electromagnetic theory and the first electromagnetic devices were developed during the first
half of the nineteenth century. Theories and experiments of Alessandro Volta of Italy, Hans
Christian Ørsted of Denmark, Michael Faraday of Britain and Joseph Henry of the United
States laid the foundation for the theories that later (1837) were used by Sir William Cooke
and Sir Charles Wheatstone for designing the first workable telegraph equipment. Samuel
Morse was granted a patent for the telegraph using dots and dashes for representing letters and
numbers. In one respect Morse was prior to his time: the Morse code is good example of a
way to reduce the entropy of the message by allocating short codes to frequent letters and
long code to the less frequent letters. The Morse code is therefore a good example of the
practical use of information theory.
The people who developed the electromagnetic theory were physicists and
technologists. Morse was not. He was professor of painting and sculpture at the University of
the City of New York.
Some inventors of telecommunications
We just saw that Morse was professor of painting and sculpture. He had no training as a
technologist. Still he is regarded as the father of the largest technological field ever:
telecommunications. In 1843 he was granted money to build a system for demonstration of his
invention between Washington D.C. and Baltimore. The first telegraph message sent on the
inauguration May 24, 1844, was: “What hath God wrought!” Now, 155 years later in 1999,
the telegraph has been taken out of professional use: it is no longer mandatory at sea in order
to meet the requirements of the SOLAS1 convention.
Another example is Strowger,inventor the automatic telephone exchange, who was the owner
of a funeral agency. The anecdote is that he invented the automatic switch in order to get rid
of the manual operator. She was the wife of his competitor and ensured that her husband got
more funerals than Strowger. I do not know whether the story is true. Anyhow, it is a good
one because now we may expect that actors from other businesses are entering the
telecommunications field. The reason is that telecommunications is more and more being used
as a production factor, and owning the telecommunications business may make the primary
product cheaper. Strowger made his invention in 1889. This invention revolutionised
telephony but it took almost one hundred years to automate the industrialised countries
completely. Still there are countries having mainly manual systems.
Frequency hopping is a coding method where the transmitter sends individual bits using
different frequencies. Each transmitter is assigned a fixed pattern of frequencies and the
various patterns are different for different transmitters. This method has the advantage that it
is difficult to jam the transmission because the intruder must first synchronise to the
1
Safety Of Life At Sea
1
frequency pattern. The method was invented during World War II by a Hollywood actress and
her friend, a composer of film music who developed the hopping pattern by using notepaper.
Facsimile transmission is almost as old as the telegraph. The first patent of a machine
never to be built was granted to Alexander Bain, a Scottish inventor, in 1843. He suggested a
system where a stylus mounted on a pendulum scanned the page to be transferred. The stylus
was conducting and the text had to be written on a metallic surface in order produce areas of
high and low electrical conductivity. An improved principle was demonstrated by the English
physicist Blackwell in 1851. The first commercial system was opened between Paris and
Lyon in 1863. The system was based on an invention by the Italian inventor Giovanni Caselli.
However, Facsimile never became a success until 1980. The reason was the lack of standards.
Many systems were in operation prior this time but were mainly used for special purposes
(Telephoto) or within small segments of the industry and government. The equipment was
expensive, big and slow.
The Group 3 fax standard developed by the ITU in 1980 resulted in simple and
lightweight equipment. This caused an enormous increase in demand. This increase in
demand was strange because fax is an example of service where the network economy really
is at play: one machine at one site has no value; few machines at few places have limited
value; many machines at many places have much value. The success came from the fact that
the standard combined several elements at the same time. It catered for the need already
established in small segments by offering improved services. The equipment was cheap so
that the fax became a common element in office automation. Users could send pictures and
text messages to partners that had never communicated in this way before. This happened at a
time when the industry was changing: increasing internationalisation, new forms of cooperation between manufacturers, sales channels and customers, and geographic distribution
of the companies. This development required fast and reliable transfer of documents.
The lifetime of the fax service may only be about 15 years. Fax is probably the fastest
growing service we have ever had; it may also be the service entering obsolescence faster than
any other service. The reason is that fax can be replaced by e-mail and PC technology. The
fax combines canning and printing in one machine. The PC can be equipped with equivalent
equipment and replace the fax as a separate entity. This development has already changed the
traffic patterns of telecommunications from telephony to Internet 2. In consequence, the
sources of income, costs and investments in the telecommunications industry are also
changing. However, not all operators and industry have given up developing the fax machine
further because there is still a considerable demand for the service in developing countries
where the World Wide Web is still not available in scale. It is still an important service for
small businesses in developed countries because they have already invested in the equipment.
The evolution suggested for the fax service is taking the same route as the evolution of the
World Wide Web so it is indeed questionable whether the new fax service represent anything
different from the web.
The telephone is a younger invention than the fax. Bell’s patent is from 1876. The
telephone was commercialised the year after. However, the practical instrument using rotary
dial was invented in 1896. This was an improvement required for operating the automatic
exchange, which had been invented in 1889 by Strowger.
The youngest of the basic services is the telex service. The first invention of
importance was made by Murray in 1903. He constructed a system that could transfer written
2
This was particularly noticeable in Australia where international telephone traffic dropped considerably while
Internet showed exceptional increase. The reason that so much international traffic in and out of Australia is textbased is of course that Australia is several time zones away from its business partners.
2
text using the five-bit alphabet of Jean-Maurice-Émile Baudot3. Automation of the telex
service came after World War II based on a service inaugurated in Germany in the early
1930s. A similar service was introduced in 1932 in the United States called TWX
(Teletypewriter Exchange Service). Telex and TWX were based on different standards and
Western Union – the operator of the TWX service in the United States – established an
interworking centre where messages could be translated automatically between the two
standards. This was as late as in 1970. At this time, telex became a world wide service. This
service has had the same fate as the fax service. First it was replaced by the fax service.
Thereafter, what was left of the service was replaced by e-mail and the web service. The
teletex service was developed during the 1980s in order to improve the telex service. The
teletex service was never realised because the World Wide Web made it obsolete when it still
was on the drawing board.
Of all the old services, only telephony has survived the 1990s. It was during this
decennium that the telegraph service and the telex service were replaced by data transmission.
It is also likely that the facsimile services also disappears
The first computers using the new semiconductor technology were put into operation
about 1960. At about the same time it was observed that these new machines would require
communication. This communication would be different from other telecommunications
where the source and destination of communication were human minds. The human mind can
accept imprecise and inconsistent semantics but still being able to interpret it correctly.
Machines cannot. Here the semantics of communication must be precise and consistent (at
least within the context it is used; note that it is hard to prove the consistency of any language
– including computer languages). One important and often forgotten reason that it became
urgent to implement data transmission services was that time-sharing had been developed
already in 1961. Time-sharing implies that several tasks can be performed by the computer at
the same time by slicing each task into segments and processing the segments of the different
tasks in cyclic order. The computer is then utilised better and the processing becomes more
efficient. Time-sharing allowed that several users accessed the computer at the same time.
Such systems were only efficient if multiple users were connected to the computer over a
switched network.
The first practical data network was introduced by the U.S. Department of Defence in
1969: the ARPANET4. The network connected computers located at universities and military
installations. At the same time ITU developed the concept of public data networks and came
up with two techniques: the circuit switched method and the packet switched method.
ARPANET was based on packet switching only. The packet switching method defined by
ITU is called X.25.
Circuit switching was based on establishing a unique circuit all the way between two
computers so long as they were communicating in the same way as in telephony. The idea
was to transfer information symbols in much the same way as in telex but at higher bit rate
allowing large amounts of data to be transferred at the same time. The proponents of this
technique believed that the main requirement for data transmission was to transfer data in
bulk.
In packet switching, the symbols were imbedded in packets and transferred when
required. This means that several users may send packets over the same circuit and thus
utilising the resources much better for interactive communication between computers. The
proponents of this technique assumed that data transmission would contain a significant
component of interactive communication consisting of short messages. In the ARPANET,
3
The unit baud for transfer rate of bits is named after Baudet. One baud is equal to one bit per second. The data
transfer rate of telex is usually given as 50 baud.
4
ARPA = Advanced Research Project Agency
3
information is sent in datagrams containing complete addressing of the receiver. Datagrams
are thus independent entities. One data message may consist of several datagrams and it is the
responsibility of the transmitter and the receiver to ensure that the message is assembled in the
correct way from the individual datagrams. In X.25, a “virtual” connection is established
when information is to be exchanged. The X.25 protocol consists of an establishment phase, a
data transfer phase, and a release phase. During the establishment phase, a reference number
is assigned to the virtual call. The reference number is then used during the data transfer phase
and the release phase as address information in order to indicate which packets belong to the
same data-call.
The X.25 protocol got its name from the reference number assigned to the
recommendation of ITU in which the specification is contained. The major X.25 networks
were implemented during the first half of the 1980s. At the same time, Internet emerged from
the ARPANET since the military lost much of the interest in the network. The Internet used
the datagram concept of ARPANET but with a new protocol stack known as TCP/IP 5. This
protocol has now taken over much of the data transmission business, in particular after the
World Wide Web was commercialised in the autumn of 1993.
The first systems for mobile communication were manual. They were developed
during the early 1960s. However, automatic mobile communication was implemented about
1980: AMPS in USA and NMT6 in the Nordic countries represented the original concepts of
automatic cellular radiotelephone systems. Both systems used analogue techniques. The first
digital system was GSM brought into operation in 1991. Mobile satellite communication
came at the same time as the early automatic land mobile systems. Marisat for communication
to ships was deployed by Comsat General in 1978. Inmarsat took over this system in 1981
and has since then developed the technology further. Iridium of 1998 offers satellite
communication to hand held mobile stations. However, this system was inaugurated at a time
where most of the globe was well covered by much cheaper land mobile systems. Iridium
found itself without markets and went bankrupt only after one year of operation.
The transistor, the microprocessor and the laser: three technologies that changed
telecommunications
Data transmission and satellite communication became possible because of a single invention:
the transistor. The transistor was invented in 1947 by the physicists Bardeen, Brattain and
Shockly at Bell Telephone Laboratories. They shared the Nobel Prize in 1956 for this
invention. By the end of the 1950s, the transistor had become a practicable device replacing
the vacuum tube. As compared to the vacuum tube, the transistor was small, reliable and used
little power. The full replacement of the vacuum tube by transistors took place during the
1960s. Products and systems that came as a result of the transistor, were the small and
portable transistor radio, the PCM technology that digitised the network, the active satellite
transponder, and the reliable computer. Modern telecommunication could not have existed
without the transistor. The world economy would have been a shade of what it is now without
this tiny component. The transistor is probably the most important invention of all time.
The microprocessor was developed during the 1970s. This device is, of course, one of the offspin technologies of the transistor. Because of the very large scale integration (or VLSI)
technique, it was then possible to manufacture the central processing unit (CPU) of a
computer on a single silicon chip. Memory devices could be manufactured in a similar way.
5
6
TCP = Transmission Control Protocol, IP = Internet Protocol
AMPS = Automatic Mobile Phone System, NMT = Nordic Mobile Telephone system
4
The PC is one of the main products where the microprocessor made it possible to design the
product. The first mobile systems such as NMT and Marisat could not have been designed at
reasonable size and with reasonable power consumption without these devices. Systems like
GSM have taken advantage of the evolution towards faster CPUs and larger memory chips.
See also Moore’s law below.
Laser is an acronym for light amplification by stimulated emission of radiation. The existence
of stimulated emission or radiation was predicted by Albert Einstein from pure
thermodynamic arguments in 1917. The argument is that when light is scattered by an atom,
the atom may emit two photons. One of the emitted photons has the same frequency and
phase as the incident photon. The other photon has random frequency and phase. The first
component is called stimulated emission. The second component is called spontaneous
emission. The component of stimulated emission is so rare that in one of the most prestigious
textbooks in theoretical physics7 ever written it is stated that stimulated emission processes
“... are, under normal conditions, very rare, ..., and are of little importance as regards the
phenomenon of scattering.” This was written in the late 1950s. In 1957, Townes and his
student Schawlow proposed the construction of the laser and claimed that it could become a
practicable device if only suitable materials could be found. Maiman constructed the first ruby
laser in 1960. It has been much criticised that only Townes was awarded the Nobel Prize
(1964) but not also his co-inventors. It may be worth noting that Townes had already built a
similar device in 1953 using microwave frequencies. This device was called maser as
acronym for microwave amplification by stimulated emission of radiation. From this
viewpoint, Townes may have deserved to receive the prize alone. Many suitable materials
have been later found and lasers are now available within a large range of emitted energy and
frequency.
The concept of laser lay dormant for 40 years before a practical way to implement the
principle was found. When the laser had been invented, the evolution of the device and the
technologies in which it can be used was rapid.
Some lasers can produce enormous energies and the materials through which the light is
transmitted become nonlinear. This has given rise to the study and applications of nonlinear
optics in physics and technology. Applications are accurate focusing of light over long
distances, optical radar, super-cooling of clusters of atoms and construction of molecules from
single atoms by using the laser to cool them to a temperature close to absolute zero. Less
energetic lasers have become the core elements of optical communication. Lasers are also
found in all sizes from huge machines used for ranging of remote objects and cutting of steel
to soldering of micro-miniature devices on integrated circuit chips for communication and
computing.
The evolution of telecommunications is summarised in Figure 1.1. The time axis
shows approximately when major events took place. The ordinate provides some idea of how
the complexity of telecommunication systems has developed. At the turn of the century the
basic technologies where automatic switching, facsimile and telegraph. The next step in
increasing complexity came in the 1930s by the development of the telex systems.
In the 1960s, there was a huge leap in complexity. At this time the satellite
communications systems were implemented, the first computer controlled switches were
7
L. D. Landau and E. M. Lifshitz, Electrodynamics of Continuous Media, Volume 8 of Course of Theoretical
Physics, 1960 (English edition), page 377
5
made (No 1 ESS in USA in 1965), and data transmission was developed. The enabler of this
leap was the transistor.
The next big leap is related to mobile communication. The first systems were put into
service around 1980. They were enabled by the microprocessor. During the 1990s, we have
seen an enormous increase in complexity because of the World Wide Web and its offsprings
or derivative technologies such as:

the programming language Java;

the network computer (NC);

the platform CORBA (Common Object Request Broker Architecture) for distributed
processing;

the agent technologies.
95% transmission
5% processing
Complexity
NC
CORBA
Java
Agents
5% transmission
95% processing
Web
IN
GSM
The transistor
Automatic switching
Facsimile
Telegraph
1900
Land mobile
Inmarsat
Computer controlled switching
Data transmission
Telex
Satellites
The microprocessor
1930
1960
1980
1990 1995
Transmission
Processing
Applications
4
Figure 1.1 Evolution of telecommunications
None of these technologies was developed by the telecommunications industry but the
data industry. The telecommunications industry developed the intelligent network (IN)
platforms, which are platforms on which telecommunications services can be implemented
quickly. IN will probably be replaced by CORBA-type platforms or other platforms for
distributed computing more suitable for the web technology.
Figure 1.1 also illustrates that the focus when designing platforms has changed from
transmission to processing. Based on my own experience, about 95 % of the specification of
the Inmarst system was concerned with transmission problems and only 5% with processing
(including signalling). In contrast, 95% of the GSM specification is concerned with
processing while 5% is about transmission. The focus has again changed from processing to
content.
Finally, Figure 1.2 summarises the evolution. The telegraph service and the telex
service disappeared by the end of the 1990s. The fax service is likely to meet the same destiny
6
being replace by technologies derived from the World Wide Web. Only two services have
then survive in the new millennium: telephony and data transmission (in the form of the
World Wide Web). It is likely that these services merge into a single service within a few
years. Then we may achieve what we have dreamed of before: a single service on a single
network. This was the idea behind the ISDN. That idea never succeeded.
Telegraph
Telephony
Merging?
Internet
Web
Telex
Fax
1900
1950
2000
Figure 1.2 Evolution of services
5
7
8
2 The Last Decennium
2.1
Moore’s law
1012
0.01 % of all
written knowledge
1011
Number of bits/chip
1010
109
Encyclopaedia
108
107
Book
106
105
Page
104
103
1970
1980
1990
2000
2010
2020
Year
Figure 2.1 Moore’s law
6
During the last decade telecommunications has changed in a fundamental way. This change is
related to the increase in computational power that took place after the microprocessor was
introduced during the 1970s. This has lead to the development of more complex systems
allowing higher information transfer rates, larger processing platforms and realisation of new
services and applications. Processors have become larger with regard to memory capacity and
the processing speed has increased. This evolution is called Moore’s law and is shown in
Figure 2.1.
One way of expressing Moore’s law is in terms of the development that has taken
place with regard to memory capacity. A similar development has taken place for processing
capacity. In applications like real time motion picture processing both high processing speed
and large memory capacity are required. The standard developed by the motion picture expert
group (MPEG) requires almost the full capacity of the fastest PCs available today. These PCs
are required for full multimedia performance.
The number of bits that can be stored on one chip is used as ordinate in Figure 2.1.
Note that the scale on the ordinate is logarithmic. The year is shown as abscissa. This scale is
linear. The curve (or line) represents actual observations until about 1995. The actual
evolution of memory capacity follows the straight line with amazing accuracy. Note that
because of the scales used the evolution is exponential.
The mathematical form of Moore’s law
The law can be written as follows:
N = 1000 e0.55(t  1970) = 1000 exp [0.55(t  1970)],
9
where N is the number of bits and t  1970 is the time in years since 1970. e = 2.718... is the
basis of the natural logarithms. Often, the function ex is expressed as exp (x), in particular if
the exponent is complex.
The time, t = t2  t1, in years until the storage capacity has been doubled is easily found from
this equation. At t1 the storage capacity is N and at t2 the storage capacity is twice as much.
This gives
N = 1000 exp[0.55(t1  1970)] and 2N = 1000 exp[0.55(t2  1970)].
Dividing the two expressions with each other and cancelling equal terms gives:
2 = exp [0.55(t2  t1)] = exp (0.55t).
Solving this equation for t gives:
t = (ln 2)/0.55 years = 1 year and 3 months.
The figure shows that the number of bits that can be stored on one dynamic random
access memory chip increases exponentially with time and at a rate where the capacity is
doubled every one year and three months. Dynamic random access memory (or DRAM)
means that information in the memory may be dynamically inserted, changed, deleted or read.
The memory is made of semiconductors and is the “working” memory of the computer. In
computers, this is the memory where all data required actively for processing is stored. All
active windows, files and programs on your PC are stored in this way together with all
temporary data resulting from the processing. In a 32 Mbyte processor, 832 = 256 million
bits can be stored in this memory. (One byte consists of 8 bits.) Since one transistor is
required for storing one bit, the DRAM memory of this particular computer consists of 256
million transistors. Memories of this size became commercially available in 1997. The
memory size is now more than one billion bits per chip.
Memories such as hard disks are also dynamical in the same sense as DRAMs. They
may store several order of magnitudes more information than the DRAM but they are slower
and are only used for storing information or programs when they are not actively used. If the
hard disk had been used in the same way as the DRAM, the processing speed of the PC would
have been unacceptably slow. In even bigger memories such as CD-ROM (ROM = read only
memory) information can only be read. The memory content is inserted mechanically by
“burning” the bit pattern onto the disk and cannot be altered afterwards. Some types of read
only memories are also manufactured on semiconductor chips. Such memories can also
contain large number of bits. They are fast but can only contain information or programs that
are not frequently changed such as software in telephone exchanges and other big installations
where part of the software is installed once and for all. Programs stored on certain types of
ROM may be erased and changed. Such ROMs are sometimes referred to as E-PROM
(erasable programmable read only memory) or EE-PROM (electrically erasable
programmable read only memory).
Let us look more closely at the figure and observe some consequences of the
evolution. In 1978, one page of written text could be stored on one DRAM chip. Ten years
later, a complete book could be stored in the same physical size. Now it is possible to store on
one chip the complete Encyclopaedia Britannica or all information contained in the human
DNA. In 2010, it may be possible to store 0.01 percent of all written information on one chip.
10
The collective knowledge of humanity stored in the magazines of the libraries and in
bookshelves at home may then be contained on only 10,000 semiconductor chips, each being
as small as a grain of sand. Then it is fair to say that humanity has just a teaspoon of
knowledge!
In contrast, such an enormous memory can only contain one high-resolution video
film. This indicates how much more capacity-demanding moving pictures are compared with
other computer applications.
Moore formulated his law in 1975. What is amazing is how accurately the law has
predicted the evolution. Deviations from the law during the last 25 years have been
insignificant. The current development in the small-scale electronics called nanotechnology
indicates that the law may still be accurate for the next decade. On the other hand, other
developments based on the tiniest of the tiny – quantum mechanics – indicates that the slope
of the curve may become even steeper if it turns out that the quantum computer can be
realised. If the quantum computer cannot be realised, the size of the components on the chip
may become so small that individual components cannot be made. A transistor on the chip
must contain a few hundred molecules in order to avoid that quantum effects predominate.
There are several other consequences of the evolution of the semiconductor
technology. The price of storing a bit is going down in the same way as the size of the
memory is going up. This curve is also exponential (but with negative exponent). In 1970, the
price of storing one bit on a semiconductor was 1 cent. In current memories, the cost per
stored bit is only about 0.01 millicent, or 100,000 times less. This decrease is dramatic but
much smaller than could be expected from the increasing capacity of the chip. The reason is
that it is now about 100 times more expensive to produce a single transistor on the chip than
in 1970. The reason is that the size is so much smaller for each transistor, demanding more
mechanical precision and chemically purer semiconductor materials. The increase in cost
follows a law similar to Moore’s law. However, the value of the sales of semiconductor
products is following the same development. The three curves representing memory density,
production cost and semiconductor sales in US $ are shown in Figure 2.2.
Bits/chip
1012
1010
1
109
10-1
108
10-2
107
10-3
Sales
106
105
10-4
Cost of fabrication
104
10-5
10-6
103
Price per bit
Price per bit (cent)
Number of bits/chip
Cost of fabrication (millionUS $)
Semiconductor sales (million US $)
1011
10-7
102
10-8
10
10-9
1
1970
1980
1990
2000
2010
2020
Year
Figure 2.2 The “extended Moore’s law”
11
7
Figures 2.3, 2.4 and 2.5 show a similar development of other parameters. Figure 2.3
shows the number of atoms per bit. We have already reached a number of atoms per bit where
pure collective or classical theories for the materials seize to hold. The curve predicts that we
are deep into this region within ten years and, by 2020, only one atom per bit is required.
However, when the number of atoms in a system such as a memory cell becomes very small,
the behaviour alters because of quantum effects. Therefore, there may be a lower limit for the
size of memory cells somewhere that cannot be surpassed. On the other hand, nanotechnology
and quantum computation may press this limit downward because of new ways of designing
the computer.
Number of atoms per bit
1020
1016
1012
108
104
1970
1980
1990
2000
Year
2010
2020
8
Figure 2.3 Decrease in number of atoms per bit during the last 40 years
10 GHz
Clock frequency
1 GHz
100 MHz
10 MHz
1MHz
1970
1980
1990
2000
Year
2010
Figure 2.4 Evolution of clock frequency
12
2020
9
Energy per logical operation (pico-Joules)
The performance of a computer is determined by the clock frequency. The number of
logical operations the computer can make in a certain time is directly proportional to the clock
frequency. The clock rate passed 200 MHz in 1996. The clock rate has doubled in less than 5
years for the last 30 years. This trend predicts a clock frequency of 4 GHz in 2010 and 16
GHs in 2020. The evolution and forecast is shown in Figure 2.4
Finally, Figure 2.5 shows the energy dissipated per logical operation in pico-Joule
(pico = 10-12). The physical constraint imposed by noise is indicated as the thermal noise
energy produced in a material at room temperature. This energy level is 4.510-9 pico-Joule.
The working signal must be more energetic than this in order to distinguish it from the noise.
This practical limit is reached about 2020 if the evolution continues to hold.
108
104
1
10-4
10-8
Thermal noise = 4.510-9 pico-Joule
1970
1980
1990
2000
Year
2010
2020
Figure 2.5 Energy dissipation per logic operation.
10
Observe that all the parameters transistors per chip, chip price, clock frequency,
number of atoms per logic operation and energy dissipation per logic operation have closely
followed exponential laws8. The forecast is that they will continue to do so for a long time.
One consequence of exponential behaviour is that the value of the parameter is doubled (or is
halved) at a constant rate. In order to make a prediction, all we need to know is the doubling
(or halving) time of the process.
2.2 Is it possible to take Moore’s law into account in planning?
The answer to this question is yes. GSM is a good example where the system was planned in
this way. When the system was first conceived in 1982, it was decided to include capabilities
that had never before been exploited: authentication using cryptographic algorithms, digital
transmission over a channel with adverse propagation conditions, and use of speech encoding
techniques that had not been invented yet. During the development of the system the different
techniques matured because Moore’s law made it possible to implement the required software
processes: the microprocessors became simply big enough for performing the task.
8
In all the figures, the ordinate is logarithmic while the time axis is linear. This means that the evolution follows
the law Parameter = a exp(b time-interval) where a and b are constant with different values for different
parameters. If the parameter increases with time, b is positive; if the parameter decreases with time, b is
negative.
13
ISDN is an example of a development where the advances yet to come in computer
science were not exploited in the same way. Therefore, the methods used in this system seem
old and conservative, while the GSM system still is regarded to be modern ten years after it
was implemented. ISDN and GSM were industrialised at the same time.
Let us look at a few examples from GSM and determine how we were “saved” by
Moore’s law. One of the first decisions made for the GSM system was that it should contain
cryptographic authentication. This means that it should be impossible, or at least extremely
difficult, to steal and use identities belonging to other subscribers. This had become a problem
for NMT in the Netherlands because of organised crime stealing identities in order to avoid
identifying the user. Cryptographic methods require in general much processing such as
raising large numbers (consisting of thirty or more digits as in GSM) to powers which are
even larger (consisting of one hundred digits or more). It was also decided early (1986) that
all subscription information should be contained on a separate module called the SIM
(subscriber interface module). The SIM should also contain the authentication algorithm in
order to make the subscription independent of the mobile station itself. One convenient way
of designing the SIM was to use a smart card containing a single chip computer. In 1986, it
was not possible to make such a device that was even close to performing the calculations
required. In 1987, it was reported that such a device had been constructed, only that the
calculation took more than one minute. In comparison, our requirement was that the
calculation should be done in less then 50 ms, that is, 1000 times faster. Faster and larger
microprocessors and better computation algorithms reduced this time to about one second
one-and-a-half year later. The processor was still 20 times too slow. When the system was put
in operation in 1991-92, the processing time was down to acceptably 100 ms.
This is a good example of how Moore’s law works in practice. However, another
reason for the development was that there was a demand for the technology. This lead both to
development of improved computational algorithms and to the new design concepts of the
microprocessors that had to be used in the smart card.
Another example was that in order to send information across the radio channel, new
methods for modulating the information onto the carrier wave and for recovering it at the
receiver had to be developed. A method, which is not much different from the one
implemented in all the tiny hand portable GSM stations, was tested on the Cray
supercomputer at the Norwegian University of Science and Technology in 1987. The
computer was just able to modulate, demodulate and compute the channel characteristics in
real time. The method that eventually had to be used in the commercial system, seemed
simply too complex for microprocessors. However, in 1991-92 similar algorithms were
implemented on the microprocessors in the mobile station. This was another result of Moore’s
law in practice.
2.3 Why does the evolution go so slowly?
This question may seem odd, now as the whole telecommunications industry is moving
forward at an enormous pace. The problem is that we are facing two types of evolution at the
same time:

One evolution is fast and corresponds to how we develop new services and
applications on platforms like intelligent networks, the Internet and the World
Wide Web.

The other evolution is slow and is related to development of the platforms
themselves.
14
Let us now try to understand these mechanisms and the way in which they interact.
Platforms in telecommunications cover a large range of technologies:

switching systems consisting of exchanges (for telephony, ISDN, telex and data
transmission) and transport systems for transferring bits between them;

intelligent network (IN) nodes performing service processing that is otherwise not
economic to implement in individual exchanges (for example freephone service,
premium rate service and centralised queue service);

Internet platforms consisting of servers, routers, bridges and Ethernet cables, and
search engines, web systems and network browsers;

mobile system platforms supporting technologies like GSM, DECT (Digital
European Cordless Telephone), UMTS (Universal Mobile Telecommunications
System), and TETRA (Trans-European Trunked Radio Access);

platforms for satellite communication such as Inmarsat, Iridium and Intelsat;

platforms for distributed processing such as CORBA (Common Object Request
Broker Architecture) and TINA (Telecommunications Information Networking
Architecture); and

platforms for wearable computing where part of the platform is carried on the
person and consists of sensors, web interfaces, CORBA and mobile
communications.
Some of these platforms are designed on old and well-established technologies; some
are so new that they are still not used in numbers. Examples of the first category are the
platforms used for telephony, GSM, Inmarsat and Intelsat. Platforms that are still evolving are
Internet and intelligent networks. Platforms that are hardly taken into use are CORBA and
TINA. What is important to note is that it has taken several years to develop each of these
technologies. Moore’s law does not act such that the time to develop platforms has become
significantly shorter. On the other hand, because of Moore’s law, the platforms have become
more complex.
Some illustrative examples of evolution time are the following. It took seven years to
develop the specification for the GSM system (1982 to 1989). It took Bellcore about the five
years to develop specifications for the intelligent network (1985 to 1990). These systems are
complex. However, the development time was short: the GSM system was developed with an
obligation to introduce it as soon as it was implemented (the memorandum of understanding
of 1987); the intelligent network was developed by one organisation. In standardisation, GSM
is an exception rather than the rule: the development time for standards is usually very long
and the resulting standard is often technologically outdated (ISDN) or misses the market
(teletex). Teletex was an attempt to offer a fast text transfer service. However, the terminal
was too expensive and was later replaced by the more flexible and cheaper PC and the
protocols of the Internet.
It took about ten years to develop maritime satellite communication (1969 to 1978).
Half the time was used in order to determine the feasibility of such systems. In comparison, it
took only five years to develop the NMT (Nordic Mobile Telephone) system (1975 to 1980).
These systems are comparable from a technological viewpoint. The difference in development
time was the maturity of the technology. The microprocessor came just in time to make the
first system for maritime satellite communication (Marisat) economically feasible. NMT was
15
originally meant to be a manual mobile system common for the Nordic countries. However,
the microprocessor made it possible to automate the system. The processing of signalling
required in automatic systems could not have been implemented without it.
The CORBA platform has been developed for about ten years and the first generation
platform has been marketed for the last two or three years. The platform will enable
distributed or co-operative processing in all types of systems. TINA is an adaptation of
CORBA to telecommunications systems. It has been developed for six years and is still not
finished. TINA may also replace Internet platforms – if it is successfully implemented and
cheap enough. CORBA was developed by an enormously big group owned by about 640
manufactures, users and universities. TINA was owned by about 40 telecommunications
operators and manufacturers. Note that CORBA and TINA are among the newest efforts in
developing platforms.
In the discussion, we usually forget that the Internet and the protocols used for
information transfer (for example, IP and TCP) were developed over many years. The Internet
has existed since 1983 as a communication tool for universities and research institutes. Before
that it existed for 14 years as ARPANET (advanced research project agency network)
developed by the U.S. Department of Defence for linking together military research facilities
and universities working on defence projects. The World Wide Web was developed at CERN
(Conceil Européen pour la Recherche Nucléaire) in Geneva. The work started in 1989 and
was finished in 1992 when the hypertext markup language (html) and the hypertext transport
protocol (http) were released. The time between the creation of ARPANET and the World
Wide Web has been long. During the whole period, the network has been used but not as a
commercial product. The commercialisation took place late 1993. The success of the World
Wide Web lies in the fact that the “standard” was developed by one organisation where the
aim was not commercial telecommunications but a support vehicle for basic research in
particle physics. CERN needed a tool by which they could co-operate with universities in
interpreting pictures of reaction products resulting from particle collisions.
Internet was developed without commercial interests. In some examples, such as
GSM, intelligent networks, Inmarsat and TINA, the commercial interests have been focussed.
The development time of other systems such as ISDN, CORBA, UMTS and ATM have been
unnecessarily long because the work has opened for commercial conflicts and positioning of
interests. However, all platform development has taken much time, and there is no indications
that this time is becoming smaller.
In contrast, it is easy to develop new applications as soon as the platform is available
and the product does not require enhancement of the platform. It may take a few days to
create a new web-site. Larger products like electronic newspaper, payment services and
electronic commerce take a little longer but we are still talking of months rather than years.
The same applies to services on intelligent nodes (freephone, premium rate services and
universal access number) and in mobile systems (GSM) where the service does not require
development of the platform itself. It is easy to develop web-services, supplementary services,
fleet management services and other services in the GSM system as soon as the basic platform
has been built. The development is illustrated in Figure 2.6.
The figure contains example of the development time for platforms and applications –
or derived products. The numbers are only approximations of the time taken. The exact time
is not important. The figure rather brings forward the duality of developing the platform and
the products the platform can support. The main driving force behind the intelligent network
was to develop a platform on which new services could be realised cheaply and marketed in a
short time. The driving force behind computer telephone integration (CTI) was similar for
developing applications on PABXs (private automatic branch exchange = the automatic
switch at the subscriber premises) by terminating both the telephone services and the Internet
16
services on a common platform. On such platforms, the calling number – if available – may
be used to get the customer profile from the computer automatically. The platform may be
used to route calls to the terminal the user is currently using, or direct them to a storage
device. The platform may combine several services such as mailboxes, paging systems and
mobile systems, and distinguish between different user contexts: available, unavailable, donot-disturb or alternate termination. Many of these services are easy to develop when the
platform exists.
Platform products
Intelligent network (IN) node
GSM
World Wide Web
CORBA
ISDN
Computer telephone integration (CTI)
Development time
10 years
10 years
4 years
10 years
15 years
10 years
Derived products
Freephone
Universal personal telecommunications (UPT)
Premium rate service
Web page
Electronic newspaper
Transaction services
Products with enhanced technology
Centralised queue
Intelligent building
Wearables: experiments
Wearables: product adapted to market
1 month
6 months
1 month
1 week
2 months
2 months
1-2 years
1-2 years
months
several years
11
Figure 2.6 Development time for different projects
The figure also contains a third category: services that require enhancement of the
platform – or products with enhanced technology. One example is the centralised queue
service where the problem was not to realise the functions in the intelligent network node
managing the queue but to implement the procedures and protocols in the PABXs for
indicating when an operator is busy, engaged or absent. This required that PABXs were
retrofit with such functionality.
The products from the three categories may exist at the same time in the company’s
portfolio. If the composition of this mix is not understood, neither is the way in which the
company may earn money from the different products. Typically, the temporal development
of the portfolio of competing companies may look like Figure 2.7. Each triangle represents
the development time of a new product. Some development times are long while others are
short depending on whether a new platform must be developed or the product can be designed
on an existing platform. The profit of the company may then oscillate, depending upon capital
costs and how well the product is received in the market. The point is that the mix of a
complex portfolio with many time constants may make it difficult to assess when return of the
investment is expected and how the market will perceive the product. It is easy to wrongly
assess the situation by not developing a profitable product because it takes time to design it.
Similarly, products that are easy to make are marketed without getting any return, or products
are withdrawn from the market before they have matured.
17
Company one
Profit
time
Company two
development
cycles
time
Figure 2.7 Development and profit cycles
12
Finally, let us observe that services and telecommunication products of the type we
have just described are new. GSM, World Wide Web, telephone computer integration (CTI),
and intelligent networks were implemented during the first half of the 1990s. These networks
and the services developed on them are still new: all of them are about five years old.
Therefore, we may not expect that they have stabilised from a business viewpoint. Instead we
are facing a business and a market which are chaotic, characterised by rapid growth of the
market, much innovation, and creation of new products at a high rate. In such an environment,
it is likely that traditional methods for strategic and economic analysis break down and must
be replaced by new ways of perceiving the business.
18
3 Convergence
The term convergence is used for describing several phenomena in telecommunications:
convergence of information technology and telecommunications, convergence of different
businesses (C4 convergence) and fixed-mobile convergence. Each of these phenomena will be
discussed briefly.
3.1 Trends
The background for the development is depicted in Figure 3.1.
Relative traffic volume
Fixed
Speech
Mobile
Data
Time
Figure 3.1 Evolution of the market
The figure illustrates two developments that take place at the same time. Firstly, the
land mobile market is, at least in some countries, just as big as the market for fixed
interconnections. What is even more important is that the mobile market is increasing while
the market for fixed services has stagnated or is decreasing. Mobility is also taking new forms
in the fixed network. The most important evolutions are personal mobility allowing the user to
move from one terminal to another while receiving services, and session mobility allowing
ongoing sessions to be handed over from one terminal to another without losing the context.
Soon most of the application will require mobility of some kind. Secondly, the market for data
transmission is becoming bigger than the market for speech services. The reason for this
development is the World Wide Web. IP telephony will merge the services in such a way that
the traditional telephony service may disappear entirely and the whole network is designed on
conditions optimal for transferring data.
These two trends will merge and the result is a mobile data network offering mobility
of terminals, persons and sessions. The capability of designing user friendly, cheap and
reliable mobile data services will create the winning position in the market place.
3.2 Convergence of information technology and telecommunications
There are two observations behind this convergence. First, information technology – in terms
of computer science – is the building block of all equipment in telecommunications. All
equipment in the network and at the user premise contains processing devices and software.
This reflects simply the general development of computation. In stead of designing dedicated
circuits performing such tasks as speech and picture coding, switching, service processing and
system management, these functions are realised on standard computer hardware programmed
with software performing the function. This development is not specific for
19
telecommunications. Computer hardware is now found in all types of equipment ranging from
complex control systems of chemical plants to refrigerators. This is simply a consequence of
the development of computer hardware where the processor is becoming fast enough and the
memory circuits big enough to replace dedicated circuits. Ten years ago we saw a need for
centralised supercomputers doing advanced computer tomography for several hospitals at the
same time or inverting the enormous matrices required for controlling large systems such as
the chemical plant. These needs do not exist anymore. It is now possible to do the
computations locally.
The telecommunication networks merge to become a single network. This merged
network will be designed using technologies derived from the Internet. All services will be
provided on this network realising the multimedia network where sound, picture, graphics and
text is combined. The convergence is thus towards a functionally integrated network offering
everything seamlessly to the users.
Information technology has thus already become the most important ingredient in
telecommunications systems.
On the other hand, computation is distributed in most systems. Search for information
on the Internet is one example of distributed computing. Other examples are banking, ecommerce, remote database management, mobile agents and nomadic computing. All these
applications, and many more, require telecommunications in order to interconnect the
software components. Therefore, information technology and telecommunications are
merging as technologies.
3.3 C4 convergence
C4 stands for Content, Computer, Communication and Consumer electronics. All these
industries are involved in producing and providing multimedia services. Content means
creation and management of electronic information. Industries involved in this business are
newspapers, educational institutions, producers and distributors of music, movies and video
programs, broadcast corporations, entertainment developers, banks, merchants and so on.
Computer incorporates all those who store and processes information. This may be electronic
libraries, transaction managers, information navigators, information hosts, clearinghouses and
owners of platforms for electronic commerce, telecommuting and co-operative work.
Communication means distribution of the content to consumers like enterprises and homes.
The distribution may be via ISDNs, the Internet, cable television systems, satellites and
mobile radio. Consumer electronics refers to the equipment at the consumer premises. This
may be PCs, television sets, telephones, and microprocessors in the car or the refrigerator.
The convergence means that modern electronic products cannot be produced and be
provided to the consumer unless all four components of the C4 hierarchy are present. For the
telecommunications industry this means that the environment of the business has changed.
Going back to 1990, the products of the telecommunications industry consisted of person-toperson and machine-to-machine communication. The presentation medium for the majority of
the applications was the telephone apparatus, the radio receiver and the television set. By the
end of the 1990s, this picture is changing dramatically. The content-rich applications are
growing both in volume and in number because of the World Wide Web. This growth is by no
means a result of the efforts of the telecommunications industry but the result of the entry of
the other industries on the arena.
Another facet of the C4 convergence has been concerned with dominance. One
problem is who owns the customer: is it the content provider owning the information or the
communication operator providing the plug in the wall? Is the computer business a
component of the communication network or is this an independent business? How will the
communication operator allow businesses offering computer services access to the network?
20
Who will determine the prices of the services: the content provider, the enterprise offering
computer services or the communication operator? Or is it part of the price paid for the
terminal equipment? These are just a few of the complicated questions raised as a result of the
development.
3.4 Fixed-mobile convergence
The fixed-mobile convergence is motivated by still other concerns. When the GSM system
was developed, mobile communication was still regarded as independent of the fixed
network. About 1990 ITU developed the concept of personal mobility inheriting principles
from the GSM development. The idea was to offer mobility independent of network access.
The concept was called Universal Personal Telecommunication (UPT) in Europe and Personal
Communication Services (PCS) in USA. The PCS concept is nothing but a mobile network
with capabilities similar to that of GSM. The UPT service would enable a user to access and
being accesses from the network at any mobile or fixed access point using one unique
telephone number. The service was implemented in the telephone network in Norway as early
as 1994. However, it did not become a success for two reasons. First, the access procedure
required on telephones were too complicated requiring that between 30 and 40 digits were
dialled. Second, the development in price and availability of the GSM system offered the
users the personal mobility they requested to an acceptable quality and price.
A study performed by EURESCOM9, in 1993 concluded that UPT could not be used in
the GSM system because of the way charging and service handling was implemented in this
system. The usage of the GSM system has increase so much that it has become a serious
competitor to the fixed network. This has led to two problems. First, it is evident that the users
are not viewing GSM and fixed services in a different way anymore. The networks simply
provide the same services to the user in different situations. Second, it is convenient to
provide all the services a customer requires as one subscription. These objectives are also
becoming relevant for the convergence of traditional services (both mobile and fixed) with the
new opportunities offered by the Internet.
The fixed-mobile convergence therefore represents possible solutions both for
harmonising networks employing different technologies and for designing uniform
subscription arrangements encompassing several types of network.
The development of the mobile Internet makes these efforts more urgent.
9
European Telecommunications Research Institute
21
Download