History: The first cathode ray tube scanning device was invented by

advertisement
History:
The first cathode ray tube scanning device was invented
by the German scientist Karl Ferdinand Braun in 1897.
Braun introduced a CRT with a fluorescent screen,
known as the cathode ray oscilloscope. The screen
would emit a visible light when struck by a beam of
electrons. In 1907, the Russian scientist Boris Rosing
(who worked with Vladimir Zworykin) used a CRT in the
receiver of a television system that at the camera end
Time line:
-1855
German, Heinrich Geissler invents the Geissler tube,
created using his mercury pump this was the first good
evacuated (of air) vacuum tube later modified by Sir William Crookes.
-1859
German mathematician and physicist, Julius Plucker
experiments with invisible cathode rays. Cathode rays
were first identified by Julius Plucker.
-1878
Englishmen, Sir William Crookes was the first person
to confirm the existence of cathode rays by displaying
them, with his invention of the Crookes tube, a crude
prototype for all future cathode ray tubes.
-1897
German, Karl Ferdinand Braun invents the CRT oscilloscope - the Braun Tube was the forerunner of today’s television and radar tubes.
-1929
Vladimir Kosma Zworykin invented a cathode ray tube called the kinescope - for use with a primitive
television system.
How it works:
Almost all TVs in use today rely on a device known as
the cathode ray tube, or CRT, to display their images.
LCDs and plasma displays are sometimes seen, but they
are still rare when compared to CRTs. It is even possible
to make a television screen out of thousands of ordinary
60-watt light bulbs! You may have seen something like
this at an outdoor event like a football game. Let’s start
with the CRT, however, because CRTs are the most
common way of displaying images today. The terms
anode and cathode are used in electronics as synonyms
for positive and negative terminals. For example, you
could refer to the positive terminal of a battery as the anode and the negative terminal as the cathode. In a cathode ray tube, the “cathode” is a heated filament (not unlike the filament in a normal
light bulb). The heated filament is in a vacuum created inside a glass “tube.” The “ray” is a stream of
electrons that naturally pour off a heated cathode into the vacuum. Electrons are negative. The anode
is positive, so it attracts the electrons pouring off the cathode. In a TV’s cathode ray tube, the stream
Practical Uses for Cathode Rays:
While many scientists were busy trying to unlock the secrets of
cathode rays, others were searching for ways to apply them toward
practical ends. The first such application came in 1897 in the form
of Karl Ferdinand Braun’s oscilloscope. This device used a cathode
ray tube to produce luminescence on a chemically treated screen.
The cathode rays were allowed to pass through a narrow aperture, effectively focusing them into a beam which appeared on the
screen as a dot. The dot was then made to “scan” across the screen according to the frequency of
an incoming signal. An observer viewing the oscilloscope’s screen would then see a visual representation of an electrical pulse.
During the first three decades of the twentieth century, inventors continued to devise uses for
cathode ray technology. Inspired by Braun’s oscilloscope, A. A. Campbell-Swinton suggested that a
cathode ray tube could be used to project a video image upon a screen. Unfortunately, the technology of the time was unable to match Campbell-Swinton’s vision. It was not until 1922 that Philo T.
Farnsworth used a magnet to focus a stream of electrons onto a screen, producing a crude image.
Though the first of its kind, Farnsworth’s invention was quickly superseded by Vladimir Zworykin’s
kinescope, the ancestor of the modern television.
Today, most forms of image-viewing devices are based upon cathode-ray technology. In addition,
electron guns are used widely in scientific and medical applications. One important use for cathode-
ray research has been the electron microscope,
invented in 1928 by Ernst Ruska. The electron
microscope uses a stream of electrons to magnify an image. Because electrons have a very
small wavelength, they can be used to magnify
objects that are too small to be resolved by visible light.
The demise of the CRT monitor:
they’re heavy and bulky
they’re power hungry - typically 150W for a 17in monitor
their high-voltage electric field, high- and low frequency magnetic fields and x-ray radiation have proven to be harmful to humans
in the past
they cannot be used in laptops
the scanning technology they employ makes flickering unavoidable, causing eye strain and fatigue
their susceptibility to electro-magnetic fields makes them vulnerable in military environments
their surface is often either spherical or cylindrical, with the result that straight lines do not appear straight at the edges.
Whilst competing technologies - such as LCDs and PDPs had established themselves in specialist areas, there are several good
reasons to explain why the CRT was able to maintain its dominance in the PC monitor market into the new millennium:
phosphors have been developed over a long period of time, to
the point where they offer excellent colour saturation at the very small particle size required by high-resolution displays
the fact that phosphors emit light in all directions means that viewing angles of close to 180 degrees are
possible
since an electron current can be focused to a small spot, CRTs can deliver peak luminances as high as
1000 cd/m2 (or 1000 nits)
CRTs use a simple and mature technology and can therefore be manufactured inexpensively in many
industrialised countries
whilst the gap is getting smaller all the time, they remain significantly cheaper than alternative display
technologies.
However, by 2001 the writing was clearly on the wall and the CRT’s long period of dominance appeared
finally to be coming to an end. In the summer of that year Philips Electronics - the world’s largest CRT
manufacturer - had agreed to merge its business with that of rival LG Electronics, Apple had begun shipping all its systems with LCD monitors and Hitachi had closed its $500m-a-year CRT operation, proclaiming that “there are no prospects for growth of the monitor CRT market”. Having peaked at a high of
approaching $20 billion in 1999, revenues from CRT monitor sales were forecast to plunge to about half
that figure by 2007.
Health conerns:
Ionizing radiation
CRTs can emit a small amount of X-ray radiation as a result of
the electron beam’s bombardment of the shadow mask/aperture
grille and phosphors. The amount of radiation escaping the front
of the monitor is widely considered unharmful. The Food and
Drug Administration regulations in 21 C.F.R. 1020.10 are used to
strictly limit, for instance, television receivers to 0.5 milliroentgens per hour (mR/h) (0.13 µC/(kg·h) or 36 pA/kg) at a distance
of 5 cm (2 in) from any external surface; since 2007, most CRTs
have emissions that fall well below this limit.[42]
[edit]Toxicity
Color and monochrome CRTs may contain toxic substances, such as cadmium, in the phosphors.[43][44]
The rear glass tube of modern CRTs may be made from leaded glass, which represent an environmental
hazard if disposed of improperly.[45] By the time personal computers were produced, glass in the front
panel (the viewable portion of the CRT) used barium rather than lead, though the rear of the CRT was still
produced from leaded glass. Monochrome CRTs typically do not contain enough leaded glass to fail EPA
tests.
In October 2001, the United States Environmental Protection Agency created rules stating that CRTs
must be brought to special recycling facilities. In November 2002, the EPA began fining companies that
disposed of CRTs through landfills or incineration. Regulatory agencies, local and statewide, monitor the
disposal of CRTs and other computer equipment.[46]
In Europe, disposal of CRT televisions and monitors is covered by the WEEE Directive.[47]
[edit]Flicker
At low refresh rates (below 50 Hz), the periodic scanning of the display may produce an irritating flicker
that some people perceive more easily than others, especially when viewed with peripheral vision. A high
refresh rate (above 72 Hz) reduces the effect. Computer displays and televisions with CRTs driven by digital electronics often use refresh rates of 100 Hz or more to largely eliminate any perception of flicker.[48]
Non-computer CRTs or CRT for sonar or radar may have long persistence phosphor and are thus flicker
free. If the persistence is too long on a video display, moving images will be blurred.
[edit]High-frequency noise
CRTs used for television operate with horizontal scanning frequencies of 15,734 Hz (for NTSC systems)
or 15,625 Hz (for PAL systems).[49] These frequencies are at the upper range of human hearing and are
inaudible to many people; some people will perceive a high-pitched tone near an operating television CRT.
[50] The sound is due to magnetostriction in the magnetic core of the flyback transformer.
[edit]Implosion
Download