2.875 - Fall 2001

advertisement
2.875 - Fall 2001
Mechanical Assembly and Its Role in Product Development
Term Project: Report #2
Analyzing the Assembly Design of a Computer Mouse Using Datum Flow Chains
October 31, 2001
Annabel Flores
James Katzen
Photos taken from:
http://www.petergof.com/x-ray/mouse.htm
http://www.mousemorf.com/images/mouse2b.jpg
Analyzing the Assembly Design of a Computer Mouse1
Using Datum Flow Chains
The Microsoft Mouse Version 2.0 is an ergonomic, dual-button mouse. The simple, ten-part
design provides an opportunity to analyze the product’s assembly characteristics. In addition, we
can apply assembly analysis tools such as a liaison diagram and datum flow chains to evaluate
the current design and propose product redesigns. Below is a figure of the computer mouse.
Figure 1: Computer Mouse Semi-Exploded View
The Liaison Diagram for the mouse is shown below:
Mouse Base
(Part 8)
Sticker
Pad #1
(Part 3)
Screw
(Part 4)
ce
an
ar
pe
Ap
Circuit
Board
(Part 11)
Functional (Gear
aligned wrt Encoders)
F
(Ge unctio
wrt ar ali nal
En gne
cod
d
ers
)
Gear #1
(Part 7)
Gear #2
(Part 7)
Ball Holder
(Part 1)
Spring
(Part 9)
Ball
(Part 2)
Wheel
(Part 10)
Sticker
Pad #2
(Part 3)
nt
ge
an
ll t is)
Ba x
l( ra
na ea
t io G
nc ith
Fu w
s
tton s)
(Bu
nal witche
ctio
S
Fun ed wrt
n
Alig
Cord
(Part 6)
Fun
ction
a
with l (Ball
ta
Gea
r ax ngent
is)
Mouse
Cover
(Part 5)
Figure 2: Liaison Diagram for Microsoft® Mouse v2.2A
Note that this diagram shows the numerous key characteristics that must be delivered in this
device. A designer must pay attention to numerous aspects of the assembly in order to properly
deliver the important functional and customer requirements. This diagram appears fairly
1
Refer to Bill of Materials and exploded view of mouse attached at end of product.
straightforward, and shows that there is no coupling of key characteristics. Thus, we assume that
each of these key characteristics can be met under nominal conditions.
In observing the liaison diagram, as well as various different parts of a mouse, we found a
number of key characteristics that we could have focused on for this report. For example, the
alignment between the mouse base and cover is a key characteristic particularly of importance to
the end user who would like a smooth transition between the parts. This key characteristic is
therefore an appearance characteristic. Another key characteristic is the positioning of the mouse
cover with respect to the circuit board, so that the buttons properly actuate the electronic
switches. This key characteristic is therefore a functional characteristic.
In choosing a key characteristic for the purpose of this analysis, we were compelled to choose
one that was crucial in the functional performance of the mouse. We determined that other than
the monitoring of switch inputs, the key driver of mouse functionality is its ability to convert
mechanical motion into optical information that could be transferred onto a computer screen.
A user will move the mouse on a surface that, in turn, will move the ball within. Refer to the
figure below to follow our nomenclature. The ball transfers up/down and left/right motion to
two independent gear axles that are in constant contact with the ball. The gear head rotates
according to the ball’s speed and direction. The teeth on the gear head pass through optical
sensors that record the speed and direction of the ball and hence the user’s movement.
Key Characteristic
Sensor Spacing
Y
X
Gear Head
Figure 3: Gear/Sensor with Co-ordinate System
The position of the gear’s teeth with respect to the sensor is critical to provide the sensitivity
necessary to capture the user’s movement. If the gear is grossly out of position in or about any
direction, the sensors will not be able to process the data accurately. However, for the purpose of
this analysis, we are focusing on the key characteristic that is the relative position of the gear and
sensor in the direction parallel to the gear’s rotational axis. This key characteristic, and its
associated coordinate system are denoted in the figure above. Note that we have chosen to
analyze only one gear/sensor pair instead of both as the same analysis can be applied to both.
The parts that deliver the key characteristic are the mouse base (part #8), circuit board (part #11),
optical sensors (subassembly on part #11) and gear (part #7). The most likely root for this datum
flow chain is the mouse base as this part locates all others. The datum flow chain for the
gear/sensor subassembly is as follows:
Circuit Board
10
6
7
8
1
9
11
2
Mouse Base
12
3
4
13
5
Sensors
15
Feature Coord Frame
16
14
Group Coord Frame
Gear
Figure 4: Datum Flow Chain Delivering Key Characteristic
As the assembly root, the mouse base is the starting point to locate all other parts. The table
below lists the important features on the base that locate the circuit board and the gear.
Feature
Number
1
2
3
4
5
6
7
8
9
10
11
12
13
Feature
Support
Pin 1
Pin 2
Tab 1
Tab 2
Bottom Surface
Hole 1
Hole 2
Hole Set 1
Hole Set 2
Pegs
Pegs
Sensor Spacing
Part
Base
Circuit Board
Photo Eye Sensor
LED Sensor
Sensors
14
15
16
Teeth
Peg End 1
Peg End 2
Gear
Table 1: Features Delivering the KC
The Support, Pin 1 & Pin 2 define the location of the circuit board within the mouse base. Tabs
1 & 2 locate the position of the gears with respect to the base. The injection-molded features of
the base are shown in the figure below. The part’s coordinate frame locates the coordinate frame
of the features.
Tab 1
Tab 2
Pin 1
Support
Pin 2
Figure 5: Mouse Base Features Delivering the KC
The circuit board features that connect to those on the mouse base are the bottom surface and
Holes 1 & 2. The circuit board’s coordinate frame is located in the assembly through these
features. The part coordinate frame defines the frame of the Hole Sets 1 & 2 that define the
position of the optical sensors.
Figure 6 locates the above features on the circuit board. The circuit board’s coordinate frame is
located through these features.
Hole Set 2
Hole Set 1
Hole Set 2
Hole 1
Hole Set 1
Hole 2
Figure 6: Circuit Board Features Delivering the KC
The two sensors that translate the mechanical motion of the gears are the photo eye sensor and
the LED Sensor. The position of these sensors, and the spacing between them, is defined
through the position of the circuit board’s hole sets.
LED Sensor
Photo Eye Sensor
Figure 7: Sensor Features Delivering the KC
As shown in Figure 8, the gear’s coordinate frame is located through its pegs and the respective
tabs on the mouse base.
Peg End 2
Teeth
Peg End 1
Figure 8: Gear Features Delivering the KC
The datum flow chain developed in Figure 4 is a direct representation of the assembly scheme.
The circuit board and gear are located through features on the mouse base. The circuit board
subassembly locates the sensors. The position of the gear with respect to the sensor spacing is
the key characteristic under analysis.
In studying the assembly scheme of the mouse, it became apparent that parts in the assembly
were over-constrained. Adding the degrees of freedom to the datum flow diagram developed in
Figure 4 results in the following figure.
Circuit Board
10
6
Z, x, y
7
8
1
X, Y
2
X, Y, z
Over-constraint
9
Overconstraint
Mouse Base
11
12
3
4
5
13
X, Z, x, z
Y
Sensors
15
Feature Coord Frame
16
14
Group Coord Frame
Gear
Figure 9 Datum Flow Chain with Assembly Degrees of Freedom
There are a number of potential redesigns of the computer mouse that can eliminate the overconstrained conditions. In addition, a product redesign can improve the deliver of the key
characteristic. We have developed a number of design proposals that improve the product
assembly with a more robust design and are described in detail below.
Design Proposal 1: Revise Circuit Board – Mouse Base Locating Method
Currently, the circuit board (Part #11) is oriented with respect to the mouse base (Part #8)
through the use of two peg-hole mates. These are shown in the picture below:
Figure 10: Assembly Features linking the mouse base and the circuit board
As we have seen in this class, this is an inherently over-constrained design. The over-constraint
in the x and y directions will cause numerous problems, such as assembly difficulty, and the
locking-in of internal stresses.
This over-constraint can be overcome by introducing a peg-slot mate at one of the features. This
design is shown below:
Figure 11: Proposed Assembly Features linking the mouse base and the circuit board
Note that since the two pegs do not share a common x or y location, the slot must be placed on
an angle. The long axis of the slot must be parallel to the line that connects the two center points
of the pegs. This will result in a completely constrained assembly, with no over-constraint. The
circuit board will be completely positioned and oriented properly with respect to the mouse base.
This will likely result in reduced assembly efforts, and reduced “locked-in” stresses.
Design Proposal 2: Revise Optical Encoder Sensors’ LED Unit – Circuit Board Locating
Method
Currently, the optical encoder sensors’ LED units are oriented with respect to the circuit board
(Part #11) through the use of two peg-hole mates. These are shown in the picture below:
Figure 12: Assembly Features linking the optical encoder sensors’ LED units and the
circuit board
This is recognized as an inherently over-constrained design. The over-constraint in the x and y
directions will cause numerous problems, such as assembly difficulty, and the locking-in of
internal stresses. It is assumed that these problems occur very regularly, since both the LED
units are visibly misoriented about their x-axes on the unit we examined. We assume that this
misorientation does not affect the amount of light produced by the LED units in the direction of
the photo eye (due to a wide field of illumination). However, this over-constraint likely causes
additional effort than should be needed in assembling these relatively simplistic parts.
This over-constraint can be overcome by introducing a peg-slot mate at one of the features. This
design is shown below:
Figure 13: Proposed Assembly Features linking the optical encoder sensors’ LED units
and the circuit board
Although it has not been confirmed, it is assumed that the through holes in the circuit board are
large enough to have clearance between the sensor element leads and the hole edges. If this is
the case, the item is not over-constrained. However, if these items are soldered one lead at a
time rather than both at once, the item will be over-constrained, since an adjustable contact
feature was fixed before the mates were fixed. But since it is known that this part is made with a
wave-soldering process, we can assume that the contacts are all affixed at the same time.
This new design will result in a completely constrained assembly, with no over-constraint. The
LED unit will be completely positioned and oriented properly with respect to the circuit board.
This will likely result in reduced assembly efforts, and reduced “locked-in” stresses.
Design Proposal 3: Revised Optical Encoder Sensors’ Photo Eye Unit – Circuit Board
Locating Method
Currently, the optical encoder sensors’ photo eye units are oriented with respect to the circuit
board (Part #11) through the use of three collinear peg-hole mates. These are shown in the
picture below:
Figure 14: Assembly Features linking the optical encoder sensors’ photo eye units and the
circuit board
As with the circuit board – mouse base and the LED units – circuit board mates, this is
recognized as an inherently over-constrained design. The over-constraint in the x and y
directions will cause numerous problems, such as assembly difficulty, and the locking-in of
internal stresses. It is assumed that these problems occur also very regularly, since both the
photo eye units are visibly misoriented about their x-axes on the unit we examined. We assume
that this misorientation does not affect the amount of light sensed by the photo eye units (due to a
wide sensing field). However, this over-constraint likely causes additional effort than should be
needed in assembling these relatively simplistic parts.
Eliminating one of the three pegs and introducing a peg-slot mate at one of the features can
overcome this over-constraint. This design is shown below:
Figure 15: Proposed Assembly Features linking the optical encoder sensors’ photo eye
units and the circuit board
This will result in a completely constrained assembly, with no over-constraint. The photo eye
unit will be completely positioned and oriented properly with respect to the circuit board. This
will likely result in reduced assembly efforts, and reduced “locked-in” stresses.
It is not known whether the new two-lead photo eye units would be compatible with the existing
circuitry and sensing algorithms. Therefore, this change to correct an over-constrained condition
may require additional analysis and changes.
The same concern regarding the soldering process exists for this item, as it did for the LED units.
Suggested Redesigns That Improve KC Delivery
The Key Characteristic that has been identified is the alignment of the gear, the optical encoder
sensor LED unit and the optical encoder sensor photo eye unit. This alignment must be tightly
controlled, since the mechanical motion of the mouse is directly transferred to electrical signals
via the use of the optical encoder. Figure 16 shows this nominal case.
Y Y
Gear
X
X
Alignment Centerline
Photo
Eye Unit
LED Unit
Figure 16: Nominal case where the light transmitted by the optical encoder sensor LED
unit is sensed by the optical encoder sensor photo eye unit
If this alignment were in error, the amount of light passing through the gear (Part #7) would be
affected. If large misalignments occur, no light could pass through, and even though the mouse
is in motion, the unit would not sense the movement. The condition of lateral error is shown
below:
Y
Y
Gear
X
X
Alignment Centerline
Photo
Eye Unit
LED Unit
Figure 17: Effect on the amount of light sensed by the optical encoder sensor photo eye
unit as the result of lateral misalignment of the optical encoder sensor LED unit and the
optical encoder sensor photo eye unit
The condition of angular error is shown below:
Y
Y
Gear
X
X
Alignment Centerline
Photo
Eye Unit
LED Unit
Figure 18: Effect on the amount of light sensed by the optical encoder sensor photo eye
unit as the result of angular misalignment of the optical encoder sensor LED unit and the
optical encoder sensor photo eye unit
Multiple design proposals are being offered to make the sensing system more robust to variation
in the alignment of the optical encoder sensor LED unit and the optical encoder sensor photo eye
unit.
Design Proposal 4: Increase Diameter of Toothed Portion of Gear
Since the passages that allow the passage of light are slots rather than point holes, the current
design of the gear (part #7) is tolerant of some lateral error in the x-direction. A non-robust
design for the gear, with point holes is shown below:
Figure 19: Non-Robust design for optical encoder gear
However, due to geometric concerns, the current gear cannot be made more tolerant of lateral
error in the x-direction, since expanding the slots would soon weaken the gear tremendously.
But, if the overall diameter of the gear is increased, the slots could then be lengthened, making
the gear much more tolerant of lateral error. This design modification is shown below:
Figure 20: Proposed design improvement for optical encoder gear
Note that the accuracy of the device would be negatively affected as the lateral error in the xdirection increases. Knowing the elapsed time over which the light passes through a gap in the
gear teeth, a rotational velocity can be calculated, as long as the distance of the gap is known.
However, if this gap increases, the relation between the time and the corresponding rotational
velocity is affected, and false readings can be created. This is precisely what would happen if
the gear placement were in error. Because the width of the slot increases with increasing radius,
this increased gap would fool the electronics into thinking that the mouse was moving slower
than it was. However, existing computer software utilities exist that allow the computer user to
customize the behavior of the mouse. Therefore, the customer could correct the slight
calculation error in mouse position, and this integration of this design proposal is therefore
feasible.
Design Proposal 5: Decrease Distance Between Optical Encoder Sensor LED and Photo
Eye Unit
Changing the distance between the optical encoder sensor LED and sensing field of optical
encoder sensor photo eye units will affect the robustness of the light sensing performance.
Shortening this distance will result in an increased ability of the photo eye to detect the light
illumination, even in the presence of LED / Sensor misalignment. The same angular
misalignment that was shown in Figure 18 is repeated in Figure 21. However, in Figure 21, the
distance between the LED and the photo eye unit is decreased.
Gear
Y
X
Alignment Centerline
Photo
Eye Unit
LED Unit
Figure 21: Improved robustness of sensing device created by the shortening of the distance
between the sensor LED and the photo eye unit
As can be seen, with a misalignment between the LED and the sensor unit, the photo eye “sees”
more of the beam if the photo eye is brought closer to the LED. Therefore, this shortened
distance would allow a higher amount of light to be sensed. This change would therefore
improve the KC delivery by reducing the sensitivity of the system performance by reducing the
effect of variation in the alignment of the optical encoder sensor LED unit and the optical
encoder sensor photo eye unit.
Upon examination of the assembly, it does appear that some of the distance between the two
components can be eliminated. Examining the following photograph can see this:
Figure 22: Distance between Optical Encoder Sensor LED and Photo Eye Unit
A rough estimate predicts that almost 50 per cent of the existing distance can be eliminated.
Placing the components closer together would require adding more material to the circuit board
(which may introduce some cost), but is well within the realm of possibility.
Design Proposal 6: Change Field of Illumination of Optical Encoder Sensor LED and
Sensing Field of Photo Eye Unit
Changing the field of illumination of optical encoder sensor LED and sensing field of optical
encoder sensor photo eye units will also affect the robustness of the light sensing performance.
Using elements that have a large field of illumination and/or elements that have a large sensing
region will transmit and sense a constant amount of light, even in the presence of LED / Sensor
misalignment. The same angular misalignment that was shown in Figure 18 is repeated in Figure
23. However, in Figure 23, the field of illumination and the sensing field have been increased in
area.
Gear
Y
X
Alignment Centerline
Photo
Eye Unit
LED Unit
Figure 23: Improved robustness of sensing device created by the use of electronic
components with wider field of illumination and sensing areas
As can be seen, with a misalignment between the LED and the sensor unit, the photo eye “sees”
more of the beam if the optical properties are widened. Therefore, these upgraded components
would allow a constant amount of light to be sensed, even in the presence of LED / Sensor
misalignment. As in the prior proposal, this change would therefore improve the KC delivery by
reducing the sensitivity of the system performance by reducing the effect of variation in the
alignment of the optical encoder sensor LED unit and the optical encoder sensor photo eye unit.
It is expected that this design change will not increase per-piece cost of the finished unit. Optical
LED and photo-eyes are commodity electronic components, and are made with multiple values
for illumination field and sensing area. It is very likely that improved components can be found
that will still satisfy overall quality and price design requirements.
Design Proposal 7: Incorporate Optical Encoder Sensor LED and Photo Eye Unit Into One
Component
We have seen that an important Key Characteristic (and one that can be addressed) is the
alignment of the gear face with the axis connecting the optical encoder sensor LED unit and the
optical encoder sensor photo eye unit. This KC is obviously affected by the horizontal position,
the vertical position, and the rotational position about the horizontal and vertical axes. To
achieve this properly, both the optical encoder sensor LED and the optical encoder sensor photo
eye unit must be placed accurately, relatively to each other. However, currently these
components are separately mounted to the circuit board. Therefore, much care must be taken to
ensure their alignment. Often, these components are placed improperly, as seen below:
Figure 24: Misalignment between optical encoder sensor LED and photo eye unit
The importance of the relative position of the optical encoder sensor LED and the optical
encoder sensor photo eye unit can be eliminated if these two items are incorporated into one unit.
In effect, the two items will become their own, fully constrained, sub-assembly. These
integrated photo eye gates are common electronic devices, and it is expected that one can be
found that would meet the quality and cost requirements.
Note that it is likely that this integrated unit will at least three external leads (power, signal,
ground) that will be used to mount the unit to the circuit board (Part #11). Care should be taken
that the mounting of these three leads does not create an over-constrained condition similar to the
one that currently exists with the optical sensor units’ photo eye sensors.
Design Proposal 8: Replacement of Mouse Wheel/Trackball, Gear, and Optical Encoder
Technology with Alternative Motion Sensing Technology
The objective of the entire ball (Part #2), gear (Part #7), and optical encoder electronics systems
linkage (Part #11) is to reliably translate planar motion (input by a human user) into an electrical
signal (output to a integrated circuit for analysis).
The original mouse, developed by Douglas Engelbart in 1968, utilized this wheel/trackball
technology. This design is shown below:
Figure 25: Underside view of mouse developed by Douglas Englebart2
Note the horizontal and vertical discs protruding from the underside of the mouse. Since this
early design, this method has been improved upon incrementally, and evolved to its current form.
The figure below is a partially disassembled view of an original Apple mouse from 1983. This
mouse incorporated the ball into the design to transfer mechanical motion into optical
information. There are significant differences in this design as the key characteristic is integral
to the circuit board subassembly.
Figure 26: 1983 Apple Mouse
The figure below is an earlier version of the Microsoft Mouse we have studied thus far.
Figure 27: 1998 Microsoft Mouse 2.0
2
http://www.superkids.com/aweb/pages/features/mouse/mouse.html
While there were a number of design improvements, the key characteristic is still delivered via
the same parts. The part features on the base that locate the circuit board are redesigned though
it is still over-constrained.
Pin/Hole
Pin/Hole
Figure 28: 1998 Microsoft Mouse – Base & Circuit Board Locating Features
However, there are numerous other technologies that can sense planar motion. A number of
these are solid-state components (i.e. dual axis accelerometers, texture recognition cameras, etc.)
that require no moving parts. Solid-state electronics provide multiple advantages, such as: the
reduction in total parts; the elimination of the need to periodically clean the device; and the
reduction in overall weight (due to the lack of the need to create friction between the ball and the
table surface).
The reduction in parts as a result of using solid-state technology will likely yield lower material
costs, lower tooling costs, lower assembly costs times, and overall higher reliability. Additional
effort may be needed to process and filter the input signals to obtain the required resolution and
bandwidth, however, these algorithms are readily available, and would be simple to integrate.
This design approach has been implemented by a number of computer mouse manufacturers.
Optical mice have come to the forefront of the market, and are currently displacing the
wheel/trackball mouse as the dominant technology. The following Figures present different
optical mouse designs that are currently on the market.
Figure 29: Microsoft® Optical Mouse3
Figure 30: Macally Peripherals® Optical Mouse4
3
4
http://www.overclockers.co.uk/acatalog/ms_intellimouse_optical.jpg
http://www.macally.com/gif/products/usb/micromouse.gif
Figure 31: Logitech® Optical Mouse5
Other technologies that monitor input motion include virtual reality gloves and hand mounted
ring devices that use accelerometers to accurately track the desired motion and resolve it into
planar motion. Perhaps the computer mouse will eventually be replaced!
5
http://www.theshipcarver.com/images/products/peripherals/log_ifeel_optical.gif
Appendix 1:
List of Figures
FIGURE 1: COMPUTER MOUSE SEMI-EXPLODED VIEW
2
FIGURE 2: LIAISON DIAGRAM FOR MICROSOFT® MOUSE V2.2A
2
FIGURE 3: GEAR/SENSOR WITH CO-ORDINATE SYSTEM
3
FIGURE 4: DATUM FLOW CHAIN DELIVERING KEY CHARACTERISTIC
4
FIGURE 5: MOUSE BASE FEATURES DELIVERING THE KC
5
FIGURE 6: CIRCUIT BOARD FEATURES DELIVERING THE KC
6
FIGURE 7: SENSOR FEATURES DELIVERING THE KC
6
FIGURE 8: GEAR FEATURES DELIVERING THE KC
6
FIGURE 9 DATUM FLOW CHAIN WITH ASSEMBLY DEGREES OF FREEDOM
7
FIGURE 10: ASSEMBLY FEATURES LINKING THE MOUSE BASE AND THE CIRCUIT BOARD
8
FIGURE 11: PROPOSED ASSEMBLY FEATURES LINKING THE MOUSE BASE AND THE CIRCUIT
BOARD
8
FIGURE 12: ASSEMBLY FEATURES LINKING THE OPTICAL ENCODER SENSORS’ LED UNITS AND
THE CIRCUIT BOARD
9
FIGURE 13: PROPOSED ASSEMBLY FEATURES LINKING THE OPTICAL ENCODER SENSORS’ LED
UNITS AND THE CIRCUIT BOARD
9
FIGURE 14: ASSEMBLY FEATURES LINKING THE OPTICAL ENCODER SENSORS’ PHOTO EYE UNITS
AND THE CIRCUIT BOARD
10
FIGURE 15: PROPOSED ASSEMBLY FEATURES LINKING THE OPTICAL ENCODER SENSORS’ PHOTO
EYE UNITS AND THE CIRCUIT BOARD
11
FIGURE 16: NOMINAL CASE WHERE THE LIGHT TRANSMITTED BY THE OPTICAL ENCODER
SENSOR LED UNIT IS SENSED BY THE OPTICAL ENCODER SENSOR PHOTO EYE UNIT
11
FIGURE 17: EFFECT ON THE AMOUNT OF LIGHT SENSED BY THE OPTICAL ENCODER SENSOR
PHOTO EYE UNIT AS THE RESULT OF LATERAL MISALIGNMENT OF THE OPTICAL ENCODER
SENSOR LED UNIT AND THE OPTICAL ENCODER SENSOR PHOTO EYE UNIT
12
FIGURE 18: EFFECT ON THE AMOUNT OF LIGHT SENSED BY THE OPTICAL ENCODER SENSOR
PHOTO EYE UNIT AS THE RESULT OF ANGULAR MISALIGNMENT OF THE OPTICAL ENCODER
SENSOR LED UNIT AND THE OPTICAL ENCODER SENSOR PHOTO EYE UNIT
12
FIGURE 19: NON-ROBUST DESIGN FOR OPTICAL ENCODER GEAR
13
FIGURE 20: PROPOSED DESIGN IMPROVEMENT FOR OPTICAL ENCODER GEAR
13
FIGURE 21: IMPROVED ROBUSTNESS OF SENSING DEVICE CREATED BY THE SHORTENING OF THE
DISTANCE BETWEEN THE SENSOR LED AND THE PHOTO EYE UNIT
14
FIGURE 22: DISTANCE BETWEEN OPTICAL ENCODER SENSOR LED AND PHOTO EYE UNIT
14
FIGURE 23: IMPROVED ROBUSTNESS OF SENSING DEVICE CREATED BY THE USE OF ELECTRONIC
COMPONENTS WITH WIDER FIELD OF ILLUMINATION AND SENSING AREAS
15
FIGURE 24: MISALIGNMENT BETWEEN OPTICAL ENCODER SENSOR LED AND PHOTO EYE UNIT 16
FIGURE 25: UNDERSIDE VIEW OF MOUSE DEVELOPED BY DOUGLAS ENGLEBART
17
FIGURE 25: 1983 APPLE MOUSE
17
FIGURE 26: 1998 MICROSOFT MOUSE 2.0
17
FIGURE 27: 1998 MICROSOFT MOUSE – BASE & CIRCUIT BOARD LOCATING FEATURES
18
FIGURE 28: MICROSOFT® OPTICAL MOUSE
19
FIGURE 29: MACALLY PERIPHERALS® OPTICAL MOUSE
19
FIGURE 30: LOGITECH® OPTICAL MOUSE
20
Appendix 2:
List of Tables
TABLE 1: FEATURES DELIVERING THE KC
5
Download