The Web of System Performance

advertisement
`
The Web of System Performance
A multi-goal model of information system performance
Brian Whitworth, Jerry Fjermestad and Edward Mahinda
Introduction
Modern information systems face demands so diverse some postulate a virtual
“evolution”, where only the fit succeed (David, McCarthy, & Sommer, 2003). Yet what is
information system (IS) “fitness”? Successful life varies from simple viruses to powerful
predators. Not only the strong are fit, and one-sided “excellence” can precede extinction.
IS progress may be equally non-linear. Laptops give less power than a PC at more cost,
yet sell well. The variability of modern technology suggests a multi-goal model of system
performance (Chung, Nixon, Yu, & Mylopoulos, 1999) - a web of system performance.
The web of system performance
We take performance as how well a system interacts with its environment, and suggest
four elements: a system/non-system boundary, a supporting internal structure, output
effectors, and input receptors (Whitworth & Zaic, 2003), e.g.
a. Cells have a membrane boundary, internal support (nucleus), flagella to move
(effectors), and photo-receptors.
b. People have a skin boundary, internal brain and organs, acting muscles, and sensory
input.
c. Computers have a physical case, motherboard architecture, printer/screen effectors,
and keyboard/mouse “receptors”.
d. Software has a memory boundary, an internal program structure, and specialized
input/output modules.
Successful systems interact with their environment to gain value and avoid loss. Each
system element can maximize opportunity or minimize risk, giving eight performance
goals:
1) Boundary:
a) To enable useful entry (extendibility).
b) To deny harmful entry (security).
2) Internal structure:
a. To accommodate external change (flexibility).
b. To accommodate internal change (reliability).
3) Effector:
a. To maximize external effects (functionality).
b. To minimize internal effort (usability).
4) Receptor:
a. To enable meaning exchange (connectivity).
b. To limit meaning exchange (privacy).
1
An information system today is not “high performance” if it is:
1. Ineffectual – it cannot do the job.
2. Unusable – you cannot make it work.
3. Unreliable – it breaks down often.
4. Insecure – it succumbs to viruses.
5. Inflexible – it fails when things change.
6. Incompatible – it cannot use standard plug-ins or data.
7. Disconnected – it cannot communicate.
8. Indiscreet – it reveals private information.
The web of system performance (WOSP) model has four success-creating sub-goals
(functionality, flexibility, extendibility, connectivity), and four failure-avoiding sub-goals
(security, reliability, privacy, usability). In Figure 1 the web area represents system
performance, the lines the “tension” between its aspects, and the shape shows the type of
performance (see later). None of the WOSP goals are new (Table 1), but their integration
into a common framework is.
Figure 1. Web of System Performance
2
Table 1. System Design Goal Terminology
Sub-Goal
Similar Terms
Extendability Openness, interoperability, permeability, compatibility, scalability.
Security
Defense, protection, safety, threat resistance, integrity.
Flexibility
Adaptability, portability, customizability, plasticity, agility, modifiability.
Reliability
Stability, dependability, robustness, ruggedness, durability, availability,
maintainability.
Functionality Effectualness, capability, usefulness, effectiveness, power, utility.
Usability
Connectivity
Privacy
Ease of use, simplicity, user friendliness, efficiency, accessibility.
Networkability, communicativeness, interactivity, sociability.
Confidentiality, secrecy, camouflage, stealth, social rights, ownership.
Boundary goals
A system’s boundary determines what enters it, and can be designed to repel external
threats (security), or accept external opportunities (extendibility).
Extendibility
Extendibility, or openness, is a system’s ability to make use of outside elements, e.g. a
car with a tow hitch can add a trailer. Human tool-use extends performance the same way,
and programs can use third party plug-ins given the equivalent of an “open” human hand.
However to use a trailer, the car’s tow hitch must match the trailer link. This explains the
benefit of IS open standards. Who would have thought, in the software copy protection
days, Netscape would thrive with publicly available source code? Yet this encouraged
third party development, which increased performance. Openness seems why the early
IBM PC superceded a more reliable, secure and usable Apple Macintosh. One was a
propriety black box, while the other had standard slots for third party cards. Extendibility
is a factor in IS performance.
Security
Security is a system’s ability to protect against unauthorized entry, misuse or
takeover, e.g. a secure car has locks. A security breach is a system failure, and so a
performance failure. Secure hardware is sealed and tamper-proof, and distributors prefer
compiled to interpreted software because users cannot alter it. The entry denial principle is
the same for hardware and software. Virus and hacker threats make boundary firewalls
and logon checks critical to system survival. Security is part of IS performance.
Structure goals
A system’s internal structure can be designed to manage internal changes (reliability)
or external changes (flexibility).
Flexibility
Flexibility is a system’s ability to work in new environments, e.g. tracked vehicles
operate in any terrain, and mobile computers can work anywhere. CSMA/CD (Ethernet)
protocols outperformed more reliable but less flexible polling protocols. Flexible relational
3
databases displaced more efficient but less flexible hierarchical and network models. Most
modern software has a “preferences” module to configure it for hardware, software or user
environments, e.g. the Windows control panel. Flexibility is part of IS performance.
Reliability
Reliability is a system’s ability to continue operating despite internal changes like
part failure, e.g. a car that always starts is a great thing. Reliable systems can be trusted to
work, survive stress or load, and if affected, degrade “gracefully” rather than crash
catastrophically. In IS, mean time between failure (MTBF) measures the probability of
failure-free operation over time. Equally important is fast recovery, whether by error code
or state rollback. Computer companies, like Dell, with better after-sales support and
warrantees, succeed because reliability is part of IS performance.
Effector goals
System effectors change the external environment, and can be designed for maximum
effect (functionality), or minimum cost (usability).
Functionality
Functionality is a system’s ability to act directly on its environment to produce a
desired change, e.g. a car’s speed, or a word processor’s power to change documents. A
focus on functional requirements gives “feature” laden software that gets the job done, so
the effort to use it is never wasted. People upgrade for functionality, so functionality is
part of IS performance.
Usability
Usability is a system’s ability to minimize the resource costs of action, e.g. a usable
car is easy to drive and easy on petrol (economical). Reduced instruction set computing
(RISC) outperforms complex instruction set computing (CISC) by using less code for the
same work. “Light” software runs well in background because it uses little CPU/memory.
In human computer interaction (HCI), graphical user interfaces (GUIs) replaced command
user interfaces (CUIs) because they reduced user cognitive effort. Usability is part of IS
performance.
Receptor goals
Social interaction magnifies human performance, both for good and ill, and can
enable information exchange (connectivity), or limit it (privacy), adding a social
dimension to system performance.
Connectivity
Connectivity is a system’s ability to communicate with other systems, e.g. connected
cars could detect other cars and avoid collisions. Earlier we linked actions to effectors,
though actions occur in an input guided feedback loop. Likewise, meaning links to
receptors even though communicative acts require effectors. Receptors create meaning,
the end-result of communication, just as effectors create the end result of actions.
Connected software can download updates and communicate on various levels (Whitworth
et al, 2001). Connectivity is part of IS performance.
Privacy
Privacy is a system’s ability to control the release of information about itself, e.g. a
car’s tinted windows hides the occupants. The military values stealth airplanes for the
4
same reason animals camouflage themselves. In society, not giving information is
important because public ridicule or censure can have physical consequences. In socialtechnical worlds, privacy is part of IS performance.
Implications
The WOSP model can be used in system design or system evaluation.
Environment “tuning”
Much system design assumes performance is absolute, but in the WOSP model,
performance is relative to the environment, and has no “perfect” form:
1. In opportunity environments, right action gives great benefit
2. In risk environments, wrong action gives great loss
5
3. In dynamic environments, risk and opportunity change rapidly.
If performance has a shape as well as an area, different shapes may suit different
environments (Figure 2). The WOSP model can help developers analyze the system shape
that fits a given environment (Table 2).
Table 2. WOSP sub-goal weights
Goal
Extendibility
Security
Detail
Weight %
Use outside component/data add-ins?
Resist outside attack/take-over?
Flexibility
Predict/adapt to external changes?
Reliability
Avoid/recover from internal failure?
Functionality
What task functionality is required?
Usability
Connectivity
Privacy
Performance
Conserve system/user effort or training?
Communicate/connect with other systems?
Manage self-disclosure?
Interact successfully with the environment.
100%
Multi-dimensional requirements
In traditional IS development, functionality is the primary goal, leaving “nonfunctional” requirements (NFRs) as second-rate needs. Yet many systems have more lines
of error or interface code than functional code, and systems fail surprisingly often for
“unexpected” non-functional, or “quality” reasons (Cysneiros & Leita, 2002, p699). If
NFRs can cause system failure, they define performance as well as modify it. In the
WOSP model, functionality differs from other system goals only in being more obvious.
Low usability can nullify functionality, just as low functionality can make usability
irrelevant. Modern technology illustrates the multi-dimensional nature of performance.
Cell phones let people talk anywhere, anytime (flexibility), but must still be effective
(voice quality), usable (keys not too small), reliable (if dropped), secure (if stolen),
extendable (earphones, phone covers), connected (reception) and private (prevent
snooping). Ubiquitous wireless software must flexibly use different devices/networks, but
also be resilient to data entry errors and power failures. It must be scalable, yet secure
against virus attack, connect yet maintain user privacy. If success needs many causes,
failure may need only one. The WOSP model is a useful checklist for complex new
technology.
Combination breakthroughs
When is an advance not an advance? In the WOSP model, only an area increase is
progress, and this may require combination breakthroughs. In 1992, Apple CEO John
Sculley introduced the hand held Newton, saying portability (flexibility) was the wave of
the future. We now know he was right, but the Newton’s smallness made data-entry hard,
and its handwriting recognition was poor. The usability reduction neutralized the
flexibility advance, and in 1998 Apple dropped the line. Yet, when Palm’s Graffiti
6
language solved the usability problem, the PDA market revived (though it is now under
threat from cell phone devices with better connectivity).
When the web is slack, any advance increases performance, but as systems evolve
“tension” occurs. Specialties pull the system in different directions, and one purpose can
“cut across” another (Moreira, Araujo, & Brita, 2002), e.g. upgrading connectivity from
one-to-one 1G circuit switching to 2.5G “always on” packet switching created security
problems. The WOSP sub-goals are as if connected by elastic, so pulling one point can
reduce others. Hence “feature creep” can give complex “bloatware”, that is hard to use,
maintain and defend. This explains why later versions of successful products can, after
much development effort, perform worse than the original, and why “progress” in general
can “bite back” (Tenner, 1997).
Yet flexibility need not deny reliability, nor functionality reduce usability, nor
Internet connectivity abuse privacy (Whitworth & de Moor, 2003). One can expand the
web by “pulling” two or more sides at once, e.g. logon sub-systems (security) can
welcome users by name, and recall their preferences (increase usability). The WOSP
model suggests that apparent opposites, like security and openness, can be reconciled – if
the tension is recognized and resolved.
Project integration
Traditional projects often prepare system requirements by specialty, as for example
interface and database design need different skills. Designing for add-ins needs standards
knowledge, while security design needs virus knowledge. The WOSP model recognizes
this goal specialization. Depending on the project, up to eight project specialty teams
could design different system layers, with separate specifications, code and testing (Table
3). However extreme specialization can produce what Tenner calls the “Frankenstein
effect” 1 (Tenner, 1997). The modular WOSP goals interact in implementation because
there is only one final system. This adds a new factor to the above specialties – their
integration.
Table 3. System specialist performance layers
Goal
Analysis
System Layer
Testing
Extendibility
Standards/Interoperability analysis
Import/Add-in
Compatibility
Security
Threat analysis
Security/Log-on
Penetration
Flexibility
Contingency analysis
Configuration/Control
Situation
Reliability
Error recovery analysis
Recovery
Stress
Functionality
Task analysis
Application
Task
Usability
Usability analysis
Interface
User
Connectivity
Network analysis
Communication
Channel
Privacy
Legitimacy analysis
Rights
Social
Integration maximizes when everyone designs all the system, as in “extreme” or
“agile” project management styles. However who today knows all aspects of system
Dr Frankenstein stitched together the “best” of each body part he found in the graveyard.
The result was a (low performance) monster.
1
7
design? The WOSP model suggests ways to increase integration while keeping the benefit
of specialization:
1.
Designate integrators: Cross-disciplinary integrators can liaise, and chair meetings
to reconcile cross-over conflicts and create synergies.
2.
Combine specialists: For example create four teams, for actions (functionality +
usability), interactions (security + extendibility), contingencies (reliability +
flexibility) and sociability (connectivity + privacy).
New projects may need more integration, while familiar projects may specialize more. The
WOSP tension lines suggest that developing complex social-technical systems is as much
about integration as specialization.
Discussion
Limitations
The WOSP model describes system performance causes, not external causes like
marketing, politics and distribution. It also ignores system cost. It assumes buyers expect
to pay more for more performance. Finally, the WOSP logic assumes the system “world”
is defined. Systems can be viewed on four levels: mechanical, informational, cognitive and
social (Whitworth & Zaic, 2003). Each depends on the previous (e.g. software depends on
hardware), yet each level “emerges” with higher demands and greater potential, e.g. social
systems increase productivity, but require legitimate interaction to do so (Whitworth &
deMoor, 2003). The WOSP model applies to any IS level - but not to all at once.
Theoretical
Some system design theories, like the waterfall method, suggest ways to achieve
presumed known goals. The WOSP model however analyses IS ends, and finds a web of
performance. Most goal models exist within specialities, like security, usability and
flexibility. They recognize their goals, and tend to subsume others under their speciality,
e.g. the European general security model includes availability, integrity, reliability and
confidentiality under a general security concept. The WOSP model modularises these
concepts under system performance. It calls confidentiality privacy, integrity functionality,
and availability usability. It separates security from reliability, as the engineers who
maintain society differ from the police who protect it. Mechanisms that increase faulttolerance (reliability) can reduce security, which is illogical if reliability is part of security.
Other models support this reliability/security distinction, as one involves the provision of
services, while the other involves denial of services (Jonsson, 1998). Finally, the security
model under-represents opportunity goals, like connectivity and flexibility.
The Technology Acceptance Model (TAM) raised the importance of usability in
system design (Davis, 1989). What value is a powerful system that one cannot use? Yet it
ignores factors like security and reliability, which obviously affect acceptance. When
“usability” measures include “suitability for task” (functionality) and “error tolerance”
(reliability) (Gediga, Hamborg, & Duntsch, 1999), then usability, like security before it, is
becoming a confusing catch-all term for performance. Likewise, when flexibility
proponents suggest “scalability” and “connectivity” as aspects of flexibility (Knoll &
Jarvenpaa, 1994, p6), specialist terms are expanding to fill the available theory space. This
creates confusion. The WOSP model modularizes concepts like usability, security,
reliability and flexibility, keeping “system performance” as the general rubric.
8
Progress points
It is easy to forget how yesterday saw today’s progress. The Internet was for technogeeks, until virtual reality became “real”. Email was socially inept lean communication,
until text became “rich”. Berners-Lee’s World Wide Web was passed over by CERN, the
Hypertext community and Microsoft, before MIT took it up to create online society. Cell
phones were yuppy toys, until everyone got one.
The WOSP model suggests that prevalent IS theories have repeatedly failed to predict
evident IS progress because performance is multi-dimensional: the Internet offered
massive connectivity; text e-mail was easy to use; the World Wide Web was scalable; and
the cell phone was flexible. What is the common factor? Traditional models see progress
as a train on one track moving forward. The WOSP model sees progress as a train on
many tracks, switching between them to expand the covered area. Increasing progress in
only in one direction soon gives diminishing returns, and creates tension in other areas.
If progress in one aspect tends to be followed by progress in another, today’s trends
may not be tomorrow’s progress points. A decade ago multi-media was “hot”, but Star
Trek’s vid-phone did not happen, nor did video-conferencing boom, nor were virtual
reality goggles the future of computer gaming. How did the computer game frontier
progress? Game editors became popular, opening games to user map and scenario
contributions (e.g. Doom’s WAD files). Games became connected, allowing virtual social
worlds, like The Sims and massively multi-player online role-playing games
(MMORPGs).
Experts are, by the nature of things, experts in the past, so the progress they predict is
not always the progress that occurs. The WOSP model suggests, counter-intuitively, that
while a system’s strongest aspect(s) may create its current success, developing its weakest
aspect gives the greatest long-term performance gains. If performance is the WOSP area,
it is most increased by extending its shortest extent. Perhaps the future of online gaming
lies in exclusive groups! We are currently developing a WOSP checklist to evaluate
social-technical system performance in a more balanced way.
Conclusion
The lesson that progress is not linear is hard and ongoing. When something works, we
want to do it again – and again. Yet if biological evolution applies to software, IS progress
will have many forms. It is interesting that “killer” applications”, like browsers, e-mail and
chat, are functionally quite simple. Perhaps this simplicity creates the “slack” necessary
for all-round performance.
Today’s users want it all: functionality, usability, reliability, flexibility, security,
extendibility, connectivity and privacy. To deliver this web of performance, system
designers need specialization (to handle complexity) and integration (so one advance does
not nullify another). We suggest:
1.
Cross-disciplinary research into system design conflicts and synergies.
2.
Project teams allocate personnel and time to integration.
In biological evolution, one-dimensional progress is not enough. Why should software
evolution be different? In the short term, specialization may succeed, but in the long term,
it can be a one-way street. If today’s specialties need a motto, we suggest: “You are not
advancing alone”. Unless specialties work with other specialties, we could create
technology monsters. In academia, specialists who cross-train are disadvantaged in their
second field. Yet in software practice, these same generalists create progress. If IS theory
9
is to drive IS practice, it needs an integration “specialty”, because system performance is
balance as well as excellence.
REFERENCES
Chung, L., Nixon, B. A., Yu, E., & Mylopoulos, J. (1999). Non-functional requirements in
Software Engineering. Kluwer Academic.
Cysneiros, L. M., & Leita, J. (2002). Non-functional requirements: From elicitation to
modeling languages. Paper at ICSE, Orlando, Florida.
David, J. S., McCarthy, W. E., & Sommer, B. S. (2003). Agility - The key to survival of
the fittest. Communications of the ACM, 46(5), 65-69.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of
information technology. Management Information Systems Quarterly, Sep.
Gediga, G., Hamborg, K., & Duntsch, I. (1999). The IsoMetrics usability inventory: an
operationalization of ISO9241-10 supporting summative and formative evaluation
of software systems. Behaviour & Information Technology, 18(3), 151-164.
Jonsson, E. (1998). An integrated framework for security and dependability. Paper at
NSPW, Charlottsville, VA, USA.
Knoll, K., & Jarvenpaa, S. L. (1994). Information technology alignment or "fit" in highly
turbulent environments: The concept of flexibility. Paper at SIGCPR 94,
Alexandria, Virginia, USA.
Moreira, A., Araujo, J., & Brita, I. (2002). Crosscutting quality attributes for requirements
engineering. Paper at SEKE, Ischia, Italy.
Tenner, E. (1997). Why Things Bite Back. New York: Vintage Books, Random House.
Whitworth, B., Gallupe, B., & McQueen, R. (2001). Generating agreement in computermediated groups. Small Group Research, 32(5), 621-661.
Whitworth, B., & deMoor, A. (2003). Legitimate by design: Towards trusted virtual
community environments. Behaviour & Information Technology, 22(1), 31-51.
Whitworth, B., & Zaic, M. (2003). The WOSP Model: Balanced Information System
Design and Evaluation. Communications of the Association for Information
Systems, 12, 258-282.
10
11
Download