Improving the Design of Interactive Software

advertisement
PROGRESS REPORT
FIRST YEAR
GRANT CCR-9624846
Improving the Design of Interactive Software
David F. Redmiles
Department of Information and Computer Science
University of California, Irvine
Irvine, CA 92697-3425
email: redmiles@ics.uci.edu
voice: (714) 824-3823
fax: (714) 824-1715
web http://www.ics.uci.edu/~redmiles/
July 1996 - July 1997
1
Table of Contents
Overall Summary of Progress for Year 1
4
Collaborations
5
Theoretical Progress
5
Human-Centered Software Development Lifecycle
5
Software Design Environments
6
Usage Data Collection
9
Information Review and Project Awareness
System Implementation Progress
10
11
System Integration — A Scenario of Design, Usage, and Review
11
The Argo Software Design Environment
16
EDEM Software for Usage Data Collection
17
The Knowledge Depot for Information Review and Project Awareness
17
Immediate, On-going Work Plan
19
Personnel Supported on the Grant
19
Research Support of Senior Personnel
20
List of Publications—Year 1
20
List of Presentations—Year 1
21
Web Site Guide
22
References
22
2
List of Figures
Figure 1: Human-Centered Software Development Lifeycle
6
Figure 2: The Argo Software Design Environment
7
Figure 3: Mapping between Cognitive Theory and Design Environment Features
8
Figure 4: Differing Expectations between Developers and Users
9
Figure 5: Travel Voucher Application
12
Figure 6: Agents Editor
13
Figure 7: Monitoring Events
14
Figure 8: Organizing Information for Review in Knowledge Depot
14
Figure 9: Architecture of the Travel Voucher Application in Argo
15
Figure 10: Process Model for Constructing the Architecture of an Application
15
List of Tables
Table 1: Characterizing Individual and Group Awareness
3
10
Improving the Design of Interactive Software
Overall Summary of Progress for Year 1
The goals of our research are to combine elements of research in human-computer interaction
with research in software engineering. More specifically, we have made progress with the following approaches. First, we have focused on the communication that takes place between software
developers and the intended end users of interactive systems. To facilitate this communication we
have developed an approach we call expectation agents which automatically collects usability
data and reports it back to developers. Second, we have focus on the collection and organization
of usability data and other information which software developers, end users, and other stakeholders exchange during the course of a project. Namely, we have explored an approach based on
group memory and organizational awareness. Third, we have re-examined the highest level of
activity in software development, namely software architecture design. We have developed this
approach in light of current cognitive analyses of human problem solving and design. We continually seek to integrate our efforts and refer to them under the collective term of human-centered
software development.
Three papers are included with this progress report to provide details of these approaches (Robbins, Hilbert, Redmiles 1996; Hilbert, Robbins, Redmiles 1997; Kantor, Zimmermann, Redmiles
1997). In the body of this progress report, we are including a scenario of how we envision the
three specific approaches to work together in the context of a software development effort. This
project is strengthened by our close collaboration with NYNEX/Bell Atlantic in White Plains.
This collaboration grounds our theories in software practice and may in the future provide one
venue for evaluation. We also corroborate our approach when possible with other research
projects of similar interest. Thus, herein we also report on our specific efforts in collaborations.
In summary, this report describes our progress in the following areas:
• Collaborations: developments in our on-going collaboration with NYNEX/Bell Atlantic
and collaborations with other research groups;
• Theoretical Progress: an integrated theoretical framework for human-centered software
development; and
• System Implementation Progress: implementation of our theory through three systems for
supporting, respectively, expectation agents, organizational awareness, and software architecture design.
In addition, this report provides information on the following:
• Immediate, On-going Work Plan: specific current goals (consistent with the original proposal);
• Personnel Supported on the Grant: people contributing to the effort with or without direct
funding;
• Research Support of Senior Personnel: efforts to expand our research through new proposals;
4
•
•
•
List of Publications—Year 1;
List of Presentations—Year 1; and
Web Site Guide.
Collaborations
A key aspect of our research program is our collaboration with outside industry and research
groups. These collaborations provide valuable feedback on our research ideas. The feedback
comes both from other researchers and actual practitioners. Foremost, this feedback is a critical
element of any research. However, it also is extremely valuable to graduate students in that it provides them the opportunity to think more realistically about their work. Finally, it provides opportunities for internships and potential future employment for them.
With respect to this grant, our primary collaboration is with NYNEX/Bell Atlantic based in White
Plains. Our principle contacts there are Michael Atwood and Bea Zimmermann. Through this collaboration we gain access to end users in actual work place settings and have the opportunity to
jointly develop prototype systems. Some milestones in this collaboration include the following:
• internship at NYNEX/Bell Atlantic for one student during Summer 1996 and for Summer
1997;
• licensing agreement, completed in June 1997, for joint development of the Knowledge
Depot software tool described in this report;
• visits by technical staff, UCI to NYNEX/Bell Atlantic in June 1996, NYNEX/Bell Atlantic
to UCI in April 1997;
• joint authorship of a technical report on group awareness (Kantor, Zimmermann, Redmiles 1997).
In the past six months, we have cultivated a working relationship with a Los Angeles based company, Computing Services Support Solutions (Cs3). They are working on an event-based language
relevant to our work on expectation agents. K. “Swamy” Narayanaswamy and Martin Feather are
our principle contacts there. To date, we have
• hosted a research visit by Cs3 staff at UCI;
• explored plans to jointly evaluate our software.
Theoretical Progress
Our research has three objectives: 1) to develop a model of software development as a process of
on-going communication; 2) to support this model through active mechanisms in software tools;
and 3) improve the accessibility and acceptance of usability methods by software practitioners. In
general, the objectives reflect a theory of human-centered software development.
Human-Centered Software Development Lifecycle
Figure 1 shows one way to illustrate a human-centered software development lifecycle. Namely,
prototype systems are designed, evaluated in a usage situation, and the results of the evaluation
are reviewed and incorporated back into a revised design. Of course, by this diagram, we do not
mean to imply that these activities are segregated phases of a lifecycle; rather, the activities are
5
separated out for ease of discussion and to emphasize areas when software tool support is possible.
Figure 1: Human-Centered Software Development Lifeycle
Design
prototype and expectation agents
incorporate
reviews into
design
ARGO
EDEM
deploy prototype with
expectation agents
Review
Usage
expectation agents’ reports and
other information
prototype with expectation agents
deposit observations
EDEM
KNOWLEDGE DEPOT
perform direct observations,
interviews, other manual
usability methods
perform other design reviews,
including email discussions
This lifecycle is human-centered in two ways. First, the emphasis on prototypes and evaluation of
usage maintains a focus on the requirements of end users and other stakeholders in the project.
Second, there is an underlying assumption that the software developers need cognitive support for
managing the complexity of usability information and incorporating that data and other information into design. Thus, the applications being developed are engineered to human cognitive needs
of the end users and the tools used to developed those applications are engineered to the human
cognitive needs of the software developers.
Software Design Environments
The approach we have taken emphasizes the use of software protoypes. This is well motivated by
the research on participatory design (Greenbaum and Kyung 1991). Prototypes provide a focus
around which software developers, end users, and other stakeholders can communicate and
expose requirements. While we acknowledge the necessity of non-executable prototypes (such as
mock-ups or informal drawings) early in the development lifecycle, we seek to enable software
developers to create executable prototypes as soon as possible. The approach we have taken is to
focus on high level design through software architecture and support that design through domainoriented design environments (Fischer 1994). In particular, we have developed a design environ-
6
ment, called Argo, for the domain of software architecture design (Robbins, Hilbert, Redmiles
1996). In this domain, it is possible for a designer to generate a software prototype from a high
level architectural specification. Figure 2 provides a view of Argo (center) being used to design a
video game (left). A “to do” checklist window appears on the right in this figure.
Figure 2: The Argo Software Design Environment
In general, a domain-oriented design environment is a tool that augments a human designer’s ability to design complex artifacts. Design environments must address systems-oriented issues such
as design representation, transformations on those representations (e.g., code generation), and
application of analysis algorithms. Furthermore, design environments exceed the capabilities of
most CASE (computer-aided software engineering) tools by addressing the cognitive needs of
designers. Specifically, we have examined three cognitive theories of design and problem solving:
reflection-in-action, opportunistic design, and comprehension and problem solving. Figure 3 summarizes our work by showing a mapping from these cognitive theories to specific design environment features that address them.
The cognitive theory of reflection-in-action observes that designers of complex systems do not
conceive a design fully-formed (Schoen, 1983, 1992). Instead, they must construct a partial
design, evaluate, reflect on, and revise it, until they are ready to extend it further. Design environments support reflection-in-action with critics that give design feedback. Critics are agents that
watch for specific conditions in the partial design as it is being constructed and present the
designer with relevant feedback. Critics can be used to deliver knowledge to designers about the
implications of, or alternatives to, a design decision. Examples of domain knowledge that can be
7
delivered by critics include well-formedness of the design, hard constraints on the design, rules of
thumb about what makes a good design, industry standards, organizational guidelines, and the
opinions of fellow project stakeholders and domain experts. In Argo, critics are displayed in the
“to do” checklist window on the right in Figure 2.
Figure 3: Mapping between Cognitive Theory and Design Environment Features
Theories of Cognitve Needs
Support in Argo
Reflection-in-action
Diversity of knowledge
Evaluation during design
Provide missing knowledge
Opportunitsic design
Timeliness
Reminding
Process flexibility
Process guidance
Process visibility
Critics
Low barriers to authorship
Continuous and Pessimistic
Kept relevant and timely by control mechs
Produce "to do" items with links
"To do" list
Presents feedback
Reminders to oneself
Allows choice
Process model
Process editing
Process critics
Process perspectives
Comrehension and
problem solving
Divide complexity
Multiple perspectives
that match mental models
Design Perspectives
Multiple, overlapping perspectives
Customizable presentation graphics
Presentation critics
The cognitive theory of opportunistic design explains that although designers plan and describe
their work in an ordered, hierarchical fashion, in actuality, they choose successive tasks based on
the criteria of cognitive cost (Hayes-Roth, Hayes-Roth, 1979; Guindon, Krasner, Curtis 1987;
Visser, 1990). Simply stated, designers do not follow even their own plans in order, but choose
steps that are cognitively the least cost among alternatives. The cognitive cost of a task depends on
the background knowledge of designers, accessibility of pertinent information, and complexity of
the task. Thus, although it is customary to think of solutions to design problems in terms of a hierarchical plan since hierarchical decomposition is a common strategy to cope with complex design
situations, in practice, designers have been observed to perform tasks in an opportunistic order.
Design environments can allow the benefits of both an opportunistic and a prescribed design process. They should allow, and where possible augment, human designers’ abilities to choose the
next design task to be performed. They can also provide information to lower the cost of following the prescribed process and avoid context switches that deviate from it.
The theory of comprehension and problem solving observes that designers must bridge a mental
gap between their model of the problem or situation and the formal model of a solution or system
(Kintsch, Greeno, 1985; Fischer, 1987; Pennington 1987). Problem solving or design proceeds
through successive refinements of the mapping between elements in the design situation and elements in the formal description. Multiple, overlapping design perspectives in design environments
allow for improved comprehension and problem solving through the decomposition of complexity, the leveraging of the familiar to comprehend the unfamiliar, and the capture of multiple stakeholders’ interests. It is our contention that no fixed set of perspectives is appropriate for every
8
possible design; instead perspective views should emphasize what is currently important in the
project. When new issues arise in the design, it may be appropriate to use a new perspective on the
design to address them.
Usage Data Collection
Usability breakdowns occur when developers’ expectations about system usage do not match
users’ expectations. A usage expectation determines whether a given interaction (e.g., a sequence
of mouse clicks) is expected or unexpected. For example, a developer may hold the usage expectation that users will fill in forms from top to bottom with minor variation, while a user may hold
the expectation that independent sections may be filled out in any order.
Figure 4: Differing Expectations between Developers and Users
Figure 4 shows a conceptual picture of expectations held by two groups of stakeholders (developers and users) and expectations encoded in the system being used. Each lowercase “e” represents
a tacit expectation held in the mind of a person or in the code of a program. Developers get their
expectations from their knowledge of the requirements, past experience in developing systems,
domain knowledge, and past experience in using applications themselves. Users get their expectations from domain knowledge, past experience using applications, and interactions with the system at hand. The software system embodies implicit assumptions about usage that are encoded in
screen layout, key assignments, program structure, and user interface libraries. Each upper case
“E” represents an explicit expectation. Several usability methods seek to make implicit expectations of developers and users explicit, e.g., cognitive walkthroughs (Wharton et al. 1994), participatory design (Greenbaum, Kyung 1991), and use cases (Rumbaugh, Booch, Jacobson 1997).
Once a mismatch between users’ and developers’ expectations is detected it can be corrected in
one of two ways. Developers can change their expectations about usage to better match users’
expectations, thus refining the system requirements and eventually making a more usable system.
For example, features that were expected to be used rarely, but are used often in practice should be
made easier to access. Alternatively, users can adjust their expectations to better match those of
developers, thus learning how to use the existing system more effectively.
9
Detecting a breakdown or difference in expectations is useful, but aligning expectations requires
knowledge of the other expectations and the specific differences. For developers to learn about
users’ expectations they need specific details of actual usage, including context, history, timing,
and intent. For users to learn of developers’ expectations they need clear documentation of the
intended system operation and rationale to be presented to them at the time of breakdown. In
either case, dialog between users and developers can help clarify and expose expectations.
Information Review and Project Awareness
In the CSCW literature, systems labeled as group awareness tools are generally designed to provide relatively instantaneous awareness of a group of individuals. This research explores a type of
awareness tool that does not fit within this implicit classification of awareness. The approach
instead focuses on providing individuals with awareness of multiple groups (rather than multiple
individuals), over days or weeks (rather than fractions of seconds to minutes). The types of informational benefits derived from such a system vary greatly from the benefits of a typical group
awareness tool. Table 1 illustrates the types of questions answered by different classes of awareness systems. The two dimensions are the frequency with which users receive information and the
unit of observation.
Table 1: Characterizing Individual and Group Awareness
Unit of Observation
Frequency
Individual
Group
Seconds to
Minutes
What is a person’s location and
current activity? (example
tools are Portholes, Piazza)
Is there a group meeting?
Where? What types of tasks is
the group working on? Who is
in the group? (example tools
are Video Windows, wOrlds).
Days to
Months
What is a person trying to
accomplish this week? What
are a person’s plans for this
week? What problems is a person working on solving?
(example tools are various calendar applications and distribution lists).
What is a group working on
this week? What kinds of problems is the group encountering? What changes has the
group made in the task the
group is working on? When
will the task be complete? (we
seek to fill this block with our
tool, Knowledge Depot).
A tool whose unit of observation is an individual, helps users maintain awareness of a set of individuals; example tools are Portholes (Dourish & Bly, 1992; Mantei, Baecker & Sellen, 1991; Lee,
Girgensohn, Schlueter 1997) and Piazza (Isaacs, Tang & Morris, 1996). A tool whose unit of
observation is a group helps users maintain awareness of a set of groups; example tools are Video
Windows (Abel, 1990) and wOrlds (Boggy & Kaplan, 1995). Information that users need to be
aware of over days rather than over minutes tends to be much more conceptual and task oriented
10
than physical. Instead of providing awareness through graphical or audio means of some physical
fact (the user is in room X, sitting, talking on the phone and browsing the web), we instead have to
provide information that is much more abstract (a person plans to work on the following tasks,
and the tasks have this set of priorities, and the following problems are delaying them) which type
of information is best represented textually, as is done in calendars. Whereas the unit of information in a typical awareness system is an individual, the unit of awareness in a system aimed at
groups over days is the task or set of tasks of the group. Awareness of the task then is what makes
this type of awareness so important.
We have focused on an approach being developed to fill the Group/Days-to-Months quadrant of
the diagram. We refer to tools that fit the this quadrant as project awareness tools. We have developed one such tool, called Knowledge Depot, which we illustrate below.
Features that are needed by all awareness systems for them to function are information capture,
information distribution and information presentation. The effectiveness of an awareness tool will
largely depend upon how well it provides these three features. It must capture useful information, distribute it quickly to the appropriate people, and display the information in a meaningful
manner. From this, a bulletin board system could be argued to be an awareness system, as it captures information, distributes it by making it available to bulletin board readers, and presents it as
a list of messages. However, the information capture requires users with information to deliberately start the news reader, navigate to a topic and submit the information, and distribution
requires users to go to appropriate software and search for the information that they want to be
aware of.
System Implementation Progress
The human-centered software development lifecycle presented above is being explored through
three systems. The Argo Software Design Environment implements the design theory component
illustrated in Figure 1. The EDEM (Expectation-Driven Event Monitoring) system substrate
implements the observation of usage component. The Knowledge Depot implements the review
component. The preceding sections introduced our theoretical understanding of a human-centered
software development software lifecycle and each of the component activities. The following sections illustrate how the components could work together through a hypothesized scenario and then
explain more about each system’s details. As noted earlier, additional details can be found in the
technical reports and paper accompanying this annual report (Robbins, Hilbert, Redmiles 1996;
Hilbert, Robbins, Redmiles 1997; Kantor, Zimmermann, Redmiles 1997).
System Integration — A Scenario of Design, Usage, and Review
We are working on integrating our tools for design, observation, and review (project awareness). Our objective is to provide integrated support of the human-centered software development life lifecycle described earlier and shown in Figure 1. We present our integration effort by
means of a scenario in which a developer uses Argo, EDEM, and Knowledge Depot in developing
a simple Travel Expense Form application written in Java, monitor its usage, and review the data
from the monitoring.
11
The interface shown in Figure 5 represents applications used to report travel expenses for reimbursement purposes. The layout of the interface is based on certain tacit, cognitive assumptions
about the tasks and the users of the form. For example, the grouping and order of fields reflects
developers’ expectations about how users mentally organize the requested data and the sequence
in which it is most convenient or efficient to provide it.
Figure 5: Travel Voucher Application
In order to test some of these assumptions, a developer uses the EDEM Agent Editor to specify an
Expectation Agent that will trigger whenever a user begins editing the “Address Section” of the
Travel Expense Form, see Figure 6. The developer uses the Agent Editor to express interest in
detecting when the user begins to edit the “ZIP” text field. The developer then adds this simple
event specification to an Expectation Agent that will trigger whenever a user begins editing the
“Address,” “City,” “State,” or “ZIP” fields in the Travel Expense Form.
12
Figure 6: Agents Editor
The agent specification consists of: (1) a name, (2) an activate pattern (an event pattern which, if
satisfied, causes the agent to be activated), (3) a cancel pattern (an event pattern which, if satisfied,
causes the agent to cancel evaluation of its activate pattern, (4) timing constraints, (5) a condition
to be checked once the activate pattern has been satisfied, (6) an action to be performed if the
agent is activated while the condition is TRUE, (7) and a flag indicating whether the agent should
continue checking after successfully triggering once.
Figure 7 presents a display of the state of the event monitors when a user has edited the “Traveler”
field and then shifted input focus to the “Address” field. This display is primarily for our development and debugging. Note that the LOST_EDIT event is not generated for the “Traveler” field
until the user actually begins editing another field in the form. This is an example of how EDEM
provides events at a higher level of abstraction than the window system events generated by Java.
In this fashion, an agent centered on detecting when editing has been completed in a particular
field does not need to be concerned with monitoring every other field in the interface.
The agent defined in this example generates a single event (“Address Started”) as soon as any of
the fields in the “Address Section” has been edited. This agent can then be used in conjunction
with other agents that detect when other sections have been started and/or finished. The developer
can thus compare the actual order that users complete the sections and fields in the Travel Expense
Form with the order that was originally expected.
13
Figure 7: Monitoring Events
Figure 8: Organizing Information for Review in Knowledge Depot
14
Feedback from agents must be reviewed and analyzed to inform future design decisions. To this
end, we have integrated EDEM with the Knowledge Depot which collects and organizes Expectation Agent reports for later review. Figure 8 shows several categories of data collected from
Expectation Agents monitoring the Travel Expense form. These categories can then be used to
link back into the appropriate (of affected) architectural elements in the Argo design of the system
(Figure 9), thus providing an indication of where changes to the system will need to occur.
A software developer can use the Argo design environment to specify the components of the
Travel Voucher Application. In future work, we seek to make the feedback from agents link
directly to these design components.
Figure 9: Architecture of the Travel Voucher Application in Argo
Figure 10: Process Model for Constructing the Architecture of an Application
15
The Argo Software Design Environment
Argo is our software architecture design environment illustrated in Figures 1, 9, and 10. It is based
on the cognitive theories of reflection-in-action, opportunistic design, and comprehension and
problem solving (see the earlier section, “Software Design Environments” on page 6). Argo offers
basic CASE tool features for entering an architectural model and applying code generating transformations. Furthermore, Argo contains critics that automatically provide design feedback, a user
interface for browsing design feedback, a process model to guide the architect and help control
critics, and multiple coordinated design perspectives.
Our current Argo implementation support critics that run in a background thread of control to continuously analyze a software architecture as it is being manipulated. Critics are made up of an
analysis predicate, a design feedback item, and various attributes used to determine if the critic is
timely and relevant. Criticism control mechanisms are objects that define Argo’s policies on when
to apply individual critics to a given part of the architecture. For example, one criticism control
mechanism selects critics that are relevant to stated design goals, while another selects critics that
are timely (relevant to design decisions that the architect is currently considering). Argo associates
critics with specific types of design materials (i.e., elements of the architect model) rather than
storing all critics in a central knowledge-base. This allows us to keep Argo’s kernel simple and
flexible, and will soon allow up to load models and their associated critics over the Internet as
needed.
Argo presents design feedback to the architect through a “to do list” metaphor. Items on the “to do
list” indicate an issue that the architect must consider before the design is finalized. Critics unobtrusively add and remove items as design problems arise and are resolved. When the architect
selects an item an explanation of the issue and possible resolutions is displayed and the “offending” elements of the architecture are highlighted to help build the mental context needed to
resolve the issue. Furthermore, the architect can send email to the expert who authored the critic
to open a dialog and draw the architect into the organizational context needed to resolve some
issues. In future work we will address recording these design issues and resolutions as a form of
design rationale.
Perspectives are implemented as predicates that select a part of the architecture model to be displayed in a given context. For example, one perspective may show software components and their
communication relationships, while another perspective may show the relationship between components and operating system resources.
The Argo process model consists of a set of design materials for representing design tasks and
dependencies between them. Since the process modeling domain is built on top of the Argo kernel
(just as is the architecture modeling domain), we can apply Argo’s own facilities to editing, visualizing, and critiquing the design process. Each step in the design process indicates a set of design
decisions that the architect will consider during that step. As the architect progresses through the
process model, process status information is used to infer which critics are timely.
Critics, criticism control mechanism, and perspectives are each implemented as Java boolean
functions. This allows great freedom in expressing the analyses to be performed and permits us to
16
use standard Java development tools. In the future, we hope to integrate an expert system or other
technologies that focus on the kinds of predicates that are most often needed. Support for end user
customizability is also planned.
EDEM Software for Usage Data Collection
We have developed prototype software for detecting and resolving mismatches in usage expectations by allowing developers to specify their expectations in terms of user interface events. These
expectations are encoded in the form of expectation agents that continually monitor usage of the
application and perform various actions when encapsulated expectations are violated.
In order to support this functionality, expectation agents need access to events occurring in an
application’s user interface. However, in order for expectations to be expressed at an appropriate
level of abstraction, they need access to higher level events than are produced by the window system. As a result, expectation agents are implemented on top of an expectation-driven event monitoring substrate (EDEM) that provides a multi-level event model. Aspects of the EDEM software
are illustrated in Figures 6 and 7.
Other researchers have proposed event monitoring as a means for collecting usability data, however, existing approaches suffer from one or more of the following limitations: (1) low-level
semantics—events are captured and analyzed at the window system level, or just slightly above;
(2) decontextualization—analysis is done post-hoc on raw event traces, potentially relevant contextual cues are lost; (3) “one-way” communication—data flows from users to developers who
must then infer meaning, no “dialogue” is established to facilitate mutual understanding; (4) batch
orientation—hypothesis formation and analysis is performed after large amounts of (potentially
irrelevant) data have been collected, no means for hypotheses to be analyzed and action taken
while users are engaged; and (5) privacy issues—arbitrary events are collected without any
explicit constraints on the purposes of collection, no way to provide users with discretionary control over what information is collected and what information is kept private.
The EDEM software addresses these issues and goes beyond existing approaches in supporting
user involvement in development. Our system provides: (1) a multi-level event model—allowing
agents to compare usability expectations against actual usage at reasonable levels of abstraction;
(2) contextualization—taking action and collecting information while users are engaged in using
the application; (3) two-way communication—helping initiate dialog between users and developers when breakdowns occur; and finally, (4) specializable monitoring and analysis—promoting a
shift from batch-oriented data collection and analysis to hypotheses-guided collection and analysis.
The Knowledge Depot for Information Review and Project Awareness
In our scenario, groups need a mechanism for communicating about usability and other data,
design decisions, and general progress. The Knowledge Depot, an enhanced version of GIMMe
(Zimmermann, Lindstaedt, Girgensohn, 1997), captures email conversations, and other information, and organizes this information, allowing users to browse through the information to rediscover (or learn for the first time) why different decisions were made, what problems were
17
encountered as a result of those decisions, and allowing the user to regain some of the context in
which those decisions were made. The Knowledge Depot is illustrated in Figure 8.
The system organizes its knowledge around a set of topics defined over time by all users of the
system. The topics frame is shown in Figure 8 as a hierarchical list of discussion items. A topic is
four things:
1. A phrase describing a concept, task or activity representing aspects of the group’s work;
2. A place where people go to find information;
3. A definition of the type of information the system looks for to determine whether something
belongs in the topic; and
4. A destination that people will aim their messages at in order to have the message stored correctly for later retrieval.
A topic then might have a name like “Form Completion Sequence.” This says that any message
that has “Form Completion Sequence” in the subject line will be captured in this topic, and people
browsing for messages will know to look for email discussing the Form Completion Sequence
within this topic. People having an email conversation about the Form Completion Sequence then
need only put “Form Completion Sequence” in the subject, and CC the Knowledge Depot system
for the information to be captured and put in an appropriate place. Furthermore, the hierarchical
organization of topics allows a message that has “Form Completion Sequence Violation” in the
subject to enter the “Form Completion Sequence” topic, and then enter the “Violation” subtopic.
If a new type of discussion begins, any group member can create a new topic or subtopic to capture the new type of discussion. If the terminology changes, users can change the definition of the
topics. For example, if there is a topic “NYNEX, collaboration” for discussing work done with
NYNEX, and NYNEX changes its name to Bell Atlantic, the topic would simply be updated to
“NYNEX, Bell Atlantic, collaboration”. This allows the organization of knowledge to evolve over
time as the group itself evolves.
The Knowledge Depot uses the same type of privacy mechanism as the Information Lens: information does not become publicly available unless explicitly emailed to a special user name (Malone et al. 1989). The Information Lens used the account “Anyone” to determine which knowledge
should be publicly distributed, while Knowledge Depot uses a mail account named after the group
to capture mail that is to be archived in the group memory.
Although the Depot focuses on email, it can also process electronic bulletin boards, and users can
upload formatted documents (i.e. PDF, HTML, GIF, etc.), which are all treated the same as mail
messages in how users are made aware of them. These documents can’t be searched by their contents the way mail and newsgroup messages can, but can still be accessed by browsing topics, and
by giving the system queries by date, author and/or subject.
The Knowledge Depot has many similarities to a newsgroup, if the newsgroup is used as a place
where people Cc mail so that it can be made available to non-recipients of the message, and so
that it can be stored for a time. The Knowledge Depot though is not constrained a single newsgroup with a linear list of messages, nor is it equivalent to a large number of newsgroups where
18
users have to determine which group (or groups) are most appropriate. The depot uses a flexible
set of topics where any user can at any time add a new topic, and based on the definition of the
topic, all existing messages will be checked to see if they belong in the topic (in addition to any
other topics that the message is already in). Potentially, the entire topic structure could be
removed, and replaced with a new topic structure, with messages automatically reorganizing
themselves to account for the new hierarchy. While a newsgroup acts as an oversized shared email
box, Knowledge Depot acts as a shared database. The database is used to organize all of a groups
discussions and documents, and provides standard types of database queries and reorganizations.
Immediate, On-going Work Plan
Below are some immediate goals for the second year of the grant. These goals are
All Systems
• Develop revised evaluation criteria for the software prototypes (Argo, EDEM, Knowledge
Depot).
• Revise and enhance the technologies based on usage and feedback: there are plans to test
all three software prototypes (Argo, EDEM, Knowledge Depot) in industry (respectively
at Rockwell, Lockheed-Martin, and NYNEX/Bell Atlantic).
• Continue to report on progress through technical reports, conference papers, and journal
articles.
Argo
• Increase the critic analysis and the visualization support in the software design environment.
• Investigate the application of the design environment architecture to new domains such as
requirements representations (CoRE method) and newer architectural specification languages (UML–Unified Modeling Language).
EDEM
• Refine the interface for developers defining usage expectations.
• Assess the generality of the expectation agent model and, in particular, identify possibilities for generalizing the expectation model to more than usability testing.
Knowledge Depot
• Improve the integration with usability reports from EDEM.
• Add more general searching of text and documents classified in the Knowledge Depot.
Personnel Supported on the Grant
The Principle Investigator is
• David F. Redmiles, Assistant Professor
The grant supports one graduate research assistant throughout the year. However, this funding has
occasionally been split between two PhD students:
• David Hilbert
19
•
Michael Kantor
In addition, a third PhD student has contributed to the research being funded from an ARPA grant
awarded to Professors Taylor and Redmiles. He is
• Jason Robbins
Finally, the grant benefits from the collaboration with NYNEX/Bell Atlantic in White Plains. In
particular, the following personnel lend their time to this effort:
• Michael Atwood, Technical Director, Advanced Software Development Environments
Group
• Bea Zimmermann, Member Technical Staff, Advanced Software Development Environments Group
Research Support of Senior Personnel
ARPA. Since the application for this proposal, the Principle Investigator was awarded a grant
from the Advanced Research Projects Agency (ARPA). He is Co-Principle Investigator on this
grant; Richard Taylor is the Principle Investigator. The title of the grant is “Open Technology for
Software Evolution: Hyperware, Architecture, and Process.” The planned, three-year budget is
$2,606,666. The starting date was October 1996.
This grant supports Taylor’s on-going effort in the area of software tools for large-scale, complex
systems. Redmiles’ contribution focuses on the application of cognitive theories of design, continuing work he participated in at the University of Colorado on design environments. The grant
also provides another venue for applying the framework and tools which are the focus of this NSF
grant, namely the EDEM expectation agents system for supporting communication between
developers and end users through expectation agents.
MICRO. The principle investigator also has submitted a one-year project proposal to the University of California Microelectronics Innovation and Computer Research Opportunities (MICRO).
The proposal is entitled “Applying Design Critics to Software Requirements Engineering.” The
requested budget is $36,000. The proposal describes the potential application of software design
environments to the specific area of formal requirements.
List of Publications—Year 1
1. Robbins, J., Hilbert, D., Redmiles, D. Extending Design Environments to Software Architecture Design, Journal of Automated Software Engineering, to appear.
2. Kantor, M., Zimmermann, B., Redmiles, D. From Group Memory to Group Awareness
Through Use of the Knowledge Depot, Technical Report UCI-ICS-97-20, Department of
Information and Computer Science, University of California, Irvine, CA, May 1997.
3. Hilbert, D., Robins, J., Redmiles, D. Supporting Ongoing User Involvement in Development
via Expectation-Driven Event Monitoring, Technical Report UCI-ICS-97-19, Department of
Information and Computer Science, University of California, Irvine, CA, May 1997.
20
4. Robbins, J., Hilbert, D., Redmiles, D. Using Critics to Support Software Architects, Technical
Report UCI-ICS-97-18, Department of Information and Computer Science, University of California, Irvine, CA, April 1997.
5. Shukla, S., Redmiles, D. Collaborative Learning in a Bug-Tracking Scenario, Workshop on
Approaches for Distributed Learning through Computer Supported Collaborative Learning
(Boston, MA), held in conjunction with the Conference on Computer Supported Cooperative
Work (CSCW 96), Association for Computing Machinery, November 1996.
6. Robbins, J., Hilbert, D., Redmiles, D. Using Critics to Analyze Evolving Architectures, Second International Software Architecture Workshop (ISAW-2 — San Francisco, CA), held in
conjunction with SIGSOFT’96: the Fourth Symposium on the Foundations of Software Engineering (FSE4), Association for Computing Machinery, October 1996.
7. Robbins, J., Hilbert, D., Redmiles, D. Extending Design Environments to Software Architecture Design, Proceedings of the 11th Annual Knowledge-Based Software Engineering
(KBSE-96) Conference (Syracuse, NY), IEEE Computer Society, Los Alamitos, CA, September 1996, pp. 63-72—Best Paper Award.
8. Robbins, J., Morley, D., Redmiles, D., Filatov, V., Kononov, D. Visual Language Features
Supporting Human-Human and Human-Computer Communication, Proceedings of the IEEE
Symposium on Visual Languages (Boulder, CO), IEEE Computer Society, Los Alamitos, CA,
September 1996, pp. 247-254.
9. Robbins, J., Redmiles, D. Software Design From the Perspective of Human Cognitive Needs,
Proceedings of the 1996 California Software Symposium (Los Angeles, CA), UCI Irvine
Research Unit in Software, Irvine, CA, April 1996, pp. 16-27.
List of Presentations—Year 1
1. Expectation-Driven Event Monitoring (EDEM), UC Irvine, CU Boulder, UMass Amherst,
ARCADIA Consortium Meeting (Boulder, CO), presented by D. Redmiles, June 1997.
2. Supporting Ongoing User Involvement in Development via Expectation-Driven Event Monitoring, CU Boulder, Meeting of the “Life-Long Learning and Design” Research Group (Boulder, CO), presented by D. Redmiles, June 1997.
3. Argo: A Design Environment for Evolving Software Architectures, IEEE, Nineteenth International Conference on Software Engineering (Boston, MA), presented by J. Robbins, May
1997.
4. Supporting Evolutionary Design via Expectation Agents, DARPA, Design Rationale Cluster
(EDCS DR) Meeting (Stone Mountain, GA), presented by D. Redmiles, March 1997.
5. Applying Usability Inspections to Web Page Design, UC Irvine, Bay Area Round Table
(BART) Meeting (Palo Alto, CA), presented by D. Redmiles, February 1997.
21
6. Argo Software Architecture Design Environment, DARPA, Information Management Cluster
(EDCS IM) Meeting (Manassas, VA), presented by D. Redmiles and D. Hilbert, October
1996.
7. Using Critics to Analyze Evolving Architectures, ACM, Second International Software Architecture Workshop (ISAW-2 — San Francisco, CA), held in conjunction with SIGSOFT’96:
the Fourth Symposium on the Foundations of Software Engineering (FSE4), presented by J.
Robins, October 1996.
8. Extending Design Environments to Software Architecture Design, IEEE, The 11th Annual
Knowledge-Based Software Engineering (KBSE-96) Conference (Syracuse, NY), presented
by J. Robbins, September 1996.
9. Visual Language Features Supporting Human-Human and Human-Computer Communication, IEEE, The IEEE Symposium on Visual Languages (Boulder, CO), presented by J. Robbins, September 1996.
Web Site Guide
Project information and selected publications are accessible from the PI’s home page:
http://www.ics.uci.edu/~redmiles/
A direct link to the research funded under this grant is the following:
http://www.ics.uci.edu/pub/eden/
As noted earlier, some funding from DARPA is being leveraged in this work. A direct link to
research funded under that project is the following:
http://www.ics.uci.edu/pub/edcs/
References
Abel, M. Experiences in an Exploratory Distributed Organization, in E.A. Galegher (Ed.), Intellectual Teamwork: Social and Technological Foundations of Cooperative Work, Lawrence
Erlbaum Associates, 1990, pp. 489-510.
Bogia, D. P., Kaplan, S. M. Flexibility and Control for Dynamic Workflows in the wOrlds Environment, Proceedings of the Conference on Organizational Computing Systems, 1995, pp.
148-159.
Dourish, P., Bly, S. Portholes: Supporting Awareness in a Distributed Work Group, Proceedings
of the Human Factors in Computing Systems Conference (CHI ‘92), 1992, pp. 541-547.
Fischer, G. Cognitive View of Reuse and Redesign, IEEE Software, Special Issue on Reusability,
22
Vol. 4, No. 4, 1987, pp. 60-72.
Fischer, G. Domain-Oriented Design Environments, Journal of Automated Software Engineering,
Vol. 1, No. 1994, pp. 177-203.
Greenbaum, J., Kyung, M. (Eds.). Design at Work: Cooperative Design of Computer Systems,
Lawrence Erlbaum Associates, Hillsdale, NJ, 1991.
Guindon, R., Krasner, H., Curtis, B. Breakdown and Processes During Early Activities of Software Design by Professionals, in E.S. G.M. Olson, S. Sheppard (eds.), Empirical Studies of
Programmers: Second Workshop, Ablex Publishing Corporation, Lawrence Erlbaum Associates, Norwood, NJ, 1987, 65-82.
Hayes-Roth, B., Hayes-Roth, F. A Cognitive Model of Planning, Cognitive Science, Vol. 3, No.
1979, pp. 275-310.
Hilbert, D., Robbins, J., Redmiles, D. Supporting Ongoing User Involvement in Development via
Expectation-Driven Event Monitoring, Technical Report UCI-ICS-97-19, Department of
Information and Computer Science, University of California, Irvine, CA, May 1997.
Isaacs, E., Tang, J., Morris, T. Piazza: a Desktop Environment Supporting Impromptu and
Planned Interactions, Proceedings of the ACM 1996 Conference on Computer Supported
Cooperative Work (CSCW ‘96), 1996, pp. 315-324.
Kantor, M., Zimmermann, B., Redmiles, D. From Group Memory to Group Awareness Through
Use of the Knowledge Depot, Technical Report UCI-ICS-97-20, Department of Information
and Computer Science, University of California, Irvine, CA, May 1997.
Kintsch, W., Greeno, J.G. Understanding and Solving Word Arithmetic Problems, Psychological
Review, Vol. 92, No. 1985, pp. 109-129.
Lee, A., Girgensohn, A., Schlueter, K. NYNEX Portholes: Initial User Reactions and Redesign
Implications, Proceedings of the International Conference on Supporting Group Work
(GROUP ‘97—Phoenix, AZ), ACM, November 16-19, 1997, (in press).
Malone, T. W., Grant, K. R., Lai, K.-Y., Rao, R., & Rosenblitt, D. A. The Information Lens: An
Intelligent System For Information Sharing And Coordination, in M. H. Olson (Ed.), Technological Support for Work Group Collaboration, Lawrence Erlbaum, Hillsdale, NJ,1989,
pages 65-88.
Mantei, M. M., Baecker, R. M., Sellen, A. J. Experiences in the Use of a Media Space, Proceedings of the Human Factors in Computing Systems Conference (CHI ‘91), 1991, pp. 203208.
Pennington, N. Stimulus Structures and Mental Representations in Expert Comprehension of
Computer Programs, Cognitive Psychology, Vol. 19, 1987, pp. 295-341.
23
Robbins, J., Hilbert, D., Redmiles, D. Extending Design Environments to Software Architecture
Design, Proceedings of the 11th Annual Knowledge-Based Software Engineering (KBSE96) Conference (Syracuse, NY), IEEE Computer Society, Los Alamitos, CA, September
1996, pp. 63-72—Best Paper Award.
Rumbaugh, J., Booch, G., Jacobson, I. Unified Modeling Language Reference Manual, AddisonWesley, Reading, MA, 1997.
Schoen, D. The Reflective Practitioner: How Professionals Think in Action, Basic Books, New
York, 1983.
Schoen, D. Designing as Reflective Conversation with the Materials of a Design Situation,
Knowledge-Based Systems, Vol. 5, No. 1, 1992, pp. 3-14.
Visser, W. More or Less Following a Plan During Design: Opportunistic Deviations in Specification, International Journal of Man-Machine Studies, Vol. 33, No. 1990, pp. 247-278.
Wharton, C., Rieman, J., Lewis, C., Polson, P. The Cognitive Walkthrough Method: A Practitioner's Guide, in J. Nielsen, R. Mack (Eds.), Usability Inspection Methods, John Wiley &
Sons, Inc., New York, 1994, pp. 105-140, Ch. 5.
Zimmermann, B., Lindstaedt, S., & Girgensohn, A. Growing Group Memories in the Workplace:
A Field Study of GIMMe, Technical Report, NYNEX Science & Technology, White Plains,
NY, 1997.
24
Download