คุณสมบัติของข้อมูลที่ดี

advertisement

1

คุณสมบ ัติของข้อมูลที่ดี

1.

ถูกต ้อง (Accurate)

2.

ทันเวลา (Timeliness)

3.

สอดคล ้องกับงาน (Relevance)

4.

สามารถตรวจสอบได ้ (Verifiable)

การพ ัฒนาระบบสารสนเทศต้องค านึงถึงคุณสมบ ัติ

ของ MIS

5.

ความสามารถในการจัดการข ้อมูล (Data manipulation)

6.

ความปลอดภัยของข ้อมูล (Data security)

7.

ความยืดหยุ่น (Flexibility)

8.

ความพอใจของผู ้ใช ้ (User satisfaction)

ประโยชน์ของ MIS

9.

ช่วยให ้ได ้ข ้อมูลที่ต ้องการ ได ้ทันเหตุการณ์

10.

ช่วยให ้ก าหนดกลยุทธ์ และวางแผนปฏิบัติการได ้

11.

ช่วยให ้ตรวจสอบผลการด าเนินงานได ้

12.

ช่วยให ้ศึกษา และวิเคราะห์สาเหตุของปัญหา

13.

ช่วยให ้สามารถวิเคราะห์ปัญหา หรืออุปสรรค

เพื่อหาวิธีแก ้ไข

14.

ช่วยลดค่าใช ้จ่าย

การจ ัดโครงสร้างของสารสนเทศตามการน าไปใช้แ

บ่งได้ 4 ระด ับ

15.

Top management : ระดับวางแผนกลยุทธ์

นโยบาย และการตัดสินใจ

16.

Middle management :

ระดับวางแผนการปฏิบัติการ ในระดับยุทธวิธี

2

17.

Bottom management :

ระดับควบคุมการปฏิบัติการ และขั้นตอนต่าง ๆ

18.

Operation : ระดับปฏิบัติการ

ระบบย่อย หรือส่วนประกอบของ MIS

19.

ระบบประมวลผลรายการ (TPS = Transaction processing Systems)

เช่น การบันทึกรายการบัญชี การขาย การผลิต เป็นต ้น

20.

ระบบการจัดการรายงาน (MRS = Management

Reporting System)

ช่วยจัดเตรียมรายงานสนองความต ้องการของผู ้ใช ้ เช่น

Grade report

21.

ระบบสนับสนุนการตัดสินใจ (DSS = Decision

Support System)

ช่วยเตรียมรายงาน

เพื่อเป็นประโยชน์ในการตัดสินใจของผู ้บริหารระดับต่าง

22.

ระบบสารสนเทศส านักงาน (OIS = Office

Information System)

ระบบสารสนเทศในส านักงานโดยอาศัยอุปกรณ์คอมพิวเต

อร์

การรวมความส ัมพ ันธ์ของแต่ละระบบย่อยเข้าด้วยก ั

23.

ESS = Executive Support Systems

24.

MIS = Management Information Systems

25.

DDS = Decision Support Systems

26.

OIS = Office Information Systems

27.

TPS = Transaction Processing Systems

การจ ัดการทร ัพยากรสารสนเทศ (Information

Resource Management)

สารสนเทศถือเป็นทรัพยากรหนึ่งที่จ าเป็นต่อการบริหารใ

นองค์กร จึงต ้องมีการจัดสรร และจัดการอย่างเป็นระบบ

กระบวนการการประมวลผล แนะการน าไปใช้

28.

ข ้อมูล (Data)

29.

การประมวลผล (Process)

30.

สารสนเทศ (Information)

31.

ผู ้บริหาร (Administrator)

32.

การตัดสินใจ (Decision)

3

เป้าหมายของระบบสารสนเทศ (Objective of

information system)

33.

เพิ่มประสิทธิภาพในการท างาน

34.

เพิ่มผลผลิต

35.

เพิ่มคุณภาพในการบริการลูกค ้า

36.

ผลิตสินค ้าใหม่ และขยายผลิตภัณฑ์

37.

สร ้างทางเลือกในการแข่งขัน

38.

สร ้างโอกาสทางธุรกิจ

39.

ครองใจลูกค ้า และปกป้องจากคู่แข่ง

พื้นฐานเทคโนโลยีสารสนเทศ

40.

ส่วนประกอบของระบบสารสนเทศบนพื้นฐานทางค

อมพิวเตอร์ (input,memory,arithmetic and logic,control,output)

- Hardware

- Software

- Data

- People

41.

ผู ้เขียนโปรแกรม ผู ้ใช ้ และผู ้วิเคราะห์ระบบ

42.

เทคนิคในการปฏิบัติของระบบสารสนเทศบนพื้นฐา

นคอมพิวเตอร์ (Technical operation of Computer-

Base Information Systems = CBIS)

43.

การจัดข ้อมูลบนพื้นฐานของคอมพิวเตอร์

44.

รูปแบบการประมวลผล

4

แหล่งของข้อมูลสารสนเทศ

1. แหล่งข ้อมูลภายนอก

2. แหล่งข ้อมูลภายใน

2.1 สารสนเทศที่ได ้มาจากการประมวลผลข ้อมูล

2.2

สารสนเทศที่ได ้จากแหล่งอื่นที่ไม่เกี่ยวข ้องกับการประมวลผล

2.3 สารสนเทศที่ได ้จากผู ้บริหารในระดับที่ต ่ากว่า

A Brief History of Decision Support Systems by D. J. Power

Editor, DSSResources.COM version 4.0 or see version 2.8

Summary

Information Systems researchers and technologists have built and investigated Decision Support Systems (DSS) for approximately 40 years. This paper chronicles and explores the developments in DSS beginning with building modeldriven DSS in the late 1960s, theory developments in the

1970s, and the implementation of financial planning systems, spreadsheet DSS and Group DSS in the early and mid 80s.

Data warehouses, Executive Information Systems, OLAP and

5

Business Intelligence evolved in the late 1980s and early

1990s. Finally, the chronicle ends with knowledge-driven

DSS and the implementation of Web-based DSS in the mid-

1990s.

I. Introduction

Computerized decision support systems became practical with the development of minicomputers, timeshare operating systems and distributed computing. The history of the implementation of such systems begins in the mid-1960s.

In a technology field as diverse as DSS, chronicling history is neither neat nor linear. Different people perceive the field of

Decision Support Systems from various vantage points and report different accounts of what happened and what was important (cf., Arnott & Pervan, 2005; Eom & Lee, 1990b;

McCosh & Correa-Perez, 2006; Power, 2003; Power, 2004a;

Silver, 1991). As technology evolved new computerized decision support applications were developed and studied.

Researchers used multiple frameworks to help build and understand these systems. Today one can organize the history of DSS into the five broad DSS categories explained in Power (2001; 2002; 2004b), including: communicationsdriven, data-driven, document driven, knowledge-driven and model-driven decision support systems.

This hypertext document is a starting point in explaining the origins of the various technology threads that are converging to provide integrated support for managers working alone, in teams and in organization hierarchies to manage organizations and make more rational decisions. History is both a guide to future activity in this field and a record of the ideas and actions of those who have helped advance our thinking and practice. Historical facts can be sorted out and

6 better understood, but more information gathering is necessary. This web page is a starting point in collecting more first hand accounts and in building a more complete mosaic of what was occurring in universities, software companies and in organizations to build and use DSS.

This document traces decision support applications and research studies related to model and data-oriented systems, management expert systems, multidimensional data analysis, query and reporting tools, online analytical processing

(OLAP), Business Intelligence, group DSS, conferencing and groupware, document management, spatial DSS and

Executive Information Systems as the technologies emerge, converge and diverge. All of these technologies have been used to support decision making. A timeline of major historical milestones relevant to DSS is included in Appendix

I.

The study of decision support systems is an applied discipline that uses knowledge and especially theory from other disciplines. For this reason, many DSS research questions have been examined because they were of concern to people who were building and using specific DSS.

Hence much of the broad DSS knowledge base provides generalizations and directions for building more effective

DSS (cf., Baskerville & Myers, 2002; Keen, 1980).

The next section describes the origins of the field of decision support systems. Section 3 discusses the decision support systems theory development that occurred in the late 1970s and early 1980s. Section 4 discusses important developments to communications-driven , data-driven, document driven, knowledge-driven and model-driven DSS

(cf., Power, 2002). The final section briefly discusses how

7

DSS practice, research and technology is continuing to evolve.

II. Decision Support Systems Origins

In the 1960s, researchers began systematically studying the use of computerized quantitative models to assist in decision making and planning (Raymond, 1966; Turban, 1967; Urban,

1967, Holt and Huber, 1969). Ferguson and Jones (1969) reported the first experimental study using a computer aided decision system. They investigated a production scheduling application running on an IBM 7094. In retrospect, a major historical turning point was Michael S. Scott Morton's (1967) dissertation field research at Harvard University.

Scott Morton’s study involved building, implementing and then testing an interactive, model-driven management decision system. Fellow Harvard Ph.D. student Andrew

McCosh asserts that the “concept of decision support systems was first articulated by Scott Morton in February

1964 in a basement office in Sherman Hall, Harvard Business

School” (McCosh email, 2002) in a discussion they had about

Scott Morton’s dissertation. During 1966, Scott Morton

(1971) studied how computers and analytical models could help managers make a recurring key business planning decision. He conducted an experiment in which managers actually used a Management Decision System (MDS).

Marketing and production managers used an MDS to coordinate production planning for laundry equipment. The

MDS ran on an IDI 21 inch CRT with a light pen connected using a 2400 bps modem to a pair of Univac 494 systems.

The pioneering work of George Dantzig, Douglas Engelbart and Jay Forrester likely influenced the feasibility of building

8 computerized decision support systems. In 1952, Dantzig became a research mathematician at the Rand Corporation, where he began implementing linear programming on its experimental computers. In the mid-1960s, Engelbart and colleagues developed the first hypermedia—groupware system called NLS (oNLine System). NLS facilitated the creation of digital libraries and the storage and retrieval of electronic documents using hypertext. NLS also provided for on-screen video teleconferencing and was a forerunner to group decision support systems. Forrester was involved in building the SAGE (Semi-Automatic Ground Environment) air defense system for North America completed in 1962. SAGE is probably the first computerized data-driven DSS. Also,

Professor Forrester started the System Dynamics Group at the Massachusetts Institute of Technology Sloan School. His work on corporate modeling led to programming DYNAMO, a general simulation compiler.

In 1960, J.C.R. Licklider published his ideas about the future role of multiaccess interactive computing in a paper titled

“Man-Computer Symbiosis.” He saw man-computer interaction as enhancing both the quality and efficiency of human problem solving and his paper provided a guide for decades of computer research to follow. Licklider was the architect of Project MAC at MIT that furthered the study of interactive computing.

By April 1964, the development of the IBM System 360 and other more powerful mainframe systems made it practical and cost-effective to develop Management Information

Systems (MIS) for large companies (cf., Davis, 1974). These early MIS focused on providing managers with structured, periodic reports and the information was primarily from accounting and transaction processing systems, but the

9 systems did not provide interactive support to assist managers in decision making.

Around 1970 business journals started to publish articles on management decision systems, strategic planning systems and decision support systems (cf., Sprague and Watson

1979).. For example, Scott Morton and colleagues McCosh and Stephens published decision support related articles in

1968. The first use of the term decision support system was in Gorry and Scott-Morton’s (1971) Sloan Management

Review article. They argued that Management Information

Systems primarily focused on structured decisions and suggested that the supporting information systems for semistructured and unstructured decisions should be termed

“Decision Support Systems”.

T.P. Gerrity, Jr. focused on Decision Support Systems design issues in his 1971 Sloan Management Review article titled

"The Design of Man-Machine Decision Systems: An

Application to Portfolio Management". The article was based on his MIT Ph.D. dissertation. His system was designed to support investment managers in their daily administration of a clients' stock portfolio.

John D.C. Little, also at Massachusetts Institute of

Technology, was studying DSS for marketing. Little and

Lodish (1969) reported research on MEDIAC, a media planning support system. Also, Little (1970) identified criteria for designing models and systems to support management decision-making. His four criteria included: robustness, ease of control, simplicity, and completeness of relevant detail. All four criteria remain relevant in evaluating modern Decision

Support Systems. By 1975, Little was expanding the frontiers of computer-supported modeling. His DSS called Brandaid

10 was designed to support product, promotion, pricing and advertising decisions. Little also helped develop the financial and marketing modeling language known as EXPRESS.

In 1974, Gordon Davis, a Professor at the University of

Minnesota, published his influential text on Management

Information Systems. He defined a Management Information

System as "an integrated, man/machine system for providing information to support the operations, management, and decision-making functions in an organization. (p. 5)." Davis's Chapter 12 was titled

"Information System Support for Decision Making" and

Chapter 13 was titled "Information System Support for

Planning and Control". Davis’s framework incorporated computerized decision support systems into the emerging field of management information systems.

Peter Keen and Charles Stabell claim the concept of decision support systems evolved from "the theoretical studies of organizational decisionmaking done at the Carnegie Institute of Technology during the late 1950s and early '60s and the technical work on interactive computer systems, mainly carried out at the Massachusetts Institute of Technology in the 1960s. (Keen and Scott Morton, 1978)". Herbert Simon’s books (1947, 1960) and articles provide a context for understanding and supporting decision making.

In 1995, Hans Klein and Leif Methlie noted “A study of the origin of DSS has still to be written. It seems that the first

DSS papers were published by PhD students or professors in business schools, who had access to the first time-sharing computer system: Project MAC at the Sloan School, the

Dartmouth Time Sharing Systems at the Tuck School. In

France, HEC was the first French business school to have a

11 time-sharing system (installed in 1967), and the first DSS papers were published by professors of the School in 1970.

(p. 112).”

III. Theory Development

In the mid- to late 1970s, both practice and theory issues related to DSS were discussed at academic conferences including the American Institute for Decision Sciences meetings and the ACM SIGBDP Conference on Decision

Support Systems in San Jose, CA in January 1977 (the proceeding were included in the journal Database). The first

International Conference on Decision Support Systems was held in Atlanta, Georgia in 1981. Academic conferences provided forums for idea sharing, theory discussions and information exchange.

At about this same time, Keen and Scott Morton’s DSS textbook (1978) provided the first broad behavioral orientation to decision support system analysis, design, implementation, evaluation and development. This influential text provided a framework for teaching DSS in business schools. McCosh and Scott-Morton’s (1978) DSS book was more influential in Europe.

In 1980, Steven Alter published his MIT doctoral dissertation results in an influential book. Alter's research and papers

(1975; 1977) expanded the framework for thinking about business and management DSS. Also, his case studies provided a firm descriptive foundation of decision support system examples. A number of other MIT dissertations completed in the late 1970s also dealt with issues related to using models for decision support.

12

Alter concluded from his research (1980) that decision support systems could be categorized in terms of the generic operations that can be performed by such systems. These generic operations extend along a single dimension, ranging from extremely data-oriented to extremely model-oriented.

Alter conducted a field study of 56 DSS that he categorized into seven distinct types of DSS. His seven types include:

File drawer systems that provide access to data items.

Data analysis systems that support the manipulation of data by computerized tools tailored to a specific task and setting or by more general tools and operators.

Analysis information systems that provide access to a series of decision-oriented databases and small models.

Accounting and financial models that calculate the consequences of possible actions.

Representational models that estimate the consequences of actions on the basis of simulation models.

Optimization models that provide guidelines for action by generating an optimal solution consistent with a series of constraints.

Suggestion models that perform the logical processing leading to a specific suggested decision for a fairly structured or well-understood task.

13

Donovan and Madnick (1977) classified DSS as institutional or ad hoc. Institutional DSS support decisions that are recurring. An ad hoc DSS supports querying data for one time requests. Hackathorn and Keen (1981) identified DSS in three distinct yet interrelated categories: Personal DSS,

Group DSS and Organizational DSS.

In 1979, John Rockart of the Harvard Business School published a ground breaking article that led to the development of executive information systems (EISs) or executive support systems (ESS). Rockart developed the concept of using information systems to display critical success metrics for managers.

Robert Bonczek, Clyde Holsapple, and Andrew Whinston

(1981) explained a theoretical framework for understanding the issues associated with designing knowledge-oriented

Decision Support Systems. They identified four essential

"aspects" or general components that were common to all

DSS: 1. A language system (LS) that specifies all messages a specific DSS can accept; 2. A presentation system (PS) for all messages a DSS can emit; 3. A knowledge system (KS) for all knowledge a DSS has; and 4. A problem-processing system (PPS) that is the "software engine" that tries to recognize and solve problems during the use of a specific

DSS. Their book explained how Artificial Intelligence and

Expert Systems technologies were relevant to developing

DSS.

Finally, Ralph Sprague and Eric Carlson’s (1982) book

Building Effective Decision Support Systems

was an important milestone. Much of the book further explained the

Sprague (1980) DSS framework of data base, model base and dialog generation and management software. Also, it

14 provided a practical, and understandable overview of how organizations could and should build DSS. Sprague and

Carlson (1982) defined DSS as "a class of information system that draws on transaction processing systems and interacts with the other parts of the overall information system to support the decision-making activities of managers and other knowledge workers in organizations (p.

9).”

IV. DSS Applications Development

Beginning in about 1980 many activities associated with building and studying DSS occurred in universities and organizations that resulted in expanding the scope of DSS applications. These actions also expanded the field of decision support systems beyond the initial business and management application domain. These diverse systems were all called Decision Support Systems. From those early days, it was recognized that DSS could be designed to support decision-makers at any level in an organization. Also,

DSS could support operations decision making, financial management and strategic decision-making.

A literature survey and citation studies (Alavi &

Joachimsthaler, 1990, Eom & Lee, 1990a, Eom, 2002, Arnott

& Pervan, 2005) suggest the major applications for DSS emphasized manipulating quantitative models, accessing and analyzing large data bases, and supporting group decision making. Much of the model-driven DSS research emphasized use of the systems by individuals, i.e., personal DSS, while data-driven DSS were usually institutional, ad hoc or organizational DSS. Group DSS research emphasized impacts on decision process structuring and especially brainstorming.

15

The discussion in this section follows the broad historical progression of DSS research. The first subsection examines model-driven DSS, then the focus turns to data-driven DSS and executive information systems and notes the growing prominence of such systems beginning in the late 1980s.

The origins of communications-driven DSS are then briefly explored and the bifurcation into two types of group DSS, model-driven and communications-driven. Developments in document storage technologies and search engines then made document-driven DSS more widely available as webbased systems. The last subsection summarizes major developments in Artificial Intelligence (AI) and expert systems that made suggestion or knowledge-driven DSS practical.

IV.1 Model-driven DSS

Scott-Morton’s (1971) production planning management decision system was the first widely discussed model-driven

DSS, but Ferguson and Jones’ (1969) production scheduling application was also a model-driven DSS. Many of the early decision systems mentioned in section 2, e.g., Sprinter,

MEDIAC and Brandaid, are probably model-driven DSS.

A model-driven DSS emphasizes access to and manipulation of financial, optimization and/or simulation models. Simple quantitative models provide the most elementary level of functionality. Model-driven DSS use limited data and parameters provided by decision makers to aid decision makers in analyzing a situation, but in general large data bases are not needed for model-driven DSS (Power, 2002).

Early versions of model-driven DSS were called modeloriented DSS by Alter (1980), computationally oriented DSS by Bonczek, Holsapple and Whinston (1981) and later

16 spreadsheet-oriented and solver-oriented DSS by Holsapple and Whinston (1996).

The first commercial tool for building model-driven DSS using financial and quantitative models was called IFPS, an acronym for interactive financial planning system. It was developed in the late 1970's by Gerald R. Wagner and his students at the University of Texas. Wagner’s company,

EXECUCOM Systems, marketed IFPS until the mid 1990s.

Gray’s Guide to IFPS (1983) promoted the use of the system in business schools. Another DSS generator for building specific systems based upon the Analytic Hierarchy Process

(Saaty, 1982), called Expert Choice, was released in 1983.

Expert Choice supports personal or group decision making.

Ernest Forman worked closely with Thomas Saaty to design

Expert Choice.

In 1978, Dan Bricklin and Bob Frankston co-invented the software program VisiCalc (Visible Calculator). VisiCalc provided managers the opportunity for hands-on computerbased analysis and decision support at a reasonably low cost.

VisiCalc was the first "killer" application for personal computers and made possible development of many modeloriented, personal DSS for use by managers. The history of microcomputer spreadsheets is described in Power (2000).

In 1987, Frontline Systems founded by Dan Fylstra marketed the first optimization solver add-in for Microsoft Excel.

In a 1988 paper, Sharda, Barr, and McDonnell reviewed the first 15 years of model-driven DSS research. They concluded that research related to using models and financial planning systems for decision support was encouraging but certainly not uniformly positive. As computerized models became more numerous, research focused on model management

17 and on enhancing more diverse types of models for use in

DSS such as multicriteria, optimization and simulation models.

The idea of model-driven spatial decision support system

(SDSS) evolved in the late 1980’s (Armstrong, Densham, and

Rushton., 1986) and by 1995 the SDSS concept had become firmly established in the literature (Crossland, Wynne, and

Perkins, 1995). Data-driven spatial DSS are also common.

IV.2 Data-driven DSS

In general, a data-driven DSS emphasizes access to and manipulation of a time-series of internal company data and sometimes external and real-time data. Simple file systems accessed by query and retrieval tools provide the most elementary level of functionality. Data warehouse systems that allow the manipulation of data by computerized tools tailored to a specific task and setting or by more general tools and operators provide additional functionality. Data-

Driven DSS with On-line Analytical Processing (cf., Codd et al., 1993) provide the highest level of functionality and decision support that is linked to analysis of large collections of historical data. Executive Information Systems are examples of data-driven DSS (Power, 2002). Initial examples of these systems were called data-oriented DSS, Analysis

Information Systems (Alter, 1980) and retrieval-only DSS by

Bonczek, Holsapple and Whinston (1981).

One of the first data-driven DSS was built using an APLbased software package called AAIMS, An Analytical

Information Management System. It was developed from

1970-1974 by Richard Klaas and Charles Weiss at American

Airlines (cf. Alter, 1980).

18

As noted previously, in 1979 John Rockart’s research stimulated the development of executive information systems (EIS) and executive support systems (ESS). These systems evolved from single user model-driven decision support systems and from the development of relational database products. The first EIS used pre-defined information screens maintained by analysts for senior executives. For example, in the Fall of 1978, development of an EIS called Management Information and Decision Support

(MIDS) system began at Lockheed-Georgia (cf., Houdeshel and Watson, 1987).

The first EIS were developed in the late 1970s by Northwest

Industries and Lockheed “who risked being on the ‘bleeding edge’ of technology …. Few even knew about the existence of EIS until John Rockart and Michael Treacy’s article, ‘The

CEO Goes On-line,’ appeared in the January-February 1982 issue of the

Harvard Business Review

. (Watson, Houdeshel and Rainer, 1997, p. 6)” Watson and colleagues. further note “A major contributor to the growth of EIS was the appearance of vendor-supplied EIS software in the mid-

1980s. Pilot Software’s Command Center and Comshare’s

Commander EIS made it much easier for firms to develop an

EIS by providing capabilities for (relatively) easy screen design, data importation, user-friendly front ends, and access to news services. (p. 6)” In a related development in

1984, Teradata’s parallel processing relational database management system shipped to customers Wells Fargo and

AT&T.

In about 1990, data warehousing and On-Line Analytical

Processing (OLAP) began broadening the realm of EIS and defined a broader category of data-driven DSS (cf., Dhar and Stein, 1997). Nigel Pendse (1997), author of the OLAP

19

Report, claims both multidimensional analysis and OLAP had origins in the APL programming language and in systems like

Express and Comshare’s System W. Nylund (1999) traces the developments associated with Business Intelligence (BI) to Procter & Gamble’s efforts in 1985 to build a DSS that linked sales information and retail scanner data. Metaphor

Computer Systems, founded by researchers like Ralph

Kimball from Xerox’s Palo Alto Research Center (PARC), built the early P&G data-driven DSS. Staff from Metaphor later founded many of the Business Intelligence vendors: The term BI is a popularized, umbrella term coined and promoted by Howard Dresner of the Gartner Group in 1989.

It describes a set of concepts and methods to improve business decision making by using fact-based support systems. BI is sometimes used interchangeably with briefing books, report and query tools and executive information systems. In general, business intelligence systems are datadriven DSS.

Bill Inmon and Ralph Kimball actively promoted decision support systems built using relational database technologies.

For many Information Systems practitioners, DSS built using

Oracle or DB2 were the first decision support systems they read about in the popular computing literature. Ralph

Kimball was "The Doctor of DSS" and Bill Inmon was the

"father of the data warehouse”. By 1995, Wal-Mart’s datadriven DSS had more than 5 terabytes of on-line storage from Teradata that expanded to more than 24 terabytes in

1997. In more recent years, vendors added tools to create web-based dashboards and scorecards.

IV.3 Communications-driven DSS

20

Communications-driven DSS use network and communications technologies to facilitate decision-relevant collaboration and communication. In these systems, communication technologies are the dominant architectural component. Tools used include groupware, video conferencing and computer-based bulletin boards (Power,

2002).

Engelbart's 1962 paper "Augmenting Human Intellect: A

Conceptual Framework" is the anchor for much of the later work related to communications-driven DSS. In 1969, he demonstrated the first hypermedia/groupware system NLS

(oNLine System) at the Fall Joint Computer Conference in

San Francisco. Engelbart invented the both the computer mouse and groupware.

Joyner and Tunstall’s article (1970) reporting testing of their

Conference Coordinator computer software is the first empirical study in this research area. Murray Turoff’s (1970) article introduced the concept of Computerized Conferencing.

He developed and implemented the first Computer Mediated

Communications System (EMISARI) tailored to facilitate group communications.

In the early 1980s, academic researchers developed a new category of software to support group decision-making called Group Decision Support Systems abbreviated GDSS

(cf., Gray, 1981; Huber, 1982; Turoff and Hiltz, 1982).

Mindsight from Execucom Systems, GroupSystems developed at the University of Arizona and the SAMM system developed by University of Minnesota researchers were early

Group DSS.

21

Eventually GroupSystems matured into a commercial product.

Jay Nunamaker, Jr. and his colleagues wrote in 1992 that the underlying concept for GroupSystems had its beginning in 1965 with the development of Problem Statement

Language/Problem Statement Analyzer at Case Institute of

Technology. In 1984, the forerunner to GroupSystems called

PLEXSYS was completed and a computer-assisted group meeting facility was constructed at the University of Arizona.

The first Arizona facility, called the PlexCenter, housed a large U-shaped conference table with 16 computer workstations.

On the origins of SAMM, Dickson, Poole and DeSanctis

(1992) report that Brent Gallupe, a Ph.D. student at the

University of Minnesota, decided in 1984 "to program his own small GDSS system in BASIC and run it on his university’s VAX computer".

DeSanctis and Gallup (1987) defined two types of GDSS.

Basic or level 1 GDSS are systems with tools to reduce communication barriers, such as large screens for display of ideas, voting mechanisms, and anonymous input of ideas and preferences. These are communications-driven DSS.

Advanced or level 2 GDSS provide problem-structuring techniques, such as planning and modeling tools. These are model-driven group DSS. Since the mid-1980s, many research studies have examined the impacts and consequences of both types of group DSS. Also, companies have commercialized model-driven group DSS and groupware.

Kersten (1985) developed NEGO, a computerized group tool to support negotiations. Bui and Jarke (1986) reported developing Co-op, a system for cooperative multiple criteria

22 group decision support. Kraemer and King (1988) introduced the concept of Collaborative Decision Support Systems

(CDSSs). They defined them as interactive computer-based systems to facilitate the solution of ill-structured problems by a set of decision makers working together as a team.

In 1989, Lotus introduced a groupware product called Notes and broadened the focus of GDSS to include enhancing communication, collaboration and coordination among groups of people. Notes had its roots in a product called

PLATO Notes, written at the Computer-based Education

Research Laboratory (CERL) at the University of Illinois in

1973 by David R. Woolley.

In general, groupware, bulletin boards, audio and videoconferencing are the primary technologies for communications-driven decision support. In the past few years, voice and video delivered using the Internet protocol have greatly expanded the possibilities for synchronous communications-driven DSS.

IV.4 Document-driven DSS

A document-driven DSS uses computer storage and processing technologies to provide document retrieval and analysis. Large document databases may include scanned documents, hypertext documents, images, sounds and video.

Examples of documents that might be accessed by a document-driven DSS are policies and procedures, product specifications, catalogs, and corporate historical documents, including minutes of meetings and correspondence. A search engine is a primary decision-aiding tool associated with a document-driven DSS (Power, 2002). These systems have

23 also been called text-oriented DSS (Holsapple and

Whinston,1996).

The precursor for this type of DSS is Vannevar Bush’s (1945) article titled "As We May Think". Bush wrote "Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and to coin one at random, ‘memex’ will do”. Bush’s memex is a much broader vision than that of today’s document-driven

DSS.

Text and document management emerged in the 1970s and

1980s as an important, widely used computerized means for representing and processing pieces of text (Holsapple and

Whinston, 1996). The first scholarly article for this category of DSS was written by Swanson and Culnan (1978). They reviewed document-based systems for management planning and control. Until the mid-1990s little progress was made in helping managers find documents to support their decision making. Fedorowicz (1993, 1996) helped define the need for such systems. She estimated in her 1996 article that only 5 to 10 percent of stored business documents are available to managers for use in decision making. The

World-wide web technologies significantly increased the availability of documents and facilitated the development of document-driven DSS.

IV.5 Knowledge-driven DSS

Knowledge-driven DSS can suggest or recommend actions to managers. These DSS are person-computer systems with specialized problem-solving expertise. The "expertise" consists of knowledge about a particular domain, understanding of problems within that domain, and "skill" at

24 solving some of these problems (Power, 2002). These systems have been called suggestion DSS (Alter, 1980) and knowledge-based DSS (Klein & Methlie, 1995). Goul,

Henderson, and Tonge (1992) examined Artificial

Intelligence (AI) contributions to DSS.

In 1965, a Stanford University research team led by Edward

Feigenbaum created the DENDRAL expert system. DENDRAL led to the development of other rule-based reasoning programs including MYCIN, which helped physicians diagnose blood diseases based on sets of clinical symptoms.

The MYCIN project resulted in development of the first expert-system shell (Buchanan and Shortliffe, 1984).

Bonczek, Holsapple and Whinston’s (1981) book created interest in using these technologies for DSS. In 1983, Dustin

Huntington established EXSYS. That company and product made it practical to use PC based tools to develop expert systems. By 1992, some 11 shell programs were available for the MacIntosh platform, 29 for IBM-DOS platforms, 4 for

Unix platforms, and 12 for dedicated mainframe applications

(National Research Council, 1999). Artificial Intelligence systems have been developed to detect fraud and expedite financial transactions, many additional medical diagnostic systems have been based on AI, expert systems have been used for scheduling in manufacturing operation and webbased advisory systems. In recent years, connecting expert systems technologies to relational databases with web-based front ends has broadened the deployment and use of knowledge-driven DSS.

V. Web-based DSS

25

Beginning in approximately 1995, the World-wide Web and global Internet provided a technology platform for further extending the capabilities and deployment of computerized decision support. The release of the HTML 2.0 specifications with form tags and tables was a turning point in the development of web-based DSS. In 1995, a number of papers were presented on using the Web and Internet for decision support at the 3rd International Conference of the

International Society for Decision Support Systems (ISDSS).

In addition to Web-based, model-driven DSS, researchers were reporting Web access to data warehouses. DSS

Research Resources was started as a web-based collection of bookmarks. By 1995, the World-Wide Web (Berners-Lee,

1996) was recognized by a number of software developers and academics as a serious platform for implementing all types of Decision Support Systems (cf., Bhargava & Power,

2001).

In November 1995, Power, Bhargava and Quek submitted the Decision Support Systems Research page for inclusion in

ISWorld. The goal was to provide a useful starting point for accessing Web-based material related to the design, development, evaluation, and implementation of Decision

Support Systems. Nine months later, a DSS/WWW Workshop organized by Power and Quek was held as part of the IFIP

Working Group 8.3 Conference on “Implementing Systems for Supporting Management Decisions: Concepts, Methods and Experiences”, July 21-24, 1996 in London, UK.

In 1996-97, corporate intranets were developed to support information exchange and knowledge management. The primary decision support tools included ad hoc query and reporting tools, optimization and simulation models, online analytical processing (OLAP), data mining and data

26 visualization (cf., Powell, 2001). Enterprise-wide DSS using database technologies were especially popular in Fortune

2000 companies (Power, 1997). Bhargava, Krishnan and

M ller (1997) continued to discuss and experiment with electronic markets for decision technologies.

In 1999, vendors introduced new Web-based analytical applications. Many DBMS vendors shifted their focus to Webbased analytical applications and business intelligence solutions. In 2000, application service providers (ASPs) began hosting the application software and technical infrastructure for decision support capabilities. 2000 was also the year of the portal. More sophisticated "enterprise knowledge portals" were introduced by vendors that combined information portals, knowledge management, business intelligence, and communications-driven DSS in an integrated Web environment (cf., Bhargava and Power,

2001).

Power (1998) defined a Web-based decision support system as a computerized system that delivers decision support information or decision support tools to a manager or business analyst using a "thin-client" Web browser like

Netscape Navigator or Internet Explorer. The computer server that is hosting the DSS application is linked to the user's computer by a network with the TCP/IP protocol.

VI. Conclusions

DSS practice, research and technology continue to evolve.

By 1996, Holsapple and Whinton had identified five specialized types of DSS, including text-oriented DSS, database-oriented DSS, spreadsheet-oriented DSS, solveroriented DSS, and rule-oriented DSS. These last four types

27 of DSS match up with some of Alter’s (1980) categories.

Arnott and Pervan (2005) traced the evolution of DSS using seven sub-groupings of research and practice: personal DSS, group support systems, negotiation support systems, intelligent DSS, knowledge management-based DSS, executive information systems/business intelligence, and data warehousing. These sub-grouping overlap, but reflect the diverse evolution of prior research.

This chapter used an expanded DSS framework (Power,

2001, 2002) to retrospectively discuss the historical evolution of decision support systems. The Web has had a significant impact on the variety, distribution and sophistication of DSS, but handheld PCs, wireless networks, expanding parallel processing coupled with very large data bases and visualization tools are continuing to encourage the development of innovative decision support applications.

Historians use two approached to apply the past to the future: reasoning by analogy and projection of trends. In many ways computerized decision support systems are like airplanes, coming in various shapes, sizes and forms, technologically sophisticated and a very necessary tool in many organizations. Decision support systems research and development will continue to exploit any new technology developments and will benefit from progress in very large data bases, artificial intelligence, human-computer interaction, simulation and optimization, software engineering, telecommunications and from more basic research on behavioral topics like organizational decision making, planning, behavioral decision theory and organizational behavior.

28

Trends suggest that data-driven DSS will use faster, realtime access to larger, better integrated databases. Modeldriven DSS will be more complex, yet understandable, and systems built using simulations and their accompanying visual displays will be increasingly realistic. Communicationsdriven DSS will provide more real-time video communications support. Document-driven DSS will access larger repositories of unstructured data and the systems will present appropriate documents in more useable formats.

Finally, knowledge-driven DSS will likely be more sophisticated and more comprehensive. The advice from knowledge-driven DSS will be better and the applications will cover broader domains.

Decision Support Systems pioneers came from a wide variety of backgrounds and faced many challenges that they successfully overcame to demonstrate the value of using computers, information technologies and specific decision support software to enhance and in some situations improve decision making. The DSS pioneers created particular and distinct streams of technology development and research that serve as the foundation for much of today’s interest in building and studying computerized decision support systems. The legacy of the pioneers must be preserved.

Check the Decision Support Systems Pioneers list at

DSSResources.com/history/pioneers/pioneerslist.html.

The future of decision support systems will certainly be different than the opportunistic and incremental innovations seen in the recent past. Decision support systems as an academic discipline is likely to follow a path similar to computer architecture and software engineering and become more rigorous and more clearly delineated. DSS consulting, teaching and research can be mutually supportive and each

29 task can help establish a niche for those interested in building and studying DSS whether in Colleges of

Information, Business or Engineering.

The history of Decision Support Systems covers a relatively brief span of years, and the concepts and technologies are still evolving. Today it is still possible to reconstruct the history of Decision Support Systems (DSS) from retrospective accounts from key participants as well as from published and unpublished materials. Many of the early innovators and early developers are retiring but their insights and actions can be captured to guide future innovation in this field. It is hoped this paper leads to email and retrospective accounts that can help us understand the

"real" history of DSS. The Internet and Web have speededup developments in decision support and have provided a new means of capturing and documenting the development of knowledge in this research area. Decision support pioneers include many academic researchers from programs at MIT, University of Arizona, University of Hawaii, University of Minnesota and Purdue University. The DSS pioneers created particular and distinct streams of technology development and research that serve as the foundation for much of today’s work in DSS.

VII. References

Alavi, M., & Joachimsthaler, E. A., “Revisiting DSS

Implementation Research: A Meta-Analysis of the literature and suggestions for researchers,” MIS

Quarterly, 16, 1, 1992, 95-116.

Alter, S.L., "A Study of Computer Aided Decision Making in

Organizations," Ph.D. dissertation,

30

M.I.T., 1975.

Alter, S.L., "Why Is Man-Computer Interaction Important for

Decision Support Systems?",

Interfaces, 7, 2, Feb. 1977, 109-115.

Alter, S.L. Decision Support Systems: Current Practice and

Continuing Challenge. Reading, MA:

Addison-Wesley, 1980.

Armstrong, M. P., Densham, P. J. and Rushton, G.,

“Architecture for a microcomputer based spatial decision support system,” Second International

Symposium on Spatial Data Handling,

120, 131 International Geographics Union, 1986.

Arnott, D. and G. Pervan, "A critical analysis of decision support systems research", Journal of

Information Technology, 20, 2, 2005, 67-87.

Baskerville, R., and Myers, M., “Information Systems as a

Reference Discipline”, MIS Quarterly,

31

26, 1, 2002, 1 -14.

Berners-Lee, T., “The World Wide Web: Past, Present and

Future,” August 1996,

URL http://www.w3.org/People/Berners-Lee/1996/ppf.html, last accessed March 5, 2007.

Bhargava, H. K., R. Krishnan and R. M ller, “Decision

Support on Demand: Emerging Electronic

Markets for Decision Technologies,'' Decision Support

Systems, 19:3, pp. 193-214, 1997.

Bhargava, H. and D. J. Power. Decision Support Systems and

Web Technologies: A Status Report.

Proceedings of the 2001 Americas Conference on

Information Systems, Boston, MA, August 3 - 5,

2001.

Bonczek, R. H., C.W. Holsapple, and A.B. Whinston.

Foundations of Decision Support Systems,

New York: Academic Press, 1981.

32

Buchanan, B.G. and E.H. Shortliffe (eds.): Rule-Based Expert

Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project, 1984.

Bui, T. X. and M. Jarke, "Communications Design for Co-op:

A Group Decision Support System."

ACM Transactions on Office Information Systems, 4 2, 1986,

81-103.

Bush, V. "As We May Think", The Atlantic Monthly, 176, 1,

July 1945, 101-108, http://www.theatlantic.com/unbound/flashbks/computer/bus hf.htm

Codd, E.F., S.B. Codd and C.T. Salley, "Providing OLAP (On-

Line Analytical Processing) to

User-Analysts: An IT Mandate", E.F. Codd and Associates,

1993 (sponsored by Arbor

Software Corporation).

33

Crossland, M. D., Wynne, B. E. and Perkins, W. C., “Spatial

Decision Support Systems:

An overview of technology and a test of efficacy,” Decision

Support Systems, 14, 3, 1995,

219-235.

Davis, G., Management Information Systems: Conceptual

Foundations, Structure, and

Development. New York: McGraw-Hill, Inc., 1974.

DeSanctis, G. and R. B. Gallupe. "A Foundation for the Study of Group Decision Support

Systems," Management Science, 33, 5, May 1987, 589 - 609.

Dickson, G. W., M. S. Poole and G. DeSanctis. "An Overview of the GDSS Research Project and the SAMM System", in Bosttrom, R. P., R. T. Watson, and S. T. Kinney, Computer

Augmented Teamwork: A Guided Tour, New York: Van

Nostrand Reinhold, 1992, 163-179.

Dhar, V. and R. Stein. Intelligent Decision Support Methods:

The Science of Knowledge.

Upper Saddle River, NJ: Prentice-Hall, 1997.

Donovan, J.J. and S.E. Madnick. "Institutional and Ad Hoc

DSS and Their Effective Use",

Data Base, 8, 3, 1977.

34

Engelbart, D.C., "Augmenting Human Intellect: A

Conceptual Framework,” October 1962,

Air Force Office of Scientific Research, AFOSR-3233, URL www.bootstrap.org/augdocs/friedewald030402/augmentingh umanintellect/ahi62index.html .

Eom, S.B. and S. M. Lee. "DSS Applications Development

Research: Leading Institutions and Most Frequent Contributors (1971-April 1988)," Decision

Support Systems (6)3, 1990a,

269-275.

Eom, H. B. and S.M. Lee., “A Survey of Decision Support

System Applications

(1971-April 1988),” Interfaces, 20, 3, 1990b, 65-79.

35

Eom, S.B., Decision Support Systems Research (1970-1999),

Lewiston, NY: Edwin Mellen

Press, 2002.

Fedorowicz, J. "A Technology Infrastructure for Document-

Based Decision Support Systems", in Sprague, R. and H. J. Watson, Decision Support Systems:

Putting Theory into Practice

(Third Edition), Prentice-Hall, 1993, 125-136.

Fedorowicz, J., "Document Based Decision Support" in

Decision Support for Management, in

R. Sprague Jr. and Hugh J Watson (eds.) Upper Saddle River,

N.J.: Prentice-Hall, 1996.

Ferguson, R. L. and C. H. Jones, "A Computer Aided Decision

System," Management Science,

15, 10, 1969, B550-B562.

Gerrity, T. P., Jr., Design of Man-Machine Decision Systems:

An Application to Portfolio

Management. Sloan Management Review, 12, 2, 1971, 59-

75.

Gorry, A. and M.S. Scott-Morton, “A Framework for

Information Systems”, Sloan Management

Review, 13, 1, Fall 1971, 56-79.

Goul, M., J.C. Henderson, J. C., and F.M. Tonge, “The emergence of Artificial Intelligence as a Reference Discipline for Decision Support Systems

Research,” Decision Sciences, 23,

6, 1992, 1263-1276.

36

Gray, P., "The SMU decision room project", Transactions of the Ist International

Conference on Decision Support Systems (Atlanta, Ga.),

1981, pp. 122-129.

Gray, P., Guide to IFPS (Interactive Financial Planning

System), New York:

McGraw-Hill Book Company, 1983.

Hackathorn, R.D. and P.G.W. Keen, “Organizational

Strategies for Personal Computing in

37

Decision Support Systems,” MIS Quarterly, 5, 3, September

1981, 21-26.

Holt, C. C. and G. P. Huber, "A Computer Aided Approach to

Employment Service Placement and Counseling," 15, 11, 1969, 573-595.

Holsapple, C. and A. Whinston, Decision Support Systems: A

Knowledge-Based Approach,

Minneapolis/St. Paul, MN: West Publishing, 1996.

Houdeshel, G. and H. Watson, "The Management

Information and Decision Support (MIDS) System at Lockheed-Georgia", MIS Quarterly, 11, 1, March 1987,

127-140.

Huber, G. P., "Group decision support systems as aids in the use of structured group management techniques", Transactions of the 2nd International

Conference on Decision Support Systems,

1982, 96-103.

Joyner, R. and K. Tunstall, "Computer Augmented

Organizational Problem Solving," Management

Science, 17, 4, 1970, B212-226.

Keen, P. G. W. and M. S. Scott Morton, Decision Support

Systems: An Organizational Perspective.

38

Reading, MA: Addison-Wesley, 1978.

Keen, Peter G. W., "MIS Research: Reference Disciplines and

Cumulative Tradition", in E. McLean,

Proceedings of the First International Conference on

Information Systems, Philadelphia,

Pennsylvania, December 1980, 9-18.

Kersten, G.E., "NEGO - Group Decision Support System",

Information and Management, 8, 5,

1985, 237-246.

Kreamer, K.L. and J. L. King, “Computer-based systems for cooperative work and group decision making,” ACM Computing surveys 20, 2, 1988, 115-146.

Klein, M. and L. B. Methlie, Knowledge-based Decision

Support Systems with Applications in

Business. Chichester, UK: John Wiley & Sons, 1995.

Little, J.D.C. and L.M. Lodish, "A Media Planning Calculus,"

Operations Research, 17,

39

Jan.-Feb. 1969, 1-35.

Little, J. D. C., "Models and Managers: The Concept of a

Decision Calculus". Management

Science, 16, 8, , April 1970, B466-485.

Little, J. D. C., “Brandaid, an On-Line Marketing Mix Model,

Part 2: Implementation,

Calibration and Case Study,” Operations Research, 23, 4,

1975, 656-673.

McCosh, A., “Comments on ‘A Brief History of DSS’,” email to

D. Power, Oct 3, 2002 at URL http://dssresources.com/history/dsshistory.html , last accessed March 10, 2007.

40

McCosh, A. M. and B. A. Correa-Perez, "The Optimization of

What?" in Gupta, J. G. Forgionne, and M. Mora, Intelligent Decision-making Support Systems:

Foundations, Applications and

Challenges, Springer-Verlag, 2006, 475-494.

McCosh, A. M and Scott Morton, M. S., Management

Decision Support Systems, London, Macmillan,

1978.

Nunamaker, J. F., Jr., A. R. Dennis, J. F. George, W. B.

Martz, Jr., J. S. Valacich, and D. R.

Vogel, "GroupSystems" in Bosttrom, R. P., R. T. Watson, and

S. T. Kinney, Computer Augmented

Teamwork: A Guided Tour, New York: Van Nostrand

Reinhold, 1992, 143-162.

Nylund, A., "Tracing the BI Family Tree", Knowledge

Management, July 1999.

National Research Council ,Committee on Innovations in

Computing and Communications,

41

“Funding a Revolution: Government Support for Computing

Research,” 1999,URL http://www.nap.edu/readingroom/books/far/contents.html

Pendse, N., "Origins of today's OLAP products," The OLAP

Report, URL www.olapreport.com,

1997.

Power, D. J. "What is a DSS?". DSstar, The On-Line

Executive Journal for Data-Intensive

Decision Support, October 21, 1997: Vol. 1, No. 3.

Power, D. J. "Web-based Decision Support Systems". DSstar,

The On-Line Executive

Journal for Data-Intensive Decision Support, August 18 and

25, 1998b: Vol. 2, Nos. 33 and 34.

Power, D. J., “A History of Microcomputer Spreadsheets,”

Communications of the Association for

Information Systems, 4, 9, October, 2000, 154-162.

Power, D. J., “Supporting Decision-Makers: An Expanded

Framework,” In Harriger, A. (Editor),

42 e-Proceedings Informing Science Conference, Krakow,

Poland, June 19-22, 2001, 431-436.

Power, D. J., Decision Support Systems: Concepts and

Resources for Managers, Westport, CT:

Greenwood/Quorum, 2002.

Power, D.J., “A Brief History of Decision Support Systems,”

DSSResources.COM, World Wide Web,

URL DSSResources.COM/history/dsshistory2.8.html, version

2.8, May 31, 2003.

Power, D. J., “Decision Support Systems: From the Past to the Future,” Proceedings of the 2004

Americas Conference on Information Systems, New York, NY,

August 6-8, 2004a, 2025-2031.

Power, D. J., “Specifying an Expanded Framework for

Classifying and Describing Decision Support

Systems,” Communications of the Association for

Information Systems, Vol. 13, Article 13,

February 2004b, 158-166.

Powell, R. “DM Review: A 10 Year Journey”, DM Review,

February 2001, URL

43 http://www.dmreview.com, last accessed March 10, 2001.

Raymond, R.C., "Use of the Time-sharing Computer in

Business Planning and Budgeting, Management

Science, 12, 8, 1966, B363-381.

Rockart, J. F. "Chief Executives Define Their Own Data

Needs," Harvard Business Review, 67, 2

March-April 1979, 81-93.

Rockart, J.F. and M.E. Treacy, “The CEO Goes On-Line,”

Harvard Business Review,

January-February, 1982, 82-88.

Scott Morton, M. S., "Computer-Driven Visual Display

Devices -- Their Impact on the Management

Decision-Making Process," Doctoral Dissertaion, Harvard

Business School, 1967.

44

Scott Morton, M. S. and J. A. Stephens, “The impact of interactive visual display systems on the management planning process,” IFIP Congress, 2, 1968,

1178-1184.

Scott Morton, M. S. and A. M. McCosh, "Terminal Costing for

Better Decisions," Harvard Business

Review, 46, 3, May-June 1968, 147–56.

Scott Morton, M. S., Management Decision Systems;

Computer-based support for decision making,

Boston, Division of Research, Graduate School of Business

Administration, Harvard University,

1971.

Saaty, T., Decision Making for Leaders; the Analytical

Hierarchy Process for Decisions in a

Complex World, Wadsworth, Belmont, Calif., 1982.

Sharda, R., S. Barr, and J. McDonnell, "Decision Support

Systems Effectiveness: A Review

and an Empirical Test, Management Science, 34, 2, 1988,

139-159.

45

Silver, M.S., Systems that Support Decision Makers:

Description and Analysis,

New York: John Wiley & Sons, 1991.

Simon, H.A., Administrative Behavior, New York, NY:

Macmillan, 1947.

Simon, H.A., The New Science of Management Decision,

New York, NY: Harper and Row, 1960.

Sprague, R. H., Jr. and H. J. Watson, "Bit by Bit: Toward

Decision Support Systems",

California Management Review, XXII, 1, Fall 1979, 60-68.

Sprague, R, H., Jr., "A Framework for the Development of

Decision Support Systems,"

Management Information Systems Quarterly, vol. 4, no. 4,

Dec. 1980, pp. 1-26.

Sprague, R. H., Jr. and E. D. Carlson. Building Effective

Decision Support Systems.

Englewood Cliffs, N.J.: Prentice-Hall, Inc.,1982.

46

Swanson, E. B. and M. J. Culnan, "Document-Based Systems for Management Planning and Control: A Classification, Survey, and Assessment", MIS

Quarterly, 2, 4,

Dec. 1978, 31-46.

Turban, E., "The Use of Mathematical Models in Plant

Maintenace Decision Making,"

Management Science, 13, 6, 1967, B342-359.

Turoff, M., “Delphi Conferencing: Computer Based

Conferencing with Anonymity,”

Journal of Technological Forecasting and Social Change. 3, 2,

1970, 159-204.

Turoff, M. and S. R. Hiltz, "Computer support for group versus individual decisions",

IEEE Trans. Communications, COM-30, 1, 1982, 82-90.

Urban, G.L., "SPRINTER: A Tool for New Products Decision

Makers," Industrial Management

47

Review, 8, 2, Spring 1967, 43-54.

Watson, H., G., Houdeshel and R. K. Rainer, Jr., Building

Executive Information Systems and other Decision Support Applications, New York: John

Wiley, 1997.

Appendix I. DSS Timeline

Year Major Milestones

1945 Bush proposed Memex

1947 Simon book titled Administrative Behavior

1952 Dantzig joined RAND and continued research on linear programming

1955 Semiautomatic Ground Environment (SAGE) project at M.I.T. Lincoln Lab uses first light pen; SAGE completed 1962, first data-driven

DSS

1956 Forrester started System Dynamics Group at the M.I.T. Sloan School

1960 Simon book The New Science of

Management Decision; Licklider article on

“Man-Computer Symbiosis”

1962 Licklider architect of Project MAC program at

M.I.T.; Iverson’s book A Programming

Language (APL); Engelbart's paper

"Augmenting Human Intellect: A Conceptual

Framework"

1963 Englebart established Augmentation

Research Center at SRI

1965 Stanford team led by Feigenbaum created

DENDRAL expert system; Problem Statement

Language/Problem Statement Analyzer

(PSL/PSA) developed at Case Institute of

Technology

1966 UNIVAC 494 introduced; Tymshare founded and Raymond article on computer timesharing for business planning and budgeting

1967 Scott Morton’s dissertation completed on impact of computer-driven visual display devices on management decision-making process; Turban reports national survey on use of mathematical models in plant maintenance decision making

1968 Scott Morton and McCosh article; Scott

Morton and Stephens article; Englebart demonstrated hypermedia—groupware system NLS (oNLine System) at Fall Joint

Computer Conference in San Francisco

1969 Ferguson and Jones article on lab study of a production scheduling computer-aided decision system running on an IBM 7094;

Little and Lodish MEDIAC, media planning model; Urban new product model-based system called SPRINTER

1970 Little article on decision calculus support

48

system; Joyner and Tunstall article on

Conference Coordinator computer software;

IRI Express, a multidimensional analytic tool for time-sharing systems, becomes available;

Turoff conferencing system

1971 Gorry and Scott Morton SMR article first published use of term Decision Support

System; Scott Morton book Management

Decision Systems; Gerrity article Man-

Machine decision systems; Klein and Tixier article on SCARABEE

1973 PLATO Notes, written at the Computer-based

Education Research Laboratory (CERL) at the

University of Illinois by David R. Woolley

1974 Davis’s book Management Information

Systems; Meador and Ness article DSS application to corporate planning

1975 Alter completed M.I.T. Ph.D. dissertation "A

Study of Computer Aided Decision Making in

Organizations"; Keen SMR article on evaluating computer-based decision aids;

Boulden book on computer-assisted planning systems

1976 Sprague and Watson article "A Decision

Support System for Banks"; Grace paper on

Geodata Analysis and Display System

1977 Alter article "A Taxonomy of Decision

Support Systems", Klein article on Finsim;

Carlson and Scott Morton chair ACM SIGBDP

Conference DSS Conference

1978 Development began on Management

Information and Decision Support (MIDS) at

Lockheed-Georgia; Keen and Scott Morton

49

book; McCosh and Scott Morton book;

Holsapple dissertation completed; Wagner founded Execucom to market IFPS; Bricklin and Frankston created Visicalc (Visible

Calculator) microcomputer spreadsheet;

Carlson from IBM, San Jose plenary speaker at HICSS-11; Swanson and Culnan article document-based systems for management planning

1979 Rockart HBR article on CEO data needs

1980 Sprague MISQ article on a DSS Framework;

Alter book; Hackathorn founded

MicroDecisionware

1981 First International Conference on DSS,

Atlanta, Georgia; Bonczek, Holsapple, and

Whinston book; Gray paper on SMU decision rooms and GDSS

1982 Computer named the “Man” of the Year by

Time Magazine; Rockart and Treacy article

“The CEO Goes On-Line” HBR; Sprague and

Carlson book; Metaphor Computer Systems founded by Kimball and others from Xerox

PARC; ESRI launched its first commercial

GIS software called ARC/INFO; IFIP Working

Group 8.3 on Decision Support Systems established

1983 Inmon Computerworld article on relational

DBMS; IBM DB2 Decision Support database released; Student Guide to IFPS by Gray;

Huntington established Exsys; Expert Choice software released

1984 PLEXSYS, Mindsight and SAMM GDSS; first

Teradata computer with relational database

50

management system shipped to customers

Wells Fargo and AT&T; MYCIN expert system shell explained

1985 Procter & Gamble use first data mart from

Metaphor to analyze data from checkoutcounter scanners; Whinston founded

Decision Support Systems journal; Kersten developed NEGO

1987 Houdeshel and Watson article on MIDS;

DeSanctis and Gallupe article on GDSS;

Frontline Systems founded by Fylstra, marketed solver add-in for Excel

1988 Turban DSS textbook; Pilot Software EIS for

Balanced Scorecard deployed at Analog

Devices

1989 Gartner analyst Dresner coins term business intelligence; release of Lotus Notes;

International Society for Decision Support

Systems (ISDSS) founded by Holsapple and

Whinston

1990 Inmon book Using Oracle to Build Decision

Support Systems; Eom and Lee co-citation analysis of DSS research 1971–1988

1991 Inmon books Building the Data Warehouse and Database Machines and Decision

Support Systems; Berners-Lee’s World Wide

Web server and browser, become publicly available

1993 Codd et al. paper defines online analytical processing (OLAP)

1994 HTML 2.0 with form tags and tables;

Pendse’s OLAP Report project began

1995 The Data Warehousing Institute (TDWI)

51

established; DSS journal issue on Next

Generation of Decision Support; Crossland,

Wynne, and Perkins article on Spatial DSS;

ISWorld DSS Research pages and DSS

Research Resources

1996 InterNeg negotiation software renamed

Inspire; OLAPReport.com established;

1997 Wal-Mart and Teradata created then world’s largest production data warehouse at 24

Terabytes (TB)

1998 ACM First International Workshop on Data

Warehousing and OLAP

1999 DSSResources.com domain name registered

2000 First AIS Americas Conference mini-track on

Decision Support Systems

2001 Association for Information Systems (AIS)

Special Interest Group on Decision Support,

Knowledge and Data Management Systems

(SIG DSS) founded

2003 International Society for Decision Support

Systems (ISDSS) merged with AIS SIG DSS

Author Profile

52

Daniel J. Power is a Professor of Information Systems and

Management at the College of Business Administration at the

University of Northern Iowa, Cedar Falls, Iowa and the editor of DSSResources.COM, the Web-based knowledge repository about computerized systems that support decision making, the editor of PlanningSkills.COM, and the editor of

DSS News, a bi-weekly e-newsletter. Dan writes the column

"Ask Dan!" in DSS News.

53

Dr. Power's research interests include the design and development of Decision Support Systems and how DSS impact individual and organizational decision behavior.

Since 1982, Power has published more than 40 articles, book chapters and proceedings papers. He was founding Chair of the Association for Information Systems Special Interest

Group on Decision Support, Knowledge and Data

Management Systems (SIG DSS).

Thanks for visiting. If you have any suggestions for improving this brief history of DSS, I'd like to hear from you.

I'm trying to collect retrospective reports for my "Brief

History of Decision Support Systems" hypertext document at

DSSResources.COM. I'm including recollections, reflections and comments of those involved in the various DSS

"threads" and I'm trying to correct any errors of omission or misinterpretation.

How to cite

A Brief History of Decision Support Systems should be cited as:

Power, D.J.

A Brief History of Decision Support Systems.

DSSResources.COM, World Wide Web, http://DSSResources.COM/history/dsshistory.html, version

4.0, March 10, 2007.

A model-driven DSS emphasizes access to and manipulation of a statistical, financial, optimization, or simulation model. Model-driven

DSS use data and parameters

54

 provided by users to assist decision makers in analyzing a situation; they are not necessarily data intensive. Dicodess is an example of an open source modeldriven DSS generator [14] .

A communication-driven DSS supports more than one person working on a shared task; examples include integrated tools like Microsoft's NetMeeting or

Groove [15] .

A data-driven DSS or dataoriented DSS emphasizes access to and manipulation of a time series of internal company data and, sometimes, external data.

A document-driven DSS manages, retrieves and manipulates unstructured information in a variety of electronic formats.

A knowledge-driven DSS provides specialized problem solving expertise stored as facts, rules, procedures, or in similar structures.

[13]

Stevenson’s Shockabsorber Model

Business Architecture

Data Architecture

Application Architecture

Technical Architecture

55

56

Enterprise Architecture Alignment Heuristics

Pedro Sousa

Carla Marques Pereira

Jose Alves Marques

Link Consulting, SA

January 2005

Summary: The alignment between Business Processes (BP) and Information Technologies (IT) is a major issue in most organizations, as it directly impacts on the organization's

57 agility and flexibility to change according to business needs.

The concepts upon which alignment is perceived are addressed in what is called today the "Enterprise

Architecture," gathering business and IT together. The focus of this paper is to show how alignment between Business and IT can be stated in terms of the components found in most Enterprise Architectures. (11 printed pages)

Contents

Introduction

Enterprise Architecture Frameworks

Identifying Enterprise Architecture Components from an

Alignment Perspective

Alignment Heuristics

Final Remarks

References

About the Authors

Introduction

The alignment between Business Processes (BP) and

Information Technologies (IT) is a major issue in most organizations, as it directly impacts on the organization's agility and flexibility to change according to business needs.

The concepts upon which alignment is perceived are

addressed in what is called today the "Enterprise

Architecture", gathering business and IT together.

58

Many Enterprise Architecture Frameworks have been proposed, focusing on different concerns and with different approaches for guiding the development of an IT infrastructure well suited for the organization. Each

Enterprise Architecture Framework has its own concepts, components, and methodologies to derive the component all the required artifact. However, when the main concern is alignment, we may consider simpler architecture concepts and simpler methodologies because the focus is not to define development artifacts but only to check their consistency.

The focus of this paper is to show how alignment between

Business and IT can be stated in terms of the components found in most Enterprise Architectures.

In the next section, we briefly introduce three well-known

Enterprise Architecture Frameworks, namely: The Zachman

Framework, Capgemini's Integrated Architecture Framework and the Microsoft Enterprise Architecture. We do not intend to fully describe them, but solely present the main aspects.

59

Next, we present the basic concepts common to these frameworks, focusing on their generic properties and leaving out specificities of each framework. We will consider four basic components of an Enterprise Architecture: Business

Architecture, Information Architecture, Application

Architecture and Technical Architecture.

Finally, we show how alignment between Business and IT can be disaggregated into alignment between these basic components, we present general heuristics defined in terms of the architectural components and present the work in progress. We will not address the Technical Architecture; the main reason being that technical alignment is mostly dependent on the technology itself.

Enterprise Architecture Frameworks

The Zachman Framework

The Zachman Framework for Enterprise Architecture

(www.zifa.com) proposes a logical structure for classifying and organizing the descriptive representations of an enterprise. It considers six dimensions, which can be analyzed in different perspectives, as presented in

Figure 1

; the rows represent the perspectives and the columns the dimensions within a perspective.

The Framework is structured around the views of different users involved in planning, designing, building and maintaining an enterprise's Information Systems:

60

Scope (Planner's Perspective): The planner is concerned with the strategic aspects of the organization, thus addressing the context of its environment and scope.

Enterprise Model (Owner's Perspective): The owner is interested in the business perspective of the organization, how it delivers and how it will be used.

System Model (Designer's Perspective): The designer is concerned with the systems of the organization to ensure that they will, in fact, fulfill the owner's expectations.

Technology Model (Builder's Perspective): The builder is concerned with the technology used to support the systems and the business in the organization.

Detailed Representations (Subcontractor's

Perspective): This perspective addresses the builder's specifications of system components to be subcontracted to third parties.

While the rows describe the organization users' views, the columns allow focusing on each dimension:

61

Data (What?): Each of the cells in this column addresses the information of the organization. Each of the above perspectives should have an understanding of enterprise's data and how it is used.

Function (How?): The cells in the function column describe the process of translating the mission of the organization into the business and into successively more detailed definitions of its operations.

Network (Where?): This column is concerned with the geographical distribution of the organization's activities and artifacts, and how they relate with each perspective of the organization.

People (Who?): This column describes who is related with each artifact of the organization, namely Business processes, information and IT. At higher level cells, the

"who" refers to organizational units, whereas in lower cells it refers to system users and usernames.

Time (When?): This column describes how each artifact of the organizations relates to a timeline, in each perspective.

Motivation (Why?): This column is concerned with the translation of goals in each row into actions and objectives in lower rows.

Figure 1. The Zachman Framework, © John A.

Zachman International (click for larger image)

62

Figure 2. The Integrated Architecture Framework

Capgemini's Integrated Architecture Framewor

Capgemini has developed an approach to the analysis and development of enterprise and project-level architectures know as the Integrated Architecture Framework (IAF) shown in

Figure 2

[AM04].

63

IAF breaks down the overall problem into a number of the related areas covering Business (people and processes),

Information (including knowledge), Information Systems, and Technology Infrastructure, with two special areas addressing the Governance and Security aspects across all of these. Analysis of each of these areas is structured into four levels of abstraction: Contextual, Conceptual, Logical and

Physical.

The Contextual view presents the overall justification for the organization and describes the contextual environment. It corresponds largely to Zachman's Planner's Perspective row.

The Conceptual view describes what the requirements are and what the vision for the solution is. The Logical view describes how these requirements and vision are met. Finally, the Physical view describes the artifacts of the solution.

These views have no direct relation to Zachman's perspectives because in IAF, Business, Information,

Information Systems and Technology Infrastructure are the artifacts of the architecture whereas in Zachman, Business,

Information Systems and Technology are views

(perspectives).

Microsoft Enterprise Architecture

64

Microsoft Enterprise Architecture shown in

Figure 3

, is a two dimensional framework, that considers four basic perspectives (business, application, information, and technology), and four different levels of detail (conceptual, logical and physical and implementation).

The business perspective describes how a business works. It includes broad business strategies along with plans for moving the organization from its current state to an envisaged future state.

Figure 3. Microsoft Enterprise Architecture

Perspectives

65

Figure 4. Decomposing Business and IT Alignment

into Architectural Components

The application perspective defines the enterprise's application portfolio and is application-centered. The application perspective may represent cross-organization services, information, and functionality, linking users of different skills and job functions in order to achieve common business objectives.

The information perspective describes what the organization needs to know to run its business processes and operations.

The information perspective also describes how data is bound into the business processes, including structured data stores such as databases, and unstructured data stores such

as documents, spreadsheets, and presentations that exist throughout the organization.

66

The technology perspective provides a logical, vendorindependent description of infrastructure and system components that are necessary to support the application and information perspectives. It defines the set of technology standards and services needed to execute the business mission.

Each of these perspectives has a conceptual view, a logical view, and a physical view. Elements of the physical view also have an implementation view. Microsoft Enterprise

Architecture is described in detail in the MSDN Library.

Making a parallel between the Microsoft's and Zachman's

Enterprise Framework, the Business perspective corresponds to

Planner's

and

Owner's

perspectives; Application perspective to Zachman

Designer's

perspective; Technology perspective to

Builder's

and Subcontractor's perspectives and finally, the Microsoft Information perspective corresponds to the Data column in the Zachman framework.

67

Identifying Enterprise Architecture Components from an Alignment Perspective

As we could see in previous sections, different Enterprise

Frameworks have different ways to model artifacts of the

Enterprise, their perspectives and the different levels at which they can be described.

The Enterprise Frameworks address a large number of problems and therefore have a degree of complexity far larger than needed if the sole problem is the alignment of the business and IT architecture. Thus we can simplify these models and just consider the sub-architectures which have a certain commonality. It is out of the scope of this paper to fully study and justify similar concepts in these Enterprise

Architecture Frameworks. If alignment is the main concern, an Enterprise Architecture has four fundamentals components: Business Architecture, Information Architecture,

Application Architecture and Technical Architecture. This is not new, and it has been long accepted in the Enterprise

Architect Community (for instance see www.eacommunity.com).

We will address the issue of alignment based on the coherency between elements of Business Architecture,

68 elements of Information Architecture and elements of

Application Architecture. The more elements each of these

Architectures has, the more rich and complex is the concept of alignment, because more rules and heuristics need to be stated to govern the relation between these elements. So, in order to build up alignment, one must first clarify the elements of each architecture (see

Figure 4

).

In what concerns the Technical Architecture, its alignment is mostly dependent on the technology itself. We are currently investigating how Service Oriented Architecture (SOA) concepts overlap with previous architectures and how alignment could be formulated in its model. This is ongoing work and is beyond the scope of this paper.

Business Architecture

The Business Architecture is the result of defining business strategies, processes, and functional requirements. It is the base for identifying the requirements for the information systems that support business activities. It typically includes the following: the enterprise's high-level objectives and goals; the business processes carried out by the enterprise as a whole, or at least a significant part; the business

functions performed; major organizational structures; and the relationships between these elements.

69

In this paper, we consider a simpler case where the Business

Architecture includes only Business Processes, each business process is composed by a flow of business activities and each activity is associated with information entities, time, and people. Business Processes have attributes such as criticalness, security level, delayed execution (batch), on-line, and so on.

Information Architecture

The Information Architecture describes what the organization needs to know to run its processes and operations, as described in the Business Architecture. It provides a view of the business information independent of the IT view of databases. In the Information Architecture, business information is structured in

Information Entities

, each having a business responsible for its management and performing operations like: Acquisition, Classification, Quality

Control, Presentation, Distribution, Assessment, and so on.

Information Entities must have an identifier, defined from a business perspective, a description, and a set of attributes.

Attributes are related to business processes (that use or

70 produce them) and to applications (that create, read, update and delete them). Attributes are classified according to different properties such as Security, Availability and so on.

As an example,

Client

and

Employee

are typical Information

Entities.

Employee

has attributes such as "courses taken",

"competences", "labor accidents", and "career". Each of these attributes can be physically supported by a complex database schema in different databases used by several applications.

Application Architecture

The Application Architecture describes the applications required to fulfill two major goals:

1.

Support the business requirements, and

2.

Allow efficient management of Information Entities.

Application Architecture is normally derived from the analyses of both Business and Information Architectures.

Application Architecture typically include: descriptions of automated services that support the business processes; descriptions of the interaction and interdependencies

(interfaces) of the organization's application systems; plans for developing new applications and revision of old

71 applications based on the enterprises objectives, goals, and evolving technology platforms.

Applications also have required attributes, such as availability (up time), scalability (ability to scale up perform), profile-based accesses (ability to identify who does each tasks).

Alignment and Architecture Component

After identifying the major architectural components from an alignment point of view, we are now in position to address the relations between these components in terms of alignment.

Alignment between Business and Applications

In a fully aligned Business and Applications scenario, the time and effort business people spent to run the business should be only devoted to "reasoning" functions. On the contrary, misalignments force business people to perform extra and mechanic work such as:

Inserting the same data multiple times in different applications.

Logging in multiple times, once for each application they need to access.

72

Recovering from a failed operation across multiple systems, requiring careful human analyses to rollback to a coherent state.

Overcoming inappropriate application functionality. For example, printing invoices one by one because applications do not have an interface for multiple printing.

Notice that alignment between Business and

Applications in the above context, does not imply a flexible and agile IT Architecture, in fact, a measure of a flexible and agile IT Architecture is the effort IT people make to keep the Business and Applications aligned when Business is changing. This topic is addressed next.

Alignment between Information and Application

In fully aligned Information and Applications Architectures,

IT people only spent effort and time coding business functions and logics. On the contrary, misalignments between information and application require IT people to do extra coding for:

Keeping multiple replicas of the same data coherent, because they are updated by multiple applications.

Assuring coherency from multiple transactions, because a single business process crosses multiple applications.

Gathering information from multiple systems and coding rules to produce a coherent view of the organization's business information.

Transforming data structures when data migrates between applications.

73

The extra coding is required to consistently modify both architectures. However, since the information critical to run the business (Information Architecture) is far more stable than the applications to support it (Applications Architecture), most effort really is put into changing in the Applications.

Alignment between Business and Information

Information and Business Architectures are aligned when business people have the information they need to run the business. This means accurate, with the right level of detail, and on time information. Unlike the previous misalignments, here the impact is neither time nor effort, but the impossibility of getting the adequate piece of information relevant for the business.

74

Examples are abundant; a CEO asks for some report, where sales figures need to be disaggregated by type of services.

Assuming the report requested by the CEO has either actual or foreseen business relevance, the possibility/ impossibility to produce such report is an evidence of the alignment/misalignment between information and Business

Architectures. To produce the report we must have the adequate basic data and data exploring applications, and thus this is an issue that should be dealt by the previous

Alignments (Information/ Applications and

Business/Applications).

Alignment Heuristics

We developed Alignment's Heuristics as a common sense rule (or a set of rules) to increase the probability of finding an easier way to achieve Business, Information and

Application Architectures alignment.

Heuristics presented result from mapping our experience, both as academic teaching at university and professional consultancy services, into the context of the Business,

Information and Application Architectures presented in this paper. We present the heuristics that we consider to have a greater value, given its simplicity and its results.

The main heuristics to consider when checking alignment between Business and Application Architectures are:

75

Each business process should be supported by a minimum number of applications. This simplifies user interfaces among applications, reduces the need for application integration, and also minimizes the number of applications that must be modified when the business process changes.

Business activities should be supported by a single application. This reduces the need for distributed transactions among applications.

Critical business processes should be supported by scalable and highly available applications.

Critical business processes/activities should be supported by different applications than the noncritical business processes/activities. This helps to keep critical hardware and permanent maintenance teams as small as possible.

Each application's functionality should support at least one business process activity. Otherwise, it plays no role in supporting the business.

Information required for critical processes should be also supported by scalable and highly available systems.

Business processes activities requiring on-line/batch support should be supported by applications running on different infrastructures, making easier the tuning of the systems for operating window.

76

The main heuristics to check alignment between Application and Information Architectures are:

An information entity is managed by only one application. This means that entities are identified, created and reused by a single application.

Information entities are created when identifiers are assigned to them, even if at that time no attributes are known. For example, if the Client information entity may be created before its name and address and other attributes are known. Even so, the application that manages Client information entity must be the application that manages their IDs.

Applications that manage information entities should provide means to make the entity information distributable across the organization using agreed-on protocols and formats.

Exporting and distributing information entities across organization applications should make use of a "data store",

77 rather than a point-to-point Application integration.

Applications managing a given information entity should export its contents to the data store when its contents have changed. Applications requiring a given information entity should inquire the data store for up-to-date information.

This allows for computational independence between applications, and make possible to size the HW required to run an application without knowing the rate that other applications demand information from it. Further, if the application goes down, it allows other to continue operation using best possible data.

Whenever possible, applications should manage information entities of the same security level. This simplifies the implementation of controls and procedures in accordance with the security policy. Finally the main heuristics to apply for Business and Information alignment are:

All business processes activity create, update and/or delete at least one information entity.

All information entities attributes are read at least by one business process activity.

All information entities have an identifier understood by business people.

78

All information entities must have a mean of being transformed for presentation to appropriate audiences using enterprise-standard applications and tools.

All information entities must derive from known sources, and must have a business people responsible for its coherency, accuracy, relevance and quality control.

All information entities must be classified and named within the Information Architecture.

For each information entity, Business people should be responsible for assessing the usefulness and cost/benefits of information and sustain its continued use.

Final Remarks

The heuristics presented have been validated and tested in real projects. In some cases, different heuristics produce opposite recommendations. This means that, a compromise solution must be reached. In other cases, heuristics do not favor optimal solutions from an engineering point of view, because optimal solutions do not normally take into account flexibility and ease of change.

Another remark is the heuristics presented intend to validate alignment among architectures, but assume that each architecture is "aligned within itself". This means that we are

79 not checking if, for example, the Business Architecture presents a good and a coherent schema of the business processes and activities. Likewise, we are not checking if

Applications Architecture makes sense or not. This requires more complex models, such as the ones initially proposed in original Frameworks (Zachman, IAF and Microsoft).

The work presented was developed using the Zachman

Framework as the general approach, but is strongly focused in the alignment issues. We have coded a large percentage of the heuristics presented in a modeling tool (System

Architect from Popkin Software) and we are able to derive a measurement for alignment given a Business, Application and Information Architecture. We have started work to include these heuristics in Microsoft Visio.

We consider the heuristics to be valuable because they force architects to think about the justification of their decisions, leaving better documented and solid architectures.

References

[AM04] Andrew Macaulay, Enterprise Architecture Design and the Integrated Architecture Framework, JOURNAL1:

Microsoft Architects Journal, Issue 1, January 2004.

About the Authors

Pedro Sousa pedro.sousa@link.pt

80

Pedro Sousa is responsible for Enterprise Architecture professional services at Link Consulting SA, where he has been involved in several Enterprise Architecture projects for the past six years. He is an Associate Professor at the

Technical University of Lisbon (IST), where he teaches on theses subjects in Masters of Computer Science courses. He has published a series of papers about Business and IT alignment.

Carla Marques Pereira

Carla Marques Pereira has a MSc in Computer Science at the

Technical University of Lisbon (IST). She is reading for a

PhD on the subject of "Business and IT alignment". Carla is a member of Link Consulting SA's Enterprise Architects team and a researcher in the Center for Organizational

Engineering (ceo.inesc.pt).

Jose Alves Marques

Jose Alves Marques is CEO of Link Consulting SA and a full

Professor at Technical University of Lisbon (IST). Technically,

his main focus is Distributed Systems Architectures and

Services Oriented Architectures. He has a long track of published papers and projects on these domains.

81

This article was published in the Architecture Journal, a print and online publication produced by Microsoft. For more articles from this publication, please visit the Architecture

Journal website .

This chapter describes the development of a Technology

Architecture.

82

Figure: Phase D: Technology Architecture

The detailed description of the process to develop the Target

Technology Architecture is given in Target Technology

Architecture - Detail .

Objective

The objective of Phase D is to develop a Technology

Architecture that will form the basis of the following implementation work.

Approach

General

Detailed guidelines for Phase D, including Inputs, Steps, and

Outputs, are given in Target Technology Architecture -

Detail .

Architecture Continuum

As part of Phase D, the architecture team will need to consider what relevant Technology Architecture resources are available in the Architecture Continuum.

In particular:

The TOGAF Technical Reference Model (TRM)

Generic technology models relevant to the organization's industry "vertical" sector.

For example, the TeleManagement Forum (TMF - www.tmforum.org

) has developed detailed technology models relevant to the Telecommunications industry.

83

Technology models relevant to Common Systems

Architectures.

For example, The Open Group has a Reference Model for Integrated Information Infrastructure (III-RM: see

Integrated Information Infrastructure Reference Model ) that focuses on the application-level components and underlying services necessary to provide an integrated information infrastructure.

Inputs

Inputs to Phase D are:

Technology principles ( existing

Technology Principles

Request for Architecture Work ( Request for

Architecture Work )

Statement of Architecture Work (

), if

Major Output

Descriptions )

Architecture Vision ( Business Scenario/Architecture

Vision )

Baseline Technology Architecture, Version 0.1 (from

Phase A)

Target Technology Architecture, Version 0.1 (from

Phase A)

Relevant technical requirements from previous phases

Gap analysis results (from Data Architecture)

Gap analysis results (from Applications Architecture)

Baseline Business Architecture, Version 1.0 (detailed), if appropriate

Baseline Data Architecture, Version 1.0, if appropriate

Baseline Applications Architecture, Version 1.0, if appropriate

Target Business Architecture (

Version 1.0 (detailed)

Business Architecture

Re-usable building blocks, from organization's

Enterprise Continuum (

),

Introduction to the Enterprise

Continuum ), if available

Target Data Architecture, Version 1.0

Target Applications Architecture, Version 1.0

84

Steps

Key steps in Phase D include:

1.

Develop Baseline Technology Architecture Description i.

Review Baseline Business Architecture, Baseline

Data Architecture, and Baseline Applications

Architecture, to the degree necessary to inform decisions and subsequent work. ii.

Develop a Baseline Description of the existing

Technology Architecture, to the extent necessary to support the Target Technology Architecture.

The scope and level of detail to be defined will depend on the extent to which existing technology components are likely to be carried over into the

Target Technology Architecture, and on whether existing architectural descriptions exist, as described in Approach . Define for each major hardware or software platform type:

Name (short and long)

Physical location

Owner(s)

Other users

Plain language description of what the hardware/software platform is and what it is used for

85

Business functions supported

Organizational units supported

Networks accessed

Applications and data supported

System inter-dependencies (for example, fallback configurations) iii.

To the extent possible, identify and document candidate Technology Architecture Building Blocks

(potential re-usable assets). iv.

Draft the Technology Architecture Baseline report: summarize key findings and conclusions, developing suitable graphics and schematics to illustrate baseline configuration(s). If warranted, provide individual Baseline Technology

Architecture Descriptions as Annexes.

2.

Develop Target Technology Architecture; see detailed steps

Detailed activities for this step, including Inputs,

Activities, and Outputs, are given in Target Technology

Architecture - Detail .

Outputs

The outputs of Phase D are:

Statement of Architecture Work ( Major Output

Descriptions ), updated if necessary

Baseline Technology Architecture, Version 1.0, if appropriate

Validated technology principles, or new technology principles (if generated here)

Technology Architecture Report, summarizing what was done and the key findings

86

Target Technology Architecture (

Architecture ), Version 1.0

Technology

Technology Architecture, gap report

Viewpoints addressing key stakeholder concerns

Views corresponding to the selected viewpoints

Target Technology Architecture - Detail

Introduction

This is the detailed description of the process to develop the

Target Technology Architecture.

Overview

An organization creating or adapting a Technology

Architecture may already mandate the use of a list of approved suppliers/products for that organization. The list will be an input to the definition of the organization-specific architecture framework. The architectures can then be used as procurement tools to govern the future growth and development of the organization's IT infrastructure. The key steps are expanded in the following subsections.

Note:

The order of the following steps should be adapted to the situation as described in Introduction to the ADM .

Step 1

Step 1 is to create a Baseline Description in the TOGAF format.

Objective

87

The objective of this step is to convert the description of the existing system into services terminology using the organization's Foundation Architecture (e.g., the TOGAF

Foundation Architecture's TRM). The rationale behind this is to structure the existing system description in a way which makes it compatible with the breakdown of standards and the descriptions used within your Foundation Architecture.

Approach

This step is intended to facilitate moving from product documentation to a service-oriented description. The step will aid in specifying standards for the Target Architecture in

Step 4. An additional step, Step 3, oriented to defining building blocks, provides the means to cross-check the architectural definition process in the form of implementation-related decisions.

Additionally, this step captures relevant parts of the existing architecture (using the scope definition established in Phase

A) as candidates for re-usable building blocks, along with inhibitors to meeting business requirements using the existing system. The existing architecture is assessed against the Business Architecture, identifying the key inhibitors and opportunities for re-use. Finally, the existing architecture assessment ends with the capture of implied or explicit architecture principles that should be carried forward and imposed on this architecture exercise.

Begin by converting the description of the existing environment into the terms of your organization's

Foundation Architecture (e.g., the TOGAF Foundation

Architecture's TRM). This will allow the team developing the architecture to gain experience with the model and to

88 understand its component parts. The team may be able to take advantage of a previous architectural definition, but it is assumed that some adaptation may be required to match the architectural definition techniques described as part of this process. Another important task is to set down a list of key questions which can be used later in the development process to measure the effectiveness of the new architecture.

A key process in the creation of a broad architectural model of the target system is the conceptualization of Architecture

Building Blocks (ABBs). ABBs are not intended to be solutions, but depictions of how the architecture might be looked on in implementable terms. Their functionality is clearly defined, but without the detail introduced by specific products. The method of defining ABBs, along with some general guidelines for their use in creating an architectural model, is described in Part IV: Resource Base ,

Blocks .

Building

Blocks and the ADM , and illustrated in detail in Building

It is recommended that Architecture Building Blocks be documented (e.g., with an architecture description language) and stored (e.g., in a repository or information base), in order to maximize re-use potential.

Applying the ABB method introduces application space into the architectural process. This is the means of linking services, which address functionality that must be considered on an enterprise basis, with applications, which may or may not address global functionality. The building blocks example in Part IV: Resource Base , Building Blocks , provides insight into both application-specific and more global considerations in defining building blocks in order to illustrate this.

Inputs

89

The inputs to Step 1 are:

Technology principles ( existing

Technology Principles

Request for Architecture Work ( Request for

Architecture Work )

Statement of Architecture Work (

), if

Major Output

Descriptions )

Architecture Vision ( Business Scenario/Architecture

Vision )

Baseline Technology Architecture, Version 0.1

Target Technology Architecture, Version 0.1

Relevant technical requirements from previous phases

Gap analysis results (from Data Architecture)

Gap analysis results (from Applications Architecture)

Baseline Business Architecture, Version 1.0 (detailed), if appropriate

Baseline Data Architecture, Version 1.0, if appropriate

Baseline Applications Architecture, Version 1.0, if appropriate

Target Business Architecture (

Version 1.0 (detailed)

Business Architecture

Re-usable building blocks, from organization's

Enterprise Continuum (

),

Introduction to the Enterprise

Continuum ), if available

Target Data Architecture, Version 1.0

Target Applications Architecture, Version 1.0

Re-usable Architecture Building Blocks, from organization's Architecture Continuum ( The

Architecture Continuum ), if available

90

Re-usable Solution Building Blocks, from organization's

Solutions Continuum ( available

The Solutions Continuum ), if

Activities

Key activities in Step 1 include:

1.

Collect data on current system

2.

Document all constraints

3.

Review and validate (or generate, if necessary) the set of Technology Architecture principles

These will normally form part of an overarching set of architecture principles. Guidelines for developing and applying principles, and a sample set of Technology

Architecture principles, are given in Part IV: Resource

Base , Architecture Principles .

4.

List distinct functionality

5.

Produce affinity groupings of functionality using TOGAF

TRM service groupings (or your business' Foundation

Architecture)

6.

Analyze relationships between groupings

7.

Sanity check functionality to assure all of current system is considered

8.

Identify interfaces

9.

Produce Technology Architecture model

10.

Verify Technology Architecture model

11.

Document key questions to test merits of

Technology Architecture

12.

Document criteria for selection of service portfolio architecture

Outputs

The outputs of Step 1 are:

91

Technology principles ( existing

Technology Principles

Baseline Technology Architecture (

), if not

Technology

Architecture ), Version 1.0

Target Technology Architecture, Version 0.2:

Technology Architecture - constraints

Technology Architecture - architecture principles

Technology Architecture - requirements traceability, key questions list

Technology Architecture - requirements traceability, criteria for selection of service portfolio

Technology Architecture Model, Version 0.1

Step 2

Step 2 is to consider different architecture reference models, viewpoints, and tools.

Objective

The objective of this step is to perform an analysis of the

Technology Architecture from a number of different concerns (requirements) or viewpoints and to document each relevant viewpoint. The purpose of considering these viewpoints is to ensure that all relevant stakeholder concerns will have been considered in the final Technology

Architecture, so ensuring that the target system will meet all the requirements put on it.

Approach

92

The Business Architecture is used to select the most relevant viewpoints for the project. It is important to recognize that in practice it will be rarely possible to reach 100% coverage of stakeholder concerns.

Pertinent viewpoints are created first from the existing system to identify the salient elements of the current systems requirements that the stakeholders confirm must also be satisfied in the target system. A comprehensive set of stakeholder viewpoints must also be created for the target system. The corresponding views of the existing system will be compared with the views of the target system to identify elements of the existing system that are intended for replacement or improvement.

If a set of viewpoints is carefully chosen, it will expose the most important aspects of the existing architecture and the requirements of the target system.

Several different viewpoints may be useful. Architecture viewpoints and views are described in greater detail in Part

IV: Resource Base , Developing Architecture Views . The viewpoints presented there should not be considered an exhaustive set, but simply a starting point. In developing a

Technology Architecture, it is very likely that some of the viewpoints given there will not be useful, while others not given there will be essential. Again, use the Business

Architecture as a guide in selecting the pertinent viewpoints.

Inputs

The inputs to Step 2 are:

Request for Architecture Work (

Architecture Work )

Request for

Statement of Architecture Work (

Descriptions )

Major Output

Target Business Architecture (

Version 1.0

Business Architecture

Target Technology Architecture ( Technology

),

Architecture ), Version 0.2

93

Activities

Key activities in Step 2 include:

1.

Select relevant Technology Architecture resources

(reference models, patterns, etc.) from the Architecture

Continuum, on the basis of the business drivers, and the stakeholders and concerns.

2.

Select relevant Technology Architecture viewpoints; i.e., those that will enable the architect to demonstrate how the stakeholder concerns are being addressed in the

Technology Architecture. (See Part IV: Resource Base ,

Developing Architecture Views for examples). o o o

Document the selected viewpoints, if not already documented.

Consider using ANSI/IEEE Std 1471-2000 as a guide for documenting a viewpoint.

A primary reference model will be the TOGAF TRM.

Other reference models will be taken from the

Architecture Continuum.

Consider developing at least the following views:

Networked Computing/Hardware view

Communications view

Processing view

Cost view

Standards view

94 o

Brainstorm and document technical constraints deriving from analysis of the concerns, and ensure they are covered by the viewpoints.

3.

Identify appropriate tools and techniques to be used for capture, modeling, and analysis, in association with the selected viewpoints. Depending on the degree of sophistication warranted, these may comprise simple documents or spreadsheets, or more sophisticated modeling tools and techniques.

4.

Perform trade-off analysis to resolve conflicts (if any) among the different viewpoints. o

One method of doing this is CMU/SEI's

Architecture Trade-off Analysis (ATA) Method

(refer to www.sei.cmu.edu/ata/ata_method.html

).

Outputs

The outputs of Step 2 are:

Target Technology Architecture, Version 0.3:

Technology Architecture ( Technology Architecture

- architecture viewpoints

Networked Computing/Hardware view

Communications view

Processing view

Cost view

Standards view

Technology Architecture - constraints

)

Step 3

Step 3 is to create an architectural model of building blocks.

Objective

95

The reason for selecting viewpoints in Step 2 is to be able to develop views for each of those viewpoints in Step 3. The architectural model created in Step 3 comprises those several views.

The objective of this step is to broadly determine how the services required in the target system will be grouped after considering all pertinent viewpoints of the architecture's use.

This differs from Step 1 in that Step 1 dealt mainly with the required functionality of the system, whereas here we are considering many viewpoints that are not expressed explicitly as required functionality.

The rationale behind this is to enable the services required within the system to be selected during the next step, through the creation of an architecture model that clearly depicts the required services.

Approach

At Step 3, the purpose of examining different viewpoints in

Step 2 becomes clear. The constraints defined and the unique system insights gained through an examination of the viewpoints pertinent to the current system and the target system can be used to validate the ability of the broad architectural model to accommodate the system requirements.

The broad architectural model starts as a TOGAF TRM-based model (or a model based upon the organization's Foundation

Architecture), derived from the service-to-function mapping carried out as part of the service examination in Step 1. An architecture based exactly on the TOGAF TRM may not be able to accommodate the stakeholder needs of all

96 organizations. If the examination of different viewpoints identifies architectural features that cannot be expressed in terms of the TOGAF TRM, changes and amendments to the

TOGAF TRM should be made to create an organizationspecific TRM.

Once the Baseline Description has been established and appropriate views described, it is possible to make decisions about how the various elements of system functionality should be implemented. This should only be in broad terms, to a level of detail which establishes how the major business functions will be implemented; for example, as a transaction processing application or using a client/server model.

Therefore this step defines the future model of building blocks (e.g., collections of functions and services generated from previous steps). It is here that re-use of building blocks from your business' Architecture Continuum is examined carefully, assuring that maximum re-use of existing material is realized.

Once the architecture model of building blocks is created, the model must be tested for coverage and completeness of the required technical functions and services. For each building block decision, completely follow through its impact and note the rationale for decisions, including the rationale for decisions not to do something.

Inputs

The inputs to Step 3 are:

Target Business Architecture (

Version 1.0

Business Architecture ),

Target Technology Architecture (

Architecture ), Version 0.3:

Technology

Technology Architecture - viewpoints

Technology Architecture - constraints

Re-usable Architecture Building Blocks, from organization's Architecture Continuum ( The

Architecture Continuum ), if available

Activities

97

Key activities in Step 3 include:

1.

To the extent possible, identify the relevant Technology

Architecture building blocks, drawing on the

Architecture Continuum.

2.

For each viewpoint, create the model for the specific view required, using the selected tool or method.

Consider developing at least the following views: o o

Networked Computing/Hardware view

Communications view o o

Processing view

Cost view o

Standards view

3.

Assure that all stakeholder concerns are covered. If they are not, create new models to address concerns not covered, or augment existing models.

4.

Ensure that all information requirements in the

Business Architecture, Data Architecture, and

Applications Architecture are met.

5.

Perform trade-off analysis to resolve conflicts (if any) among the different views.

98

One method of doing this is CMU/SEI's Architecture

Trade-off Analysis (ATA) Method (refer to www.sei.cmu.edu/ata/ata_method.html

).

6.

Validate that the models support the principles, objectives, and constraints.

7.

Note changes to the viewpoint represented in the selected models from the Architecture Continuum, and document.

8.

Identify Solution Building Blocks that would be used to implement the system, and create a model of building blocks.

9.

Check building blocks against existing library of building blocks and re-use as appropriate.

10.

Test architecture models for completeness against requirements.

11.

Document rationale for building block decisions in the architecture document.

Outputs

The outputs of Step 3 are:

Target Technology Architecture (

Architecture ), Version 0.4:

Technology

Technology Architecture Model

Networked Computing/Hardware view

Communications view

Processing view

Cost view

Standards view

Technology Architecture - change requests and/or extensions or amendments to be incorporated in an organization-specific Architecture Continuum

Step 4

99

Step 4 is to select the services portfolio required per building block.

Objective

The objective of this step is to select services portfolios for each building block generated in Step 3.

Approach

The services portfolios are combinations of basic services from the service categories in the TOGAF TRM that do not conflict. The combination of services are again tested to ensure support for the applications. This is a pre-requisite to the later step of defining the architecture fully.

The constraints output from Step 2 can provide more detailed information about:

Requirements for organization-specific elements or preexisting decisions (as applicable)

Pre-existing and unchanging organizational elements

(as applicable)

Inherited external environment constraints

Where requirements demand definition of specialized services that are not identified in TOGAF, consideration should be given to how these might be replaced if standardized services become available in the future.

For each Architecture Building Block (ABB), build up a service description portfolio as a set of non-conflicting services. The set of services must be tested to ensure that the functionality provided meets application requirements.

Inputs

100

The inputs to Step 4 are:

Target Business Architecture (

Version 1.0

Business Architecture

Target Technology Architecture ( Technology

Architecture ), Version 0.4

Technical Reference Model (TRM)

Standards Information Base (SIB)

),

Activities

Key activities in Step 4 include:

1.

Produce affinity grouping of services

2.

Cross-check affinity groups against needs

3.

Document service description portfolio for each ABB, cross-checking for non-conflicting services

4.

Document change requests to architectures in the

Architecture Continuum

Outputs

The outputs of Step 4 are:

Target Technology Architecture ( Technology

Architecture ), Version 0.5:

Technology Architecture - target services (a description of the service portfolios required also known as an Organization-Specific Framework)

Technology Architecture - change requests and/or extensions or amendments to be incorporated in an organization-specific Architecture Continuum

Step 5

101

Step 5 is to confirm that the business goals and objectives are met.

Objective

The objective of this step is to clarify and check the business goals and other objectives of implementing the architecture.

This is required as a cross-check that the Technology

Architecture meets these objectives.

Approach

The key question list is used to pose questions against the architecture model and service description portfolio to test its merit and completeness.

Inputs

The inputs to Step 5 are:

Target Business Architecture (

Version 1.0 (business goals)

Business Architecture

Target Technology Architecture ( Technology

),

Architecture ), Version 0.5

Activities

Key activities in Step 5 include:

1.

Conduct a formal checkpoint review of the architecture model and building blocks with stakeholders, validating that business goals are met. Utilizing the key questions list, ensure that the architecture addresses each question.

2.

Document findings.

Outputs

The outputs of Step 5 are:

Target Technology Architecture (

Architecture ), Version 0.6:

Technology

Technology Architecture - requirements traceability (business objectives criteria)

102

Step 6

Step 6 is to determine criteria for specification selection.

Objective

The objective of this step is to develop a set of criteria for choosing specifications and portfolios of specifications.

Approach

Choosing the right criteria is vital if the final architecture is to meet its objectives. These criteria will depend on the existing system and the overall objectives for the new architecture. The overall objectives should be developed from the organization's business goals, so it is hard to give specific advice here, but some example objectives are listed in Part IV: Resource Base , Business Scenarios .

Here are some example criteria, selected by a large government organization with the intention of building a stable and widely applicable architecture:

"A standard or specification:

Must meet the organization's requirements

Must meet legal requirements

103

Should be a publicly available specification

Should have been developed by a process which sought a high level of consensus from a wide variety of sources

Should be supported by a range of readily available products

Should be complete

Should be well understood, mature technology

Should be testable, so that components or products can be checked for conformance

Should support internationalization

Should have no serious implications for ongoing support of legacy systems

Should be stable

Should be in wide use

Should have few, if any problems or limitations"

A high level of consensus is often considered the most important factor by large organizations because standards and specifications chosen have to accommodate a wide range of user needs. For example, in determining the level of consensus for standards in their architecture, the

Application Portability Profile (APP), the US National Institute for Standards and Technology (NIST) prefers to use international standards for the basis of specifications. The process through which these international standards have evolved requires a very high level of consensus. A number of

US Federal Information Processing Standards (FIPS) specified in the APP are based on approved international standards. The use of international standards has significant benefits for any organization which works or trades with organizations in other countries.

Inputs

The inputs to Step 6 are:

104

Target Business Architecture (

Version 1.0

Business Architecture

Target Technology Architecture ( Technology

Architecture ), Version 0.6

Standards Information Base (SIB)

),

Activities

Key activities in Step 6 include:

1.

Brainstorm criteria for choosing specifications and portfolios of specifications relying on previously used criteria for existing system and extrapolating for new architectural elements.

2.

Meet with sponsors and present current state to negotiate a continue request from sponsors.

Outputs

The outputs of Step 6 are:

Target Technology Architecture ( Technology

Architecture ), Version 0.7:

Technology Architecture - requirements traceability (standards selection criteria)

Step 7

Step 7 is to complete the architecture definition.

Objective

The objective of this step is to fully specify the Technology

Architecture. This is a complex and iterative process in which

105 the selection of building blocks and interfaces has a big impact on how the original requirements are met. See Part

IV: Resource Base , Building Blocks for further details.

Approach

Completion of the architecture definition may be achieved in two steps, by defining an intermediate Transitional

Architecture in addition to the final Target Architecture, if complexity of migration requires it.

The specification of building blocks as a portfolio of services is an evolutionary process:

The earliest building block definitions start as relatively abstract ones, defined by standards and services that map most easily to the architecture framework. These building blocks are most probably ABBs.

At this stage a model and a portfolio of services have been established. The next step is to select the set of specifications that provide the services and that can be combined as required to create the building blocks.

During this final step in the development of building blocks it must be verified that the organization-specific requirements will be met. The development process must include recognition of dependencies and boundaries for functions and should take account of what products are available in the marketplace. There are architectural and related solution-oriented building blocks.

An example of how this might be expressed can be seen in the building blocks example ( Part IV: Resource

Base , Building Blocks ). Building blocks can be defined at a number of levels matching the degree of

106 integration that best defines the architecture of the system at any stage.

Fundamental functionality and attributes - semantic, unambiguous including security capability and manageability

Interfaces - chosen set, supplied (APIs, data formats, protocols, hardware interfaces, standards)

Dependent building blocks with required functionality and named used interfaces

Map to business/organizational entities and policies

Finally the building blocks become more implementation-specific as Solution Building Blocks

(SBBs) and their interfaces become the detailed architecture specification. SBBs are a means to determine how portions of the Target Architecture might be procured, developed, or re-used. The SBBs architecture should have separate elements for developed, re-used, and procured building blocks, each described in terms of their minimum specification.

A full list of standards and specifications recommended by

The Open Group can be found in Part III: Enterprise

Continuum , Foundation Architecture: Standards Information

Base .

Inputs

The inputs to Step 7 are:

Target Business Architecture (

Version 1.0

Business Architecture ),

Target Technology Architecture (

Architecture ), Version 0.7

Technology

Re-usable Architecture Building Blocks, from organization's Architecture Continuum ( The

Architecture Continuum ), if available

Standards Information Base (SIB)

Activities

107

Key activities in Step 7 include:

1.

Ensure clear documentation of all interfaces for each building block (APIs, data formats, protocols, hardware interfaces).

2.

Select standards for each of the Architecture Building

Blocks, re-using as much as possible from the reference models selected from the Architecture Continuum.

3.

Fully document each Architecture Building Block.

4.

Final cross-check of overall architecture against business requirements. Document rationale for building block decisions in the architecture document.

5.

Document final requirements traceability reports.

6.

Document final mapping of the architecture within the

Architecture Continuum. From the selected Architecture

Building Blocks, identify those that might be re-used, and publish via the architecture repository.

7.

Document rationale for building block decisions in the architecture document.

8.

Generate the Technology Architecture document.

9.

Prepare the Technology Architecture Report. If appropriate, use reports and/or graphics generated by modeling tools to demonstrate key views of the architecture. Route the Technology Architecture

108 document for review by relevant stakeholders, and incorporate feedback.

10.

Checkpoint/Impact Analysis: Check the original motivation for the architecture project and the

Statement of Architecture Work against the proposed

Technology Architecture. Conduct an Impact Analysis, to: a.

Identify any areas where the Business Architecture

(e.g., business practices) may need to change to cater for changes in the Technology Architecture.

If the impact is significant, this may warrant the

Business Architecture being revisited. b.

Identify any areas where the Data Architecture may need to change to cater for changes in the

Technology Architecture.

If the impact is significant, this may warrant the

Data Architecture being revisited. c.

Identify any areas where the Applications

Architecture may need to change to cater for changes in the Technology Architecture.

If the impact is significant, this may warrant the

Applications Architecture being revisited. d.

Refine the proposed Technology Architecture only if necessary.

Outputs

The outputs of Step 7 are:

Target Technology Architecture (

Architecture ), Version 0.8

Technology

Technology Architecture - architecture specification

Technology Architecture - requirements traceability

Technology Architecture - mapping of the architectures in the Architecture Continuum

Technology Architecture Report

Step 8

Step 8 is to conduct a gap analysis.

Objective

109

The objective of this step is to identify areas of the current and target system for which provision has not been made in the Technology Architecture. This is required in order to identify projects to be undertaken as part of the implementation of the target system.

Approach

A key step in validating an architecture is to consider what may have been forgotten. The architecture must support all of the essential information processing needs of the organization, as driven by the required applications. The most critical source of gaps that should be considered is stakeholder concerns that have not been addressed in subsequent architectural work.

Gap analysis highlights services and/or functions that have been accidentally left out, deliberately eliminated, or are yet to be developed or procured:

110

Draw up a matrix with all the business functions of the

Baseline Architecture on the vertical axis, and all the business functions of the Target Technology

Architecture on the horizontal axis. In creating the matrix, it is imperative to use terminology that is accurate and consistent.

Add to the Baseline Architecture axis a final row labeled

"New Services", and to the Target Architecture axis a final column labeled "Eliminated Services".

Where a function is available in both the Baseline and

Target Architectures, record this with "Included" at the intersecting cell.

Where a function from the Baseline Architecture is missing in the Target Architecture (in the example,

"broadcast services" and "shared screen services"), each must be reviewed. If it was correctly eliminated, mark it as such in the appropriate "Eliminated Services" cell. If it was not, you have uncovered an accidental omission in your new architecture that must be addressed by reinstating the function in the next iteration of the design - mark it as such in the appropriate "Eliminated Services" cell.

Where a function from the Target Architecture cannot be found in the Baseline Architecture (in the example,

"mailing list services"), mark it at the intersection with the "New" row, as a gap that needs to be filled, either by developing or procuring the function.

When the exercise is complete, anything under "Eliminated

Services" or "New Services" is a gap, which should either be explained as correctly eliminated, or marked as to be addressed by reinstating or developing/procuring the function.

Gap Analysis Matrix shows an example from the Network

Services category when functions from the Baseline

Architecture are missing from the Target Architecture.

111

Table: Gap Analysis Matrix

Inputs

The inputs to Step 8 are:

112

Target Business Architecture (

Version 1.0

Business Architecture

Target Technology Architecture ( Technology

Architecture ), Version 0.8

Target Data Architecture, Version 1.0

Target Applications Architecture, Version 1.0

),

Activities

Key activities in Step 8 include:

1.

Create gap matrix as described above.

2.

Identify building blocks to be carried over, classifying as either changed or unchanged.

3.

Identify eliminated building blocks.

4.

Identify new building blocks.

5.

Identify gaps and classify as those that should be developed, those that should be procured, and those inherited.

Outputs

The output of Step 8 is:

Target Technology Architecture (

Architecture ), Version 1.0:

Technology

Technology Architecture - gap report

Postscript

The Technology Architecture development process described above includes iterations. Financial and timing constraints should explicitly limit the number of iterations within Steps 1 through 8, and drive to implementation. After that, a new cycle of architecture evolution may ensue.

Choosing the scope of an architecture development cycle carefully will accelerate the pay-back. In contrast, an excessively large scope is unlikely to lead to successful implementation.

113

"How do you eat an elephant? - One bite at a time." return to top of page

Navigation

The TOGAF document set is designed for use with frames.

To navigate around the document:

In the main Contents frame at the top of the page, click the relevant hyperlink (Part I, Part II, etc.) to load the

Contents List for that Part of the TOGAF document into the Secondary Index frame in the left margin.

Then click in that Contents List to load a page into this main frame. return to top of page

Downloads

Downloads of the TOGAF documentation, are available under license from the TOGAF information web site . The license is free to any organization wishing to use TOGAF entirely for internal purposes (for example, to develop an information system architecture for use within that

organization). A hardcopy book is also available from The

Open Group Bookstore as document G063 .

114

Architecture Principles

Introduction | Characteristics of Architecture Principles |

Components of Architecture Principles | Developing

Architecture Principles | Applying Architecture Principles |

Example Set of Architecture Principles

This chapter provides principles for the use and deployment of IT resources across the enterprise.

This chapter builds on work done by the US Air Force in establishing its

Headquarters Air Force Principles for

Information Management

(June 29, 1998), with the addition of other input materials.

Introduction

Principles are general rules and guidelines, intended to be enduring and seldom amended, that inform and support the way in which an organization sets about fulfilling its mission.

In their turn, principles may be just one element in a structured set of ideas that collectively define and guide the organization, from values through to actions and results.

Depending on the organization, principles may be established at any or all of three levels:

Enterprise principles provide a basis for decisionmaking throughout an enterprise, and inform how the organization sets about fulfilling its mission. Such

115 enterprise-level principles are commonly found in governmental and not-for-profit organizations, but are encountered in commercial organizations also, as a means of harmonizing decision-making across a distributed organization. In particular, they are a key element in a successful architecture governance strategy (see Architecture Governance ).

Information Technology (IT) principles provide guidance on the use and deployment of all IT resources and assets across the enterprise. They are developed in order to make the information environment as productive and cost-effective as possible.

Architecture principles are a subset of IT principles that relate to architecture work. They reflect a level of consensus across the enterprise, and embody the spirit and thinking of the enterprise architecture. Architecture principles can be further divided into:

Principles that govern the architecture process, affecting the development, maintenance, and use of the enterprise architecture

Principles that govern the implementation of the architecture, establishing the first tenets and related guidance for designing and developing information systems

These sets of principles form a hierarchy, in that IT principles will be informed by, and elaborate on, the principles at the enterprise level; and architecture principles will likewise be informed by the principles at the two higher levels.

The remainder of this section deals exclusively with architecture principles.

Characteristics of Architecture Principles

116

Architecture principles define the underlying general rules and guidelines for the use and deployment of all IT resources and assets across the enterprise. They reflect a level of consensus among the various elements of the enterprise, and form the basis for making future IT decisions.

Each architecture principle should be clearly related back to the business objectives and key architecture drivers.

Components of Architecture Principles

It is useful to have a standard way of defining principles. In addition to a definition statement, each principle should have associated rationale and implications statements, both to promote understanding and acceptance of the principles themselves, and to support the use of the principles in explaining and justifying why specific decisions are made.

A recommended template is given in Recommended Format for Defining Principles .

Name Should both represent the essence of the rule as well as be easy to remember. Specific technology platforms should not be mentioned in the name or statement of a principle. Avoid ambiguous words in the

Name and in the Statement such as:

"support", "open", "consider", and for lack of good measure the word "avoid", itself, be careful with "manage(ment)", and look for unnecessary adjectives and adverbs (fluff).

Statement Should succinctly and unambiguously

117 communicate the fundamental rule. For the most part, the principles statements for managing information are similar from one organization to the next. It is vital that the principles statement be unambiguous.

Rationale Should highlight the business benefits of adhering to the principle, using business terminology. Point to the similarity of information and technology principles to the principles governing business operations.

Also describe the relationship to other principles, and the intentions regarding a balanced interpretation. Describe situations where one principle would be given precedence or carry more weight than another for making a decision.

Implications Should highlight the requirements, both for the business and IT, for carrying out the principle - in terms of resources, costs, and activities/tasks. It will often be apparent that current systems, standards, or practices would be incongruent with the principle upon adoption. The impact to the business and consequences of adopting a principle should be clearly stated. The reader should readily discern the answer to: "How does this affect me?" It is important not to oversimplify, trivialize, or judge the merit of the impact.

Some of the implications will be identified as potential impacts only, and may be speculative rather than fully analyzed.

Table: Recommended Format for Defining Principles

118

An example set of architecture principles following this template is given in Example Set of Architecture Principles .

Developing Architecture Principles

Architecture principles are typically developed by the Lead

Architect, in conjunction with the enterprise CIO,

Architecture Board, and other key business stakeholders.

Appropriate policies and procedures must be developed to support the implementation of the principles.

Architecture principles will be informed by overall IT principles and principles at the enterprise level, if they exist.

They are chosen so as to ensure alignment of IT strategies with business strategies and visions. Specifically, the development of architecture principles is typically influenced by the following:

Enterprise mission and plans: the mission, plans, and organizational infrastructure of the enterprise.

Enterprise strategic initiatives: the characteristics of the enterprise - its strengths, weaknesses, opportunities, and threats - and its current enterprisewide initiatives (such as process improvement and quality management).

External constraints: market factors (time-to-market imperatives, customer expectations, etc.); existing and potential legislation.

Current systems and technology: the set of information resources deployed within the enterprise, including systems documentation, equipment inventories, network configuration diagrams, policies, and procedures.

119

Computer industry trends: predictions about the usage, availability, and cost of computer and communication technologies, referenced from credible sources along with associated best practices presently in use.

Qualities of Principles

Merely having a written statement that is called a principle does not mean that the principle is good, even if everyone agrees with it.

A good set of principles will be founded in the beliefs and values of the organization and expressed in language that the business understands and uses. Principles should be few in number, future-oriented, and endorsed and championed by senior management. They provide a firm foundation for making architecture and planning decisions, framing policies, procedures, and standards, and supporting resolution of contradictory situations. A poor set of principles will quickly become disused, and the resultant architectures, policies, and standards will appear arbitrary or self-serving, and thus lack credibility. Essentially, principles drive behavior.

There are five criteria that distinguish a good set of principles:

Understandable: the underlying tenets can be quickly grasped and understood by individuals throughout the organization. The intention of the principle is clear and unambiguous, so that violations, whether intentional or not, are minimized.

Robust: enable good quality decisions about architectures and plans to be made, and enforceable

120 policies and standards to be created. Each principle should be sufficiently definitive and precise to support consistent decision-making in complex, potentially controversial situations.

Complete: every potentially important principle governing the management of information and technology for the organization is defined. The principles cover every situation perceived.

Consistent: strict adherence to one principle may require a loose interpretation of another principle. The set of principles must be expressed in a way that allows a balance of interpretations. Principles should not be contradictory to the point where adhering to one principle would violate the spirit of another. Every word in a principle statement should be carefully chosen to allow consistent yet flexible interpretation.

Stable: principles should be enduring, yet able to accommodate changes. An amendment process should be established for adding, removing, or altering principles after they are ratified initially.

Applying Architecture Principles

Architecture principles are used to capture the fundamental truths about how the enterprise will use and deploy IT resources and assets. The principles are used in a number of different ways:

1.

To provide a framework within which the enterprise can start to make conscious decisions about IT

2.

As a guide to establishing relevant evaluation criteria, thus exerting strong influence on the selection of products or product architectures in the later stages of managing compliance to the IT architecture

121

3.

As drivers for defining the functional requirements of the architecture

4.

As an input to assessing both existing IS/IT systems and the future strategic portfolio, for compliance with the defined architectures. These assessments will provide valuable insights into the transition activities needed to implement an architecture, in support of business goals and priorities

5.

The Rationale statements (see below) highlight the value of the architecture to the enterprise, and therefore provide a basis for justifying architecture activities

6.

The Implications statements (see below) provide an outline of the key tasks, resources, and potential costs to the enterprise of following the principle; they also provide valuable inputs to future transition initiative and planning activities

7.

Support the architecture governance activities in terms of: o o

Providing a "back-stop" for the standard

Architecture Compliance assessments where some interpretation is allowed or required

Supporting the decision to initiate a dispensation request where the implications of a particular architecture amendment cannot be resolved within local operating procedure

Principles are inter-related, and need to be applied as a set.

Principles will sometimes compete; for example, the principles of "accessibility" and "security" tend towards conflicting decisions. Each principle must be considered in the context of "all other things being equal".

122

At times a decision will be required as to which information principle will take precedence on a particular issue. The rationale for such decisions should always be documented.

A common reaction on first reading of a principle is "this is motherhood", but the fact that a principle seems self-evident does not mean that the principle is actually observed in an organization, even when there are verbal acknowledgements of the principle.

Although specific penalties are not prescribed in a declaration of principles, violations of principles generally cause operational problems and inhibit the ability of the organization to fulfil its mission.

Example Set of Architecture Principles

Too many principles can reduce the flexibility of the architecture. Many organizations prefer to define only highlevel principles, and to limit the number to between 10 and

20.

The following example illustrates both the typical content of a set of architecture principles, and the recommended format for defining them, as explained above.

Another example of architecture principles is contained in the US Government's Federal Enterprise Architecture

Framework (FEAF - see FEAF ).

Business Principles

Principle 1:

Primacy of Principles

Statement:

123

These principles of information management apply to all organizations within the enterprise.

Rationale:

The only way we can provide a consistent and measurable level of quality information to decisionmakers is if all organizations abide by the principles.

Implications:

Without this principle, exclusions, favoritism, and inconsistency would rapidly undermine the management of information.

Information management initiatives will not begin until they are examined for compliance with the principles.

A conflict with a principle will be resolved by changing the framework of the initiative.

Principle 2:

Maximize Benefit to the Enterprise

Statement:

Information management decisions are made to provide maximum benefit to the enterprise as a whole.

Rationale:

This principle embodies "service above self". Decisions made from an enterprise-wide perspective have greater long-term value than decisions made from any particular organizational perspective. Maximum return on investment requires information management decisions to adhere to enterprise-wide drivers and priorities. No minority group will detract from the benefit of the whole. However, this principle will not preclude any minority group from getting its job done.

Implications:

124

Achieving maximum enterprise-wide benefit will require changes in the way we plan and manage information. Technology alone will not bring about this change.

Some organizations may have to concede their own preferences for the greater benefit of the entire enterprise.

Application development priorities must be established by the entire enterprise for the entire enterprise.

Applications components should be shared across organizational boundaries.

Information management initiatives should be conducted in accordance with the enterprise plan.

Individual organizations should pursue information management initiatives which conform to the blueprints and priorities established by the enterprise. We will change the plan as we need to.

As needs arise, priorities must be adjusted. A forum with comprehensive enterprise representation should make these decisions.

Principle 3:

Information Management is Everybody's

Business

Statement:

All organizations in the enterprise participate in information management decisions needed to accomplish business objectives.

Rationale:

Information users are the key stakeholders, or customers, in the application of technology to address a business need. In order to ensure information

125 management is aligned with the business, all organizations in the enterprise must be involved in all aspects of the information environment. The business experts from across the enterprise and the technical staff responsible for developing and sustaining the information environment need to come together as a team to jointly define the goals and objectives of IT.

Implications:

To operate as a team, every stakeholder, or customer, will need to accept responsibility for developing the information environment.

Commitment of resources will be required to implement this principle.

Principle 4:

Business Continuity

Statement:

Enterprise operations are maintained in spite of system interruptions.

Rationale:

As system operations become more pervasive, we become more dependent on them; therefore, we must consider the reliability of such systems throughout their design and use. Business premises throughout the enterprise must be provided with the capability to continue their business functions regardless of external events. Hardware failure, natural disasters, and data corruption should not be allowed to disrupt or stop enterprise activities. The enterprise business functions must be capable of operating on alternative information delivery mechanisms.

Implications:

126

Dependency on shared system applications mandates that the risks of business interruption must be established in advance and managed.

Management includes but is not limited to periodic reviews, testing for vulnerability and exposure, or designing mission-critical services to assure business function continuity through redundant or alternative capabilities.

Recoverability, redundancy, and maintainability should be addressed at the time of design.

Applications must be assessed for criticality and impact on the enterprise mission, in order to determine what level of continuity is required and what corresponding recovery plan is necessary.

Principle 5:

Common Use Applications

Statement:

Development of applications used across the enterprise is preferred over the development of similar or duplicative applications which are only provided to a particular organization.

Rationale:

Duplicative capability is expensive and proliferates conflicting data.

Implications:

Organizations which depend on a capability which does not serve the entire enterprise must change over to the replacement enterprise-wide capability.

This will require establishment of and adherence to a policy requiring this.

Organizations will not be allowed to develop capabilities for their own use which are

127 similar/duplicative of enterprise-wide capabilities.

In this way, expenditures of scarce resources to develop essentially the same capability in marginally different ways will be reduced.

Data and information used to support enterprise decision-making will be standardized to a much greater extent than previously. This is because the smaller, organizational capabilities which produced different data (which was not shared among other organizations) will be replaced by enterprise-wide capabilities. The impetus for adding to the set of enterprise-wide capabilities may well come from an organization making a convincing case for the value of the data/information previously produced by its organizational capability, but the resulting capability will become part of the enterprise-wide system, and the data it produces will be shared across the enterprise.

Principle 6:

Compliance with Law

Statement:

Enterprise information management processes comply with all relevant laws, policies, and regulations.

Rationale:

Enterprise policy is to abide by laws, policies, and regulations. This will not preclude business process improvements that lead to changes in policies and regulations.

Implications:

The enterprise must be mindful to comply with laws, regulations, and external policies regarding the collection, retention, and management of data.

128

Education and access to the rules. Efficiency, need, and common sense are not the only drivers.

Changes in the law and changes in regulations may drive changes in our processes or applications.

Principle 7:

IT Responsibility

Statement:

The IT organization is responsible for owning and implementing IT processes and infrastructure that enable solutions to meet user-defined requirements for functionality, service levels, cost, and delivery timing.

Rationale:

Effectively align expectations with capabilities and costs so that all projects are cost-effective. Efficient and effective solutions have reasonable costs and clear benefits.

Implications:

A process must be created to prioritize projects.

The IT function must define processes to manage business unit expectations.

Data, application, and technology models must be created to enable integrated quality solutions and to maximize results.

Principle 8:

Protection of Intellectual Property

Statement:

The enterprise's Intellectual Property (IP) must be protected. This protection must be reflected in the IT architecture, implementation, and governance processes.

Rationale:

A major part of an enterprise's IP is hosted in the IT domain.

Implications:

129

While protection of IP assets is everybody's business, much of the actual protection is implemented in the IT domain. Even trust in non-

IT processes can be managed by IT processes

(email, mandatory notes, etc.).

A security policy, governing human and IT actors, will be required that can substantially improve protection of IP. This must be capable of both avoiding compromises and reducing liabilities.

Resources on such policies can be found at the

SANS Institute

( www.sans.org/newlook/home.php

).

Data Principles

Principle 9:

Data is an Asset

Statement:

Data is an asset that has value to the enterprise and is managed accordingly.

Rationale:

Data is a valuable corporate resource; it has real, measurable value. In simple terms, the purpose of data is to aid decision-making. Accurate, timely data is critical to accurate, timely decisions. Most corporate assets are carefully managed, and data is no exception.

Data is the foundation of our decision-making, so we must also carefully manage data to ensure that we know where it is, can rely upon its accuracy, and can obtain it when and where we need it.

Implications:

130

This is one of three closely-related principles regarding data: data is an asset; data is shared; and data is easily accessible. The implication is that there is an education task to ensure that all organizations within the enterprise understand the relationship between value of data, sharing of data, and accessibility to data.

Stewards must have the authority and means to manage the data for which they are accountable.

We must make the cultural transition from "data ownership" thinking to "data stewardship" thinking.

The role of data steward is critical because obsolete, incorrect, or inconsistent data could be passed to enterprise personnel and adversely affect decisions across the enterprise.

Part of the role of data steward, who manages the data, is to ensure data quality. Procedures must be developed and used to prevent and correct errors in the information and to improve those processes that produce flawed information. Data quality will need to be measured and steps taken to improve data quality - it is probable that policy and procedures will need to be developed for this as well.

A forum with comprehensive enterprise-wide representation should decide on process changes suggested by the steward.

Since data is an asset of value to the entire enterprise, data stewards accountable for properly managing the data must be assigned at the enterprise level.

131

Principle 10:

Data is Shared

Statement:

Users have access to the data necessary to perform their duties; therefore, data is shared across enterprise functions and organizations.

Rationale:

Timely access to accurate data is essential to improving the quality and efficiency of enterprise decision-making.

It is less costly to maintain timely, accurate data in a single application, and then share it, than it is to maintain duplicative data in multiple applications. The enterprise holds a wealth of data, but it is stored in hundreds of incompatible stovepipe databases. The speed of data collection, creation, transfer, and assimilation is driven by the ability of the organization to efficiently share these islands of data across the organization.

Shared data will result in improved decisions since we will rely on fewer (ultimately one virtual) sources of more accurate and timely managed data for all of our decision-making. Electronically shared data will result in increased efficiency when existing data entities can be used, without re-keying, to create new entities.

Implications:

This is one of three closely-related principles regarding data: data is an asset; data is shared; and data is easily accessible. The implication is that there is an education task to ensure that all organizations within the enterprise understand the

132 relationship between value of data, sharing of data, and accessibility to data.

To enable data sharing we must develop and abide by a common set of policies, procedures, and standards governing data management and access for both the short and the long term.

For the short term, to preserve our significant investment in legacy systems, we must invest in software capable of migrating legacy system data into a shared data environment.

We will also need to develop standard data models, data elements, and other metadata that defines this shared environment and develop a repository system for storing this metadata to make it accessible.

For the long term, as legacy systems are replaced, we must adopt and enforce common data access policies and guidelines for new application developers to ensure that data in new applications remains available to the shared environment and that data in the shared environment can continue to be used by the new applications.

For both the short term and the long term we must adopt common methods and tools for creating, maintaining, and accessing the data shared across the enterprise.

Data sharing will require a significant cultural change.

This principle of data sharing will continually

"bump up against" the principle of data security.

Under no circumstances will the data sharing principle cause confidential data to be compromised.

133

Data made available for sharing will have to be relied upon by all users to execute their respective tasks. This will ensure that only the most accurate and timely data is relied upon for decision-making.

Shared data will become the enterprise-wide

"virtual single source" of data.

Principle 11:

Data is Accessible

Statement:

Data is accessible for users to perform their functions.

Rationale:

Wide access to data leads to efficiency and effectiveness in decision-making, and affords timely response to information requests and service delivery.

Using information must be considered from an enterprise perspective to allow access by a wide variety of users. Staff time is saved and consistency of data is improved.

Implications:

This is one of three closely-related principles regarding data: data is an asset; data is shared; and data is easily accessible. The implication is that there is an education task to ensure that all organizations within the enterprise understand the relationship between value of data, sharing of data, and accessibility to data.

Accessibility involves the ease with which users obtain information.

The way information is accessed and displayed must be sufficiently adaptable to meet a wide range of enterprise users and their corresponding methods of access.

134

Access to data does not constitute understanding of the data. Personnel should take caution not to misinterpret information.

Access to data does not necessarily grant the user access rights to modify or disclose the data. This will require an education process and a change in the organizational culture, which currently supports a belief in "ownership" of data by functional units.

Principle 12:

Data Trustee

Statement:

Each data element has a trustee accountable for data quality.

Rationale:

One of the benefits of an architected environment is the ability to share data (e.g., text, video, sound, etc.) across the enterprise. As the degree of data sharing grows and business units rely upon common information, it becomes essential that only the data trustee makes decisions about the content of data.

Since data can lose its integrity when it is entered multiple times, the data trustee will have sole responsibility for data entry which eliminates redundant human effort and data storage resources.

Note:

A trustee is different than a steward - a trustee is responsible for accuracy and currency of the data, while responsibilities of a steward may be broader and include data standardization and definition tasks.

Implications:

135

Real trusteeship dissolves the data "ownership" issues and allows the data to be available to meet all users' needs. This implies that a cultural change from data "ownership" to data

"trusteeship" may be required.

The data trustee will be responsible for meeting quality requirements levied upon the data for which the trustee is accountable.

It is essential that the trustee has the ability to provide user confidence in the data based upon attributes such as "data source".

It is essential to identify the true source of the data in order that the data authority can be assigned this trustee responsibility. This does not mean that classified sources will be revealed nor does it mean the source will be the trustee.

Information should be captured electronically once and immediately validated as close to the source as possible. Quality control measures must be implemented to ensure the integrity of the data.

As a result of sharing data across the enterprise, the trustee is accountable and responsible for the accuracy and currency of their designated data element(s) and, subsequently, must then recognize the importance of this trusteeship responsibility.

Principle 13:

Common Vocabulary and Data Definitions

Statement:

Data is defined consistently throughout the enterprise, and the definitions are understandable and available to all users.

136

Rationale:

The data that will be used in the development of applications must have a common definition throughout the Headquarters to enable sharing of data. A common vocabulary will facilitate communications and enable dialogue to be effective. In addition, it is required to interface systems and exchange data.

Implications:

We are lulled into thinking that this issue is adequately addressed because there are people with "data administration" job titles and forums with charters implying responsibility. Significant additional energy and resources must be committed to this task. It is key to the success of efforts to improve the information environment.

This is separate from but related to the issue of data element definition, which is addressed by a broad community - this is more like a common vocabulary and definition.

The enterprise must establish the initial common vocabulary for the business. The definitions will be used uniformly throughout the enterprise.

Whenever a new data definition is required, the definition effort will be co-ordinated and reconciled with the corporate "glossary" of data descriptions.

The enterprise data administrator will provide this co-ordination.

Ambiguities resulting from multiple parochial definitions of data must give way to accepted enterprise-wide definitions and understanding.

Multiple data standardization initiatives need to be co-ordinated.

137

Functional data administration responsibilities must be assigned.

Principle 14:

Data Security

Statement:

Data is protected from unauthorized use and disclosure.

In addition to the traditional aspects of national security classification, this includes, but is not limited to, protection of pre-decisional, sensitive, source selectionsensitive, and proprietary information.

Rationale:

Open sharing of information and the release of information via relevant legislation must be balanced against the need to restrict the availability of classified, proprietary, and sensitive information.

Existing laws and regulations require the safeguarding of national security and the privacy of data, while permitting free and open access. Pre-decisional (workin-progress, not yet authorized for release) information must be protected to avoid unwarranted speculation, misinterpretation, and inappropriate use.

Implications:

Aggregation of data, both classified and not, will create a large target requiring review and declassification procedures to maintain appropriate control. Data owners and/or functional users must determine whether the aggregation results in an increased classification level. We will need appropriate policy and procedures to handle this review and de-classification. Access to information

138 based on a need-to-know policy will force regular reviews of the body of information.

The current practice of having separate systems to contain different classifications needs to be rethought. Is there a software solution to separating classified and unclassified data? The current hardware solution is unwieldy, inefficient, and costly. It is more expensive to manage unclassified data on a classified system. Currently, the only way to combine the two is to place the unclassified data on the classified system, where it must remain.

In order to adequately provide access to open information while maintaining secure information, security needs must be identified and developed at the data level, not the application level.

Data security safeguards can be put in place to restrict access to "view only", or "never see".

Sensitivity labeling for access to pre-decisional, decisional, classified, sensitive, or proprietary information must be determined.

Security must be designed into data elements from the beginning; it cannot be added later.

Systems, data, and technologies must be protected from unauthorized access and manipulation. Headquarters information must be safeguarded against inadvertent or unauthorized alteration, sabotage, disaster, or disclosure.

Need new policies on managing duration of protection for pre-decisional information and other works-in-progress, in consideration of content freshness.

Application Principles

139

Principle 15:

Technology Independence

Statement:

Applications are independent of specific technology choices and therefore can operate on a variety of technology platforms.

Rationale:

Independence of applications from the underlying technology allows applications to be developed, upgraded, and operated in the most cost-effective and timely way. Otherwise technology, which is subject to continual obsolescence and vendor dependence, becomes the driver rather than the user requirements themselves.

Realizing that every decision made with respect to IT makes us dependent on that technology, the intent of this principle is to ensure that Application Software is not dependent on specific hardware and operating systems software.

Implications:

This principle will require standards which support portability.

For Commercial Off-The-Shelf (COTS) and

Government Off-The-Shelf (GOTS) applications, there may be limited current choices, as many of these applications are technology and platformdependent.

Application Program Interfaces (APIs) will need to be developed to enable legacy applications to

140 interoperate with applications and operating environments developed under the enterprise architecture.

Middleware should be used to decouple applications from specific software solutions.

As an example, this principle could lead to use of

Java, and future Java-like protocols, which give a high degree of priority to platform-independence.

Principle 16:

Ease-of-Use

Statement:

Applications are easy to use. The underlying technology is transparent to users, so they can concentrate on tasks at hand.

Rationale:

The more a user has to understand the underlying technology, the less productive that user is. Ease-ofuse is a positive incentive for use of applications. It encourages users to work within the integrated information environment instead of developing isolated systems to accomplish the task outside of the enterprise's integrated information environment. Most of the knowledge required to operate one system will be similar to others. Training is kept to a minimum, and the risk of using a system improperly is low.

Using an application should be as intuitive as driving a different car.

Implications:

Applications will be required to have a common

"look and feel" and support ergonomic

141 requirements. Hence, the common look and feel standard must be designed and usability test criteria must be developed.

Guidelines for user interfaces should not be constrained by narrow assumptions about user location, language, systems training, or physical capability. Factors such as linguistics, customer physical infirmities (visual acuity, ability to use keyboard/mouse), and proficiency in the use of technology have broad ramifications in determining the ease-of-use of an application.

Technology Principles

Principle 17:

Requirements-Based Change

Statement:

Only in response to business needs are changes to applications and technology made.

Rationale:

This principle will foster an atmosphere where the information environment changes in response to the needs of the business, rather than having the business change in response to IT changes. This is to ensure that the purpose of the information support - the transaction of business - is the basis for any proposed change. Unintended effects on business due to IT changes will be minimized. A change in technology may provide an opportunity to improve the business process and, hence, change business needs.

Implications:

142

Changes in implementation will follow full examination of the proposed changes using the enterprise architecture.

We don't fund a technical improvement or system development unless a documented business need exists.

Change management processes conforming to this principle will be developed and implemented.

This principle may bump up against the responsive change principle. We must ensure the requirements documentation process does not hinder responsive change to meet legitimate business needs. The purpose of this principle is to keep us focused on business, not technology needs - responsive change is also a business need.

Principle 18:

Responsive Change Management

Statement:

Changes to the enterprise information environment are implemented in a timely manner.

Rationale:

If people are to be expected to work within the enterprise information environment, that information environment must be responsive to their needs.

Implications:

We have to develop processes for managing and implementing change that do not create delays.

A user who feels a need for change will need to connect with a "business expert" to facilitate explanation and implementation of that need.

If we are going to make changes, we must keep the architectures updated.

143

Adopting this principle might require additional resources.

This will conflict with other principles (e.g., maximum enterprise-wide benefit, enterprise-wide applications, etc.).

Principle 19:

Control Technical Diversity

Statement:

Technological diversity is controlled to minimize the non-trivial cost of maintaining expertise in and connectivity between multiple processing environments.

Rationale:

There is a real, non-trivial cost of infrastructure required to support alternative technologies for processing environments. There are further infrastructure costs incurred to keep multiple processor constructs interconnected and maintained.

Limiting the number of supported components will simplify maintainability and reduce costs.

The business advantages of minimum technical diversity include: standard packaging of components; predictable implementation impact; predictable valuations and returns; redefined testing; utility status; and increased flexibility to accommodate technological advancements. Common technology across the enterprise brings the benefits of economies of scale to the enterprise. Technical administration and support costs are better controlled when limited resources can focus on this shared set of technology.

Implications:

144

Policies, standards, and procedures that govern acquisition of technology must be tied directly to this principle.

Technology choices will be constrained by the choices available within the technology blueprint.

Procedures for augmenting the acceptable technology set to meet evolving requirements will have to be developed and emplaced.

We are not freezing our technology baseline. We welcome technology advances and will change the technology blueprint when compatibility with the current infrastructure, improvement in operational efficiency, or a required capability has been demonstrated.

Principle 20:

Interoperability

Statement:

Software and hardware should conform to defined standards that promote interoperability for data, applications, and technology.

Rationale:

Standards help ensure consistency, thus improving the ability to manage systems and improve user satisfaction, and protect existing IT investments, thus maximizing return on investment and reducing costs. Standards for interoperability additionally help ensure support from multiple vendors for their products, and facilitate supply chain integration.

Implications:

Interoperability standards and industry standards will be followed unless there is a compelling

145 business reason to implement a non-standard solution.

A process for setting standards, reviewing and revising them periodically, and granting exceptions must be established.

The existing IT platforms must be identified and documented. return to top of page

Navigation

The TOGAF document set is designed for use with frames.

To navigate around the document:

In the main Contents frame at the top of the page, click the relevant hyperlink (Part I, Part II, etc.) to load the

Contents List for that Part of the TOGAF document into the Secondary Index frame in the left margin.

Then click in that Contents List to load a page into this main frame. return to top of page

Downloads

Downloads of the TOGAF documentation, are available under license from the TOGAF information web site . The license is free to any organization wishing to use TOGAF

entirely for internal purposes (for example, to develop an information system architecture for use within that organization). A hardcopy book is also available from The

Open Group Bookstore as document G063 .

146

Introduction

Well, here is an attempt to bring together ebXML, Web

Services, PMLs (BPML, BPEL, WS-CDL...) and define their respective roles and responsibilities, while showing how they fit in an overall IT infrastructure.

So we start with the basics and the p-calculus theory . For those of you who are not familiar with it and remember a few lectures of your college math classes, I really recommend reading about it. The best source is the book from the inventor of the theory: Robin Milner. You can also read this section which explains the differences between

State, Actions and Interactions in the context of finite state machine.

We then move on an review the current state of Web

Service specifications and explain why web services are the foundations of all three modern Process Markup Languages

(BPML, XLANG and WSFL).

You should be ready now to to take a look at the architecture of Business Process Management systems. We will then move on to the analysis of the Process Markup

Languages.

What is a business process anyway?

Everything that looks like a series of steps tends to be labelled a business process. Here is a simple taxonomy of

147

"business processes". For the purpose of this discussion we will distinguish five concepts, all of which are referenced in the literature as "business processes":

Enterprise business processes

Executable business processes ebXML business processes (a.k.a. collaborations)

Business process activities

Workflows

An Enterprise Business Process (eBP) is the description of steps needed to carry out a business activity regardless of the systems involved. They provide a high level view of the steps involved and can be used to model, benchmark and document existing or future designs. Enterprise business processes are actually free to span multiple corporations because of their nature which is not bounded to systems. An example would be describing all the steps that are required to happen for a pair of shoes to be manufactured in Asia and appear at your favourite store at the mall.

An Executable Business Process (xBP or Business

Process for short) is a kind of eBPs which lifecyle is controlled by one or a combination of systems. We will call these systems: business process management systems

(BPMS). It is limited to run within a single corporation. One of the important characteristics of an eBP or xBP is that it is long running. Its execution is not limited to minutes or hours like the session of a web-based application, it rather spans days, months, or years. An xBP relies on specific interactions between users, systems, and business partners which it ties together. This system provides all the facilities and services necessary for design and execution, and mediates the integration with its environment. As we will see in the later

148 sections of this chapter, a BPML, XLANG or WSFL business process is an XBP.

An ebXML Business Process (Collaboration) is a business collaboration specification which can be used to specify how two concurrent executable business processes interact at the business level.

A Business Process Activity (Task) represents a shortlived interaction between users or, in certain cases, systems.

BPas are often managed by a Session Bean in a J2EE based application. A BPa can be viewed as one step in an executable business process. A typical example is a user browsing a catalogue and filling a shopping cart. Once the user is finished, he or she pushes the checkout button, which in turn completes the activity. The proper information is passed to a business process management system as part of a completion message. Overall, the concept of long running executable business processes is not part of the

J2EE architecture - this architecture was solely designed to provide the services to build standalone web-based OLTP applications. Typically, in the current J2EE model, a developer has to hard code the long running state management.

Lastly, some people are talking about Workflow, or strangely enough they use the term Business Process

Workflow. We can often associate workflow to "automated document management" which requires reviews and approvals: for instance the review of a proposal or a contract by a large number of people. The engine in charge of this task does not know much about the documents themselves and is merely routing them through different people while keeping an audit trail. There is little or no

integration with enterprise systems, let alone with other partners.

149

Business Process or Workflow?

Ultimus provides an excellent definition of workflow

"

Any task performed in series or in parallel by two or more members of a workgroup to reach a common goal.

"

Note the words with emphasis:

Any Task: Which implies that workflow refers to a very wide range of business related activities.

Series or in Parallel: Which implies that steps in the task may be performed one after the other, or simultaneously by different individuals, or a combination of the two.

Two or More Members: Which implies that if only one person performs a task it is not workflow.

As the workflow name suggests, a task is workflow if it "flows" from one individual to another.

Common Goal: Individuals

participating in workflow must be working towards a common goal.

If they are working on independent projects, that does not constitute workflow.

150

Why Business Process Managements Systems? Why

Today?

Since the Internet revolution enterprises have opened their core functions to customers, business partners, suppliers and financial institutions.

151

In doing so, they have created a maze of intranet, extranets,

B2B integrations, using various technologies ranging from

EAI solutions, to Portals, B2B integrations. Business Process

Management Systems are the ultimate evolution of this quest for better interactions between the enterprise and its environment. A BPMS is a strategic platform that manages the relationships at the enterprise level, regardless of the geography, organizational structure and IT.

The major barrier to a B2B implementation is not necessarily technology but rather cost. If you spend $1000 (hardware, software licenses, labor, maintenance, support…) on each of the 1000 business partners, you have already spent

$1,000,000. The cost of such a project could easily run into several hundred million dollars for a given business community or industry. Even if this cost is not fully supported by a single corporation, the higher it is, the less likely it will be that the overall project will be successful, because partners will delay their decision to implement an expensive infrastructure. The complexity and technical risks will also be directly related to the cost of implementation.

152

Problem solved : How do I enable my company to let its customers, partners and suppliers interact with its core functions (Sales, Production, Design and Marketing) in a secure environment?

Value proposition: bring rationalization and agility across the dozens if not hundreds of business processes a global corporation need to support and which touches an increasing number of enterprise information systems (EIS).

Measure of success: how easy it is to adapt a business process to a changing economic environment? can I manage personalized relationships with my customers, partners and suppliers?

Key technologies for BPMS

The Internet has changed dramatically the technical infrastructure of most IT departments, sometimes bringing extreme challenges (security) while enabling customers, suppliers and employees to reach the information they need and transact at will with core enterprise systems. These capabilities have also created an environment where enterprise systems are solicited by thousands of events daily which require near real-time processing.

Business Process Management Systems are the key infrastructure for routing all these events to the appropriate processing end point within the enterprise or onto a business partner system. There are four key technologies which make

BPMS architecture more efficient:

XML

B2B middleware

Enterprise Application Integration

153

Web Services

XML has freed BPMS from the intricacies of the interaction with data sources. In the past, most workflow engine that was developed had a collection of libraries to fetch or update data from a large variety of data sources. XML is bringing data into the data flow as "documents" which can be handled generically by the BPMS. In addition, business rules

(decisions, guarded transitions, ...) can be written in standard XPATH predicates, sometimes even in a fashion independent of the overall XML document structure. XML combine with XSLT (and often CSS) can also bring the data flow to the user's desktop such that engineering change order or the purchase of your laptop can be approved.

B2B Middleware based on standards such as ebXML is also providing a homogeneous way to connect securely business process management systems to the outside world. BPMS can take advantage of Trading Partner Agreements to route messages dynamically to the correct business partner system. Again, without such a common infrastructure BPMS vendors would spend a good deal of their resources to reconcile different protocols together.

Enterprise Application Integration is actually the oldest of the three technologies since major frameworks appeared in the mid 1990s with New Era Of Network (NEON), Mercator, or Oberon Software ... Most of these players have been acquired while new players such as WebMethods or

CrossWorlds have brought this technology to a level of maturity which also provides a homogeneous environment which enables a BPMS to interact with virtually any enterprise and legacy systems.

154

Web Services are the latest wave of technology which aims at standardizing the interactions between systems. It is the direct heir of the 1990s middleware technologies and provide a common abstraction to both web-based applications, component based systems or any legacy system.

Because the interfaces to data sources, enterprise systems and business partners have been "standardized", the BPMS technology can start developing and act as a coordinator of the information flow, transaction flow and value flow across and beyond the enterprise.

BPMS Architecture

The BPMS architecture has to be optimized to connect to the largest number of business partners, employees and enterprise systems.

155

The BPMS acts as a bridge between three interfaces. First between an Application Service Interface (ASI) and the

Business Service Interface (BSI). The ASI wraps all enterprise, legacy, and e-business applications into a set of

Web Services. The BSI manages the interactions between executable business processes and the business partners.

The third interface is "browser-based document exchange" for users like you and me. This is the paradox of automation: we can only automate if we are guaranteed that exceptions will be properly raised and resolved with the appropriate person or business partner employee approving a document, or filling out missing information.

The architecture should also support legacy formats such as

EDI or flat files. It is probably better to optimize the design of the BPMS for XML since most of the messages will use this format. Necessary transformations should be used to go from and to other data formats, as well as from an XML format to another one (RosettaNet to OAGIS...)

Most products currently available on the market follow closely or loosely the model we just presented.

Download