Dr. Lixin Tao
Pace University http://csis.pace.edu/~lixin
We are facing real challenges in computing education. The enrollment of many computing programs has shrunk over the last three years, and people are worrying about the computing job opportunities and the impact of computing job outsourcing. At the same time, many IT companies cannot easily find graduates with proper qualifications and they need invest significant period of time to educate and train new hires. For computing education being able to help promote a competitive USA IT industry, we must answer three questions: what are the fundamental new technologies in the IT industry over the last decade; what is the academic value of them; if they have academic value, how can we integrate the essence and basic principles of them into our computing curricula.
1 Fundamental New IT Technologies over the Last Decade
While there are many competing computing platforms and technologies, often underlying huge amount of misleading buzzwords or commercial pitches, we can find that they share a lot of fundamental concepts, principles, and structures.
They are different from the pre-1990’s IT technologies in many ways. The following are two major ones.
1.
Component-based software engineering . We have gone beyond the object-oriented (OO) computing paradigm.
While OO is still the cornerstone for building in-house software where we have access and control of the source code, the latest industry common practice is to let specialists develop reusable software components, and it is normally much more cost-effective to buy and integrate these COTS (commercial off-the-shelf) components with in-house code to develop a new application than reinventing wheels. In its elementary form a software component is an instance of a unit of specially structured and customizable code, normally in binary form, that has welldefined public interfaces and can be individually deployed and dynamically shared by multiple applications. More mature components may support reflection, networking, component discovery, and learning. Components are computing abstraction at a higher level than objects. Their development starts with clear interface specifications, and their implementation can be based on various algorithms, languages, and technologies. This reflects the fact that no single computing platform or language can dominate the IT industry. Since 1995, the US Department of
Defense mandated that all of its major software projects must be based on the software component technology so no company can monopolize their production and maintenance. Almost all of the prevailing computing technologies, including J2EE (Java 2 Enterprise Edition) and Microsoft’s .NET, are based on software components. If the public interfaces of major software components are standardized by national or international industrial consortia, the software component approach can improve IT industry’s competitiveness and promote specialized and collaborative computing.
2.
Server-based computing . Server-based computing is now ubiquitous. Web server based e-commerce is a more visible example. Web browser is becoming a universal user interface for clients to get computing services from remote service providers. But Web computing is only a special example of server-based computing, and Service-
Oriented Architecture (SOA) is one of the emerging server-based technologies that provide a higher-level abstraction of computing than software components. Server-based computing is characterized by multiple concurrent computation threads serving remote clients simultaneously; integrating distributed software components and services for benefiting from specialized computing and load balancing; distributed transactions for abstracting multiple distributed operations into a single atomic one; and software framework -based tiered implementation that enables the application developers to focus on algorithms and business logics and not be overwhelmed by server infrastructure details. Server-based services are sometimes loosely coupled by events or messages. Server-based computing is the foundation of distributed enterprise system integration , while XML, a standardized neutral language, is becoming the foundation for enterprise data integration . Grid computing further virtualizes and integrates server-based computing and data into a single large virtual computer.
2 Academic Value of Software Components and Server-Based Computing
1.
While objects abstract data, operations and common computation of applications from the same producer, software components also abstract implementation languages and common computation across multiple
1
applications from different producers; delay code binding; and support strict separation of interfaces and their implementation. Server-based services further abstract the server infrastructure, data, and expertise as well as server site’s location. Objects, components, and server-based services all serve the purpose of controlling computation complexity but at different grain size and levels. Typically, a software component is made of objects, and a server service is implemented by integrating multiple local and remote software components or services.
2.
Software components normally function in component containers, which abstract most infrastructure issues like multithreading, networking, component life cycle (when the components will be created, activated, deactivated, or recycled), and configuration so that component authors can focus on algorithms or business logics only. This is a good application of problem-solving strategies abstraction and divide-and-conquer .
3.
Most of today’s applications use GUI (graphic user interface) to improve the human-computer communication efficiency, and multithreading is a necessity to make a GUI based application reactive. But GUI programming is based on a more fundamental computing paradigm called event-driven programming that is critical for loosely coupling computing that may be implemented in different languages by different authors and run at different sites.
Multithread programming can be viewed as another fundamental computing paradigm since it allows us to more realistically model the world which is intrinsically multithreaded. Sequential computing, OO computing, eventdriven computing, and multithread computing are all fundamental in all aspects of today’s computing technologies, including the server-based ones; and they are needed from year one of many undergraduate computing programs for proper student mind-set formation for innovation and more engaging programming exercises.
3 Computing Curricula Reality Check
ACM Computing Curriculum 2001 (CC2001) acknowledged the technical changes and increased importance in “the Web and its application, networking, interoperability, and the use of APIs.” It also identified “net-centric computing” (but mainly using the term for traditional client-server technologies and not much for peer-to-peer computing) as a new knowledge area, and “component-based computing” as an optional knowledge topic. CC2001 assigned 4 core hours to
“event-driven programming”, 2 hours for “introduction to net-centric computing”, and 3 hours for “the Web as an example of client-server computing”. ACM Software Engineering 2004 Curriculum Guidelines listed “middleware
(components and containers)” (one of 14 topics for a total of 20 hours) and “peer-to-peer, publish-subscribe, event-based, client-server” (less than one hour) as knowledge topics, and listed “net-centric systems” as one of its senior optional specialties. At this time server-based computing software technologies, except for Web servers from the IT perspective, have not been properly integrated into computing curricula, even though they have great impact on multiple computing disciplines including CS, SE, IS, and IT. Few faculty members are aware of application servers , the core infrastructure and abstraction software for server-based computing. Multithreading is still only treated as an OS topic. Network programming is normally only marginally covered in a traditional Data Communications or Computer Networks course in the 3 rd or 4 th year. Most computing faculty members view the new topics as incremental knowledge extensions to pre-
1990 courses, and expect students to pick up the “mundane” industry technologies by reading product manuals.
4 Integration Strategy
The server-based computing platforms are best viewed as the generalization and decentralization of the traditional operating systems, and many of the key technologies developed over the last decade are about controlling the extra complexity of the decentralized computing platforms and supporting better server-based computing performance. The adoption of modern programming languages like Java as the first programming language by many computing programs has paved way for a more holistic solution to the problem of integrating software component and server-based computing technologies into computing curricula, but many institutions now only teach the C++ counterparts of Java. While many colleagues find the teaching of object-oriented programming a challenge, the best way to attract and retain students in computing disciplines is to excite and motivate students with real-world challenges like developing their own client-server applications and networked computer games during the first two years of their college study. This is possible and practical because the layered abstractions provided by the new technologies mentioned in this article enable us to take a top-down approach to teaching networking, event-driven programming, and multithreading. For example, with a simple software framework consisting of three classes for an abstract server and an abstract client, we can let students of the second or third programming course design and implement a simple FTP server, a simple Web server, a simple Web browser, or a simple networking talk utility. This is confirmed by our experiment at the Pace University. The key to the success of this integration process is the faculty’s attitude towards and understanding of the current mainstream computing technologies.
2