Computational Science and Engineering

advertisement
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/234806977
Computational Science and Engineering.
Article in ACM Computing Surveys · December 1996
DOI: 10.1145/242223.246865 · Source: doi.acm.org
CITATIONS
READS
37
1,054
7 authors, including:
Ahmed Sameh
George Cybenko
Purdue University
Dartmouth College
299 PUBLICATIONS 4,591 CITATIONS 272 PUBLICATIONS 16,503 CITATIONS SEE PROFILE
SEE PROFILE
M. H. Kalos
John R. Rice
Lawrence Livermore National Laboratory
Purdue University
213 PUBLICATIONS 9,448 CITATIONS 330 PUBLICATIONS 5,203 CITATIONS SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Deep learning of behaviors View project
Pseudorandom number generators View project
All content following this page was uploaded by Ahmed Sameh on 09 November 2015.
The user has requested enhancement of the downloaded file.
SEE PROFILE
Computational Science and Engineering
A. SAMEH ET AL.1
Department of Computer Science, University of Minnesota, Minneapolis, MN ^sameh@cs.umn.edu.&
1. WHAT IS CSE?
Computational science and engineering
(CSE) is a multifaceted field; it is not
just the study of computational aspects
of science and engineering disciplines
but also the “science” of scientific computing. Therefore, this field concerns
the whole computational process. It
seeks to advance science and engineering disciplines through better understanding of advanced computers and
computational methods, as well as advancing the state of the art in computer
architecture, system software, and algorithm design through better understanding of science and engineering applications. Thus the slight ambiguity in
the name is actually useful. In essence,
CSE is that interdisciplinary field that
represents the intersection of three domains: applied mathematics, computer
science, and science and engineering
disciplines. Think of CSE as a pyramid
with a square base in which the five
nodes represent the whole computational process. The driving science and
engineering applications at the apex,
1
Participants in the Workshop on Science and
Engineering include G. Cybenko (Dartmouth College, Hanover, NH), M. Kalos (Cornell Theory
Center, Cornell University, Ithaca, NY), K. Neves
(Boeing, Seattle, WA), J. Rice (Department of
Computer Sciences, Purdue University, Lafayette,
IN), D. Sorensen (Computational and Applied
Mathematics, Rice University, Houston, TX), F.
Sullivan (IDA Center for Computing Sciences).
and the four nodes of the base represent
the components of the enabling technology: geometric, numerical, and symbolic
algorithms; system software; computer
architecture; and performance evaluation and analysis.
A few examples of some of the driving
science and engineering applications
are outlined in the following. They
range from computational biotechnology
to computational structures technology.
A surprising application from another
area that nevertheless has much in
common with those in science and engineering, namely, computational methods in finance, is added to the following
list. Six applications are described
briefly. Almost all of these are quotations representing the overview of actual articles published in IEEE Computational Science and Engineering, a
magazine whose main purpose is to help
define this interdisciplinary field.
Several web sites provide information
regarding research and educational activities in this field; see, for example:
—Problem Solving Environments: http://
www.cs.purdue.edu/research/cse/pses/.
—Symbolic Net 5 Symbolic Mathematical Computation Information Center:
http://symbolicnet.mcs.kent.edu/.
—IEEE CS&E magazine: http://www.
computer.org/pubs/cs&e/cs&e.htm.
—CSEP: http://csepl.phy.ornl.gov/esep.
html.
Parts of this report are reprinted with permission of the IEEE Computer Society and IEEE Computational Science and Engineering.
Permission to make digital / hard copy of part or all of this work for personal or classroom use is granted
without fee provided that the copies are not made or distributed for profit or commercial advantage, the
copyright notice, the title of the publication, and its date appear, and notice is given that copying is by
permission of the ACM, Inc. To copy otherwise, to republish, to post on servers, or to redistribute to
lists, requires prior specific permission and / or a fee.
© 1996 ACM 0360-0300/96/0400–0810 $03.50
ACM Computing Surveys, Vol. 28, No. 4, December 1996
Computational Science and Engineering
•
811
1.1 Computational Biotechnology
1.2 Computational Structures Technology
1.1.1 Atomic Structure of Large Viruses
and Proteins. “Viruses and proteins
are very large (100,000 to 1 million atoms) molecules whose behavior can be
studied from knowing their exact
atomic structure. The determination of
their 3D atomic structure requires a
variety of algorithms for: (a) calculation
of the structure factors from X-ray diffraction images and for solving the
“phase problem” in X-ray crystallography, (b) identification of virus particles
in electron micrographs and 3-D image
reconstruction in electron microscopy,
(c) combining lower resolution data obtained through electron microscopy with
higher resolution electron density maps
obtained through X-ray crystallography,
and (d) molecular dynamics computations able to exploit their high (60 or
120-fold) symmetry. Calculations to determine these structures involve data
sets with several gigabytes and even
with the most powerful computers today
and very efficient algorithms they require hundreds of thousands of processor-hours.”
“Computational structures technology
represents one of the most significant
developments in the history of structural engineering. Applying computational methods to structural modeling
has transformed much of theoretical
structural mechanics and materials science into practical tools that affect all
phases of the design, fabrication, and
testing of engineering systems. Designers use CST to model materials; predict
the response, performance, failure, and
life of structures and their components;
and automate structural synthesis and
design optimization.
CST led the way among computational aerosciences until the 1970s,
when the emphasis shifted to other disciplines, particularly computational
fluid dynamics. However, we are seeing
renewed interest in CST for at least
three compelling reasons. First, there
are practical problems awaiting solutions, such as a vehicle’s response to
crash impact forces or the effects that
joints have on the response of a large,
flexible structure. Second, we need to
reduce our dependence on costly testing.
Third, the power of new and emerging
high-performance computers makes using them to achieve cost-effective design
and operation of future engineering
structures particularly alluring.”
CORNEA-HASEGAN
ET AL.
1995
1.1.2 Modeling Biomolecules: Larger
Scales, Longer Durations. “The fast
growth of molecular modeling as a research tool in biology and medicine has
been tightly coupled to the advent of the
supercomputer and to advances in applied and computational mathematics
over the past decade. Three features
characterize the progress made to date:
bigger molecular systems described in
atomic detail, longer simulation time
scales, and more realistic representations of interatomic forces. With these
improvements, molecular modeling by
computer has given us many insights
into the relationship between structure
and function of biopolymers and drugs.
Researchers now find it indispensable
for structure refinement.”
J. BOARD
ET AL.
1994
A. NOOR
AND
S. VENNERI 1993
1.3 Computational Electromagnetics
“Maxwell’s partial differential equations of electrodynamics, formulated
about 130 years ago, combined the thenseparate concepts of electric and magnetic fields into a unified theory of electromagnetic wave phenomena. Nobel
laureate Richard Feynman has called
this theory the most outstanding
achievement of 19th-century science.
Now, engineers worldwide solve Maxwell’s equations with computers ranging from simple desktop machines to
massively parallel supercomputing arrays to investigate electromagnetic
wave guiding, radiation, and scattering.
ACM Computing Surveys, Vol. 28, No. 4, December 1996
812
•
A. Sameh et al.
As we approach the 21st century, it may
seem a little odd to devote so much
effort to solving the 19th century’s best
equations. Thus, I pose the question: Of
what relevance are Maxwell’s equations
to modern society?
Until 1990, the answer to this question would almost certainly have related
to the perceived need for a strong military defense. Solutions of Maxwell’s
equations for wave phenomena in this
era were driven primarily by defense
requirements for aerospace vehicles
having low radar cross sections.”
“The models discussed here, developed in my laboratory, represent a cross
section of the wide-ranging applications
of FD-TD computational electromagnetics. These include
—radar scattering by a jet fighter up to
1 GHz;
—radiation by wideband antennas and
phased arrays;
—UHF, microwave, and optical interactions with human tissues for planning
heat treatment of cancer and for
studying how the retina works;
—crosstalk and ground-loop coupling in
complex, multilayer circuit boards
and connectors;
—analog and digital operation of transistor and logic circuits operating well
above 250 MHz; and
—subpicosecond optical switches based
on non-linear interactions of self-focussed laser beams.”
A. TAFLOVE 1995
1.4 Computational Medicine
“Over the past two decades, the techniques of computer modeling and simulation have become increasingly important to the fields of bioengineering and
medicine. The reasons for this are numerous. As in other areas of science and
engineering, mathematical modeling
lets biomedical researchers subject increasingly complex hypotheses to quantitative examination. Furthermore, the
sophistication of computer hardware
configurations, particularly increases in
ACM Computing Surveys, Vol. 28, No. 4, December 1996
memory capacity and CPU speed, have
produced a parallel increase in the level
of biological complexity that realistically can be modeled.
Although biological complexity outstrips the capabilities of even the largest computational systems, and will for
some time to come, the computational
methodology has taken hold in biology
and medicine and has been used successfully to suggest physiologically and
clinically important scenarios and results.”
C. JOHNSON
AND
M. MATHESON 1993
1.5 Interactive Scientific Visualization of
Fluid Flow
“Researchers in the field of scientific
computation have the extreme good fortune to see their capabilities increase by
more than an order of magnitude each
decade. Few other areas of scientific
endeavor are advancing so rapidly.
Two decades ago, when fluid dynamicists were simulating flows in one dimension, a series of line plots on a simple graph sufficed to communicate the
experiments’ results. Since the simulations commonly used grids of only one
or two hundred computational zones,
researchers could easily peruse the raw
numeric output while the computer generated the plots.
Computational fluid dynamicists now
routinely perform experiments that rival the complexity of laboratory experiments, so it should be no surprise that
they use similar means to view their
results. Inspecting the 650 million or so
numbers representing a snapshot of a
3D flow computation on a grid with 512
computational zones in each dimension
is simply out of the question.
Like the laboratory experimenters, today’s computational fluid dynamicists
use visualization techniques to see their
results and to suggest useful comparisons with analytic theory.”
P. WOODWARD 1993
Computational Science and Engineering
1.6 Computational Finance
“Finance theory is one of the most active fields of research in modern applied
mathematics. Those who enter it are
sometimes surprised by the number of
problems in finance that, in their abstract form, seem to be lifted straight
out of the concepts or methods of physics and other scientific disciplines. Over
the last 20 years computational finance
has attracted many mathematicians,
physicists, and engineers to work on a
host of interesting and unsettled theoretical issues related primarily to asset
evaluation.”
E. BARUCCI
ET AL.
1996
2. WHY SHOULD CSE EXIST?
Computation now pervades all of the
scientific and engineering fields. Yet
groups in various science and engineering disciplines tend to operate independently in ignorance of each other’s activities. In many cases, there is extensive
duplication of effort in algorithm development. Worse, a superior method can
exist in one discipline but be unknown
in another. This is a result of the “selfteaching within a discipline” philosophy
and the tendency to develop computing
techniques in isolation. In addition to
duplication, this approach fosters inconsistent and conflicting notation for similar objects and concepts. What completes this unfortunate situation is that
most students in computer science are
rarely encouraged (or forced) to learn
about other science and engineering disciplines. In fact, most of them avoid
learning much about applied mathematics and numerical methods. Since major
science and engineering problems cannot be solved without innovative computational methods, this gap in preparing
graduate students must be bridged by a
structured program in CSE. Moreover,
computation is the natural medium for
quantitative interdisciplinary research,
and CSE offers a means to greater understanding of the connections between
•
813
many scientific and engineering disciplines and computer science.
3. WHAT ARE CSE’S CHALLENGES?
With the advent of practical scalable
computing, CSE is undergoing a revolution. Computing resources have increased by three orders of magnitude in
speed and memory during the past decade. With such advances, computational techniques have begun to attain
their longstanding promise of permeating all science and engineering disciplines. This revolution, however, poses
significant challenges. One challenge is
that, in spite of roughly two decades of
research in parallel computing, the
rapid developments of the last few years
have confronted the community with
different and evolving architectures and
computational models. In addition, users have had to cope with immature
software at every level and, in many
cases, the need to program at a lower
level than had been necessary for many
years. Another major issue is the question of “large-scale” computing versus
“large-scope” computing. Whereas largescale means increasing the resolution of
the solution to a fixed physical model
problem, large-scope means increasing
the physical complexity of the model
itself. Increasing the scope involves
adding more physical realism to the
simulation, making the actual code
more complex and heterogeneous, while
keeping the resolution more or less constant.
The emphasis over the last few years
on massive parallelism and teraflops
computing has focused some researchers on large-scale computing. Some took
“large-scale” literally and scaled up the
data sets needlessly just to achieve high
computational rates. (Admittedly, this
is a simplification to make a point but
it’s not too far off the mark.) Going
further, the recent collapse of the massively parallel computer market can be
largely attributed to the fact that commercial customers have not been impressed by overzealous large-scale comACM Computing Surveys, Vol. 28, No. 4, December 1996
814
•
A. Sameh et al.
puting and seek large-scope computing
solutions instead. There are some fundamentally important large-scale problems and we need machines to solve
them, but this alone is not enough reason to deflect a whole industry and a
large part of the academic research
community.
The technological opportunities are
too great to ignore. We now have microprocessors that deliver a major fraction
of a gigaflops each plus memory cheap
enough to make possible large systems
with scores of gigabytes of random access memory along with terabytes of
attached disk storage. It is memory as
much as processing power that has
made possible many of the important
and striking new application successes
in the last few years. A consequence is
that scalably parallel computing has become a primary source of high-performance computing before it was quite
ready. The challenge remains to make
this technology readily accessible to the
entire scientific computing community.
As we move into the 21st century,
large scope is where the action can increasingly be found. It is important to
note that large-scope problems are not
helped much by asymptotically efficient
theoretical parallel algorithms or simple parallel language extensions. Four
applications areas can serve as examples: microscale electromechanical systems (MEMS), large mechanical systems (e.g., aircraft, automobiles, and
robots), natural environments, and data
mining. In each of these areas, modeling
and simulating a broad assortment of a
system’s dynamics and properties are
required.
MEMS involve electromechanical devices at the micron scale. They are fabricated using semiconductor manufacturing processes. At the micron level,
things like friction, wear (tribological
degradation), surface tension from
moisture, structural flexibility, and
electromagnetic forces contribute to the
operation of a device in ways quite different from normal scales. A meaningful
simulation of a MEMS device requires
ACM Computing Surveys, Vol. 28, No. 4, December 1996
modeling most if not all such influences:
taking a single aspect of the model and
performing high-resolution simulations
does not suffice. An even larger and
more critical issue, however, is that of
determining the physical models reliably. This situation illustrates the close
interaction that must occur among
physical experimentation, physical modeling, and computational science. When
experiments and simulation disagree
(as they will in frontier areas like
MEMS), scientists can hypothesize new
models and then ask experimentalists
and computational scientists to work together to test them. As the agreement
between experiment and computation
improves, these models will move from
being hypothetical to being working
models to becoming the “standard.”
This is happening also in the arena of
large mechanical systems such as aircraft. The design approach for the Boeing 777 is a sign of things to come,
incorporating in the model all aspects of
the whole system before manufacturing—aerodynamic, structural, thermal,
electrical, and control systems—using
the most advanced software tools available. All automobile makers see this as
a goal in the next few years and are
taking steps to achieve it. Even though
every Boeing plane is 100 percent custom-made, Boeing is trying to standardize options. The design process for the
Boeing 777 generated a vast amount of
data that included not only manufacturing plans but also parts and maintenance data. This, in turn, required expertise in scalability, performance, and
modeling in the data area, not just in
the computer performance area.
Better simulation of an environmental system also involves modeling more
of the dynamics that govern it such as
weather, water flow, terrain, flora,
fauna, chemical reactions, and biological interactions. Global climate modeling, for instance, can get only so far by
increasing resolution. At some point,
and perhaps we are already there, the
models must become more realistic and
Computational Science and Engineering
must include more elements of the natural world.
Data mining likewise involves a combination of different computational expertise. Data mining generally refers to
the process of building models of realworld systems from very large databases of transaction data. Such models
might just be simple relationships (e.g.,
high school graduates from suburbs who
enroll in liberal arts colleges) or intricate relationships such as those that
occur in the flow of money in a small
town. Take, for example, the problem of
categorizing consumers so as to identify
buying trends and use them in marketing studies. The scope of this problem
includes techniques from the management of very large databases, efficient
access methods, dealing with incomplete or noisy data, pattern analysis,
nonlinear statistical modeling, and visualization of large data sets. Similar
problems arise in determining trends or
disparities in health care at the national level from digital patient records.
The bad news is that what has been
good for large-scale computing may not
be very good for large-scope computing.
Performance will likely decrease as we
add more physical realism to a simulation because the computation becomes
less homogeneous and harder to optimize. Unfortunately, many of the tools,
technologies, and advances in high-performance computing of the past few
years work best for uniform, structured,
and
large-data/small-code-complexity
programs.
If this large-scope computing trend is
real, the computational science and engineering community needs to rally. We
need to devote time and energy to the
computational challenges arising from
increasing the scope of models and the
complexity of scientific programs. This
will require a new approach towards
computing research wherein the goal is
to render realistic, meaningful simulations of complex real-world phenomena
and not just focus on rarefied performance metrics and isolated algorithms.
A number of existing and innovative
•
815
developments need to be encouraged
and accelerated. The first, and most
challenging, is to find novel algorithmic
approaches to computational modeling
that better match existing and future
architectures. These would include
higher-order methods, adaptive schemes,
polyalgorithms, as well as Monte Carlo,
particle, and cellular automata methods, and no doubt still newer and more
radical ideas. New languages and programming paradigms better suited for
parallel computing must be brought to
the state where their practical and efficient use is straightforward. Standards
must be defined and serious software of
commercial strength created as quickly
as possible. Undoubtedly, still more creative ideas will be needed here as well.
Finally, we must seek to advance the
uses of computational science in hitherto underrepresented areas.
4. WHAT IS THE ROLE OF COMPUTER
SCIENCE IN CSE?
Although the development of CSE is not
strictly tied to the fortunes of parallel
computing and the parallel-computing
industry, its development as an interdisciplinary field is certainly dependent
on the availability of powerful computational engines. Recent turbulence in the
industry represents only necessary corrections in the evolution of a tool that
has changed the process of scientific
investigation in a fundamental way. It
is important to emphasize that significant computational advances in science
and engineering disciplines will not be
realized without innovations in computers and the computational process.
In addition, there is a need to educate
the scientific computing community
about advances in the design of algorithms, especially nonnumerical ones,
that are effective for scientific computing problems. Although there certainly
are examples of nonnumeric algorithms
that have had a dramatic effect on scientific applications, the time required
for this to happen appears to be much
greater than in the numeric case. MoreACM Computing Surveys, Vol. 28, No. 4, December 1996
816
•
A. Sameh et al.
over, many important ideas have never
been adopted in scientific computing.
Indeed, methods that can be found in
undergraduate algorithm textbooks are
virtually unknown in scientific computing, such as balanced trees, spanning
trees, depth-first search, and skip-lists.
A more direct involvement of computer
scientists with application work would
help to alleviate this problem.
In spite of ample evidence to the contrary, most physical scientists do not
think of the nonnumeric aspects of algorithm design as important in speeding
up the execution of codes. At the same
time, many computer science researchers do not think of scientific computing
as a customer for their products. An
aggravating factor contributing to the
underappreciation of nonnumeric algorithms can be traced back to the current
style in theoretical computer science literature. The only way that computer
science researchers can change the
thinking of physical scientists is by generating and promulgating useful results. Specialists in algorithm design
can make an enormous contribution to
scientific computing. However, in order
to do so, they will have to change how
they do business and even go out of
their way to learn about scientific applications. If even a relatively small group
of computer scientists is willing to make
this effort, both theory and applications
will flourish.
Computer science can provide the enabling technology essential to the design and manufacture of robots and
other complex physical objects. Examples of such enabling technology components are:
—interactive compilers,
—parallel programming languages,
—computer graphics and scientific visualization,
—methodologies for performance evaluation,
—distributed operating systems,
—innovative computer architectures,
—database management,
ACM Computing Surveys, Vol. 28, No. 4, December 1996
—geographical information systems,
—parallel I/O,
—data structures and algorithms for
adaptive grid generation,
—real-time systems (robotics, speech
analysis, computer vision),
—user interfaces, and
—problem-solving environments.
Bill Wulf comments that “computer
science in the past has provided structure without regards to applications or
content; it is time for the field to include
content with structure.” CSE is an area
that naturally provides such content
and also gives feedback and poses new
challenges to computer science.
More on the various driving application areas of computational science and
engineering and the components of the
enabling technology may be found in the
following selected references.
REFERENCES
BARUCCI, E., LANDI, L., AND CHERUBINI, V. 1996.
Computational methods in finance: Option
pricing. IEEE Comput. Sci. Eng. (Spring), 66 –
80.
BERRY, M. ET AL. 1996. Lucas: A system for
modeling land-use change. IEEE Comput. Sci.
Eng. (Spring), 24 –35.
BOARD, J., KALEĢ, L., SCHULTEN, K., SKEEL, R., AND
SCHLICK, T. 1994. Modeling biomolecules:
Larger scales, longer durations. IEEE Comput. Sci. Eng. (Winter), 19 –30.
BROWNE, S., ET AL. 1995. The National HPCC
Software Exchange. IEEE Comput. Sci. Eng.
(Summer), 62– 69.
COE, T., ET AL. 1995. Computational aspects of
the pentium affair. IEEE Comput. Sci. Eng.
(Spring), 18 –31.
CORNEA-HASEGAN ET AL. 1995. Phase refinement and extension by means of non-crystallographic symmetry averaging using parallel
computers. Acta Crystallogr. D51, 749 –759.
CRUTCHER, R. 1994. Imaging the universe at radio wavelengths. IEEE Comput. Sci. Eng.
(Summer), 39 – 49.
DONGARRA, J., GROSSE, E., PATT, Y., CHAUDY, K.
M., AND MURAOKA, Y. 1996. Taking stock,
looking ahead, five essays. IEEE Comput. Sci.
Eng. (Summer), 38 – 45.
GALLOPOULOS, E., HOUSTIS, E., AND RICE, J. 1994.
Computer as thinker/doer: Problem-solving
environments for computational science.
IEEE Comput. Sci. Eng. (Summer), 39 – 49.
Computational Science and Engineering
GOUDREAU, G. 1994. Computational structural
mechanics: From national defense to national
resource. IEEE Comput. Sci. Eng. (Spring),
33– 42.
IEEE 1994a. NSF panel charts future of HPC.
IEEE Comput. Sci. Eng. (Spring), 79 – 81.
IEEE 1994b. Site report: The Army High Performance Computing Research Center. IEEE
Comput. Sci. Eng. (Summer), 6 – 8.
IEEE 1994c. Site report: Cornell Theory Center. IEEE Comput. Sci. Eng. (Winter), 10 –13.
IEEE 1994d. Site report: San Diego Supercomputer Center. IEEE Comput. Sci. Eng. (Fall),
10 –14.
IEEE 1995a. Site report: ICASE: NASA Langley’s CSE Center. IEEE Comput. Sci. Eng.
(Spring), 6 –14.
IEEE 1995b. Site report: Sandia National Laboratories. IEEE Comput. Sci. Eng. (Summer),
10 –15.
IEEE 1996a. Interview with David Kuck: What
is good parallel performance and how do we get
it? IEEE Comput. Sci. Eng. (Spring), 81– 85.
IEEE 1996b. Site report: Pittsburgh Supercomputing Center, IEEE Comput. Sci. Eng.
(Spring), 8 –12.
JOHNSON, C. ED. 1995. Theme issue on: Computational inverse problems in medicine. IEEE
Comput. Sci. Eng. (Winter), 42–77.
JOHNSON, C., AND MATHESON, M. 1993. Computational medicine: Bioelectric field problems.
Comput. Computation. Sci. Eng. (Oct.), 59 – 67.
KAHANER, D. 1994. Stiff competition: HPC in
Japan. IEEE Comput. Sci. Eng. (Summer),
84 – 86.
KING, S. 1995. A numerical journey to the
Earth’s interior. IEEE Comput. Sci. Eng.
(Fall), 12–23.
MARCHIARO, T., ET AL. 1995. UCES: An under-
•
817
graduate CSE initiative. IEEE Comput. Sci.
Eng. (Fall), 69 –73.
NOOR, A., AND VENNERI, S. 1993. A perspective
on computational structures technology. Comput. Computation. Sci. Eng. (Oct.), 38 – 46.
PANCAKE, C. 1996. Is parallelism for you? IEEE
Comput. Sci. Eng. (Summer), 18 –37.
RICE, J. 1994. Academic programs in computational science and engineering. IEEE Comput.
Sci. Eng. (Spring), 13–21.
RICE, J. 1995. Computational science and the
future of computing research. IEEE Comput.
Sci. Eng. (Winter), 35– 41.
SAMEH, A., AND RIGANATI, J. 1993. Computational science and engineering. Comput. Computation. Sci. Eng. (Oct.), 8 –12.
STONEBRAKER, M. 1994. Sequoia 2000: A reflection on the first three years. IEEE Comput.
Sci. Eng. (Winter), 63–72.
TAFLOVE, A. 1995. Reinventing electromagnetics: Emerging applications for FD-TD computation. IEEE Comput. Sci. Eng. (Winter), 24 –
34.
VOLAKIS, J., AND KEMPEL, L. 1995. Electromagnetics: Computational methods and considerations. IEEE Comput. Sci. Eng. (Spring),
42–57.
WARREN, J. 1995. How does a metal freeze? A
phase-field model of alloy solidification. IEEE
Comput. Sci. Eng. (Summer), 38 – 49.
WOODWARD, P. 1993. Interactive scientific visualization of fluid flow. Comput. Computation.
Sci. Eng. (Oct.), 13–25.
WILSON, G. 1996. What should computer scientists teach to physical scientists and engineers? IEEE Comput. Sci. Eng. (Summer),
46 –54. (See also responses by R. Landau and
by S. McConnell.)
ACM Computing Surveys, Vol. 28, No. 4, December 1996
View publication stats
Download