McMail6Nov2007 - Oregon State University

advertisement
Next Generation Research
and Breakthrough Innovation
Indicators from U.S. Academic Research 2007
Tom McMail
Microsoft Research
External Research Group
Ta b le o f Co nt ent s
Introduction ......................................................................................................................................................................................3
Background .............................................................................................................................................................................................................................................. 3
Changing the World ................................................................................................................................................................................................................................. 3
A new era of computing....................................................................................................................................................................................................................... 3
Predicting the most important new directions for research ................................................................................................................................................................. 3
The Worldwide Research Outlook .....................................................................................................................................................3
Everywhere, innovation of all kinds is driving productivity ...................................................................................................................................................................... 4
Globalization and IT are accelerating both competition and innovation ................................................................................................................................................. 4
Global businesses and open innovation ................................................................................................................................................................................................... 4
Indicators from Government and Academia ......................................................................................................................................5
Following specific funding trends ............................................................................................................................................................................................................. 5
What does the funding data actually reflect? .......................................................................................................................................................................................... 5
Conversations with academic leaders ...................................................................................................................................................................................................... 5
Alignment of funding priorities ................................................................................................................................................................................................................ 6
Academic hiring trends ............................................................................................................................................................................................................................ 6
Transcending Current Architectures ..................................................................................................................................................7
Creating manycore architectures of the future ........................................................................................................................................................................................ 7
Will other technologies surpass manycore? ............................................................................................................................................................................................ 7
Industry and academic perspectives on manycore .................................................................................................................................................................................. 7
Manycore chip architectures ................................................................................................................................................................................................................... 7
A unique combination of strategies from Berkeley .................................................................................................................................................................................. 8
Other architectural approaches ............................................................................................................................................................................................................... 8
Another look at parallel computing research challenges ......................................................................................................................................................................... 9
Quantum Computers ............................................................................................................................................................................................................................... 9
Bio-computing and nano-computing ....................................................................................................................................................................................................... 9
Next generation internet architectures ................................................................................................................................................................................................... 9
Multidisciplinary and Breakthrough Research ................................................................................................................................. 10
Importance of multidisciplinary research .............................................................................................................................................................................................. 10
Multidisciplinary funding activity to watch ............................................................................................................................................................................................ 10
Promising design approaches to multidisciplinary research .................................................................................................................................................................. 10
Changing collaboration models .............................................................................................................................................................................................................. 11
Collaboration with the social sciences ................................................................................................................................................................................................... 11
Implications for multidisciplinary computer science education ............................................................................................................................................................. 12
Creating Breakthrough Solutions for Important Problems ..................................................................................................................................................................... 13
Global issues that drive acclerated research .................................................................................................................................................................................... 13
Problems unique to the 21st century ................................................................................................................................................................................................. 13
The promise of “green computing”................................................................................................................................................................................................... 13
Computing enables science ............................................................................................................................................................................................................... 13
The need for fast discovery................................................................................................................................................................................................................ 14
Breakthroughs are nearly impossible to predict ................................................................................................................................................................................ 14
Anticipating where breakthroughs are likely..................................................................................................................................................................................... 15
Observing current top innovators ..................................................................................................................................................................................................... 15
Unknown next-generation innovators ............................................................................................................................................................................................... 15
New innovators and “risky” research ............................................................................................................................................................................................... 16
Innovation uptake: fuel cells in China before the U.S.? .................................................................................................................................................................... 16
Spontaneity takes time ....................................................................................................................................................................................................................... 16
Snapshots of Recent Developments in Several Domains ................................................................................................................. 17
Life Sciences ........................................................................................................................................................................................................................................... 17
The next disciplines to be transformed by the data integration revolution ........................................................................................................................................ 17
Getting organized: workflows, provenance and semantics................................................................................................................................................................ 17
Breakthrough Research Samples from the Edge .................................................................................................................................................................................... 17
Robotics has entered into the Mainstream............................................................................................................................................................................................ 18
Taking Computer Gaming to the Next Level .......................................................................................................................................................................................... 19
Is Computing a Natural Science? ..................................................................................................................................................... 21
Two divergent opinions.......................................................................................................................................................................................................................... 21
The study of natural and artificial information processes ...................................................................................................................................................................... 21
Conclusions ..................................................................................................................................................................................... 22
Urgency and opportunity accelerate innovation .................................................................................................................................................................................... 22
Computers complement human ingenuity ............................................................................................................................................................................................ 22
Popular myths must be abandoned ....................................................................................................................................................................................................... 22
Creating more opportunities for breakthrough innovation ................................................................................................................................................................... 22
Guidelines for increasing innovation ...................................................................................................................................................................................................... 23
Promising prospects for significant positive change .............................................................................................................................................................................. 23
Summary ......................................................................................................................................................................................... 24
Acknowledgements ......................................................................................................................................................................... 25
2
Introduction
Background
Over the past few years, during discussions with thought leaders in academia about future research directions, my attention was drawn
to a number of powerful and recurrent themes. Insights from these conversations, enhanced by examining recent changes in funding
policy and new approaches to research, have illuminated a number of salient, unpredictable and sometimes disconcerting trends.
It is clear to many that computing has become indispensable for creating scientific advances needed to solve immensely complex
problems that are highly important to society. While the dramatic increases in computing power necessary to support this progress can
no longer be taken for granted, serious global issues facing humanity are driving increasingly urgent requirements for accelerated
innovation. As a result, multidisciplinary explorations are proliferating rapidly, while dramatic new strategies for riskier research are
being devised to create breakthrough solutions. My analysis of these dynamic and startling trends, together with qualitative and
quantitative evidence, form the basis and rationale for this paper.
Changing the World
A new era of computing
Computing is playing an increasingly important role in society as the power of computing becomes indispensable for progress in nearly
all disciplines. In the forefront of enabling the sciences with computing, Jim Gray defined many of the principles for achieving this type
of transformation and inspired others to continue with and to expand the work, with dramatic results.1 Enabling the sciences with
computing is a top priority for the computational disciplines, today more than ever.
The three primary challenges facing the computational disciplines today are to figure out how to:
1. Transcend current architecture limitations.
It is necessary to invent and develop revolutionary next-generation computing architectures which conquer the limitations and far
surpass the performance of today’s platforms and technologies. These new architectures will provide the dramatic increases in
computational power that are urgently needed to solve the most demanding problems facing researchers today, and for the
foreseeable future.
2. Transform all disciplines computationally.
Computing must drive major efforts to enable practitioners computationally in all fields and empower them to extract breakthrough
insights and discoveries from the valuable yet overwhelming mass of information now available.
3. Create breakthrough solutions with computing for the problems most important to humanity.
Computational approaches and principles should be applied to find answers for the most important problems the world faces,
enabling researchers to work collaboratively across domains, applying computing to change the world by addressing key problems
for science and humanity.
Predicting the most important new directions for research
How should the research agendas be set which will further these overarching goals? Where should research funding be invested? How
can significant and relevant discoveries be predicted? Should computing be considered a natural science? This paper explores several
strong indicators of research directions for the coming years and will suggest additional means for predicting new areas for
investigation. In addition, the discussion will cover possible ways for identifying the researchers who will make essential breakthroughs
determining the future direction of both computing and computing-enabled disciplines.
“The sciences all have their own terminology and approaches and use different and often incompatible systems of
measurement. Empowering science computationally involves first standardizing data and reducing complex problems
to simpler, solvable ones, with computing professionals working closely with other disciplines. Our strategic intent is
information at your fingertips, so I have taken the task of getting all the scientific information online. In science it’s
very, very hard to get people to agree. The fundamental thing we are trying to do with Terra Server is to build a
fundamental conceptual model for astronomy, and that means you have to agree what a star is. There is no simplicity
theory, there’s only complexity theory. The problem is to find something that is simple enough that you can actually
make progress on. We are all optimists; most of us would not have undertaken the projects we have if we had known
at the outset how hard they would be.” - Jim Gray, December 2006 2 (Jim Gray was reported missing at sea off the
coast of California on January 28, 2007. 3 An extensive search by the U.S. Coast Guard, his family and friends found
no trace of him. Additional information is available on the Microsoft Research web site.)
The Worldwide Research Outlook
From a global perspective, the European Journal of Education covered 30 developed countries including the U.S. and reported that,
“Computers, digital data, and networks have indeed revolutionized the research environment (as much as society at large). The
exponential growth of computing and storage capacity will continue in the foreseeable future and many experiments that are still
impossible because they would involve too massive data collection and computation will soon become possible, for example in s ky
modeling (astronomy) or climate modeling (atmospheric science).”4 Computers are transforming research in many fields and have
been essential in addressing problems ranging from the decoding of the human genome to the discovery of unknown celestial bodies
through mathematical inference. Sometimes these efforts result in cascading innovations; for instance, computing was essential in the
development of robotic surgery, which in turn has aided three other major advances: remote surgery, minimally invasive surgery and
unmanned surgery.
3
Everywhere, innovation of all kinds is driving productivity
Business people care about increased productivity and are aware that innovation is a prime means
for generating it, although not all innovation is based Connecting intellectual and
on new technologies. The McKinsey Global Institute financial capital helps rich
shows that the enormous productivity gains of the and poor countries prosper.
1990s were based on innovation in products and
business practices, not purely on technology alone.5 The Organisation for Economic Cooperation and Development (OECD) 6, a think-tank for rich countries, defines innovation as
“new products, business processes and organic changes that create wealth or social
welfare.”7 Richard Lyons from Goldman Sachs investment bank defines innovation more
generally as and succinctly as “fresh thinking that creates value.” 8 Innovation matters
even more to business today than in the past because manufacturing is less than 20% of
economic activity in rich countries and thus the “knowledge economy” is becoming
accordingly more important.“But even if innovation is the key to global competitiveness, it
is not necessarily a zero sum game. On the contrary, because the well of human ingenuity
is bottomless, innovation strategies that tap into hitherto neglected intellectual capital and
connect it better with financial capital can help both rich and poor countries prosper. That
is starting to happen in the developing world.” 9 Table 1 shows the rising proportion of U.S.
productivity that can be attributed to innovation.10
Table 1: Innovation as an increasing source of the rise in productivity that grows economies.
Globalization and IT are accelerating both competition and innovation
A special report on innovation from The Economist points out that “North America still leads the world in research spending, but the big
labs’ advantage over their smaller rivals and the developing world is being eroded by two powerful forces.” Globalization is the first
factor driving the worldwide acceleration of innovation, 11 the leveling effects of which have made it possible for innovation to occur
anywhere; it is not reserved only to economically elite regions. More researchers are engaged worldwide, increasing the rate of
discovery and resulting in a competition-driven feedback loop that further accelerates
innovation in all fields worldwide. Table 1 reflects the global investment in R&D by region.12
The second factor contributing to accelerated innovation is “the rapid advance of
information technologies, which are spreading far beyond the internet and into older
industries such as steel, aerospace and carmaking.”13
According to a report from the World Economic Forum, the United States once again rates
first in competitiveness compared to other economies, after falling to 6th place last year,14
reflecting a shift in the metrics being measured rather than actual trends. “The U.S. does
amazingly well on innovation and markets, but on the macroeconomic stability pillar it ranks
75th” out of the 131 countries included in the survey, said Jennifer Blanke, the forum’s chief
economist. “This still reflects a very serious problem that could hurt the U.S. in the future.” 15
Although innovation is important, the U.S. scores badly in areas of government debt,
deficits and a low savings rate.16 Surprising performers are the oil-rich nations of the Middle
East, whose high ranking reflects a windfall from higher fuel prices. Meanwhile, China rose
from 54th to 34th place. India, interestingly, does not rank higher than 48 th because “the
advantages of its top rate engineers and schools are dwarfed by a sea of poverty and low
skills, a big long-term challenge.”17
Table 2: Worldwide investment in innovation18
Global businesses and open innovation
In an era of increasing need for innovation, the “open
innovation” approach has been adopted to some degree Open innovation works,
by a number of companies. However, there are tradeoffs but not in all situations.
that must be managed in collaborating with others across
a wide spectrum of partnerships. Separation, both geographical and cultural, is a factor that
can make certain kinds of innovation more expensive than they are worth. It must be noted
that firms have always been open to a degree, and the benefits of open collaboration vary
depending on the type of business.19 For instance, Kenneth Morse, head of MIT’s
Entrepreneurship Center, scoffs at IBM’s claim to be an open company: “They’re open only
in markets, like software, where they have fallen behind. In hardware markets, where they
have the lead, they are extremely closed.”20 David Gann and Linus Dahlander, of London’s
Imperial College are also skeptical. They point out that “the costs of open innovation, in
management distraction or lost intellectual-property rights, are not nearly as well studied as
its putative benefits.” 21
Much of innovation still orginates with employees, R&D units and internal sales and service
units, illustrating the fact that not all innovation can be outsourced or derived from
external collaborations. Innovation deriving from both internal and external sources
is illustrated in Table 3.22
Table 3: Sources for innovative ideas worldwide.
4
Indicators from Government and Academia
Following specific funding trends
One way to identify the most important research areas to track and to enable with research funding is to examine how major
organizations are “placing their bets”. At a high level, government and academia describe and fund research on many of the largest and
the most important problems humanity faces today. These projects are also related to global needs for innovation, for instance, in the
environmental and biomedical fields. In the United States, it is informative to examine the policies and priorities of the National Science
Foundation (NSF), the National Institutes of Health (NIH), and other large scale funding agencies. Recent NSF funding trends reveal
the increasing importance of computing research to other areas of research. By 2003, NSF funding for computing had surpassed
funding for all other areas, as seen in Table 4.
Support for research funding
for Math and Computing now
exceeds that for other areas
Table 4: NSF Support for math & computer science research now surpasses funding for all other disciplines 23
What does the funding data actually reflect?
While NSF funding has increased in a number of areas, it is clear from this data that the most
significant growth is taking place in support of the five categories listed in Table 5, ranked by funding levels.
Rank
NSF Highest Funded Categories
1
Math & Computer Science – steepest recent growth in funding; these are enablers for other disciplines
2
Physical Sciences – includes chemistry, physics, metallurgy, materials science, and so on
3
Environmental Sciences – funding has grown steadily since 1970 as the issues have become clearer
4
Engineering – includes all types of engineering not covered by other categories
5
Life Sciences – a wide definition, subsuming biology, medicine, genomics, and related fields
Table 5: Trends in NSF funding
Conversations with academic leaders
Conversations with deans, top researchers, and other leaders in academia during the past year24 have exposed some of the best
thinking about future research trends in computation and related areas. These academic thought leaders are familiar with current
research activities and can point to the most important new directions for the near future. A review of recent publications, combined with
discussions with scores of academics from a variety of different institutions points to important growth areas in the list of future research
priorities summarized in Table 6.
Priority
Future Research Directions
1
Multidisciplinary research, with computation at the core, and enabling other domains
2
New computer architectures including multi-core and many-core systems
3
Large-scale data mining, and analysis of vast data sets
4
Programming in the large, concurrently, and in parallel
Table 6: Academic thought leader opinions on the highest priorities for future research
5
Alignment of funding priorities
These views highlight research directions from a macro level, and closely align with the most pressing problems of our day. Further
analysis of funding allocation in the computing disciplines demonstrates considerable alignment between the research priorities of
thought leaders in academia and those set by various funding agencies.
For instance, the NSF has started a program called Accelerating Discovery in Science and Engineering through Petascale Simulations
and Analysis (“PetaApps”) which involves the creation of new types of programs for quickly manipulating and data mining huge
information stores. These programs enable simulations that provide visualizations of trends and relationships that would otherwise
remain buried in the data. A summary of research funding planned by NSF or the NSF directorate for Computer Information Science
and Engineering (CISE), as of June, 2007 can be found in Table 7.
Amount
Area
Directorates / sources
$ 52,000,000
Cyber-enabled Discoveries
$20M from the CISE budget
$ 50,000,000
Computing Foundations
CISE
$ 21,500,000
PetaApps (for Science & Engineering)
Multidisciplinary focus
$10,000,000
Software Design and Productivity
CISE
Table 7: NSF funding focus for computational initiatives for 200825
Cyber-enabled Discoveries refer to multidisciplinary research with a strong computational component. By funding this research at the
highest level, the NSF recognizes that computational empowerment is vital to the progress of all research fields. The “computational
enablers” developed from this research will benefit many disciplines, not just the sciences.
Computing foundations are also a major focus and will receive nearly as much funding as Cyber-enabled Discoveries. Processors
based on complementary metal-oxide semiconductor (CMOS) technology, which drove growth in computing power for decades, have
now reached their limits. Defining next-generation computer architectures is therefore essential to both research and industry.
Large scale applications for science and engineering, with an emphasis on applications for mining huge data stores (PetaApps) for
simulations is also receiving significant funding, although about half as much as each of the first two categories. The Cyber-enabled
Discoveries research may be related to the PetaApps call for proposals, as an NSF-wide program involving a wide array of scientists
from other fields including chemists, biologists, material scientists, etc.
Software design and productivity will receive the least funding of these areas, but it is still a major focus because the new
architectures will require vastly different approaches to software production.
This information aligns well with future research priorities described by academic thought leaders. After all, academia and government
frequently share the same opinions, often arrived at by the same people who are making decisions and recommendations in both
environments, based on equivalent data. Funding priorities of NSF vs. those of academic thought leaders are summarized in Table 8.
Academic Thought Leader Opinions
NSF Funding Priorities
Multidisciplinary research, with computation at the core, enabling other domains
Cyber-enabled Discoveries
New computer architectures including multi-core and many-core systems
Computing Foundations
Large-scale data mining, and analysis of vast data sets
“PetaApps”
Programming in the large, concurrently, and in parallel
Software Design and Productivity
Similarity of
priorities is not
unexpected; many
NSF leaders come
from academia.
Table 8: Academic thought leaders’ projections for future research directions and current NSF funding priorities
Academic hiring trends
Another possible way to discern future trends is to see who thought leaders in academia are looking to hire. Recruiting priorities at
universities could offer some insight into academic perceptions of future research trends by uncovering areas where department heads
feel that expertise will be needed. However, most hiring trend data available explains what is already known - departments are hiring in
response to research opportunities driven by funding organizations. While some academics prefer to keep their specific hiring priorities
to themselves, others have indicated in personal interviews that they are usually working to fill two or more specific needs at once.
For instance, with only one position available, it may be necessary to find a replacement for an embedded systems specialist who is
just retiring, while also finding a candidate who is proficient in another area such as bioengineering where the department has identified
a need for expertise. In that case, finding a candidate who excels in both disciplines becomes very important. Another factor influencing
the hiring decision is that while industry hires for its needs for the next few years, universities are considering a 30-year investment26
and will need to pick for the person rather than just for certain specializations. Some of the most desirable researchers to hire are those
who are good at reinventing themselves in response to new challenges. 27
6
Transcending Current Architectures
Creating manycore architectures of the future
The increase in the speed of processors has met an upper limit at about 4 Gigahertz. Obviously new architectures are needed, but it is
difficult to predict how they will evolve. Many researchers, architecture experts, chip designers, software engineers, and manufacturers
have turned their attention to figuring out how to develop optimal manycore systems. Efforts in these areas merit close monitoring.
There are multiple sets of problems in this domain, some of which may be solved in a few years, while others may take significantly
longer. For customers to fully benefit from hardware improvements in the years ahead, developers must create applications that take
full advantage of manycore processors and the increased power they promise.28 One of the biggest challenges in the manycore space
is to develop new methods of software engineering for optimizing the programming of these processors in parallel.
Will other technologies surpass manycore?
Although much activity is currently focused on manycore/parallel approaches, the strategy is still based on use of familiar technologies
to produce gains in performance. Meanwhile, computing architectures based on completely different models are under investigation
and may at some point produce breakthroughs that could rival a purely parallel play for improvement.
If it is possible for these other models to increase computational performance by orders of magnitude, solving the problems associated
with multiple processors and programming them in parallel may become less pressing issues. Alternative computing models under
investigation include reconfigurable fabrics, streaming machines, quantum computing, biocomputing, and nanocomputing. These
probably will not render parallel computing irrelevant anytime soon. However, it is not inconceivable that practical breakthroughs in
these alternative approaches might at some point surpass manycore solutions in usefulness. It is important to avoid overlooking the
importance of the intended applications in these evaluations; the value of specific architectures is greatly dependent on the type of
application they are intended to run.
Industry and academic perspectives on manycore
The emergence of multicore/manycore chips and the predicted growth in number of cores per chip is changing the landscape of
computing and presents the computing industry with the critical challenges enumerated in Table 9.
Important questions about next-generation architectures
1
Can today’s operating systems and applications evolve to exploit the power of
multicore/manycore architectures?
2
With per-core performance improvement projected to be flat for the foreseeable future,
what will drive consumers to purchase new systems if their existing applications and
operating environment don’t run significantly faster?
3
If performance stays flat, how can the runtime stack continue to add new features and
functionalities without degrading application performance?
4
Can current programming languages and models enable current and future
programmers to create safe and scalable concurrent applications?
5
Are current synchronization mechanisms sufficient for creating safe, composable, and
scalable applications, libraries, and runtimes?
6
More broadly, can the current infrastructure enable the next generation of “killer apps”
that will make upgrading to multicore systems compelling?
7
Can current operating systems address the increasingly pressing issue of energy
efficiency and find optimal balance between throughput and power consumption?
Not to be forgotten: The value of
a parallel (or any) architecture
depends greatly on the type of
applications it is meant to run.
Table 9: Architectural issues: the industry perspective29
Obviously, the computing industry faces many serious questions in this space. For instance, what will be the “killer app” for manycore
computing? In the academic community, multicore computing is similarly redefining research priorities in many areas including
programming languages, compilers, runtimes, testing, program analysis, verification, operating systems, and virtualization. The
multicore/manycore inflection point offers an opportunity to rethink the abstractions and runtime mechanisms used to build and execute
safe, scalable concurrent applications and systems. New programming languages, parallel run-time systems, transactional memory,
heterogeneous multiprocessing, and speculative execution are among the many areas deserving attention. 30
Manycore chip architectures
Optimal configuration models are just one of the problems. While industry proceeds with designing and fabricating
the next manycore chip, researchers are working diligently to understand whether these are in fact the best
configurations and what to do with them. Is the optimal number of processors 128, 1024 or a million?
In late August 2006, Ken Kennedy of Rice University shared his thoughts about strategies for finding a “sweet
spot” in configurations that would allow for an optimal balance of the number of processors, failover/backup
components, shared/separate memories and caches, etc.31 Note: Ken Kennedy both taught and conducted
research at Rice for 36 years, and until his unfortunate death from a long illness on February 2, 2007, he also
directed the Center for High Performance Software Research at Rice (HiPerSoft).
7
These perspectives are currently evolving rapidly in many new directions, driven by the urgency for finding solutions for the very tough
problems the manycore/parallel processing domain faces.
A unique combination of strategies from Berkeley
Meanwhile, researchers at the University of California at Berkeley are investigating configurations involving only 128 or more
processors. Based on their assumptions of what the future holds, the Berkeley group is adopting some very specific strategies:
“The recent switch to parallel microprocessors is a milestone in the history of computing. Industry has laid out a
roadmap for multicore designs that preserves the programming paradigm of the past via binary compatibility and
cache coherence. Conventional wisdom is now to double the number of cores on a chip with each silicon
generation. A multidisciplinary group of Berkeley researchers met nearly two years to discuss this change. Our
view is that this evolutionary approach to parallel hardware and software may work from 2 or 8 processor systems,
but is likely to face diminishing returns as 16 and 32 processor systems are realized, just as returns fell with greater
instruction-level parallelism. We believe that much can be learned by examining the success of parallelism at the
extremes of the computing spectrum, namely embedded computing and high performance computing.
This led us to frame the parallel landscape with seven questions, and to recommend the following:
1. The overarching goal should be to make it easy to write programs that execute efficiently on highly parallel
computing systems.
2. The target should be 1000s of cores per chip, as these chips are built from processing elements that are the
most efficient in MIPS (Million Instructions per Second) per watt, MIPS per area of silicon, and MIPS per
development dollar.
3. Instead of traditional benchmarks, use 13 "Dwarfs" to design and evaluate parallel programming models and
architectures. (A dwarf is an algorithmic method that captures a pattern of computation and communication.)
4. "Autotuners" should play a larger role than conventional compilers in translating parallel programs.
5. To maximize programmer productivity, future programming models must be more human-centric than the
conventional focus on hardware or applications.
6. To be successful, programming models should be independent of the number of processors.
7. To maximize application efficiency, programming models should support a wide range of data types and
successful models of parallelism: task-level parallelism, word-level parallelism, and bit-level parallelism.
8. Architects should not include features that significantly affect performance or energy if programmers cannot
accurately measure their impact via performance counters and energy counters.
9. Traditional operating systems will be deconstructed and operating system functionality will be orchestrated
using libraries and virtual machines.
10. To explore the design space rapidly, use system emulators based on Field Programmable Gate Arrays
(FPGAs) that are highly scalable and low cost.
Since real world applications are naturally parallel and hardware is naturally parallel, what we need is a
programming model, system software, and a supporting architecture that are naturally parallel. Researchers have
the rare opportunity to re-invent these cornerstones of computing, provided they simplify the efficient programming
of highly parallel systems.”32
Other architectural approaches
There are other researchers focusing on flexible architectures, enabling powerful cores to be reconfigured based on their purpose and
on computational requirements. They have begun examining many alternative architectures and the possibility of building and using
highly configurable and heterogeneous arrays. For instance, Scott Hauck at the University of Washington is one of the investigators
focusing on enabling configurable systems by applying FPGA technology to multicore processors (Berkeley is doing this for different
reasons). Hauck recommends FPGAs in situations that can make use of their novel features: high-performance computing,
reconfigurable subsystems for a system-on-a-chip, and other advantages.33 The biggest promise of this approach is dynamic routing of
memory that will ensure that something happens on every clock cycle resulting in virtually no wasted cycles.
Robert Coldwell (previously a researcher at IBM) articulated at the June 2007 Federated Computing Research Conference in
San Diego, that more research is needed and where he believes much of it will be focused, shown in Table 10.
Approaches to solving next generation architectural issues
1
CMOS end-game electrical problems
2
Multicore software
3
Power/thermals management
4
Thread and manycore sync: software needs help
5
Expand synergies between embedded & GP
6
Design-in-the-large
7
Grand Challenges
8
New technologies including reconfigurable fabrics, streaming machines, quantum, bio, & nanocomputing
Table 10: Architectural research priorities34
8
Every researcher I interviewed in the past two years reports concerns about the difficulties of programming these devices, at least in
ways that would be cost effective and could leverage most of the power on the cores. Other architecture-related issues to be solved
include how to build systems capable of handling enormous data sets. In this vein, Randall Bryant from Carnegie Mellon University
focused on Data-Intensive Supercomputing (DISC), the need for such systems with today’s petabyte scale requirements and the
challenges they face.35
Another look at parallel computing research challenges
Burton Smith of Microsoft Research has observed that “Thread parallelism is upon us”, and then asks, “Which and how much thread
parallelism can we exploit? … [this is] a good question for both hardware and software [domains].”36 Table 11 enumerates the specific
parallelism questions that Smith believes will drive research in this area.
Parallel computing challenges
1
Languages for mainstream parallel computing
2
Compilation techniques for parallel programs
3
Debugging and performance tuning of parallel programs
4
Operating systems for parallel computing at all scales
5
Computer architecture for mainstream parallel computing
Table 11: Parallel computing challenges37
Quantum Computers
A quantum computer is a computational device that leverages phenomena of quantum mechanics, such as superposition and
entanglement to perform operations on data. The basic principle of quantum computation is that quantum properties can be used to
represent and structure data, and that quantum mechanisms can be devised and built to perform operations with this data.38 While bits
are the basis of current digital computing, holding either a one or a zero, qubits are the representational model for quantum computing,
and can hold a one, a zero, or a superposition of these, allowing for an infinite number of states. Qubits are made up of controlled
particles and the means of control are devices that trap particles and switch them from one state to another. A quantum computer
operates by manipulating qubits with quantum logic gates. If large-scale quantum computers can be built, they will be able to solve
certain problems exponentially faster than any of our current classical computers (for example Shor's algorithm). The Block sphere is a
representation of a single qubit.39
Quantum computers are different from other advanced architectures such as DNA Computers and optical
computers. Although optical computers may use classical superposition of electromagnetic waves, they do
so without suing some specifically quantum mechanical resources such as entanglement, and they have
less potential for increasing computational speed than quantum computers. Research in both theoretical
and practical applications of quantum computing is proceeding rapidly, and many national government and
military funding agencies support quantum computing research to develop special quantum computers for
both civilian and national security purposes, such as cryptanalysis.40
Although quantum computing offers better potential for increased speed than other advanced architectures,
such as DNA computers and optical computers, it has serious practical problems that remain to be solved.
The dramatic advantage of quantum computers has only been discovered for solving certain types of
problems, such as factoring and discrete logarithms. It is still possible that an equally fast classical algorithm
may still be discovered for solving these problems. There is one other problem where quantum computers
have a smaller, though significant (quadratic) advantage, in quantum database search. Also, quantum
computers are superior in advanced cryptographic analysis and hold an advantage in breaking advanced
codes that would be impossible for traditional computer architectures in reasonable timeframes. 41
The Block sphere is a
representation of a qubit,
the fundamental building
block of quantum
computers.
Bio-computing and nano-computing
Biocomputers use DNA and proteins to perform computational functions through metabolic or other pathways. Three types of
biocomputers under investigation include biochemical computers, biomechanical computers, and bioelectronic computers. Interest in
these approaches is growing. “The U.S. Department of Defense is awarding a team of nine professors from six universities $6 million
over five years to exploit precise biological assembly for the study of quantum physics in nanoparticle arrays. This research will help to
produce a fundamental understanding of quantum electronic systems, which could impact the way future electronics are created.” 42
Next generation internet architectures
Network research is proceeding along many paths simultaneously, including Internet research, sensor nets and many others. A very
challenging area is architecting the next Internet. Nick McKeown of Stanford is one of a growing number of researchers who are
interested in what is called the “clean slate” approach, and he is currently assembling a consortium of industry partners to investigate
redesigning the Internet “from scratch” without preconceived notions or legacy infrastructure issues. 43 Scott Shenker from Berkeley also
focuses on problems of designing the next Internet. He points out that unfortunately, we have no way of validating these designs.
Analysis and simulation are informative but inadequate, and we cannot experimentally deploy designs on a large enough scale to know
how they would perform in practice. The Global Environment for Network Innovations (GENI) is a proposal for building an experimental
facility that would support such large-scale deployments. Shenker is interested in the technical challenges, the proposals being
discussed, and the role GENI might play in furthering the research agenda. 44
9
Multidisciplinary and Breakthrough Research
Importance of multidisciplinary research
Much has been said about the escalating importance of multidisciplinary research and the need for higher funding in multidisciplinary
areas. One primary reason for this is that many of the greatest problems humanity faces today are much too complex to address using
just the knowledge and methodology of a single discipline. Collaboration in the digital domain enables a greater intersection of
disciplines and easier cooperation between their practitioners. According to David Dornfeld, Dean of Engineering at the University of
California, Berkeley, “The intersection of technology, design, multidisciplinary approaches and entrepreneurship are the most important
focus for the future. For research, that’s where it’s at.”45
The European Journal of Education reinforces this assessment, pointing out that, “Interestingly, the digitization of data also enables
more interdisciplinary work, and sometimes the emergence of new fields, thanks to the reuse of data sets in unexpected ways or the
linking of several data sets.”46
Multidisciplinary funding activity to watch
For fiscal year 2008, NSF has proposed an overall budget of $6.43 billion which is $409 million more than for FY 07. The agency will
use the funds to build on recent advances and to support promising initiatives to strengthen the nation's capacity for discovery and
innovation. In the words of NSF Director Arden L. Bement, Jr., "NSF works at the frontier of knowledge where high-risk, high-reward
research can lay the foundation for revolutionary technologies and tackle difficult problems that challenge society. The opportunities to
advance the frontiers of discovery and learning have never been more promising." 47 The budget emphasizes new research on
improving computing abilities to meet the challenges of 21st century inquiry, as well as polar research, ocean research,
nanotechnology, education, and international collaborations. NSF describes research opportunities by discipline and denotes efforts
across two or more disciplines as “cross-cutting”. Some interesting new multidisciplinary areas of investigation to be sponsored by NSF
and NIH in 2008 are summarized in Table 12.
Programs encouraging multidisciplinary and breakthrough research
NSF: Collaborative Research in Computational Neuroscience (CRCNS)48
The most exciting and difficult challenge for neuroscientists is to understand the functions of complex neurobiological systems.
Computational neuroscience provides a theoretical foundation and a set of technological approaches that may enhance our
understanding of nervous system functions by providing analytical and modeling tools that describe, traverse, and integrate
different levels of organization, spanning vast temporal and spatial scales and levels of abstraction. Computational approaches
are needed in the study of neuroscience as the requirement for comprehensive analysis and interpretation of complex data sets
becomes increasingly important. Collaborations among computer scientists, engineers, mathematicians, statisticians,
theoreticians and experimental neuroscientists, are imperative to advance our understanding of the nervous system and
mechanisms underlying brain disorders. Computational understanding of the nervous system may also have a significant
impact on the theory and design of engineered systems.
NSF: Advanced Learning Technologies (ALT)49
Through the Advanced Learning Technologies (ALT) program, the CISE and EHR Directorates of NSF support research that
(1) enables radical improvements in learning through innovative computer and information technologies, and (2) advances
research in computer science, information technology, learning, and cognitive science through the unique challenges posed by
learning environments and learning technology platforms. Integrative research approaches that build across disciplines and
links theory, experiment, and design are strongly encouraged. Technology goals may include systems for tutoring or
assessment, modeling and sensing of cognitive or emotional states, context awareness, natural language interfaces,
collaboration, knowledge management, and non-traditional goals that redefine the roles of technology in learning. Educational
foci for ALT projects must include an area of science, technology, engineering, or mathematics (STEM), or general crosscutting skills directly relevant to STEM.
NIH: Exploratory/Developmental Research Grant Award (R21) 50
R21 funds provide seed funding to identify projects that qualify for the next level of grants at NIH (multi-year, multi-million dollar
investments). NIH has standardized the Exploratory/Developmental Grant (R21) application requirements to accommodate
investigator-initiated (unsolicited) applications. R21 follows the “let a thousand flowers bloom” model and is intended to
encourage exploratory/developmental research projects by providing support for the early and conceptual stages of
development. Investigators wishing to apply for an R21 grant should be aware that not all NIH Institutes and Centers (ICs)
accept these unsolicited, investigator-initiated R21 applications.
NSF: Small Grants for Exploratory Research (SGER) 51
SGER is a similar effort to the NIH R21 program in that it essentially encourages “riskier” research that may later qualify for
more substantial funding.
Table 12: Innovative programs from NSF and NIH
Promising design approaches to multidisciplinary research
Traditional research usually focuses on existing disciplines and structures, while “multidisciplinary research” is often used to describe
solutions for which one area of expertise will not suffice. However, discussions about multidisciplinary areas often miss the point. In
reality, this reaching across subject boundaries is much more fluid than connecting a computer scientist with a biologist to eventually
create the new discipline of bioinformatics.
10
Thus, it may not be very productive to ask, “Which will be the best new combination of
disciplines to watch for?” Rather than focusing on specific combinations of disciplines, the
design approach to multidisciplinary research stresses fully understanding the research goals,
and then assembling a team that includes whatever areas of expertise are required to produce
the intended outcome. Rather than using a “multidisciplinary” framework of disciplines at the
start, a design approach to problem solving focuses holistically on both the problem and the
solution, involving as many areas of expertise and research roles as necessary.52
Designers approach problem
solving holistically, only then
deciding which disciplines and
expertise are needed.
In this context, design is a global concept rather than a physical plan for a specific item. In “The Ten Faces of Design”, by Tom Kelley of
the famed IDEO design firm, ten roles for team members are specifically described as useful and necessary for producing solutions to
specific problems.53 The Technology, Engineering, Design organization (TED) also shares this approach and maintains a vibrant site
devoted to technology-focused multidisciplinary design and research.54
Because of its fluid approach to problems and to problem-solving, the design approach provides the
ultimate framework for engineering and for all innovation involving multiple disciplines. Traditional
“design engineering” curricula at universities were originally based on Mechanical Engineering courses,
typically supplemented by a few psychology and drawing courses. Today, Stanford’s Design Institute, or
d.school55 involves 5 or 6 departments and approaches design as envisioned by IDEO, whose CEO
56
serves as adjunct faculty.
The Berkeley Institute of Design (BiD) is a research group that also fosters a deeply holistic and
multidisciplinary approach to design for the 21st century, spanning human-computer interaction, mechanical design, education,
architecture and art practice.57
“The future is already
here. It’s just not well
distributed.”
Based on his extensive design experience, Bill Buxton warns in his recent book58 that, “We should not
count on any deus ex machina. We should not expect any silver bullets. Most of what will be invented is
already here in some form. It is highly unlikely that there will be any technology that we don’t know about
today that will have a major impact over the next 10 or 20 years.” He summarizes this central idea with a
William Gibson quote: “…the future is already here. It’s just not well distributed.”59
“Most of what will be
invented is already
here in some form.”
Changing collaboration models
Multidisciplinary research involving computing is growing. However, in many of these projects, computing professionals are not included
from the beginning. “An important role exists for computer scientists in this new multidisciplinary world as primary collaborators.
However, computing professionals are often hired at the end of the research process and merely as service personnel – as
programmers.”60
To maintain continually increasing multidisciplinary advancement, this model must change. Multidisciplinary research requiring
computational expertise is most enhanced by including a computer scientist/engineer at the outset as an integral member of the team.
New and old paradigms for computing professionals’ involvement in research are summarized in Table 9.
Old model
New model
Discipline experts define specific problems
Computing professionals aid in defining the problem
Computing professionals are merely coders
Computational thinking aids in process and problem solving
Computing professionals apply available tools Collaboration results in better solutions, new tools, and algorithms
Computing
professionals add more
value when included
from the beginning of
a project.
Table 12: Changes in computational collaboration models
Collaboration with the social sciences
Increasingly, significant progress is facilitated through close collaborations between computational and social science disciplines,
combining to work on problems that neither discipline could make progress on alone. NSF cross-cutting initiatives clearly emphasize
educational psychology. Mor Harchol-Balter, a researcher from Carnegie Mellon recently began work on an auction-based model for
scheduling the TeraGrid, which combines approaches of theoretical computer science, game theory, and economics. 61 This work not
only provides insights into theoretical mathematics, but also has strong and immediate practical value.
Despite the long-standing prejudice that the social sciences are not “real science” recent research is quite rigorous and is revealing
important information about the human dynamics within which all progress takes place. 62 The social science domains that seem to be
involved most frequently in multidisciplinary work include:
Social Sciences and Computational Collaboration
Cognitive Psychology
Educational Psychology
Sociology
Linguistics
Economics
Anthropology
Political Science
Archaeology
Table 13: Social sciences that frequently collaborate with computational disciplines
11
The social sciences, which have been collecting enormous amounts of data for years, now have available to them advanced data
mining techniques that can help definitively prove or disprove conjectures that were intractable earlier, vastly increasing the
definitiveness of evidence from these disciplines and increasing their value as powerful scientific tools.
Psychological disciplines, in particular, are increasingly becoming involved as key components of many multidisciplinary investigations.
Usability researchers are beginning to equip computers so that they will be able to anticipate specific human needs, “understanding”
what humans are thinking in order to provide services to them in more appropriate contexts. This would solve the problem
characterized by the familiar complaint,”Why doesn’t the computer just do what I want?” To do this will require understanding better
how people think, how those thoughts can be discerned from expressions, and constructing software so that computers can attend to
these demonstrable attributes. A resurgence of interest in artificial intelligence research, together with advances in robotic vision, is
likely to provide some answers in the very near future. 63
Sociology has also been playing a major role in solving large multidisciplinary problems in recent years. The field of social computing
includes a broad range of focus areas, including online community as well as the computational analysis of social mechanisms. Some
researchers driving investigations in computer science theory, such as Jon Kleinberg of Cornell64 and Henry Cohn from Microsoft
Research65, are connecting with social computing experts, since the relevance of graph theory to social networking analysis has
become strikingly apparent. Interestingly, theory as applied to real problems often raises new issues, which then drive further advances
in basic research in a sort of feedback loop.66
Implications for multidisciplinary computer science education
Discussions about the decline in student enrollments in computer science started around the year 2000, and the decline has been
mapped, and recommendations issued.67 These conversations about the pipeline have also included issues in the areas of gender
equity and immigration law. Computer science, once viewed by some students as not relevant to their interests, nerdy, macho, and
boring, has in the last year or two seen some significant changes. For example, the increase in multidisciplinary focus in computer
science education may be having some effect in slowing down, and in some cases reversing, declining trends in enrollment in these
programs. Some smaller schools, however, are facing the possibility of having to close their Computer Science programs entirely.68
Faculty note that the rationale for multidisciplinary education is based on implicit assumptions that:
 Multidisciplinary excellence is considered a competitive advantage for students from the U.S. 69
 Multidisciplinary programs are attractive to students concerned about the relevance of their training. 70
In response to the recent decline in enrollments, some Computer Science departments have instituted significant changes. Instead of
taking the attitude that students should learn core computing principles and theory that they can apply anywhere, Computer Science
departments are actively embracing collaborations with other disciplines and driving up the relevance and attractiveness of the
discipline. For several years, a number of schools have been including robotics and gaming not only as an attractant, but also as core
elements that provide valuable tangible means for teaching the principles of computing. At some schools, changes in the syllabi and
approaches to computing education have improved the attractiveness of computer science as a major. The Georgia Institute of
Technology’s School of Computing, for instance, deconstructed its curriculum and completely rebuilt it based on a multidisciplinary
threads model. Although a number of these ideas have been in circulation in one form or another for quite some time, Georgia Tech
studied the problem, designed the solution, and implemented it, all in the space of nine months.71 It now remains to be seen if the
degrees of students with this type of multi-focused coursework history will be as respected as those with more traditional degrees. They
may be more respected today, in an age when multidisciplinary expertise is considered highly desirable.
There is also considerable interest in defining computational education for other disciplines, which raises many questions. What type of
computer skills and techniques do the scientists and engineers of tomorrow require? Should these be taught by Computer Science
departments?72 Should all incoming freshman be required to take some form of computing coursework to prepare them for other
disciplines, all of which can benefit from this expertise? Harvey Mudd College 73 requires computing studies for all students, and
Arizona State University74 requires informatics training for all 10,000 incoming freshmen, regardless of the students’ intended
disciplinary focus.
Focusing on these same issues, in September 2007 Microsoft Research held a workshop entitled “Computational Training for
Scientists.” This gathering was convened to examine specific problems associated with providing the proper computational education
for the next generation of researchers in biomedicine, biology, chemistry, physics, astronomy and other disciplines. This type of
education presents new challenges for science educators as they innovate and create new curricula to prepare the upcoming
generation of computational scientists. The workshop on provided opportunities discuss, learn, and influence the development of an
interdisciplinary computational education standard.75
12
Creating Breakthrough Solutions for Important Problems
Global issues that drive accelerated research
What are the most important problems to address? Although prediction is difficult, it is important to discern
the future directions of research as an indicator of where to invest funding to support the most potentially “A crisis is a terrible
valuable research. New strategies are needed to identify tomorrow’s breakthroughs, which are urgently
thing to waste.”
needed to solve the leading worldwide challenges.76 These challenges provide strong incentives for
increasing innovation. “A crisis is a terrible thing to waste” remarks Vinod Khosla, one of the founders of
Sun Microsystems.77 Global concerns provide strong indicators of where investments are likely to grow. Note that all of the global
problems listed in Table 6 lend themselves to computationally driven solutions.
Global Problem
Energy
Central Issues
Supply - how to get more
Efficiency - making it go further
Impact - minimizing pollution
Renewable sources - wind, geothermal, solar
Healthcare
Eradication of “big killer” diseases
Turning catastrophic illnesses into manageable ones
Affordability - providing healthcare for everyone
Adaptive technologies
Population
Providing care and support for the huge aging segment
Increasing urbanization - how to make cities more livable
Transportation - easing congestion and crowding
Education
Scale - education for all, including college
Affordability
Active vs. passive
Quality of Life
Climate Change
Urbanization
Venice, Italy might become the
Atlantis of the future – uninhabitable
by 2100 due to flooding.
Meaningful experiences
Connection with others
Entertainment
Managing huge, complex information sets
Accuracy of predictions
Effective public transportation vs. private vehicles
Concentration of pollution
Crime control
Accelerated spread of disease in cities
Table 14: Pressing global problems
Problems unique to the 21st century
Urbanization: Many issues are exacerbated by population density in urban areas. By 2008, humanity will pass a turning point. In that
year, for the first time in history, more than 50% of all humans on earth will live in cities. In 2006 the number was 49%. 78
Climate Change: Whatever the cause(s) of climate change, the fact remains that a significant transformation is underway, threatening
much of the world’s population with drastic weather change, floods, and other natural forces destructive to agriculture which may
endanger humanity’s means of survival. The Worldwatch Institute has stated that, “Venice, Italy, might be the Atlantis of the future. By
2100, the city could be uninhabitable due to floods triggered by climate change”. 79
The promise of “green computing”
Academia has responded to these problems with a variety of new programs focusing on the idea that computing can positively impact
society. The terms and emphasis vary and include “socially relevant computing”, “computing for change”, and “green computing”. The
University of Colorado has published a “Green Computing Guide”.80 Other schools name this “green” focus, “sustainable computing”.
Ultimately, the needs of society are often the best predictors of change. Mounting scientific evidence has established climate change as
an undeniable fact. Science will be providing the solutions needed in this and other areas, and computing is empowering scientists in
this effort. Urgent issues such as global warming are accelerating investment in research, science, and technology and supporting
significant progress in these areas.81
Computing enables science
Future research directions are also indicated by the benefits computing brings to science and engineering, and by the specific
advantages that have proven to be highly effective for accelerating discovery. Researchers leverage specific computational assets,
which spur greater growth in these areas as demand increases.
13
Computational advantages for scientific discovery
Strong, reliable, secure data management
Fast analysis
Typically referred to as
computational enablers,
demand for these tools
has never been higher.
Discovery of hidden trends (often buried in massive data sets)
Simulations and automated investigation (examining “what if” scenarios)
Highly integrated computing tools providing access and visualization for all data in all contexts
Networks of all types
Mobile devices
Robotics, in the broadest sense, including techniques based on automated vision
Table 16: Specific benefits computing brings to science
While computing currently brings significant advantages to scientific research, many more advances are needed in a number of areas
to maintain the current accelerated rate of discovery and problem solving. This is becoming a very pressing issue because many of
these problems must be solved in less than 50 years. In fact, some of the more critical trends must be changed even sooner or they
will become irreversible.
Next Steps for Computational Advancement
Development of advanced new data mining tools and algorithms
Creating the next
generation of
computational tools vital
for scientific progress.
More speed: leveraging the advantages of manycore processors
Practical concurrent / parallel programming for new architectures
Next generation embedded systems and networks
Furthering of deep multidisciplinary collaborations
Specialized computing tools for specific disciplines and combinations of disciplines
Table 17: Computing advances necessary to maintain the current rate of scientific progress
The need for fast discovery
The impetus for an aggressive research agenda likely to produce breakthrough results originates in the desire for quick,
high-impact results valuable to science, industry and society.




Accelerated innovation is needed to address the most urgent problems facing the world today.
Traditional research leads to innovation; however, breakthroughs accelerate this process.
Industry is interested in discovering “the next big thing”.
Citizens worldwide want scientific breakthroughs that will improve access to adequate food, water, housing, healthcare, security,
satisfying life experiences, and the means for connecting to others.
 Researchers need to know, “What might we be missing?”
Breakthroughs are nearly impossible to predict
Current global problems will require more than incremental changes; transformational progress
will be needed. However, most transformational discoveries are by their nature largely
unpredictable. The single most important characteristic common to breakthrough discoveries is, in
fact, that they were unforeseen. Add to this the fact that nearly all transformational change results
from breakthroughs. In a sense, “the next big thing” is nearly unpredictable. In his definitive
treatment of uncertainty, “The Black Swan: The Impact of the Highly Improbable”, Nassim
Nicholas Taleb asserts, “The inability to predict outliers implies the inability to predict the course of
history.”82 In essence, to create awareness of these opportunities requires us to develop means
beyond traditional statistical and trend analyses, focusing on outliers rather than on the data that
produces familiar visualizations of growth.
Taleb advises approaching this conundrum like an investment portfolio, allocating about 15% of
the total allotted resources to as many high-risk, potentially high-gain investments as possible,
thus expanding the possibilities for breakthrough discoveries without discarding the security of
lower-gain, but safer research. Since most of what will be invented is already here in some form,
breakthroughs are usually based on reuse, novel recombination and repurposing of elements
already at hand.
14
Anticipating where breakthroughs are likely
Despite difficulties in foretelling the exact form that breakthroughs will take, there are ways to identify the general areas where the most
transformative new work will occur, as listed in Table 18.
Identifying breakthrough research areas
Sources
Process
Educated Opinions
Gather ideas from thought leaders most knowledgeable about academic research
Domains to watch
Identify emerging areas, following latest developments in promising directions
Researchers (known)
Focus on top innovators, watching current directions of the best researchers
Researchers (unknown)
Attract new innovators, encouraging their best, most daring new ideas
Current research
Track for infrequent but exceptional progress in existing areas
Table 18: Identifying the likely origins of breakthrough research
We have already examined the first two of these sources for identifying potential areas for breakthrough research by gathering the
opinions of academic thought leaders, and by identifying the key areas of research receiving the most funding, and an analysis has
shown that both of these sources are in general agreement. The next approach involves examining innovative individuals, rather than
innovative areas. Some of these are well known, while others are working “off the radar”, often in areas not receiving much attention or
funding.
Observing current top innovators
While it is true that 10 percent of scientists write 50 percent of scientific articles, 83 the scientists most
“First have a lot of
likely to make significant research breakthroughs conduct the largest number of total investigations.
ideas. Then, throw
However, these investigations produce mixed results. Some projects will lead to significant research
discoveries and some will not produce much useful information. Despite mixed outcomes, successful away the bad ones.”
innovators just keep having ideas.84 Linus Pauling, when asked by students how he got good ideas, said
“My answer is simple: First, have a lot of ideas. Then, throw away the bad ones.” 85 Sir Isaac Newton spent many years on alchemy,
now known to be as unscientific as astrology and numerology.86 When identifying potential innovators, the obvious first questions are,
“Who has written the most articles in the past three years?” and “Whose work has been cited the most?” The answers to these
questions will reveal information that is already widely known, enumerating those who are already successful and well-funded.87
Unknown next-generation innovators
Another very promising approach for identifying potential future innovators is to issue a broad call for proposals in new, uncrowded
areas of research that have great promise, and that are not currently well-known or well-funded. This method can be used to identify
unknown but talented people who have not been introduced to us by the usual thought leaders and researchers. Disruptive
technologies and other breakthrough discoveries are transformative precisely because they occur outside the current focus, and are
often driven by people who are not established researchers. The people identified by this method may not have many publications, but
may have uniquely useful insights and attractive team-oriented attributes. We know several important facts about inventive and
multidisciplinary innovators:




Productive innovators are also effective collaborators.88
Innovation is almost always the product of collaboration, despite myths about lone genius inventors.89
Creativity increases with collaboration across multiple disciplines.90
New researchers are as interested in creating a good network of relationships as in obtaining necessary funding. 91
Prime candidates for research in little-known areas will often display some of the teamwork attributes listed in Table 19.
Attribute
Team Focus
Research style
Highly effective with others as part of cooperative team
Discipline focus
Collaborates readily across subject boundaries
Group dynamic
Desires to leverage diverse strengths of team members
Social intelligence
Understands the value of relationships and cooperative networks
Primary motivation
Passion for the area under investigation
Table 19: Profile of individuals likely to produce breakthrough research92
Funding in little known areas with new researchers can sometimes be relatively inexpensive (despite assumptions about the high cost
of funding breakthrough research) because they may have several motivations for doing the research and they may require relatively
modest investment to get started.93
15
New innovators and “risky” research
Another way to identify key innovators is through programs that identify and reward “rising
star” performers in research, as occurs with the Microsoft Research New Faculty Fellowship.94 “Risky” in this sense refers to
This program garners recommendations from academic leaders at universities and yearly scenarios for discovery
awards $200,000 each to five exceptional individuals based on their potential for producing with low probability of success
results of profound impact in computational areas and often in multidisciplinary areas in which but potentially high reward.
computing is at the core. This enables them to engage in somewhat riskier research since they
are freed for a while from the constant search for new funding, which tends to confine their
work within the narrow definitions of current funding initiatives. The rationale for this type of award is that this provides the space and
time needed to create truly transformational breakthroughs. Ultimately, the best way to increase exposure to high-impact breakthroughs
is to target a small but significant percentage of funding resources at riskier, but potentially more rewarding research, without
abandoning all efforts in more traditional areas.
Innovation uptake: fuel cells in China before the U.S.?
GM’s head of product development says “investment in and enthusiasm for
clean technologies in Asia is so great that cars powered by fuel cells are likely
to take off in China before they do in the united States.”95 Developing countries
already have higher levels of “early stage” entrepreneurship, with more people
starting new ventures – often because there is greater necessity driving the
urgency.
Many executives feel the heat is on and that they must innovate faster just to
stand still, because production cycles are getting so much shorter. Gil Cloyd,
chief technology officer at Procter & Gamble (P&G) examined consumer goods
life cycles in the U.S. over a ten year period. He found that life cycles had fallen
by half; now firms believe they need to innovate twice as fast.96
3M is an American company famous for its culture of innovation, and also
believes the world is moving much faster. Andrew Ouderkirk, one of 3M’s
famous inventors, points out that many the company performed in-in house are
now outsourced. 3M is now trying to shorten development times by engaging in
“concurrent development”. They are now talking to customers much earlier in
the product life cycle to enable the acceleration of new product innovation.97
Table 15: “Necessity is the mother invention” everywhere
Spontaneity takes time
Although highly innovative research is valued for sudden insights and the rapid progress it enables, the reality is that research can
proceed for quite some time without transformative solutions, which are sometimes not apparent until the third year of investigation.98
One version of the law of unintended consequences99 claims that for every six outcomes, one unpredictable outcome will occur.
Although this is not actually a rule, the point remains that research can produce unexpected outcomes that can have greater impact
than those sought originally. The history of science and technology is replete with well-documented findings of this type, ranging from
the invention of penicillin to the discovery of unintended medical results and product opportunities for Rogaine and Viagra. 100
16
Snapshots of Recent Developments in Various Domains
Life Sciences
The next disciplines to be transformed by the data integration revolution
It is important to note that in terms of future trends, the life sciences are becoming the next
set of disciplines to be transformed by the data integration revolution that Jim Gray was
instrumental in starting. This is because the life science community is one of the most
motivated to do data integration and to build data repositories similar to “Sky Server”.101
(Sky Server is a website presenting data from the Sloan Digital Sky Survey, a project to
map of a large part of the universe.102) They are also highly motivated to create digital
libraries.103 The recent 4th International Workshop on Data Integration in the Life Sciences (DILS), held in Philadelphia on
June 27-29 2007, included strong industry and foundation participation.104 The results from this conference are available online on the
BioInform site.105
The life science community is one
of the most motivated to do data
integration and to build data
repositories similar to “Sky Server”.
Getting organized: workflows, provenance and semantics
The three-day workshop, hosted by the University of Pennsylvania’s engineering department and genomics institute in collaboration
with Microsoft Research, was centered on several key themes, focused on helping researchers combine and analyze disparate data
sets and tools: namely workflows, provenance, and the importance of semantics and controlled vocabularies for data annotation. Topics
ranged from neuroscience to proteomics to single nucleotide polymorphism (SNP) data analysis and beyond, all centering on data
integration. Experiments in workflow design leverage the interactive functionality of social networking sites to create a collaboration
environment for scientists looking to share their work with peers. Interesting news from this domain includes:
 The Allen Institute for Brain Science (Seattle, WA) is building a mouse brain image databank and is hoping to branch out to human
subjects through collaborations with other organizations.
 The first US based HealthGrid meeting is being organized to take place in 2008. 106
 Zoé Lacroix, from Arizona State University is working on building a meta-life science data repository to link together life science
databanks.107
Breakthrough Research Samples from the Edge
Some research projects are sufficiently out of the ordinary that they defy simple categorization. The following are some examples of this
kind of research.
Symmetry Studies have become the major focus of researcher Yanxi Liu at Pennsylvania State University. This project uses robotic
vision to detect asymmetry to uncover previously difficult to ascertain facts, such as when a person is lying, diagnosing schizophrenics,
and detecting dangerous physical abnormalities which escape usual methods of inspection.108
Protein Folding is an area receiving increasing attention. The enormous complexity of the
3-dimensional configurations that proteins can take is a central focus in the development of new
medicines and treatments. Lydia Kavraki, a computer scientist at Rice University is collaborating with
medical school colleagues to effectively accelerate this process computationally. This has had the
unexpected outcome of identifying and saving viable configurations that the chemists had already
erroneously discarded as impossible.109
Virtual Worlds for Psychotherapy:
Hunter Hoffman from the University of Washington has
produced a number of immersive simulations, which have been successful in pain mitigation and
eliminating phobias. Snow World110 a recent project is proving useful in alleviating post-traumatic
stress syndrome. Burn victims who are too young for pharmaceutical medication have been able to endure painful therapy for skin
recovery while immersed in an engaging, wintry wonderland complemented by Paul Simon’s music. New York citizens, who have
blocked out the painful memories from 9/11, resulting in significant personal debilitation, have been able to relive and participate in
those traumatic events, accompanied by a therapist, to speed their recovery. Even people afraid of spiders can conquer arachnophobia
with a gradual introduction via simulations and desensitization therapy.
A Game to Cure AIDS describes a recently funded project proposed by Zoran Popović 111
from the University of Washington. In a different approach to the protein-folding problem, he
is leveraging a user-assisted optimization algorithm to leverage networked humans to solve
these essentially three-dimensional puzzles. This is something that human beings tend to do
fairly well. He has thus combined approaches to gaming, biomedical research, graphics and
human computing,112 a field initiated by Luis Von Ahn from Carnegie Mellon.
17
Programmable Matter is under investigation in fairly unique ways by Seth Goldstein at Carnegie Mellon University, where he is
designing experimental systems of nanobots that are barely visible to the human eye which can reconfigure themselves into the shapes
of humans, or anything, through electromagnetic and other means of manipulation. He is exploring how to build and use a novel form of
programmable matter, which he calls claytronics. This system is made up of an ensemble of mobile, inter-communicating computational
units, each of which can be viewed as a modular robot, and can then be programmed to take on an arbitrary 3D shape in the physical
world. One of the main challenges in the claytronics project, and similarly for any system of modular robots, is to develop a method to
program the ensemble to carry out a given task. One of the obstacles faced when programming an ensemble of modular robots is that
the programmer must achieve a global goal for the ensemble by managing the collection of programs running on each robot. 113
Musculo-Skeletal Conduction for Secure Data Communication, Cost-Effective Diagnosis, and User Authentication with
Integrated Mobile Computing, is a recently funded proposal from Michael Liebschner. He intends to design and realize a humanbody based communication technology using mobile devices for applications in body-area communication, user authentication, and
disease diagnosis. This technology will be used to excite the musculo-skeleton system with acoustic wave patterns and detect its
response, and will be implemented on cost-effective, standard mobile phones, an approach with potentially high impact for mobile
healthcare.114
These are just a few samples of very original investigations contemplated or underway. It does not represent all highly innovative
research available today, of which there are many fine examples distributed at leading research institutions.
Robotics Entering the Mainstream
Many directions in computing research are now so complex that it is possible to refer to them as core, applied, or multidisciplinary
depending on the context. Robotics is one of these, spoken of in terms of embedded systems, networking, computer vision,
electromechanical systems, or artificial intelligence, among others, depending on the approach being taken and the background of the
speaker. Suffice it to say that robotics encompasses all of these and more, and could be said to be finally coming of age. Robots have
long proven their value in areas such as industrial manipulation, are appearing regularly as increasingly sophisticated toys every
Christmas, are standard equipment on the battlefield and in selected operating theaters, and are now entering the mainstream of our
lives in applications such as assistive care, intelligent vehicles, and even vacuum cleaners.
U.S. Congressmen Zach Wamp (D-PA) and Mike Doyle (R-TN) announced a Congressional Congress
A robot in every home
on Robotics to take place in September 2007. The discussion will center on the usual topics, for
instance, robots on the factory floor, and will probably include some starry-eyed futurism. According to
the press release, this will include "recent technological advances that enable robots to perform functions beyond traditional assembly
line tasks and to operate in environments beyond the factory floor." Today, [robots] are also being used to defend our nation, perform
surgery, fill prescriptions, deliver supplies and materials, and even as tools to educate our children," said Doyle, "so it is important that
we create a forum by which Congress can familiarize itself with the impact this first great technology of the 21st century is likely to have
on the lives of all Americans." Congressman Wamp added, "The increase in the number of emerging and potential applications for
robotics is astounding."115
This public governmental recognition of robotics as a major new area of development strongly resonates with a recent Scientific
American article by Microsoft Corporation chairman Bill Gates, “A Robot in Every Home”, in which he stated his belief that the robotics
industry is developing in much the same way that the computer business did 30 years ago.116
Interestingly, the aging of the population in developed nations, mentioned earlier as one of the big global challenges, may drive
enormous business opportunity and create many companies for the development of helpful robots for the elderly. Exploding healthcare
costs in general are also a key driver in seeking the high availability (24x7x365) and capabilities of robotic assistive care.
This diversity of opportunity and technology underscores the value of investing in many small projects early in order to increase
exposure to the possibility of the “next big thing”. For instance, the External Research & Programs group of Microsoft Research began
investigating robotics futures with academic research partners more than five years ago. The results of this were pivotal in the formation
of the Robotics product incubation group at Microsoft and its Microsoft Robotics Studio software platform, which has been designed for
researchers and commercial robotics engineers alike.117 It also directly led to such programs as the Institute for Personal Robots in
Education hosted at Georgia Tech with Bryn Mawr College, which is researching the challenge of applying robotics technology to
address attraction and retention in beginner Computer Science classes. These efforts are milestones on a path that we expect to lead
to robots into the mainstream as envisioned in Bill Gates’ article.118
18
Taking Computer Gaming to the Next Level
Application of gaming to non-traditional gaming areas
It would be a serious omission to not include gaming in any discussion of new developments in computer research and technology.
Interestingly, gaming is appearing as a central theme in conferences not even focused on games per se. Beyond the usual focus on
gaming at SIGGRAPH, the Game Developer’s Conference (GDC) and a few other gatherings; several AI, medical and security
conferences have included strong focus on gaming, simulations, and the use of PCs and Xboxes as their research platforms. SIGCSE
(ACM’s Special Interest Group for Computer Science Education) now devotes a much larger portion of the agenda to gaming and
gaming-related topics at their annual conference.
These have included discussions on game-like simulations for training doctors and emergency responders, and how to prepare for
counterterrorist activity. With the strong focus on gaming in both consumer and academic arenas, the time is ripe to improve the state of
the art.
Host
Conference
Dartmouth Schools of Engineering
& Medicine
Polytrauma and Pandemic Conferences
2006119 (focus: virtual medicine)
Aligned Management Associates
Medicine Meets Virtual Reality (San Diego)
Dartmouth School of Engineering
Conference on Adversarial Reasoning (select
intelligence community experts only)
Association for the Advancement
of Artificial Intelligence
AAAI-07 and IAAI-07 Conferences
ACM Special Interest Group for
Computer Science Education
SIGCSE 2007
Table 21: Several non-gaming conferences with strong gaming focus
Gaming aggressively drives computer requirements
Computer Gaming and the technology on which it is built are appearing everywhere in academic research. Gaming has long been
known to push the boundaries of computer hardware, driving the state of the art in consumer graphics and audio, and placing heavy
stress on all computer resources, from network capacity to memory to storage to display. In fact, John Dvorak claims that “Real
progress in desktop computing always stems from gaming.”120
The consumer hunger for the best computer game system is relentless and hardware vendors (console and PC) have responded in
terms of high end PCs, dazzling consoles and other devices, including PDAs and cell phones. Computer Gaming also provides an
ideal focus for advancing Computer Science Research.
Gaming + Internet expansion  dramatic growth in computer usage
Gaming has already captured high interest in academic and corporate research. It is also driving consumer
interest in the internet worldwide. Of the top 5 countries in internet usage, China has emerged as the
second largest, with 137 million users (expected to keep growing for the foreseeable future). Following
China are Japan (86 million), Germany (51 million) and then India (40 million). Interestingly, India recently
passed the UK at 37 million Internet users.
India recently
passed the UK with
a total of 37 million
Internet users.
Although currently in the lead with 211 million total Internet users, the U.S. only registered a marginal growth rate of 2% last year
(2006). On the other hand, China’s usage is expected to keep growing rapidly for the foreseeable future, and by the end of 2007, will
surpass the U.S. in total number of Internet consumers.121 The rapid growth of gaming, internet participation and computer use is
establishing a new economic reality in the world market. It is now economically viable to consider development of broader roles for this
combination of technologies. Beyond entertainment, for example, in education and other socially relevant computing applications,
gaming has great relevance and represents a large potential growth area.
Next step: serious games
While this growth trend is likely to increase as networking expansion fosters a new age of gaming and entertainment,
Serious Gaming has taken hold in three major venues: games for learning, games for health and games for training.
The emerging area of “Games for Learning” has seen great advancement in development of games for
learning math and also for language acquisition. The opportunity for advancement is especially true in
China where both internet usage and gaming are dramatically on the rise. The Chinese government
seeks a game for language training (both to learn Mandarin and to learn English). 122 Similarly, and in
some ways inspired by the Chinese, institutes in the U.S. are investigating the idea of gaming to
significantly strengthen math skills.123
The Chinese government
is currently seeking
proposals for games to
teach both Mandarin
and English
Games scale beautifully and have the potential to draw vast numbers of consumers. World of Warcraft
engages 9 million worldwide, and Xbox usage has grown to 7 million as this report is being written.124 Governments, educators, a few
game producers, and a number of consumers are looking for the next best examples of a serious game (or game for learning).
19
Table 22 contains only a partial listing of the serious games titles available and under development. Building on this momentum, game
producers are now beginning to focus more attention on the genre of serious games, which includes those games which have an
instructive, therapeutic, or otherwise beneficial impact. For instance, applications focusing on the areas of health and fitness have all
shown that a game can make not only exercise but physical therapy both less painful while simultaneously decreasing the tedium of
these activities.
Game
Benefit
Specific use
Innovator or sponsor
Dance Dance Revolution
Fitness
Aerobic exercise
Konami
Yourself! Fitness
Fitness
Personal trainer
Respond Design
Eye-Toy
Fitness
Aerobic exercise
Sony
Wii Sports
Fitness
Sports simulator
Nintendo
Second Life
Health
Treatment for Aspberger’s Syndrome
Linden Labs
Snow World
Health
Pain mitigation for burn victims
University of Washington HIT Lab
Everquest II
Language acquisition
Learning French, ESL
Sony Entertainment125
Levantine Iraqi
Language acquisition
Learn Iraqi Arabic
USC/ICT and Alelo
DimexianM
Learning
Algebra
Jim Gee and Tablua Digita
Under consideration
Learning
Algebra
Microsoft Research, U.S. NAS
Brain Academy (series)
Learning
Math, English and other skills
Nintendo
Darfur is Dying
Learning
Social Awareness
University of Southern California
Discrete Math, symbolic Logic
Stanford
Language, Proof and Logic Learning
Table 22: Serious games: an industry poised for explosive growth
The role of gaming in the future direction of computing
Dozens of universities have established BS or MS degrees in gaming. This is a deep trend and not just a fad. In the category of
coordination, described as one of the seven Great Principles of Computation 126 a game is a model for rules of interactions governing
complex adaptive social-technical systems. James Carse points out that “There are at least two kinds of games. One could be called
finite, the other, infinite. A finite game is played for the purpose of winning, an infinite game for the purpose of continuing the play.”127
Carse’s finite game bears striking resemblance to the notion of closed (terminating) computation, and infinite game to open (nonterminating) computation. Currently we are moving away from the study of closed to open computations, and engaging in new fields as
infinite rather than finite games. For example:
Creating
a “Science
of Games”
 Theoretical computer science is moving from closed computation toward interactive computation.
 Considerable information on the Web is provided through databases that cannot be queried by search engines.
Social science researchers believe this is possible if approached as a game built on policies.
 Evolving knowledge communities have become research laboratories for innovation.
 The Web and the Internet are both infinite games opening up new areas of science due to computation.
Examples from biology, physics, materials science, economics, and management science show they have moved beyond computing as
a description of their information processes to a malleable generator of ongoing new behaviors.128 To underscore the growing
seriousness and importance of computer gaming, the July 2007 issue of the Communications of the ACM includes a feature focused on
“Creating a Science of Games” which includes 7 articles by thought leaders in this area.129 These include:





“Creating a Science of Games,” by Michael Zyda, Communications guest editor
“Games for Science and Engineering Education,” by Merrilea J. Mayo
“Games for Training,” by Ralph E. Chatham
“How to Build Serious Games,” by H. Kelly, K. Howell, E. Glinert, L. Holding, C. Swain, A. Burrowbridge, M. Roper
“Carnegie Mellon’s Entertainment and Technology Center: Combining the Left and Right Brain,”
by Randy Pausch and Don Marinelli
 “Using Storytelling to Motivate Programming,” by Caitlin Kelleher and Randy Pausch
 “Real-Time Sound Synthesis and Propagation for Games,” by N. Raghuvanshi, C. Lueterback, A. Chandak,
D. Manocha and M. C. Lin
20
Is Computing a Natural Science?
Two divergent opinions
Natural science is defined as “A science, such as biology, chemistry, or physics, that deals with the objects, phenomena, or laws of
nature and the physical world.”130 In the preface to the 1996 edition of his lectures on computation,
Richard P. Feynman asserted that “Computer science also differs from physics in that it is not actually a science. It does not study
natural objects… Rather, computer science is like engineering – it is all about getting something to do something.”131 Feynman, who
originally composed his lectures in the 1980’s, focused on limitations of computing and computing machinery. However, in the current
era, computers possess vastly superior capabilities compared to those Feynman discussed. In contrast, In the July 2007 issue of
Communications of the ACM, Peter J. Denning disagrees. Denning, a past president of ACM, and currently, director of the Cebrowski
Institute for Innovation and Information Superiority at the Naval Postgraduate School in Monterey points out that “Computing is not – in
fact, never was a science only of the artificial.”132
The study of natural and artificial information processes
Denning goes on to note that scientists from many fields have been saying they have discovered information processes in the deep
structures of their fields.133 Nobel Laureate and Caltech President David Baltimore asserts in “The Invisible Future” that “Biology today
has essentially become an information science. The output of the system, the mechanics of life, are encoded in a digital medium and
read out by a series of reading heads.”134 There are analogous stories for physics and many other disciplines. In fact, Stephen Wolfram
in “A New Kind of Science” states that nature is written in the language of computation.135 The old definition of computer science – the
study of phenomena surrounding computers – is now obsolete. Computing is the study of natural and artificial information processes. 136
A top-level framework of seven (overlapping) categories has been devised by Denning and others to develop the basis for mapping
essential computational principles that cut across many technologies.
Framework
Computation
meaning and limits of computation
Communication
reliable data transmission
Coordination
cooperation among networked entities
Recollection
storage and retrieval of information
Automation
meaning and limits of automation
Evaluation
performance prediction and capacity planning
Design
building reliable software systems
The old definition of computer science
now obsolete: computing is the study of
both natural and artificial information
processes.
Table 23: Framework for great principles of computing137
Principle
Summary
Intractability
(Computation)
Over 3,000 key problems in science,
engineering and commerce require more
computation, even for small inputs, than can be
done in life of the universe.
Compression
Representations of data and algorithms can be
(Communication) significantly compressed and the most valuable
information recovered later.
Choosing
An uncertainty principle: it is not possible to
(Coordination)
make an unambiguous choice of one of several
alternatives within a fixed deadline.
Locality
Computations cluster their information, recall
(Recollection)
actions into hierarchically aggregated regions of
space and time for extended periods.
Search
Finding a pattern or configuration in a very large
(Automation)
space of possibilities.
Bottlenecks
(Evaluation)
Hierarchical
Aggregation
(Design)
Forced flow laws: in any network, the throughput
at any node is the product of the network
throughput and the visits per task to the node.
Larger entities are composed of many smaller
ones.
Computing examples
Searching for optimal solutions.
Parcel Delivery. Truck transportation.
Traveling salesmen. Knapsack
Taxi Routing. Airline Routing.
packing. Bin packing. Tilling a Plane. Scheduling (industrial engineering).
Compression of voice (MP3, MP4,
ACC), images (jpg, GIF), files (Zip).
Fourier transform.
Hardware that never crashes while
responding to interrupts. Mutual
exclusion. Deadlocks.
Virtual memory. Hardware caching.
Web Caching. Interconnection
structures in parallel machines.
Genetic algorithms. Evolutionary
computing. Branch and bound.
Gradient search.
Saturation and bottlenecks in
computer networks.
OS and network software levels.
Information hiding. Modularity.
Abstraction.
Operation of cochlea in the ear.
Morse code.
Traffic control. Telephone and network
routers. DNA sequencing. Free will
(psychology).
Functional brain cell clusters. Near
decomposable economic systems.
Punctuated equilibrium (biology).
Genetic evolution. Passing of genes to
descendents.
Fast propagating urban gridlock.
Assembly lines (industrial
engineering).
Ladder of scale (astronomy and
physics). Functional organs (biology).
Fractals.
Table 24: Examples of the application of principles of computing138
Since information processes have been discovered in the deep structures of most fields, unlocking the mysteries of those processes
will undoubtedly open many advances in those fields. The principles of computation are helping to pick the lock. 139
21
Conclusions
Urgency and opportunity accelerate innovation
Times of crisis engender the need for greatly accelerated innovation. For instance, during World War II, the Manhattan Project140
produced the atomic bomb which ultimately led to the end of the conflict with Japan. On May 25, 1961, in response to cold war
competition with the Soviet Union, an exhortation by John F. Kennedy initiated the “race for space” and established the goal of putting a
man on the moon by the end of the decade.141 This resulted in a national obsession in the United States for educating scientists and
engineers.
The power and advantages of computing have never been more critically needed than they are today. The year 2005 qualified as the
hottest in recorded history.142 It is well-documented but not widely realized that 90% of all of the ocean’s large fish have been depleted
(“fished out”).143 Meanwhile, the human population is expected to reach eleven billion by 2050.144 There is little doubt that these grand
challenges of epic proportion also require firm resolve at the national and international level. Meanwhile, scientists, who are expected to
provide the necessary solutions, must increasingly rely on advanced computational approaches to accelerate discovery.
Computers complement human ingenuity
Computers calculate and human beings invent. Making computers more powerful is the manycore goal: many processors working
quickly, seamlessly and in parallel. Increasing human inventiveness means empowering multiple human brains to work quickly and
seamlessly in parallel. This, in fact, describes the next age of human collaboration. Indeed, in addition to providing the best in “numbercrunching” available, networked computers today have become the consummate medium for communication and collaboration.
Computers extend human abilities in two broad classes of activity:
Computers extending human abilities
Mathematical / Computational Advantages
Communication / Collaboration Advantages
High-speed calculation and numerical analysis
Instantaneous connection with other researchers, anytime, anyplace,
synchronously or asynchronously
Management, search, easy access and manipulation of
enormous amounts of data
Collaboration spaces designed for simultaneous group sharing and
manipulation of messages, images, sounds and scientific data of all types
Robotic control to enhance surgery, fly airplanes and
Easy integration and synthesis of petabytes of digital information from
spacecraft, perform activities dangerous to humans, enable
multiple domains, deriving from virtually any source, made available to
remote laboratory access, and automate many other activities research teams worldwide
Rapid generation of complex simulations for examining
multiple “what if” scenarios
Powerful and infinitely flexible visualization tools
Table 25: Computers Increase productivity and inventiveness
Popular myths must be abandoned
In today’s world, the highly innovative enjoy the most competitive advantage. (Of course, this was also true in yesterday’s world.)
Anything promising to accelerate the creation of more “breakthroughs” has huge appeal. In reality, the idea that achieving breakthrough
findings requires a lone genius working intensely in a back room is a delusion, a myth. 145 Teams of collaborators create breakthroughs.
Edison did not invent the light bulb alone; the Wright Brothers did not “invent” the airplane (their movable rudder was a great insight
however) and Einstein did not create the theory of relativity in a vacuum; there were many other great thinkers whose ideas he used to
get to that point. The idea that breakthroughs accelerate progress in research is true but also an oversimplification. Breakthroughs often
come at the expense of many years of research, while popular culture focuses narrowly on the breakthrough moment captured in
history books.
Creating more opportunities for breakthrough innovation
There are a number of practical and useful techniques for encouraging creativity and increasing the rate
of discovery available. These are neither mysterious nor experimental.
In fact, many strategies to increase innovation have been in practice, with some success, for some time.
These include: increasing funding, having great patience for long research efforts, investing in new
researchers and exploring little known areas. Removing the word “failure” from discussions about
innovative attempts would contribute greatly to a better culture for supporting innovation.
Breakthroughs result
from unexpected
connections; increasing
these accelerates
innovation.
Increasing exposure to the unexpected can be quite productive, with significant supporting evidence. 146 Assembling teams that
embody the most beneficial attributes for creativity can greatly enhance the likelihood of useful innovation. Increasing the likelihood of
unforeseen intersections of ideas is a powerful technique that plays to the advantages pointed out by Frans Johansson in
“The Medici Effect.”147
22
Guidelines for increasing innovation
Although innovation and especially breakthroughs can be difficult or even impossible to predict, one strategy is to optimize for the
conditions in which they occur. The team environments which engender increased innovation can be constructed, and then watched for
results. Much of this depends on encouraging the team dynamics most helpful for creative collaboration, enumerated in Table 26.
To optimize collaboration and increase breakthrough innovation:
1
Increase the potential for the unexpected intersection of ideas and approaches
2
Encourage broad diversity of expertise, approaches and cultural perspectives across the team
3
Engage highly multidisciplinary teams to increase cross-fertilization of ideas
4
Generate as many new ideas as possible
5
Adopt tolerance for failure: high innovation generates ideas both good and bad
6
Leverage collaboration tools in every way possible to enable rapid group problem solving
Table 26: Team dynamics for increasing the rate of discovery
Successful innovation is a combination of many small sparks. As the need for rapid innovation increases, it is important to engage in
riskier research areas, encouraging many small investments in many places. To do this requires courage and resolve, since significant
pressure still exists to continue in established paths proven to create incremental change.
Promising prospects for significant positive change
We live in an exciting time for both science and engineering. This is especially true for the computing disciplines, which are expected to
lead the way. Discouragement over the decline in interest in the computing professions in years past is showing some signs of reversal,
and excitement about these fields will probably only increase more quickly in coming years. In the 21st century, with humanity facing
growing problems of global proportions, the urgency for finding major solutions is greater than ever and will propel researchers,
governments and academia to pursue ever more aggressive innovation strategies in close cooperation.
Indisputably, the unique and important challenges present today will significantly influence the directions of the research, industrial
development and educational agendas for years to come. Certainly, computing, as enabler for research, is central and essential to
these agendas. Therefore, it is important to recognize that next generation innovation and breakthrough research are inextricably
intertwined with the future of computing, whose central challenges are, simply put, to:
 Develop the most powerful next-generation architectures possible.
 Drive efforts to empower all disciplines computationally.
 Apply computing to find breakthrough solutions that will change the world.
Increased corporate R&D is not the entire answer to these challenges. Samuel Kortum and John Lerner, two American academics have
shown that “a dollar of venture capital could be up to ten times more effective in stimulating patenting than a dollar of traditional
corporate R&D”.148 Nor can government be relied upon to solve the problems, since sometimes (especially in some European
countries) “The best thing that governments can do to encourage innovation is to get out of the way.” 149
Perhaps the most interesting development of the current era is that broad availability of powerful IT has made innovation more open to
others outside the scientific elite; in a sense, in this “age of mass innovation, we are all innovators now”.150
Theory is important but must be replaced with action to be verifiable. “Walking the walk” is as important as “talking the talk”. Based on
the learnings and approaches developed from the past two years of investigation into processes for accelerating innovation, the
External Research & Programs group at Microsoft Research has initiated a strategic collaborations initiative to discover breakthrough
research. In 2007, 45 proposals were submitted by academic researchers, with 9 chosen for funding, projects that would not have been
found otherwise. Based on the success of this activity, a Breakthrough Research Program has been undertaken, initiated with a
$1million investment. The A. Richard Newton Breakthrough Research Award151 dedicated to the vision and accomplishments of the late
Berkeley dean and researcher, takes the form of a Request for Proposals (RFP) that is intended to yield highly innovative and impactful
research as well as new directions. Results from this type of activity, combined with other approaches currently underway may reveal if
these techniques can be used effectively to accelerate research innovation, and especially if they can also generate important
breakthroughs.
This is an occasion for guarded optimism. The goals are extremely difficult but have now become achievable, and new models for
innovation are emerging continuously. Today, as people everywhere become increasingly aware, progressive initiatives are beginning
to take root. An explosion of new investigations into multidisciplinary approaches are transforming research, as well as influencing
education. It is now more important than ever to increase opportunities for innovation by collaborating with diverse groups, embracing
different perspectives, taking more risks, and especially by examining research in the context of social challenges. Important facts
about nature and science that were previously unknowable are now discoverable, and the key issues can finally be made
understandable and actionable. It may not be too late.152
23
Summary
Computational innovation has entered a new era
The power computing provides is already acknowledged as indispensable for nearly all disciplines. Computational approaches
are now also essential for solving massive global problems around such issues as pollution, population, and affordable
healthcare. Computing is also looked to as a tool for solving urgent issues including some never before faced by humanity,
including massive climate change and the urbanization of the planet.
Computing is widely recognized as the prime enabler for scientific progress
Research agendas emphasize this in their funding priorities. The National Science Foundation's (NSF) funding for research in
math and computer science exceeds that for all other areas. Worldwide funding priorities are similar, reinforcing the value of
computing as an enabler accelerating all scientific discoveries. Priorities in academia are also in close alignment. Computing
advantages of fast analysis, simulations, visualization, integrated IT, networking, multiple devices, and robotics are essential and
will improve more rapidly than ever in coming years.
However, core computational advancement has reached a critical crossroads
Engineers and researchers have “hit the wall”. No further increases in processor speed is available with the technologies that
have driven growth for decades, yet increasing power will be needed for future applications. There are multiple paths to
investigate: manycore chip architectures, difficult problems to solve in parallel computing (especially around software
development) and researchers are hard at work exploring solutions in a variety of directions that may take a decade or more to
solve.
Multidisciplinary research and approaches are becoming increasingly important
It is essential to innovation in complex areas that span many disciplines. The new model for collaboration is for computer
scientists and engineers included from outset in complex problem solving scenarios. Some multidisciplinary areas poised for
exponential growth are computational neuroscience, advanced learning technology, gaming for many purposes other than
entertainment, and of course all the life sciences including medicine and genetics - biology is now widely acknowledged as an
information science. Social sciences are playing an increasingly important role, and leading educators are actively formulating
specialized multidisciplinary computing education for use in the sciences, engineering and for nearly all other disciplines.
Support is growing for “riskier research” to find breakthroughs
Difficult to identify in advance, breakthroughs are extremely important for accelerating discovery, solving complex problems, and
finding out what we may have missed through more traditional approaches to research. Fortunately there are a number of useful
strategies for finding and encouraging highly innovative research.
Exceptional new research is emerging, bridging a number of disciplines
A large number of computationally-enabled efforts “from the edge” are underway, including symmetry studies, programmable
matter, and other highly innovative combinations of disciplines. The life sciences promise to become the next Sky Server-style
eScience conversion. Robotics has come of age and has entered the mainstream. Computer gaming has transcended its
original entertainment focus and is driving computer requirements and Internet growth. Serious games are fomenting many
investigations in disciplines not aligned with traditional approaches to gaming. Computer science, rather than a study of
phenomena associated with computers, is now considered by some as a natural science in its own right.
Computing will enable transformational change by answering three primary challenges
Figuring out how to solve these problems will significantly influence the direction of computational (and other) research, with
profound impact, for many years to come.
The three straightforward challenges are to:
1. Quickly develop powerful next-generation computing architectures transcending previous limitations.
2. Work collaboratively to empower all disciplines computationally, as soon as possible.
3. Create breakthrough solutions enabled by computing to change the world, addressing key problems for science and
society.
Next generation academic research in computational disciplines is being planned to address these imperatives today, and key
innovators will continue to align their efforts with these themes for years to come. Breakthroughs that occur while addressing the
most important problems facing humanity will transform both the computing and scientific research communities as well as the
way they work together, and will provide the critical solutions most urgently needed by all of society.
24
Acknowledgements
More than 100 academics at all levels, including faculty, staff, researchers, department heads, deans (and even a college president) have shared their
thoughts, which have led to this collection of ideas about the future of academic research. Their areas of expertise include Computer Science, Electrical
Engineering, Bioinformatics, Physics, Chemistry, Design, English, Psychology, Medicine and many other fields. Their insights have been invaluable in
the preparation of this paper, and conversations with them are deeply appreciated. Below is a partial listing.
Contributor
Alfred Aho
Jonathan Aldrich
Magdalena Balazinska
Brian Barsky
Suzanne Bigas
Barbara Blackford
Steve Blank
Guy Blelloch
Lenore Blum
Avrim Blum
Nick Bontis
Randall Bryant
John Canny
Robert T. Collins
Tom Colton
Robert Constable
Richard De Millo
David Dornfeld
William Dally
John Domico
Tony Elam
Magy El-Nasr
Greg Ganger
Scott Hauck
Maria Klawe
Hector Garcia-Molina
C. Lee Giles
Seth Goldstein
Mark Gross
Dennis Groth
David Hall
Mor Harchol-Balter
Hunter Hoffman
Mary Jane Irwin
Martina Jacobs
Marti Hearst
Thomas I. M. Ho
Trent Jaeger
Harry Johnson
Anthony Joseph
Tom Kalil
Kam Biu Luk
David Kauffer
Lydia Kavraki
Sung-Hou Kim
Nick McKeown
Sallie Keller-McNulty
Ken Kennedy
Kurt Keutzer
Dan Klein
Scott Klemmer
Ed Knightly
John Lafferty
Patrick Langley
Edward Lazowska
Institution
Columbia
Carnegie Mellon
University of Washington
UC Berkeley
Stanford
UC Berkeley
UC Berkeley
Carnegie Mellon
Carnegie Mellon
Carnegie Mellon
McMaster University
Carnegie Mellon
UC Berkeley
Penn State
UC Berkeley
Cornell
Georgia Tech
UC Berkeley
Stanford
Penn State
Rice
Penn State
Carnegie Mellon
University of Washington
Harvey Mudd College
Stanford
Penn State
Carnegie Mellon
Carnegie Mellon
Indiana University
Penn State
Carnegie Mellon
University of Washington
Penn State
Carnegie Mellon
UC Berkeley
Purdue
Penn State
Stanford
UC Berkeley
UC Berkeley
UC Berkeley
Carnegie Mellon
Rice
Lawrence Berkeley Laboratory
Stanford
Rice
Rice
UC Berkeley
UC Berkeley
Stanford
Rice
Carnegie Mellon
Stanford and Arizona State
University of Washington
Contributor
Edward A. Lee
Peter Lee
Michael Liebschner
Todd Logan
Wang-Chien Lee
Yanxi Liu
Henry Levy
Michael Lyons
Donald Marinelli
Audrey McLean
Deirdre Meldrum
Luay Nakhleh
Vijay Narayanan
David Notkin
Don Orlando
Jim “Oz” Osborn
“Panch” Panchanathan
Randy Pausch
Leonard K. Peters
Maria Larrondo Petrie
Zoran Popović
Padma Raghavan
Michael Reiter
Carolyn Remick
Jeremy Rowe
Joseph M. Rosen
Paul E. Rybski
Norman Sadeh
Anant Sahai
Shankar Sastry
Keith Sawyer
Steve Shaffer
Ikhlaq Sidhu
James Siegrist
Dan Siewiorek
Suzanne Shontz
Brian Smith
Dan Stamper-Kurn
Ric Stoll
Devika Subramanian
Sebastian Thrun
Joseph Tront
Moshe Vardi
Luis Von Ahn
Howard Wactlar
Dan Wallach
Jean Walrand
May Wang
Joe Warren
Dan Seth Wallace
Steve Wasserman
Jeannette Wing
Terry Winograd
Willy Zwaenepoel
Michael Zyda
Institution
UC Berkeley
Carnegie Mellon
Rice
Stanford
Penn State
Penn State
University of Washington
Stanford
Carnegie Mellon
Stanford
Arizona State
Rice
Penn State
University of Washington
UC Berkeley
Carnegie Mellon
Arizona State
Carnegie Mellon
Pacific Northwest National Laboratory
Florida Atlantic University
University of Washington
Penn State
Carnegie Mellon
UC Berkeley
Arizona State
Dartmouth
Carnegie Mellon
Carnegie Mellon
UC Berkeley
UC Berkeley
Washington University
Penn State
UC Berkeley
UC Berkeley
Carnegie Mellon
Penn State
Penn State
UC Berkeley
Rice
Rice
Stanford
Virginia Tech
Rice
Carnegie Mellon
Carnegie Mellon
Rice
UC Berkeley
Emory
Rice
Rice
MIT (formerly UC Berkeley)
NSF (formerly Carnegie Mellon)
Stanford
EPFL (Switzerland)
USC
Microsoft colleagues, through a number of discussions, have also been very helpful in formulating these ideas, although they may not have realized so
at the time. These include Eric Brill, Bill Buxton, Jennifer Chayes, Lili Cheng, Sailesh Chutani, Henry Cohn, Phil Fawcett, Dan Fay, Alessandro Forin,
Michael Fors, Jason Hartline, Tom Healy, Tony Hey, Harold Javid, Jim Kajiya, Mark Lewin, Roy Levin, Simon Mercer, John Nordlinger, Michel Pahud,
Jane Prey, Mike Schroeder, Marc Smith, Stewart Tansley, David Tarditi, Kristin Tolle, Turner Whitted, Curtis Wong, and Yan Xu, to name a few.
25
Gray, Jim, in video interview posted online, “Conversation with scientist, engineer and database legend Jim Gray”
Channel 9 Forums, 2007, which can be accessed online at: http://channel9.msdn.com/Showpost.aspx?postid=168181
2
Ibid.
3
Silberman, Steve, “Where is Jim Gray?” Wired, August, 2007.
4
Vincent-Lancrin, Stéphan, “What is Changing in Academic Research? Trends and Futures Scenarios”, the European Journal of
Education, 41, 2, June 2006. This report covers 30 member countries of the Organization for Economic Cooperation
and Development (OECD).
5
A Special Report on Innovation, The Economist, October 13, 2007, page 4.
6
The Organization for Economic Cooperation and Development (OECD): http://www.oecd.org/home/0,2987,en_2649_201185_1_1_1_1_1,00.html
7
A Special Report on Innovation, The Economist, October 13, 2007, page 4.
8
Ibid.
9
Ibid.
10
Ibid.
11
A Special Report on Innovation, The Economist, October 13, 2007, page 4.
12
Ibid.
13
Ibid.
14
Champion, Marc. U.S. Ranked Most Competitive; Oil-Rich nations Show Promise. The Wall Street Journal, November 1, 2007, page A4.
15
Ibid.
16
Ibid.
17
Ibid.
18
A Special Report on Innovation, The Economist, October 13, 2007, page 4.
19
A Special Report on Innovation, The Economist, October 13, 2007, page 14.
20
Ibid.
21
Ibid.
22
Ibid.
23
Trends in NSF Research by Discipline, FY 1970-2006, graphic representation based on NSF Federal Funds for Research and Development
FY 2004, 2005 and 2006 from the National Science Foundation. American Academy for the Advancement of Science (AAAS), February, 2007.
24
From August 2006 through June of 2007, over 100 academics from more than a dozen U.S. schools were interviewed for this report.
25
Before its original print release, this information was made public at the Federated Computing Research Conference, San Diego, June 8-16, 2007.
26
Richard A. DeMillo, Dean of the School of Computing at Georgia Tech, shared this insight in personal communications, January 15, 2007.
27
This information was also obtained during the interviews conducted between August 2006 and June 2007.
28
Mundie, Craig, address delivered at the 2007 WinHEC Conference in Los Angeles, May 15, 2007; available online at the Access Global
Knowledge site at: http://access.globalknowledge.com/article.asp?ID=7563
29
This summary of contrasting industry and academic perspectives on manycore issues was provided in June 2007 by Mark Lewin, from
the External Research group at Microsoft Research. Mark focuses on connecting with academic researchers on topics of
compilers, runtimes, programming languages and tools, and new architectures.
30
Ibid.
31
For more information about the Rice University Center for High Performance Computing (HiPerSoft): http://www.hipersoft.rice.edu/
32
Krste Asanovic, Ras Bodik, Bryan Christopher Catanzaro, Joseph James Gebis, Parry Husbands, Kurt Keutzer, David A. Patterson, William Lester
Plishker, John Shalf, Samuel Webb Williams and Katherine A. Yelick, “The Landscape of Parallel Computing Research: A View from Berkeley”
December 18,2006. http://www.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-183.pdf
33
See Scott Hauck’s latest research focus at: http://www.ee.washington.edu/people/faculty/hauck/
34
This presentation is available online at: http://lazowska.cs.washington.edu/fcrc/Colwell.FCRC.pdf
35
Randy Bryant’s DISC Supercomputing presentation is available online at: http://www.cs.cmu.edu/~bryant
36
Smith, Burton. “Reinventing Computing” presentation delivered at the Workshop on the Interaction between Operating Systems and Computer
Architecture (WIOSCA), June 10, 2007, Held in conjunction with the 2007 International Symposium on
Computer Architecture (ISCA-34) http://www.ideal.ece.ufl.edu/workshops/wiosca07/wioscafinal.html
37
Ibid.
38
Gershenfeld, Neil and Isaac L. Chuang. "Quantum Computing with Molecules”, reproduced online by Scientific American:
http://www.media.mit.edu/physics/publications/papers/98.06.sciam/0698gershenfeld.html , 1998. This article provides a generally accessible overview
of quantum computing and related topics.
39
Glendenning, Ian. The Bloch Sphere. Presentation at the QIA meeting of VCPC, the European Centre for Parallel Computing at Vienna. February 16,
2005. Available online at: http://www.vcpc.univie.ac.at/~ian/hotlist/qc/talks/bloch-sphere.pdf
40
Fumy, Walter and Joerg Sauerbrey. Enterprise Security: IT Security Solutions: Concepts, Practical Experiences. 2006, Wiley-VCH. See the section
on pg. 162 on Quantum Cryptanalysis. See also online at http://books.google.com/books
41
Ibid.
42
Biologically Based Quantum Computers? DNA, Proteins, and Peptides Could Help Construct New Nanoscale Electronics. Article in Science Daily,
http://www.sciencedaily.com/releases/2007/03/070321093451.htm
43
Nick McKeown’s perspectives on redesigning the Internet from scratch: http://mediax.stanford.edu/conference_06/presentation/mckeown.pdf
44
Scott Shenker’s ideas about GENI can be seen at: http://lazowska.cs.washington.edu/fcrc/Shenker.FCRC.pdf
45
Dornfeld, David, Dean of Engineering, Berkeley. Personal communication, September 20, 2006.
46
Vincent-Lancrin, Stéphan, “What is Changing in Academic Research? Trends and Futures Scenarios”,
the European Journal of Education, 41, 2, June 2006.
47
Bement, Arden L., on the NSF site at: http://www.nsf.gov/news/news_summ.jsp?cntn_id=108364&org=CISE&from=news
48
NSF Cross-cutting computational neuroscience focus: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5147&org=NSF
1
26
49
NSF Cross-cutting computational advanced learning technology: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=12834&org=NSF
The NIH R21 program can be examined online at: http://grants.nih.gov/grants/funding/r21.htm
51
The SGER program from NSF can be found online at: http://www.nsf.gov/od/lpa/news/publicat/nsf0203/cross/ocpa.html#section8
52
Buxton, William. “Sketching user experience: getting the design right and the right design”. Elsevier, 2007, p 211.
53
Kelley, Tom, “The Ten Faces of Innovation", Doubleday, 2005.
54
Technology, Engineering, Design, TED: Ideas worth spreading, at: http://www.ted.com/index.php/
55
Information about the Stanford Institute of Design (d.school) can be found at: http://www.stanford.edu/group/dschool/
56
IDEO is at the forefront of thinking in terms of a design approach to innovation, cutting across many disciplines, and describe
themselves as “Specialists in human factors, psychology, business, design, engineering and manufacturing [who] provide full service
consulting for product innovation and design.” See: http://www.ideo.com.
57
Berkeley Institute of Design; see online at: http://bid.berkeley.edu/
58
Buxton, William. “Sketching user experience: getting the design right and the right design”. Elsevier, 2007.
59
Gibson, William, from an NPR interview, Nov. 30, 1999.
60
Langley, Patrick, Professor of Computer Science and a computational cognitive psychologist, UC Berkeley and
Arizona State University. Personal communication, October 11, 2006.
61
Harchol-Balter, Mor. Scheduling the TeraGrid: Bringing Auction-based Scheduling to Grid Computing, proposal submitted January 2007.
62
Interview with Hunter Hoffman, Psychologist from the University of Washington.
50
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
Constable, Robert. The concept that computer usability is moving to a new level in close collaboration with psychology – with computers
equipped to “understand” and react more like users want them to – was discussed in personal communication in November 2007.
Jon Kleinberg’s Cornell web page: http://www.cis.cornell.edu/kleinberg.html
Henry Cohn currently heads the cryptography group at Microsoft Research. See: http://research.microsoft.com/~cohn/
Ibid.
Microsoft Research External Research & Programs report: The Future of Information Technology: Growing the Talent Critical for
Innovation, July 2007. Available at: http://research.microsoft.com/Workshops/FS2006/papers/TheFutureofInformationTechnology.pdf
Data forthcoming but not yet available: anecdotal information from personal interviews with deans at several universities.
The assumption that multidisciplinary excellence is a competitive advantage for students is not always stated explicitly. There is some
anecdotal evidence to support this assertion; further data would be useful.
This makes sense; many students are attracted to Bioinformatics even though there are far more jobs available in traditional
computing occupations. More hard data is needed for substantiating the attractiveness hypothesis.
See Georgia Tech threads description at: http://www.cc.gatech.edu/content/view/692/144/
This was covered at the roundtable discussion on computational education for other disciplines at The Symposium on
Computational Thinking at CMU, March 26, 2007 http://www.cs.cmu.edu/compthinkingsymp/
http://www.hmc.edu/files/HMCStrategicVision2007.pdf
Rowe, Jeremy. Dr. Rowe is the director for Strategic Planning (IT) at Arizona State and in personal communication in October 2006
indicated that all 10,000 incoming freshman would take a new computing course intended to be useful for them in all disciplines.
For more information on the computational education for scientists workshop and initiative, see: http://research.microsoft.com/erp/CEfS/
Worldwatch Institute Report, “Vital Signs 2006-2007: The Trends that are Shaping our Future”, Norton, 2006.
Khosla, Vinod, in A Special Report on Innovation, The Economist, October 13, 2007, page 3.
United Nations Population Fund (UNFPA), “The State of World Population 2007”
http://www.unfpa.org/swp/2007/english/introduction.html
Worldwatch Institute Report, “Vital Signs 2006-2007: The Trends that are Shaping our Future”, Norton, 2006.
University of Colorado website: http://ecenter.colorado.edu/energy/projects/green_computing.html
Worldwatch Institute Report, “Vital Signs 2006-2007: The Trends that are Shaping our Future”, Norton, 2006.
Taleb, Nassim Nicholas, “The Black Swan: The Impact of the Highly Improbable”.
Simonton, D.K., “Creative Productivity: A Predictive and Explanatory Model of Career Trajectories and Landmarks,”
Psychological Review 104, no. 1 (1997): 66-89.
Sawyer, R. Keith. “Group genius: the creative power of collaborations”, Basic Books, 2007, p 107.
Ibid., p 109.
Ibid., p 109.
A good site for reviewing researchers’ publications and citations is CiteSeer.IST at: http://citeseer.ist.psu.edu/statistics.html; also see
ISI Highly Cited.com at: http://hcr3.isiknowledge.com/formBrowse.cgi
Sawyer, R. Keith. “Group genius: the creative power of collaborations”, Basic Books, 2007, p 36.
Ibid. p ix-xiii.
Johansson, Frans. “The Medici Effect: What elephants and epidemics can teach us about innovation.” Harvard Business School Press, 2006.
Personal Communication with research faculty and university administrators at Berkeley and Stanford, February - March, 2007
Sawyer, R. Keith. Group genius: the creative power of collaborations, Basic Books, 2007.
This is based on the Microsoft Research funding of projects in academia over the last ten years.
The Microsoft Research New Faculty Fellowship Program can be found online at: http://research.microsoft.com/erp/nff/default.aspx
A Special Report on Innovation, The Economist, October 13, 2007, page 8.
Ibid.
Ibid.
In conversations with researchers and academics, they often mention the phenomenon that significant breakthroughs in research often take three
years or longer to be realized. Although breakthroughs are highly valued and may seem to happen almost instantaneously once a new direction,
connection or approach is found (the “eureka effect”) this may have been preceded by years of incremental progress and typically many dead ends.
27
99
The law of unintended consequences is one of the building blocks of economics. See:
http://www.econlib.org/library/Enc/UnintendedConsequences.html
100
Rogaine is the brand name for minoxidil, which originally was developed as a hypertension medication which proved useful in treating male pattern
baldness. Viagra, at first a treatment for pulmonary arterial hypertension, is now famous after its value as a treatment for erectile dysfunction was
inadvertently discovered.
101
Tolle, Kristin. From the 4th International Workshop on Data Integration in the Life Sciences (DILS).
Internal Report, June 30, 2007, Microsoft Research.
102
Sky Server leaders are enthusiastic about using their website to show the beauty of the universe, and share their excitement as they
build the largest map in the history of the world. The Sixth Data Release is “dedicated to Jim Gray for his fundamental contribution to
the SDSS project and the extraordinary energy and passion he shared with everybody!” Visit: http://cas.sdss.org/dr6/en/
103
Tolle, Kristin. From the 4th International Workshop on Data Integration in the Life Sciences (DILS).
Internal Report, June 30, 2007, Microsoft Research.
104
The 4th International Workshop on Data Integration in the Life Sciences (DILS), hosted at The University of Pennsylvania and sponsored by Microsoft
Research, included significant industry representation and presentations by several companies (e.g., Siemens and MITRE) and foundations as well as
the research community. See: http://dils07.cis.upenn.edu/
105
Bioinform online can be found at: http://www.bioinform.com/issues/
106
The first US based HealthGrid meeting is being organized by Howard Bilowski from UPenn to take place June 2-3, 2008.
107
Zoé Lacroix is a research professor at the Design Automation Laboratory http://asudesign.eas.asu.edu/people/zoe.html
108
Yanxi Liu of Penn State has a strong back ground in computer vision, computer graphics, biomedical image analysis (including computer aided
diagnosis, large biomedical image database indexing and retrieval), machine learning and intelligent robotics. This has led to an increased interest in
Computational Symmetry and Group Theory Applications, including image analysis of brains, faces and gaits (computer vision), mechanical and
kinematic assemblies (robotics), periodical and near-regular patterns (pattern recognition), and texture synthesis (computer graphics).
109
Lydia Kavraki is professor of Computer Science and Bioengineering at Rice University
http://cohesion.rice.edu/engineering/computerscience/Facultydetail.cfm?riceid=886
110
See article online about Hunter Hoffman’s work which describes his successes in cutting a person’s pain in half using
virtual reality (VR) techniques. http://triennial.cooperhewitt.org/designers/hunter-hoffman
111
See the website of Zoran Popović at http://www.cs.washington.edu/homes/zoran/
112
Luis von Ahn’s research focus can be seen at: http://www.cs.cmu.edu/~biglou/research.html . He has also been named as a Microsoft
Research New Faculty Fellow for 2007: http://research.microsoft.com/erp/nff/default.aspx
113
Seth Goldstein, Jason Campbell, and Todd Mowry. Programmable matter. IEEE Computer, June 2005.
114
Michael Liebschner’s expertise is in computational and experimental methods for biomechanical research. He is head of
Rice University’s new Computational Biomechanics Laboratory. http://bioe.rice.edu/FacultyDetail.cfm?RiceID=95
115
Mims, Christopher, “U.S. Congress to give robots a big think”, in Scientific American Online, June 21, 2007
http://blog.sciam.com/index.php?title=us_congress_to_give_robots_a_big_think&more=1&c=1&tb=1&pb=1
116
Gates, William G. “A Robot in Every Home” Scientific American, June, 2007
http://www.sciam.com/article.cfm?articleID=9312A198-E7F2-99DF-31DA639D6C4BA567&pageNumber=1&catID=2
117
The Microsoft Robotics Studio online overview can be seen at: http://msdn2.microsoft.com/en-us/robotics/default.aspx
118
See the robotics in computer science initiative underway today at http://research.microsoft.com/erp/robotics/default.aspx; the Institute for Personal
Robotics in Education at http://roboteducation.org and visit http://research.microsoft.com/~stansley/
119
The Dartmouth School of Engineering Polytrauma Conference can be found at: http://engineering.dartmouth.edu/polytrauma/index.html
120
Dvorak, John. Xbox 360 to the Rescue, article in PC Magazine, can be seen online at http://www.pcmag.com/article2/0,1895,1884539,00.asp
121
Wray, Richard. “China Overtaking US for fast internet access as Africa gets left behind” Guardian Unlimited, June 14, 2007.
http://business.guardian.co.uk/story/0,,2102517,00.html
122
John Nordlinger prepared the internal report from which much of this gaming data is derived. He drives efforts in the External Research group at
Microsoft Research, collaborating with academic researchers to find best ways to use gaming for education, health
and other socially beneficial applications.
123
Mayo, Merrilyea. Presentation at Microsoft Research, Redmond, Washington, August 21, 2007, discussed the interest in gaming for these
purposes at the National Academies of Science (NAS).
124
“World of Warcraft”, Blizzard Entertainment. See: http://www.worldofwarcraft.com/index.xml
125
Rankin, Yolanda. “Evaluating Interactive Gaming as a Language Learning Tool” SIGGRAPH paper, August 3, 2006. Available online at:
http://www.siggraph.org/s2006/main.php?f=conference&p=edu&s=40
126
Denning, Peter J. “Computing is a Natural Science” Communications of the ACM, Vol. 50, Number 7, July 2007, p. 16.
127
Carse, James P. “Finite and Infinite Games”, Ballantine, 1986, p. 1. This small book is quite influential and continues to be referred to as a source of
thought leadership in this area. See discussion at: http://www.glg.net/pdf/Finite_Infinite_Games.pdf
128
Denning, Peter J. “Computing is a Natural Science” Communications of the ACM, Vol. 50, Number 7, July 2007, p. 16-17.
129
Creating a Science of Games. Feature in Communications of the ACM, July 2007. Includes a number of articles of interest:
- “Creating a Science of Games,” by Michael Zyda, guest editor
- “Games for Science and Engineering Education,” by Merrilea J. Mayo
- “Games for Training,” by Ralph E. Chatham
- “How to Build Serious Games,” by H. Kelly, K. Howell, E. Glinert, L. Holding, C. Swain, A. Burrowbridge, M. Roper
- “Carnegie Mellon’s Entertainment and Technology Center: Combining the Left and Right Brain,” by Randy Pausch and Don Marinelli
- “Using Storytelling to Motivate Programming,” by Caitlin Kelleher and Randy Pausch
- “Real-Time Sound Synthesis and Propagation for Games,” by N. Raghuvanshi, C. Lueterback, A. Chandak, D. Manocha and M. C. Lin.
130
The dictionary definition of natural science was obtained from The American Heritage® Dictionary of the English Language,
Fourth Edition Copyright © 2007, 2000 by Houghton Mifflin Company.
131
Feynman, Richard P. “Feynman Lectures on Computation,” Tony Hey and Robin W. Allen, editors. Westview Press, 1996.
28
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
Denning, Peter J. “Computing is a Natural Science” Communications of the ACM, Vol. 50, Number 7, July 2007, p. 13-18.
Ibid, p. 13.
Denning, Peter J. “The Invisible Future: The Seamless Integration of Technology into Everyday Life”, Wiley, 2001, p. 45.
Wolfram, Stephen. “A New Kind of Science”, Wolfram Media, 2002.
Denning, Peter J. “Computing is a Natural Science” Communications of the ACM, Vol. 50, Number 7, July 2007, p. 14.
Ibid, p. 16.
Ibid, p. 17.
Review the work undertaken to describe the Great Principles of Computation at: http://cs.gmu.edu/cne/pjd/GP/GP-site/welcome.html
The Manhattan Project, on the U.S. Department of Energy (DOE) site at: http://www.mbe.doe.gov/me70/manhattan/index.htm
The story of the decision to go to the moon, and the entire speech transcript can be seen http://history.nasa.gov/moondec.html
Global Warming report by the Union of Concerned Scientists is at:
http://www.ucsusa.org/global_warming/science/recordtemp2005.html
World Wildlife Federation, http://www.panda.org/about_wwf/what_we_do/marine/problems/problems_fishing/index.cfm
For more information on population projections and related issues, see United Nations Population (UNPFA): http://www.unpfa.org
Sawyer, R. Keith. “Group genius: the creative power of collaborations”, Basic Books, 2007, p 7.
Taleb, Nassim Nicholas, “The Black Swan: The Impact of the Highly Improbable”.
Johansson, Frans. “The Medici Effect: What elephants and epidemics can teach us about innovation.” Harvard Business School Press, 2006.
A Special Report on Innovation, The Economist, October 13, 2007, page 16.
Ibid.
Ibid, p. 19.
The A. Richard Newton Breakthrough Research Award can be found at:
http://research.microsoft.com/ur/us/fundingopps/rfps/ARichardNewtonAward.aspx
Martin, James. “The Meaning of the 21st Century: an urgent plan for ensuring our future.” Riverhead Hardcover, August 3, 2006.
The information contained in this document represents the current view of the
author on the issues discussed as of the date of publication. This White Paper
is for informational purposes only. MICROSOFT MAKES NO WARRANTIES,
EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS
DOCUMENT.
29
Download