NeSC News In Voting Machines We Trust Issue 62 July 2008

advertisement
The monthly newsletter from the National e-Science Centre
NeSC News
Issue 62 July 2008 www.nesc.ac.uk
In Voting Machines We Trust
by Iain Coleman
If you want to see democracy in the UK up close and
personal, go to an election count. You’ll find yourself
in an unglamorous hall, with a rectangle of tables
marking the boundary of an inner zone that you will
not be permitted to enter. Within this privileged area,
local council staff unseal ballot boxes, dump the ballot
papers onto the tables, count them up and sort them
into piles depending on which candidate’s name has a
little “X” pencilled beside it. One pile per candidate, and
the biggest pile wins. Meanwhile, candidates and their
agents, consigned to the outer zone, press as close as
they dare to the counting tables, keenly watching out for
any error that might disadvantage their own side. From
time to time, the Returning Officer calls all the agents to
huddle together and peruse the pencil marks on unclear
ballots: papers judged to be valid votes go onto the
appropriate pile, while the spoiled papers are held aside.
Hand and eye, pencil and paper: it’s all very human.
Voting in a secret ballot can be viewed as a
special case of secure distributed computation.
Several parties input secret data and want to
compute a function of these inputs, the election
result, without revealing the inputs themselves, the
votes. But voting has special features: the system
has to be very easy to use, public confidence in
the outcome has to be very high, and dependence
on the software and other components has to be
minimal. So the trick is to minimise trust in the
sense of dependence, while maximising trust in
the sense of reliability.
One approach is to give each voter a “protected
receipt” that carries, in an encoded form, the
information about how they voted. Copies of
receipts are then posted to a secure web bulletin
board, which voters inspect to confirm that their
receipt is correctly posted. The receipts are
tabulated using cryptographic techniques that
preserve the secrecy of the ballot. Once the votes
are decrypted, they can no longer be linked back
to receipt – hence they can no longer be linked
back to the voter. (continued on page 2)
Congratulations!
Vote counting arrangements in the Victoria Hall, Leeds, early 1900s
All too human, some would say. Ballot papers can be
misplaced or miscounted, people can try to interfere with
the votes that have been cast, and sleep deprivation can
make even the most honest and careful process falter.
Over the years, various democracies have introduced
some kind of voting machines to remove human error
from the system.
But whether it’s hanging chads in Florida or spoiled
ballots in Scotland, none of these has yet lived up to the
hope of delivering a secure, reliable, easy-to-use voting
system.
Can the Holy Grail of electronic voting be found? Peter
Ryan, of Newcastle University, suggested that it can
in his e-Science Institute Public Lecture on “Trust and
Security in Voting Systems”, held on 12 June.
Congratulations
from all at NeSC to
Professor Richard
Kenway, chairman
of NeSC and EPCC,
who was appointed an
OBE for services to
science in Her Majesty
the Queen’s Birthday
2008 Honours List.
Prof Richard Kenway
is also Vice-Principal
(High Performance
Computing and e-Science) and Tait Professor of
Mathematical Physics.
Issue 62, July 2008
In Voting Machines We Trust
by Iain Coleman (continued)
Ryan has developed a working version of this scheme,
which he calls Prêt a Voter. The ballot paper is in the
traditional layout, but the order in which the candidates
appear on the paper is randomised, and will be different
from paper to paper. Each paper is also marked with a
code that contains information about the candidate order
on that particular ballot paper. The cunning bit is that the
ballot paper is perforated, so that voter detaches and
discards the section with the candidates’ names on it,
and only hands over the section that contains the boxes,
one of which the voter has marked with an “X”, and the
encryption code. A digital copy of this receipt section
is made, and it is then returned to the voter for them to
keep.
A recent trial of Prêt a Voter on the Newcastle
University campus proved reasonably successful,
but it was clear that a significant number of voters
did not understand the mechanisms. If this system
is to be more widely used, it will require better
instructions and explanations for voters, and better
training for poll workers.
The receipt reveals nothing about the vote unless it is
decrypted. Once the polls have closed, all receipts are
posted to the web bulletin board, and the ballots are
decrypted and counted. There are checks within the
system to catch faulty ballots, and random audits at
intermediate stages in the processing. This functions as a
defence in depth against attempts to subvert the secrecy
of the ballot. Each stage is carried out by different tellers,
and almost all the tellers would have to be compromised
in order to compromise secrecy.
An advantage of this scheme is that the user interface
– the ballot paper – is simple and familiar. Unlike some
other secure voting systems, the voters don’t need to
have personal cryptographic keys or computing devices.
The scheme works for a wide range of electoral systems,
including traditional First Past the Post and rank-ordering
systems like the Single Transferable Vote.
Voting technology is an area of dynamic research,
and voter verifiable schemes like Prêt a Voter are
now mature enough to be trialled. Whether such
a scheme is adopted for full-scale government
elections, however, may depend on the systems
currently in place in a given country. Where some
kind of electronic voting system is already in use,
but is known to have problems, a move to a more
robust and transparent system may be attractive.
Replacing a traditional pencil-and-paper system
with verifiable electronic voting might prove to
be a harder sell. The old system has safeguards
of its own, particularly the supervision by party
political agents, that are not present in the potential
replacement, and the case would need to be made
that the new safeguards are at least as good as
the old. Furthermore, this system deliberately does
not address postal voting, which is the main form of
electoral fraud in the UK at present.
So there is great potential for ballot technology, but
there are also great dangers, and systems need to
be carefully thought out if they are to achieve their
goal of accurately, reliably and securely reflecting
the political choices of the general public. The
e-Science Institute theme on Trust and Security is
examining security challenges in a wide range of
fields: this one might just turn out to be the trickiest.
In principle, this scheme should provide a high degree
of assurance with minimal trust assumptions. But public
trust is essential, and this can only be achieved if the
public understand the system and perceive it to be
sound.
Slides and a webcast of this event can be
downloaded from:
http://www.nesc.ac.uk/esi/events/896/
Congratulations Professor Seidel!
Congratulations to Ed Seidel who has recently been appointed as Director of the
Office of CyberInfrastructure (OCI) at the National Science Foundation.
OCI awards grants to researchers who demonstrate cutting-edge information
technology that can lead to breakthroughs in science, engineering and other
academic disciplines.
Seidel, who is Floating Point Systems Professor in the Louisiana State University
(LSU) Departments of Physics & Astronomy and Computer Science and also is
director of the LSU Center for Computation & Technology, or CCT, takes up the
directorship from 1st September, 2008.
For more information, see:
http://www.nsf.gov/news/news_summ.jsp?cntn_id=111689&org=NSF&from=news
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
Crossing Boundaries: Computational Science, E-Science
and Global E-Infrastructures
8th – 11th September 2008 in Edinburgh, Scotland
http://www.allhands.org.uk/2008/conference/venue.cfm
Early Bird Registration is now open until: 31st July 2008. Any registrations received from 1st August 2008
will incur a late fee of £50.
AHM 2008 is the principal e-Science meeting in the UK and brings together researchers from all disciplines,
computer scientists and developers to meet and exchange ideas. For more details about the event, see:
http://www.allhands.org.uk/
The general format of the meeting will include cross-community symposia (kicked off by invited key speakers) and
workshops. For more details about the workshops, see:
http://www.allhands.org.uk/2008/programme/call.cfm
OGF24
The 24th Open Grid Forum - OGF24, Singapore, September 15-19, 2008
OGF returns to Asia-Pacific for OGF24 in beautiful Singapore. Co-located with the GridAsia 2008 event, and hosted
by Singapore Grid Forum, OGF24 will offer a unique opportunity for distributed computing experts, builders, and
users to meet with OGF’s global community to discuss emerging trends and success stories of grid deployments,
and collaborate on standards. In addition to high-profile keynote speakers, participants can expect a wide variety
of program content, ranging from hot technologies like cloud and virtualization to practical uses of grids in scientific
and industrial settings. A commercial exhibition will also be an important part of this multi-faceted event. Participants
will learn about products and services and see demonstrations of successful projects that have leveraged OGF
technologies and standards. Contact us for more information on sponsorship and exhibition opportunities.
Hotel accommodation:
Rooms must be reserved by mid-July. See http://www.ogf.org/OGF24/lodging.php
Due to other high-profile events in Singapore immediately following the conference, attendees are highly
encouraged to make reservations as quickly as possible. A shuttle service will be provided to the meeting location
from all OGF-recommended hotels.
Call for Participation:
Session proposals are being accepted until July 25. See http://www.ogf.org/OGF24/cfp.php
Registration:
Advance registration is available until July 25. See http://www.ogf.org/OGF24/registration.php
Sponsorship opportunities:
A variety of sponsorship opportunities are available for the event. Contact wulf@ogf.org
Questions: Please contact khamilton@ogf.org with any questions.
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
e-Science
Institute
Call for Topics
for the e-Science Institute
Thematic Programme
to be run
in late 2009
The e-Science Institute invites proposals for new themes to run in late 2009
The e-Science Institute (eSI), hosted by the University of Edinburgh, is the UK’s Centre for e-Science Meetings.
Funded by the e-Science Core Programme, it has been operating since August 2001, during which time it has run
more than 470 meetings attended by some 14,000 delegates and hosted 65 visitors who have stayed for varying
periods from one day to a year.
As well as hosting meetings, summer schools and the visitors’ programme, the Institute runs a thematic programme,
which concentrates on in-depth and sustained investigation of a topic by a series of linked talks, visitors, workshops
and conferences over a period of six months to a year. Such themes are led by a theme leader who is a long-term
funded visitor to the Institute.
Theme topics, as well as being interesting in their own right, should address issues that are relevant to applications
researchers and be able to demonstrate significant buy in from both the applications and computational scientist
communities. It is not intended that they address only the sciences – all areas of academic research present
opportunities for the application of e-Science techniques.
This programme is now in its fourth year and has run/is running a total of nine themes in a wide range of subjects.
For further details, see: http://wiki.esi.ac.uk/Themes_Information
To continue our rolling programme, we are now calling for submissions for topics for themes to start mid 2009.
These will be reviewed by the eSI Science Advisory Board, which will meet in late September 2008, and should
be submitted no later than 17 September 2008 for initial consideration by the Programme Committee. Proposals
for theme topics can be made either by the research community, in which case eSI will undertake to try to find
an appropriate leader, or potential theme leaders may put themselves forward along with the theme they wish to
develop. Themes carry a typical budget of about £60k.
Further information on eSI themes and how to apply is available at:
http://www.nesc.ac.uk/esi/themes/index.htm
The eSI Wiki http://wiki.esi.ac.uk/ESI_Themes provides further information on the work in progress, including
reports and discussions.
To propose a theme or if you have any questions, please contact Anna Kenway by email anna.kenway@ed.ac.uk or
+44 (0)131 650 9818
Grid Computing Now! Competition 2008
Grid Computing Now! is pleased to announce its second competition for applying innovative grid computing solutions
to an environmental problem. The competition is supported by the British Computer Society, The 451 Group,
Intellect, Memset, Microsoft, National e-Science Centre, Oxford e-Research Centre, the Technology Strategy Board
and WWF. Entering the competition is free, and there are two tracks for entrants to this competition: IT Professional
Track or IT Non-Professional Track.
Prizes for the winners include free one-year membership to the BCS, a mentor from industry to help the winner
progress their idea, a week long internship at the National e-Science Centre at the University of Edinburgh, a Sony
VAIO or Xbox 360 and a one-year limited subscription to The 451 Group’s EcoEfficient and Grid services research.
The deadline for initial proposals of 1000 words is 1st September 2008. There will be a further two stages to the
competition.
For further details of the competition, see: http://grid.globalwatchonline.com/epicentric_portal/site/GRID/
competition2008.html/
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
e-Science
Institute
NGS and the Social Sciences
Lancaster University who explained that tools for
computationally demanding social research are
becoming available as part of commercial systems,
e.g. SAS grid computing and Stata MP. However,
these systems can be of limited use on a public grid
and there are often problems with cost and licensing
issues. Rob Crouchley highlighted the use of R as
the framework to enable statistical modelling on the
grid for several reasons, such as many existing R
methods can be easily modified for grid computing, R
is easy to install on most popular operating systems
and there are no licensing issues.
The NGS exhibited at the 4th International Conference
on Social Science recently held at the University of
Manchester and organised by the ESRC National Centre
for e-Social Science (NCeSS).
The exhibition stand was very busy with enquiries
regarding the services available from the NGS for social
scientists and also regarding our roadshows. The large
amount of traffic to the NGS stand was in part due to the
very positive exposure given to the NGS through two
relevant projects which have made a great deal of use of
our service being presented at the conference.
The Modelling and Simulation for e-Social Science
(MoSeS) project based at the University of Leeds has
been a NCeSS research node for the past 3 years. It
has focused on developing a representation of the
entire UK population as individuals and households,
together with a package of modelling tools which allows
specific research and policy questions to be addressed.
These questions can range from estimating demand for
services such as transport schemes to access to health
and social care services or specific areas.
R is available on all four of the NGS core nodes and
further information is available at:
http://www.ngs.ac.uk/sites/ral/applications/
analysis/R.html
The MoSeS project team has written modelling code
using Java and parts of it has been parallelised to
take advantage of High End Computers such as those
available on the NGS. NGS staff have also helped the
project migrate their code from their own 32 node cluster
to the larger NGS clusters.
More information can be found at:
http://www.ncess.ac.uk/research/geographic/moses/
Alex Voss from NCeSS & Gillian Sinclair, NGS Liaison Officer at
the NGS exhibition stand.
Another project being presented was R, which is an
efficient and easy-to-use tool for statistical modelling.
The work was presented by Prof Rob Crouchley from
Courtesy of Ed Swinden Photography
NGS Labs
NGS Labs has been launched on the NGS Website. This is designed to publicise new beta NGS services and
projects that are available to users. Although these new services and projects are offered without the full support
of the core NGS Services, it is hoped that they will be of use to end NGS users, systems administrators and grid
developers.
If you know of a project that has been developed in association with the National Grid Service and would like it to be
publicised as an NGS Labs project, then please contact the NGS Helpdesk (support@grid-support.ac.uk) for
more information.
A list of projects currently included can be found in the new website section. Simply follow the new tab on the NGS
homepage (http://www.ngs.ac.uk) or go direct to: http://www.ngs.ac.uk/livecd
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
Congratulations Professor Coveney!
Congratulations to Prof Peter Coveney and the theoretical chemistry team
at the Centre for Computational Science, UCL, who were presented with the
Transformational Science Award at TeraGrid 08, for developing a computational
model that could revolutionise magnetic resonance imaging (MRI) systems by
providing real-time, three-dimensional analysis of blood flow in the brain.
The paper that was presented at TeraGrid 08 is entitled Patient-specific Whole
Cerebral Blood Flow Simulation: A Future Role in Surgical Treatment for
Neurovascular Pathologies. It will shortly be available at:
http://teragrid.org/events/teragrid08/
Prof Coveney’s team have now largely built and completed the full infrastructure that enables them to capture very
high resolution patient-specific MRI images (at the National Hospital, Queen’s Square), to reconstruct the
intracranial vasculature and to perform patient-specific whole cerebral blood flow simulations on remote
supercomputing resources -- currently ones mainly located on the US TeraGrid, but also including HPCx in UK
-- with real time rendering on the same processors/cores. The visualisation is piped back to UCL where it can be
viewed and interactively steered within a 15-20 minute time span, and then accessed by neurologists/neurosurgeons
at the National Hospital. Such computational work requires resources which support advance reservation and/or
urgent computing capabilities.
The work is part of the Grid Enabled Neurosurgical Imaging Using Simulation (GENIUS) programme – a global effort
to deliver advanced neurological diagnostic tools. The GENIUS project aims to augment clinical decisions where a
physician uses judgement and experience to decide on the course of treatment best suited to an individual patient’s
condition. GENIUS will involve the development of patient-specific models based on image data, to simulate blood
flow within a given patient’s brain. These simulations are used for planning of surgical intervention for arteriovenous
malformations and aneurysms.
Such work involves a vast amount of number-crunching
so Prof Coveney’s group made use of the TeraGrid,
a distributed supercomputing grid, funded by the US
National Science Foundation (NSF), that includes the
huge Ranger supercomputer at Texas Austin Computing
Centre, the world’s largest public access supercomputer.
GENIUS is an early example of a new European
Commission initiative called the Virtual Physiological
Human (VPH), in which Prof Coveney now has a
leading role. The goal of the VPH Initiative is to create
the research environment in which progress in the
investigation of the human body as a single complex
system may be achieved.
UCL has been awarded funding to coordinate the VPH Network of Excellence (VPH NoE). Its aims range from
the development of a ‘VPH ToolKit’ and associated computational resources, through integration of models and
biological data across the various relevant levels of structure and functional within the body, to VPH research
community building and support.
Prof. Coveney’s team were also awarded 5K Club Membership for the performance of two of their codes, LB3D
and HYPO4D, as reported in two other papers presented at the same conference. Membership of the 5K Club is
awarded for demonstrable linear scaling of codes to 5,000 more cores. Prof Coveney was able to demonstrate linear
scaling of LB3D to 33k cores and HYPO4D to 16k cores on Ranger, the ca. 0.52 petaflop/s Sun cluster at Texas
Austin Computing Center. Ranger has ca. 63,000 cores; 33,000 cores is the largest queue size currently available
(hitherto only available by special agreement) with TACC support. The significance of these benchmarks is that
Prof Coveney’s team can be reasonably confident that these codes will likely scale efficiently well into the petascale
regime, where core counts will number in the hundreds of thousands.
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
A Predictable Response
by Iain Coleman
“Prediction is very difficult, especially of the future.” So said the great quantum physicist Niels Bohr, and what is
true of subatomic particles goes double for the global climate system. It is now well-established that human activity
is causing the Earth to grow steadily hotter, and that this process will continue for some decades at least. But what
consequences will this warming have, and how should we respond to it?
This was the focus of the Modelling and Climate Change Research Workshop, held at the e-Science Institute on 19th
June. The workshop consisted of a set of parallel demonstrations of various kinds of model that can answer some
of these questions, from models of the whole Earth system to predictions of land use changes and the future of the
economy.
Modelling the Earth
Physical models of the global climate system are becoming ever more sophisticated as our computing power and
scientific knowledge increase, and yet the uncertainties in the predictions are as large as ever. As Richard Essery of
Edinburgh University’s Centre for Earth System Dynamics explained, this is because, while modelling of individual
processes has improved, scientists keep adding new processes to the models. Every time a crude – but precise
– parametrisation of some quantity is replaced by a sophisticated – but uncertain – model calculation, a little more
noise is added to the system. The resulting predictions are hopefully more accurate than previously, but they are no
more precise.
Modelling individual parts of the climate system can be tricky enough – atmospheric convection and cloud processes
are still crudely parametrised, for example – but it’s when it comes to linking all the pieces together into a truly global
model that the problems really start. Processes on land and in the ocean typically occur much more slowly than in
the atmosphere, making it challenging to couple them together. Models of human inputs to the environment also
need to be included, generally in the form of external forcings to the coupled model. And it was those human factors
that were the focus of most of the workshop’s models. (continued on page 8)
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
A Predictable Response
by Iain Coleman (continued)
Like cloud cover and sea ice, human activity has a feedback relationship with the global climate. Unlike these other
processes, that feedback is intelligent and directed. Or at least, it could be. The Regional Impact Simulator, REGIS,
presented by Mark Rounsevell (Edinburgh), is a tool that goes further than simulating the impact of various climate
change scenarios on regions of Britain. It also predicts the effects of various public policy choices that might mitigate
or exacerbate the problems caused by global warming. This is far from being an academic model: it is driven by the
needs of policymakers, and is intended to help them make informed choices about practical medium- and long-term
planning.
In contrast to this vision of rational central planning, Sandeep Kakani (Edinburgh) presented an agent-based model
of land use that built up a picture of the future development of East Anglia from simulations of the economic and
social choices made by individuals. The main focus is on the built environment, and the simulation is driven by
property decisions made by individual actors in the context of public policy and scenarios of environmental change.
The assumptions about central planning versus the free market, implicit in many models, are explicitly addressed in
FARO, Foresight Analysis of Rural Areas of Europe, presented by Marc Metzger (Edinburgh). This looks at changes
in agricultural land use on a European scale, under different scenarios that vary along to axes: whether technology
and infrastructure are mainly developed in the public or the private sector, and whether or not the investment has
the desired effect. The model combines quantitative and qualitative approaches to provide an analysis tool for EU
policymakers concerned with the future of the rural economy.
Taking a more purely economic approach were two models that examine economic initiatives aimed at limiting
climate change. Sandeep Kakani’s second agent-based model looks at the EU’s Emissions Trading Scheme. This
policy limits the total permitted carbon dioxide emissions from major industries in Europe, and divides that total up
into tradeable emissions permits. The agents in the model are industries, power suppliers, market traders, brokers
and regulatory bodies. Their interactions generate modelled emissions levels and permit prices as a function of time,
allowing researchers and policymakers to explore the predicted behaviour of the scheme.
A more holistic view of economics is at the heart of ECCO – Evaluating Capital Creation Options – presented by
Nigel Goddard (Edinburgh). This model addresses the physical, rather than financial, economy with energy rather
than money as the primary unit of value. The aim is to explore the physical limits of the economy: if something isn’t
physically possible, it isn’t economically possible either. Everything we create has an amount of energy embodied in
it, comprising all the resources that were used up in its manufacture. For our current way of life to continue, we need
to maintain the input of energy that can be turned into houses, laptops and ready meals.
The big question is this: as fossil fuels run out, and as we try to move away from carbon-based energy generation,
can we switch to a new energy infrastructure while maintaining our way of life? The ECCO model seeks to answer
this, by predicting the effects of proposed policies on the amount of energy that can be embedded in human-made
capital and relating this to standards of living. Thus national policymakers and their advisors can explore which
economic models can have the desired physical effects.
This focus on providing advice and insight to policymakers carried on to the concluding panel discussion, featuring
Stewart Russell (ISSTI) and Steve Yearley (Genomics Forum). Policymakers need to be able to judge the usefulness
and trustworthiness of models, and make decisions between disputed models. At the same time, researchers need
to be realistic about the world that the various public stakeholders operate in. This linkage between natural science
and public policy creates dilemmas for academic scientists. Hedge all results with the appropriate caveats, qualifiers
and uncertainties, and policymakers get frustrated at the lack of clear answers to their questions while mendacious
critics exploit the guardedness of the predictions to dismiss the whole idea of global warming. On the other hand,
oversimplifying the results and overstating the case risks damaging the public credibility of the entire scientific
enterprise, particularly if the prediction are subsequently revised, and is arguably an abdication of professional and
intellectual ethics.
Beyond the features of the individual models themselves, what this workshop illustrated was the increasing
interdisciplinarity of climate studies. Economists need to know about atmospheric chemistry, land use researchers
need to understand what technological innovations are likely to appear in the future, and regional planners need a
quantitative understanding of the range of likely rainfall scenarios. Climate change is the overarching challenge of
our time, and only by combining all our intellectual resources can we prepare ourselves for life on a rapidly changing
planet.
Slides and other material from this event can be downloaded from http://www.hss.ed.ac.uk/climatechange/
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
Congratulations Dr Jha!
Congratulations are also due to
Dr Shantenu Jha who has been
awarded prizes for papers at two
recent conferences – TeraGrid
08 and the International
SuperComputer Conference.
Dr Jha also received the best paper award at
the International SuperComputer Conference
in Dresden. Dr Jha was one of the co-authors
on the paper, Distributed I/O with ParaMEDIC:
Experiences with a Worldwide Supercomputer, for
which Pavan Balaji, Argonne National Laboratory,
was the lead author.
Dr Shantenu Jha is the theme
leader for the e-Science Institute’s
theme Distributed Programming
Abstractions.
Dr Jha was part of the worldwide team which
took a completely new and non-traditional
approach to distributed I/O, called ParaMEDIC:
Parallel Metadata Environment for Distributed
I/O and Computing, by using application-specific
transformation of data to orders-of-magnitude
smaller meta-data before performing the actual
I/O. The team deployed a large-scale system
to facilitate the discovery of missing genes and
construct a genome similarity tree by encapsulating
the mpiBLAST sequence-search algorithm into
ParaMEDIC. The overall project involved nine
different computational sites spread across the US,
generating more than a petabyte of data that was
then “teleported” to a large-scale facility in Tokyo for
storage.
Dr Jha’s SAGA-based paper, Developing Adaptive
Scientific Applications with Hard to Predict Runtime
Resource Requirements with SAGA, won the
Performance Challenge Prize at TeraGrid 08. The
Performance Challenge Prize is given for papers that
concern new levels of performance in any dimension
of value to TeraGrid users, such as WAN bandwidth,
file transfer performance, visualization display size
and so on. Dr Jha was the lead author for this paper
and is also the PI for the project. The work on SAGA is
primarily funded by OMII-UK. SAGA is still under active
development, and it is hoped it will eventually be made
available on the NGS.
The application, written using SAGA, makes run-time decisions about which optimal resources to use by dynamically invoking the Batch
Queue Prediction tool which runs on several supercomputers on the TeraGrid and around the world
NeSC News
www.nesc.ac.uk
Issue 62, July 2008
e-Science
Institute
ISSGC08 for All
Anyone with Internet access can now see the thrills and spills of the sixth International Summer School on Grid
Computing, which is being held from 6-18 July in Hungary. The school is training approximately 40 researchers
from 20 countries in the ways of grid computing — a technology that helps scientists tackle complex problems by
combining the power of computers around the world to create a powerful, shared computing resource.
In the spirit of shared resources, the school has made all of its teaching materials available to the public via the
ICEAGE digital library. Further, the school has invited a team of GridCast bloggers to blog live from behind the
scenes, ensuring that the excitement of the school is also shared with those learning from home.
Those interested in following the blog or accessing teaching materials should go to: http://www.gridcast.org.
More information:
International Summer School on Grid Computing ’08:
http://www.iceage-eu.org/issgc08
Media Contact: ISSGC’08 GridCast
Cristy Burne - Grid Outreach Coordinator, GridTalk
Phone: +41 (0)22 76 75590; Mobile: +41 (0) 76 487 0486
Email: cristy.burne@gridtalk-project.eu
Website: http://www.gridcast.org
Forthcoming Events Timetable
July
17 - 18
EPSRC/TSB e-Science Projects Meeting
NeSC
http://www.nesc.ac.uk/esi/events/888/
23
JISC Community Engagement Working
Meeting: UK One-Stop-Shop
eSI
http://www.nesc.ac.uk/esi/events/905/
30 - 31
PASTA Workshop 2008
Edinburgh
University
Informatics Dept
http://www.nesc.ac.uk/esi/events/889/
October
1-3
Network Inference in Genetic Studies; the eSI
GeneSys Inaugural Meeting
6 - 10
The DCC Digital Curation 101
NeSC
13 - 15
Microarray data analysis and metaanalysis
NeSC
16 - 17
Living texts: interdisciplinary approaches
and methodological commonalities in
biology and textual analysis
eSI
20 - 21
OMII-UK Face-to-Face Meeting
NeSC
http://www.nesc.ac.uk/esi/events/868/
http://www.nesc.ac.uk/esi/events/907/
This is only a selection of events that are happening in the next few months. For the full listing go to the following
websites:
Events at the e-Science Institute: http://www.nesc.ac.uk/esi/esi.html
External events: http://www.nesc.ac.uk/events/ww_events.html
If you would like to hold an e-Science event at the e-Science Institute, please contact:
Conference Administrator,
National e-Science Centre, 15 South College Street, Edinburgh, EH8 9AA
Tel: 0131 650 9833 Fax: 0131 650 9819
Email: events@nesc.ac.uk
This NeSC Newsletter was edited by Katharine Woods. Layout by Jennifer Hurst.
This is the last edition of NeSC News edited by Katharine Woods. Gillian Law will return as editor for NeSC News in September.
There will be no NeSC News in August 08.The next edition of NeSC News will appear in September/October 08.
NeSC News
10
www.nesc.ac.uk
Download