handout - School of Informatics

advertisement
Innovative, Significant, Rigorous:
Can we describe our selves the way the REF wants us to describe our papers?
Samples taken from Category A staff at Cambridge, Glasgow, Manchester, Oxford and UCL:
RAE 2008 (Computer Science and Informatics):
http://www.rae.ac.uk/submissions/submissions.aspx?id=23&type=uoa
Oxford English Dictionary definitions
 Innovation: a change made in the nature or fashion of anything; something newly
introduced; a novel practice, method, etc.
 Significance: the quality of being worthy of attention; importance, consequence.
 Rigour: strict sense or interpretation; precision, exactness; (in later use also) the quality
or condition of being highly detailed, accurate, and thorough.
REF: identifying & assessing excellent research of all kinds, in up to 100 words
 Innovation - are they novel tools, theories, methodologies in their creation or in their
field of application? What are you changing, and why does it need to be changed?
 Significance - who is this piece of work significant for? How ‘worthy’ are your modes
of dissemination? Does the work have a quantifiable impact on your field?
 Rigour - how can you prove the quality of your research methods? Where did this
research originate and where is it going?
Beyond the REF: applying the criteria to your personal life, in up to 100 words
 Innovation - how is this a new thing for you? Is it also new for the wider world?
 Significance – what does this change about your life? How have you developed as a
result? Do your actions have a wider impact?
 Rigour – is this something you have done to the best of your abilities?
Show, don’t tell
 Try to back up everything you claim with proof. For example, reconsider phrasings such
as ‘This is an innovative paper’ & ‘This is a prestigious conference’ and demonstrate
rather than describe: ‘For the first time in X context’ & ‘European conference
(acceptance rate <12%)’.
Cut to the chase
 Be specific, be accurate, be immodest.
 When editing fiction, you generally end up cutting the first and last paragraph because
they’re waffle: are you describing the content of your research or the significance of it?
Journal article
This was one of the first papers concerned with e-Science published in the Journal of Computer
Supported Cooperative Work. Drawing from analyses of a major e-Science project, the paper
systematically re-specifies the notion of trust and ethics as practices in context, unpacking the
notion of ‘trust’ in both computer and social sciences. This work led to the funding of the ESRC
Oxford e-Social Science project: Ethical, Legal and Institutional Dynamics of e-sciences, an
interdisciplinary project between the Computing Laboratory and the Oxford Internet Institute. It
also formed the foundation for the SSRC/ESRC Visiting Fellowship Award for international
research. [98 words]
Journal article
Efficient implementation of a biologically motivated model of associative memory. Takes an
established model of associative memory - Kanerva's Sparse Distributed Memory (SDM) - and
simplifies it to yield a much lower-cost implementation, based on N-of-M codes, that can be
implemented using spiking neurons and binary unipolar synaptic weights with a simple Hebbian
learning rule. The new SDM is analysed theoretically and experimentally (by numerical
simulation) and is shown to be efficient, robust to input noise, and more 'biologically-plausible'
than Kanerva's original SDM. [83 words]
Journal article
This paper examines recent developments in cross-disciplinary science and argues that a ‘Big
Science’ approach is increasingly evident in the life sciences - facilitated by a breakdown of the
traditional barriers between academic disciplines and the application of technologies across these
disciplines. Drawing on ethnographic analyses of a global e-Science project, we identify key
social, organisational and policy criteria for successful uptake of such large infrastructure
initiatives. This work contributed to the specification of an interdisciplinary research agenda for
EPSRC that resulted in a call for proposals by the EPSRC on Usability in e-Science. [94 words]
Conference contribution
This paper is in the context of software verification. Software programs have a state space that is
typically too large to be analyzed directly by means of explicit and symbolic state representations.
Boolean programs are therefore a commonly used abstraction. Boolean programs have the same
control flow constructs as the original program, but only Boolean variables. This paper describes
an algorithm to analyze Boolean programs with unbounded thread creation. The experimental
results have been obtained on automatically generated abstractions of the Apache Web server.
[84 words]
Journal article
Overview of an innovative self-timed Network-on-Chip (NoC) technology - the world's first
fully-developed self-timed NoC fabric, supporting packet-switched on-chip communications
using asynchronous circuits. Describes the technology behind Silistix Ltd (John Bainbridge, Chief
Technical Officer), a company spun-out to commercialise tools for self-timed NoC design. Since
its launch in December 2003 Silistix has attracted over £6M of venture capital funding from a
consortium led by Intel Capital, and currently employs 14 engineers in Manchester with a sales
office in California. ISI citations 11; Google-Scholar citations 55(11pa). [85 words]
Journal article
This paper discusses eDiaMoND - a UK e-Science Flagship Project that explored the use of grid
architectures to support Breast Imaging in the UK. It explores the vision of a working context
that moves from film to digital in the light of preliminary results from the initial requirements
investigations, and how such technologies could be incorporated into existing work practices.
The challenges discussed informed two interdisciplinary proposals funded by 1) ESRC the
‘IMaGE’ project, investigating ownership of intellectual property rights in medical data in
collaborative grid computing environments 2) EPSRC Embedding e-Science Applications:
Designing and Managing for Usability project. [99 words]
Journal article
This paper introduces propositional SAT solvers to abstraction-based software verification. To
the best of our knowledge, this paper reports a technique that accurately handles any ANSI-C
program expression for the first time. It also describes a precise method to reason about pointer
arithmetic, and the first model checker to do so for ANSI-C programs. The technique has been
adopted by a number of other tools. In particular, there is now a version of SLAM that uses
SAT-based reasoning for Windows device drivers. 48 citations (Google Scholar). [86 words]
Conference contribution
This paper reports on the work conducted by the ESRC funded IMaGE project, investigating the
challenges to copyright and database law of collaborative medical databases. The paper is the first
of its kind, where ethnographers and lawyers collaborated to detail and analyse the formal and
informal laws applying to the various parties in a major eHealth project, to form more general
conclusions regarding ownership rights in distributed medical databases. This work has generated
long-needed debate regarding models of ownership. The KnowRight conference focuses on the
interaction of Intellectual Property Rights, related Information Rights, Ethical Problems, and
Information Dependent Technology. [99 words]
Journal article
Computational complexity results for determining logical relationships between sentences in
various fragments of English. Applies techniques from model-theory, proof theory and
complexity theory to natural language. Extends earlier work [Journal of Logic, Language and
Information 13, 2004, 207-223], where techniques of complexity theory were applied to
semantics of fragments of natural language for the first time. [56 words]
Conference contribution
Outstanding Paper Award from the program committee of the flagship conference in the field
(IEEE Security & Privacy). This paper also received quite wide popular-science media coverage
(e.g., New Scientist, issue 2334, 16 March 2002, page 22; BBC News online, 9 March 2002, etc.).
Most of the follow-up research that I am aware of was and is being done for government security
agencies that do not routinely publish their results, therefore academic citation indices may not
yet fully reflect the impact that this paper had in the compromising-emanations community.
Paper was quoted as having inspired other hardware-security work. [98 words]
Conference contribution
Evaluation of Information Retrieval Systems is a fundamental to the discipline, but under the
current methodology comparisons may be biased. This paper defines a new measure that extends
the current evaluation methodology to determine whether two systems can be compared fairly,
which was validated on a number of TREC Test Collections. Recently published and presented
at the European Conference in IR (ECIR 2007) (acceptance rate <20%), this work takes a
significant step forward in addressing bias in the evaluation process and has identifies a key
variable which affects such comparisons. A more extensive extension of this work has been
published in the IPM journal. [104 words]
Conference contribution
Simple model of the learning mechanism in animals for expected-time-to-reward in delayedreward reinforcement learning, based on a spiking neuron model and the Sutton-Barto model.
First attempt to explain, via an ab initio model, the 'scalar property' which occurs in a wide range
of interval-timing experiments across many animal species, including birds, non-human
mammals, and even fish. Successfully accounts for many aspects of the experimental data. [65
words]
Journal article
This paper introduces SAT-based software verification techniques to SpecC, a system-level
modeling language based on ANSI-C. System-level modeling makes extensive use of bit-level
constructs, and thus, the theorem provers used by tools that are not based on SAT are not
applicable. To the best of our knowledge, our tools are the only ones to support bit-level
constructs and a sound treatment of variable overflow and shared-variable concurrency. [67
words]
Conference contribution
This paper won the EAPLS (www.eapls.org) prize for best paper in the ETAPS'06 conference
federation. It provides an intermediate language which can express both lazy and eager
evaluation. This is not merely a "union of two languages'' embedding, but a full embedding
whereby eager values appear as a subset of lazy values and lazy values retract to eager values
through a variant of Haskell "deep seq". Technically this is achieved by providing type-negation
(representing continuation) rather that function arrow; double negation then expresses laziness.
[84 words]
Journal article
This pioneering paper presented results from the first few years of our AutoHan Project. We
implemented a self-configuring software architecture for home networks using an XML-based
registry and HTTP-based event service. The ideas are now well-known in the field of Ubiquitous
Computing and they fed directly into the Oxygen project at MIT. They formed the basis for our
current work on safety-critical co-operating embedded systems. [65 words]
Conference contribution
This paper demonstrates how Aspects can be used to provide flexibility in service-based
computing. It addresses one of the most important practical problems in developing web service
architectures, the emerging dominant style in industry, that is providing support for dynamic
adaptation. This paper shows how flexibility can be achieved at a number of different levels by
using mechanisms from aspect programming. The work was published in the leading conference
for software engineering ICSE (14% acceptance rate) and has a growing number of citations (25
according to Google scholar) as the issue comes increasingly to the fore. [96 words]
Journal article
Fast, on-line algorithm for learning geometrically consistent maps using only local metric
information from an exploring robot. One of the first on-line Simultaneous Localisation and
Map-building (SLAM) algorithms; described as the '... first efficient algorithm for solving such
problems' in a recent, major textbook on probabilistic inference in robotics ['Probabilistic
Robotics', Thrun et al, MIT Press, 2005, p381]. Provides both theoretical and experimental
validation of the approach. [67 words]
Conference contribution
This paper was the first detailed demonstration and explanation of compromising RF emanations
from modern flat-panel displays. The work has received substantial interest among European
manufacturers and users of electronic voting machines in 2007, after the effect described let to
the revocation of the certification of an electronic voting machine (sdu NewVote) for the use of
elections in the Netherlands. [60 words]
Journal article
Journal version of a paper published in the 11th Logic in Computer Science Conference
(LICS'96), one of the major conferences in computer science. The first mathematical model of its
kind that sprung much work in the area concerning models of name-passing process calculi and
even outside that, e.g. in the context of algebraic theories with binding operators. [57 words]
Conference contribution
Deep looping constructs are a challenge for software analysis tools. The promise of Model
Checking is to deliver a counterexample when the property does not hold. In complex programs,
such a counterexample may need to have millions of steps through a single loop. Previous
verification algorithms based on predicate abstraction have to perform one pass through their
refinement procedure in order to proceed one iteration through a loop of a program. This paper
presents a technique that generates recurrences for paths through loops, thus scaling to much
deeper counterexamples. The experimental results demonstrate exploitable buffer overflows in C
programs. [99 words]
Conference contribution
This paper was shortlisted for the best paper prize at JCDL, a conference with an acceptance rate
below 30%. It is widely cited in the literature on use and usability of digital libraries (69 citations
on Google Scholar as at 26/8/07). It is one of the earliest naturalistic studies of users'
experiences of using DLs, and related empirical data to an evolving theory ('Interaction
Framework') on the properties of interactions, introducing and exemplifying interaction concepts
such as "blind alleys", "discriminable events" and "transitions" in interactions. This work has
been foundational to our subsequent work on DL use and usability. [99 words]
Journal article
This paper first appeared at DesignCon 2003 (Santa Clara, CA, 2003), one of the major
international conferences in this area, under the title "Using Tension VTOC for Large SoC
Concurrent Engineering: A Real World Case Study". The company Tension Technology EDA
was started to exploit it. Several patents and other papers cannot currently be disclosed for
commercial reasons. This work is in use or under evaluation by nearly every major IC
manufacturer (50+ companies). The co-authors are staff at one of the IC manufacturers. [84
words]
Conference contribution
Previous work on distributed information retrieval (DIR) has assumed that the collections were
static. In this work, we consider the problems of DIR in dynamic environments where the
collections change. Our contributions in this work are in first identifying how static
representations of collections adversely affect the performance of a DIR system, and then
proposing a number of different methods which can combat this problem with minimal cost.
This work was recently published and presented at the leading IR conference ACM SIGIR
(acceptance rate 17%) and signifies a change in paradigm from static to dynamic. [95 words]
Journal article
This paper has given the most general construction for probabilistic event structures, known so
far. It shows how probability and causality combine together within the model of event
structures for concurrency. [31 words]
Conference contribution
It is well known that doing almost any non-trivial transformation on concurrent programs is
difficult. But many problems (e.g. network packet processing) are naturally specified this way in
terms of model implementations. Such programs must be optimised to fit modern multi-core
processors. This paper is the first development of widely applicable optimisations implemented
inside a real compiler (available from SourceForge); notably it builds on type-based control of
interference previously developed by the same authors. Due to the increase in multi-core
processors this is expected to be a foundational work. [89 words]
Conference contribution
Published in one of the major conferences on programming languages. First significant advance
on the study of type isomorphism for recursive types, surprisingly putting the problem in the
context of computational algebra. [32 words]
Journal article
The paper describes the xlinkit technology that supports consistency checking of semi-structured
distributed documents. The paper is an extended journal version of an earlier paper that received
the best paper award at Automated Software Engineering 2001. The algorithms described in the
paper have been patented (US Patent No. 7143103) and have formed the basis for the formation
of Systemwire Ltd. Xlinkit has been used to define the consistency rules of the Financial Product
Markup Language, the Int. Swaps and Derivative Association standard that governs all electronic
derivative trading world-wide. The paper has been cited more than 50 times. [98 words]
Conference contribution
This paper describes models and algorithms for the real-time segmentation of foreground from
background layers in stereo video sequences. The main idea is to combine several sources of
information (colour, contrast, and stereo). We show that our methods outperform existing
techniques in terms of accuracy on sequences with ground truth segmentation, and give a good
quality composite video output in the application of background substitution. This paper
received a best paper honourable mention award at CVPR'05. The journal version of this paper
'Probabilistic fusion of stereo with color and contrast for bilayer segmentation' was published in
PAMI, 28(9):1480-1492, September 2006. [100 words]
Journal article
This paper is the main publication on xlinkit. It describes both the theory and its application.
This work has been internationally patented and resulted in a successful spinout company
acquired by trade sale in August 2008. xlinkit is widely used in the finance industry and many of
the leading investment banks now use it to support billions of pounds of trading. Several
hundred research licenses of xlinkit have been granted and many thousands of downloads of trial
workbench. ACM TOIT is a high impact journal. The paper has 82 citations according to Google
Scholar (the tool URL gets many more). [100 words]
Conference contribution
This paper was awarded the Ted Nelson prize at the conference (acceptance rate 31% in 2004). It
reports on an integration of digital library and spatial hypertext technologies, and the usability
evaluation of the resulting system. These technologies had not previously been integrated. The
evaluation established the value of integration (creating a richer environment for information
work rather than just seeking or organising), but also identified limitations of this early prototype.
This work was foundational for further developments on integrated information working
environments that include finding, organising and using information. [90 words]
Journal article
The paper is a result from work funded in EPSRC project GR/M90641. The paper identifies
features that distinguish pure routing problems from pure scheduling problems and shows how
mixing these features (such that pure problems become richer and more realistic) affects solution
technologies, and in particular commercially available toolkits popular to these two domains. This
information is then used to blend technologies. A number of conference papers (ICGT2002
paper on graph transformations, SARA2002 on reformulation, ICAPS2003 paper "What's the
difference?") and workshop papers underpin this journal paper. [87 words]
Journal article
Journal version of a paper published in the 17th Logic in Computer Science Conference
(LICS'02), one of the major conferences in theoretical computer science. First significant advance
on the study of type isomorphism for simply typed lambda calculus with sums, establishing a
very strong link to Tarski's high school algebra problem in mathematical logic and thereby
settling an open problem. [60 words]
Download