View/Open - DukeSpace

advertisement
This is a preprint of an article whose final and definitive form will be published in Serials
Review, v.40, no. 3, 2014 by Beverly Dowdy and Ros Raeford.
Electronic Resources Workflow: Design, Analysis and Technologies for an Overdue
Solution
1. Background - Describing the Problem
Academic libraries are in the middle of a huge transition from print to electronic. This is hardly
news to anyone, but technical services divisions are still wrestling with the impact to
workflows. In prior decades our workflows focused on processing print items that were tangible
and moved through an assembly line of sorts. Once items arrived at the end of the assembly line,
they were put on shelves and rarely handled again by technical services staff. The workflow was
linear; things rarely changed.
The transition to electronic resources began around twelve years ago. Processes grew slowly for
handling them and librarians tended to fall back on what they knew best: a linear workflow.
Except, electronic resources are not linear. They start, stop, sometimes backtrack, branch off,
and come back again. There is no physical piece to alert someone that a part of the process
needs to happen. This process has no linear end. It was assumed that the end was reached once
access was provided. At the point of access, library staff often walked away and assumed eresources would continue to work. Many of us were not prepared for the changes that needed to
be managed: platforms change, subscriptions moving in and out of packages, holdings dates
changes, and titles changes. Publishers turn access on and off. Licenses need to be renewed and
access maintained. And worse, often we didn’t realize when a change was happening and were
1
unaware if an e-resource made it through the entire lifecycle. Many of these processes were
handled on a piecemeal basis.
E-resources at Duke Libraries, like every other library, have grown exponentially. We began
keeping statistics on activated databases and unique e-journals in 2003. At that time 100
databases and around 28,000 unique e-journals were made available. Our focus and energies
were spent on purchasing, registering, activating, and cataloging these resources. Once they
were made available, little thought was given to maintaining them. Wouldn’t the publishers or
vendors tell us if we needed to change something?
By 2011 troublesome rumblings were afoot that led us to question our assumptions. That year
Duke Libraries’ completed a LibQual survey. The resulting data showed that Information
Control, which concerned e-resources and their access, placed last. When results were divided
by user type, faculty and post-docs rated Information Control as unacceptable. Many comments
expressed frustration with getting to full text, as illustrated in the following examples:

“Sometimes articles are listed on the library website and when you click "Get it at Duke"
it is unavailable, and it makes it very inconvenient when doing a research paper. Also,
sometimes the wrong articles come up when you click the "Get it at Duke" button.”

“Sometimes I end up clicking and finding myself at the journal's website where it asks for
a log in (when Duke already has access).”
2

“Access to E-Journals such as Elsevier's seems fragmented and error ridden. One gets a
menu of possible sources, half of which end up not having the items they claim, blocking
access, or failing to deliver for other reasons.”
At the same time we also had anecdotal information that there were serious problems with our
electronic resource processes. Technical services staff decided to complete an audit of our
databases to gain some hard data. How many of our databases made it all the way to the end of
the workflow process and how many were still working? Out of 694 subscribed databases and
210 free ones, 156 or 16% were represented in one silo but not another (Aleph or Serials
Solutions), and six were never activated at all. A similar check of single subscription e-journal
titles discovered that 7% were never activated.
The results were unacceptable. The cost of six databases could easily top $100,000 per
year. Our subject librarians, our faculty, and our students expected that when something was
purchased, they should have access to it. It was embarrassing for a faculty member to request an
e-journal purchase, only to discover the library had purchased it three years ago but it was never
activated. Or to troubleshoot a database that still had a record in our catalog, only to discover it
was cancelled five years ago and the cataloging record never removed.
We realized at that time that all of the errors made between 2003 and 2011 were cumulative. We
were dealing with the fallout. A major change in the workflow was called for. Our data
3
management had been ad hoc, and we were losing control. The major problems boiled down to
the following: (1) uncertainty among our staff because the workflow was complicated and many
understood only their piece of the process, (2) work was often duplicated because staff were
unsure, (3) big picture knowledge was lacking, (4) there was a lack of transparency with
information that was time-consuming to dig out, (5) no one had any way of knowing when
something dropped out of the process, (6) and we had a reactive and time consuming approach to
troubleshooting access problems. All of these taken together pointed to processes that were
ineffective and staff that were overwhelmed.
By May of 2014, Duke Libraries had acquired and made available 2,011 databases or platforms
containing 181,000 unique e-journals. The numbers grew to the extent that two or three people
could no longer handle the work, so the workflow across departments was decentralized. Other
major changes were brought on board: between 2003 and 2014 we changed knowledgebases,
re-organized and created a new department called “Electronic Resources and Serials
Management,” assigned and trained more staff to handle these resources, and moved from a
federated search technology to a discovery tool. All the changes with staff, technologies, and
decentralization of work ensured that by default, email became our workflow driver. Once a
discrete task involving an electronic resource was completed by one staff member, an email was
sent to another staff member containing information needed for the next step. By the end of the
process, a cascade of emails had been sent. Once an e-resource was made available to our
patrons it was rarely checked again. More work was always arriving for recent purchases, so no
one had time to review older activations. All troubleshooting was done on an as-needed basis.
We relied on our patrons to tell us when something wasn’t working.
4
A business report written for the Libraries’ Executive Group stated, “The library’s investment in
electronic resources has now eclipsed its print-based expenditures (e-based collections
expenditures constituted 53% of the entire collections expenditures in 2012), and the need to
address these concerns has reached a tipping point. Furthermore, the growth rate of the
Libraries’ electronic collections continues to slope sharply upward, yet the staffing levels within
technical services remain static and primarily dedicated to print resources. Given e-resource
growth, current staff practices are becoming unsustainable.” (Mitchell, 2013).
2. Literature Review - others are having the same problem
Duke library staff knew they were not alone. Over the last decade, nearly every academic library
has been fighting the same battle. Some libraries purchased electronic resource management
systems (ERMS) in hope of better data management. Duke Libraries was one of them. In 20072008 there was a failed attempt at implementing Verde, which took us back to square
one. Going forward we were leery of investing either time or money in another ERM system
unless an excellent case could be made for how it would solve specific problems.
Library literature echoed with similar disappointment with commercial ERMs. In many cases
implementations were still ongoing. At a 2010 presentation at the Electronic Resources and
Libraries (ER&L) Conference, Tim Jewell argued for standards and reported that “many libraries
report slow, difficult, partial or failed implementations (of ERMs).” Entering data into an ERM
5
system can be manually intensive. Collins and Grogg (2011) reported, “ERM system offers a
one-stop place for various pieces of information (contacts, licenses, orders, past problems), and
the amount of work involved in gathering and entering all of this information has been
formidable.” In response some libraries, such as at York University in Ontario (Salmon and
Lupton, 2012), built their own ERM system in-house. Some implemented open source systems
such as CORAL (Imre, Hartnett, et. al., 2013). Silton and LeMaistre (2008) in working with the
Innovative Interfaces Inc. (III) ERM stated, “The current literature overwhelmingly indicates that
III's ERM presents workflow challenges so great as to impede completing an
implementation.” Others also noted concerns related time-consuming nature of ERM system
implementation and maintenance. Condic (2008) noted that “[t]he big question that cannot be
overlooked is the amount of time that is needed to input this information into ERM products.
This is not a trivial matter and must be addressed before serious consideration is given to
subscribing to any of these products.” In addition, Collins and Grogg (2011) reported “Concerns
about data ingestion quickly turn into concerns about data maintenance. Several librarians
described the data within their ERM systems as static and only as good as their ability to
maintain it. The increase in workload has resulted in increased staffing needs for many libraries.”
3. Our Team and Staff Interviews
At Duke Libraries, library staff decided not to assume that all we needed was a different ERM.
Instead, it was critical for us to start at the very beginning and see where our own workflows (or
lack thereof) were failing. Our primary motivation was to determine exactly what was required
instead of letting a commercial system do that for us by taking a holistic approach. The
Electronic Resources Workflow Analysis and Process Improvement Team or ERWAPIT team
6
was born. This team brought together experts from across the library, not just those inside
Technical Services.
The charge was to document and analyze all existing e-resource workflows. This was a major
challenge. Thinking we would not need to reinvent the wheel, we started by gathering other
published library workflows. Most were disarmingly simple (select, acquire, license, remit,
activate = A, B, C, D, E). The guts of the processes were lacking. Also, most of the workflows
found had a definite slant towards selecting and acquiring. Description of all of the work that
happened after initial activation was one activity box called “provide access and maintain.” The
ERWAPIT team needed to tease out what was encapsulated in that one box.
The team began with a published e-resources lifecycle from EBSCO and extrapolated from there
(see Figure 1). We also developed a staff responsibility matrix (see Figure 2). In order to
discover the tasks our staff were actually doing, and not what we thought they were doing,
interviews were conducted. A total of about forty staff were interviewed. Our guiding principles
were the following: that a supervisor could not interview a direct report and no judgment could
be made during the interview about how well or how poorly a staff member was doing their
job. The interviews were laborious but necessary. After all interviews were completed,
workflow diagrams were drawn (see Figures 3.1 and 3.2). The diagrams were then analyzed for
weak points, overall logic, and to see if tasks could be moved or automated for greater
efficiency.
INSERT Figure 1. Electronic Resources Lifecycle for Databases.
7
INSERT Figure 2. Staff Responsibility Matrix.
INSERT Figure 3.1 “As Is” Workflow Diagram for Databases.
INSERT Figure 3.2 As Is” Workflow Diagram for Databases.
4. Results of Workflow Analysis
Findings from the teams’ analysis included the following points.
1. Effective communication across departmental units was hampered by inefficient and
largely non-automated techniques. In large part, it was driven by email and human
memory.
2. Existing information about e-resources was often inaccessible to workflow
participants, or time-consuming to retrieve. Information was stored in many places:
various spreadsheets, our integrated library system (ILS), or vendor knowledgebase.
3. Many existing e-resource processes do not follow any standards and often result in lost
information and a high level of duplication across units. Conscientious staff would
duplicate what they could not find.
4. Quality control measures were largely reactive rather than proactive and relied heavily
on patron-initiated notifications. We had no way of knowing how many or what
percentage of our e-resource activations were accurate and operating.
5. Current staff practices were no longer sustainable, either financially or temporally. Eresources were increasing exponentially while staffing remained static.
8
6. The Library risked both its collections management structure and the user experience
by failing to address these systemic issues. Systemic issues needed a systems solution.
The overarching consideration of the ERWAPIT report centered on the key role that operational
workflows play in the success or failure of the Libraries’ ability to manage its resources. In the
words of ERWAPIT team’s summary, a chain is only as strong as its weakest link. We decided
the weakest links in our decentralized, fragmented, error-prone system were email and human
memory.
5. Possible Solutions
Reviewing library literature about ERMs again, we heard another repeated complaint. In 2011,
Collins and Grogg completed a survey of 66 academic librarians about their experiences with
ERMs. “Over one-third of librarians surveyed prioritized workflow or communications
management, and they called it one of the biggest deficiencies (and disappointments) of ERM
functionality.” And, “Librarians noted a lack of flexibility in workflow components from
commercial ERM systems. On the flip side, librarians said that workflow support from other
commercial systems is too general and not granular enough. Librarians were forced to create
workarounds.” Survey respondents continued:
The number one priority for ERM systems was workflow management. The ERM system
has met this challenge at a relatively basic level. Across commercial, open source and
locally developed solutions, workflow management was noted as a partial success;
specifically mentioned were ticklers, status tracking, notifications, and alerts about
downtime. Successes with workflow vary significantly from system to system. . . . .
9
Sophisticated workflow management obviously hinges on a notification system that can
recognize the role of staff members in the ERM process, allow for flexible email
communication throughout the system and provide alerts of status changes to both
external and internal users (Collins & Grogg, 2011).
Numerous other librarians also mentioned this inability to customize ERM system workflows to
their local processes. “Consequently, developing workflow management remains a catch-22—
functionality is either too specific to match local practice, or too general to support local
workflows” (Collins & Grogg, 2011).
Another complaint from the Collins & Grogg (2011) survey was lack of system interoperability,
which “has created a domino effect of problems…the data traditionally housed in the ILS
environment—such as cost, fund, and vendor data—remains segregated from the ERMS without
easy means for data transfer.”
Because Duke had many of the same concerns, an “ERM-like” team was formed. This team
decided to consider products that functioned like ERMs, only better. Traditional ERMs were not
considered because:

By and large these systems were data warehouses without any workflow or
“push” technology

An average 2-3 years is required to implement the system because of data entry
10

Our associate university librarian (AUL) for Information Technology (IT) and
Technical Services encouraged us to look outside of library vendors to other
solutions.
From the outset Duke Libraries wanted to look at technologies that could do the job, not
necessarily ERMs and not necessarily products developed by library vendors. Norm Medeiros,
associate librarian and coordinator of bibliographic and digital services at Haverford College,
PA, noted “that using workflow management systems from business sectors might possibly fill
this functionality gap [of ERMs]” (Collins and Grogg, 2011).
Vendor products had been vetted over several years and found lacking. Most of all we realized
that bringing in another “information silo” was not the answer. We already had several
information silos: our ILS, spreadsheets, a link resolver/knowledgebase, a database for trialing
databases, and print files. We needed to consider solutions that included non-ERMs, and in doing
so, think outside the box. We looked for a technology to meet our current needs, especially
systems that had the capability to reduce or eliminate our heavy reliance on email and human
memory. We needed a technology that did not require us to parse out or enter the large volumes
of information concerning e-resources we had created over the last twelve years. There was
simply too much data.
6. Products Considered
11
The ERM-like team went outside the library vendor world to consider the following
technologies: Microsoft SharePoint Designer, ImageNow, and IBM products. IBM offered two
systems that worked hand in hand: BlueWorks Live and BPM (Business Process Manager). The
team chose to review these products because they held the promise of workflow and data
management. When looking at systems, the team prioritized two capabilities: push technology
and ability to search across full-text data. Push technology to replace our reliance on email and
human memory; and ability to search across the full-text of accumulated documents (license
agreements, title lists, and vendor communications) so that there would be no need to re-enter
data.
6.1. Microsoft SharePoint Designer
Microsoft states that SharePoint Designer can be used to create workflows and manage complex
business processes in two ways: by automating business applications and human collaboration.
This is accomplished through a workflow editor that allows for nested logic and sub-steps. Not
surprisingly, Visio workflows can be exported to into it (Microsoft Sharepoint Designer, 2010).
We found that SharePoint Designer had promise in that it could both search across full text of
multiple documents and utilize workflows. Tags and metadata could be added to documents,
and it could be set up with permissions. There was definite low hanging fruit we could
implement.
12
But it had major weaknesses. Upon investigation we discovered that it offered only very basic
workflows; a complicated workflow would require the use of the server. Its workflow
management consisted of sending emails to staff in Outlook with a link to a task in SharePoint.
Or it could add the task to their Outlook task list. This put us back into the realm of relying on
emails for work to be completed. Also, our institutional support was murky because there was
no on-campus expertise. Finally, for our needs it was judged as simply not robust enough.
6.2. ImageNow
ImageNow is a document imaging and management system owned by Perceptive Software, a
business unit of Lexmark International. Duke offered on-campus support for entering our
workflows and handling all upgrades of ImageNow, which made it very attractive. Documents
could be tagged, and checklists could be attached to documents. Thus the history of a resource
could be seen at a glance. Multiple documents could be linked together. If the proper mail client
was installed, the system could ingest emails and pull in files from vendors. Any workflow
queue could be opened to see how long an item had been there and where it fit into the
queue. An inbox could be assigned to a workflow, and a supervisor could be assigned as the
queue lead. This would allow the supervisor to assign work or move assignments around to
different staff in a parallel workflow.
13
Upon logging in, the system showed what tasks arrived in the queue for staff to complete. All
work stayed in the system. It could report out on everything in the workflow queue. Users could
right-click on any document to display related documents. Emails and documents could be put
into holding queues while waiting for responses from vendors, etc. Permissions and security
could be assigned at the “add/edit” or “view only” level. Ingests could be automated or manual.
Workflows could be scripted.
But the system as purchased by Duke lacked several add-ons that were viewed as essential, and
if purchased, implementation of the system would incur greater costs. Critically, it lacked
optical character recognition (OCR) software and the ability to search across multiple full-text
documents. We could not input license agreements without extra cost. It lacked a document
check-in and check-out function (for Excel files, etc.) because the Document Management addon had not been purchased by the University. It could not import Visio files that contained our
already drawn workflows; they would have to be rebuilt in the system. At its heart, ImageNow
was a document management system not a workflow technology.
6.3. IBM BlueWorks Live and Business Process Manager
BlueWorks Live is a collaborative, browser based environment in which teams can generate
workflows. A workflow can start either with imported Visio files, or one can be created in the
system. When BlueWorks Live is opened, a large canvas is generated upon which a set of
activity boxes can be moved and re-arranged, stacked and grouped through drag and drop. Once
14
a workflow has been imported or created, the system offers multiple views of the workflow. The
mapping view demonstrates problem severity and frequency as the process is documented. The
analysis view shows the steps in which the problems occur and their overall impact.
Documentation can be added to any activity anywhere along the workflow and files attached. It
can automatically generate PowerPoint or Word documents. The entire workflow can be shared
through different accounts (IBM BlueWorks Live, 2014). After the map is created and analysis
made, a Process Diagram is automatically generated that goes deeper within the modeling
activities. All work can then be imported into the more robust Business Process Manager
(BPM). Blueworks Live seamlessly integrates with IBM Business Process Manager.
Business Process Manager is described by IBM as a comprehensive management platform which
provides full visibility. It provides tooling and a runtime environment for process design,
execution, monitoring and optimization, along with basic system integration support. The
product can be configured to support various levels of complexity. (IBM, Product Overview,
2014).
Both BlueWorks Live and BPM reside in IBM’s cloud, so no library IT costs would be incurred.
BPM has a good infrastructure and is very robust. Mentoring and training by IBM staff are built
into the contract. Most importantly, we found this system was intended to be a process designer
with push technology. It offered us the big picture of our workflow and that picture could be as
big as we wanted to design it. The number of workflows it could handle was almost unlimited.
These workflows could be improved and tweaked as often as needed. It was heavily iterative.
15
All user tasks were kept within the system; the workflow was not driven by email. Thus it would
end our reliance on email and human memory to move the data. It had the ability to integrate
with other systems (such as our ILS, Microsoft Outlook, and database trial management system)
if those systems could serve up the data. It was hugely transparent; anyone in the system could
see where any particular e-resource was in the workflow at any time. The staff interface was
pleasing. Staff could collaborate through the site’s social features. The reporting capabilities
were extraordinary, with it we could finally identify bottlenecks and add metrics. With the
metrics we could then iterate the workflows to make our processes even more efficient.
We found that BPM was a scalable solution that would also work for other parts of the library,
such as digital collections. There was a large potential for use in Duke Libraries beyond
electronic resources.
BPM was designed to be a workflow management system. It was not a document storage
system. Workflow with documents attached could flow through the system, but they could not
remain there. Once a workflow reached an end, BPM needed to be told where to store any
attached documents. The team decided to use SharePoint for this purpose, which we had
investigated earlier. SharePoint was perfect for document storage because of its capability for
full-text searching across multiple documents.
16
IBM required selected library staff to be trained intensively. After ten weeks of training we
would have our own local experts. Obviously it would be a steep learning curve which required
a large investment in time and effort.
In conclusion, BlueWorks Live and BPM were the closest fit for the majority of our articulated
needs and went beyond those needs. The team recommended to the Libraries Executive Group
that the IBM products were unsurpassed in their potential to re-make our electronic resources
workflow.
7. Implementation
7.1. BlueWorks Live
The process of implementing IBM's BlueWorks Live (BWL) and Business Process Manager
(BPM) systems was no easy feat. First, we had to develop a clearer understanding of how these
tools worked. Following our investigation, BWL proved fairly simple. As indicated earlier, it is
a process modeling tool, similar to Microsoft Visio, which allows its users to build process
diagrams. However, unlike Visio, one can include documentation, which is the equivalent of
programing specifications that describe what needs to take place to make the process work in a
BPM environment. For example, if the process required BPM to send an automatic email
message. The documentation would provide the recipients and the contents of the email. One
who is familiar with process modeling and developing programing specifications can adapt that
knowledge quite easily to the use and application of BWL. Duke is fortunate to have staff that
possess skills in these areas. In light of this, BWL became a self-directed learning process, which
enabled us to build an understanding of the tool and its application. Once a process is completed
17
in BWL, it can be imported into BPM. It is important to note that BPM can be implemented
without BWL. However, the programing specification component proved critical to our
implementation of the BPM system.
7.2. IBM's Business Process Manager and Proof of Technology Workshop
BPM, on the other hand, demanded a deeper understanding of the system to determine whether
this tool could solve our more pressing problems. First, would it allow us to make the process
more transparent by enabling us to see where in the e-resource's lifecycle an e-resource rested
and to know when processing was complete? Second, would it provide the metrics necessary to
determine how long it took for an e-resource to make it through its workflow and help us identify
and address bottlenecks or process interruptions? Third, would it automate hand-offs from one
individual to the next and from one department to the next? Fourth, would it ensure easy access
to the information and documents required at each stage of the process? Finally, would it provide
the highly desirable replacement for email and human memory as the primary drivers of our
workflows?
The answers to these question emerged at a "Proof of Technology Workshop" led by IBM on
December 11, 2012. A small group of Duke University Libraries' employees participated in this
one-day, hands-on workshop that was held at IBM in the Research Triangle Park. During this
workshop, we had the opportunity to develop a workflow in the BPM system. This hands-on
experience became a game changer for us. It was through the use of this tool that we began to
re-envision how we could approach workflow development to address all of the questions that
were described previously.
18
The "Proof of Technology Workshop" introduced significant change in how we viewed the
management of our e-resources workflows. The process of transforming our "To Be" workflow
to a BPM workflow became fairly simple once we could see our processes as sets of check lists
for the work that needed to be accomplished at each stage of the e-resources lifecycle (see Figure
5). Our initial draft essentially eliminated the need to manually send, receive, and respond to
email messages. The BPM system's ability to integrate with other library systems also indicated
that data entered earlier in the process could be captured and used to (1) generate and send an
email to the provider to set up trials, (2) update the trial management system (Django), (3) send
an email message to Subject Librarians that the trial had been established, and (4) create a
bibliographic record in our online catalog to facilitate the ordering process.
7.3. An evolution: “To Be” workflows
A review of the "To Be" workflow (see Figure 4) shows a dramatic difference from the "As Is"
illustration exhibited previously. In this clearly, less complicated "To Be" workflow diagram,
we have eliminated duplication of effort, specified who needs be informed and when, and
provided clarity about the documentation requirements at each stage of the e-resource lifecycle.
Additionally, we are in the process of developing a license negotiation team to review and
contribute to negotiation of the business terms. Similarly, we moved the verification of access to
the initial receipt of a message from the provider to reduce delays in access. OCLC records
downloaded at the point of acquisitions are now suppressed and not available to users. Finally,
19
we have begun a periodical review of our e-journals holdings and coverage dates and our ebooks title lists in order to take a proactive stance to managing access to our e-resources.
INSERT Figure 4: "To Be" Database Workflow diagram
Despite these major changes, we had yet to address our more pressing problem. How do we
make our e-resources' process more transparent, ensure that a new acquisition makes its way
through its entire lifecycle, and replace email and human memory in the management of eresources workflows?
7.4. The Quick Win Pilot
With vision in hand and approval from our administration to move forward, IBM sent its team
onsite to work with a small group of Duke employees to implement what IBM refers to as its
Quick Win Pilot (QWP). The Duke team chose its database workflow to be the first BPM
process because it was the least complicated of our e-resource's processes. IBM's QWP was
designed for success and it enabled us to transform our vision for a new database workflow into
reality. Our first order of business was to import and enhance our Microsoft Visio diagram into
BWL. Although we were confident that we had done a fairly decent job on our "To Be"
workflow diagram, the IBM team managed to tease out more details on how the process would
work in BPM (see Figure 5). Through this effort we gleaned an enormous understanding of the
BWL system, along with a deeper understanding of developing specifications for BPM.
20
INSERT Figure 5: BPM preliminary Database Workflow Diagram
The new BPM Database Procurement process (Figure 6) closely matched our vision of how the
process would work in the new environment. While initially we wanted to kick start the process
with a subject librarian's email request to set up a trial, it was determined that there was no way
to know for certain the sender or the contents of the email. As an alternative, subject librarians
would initiate the process by completing a form in BPM to include the title and provider of the eresource (see Figure 7). If the trial request is received during a time when trials are not being
conducted, BPM notifies the subject librarian and holds the request until the appropriate time.
Once trials resume, BPM automatically generates a check list of tasks to be completed by the
trial manager, who provides the additional information necessary to establish the trial (see Figure
8). BPM generates and sends an email to the provider and updates Django, our trial management
system, with the database name and the trial time period.
INSERT Figure 6: BPM Database Procurement Process
INSERT Figure 7: BPM Form Completed by Subject Librarians
INSERT Figure 8: Trial Manager Completes Pre-Trial Tasks
Upon completion of the database trial, the subject librarian logs on to BPM to make his or her
purchasing decision (see Figure 9). The system offers three options: (1) purchase, (2) decline, or
(3) defer for later consideration. BPM updates the trial management system with the outcome of
the subject librarian's decision. If the decision is to proceed with the purchase, the subject
21
librarian completes the order requests. BPM generates another check list to be completed by the
license negotiator (see Figure 10). The process continues through the entire lifecycle for the
database procurement workflow with check lists generated for each of the remaining stages of
the process.
INSERT Figure 9: Subject Librarians Makes Purchasing Decision
INSERT Figure 10: Acquisitions Librarian Negotiates License Agreement
8.0. Lessons Learned From Our Journey
Our journey to become better stewards of our responsibility to improve the user experience has
forced us to look beyond library vendors to find solutions to our problems. Some may find
difficulty in looking at their dirty laundry. We must admit that it was not easy. However, it was
absolutely critical in developing a big-picture view of the process and identifying the problems
with our workflow. Unsurprisingly, we discovered that the problems unearthed were due to the
systems and processes in place. Our staff remain dedicated and committed to meeting the needs
of our users. They were instrumental in informing the Electronic Resources Workflow Analysis
and Process Improvement Team about our previous process and arming us with the knowledge to
introduce change.
9.0. Conclusion
22
If libraries are serious about improving the user experience, then libraries will need to translate
that ideology into funding to support those desires. Fortunately for Duke University Libraries,
our administration understood the need to improve ERM workflows and provided the support
that made these changes possible.
Future development of BPM workflows will focus on
cancellations and renewals, eJournal single title purchases, eJournal packages, e-book single title
purchases, e-book packages, MARC record loads, and trouble-shooting and proactive quality
control. Clearly, we still have a long way to go, but we now have a road map that will lead us to
our final destination.
References
Collins, M. & Grogg, J. E. (2011, March ). Building a Better ERMS. Library Journal 136, no. 4,
22-28.
Condic, K. (2008). Uncharted Waters: ERM Implementation in a Medium-Sized Academic
Library. Internet Reference Services Quarterly 13, no. 2, 135. Retrieved from
http://dx.doi.org.prox.lib.ncsu.edu/10.1080/10875300802103643
IBM BlueWorksLive. (2014). Process made simple. Retrieved from:
https://www.blueworkslive.com/#!gettingStarted:overview
IBM Knowledge Center. (2014). Getting started with IBM Business Process Manager. Retrieved
from: http://www01.ibm.com/support/knowledgecenter/SSFTN5_8.0.1/com.ibm.wbpm.main.doc/topics/cb
pm_wbpm_gsg.html?lang=en
23
IBM Knowledge Center. (2014). Product Overview. Retrieved from: http://www01.ibm.com/support/knowledgecenter/SSFTN5_8.0.1/com.ibm.wbpm.main.doc/topics/cb
pm_ibpmarch.html?lang=en
Imre, A. Hartnett, E. & Hiatt, D. (2013, Jan. – June). CORAL: implementing an open-source
ERM. Serials Librarian, 64, no.1-4, 224-234.
Jewell, T. (2010, Feb. 1). The NISO ERM data standards and best practices review. Electronic
Resources and Libraries conference.
Lupton, A. & Salmon, M. (2012) Building an Electronic Resource Management (ERM) Solution
at York University. Journal of Library Innovation 3, no.2, 105-122.
Microsoft Sharepoint Designer. (2010). Introducing Sharepoint Designer 2010. Retrieved from:
http://office.microsoft.com/en-us/sharepoint-designer-help/introducing-sharepointdesigner-2010-HA101782482.aspx?CTT=1
Silton, K. & LeMaistre, T. (2011, June). Innovative Interfaces' Electronic Resources
Management System: A Survey on the State of Implementation and Usage, Serials
Review, 37, no.2, 86.
24
Download