CE2-2014-0263-VLO_analysis_full

advertisement

Title

Version

Author(s)

VLO analysis

1

Twan Goosen (CLARIN ERIC), Thomas

Eckart (University of Leipzig)

2014-01-17 Date

Status Final

Distribution public

ID CE-2014-0263 (full version)

Analysis based on VLO version 2.18 (under development at time of analysis)

1 Table of content

1 Table of content .............................................................................................. 1

2 Functional analysis .......................................................................................... 2

2.1

Functionality ........................................................................................................................................... 2

2.2

Information retrieval ........................................................................................................................... 2

2.3

Usability .................................................................................................................................................... 3

2.4

Cross-platform accessibility ............................................................................................................. 3

2.5

Scalability ................................................................................................................................................. 4

3 End-product analysis ....................................................................................... 5

3.1

Documentation ....................................................................................................................................... 5

3.2

Performance ............................................................................................................................................ 5

3.3

Stability ..................................................................................................................................................... 6

3.4

Configurability ........................................................................................................................................ 7

3.5

Analysability............................................................................................................................................ 7

3.6

Integration in CLARIN ecosystem .................................................................................................. 8

4 Code base analysis ........................................................................................ 10

4.1

Code quality .......................................................................................................................................... 10

4.2

Testability.............................................................................................................................................. 10

4.3

Portability ............................................................................................................................................. 11

4.4

Modularity ............................................................................................................................................. 11

4.5

Documentation .................................................................................................................................... 11

4.6

Expertise level required .................................................................................................................. 12

4.7

Up-to-datedness ................................................................................................................................. 12

1

2 Functional analysis

2.1

Functionality

The following requirements have been stated 1 for the VLO from an end-user perspective :

 Providing a “uniform and easy to use interface to search in […] metadata records”

 Direct access to the language resources. Some resources can only be accessed after providing a username and password – this should be made clear

 Exclusion of records irrelevant relevant “in a scenario of electronically enhanced research, e.g. descriptions of books in a library without any ISBN-number or further information”

 “In [cases where search requirements that cannot be fulfilled with a facet interface] users should be informed about other ways of querying the metadata

(e.g. the CLARIN Metadata Browser ).”

 A feedback mechanism “to attend the responsible repository administrator about [issues concerning the state of metadata records]” and to “enhance the awareness of the providers” with respect to erroneous records.

The current version of the VLO in principle meets all of the above criteria. Some aspects

(such as usability, access aspects) can be improved upon and are discussed in the sections below.

2.2

Information retrieval

This section deals with the degree to which the VLO, aside from usability aspects, enables a user to find the resources she/he is looking for. This largely depends on the structure of the underlying data source: the SOLR index of records presented in the VLO.

The indexes are created by the importer component of the VLO on basis of a mapping defined by code and configuration which gets applied to the available CMDI records.

From this it follows that the likelihood of the VLO’s users successfully finding the records they are interested in depends on the following chain of criteria:

Profile: whether the CMDI instance is based on an appropriate, well-designed profile with proper semantic annotations (by means of data categories), usage of vocabularies where appropriate etc. This may not always be the case; curation on the metadata model level is required to detect and combat these cases.

Metadata instance: the quality and quantity of values in the CMDI instance.

Indexes are based on values in the records; therefore useful indexes can only be generated if the available values conform to the constraints expressed in the profile. This may not always be the case; curation on the metadata instance level is required to detect and combat these cases.

Mapping: the aptness of the process of normalisation, post-processing and final mapping to the defined set of facets carried out by the importer. This mapping can get fairly complex due to the heterogeneity of the metadata space, and in some cases may lead to unexpected or undesirable results. The challenge is to

1 Van Uytvanck, D., Stehouwer, H., & Lampen, L. (2012). Semantic metadata mapping in practice: The Virtual Language Observatory. In N. Calzolari

(Ed.), Proceedings of the Eighth International Conference on Language Resources and

Evaluation (LREC 2012), Istanbul, May 23rd-25th, 2012 (pp. 1029-1034). European

Language Resources Association (ELRA). http://hdl.handle.net/11858/00-001M-0000-

000F-85F4-5

2

achieve a scalable and maintainable mapping strategy that is generic but also flexible enough to cater for specific data sets.

Presentation: the method of presentation of the various facets and their value domains. A selection of facets is explicitly presented on the main page of the VLO web application to allow the user to narrow down the result set. Other facets are only used for presentation in the record view, or are used indirectly or internally only. The choice of ‘navigational’ facets can be discussed and may change over time 2 . Providing too many options might confuse the user, while providing too few limits the functionality. Some facets might be presentable in alternative ways (as discussed below), while others could be made available optionally as

‘expert’ or ‘niche’ options.

2.3

Usability

In general the usability seems to be good as long as users understand the faceted browsing paradigm. A simple instruction or description could add to the intuitiveness.

Recommendation: add a short instruction along the lines of “Select a value in one or more facets to narrow down the search”

Users may not directly understand the relation between the facet boxes and the results panel on the right.

Recommendation: clearly mark the results panel as such, maybe use the “selected facets” list as a link between this and the facet boxes

Not all users may realize that clicking “All” on the box of a filtered facet removes the value from the filter.

Recommendation: As an alternative way of achieving this, remove links (probably marked as X) can be added to the ‘selected facets’ list (related to https://trac.clarin.eu/ticket/419 )

Some facets could be presented in alternative ways. For example, country and continent could be presented as an interactive map (the former would require more space or zooming facilities to allow accurate selection) or some slider functionality could be used to support selection of temporal range. As an alternative to the ‘all values’ page for the facets, a tag cloud could be considered.

Recommendation: consider the implementation of additional facet presentation methods

2.4

Cross-platform accessibility

Works fine on wider screens such as PC’s and tablets. Not so much on hand held devices such as smartphones because of the table layout.

Recommendation: migrate to a more flexible, CSS-based layout and optionally provide separate styling for different form factors

Part of the functionality of the “result page” (which represents a single metadata record) depends on JavaScript. While for the majority of platforms and users this will not pose any problems 3 , in some cases it might – especially more technical users might have

2 The CLARIN-D VLO taskforce is assessing the suitability of the current and potential facets and will come forward with recommendations in this regard in February 2014

3 Recent statistics are hard to find, but a study from 2010 indicates that roughly 1 to 2 percent of general visitor traffic represents clients without support for JavaScript: http://developer.yahoo.com/blogs/ydn/many-users-javascript-disabled-14121.html

3

JavaScript disabled for a variety of reasons 4 . The affected features are the

“previous/next” navigator and the list of resources, which simply do not render without

JavaScript enabled.

Recommendation: consider refactoring the affected features to use “AJAX fallback components”, which have the benefit of JavaScript solutions but retain support for clients without the required support. Wicket has good support for this, which is already employed in parts of the application.

2.5

Scalability

“The responsibility for the mapping (i.e. the ISOcat links) now fully lies at the side of the resource provider. This approach is also a guarantee for scalability: adding more CMDIdefined metadata schemes comes at little additional cost.” (Van Uytvanck, Stehouwer and Lampen, 2012)

4 See http://programmers.stackexchange.com/questions/26179/why-do-people-disablejavascript for an overview

4

3 End-product analysis

3.1

Documentation

The following pieces of documentation is available:

1.

CLARIN Trac page 5 : a basic high-level overview of the used technologies and some links to information on the current data sources, mapping. Has up-to-date information about the deployed versions and gets regular content updates. As a source for technical documentation, it is helpful but could extended and restructured. Some high level UML diagrams would be helpful here.

2.

The document ‘Introduction to the VLO Faceted Browser’ 6 : describing the software components (importer, database, web app) and some notes on configuring these. Has not been updated since mid 2011 and seems to be out of date in the details. The contents could probably be merged with the Trac page.

3.

The ‘DEPLOY-README’ file 7 : gets packaged along with the application and provides technical documentation targeted at the application manager, primarily in the form of step-by-step instructions on how to install, update and configure the software components on the server.

4.

The paper ‘Semantic metadata mapping in practice: The Virtual Language

Observatory’ 1 : a high level introduction followed by a description of the components and workflow without too many technical details. Main focus is on the harvesting, ingestion and post-processing of metadata records.

5.

The VLO section of the CMDI FAQ 8 : answers some questions about the VLO, mainly from the provider’s point of view

Recommendation: merge the introduction document (2) into the Trac page (1),

removing all out-dated information. Extend the Trac with UML diagrams showing the relations between the various components (by means of a UML component diagram).

Refer to the packaged documentation (3) and the FAQ (5) from the Trac page so that the

latter can function as an entry point for all technical information. Make a reference from the VLO web application itself to a (new?) page that collects all relevant end-user documentation.

As an outcome of discussions within the CLARIN-D VLO taskforce was planned to create an additional help page in the VLO web application that was generated based on the facetconcepts.xml file and that would target CMDI developers. However, this has not been implemented yet.

3.2

Performance

Performance of the SOLR database

The performance of the SOLR database seems to be good enough to support the current amount of records and the actual request load. There were no obvious problems with the rising number of records over the last years. However there was no evaluation of the performance especially regarding the number of parallel requests, cost of different kinds of queries or amount of records in the database. Therefore an evaluation of the

5

6 https://trac.clarin.eu/wiki/CmdiVirtualLanguageObservatory?version=61

Included in the Subversion source code repository: https://svn.clarin.eu/vlo/trunk/docs/Introduction%20to%20the%20VLO%20Faceted%20Bro wser.docx

7 https://svn.clarin.eu/vlo/trunk/DEPLOY-README

8

Included in the Subversion http://www.clarin.eu/faq-page/275 source code repository:

5

capabilities along this line could help decide if the current storage solution is futureproof.

Performance of the importer

The performance of the importer process was no problem in the past. A complete import/update round for the current amount of files takes around 2-4 hours, which seems to be fast enough for the current import strategy. However very large metadata files (currently: >10MB) are skipped at import process to avoid performance degradation of the complete system. This concerned ~100 files at one of the last import processes.

Performance of the web application

The perceived performance of the VLO web application seems to be quite good. The search/facet selection page is very responsive both on initial load and while selecting facets or browsing through the results. The page showing individual records has some performance bottlenecks, but lazy-loading techniques employed for the ‘expensive’ elements has reduced the urgency of these. Still, the loading of the record navigator

(showing the number of records, the index of the current record and previous/back links) is somewhat slow. This may be improved by an alternative query strategy between the web app backend and the SOLR database.

3.3

Stability

Stability of the SOLR database

There are no known problems with the stability of the SOLR database.

Stability of the importer

There are no known problems with the stability of the importer.

Stability of the web application

Recently the VLO web application has suffered from stability issues in production. In spite of increased heap space assignments, the application has regularly stopped working with critical levels of heap space usage. As a result, the application becomes unresponsive and a restart of the Tomcat servlet container is required. This has happened at least once a week in the period between November 2013 and the beginning of January 2014. The exact source of these issues has not been determined but a number of related observations have been made:

 The page cache generated by the Wicket framework consumes the largest amount of heap space as was observed in a heap dump from the production environment. The parameters governing this page cache are easier to configure in a more recent version of Wicket, on which the upcoming release of the VLO web application will be based. This will allow for better tuning of the page cache size.

 Instructing ‘Googlebot’ to skip the VLO pages while crawling along with an increase of the heap space from 2GB to 8GB has improved the situation

significantly as can be seen in Figure 1. Heap usage of the VLO before and after disallowing Googlebot. From the date of this change on the application has run

stably, with occasional high load levels comparable to before (around 2GB) but without sustained peak loads and recovery as expected. This leads to the hypothesis that the crawling process carried out by the Googlebot is too taxing for the application, for example because the cache fills up faster than it can be

6

flushed. The most straightforward to test this is to re-enable crawling by

Googlebot but this is not desirable in production.

Figure 1. Heap usage of the VLO before and after disallowing Googlebot and increasing the max heap as shown in the PNP4nagios monitoring console

Recommendation: Experiments with the upcoming version of the VLO, tweaking the cache size parameters, will have to show whether this improves the stability of the web application. Once sensible values have been discovered, the Googlebot could be allowed again, the result of which will need to be closely monitored.

3.4

Configurability

The VLO has a single configuration file that is shared between the VLO web application and the importer, which makes configuring the application as a whole quite convenient.

3.5

Analysability

Access statistics

Access statistics, showing the number of (unique) views, both global and per page, can provide valuable insight into the way the VLO is used.

There is a potential for delivering (aggregated) statistics to metadata providers on their respective resources and collections. Gross totals (i.e. number of views per metadata record) are probably already very interesting to these providers. This could be augmented with information about the queries that lead users to the records accessed.

In general, the potential of “path analysis” 9 is not being exploited at the moment.

Google Analytics offers advanced functionality but its usage may not be preferable due to privacy concerns (as the data is being stored externally). However, alternative analytic tools offering comparable functionality (possibly with the exception of statistics on Google search terms) are available. Currently AWStats 10 is employed, which provides good basic functionality but alternatives may be considered.

Recommendation: investigate possible alternatives for access analysis and ways to provide (part of) the results to the metadata providers

Performance analysis

The VLO production server (Catalog) is being monitored with Nagios 11 , allowing real time monitoring of a number of performance related variables. In particular, the

9

10

11

see e.g. http://www.mediapost.com/publications/article/94308

http://awstats.sourceforge.net/ http://www.nagios.org/

7

development of memory usage and system load can be seen over time for the individual

Tomcat servlet containers (web application and SOLR run in separate instances).

A more fine-grained performance analysis is conceivable, but not completely straightforward and thus far there has not been a real need. Nevertheless, timing measurements at the level of sets of individual requests (e.g. per page in the web application or query in the SOLR database) could provide some interesting inside into potential performance bottlenecks in both the web application and the SOLR database.

Tools like JMeter 12 ,Selenium 13 and a number of SOLR monitoring tools 14 could be of use here.

3.6

Integration in CLARIN ecosystem

Customisability

The VLO web application has support for multiple themes. This allows for a relatively easy adaptation of the looks of the VLO to make it fit in different contexts. Currently there are two built-in themes: a default theme and one for CLARIN-D. Theme selection happens through a GET parameter in the request.

Theme support is currently implemented as a custom extension of Wicket; there is also some native support (called skinning/styles 15 ), which might be preferable in case of a re-implementation.

In principle it is possible to deploy separate instances of the VLO, each with their own set of records, mapping specification, configuration, theming etc. If this turns out to be common practice within CLARIN, it would be desirable to further extend the configurability and simplify the configuration process of the various components.

Integration with Federated Content Search

A coupling with the CLARIN-D Federated Content Search (FCS) has been established. If a selected record has a reference to a content search endpoint, a link to the Aggregator 16 is presented that allows the user to search within the context of the selected resources. In the upcoming version, it will be possible to search within all FCS-enabled records included in a result set from the search page of the VLO. A suggested feature to add a filter (facet) for FCS support, thus allowing the user to quickly select only resources that are content searchable, would enhance the usability of this feature.

Recommendation: add availability of a content search link as a facet and add support for this in the VLO web application interface (not necessarily as a facet box, could for example be a checkbox/toggle) 17 .

Integration with CLARIN EU website/portal

Thus far, the VLO has been positioned as a stand-alone search front-end. However, at the moment a CLARIN portal is being considered which is likely to have a direct search facility into the VLO as a required feature. This is already possible since a free text

12

13 http://jmeter.apache.org/ http://docs.seleniumhq.org/

14 http://wiki.apache.org/solr/SolrMonitoring , user/1239xn7knt/reporting-tools , http://grokbase.com/t/lucene/solrhttp://docs.lucidworks.com/display/lweug/Integrating+Monitoring+Services

15 https://cwiki.apache.org/confluence/display/WICKET/Localization+and+Skinning+of+

Applications

16 http://weblicht.sfs.uni-tuebingen.de/Aggregator/

17 https://trac.clarin.eu/ticket/289

8

search parameter can simply be added as a GET parameter to the URL of the VLO web application 18 ; optionally, facet values can be selected in a similar fashion.

The available theming options may help enhancing integration with such a portal with relatively little effort.

18 E.g. http://catalog.clarin.eu/vlo/?q=nijmegen

9

4 Code base analysis

4.1

Code quality

All of the components of the VLO (SOLR database, importer and web application) are

Java based and are managed as a set of Maven projects grouped together in a reactor project. This makes it relatively easy to manage the internal dependencies as well as external libraries and to achieve proper packaging in a portable way. The structuring of these projects and their interdependency could be slightly improved, but the overall structure can be maintained.

Recommendation: turn the reactor project into a parent project to the sub-modules so that dependencies and properties can be shared between these. The web application project should not depend on the importer project. Some general cleaning up of the

Maven build specifications (“POM” files) would be desirable.

The code of the importer is of reasonable quality. Some aspects could be improved especially to enhance the code's structure and the ability of customization:

 Remove hard-wired relation between facets and postprocessors (instead retrieving this information from configuration file)

 Make relevant filters of the import process more explicit in the code (like the filtering of resourceless CMDI files) or move it to the configuration

 Modularising a number of large classes and methods for the sake of readability and maintainability

The code of the web application is of reasonable quality. However, a couple of aspects could be improved by various degrees of refactoring:

 Usage of static references, which reduces modularity and testability (see section

“” ниже) of the application.

 Some general Java code best practices are not followed as much as they could in all places, for example in the domain of variable declaration and scope and mutability of objects.

 Error handling is a bit crude; often all Exceptions are caught and handled uniformly.

 Presentation layer is somewhat “old fashioned” HTML, containing layout attributes. Valid XHTML would be preferable, amongst other reasons because it mixes better with wicket markup (which has its own XML namespace).

 The code could benefit from the dependency injection pattern, which could be achieved relatively easily by using a framework such as Spring 19 . This too, would increase the degree of modularity and testability of the application.

 A number of built in features of Wicket are not taken advantage of in favor of

“home brew” solutions, such as in the case of theming and hide/show functionality.

 A more sophisticated design of component models 20 could improve the cleanliness of the code as well as the runtime performance of the web application.

4.2

Testability

Both the importer and the web application have unit tests defined for automated testing of the code base. According to a Cobertura 21 analysis, the proportion of code covered by

19

20

21 http://projects.spring.io/spring-framework/

In the sense as described http://wicket.apache.org/guide/guide/chapter10.html

http://cobertura.github.io/cobertura/ in the Wicket Guide:

10

tests in these components is 65% and 17% respectively. The latter number is low mostly because the Wicket components lack tests. Such tests could be written relatively easily with the test facilities of the Wicket framework, although the urgency of unit tests for the UI components can be debated. The Data Access Object (DAO) classes are not well covered by tests either, which makes the coupling with the data layer quite brittle.

Recommendation: add test cases for the Wicket components and SOLR DAO’s.

Introduction of dependency injection (see section “” выше) will improve the “unit”

aspect of the unit tests as it will allow for easy mocking of services (such as the DAO’s) in the controller and presentation layers (Wicket pages), which makes it possible to test their behaviour in isolation.

Recommendation: after the introduction of dependency injection, use a mocking framework to test classes that consume services in isolation

The test cases for the importer do not run against an instance of a SOLR database but instead partly re-implement the import process.

Recommendation: consider adding integration tests that make use of an embedded

SOLR instance 22

A plan for carrying out an acceptance test (by a human tester) has been formalized and implemented at MPI-PL (but appears to be out of date, last update mid 2012). The acceptance test only covers the functionality of the web application front-end. A broader scope of acceptance testing does not seem feasible.

Recommendation: upgrade the acceptance test plan and integrate execution of this test plan in the release cycle (on basis of the beta releases).

4.3

Portability

Maven projects make the VLO code base highly portable. Some references to local configuration files need to be filtered out.

4.4

Modularity

There are clear division lines between the main components of the VLO: SOLR database, importer and web application. Modularity within the latter two code bases is reasonable

but could be improved as explained in the sections “” and “” выше.

4.5

Documentation

The statements below concern both the importer and the web application.

 Javadoc: reasonable but far from complete. Certain classes are well covered with

Javadoc while others have virtually none at all.

 Inline documentation: a reasonable number of comments are spread throughout the code, making the flow of execution quite clear to follow in most cases. The page classes of the web application as well as the test cases could be improved by extending the inline documentation. The same applies to the Maven POM files.

 There is no README file that introduces the project to the developer. There could at the least be one that contains a link to the Trac page.

Recommendation: extend the Javadoc and inline comments where needed (Netbeans

IDE 7.4 can generate an overview of Javadoc issues), and add a README file for the developer.

22 This process is described at a number of places on the web, e.g. https://wiki.searchtechnologies.com/index.php/Unit_Testing_with_Embedded_Solr

11

4.6

Expertise level required

Development on the VLO requires knowledge of the following libraries that cannot be considered part of the Java core stack:

Apache Wicket web application framework o Well documented with a free guide and a good text book, but is known to have a steep learning curve nevertheless o Used in other CLARIN and MPI projects: Component Registry, Metadata

Browser

SOLR search platform o Well documented with a free guide o There are few free alternatives with similar functionality, support and community

VTD-XML high performance streaming XML parser o Well documented through free online documentation o Chosen for its high performance (may be worth re-assessing this). StAX 23 might be considered as an alternative with larger adoption base. Possibly a performance-maintenance trade-off

Simple XML serialization framework o Well documented o Used to serialize/deserialise configuration XML o In our use case does not have any advantages over usage of JAXB, which is highly standardized within Java (part of SE 6 and later) and will most likely lead to a more compact implementation of configuration management.

Recommendation: keep Wicket, SOLR and VTD-XML as frameworks; replace usage of

Simple with JAXB.

4.7

Up-to-datedness

Libraries

A number of the libraries on which the VLO depends have newer versions available

(mostly minor, but in some cases, such as Wicket and SOLR, there are new major releases). Upgrading increases the need for testing and in some cases implies some quite fundamental refactoring but in general it is advisable to stay up-to-date with the latest versions of libraries, as they generally have improved performance, stability and security.

 Wicket will be upgraded from 1.4 to 6 in the upcoming 2.18 release of the VLO

 VTD-XML can be upgraded from 2.10 to 2.11 which has some improvements and fixes 24

 SOLR can be upgraded from 3.6.0 to 4.6.0 which has a large number of changes 25 , some of which may need to be accounted for in the importer and/or client

23

24 https://sjsxp.java.net/ http://ximpleware.wordpress.com/2012/10/15/what-is-new-in-2-11/

25 https://cwiki.apache.org/confluence/display/solr/Major+Changes+from+Solr+3+to+So lr+4

12

Download