Alan Orth
40e284dac0
DSpace's stats-util script splits the Solr statistics core into yearly shards. We need to use Solr's `shards` query parameter in order to get the statistics for previous years. This commit adds a helper function to enumerate the active Solr cores to find yearly shards matching the statistics-YYYY pattern and add them to the query. |
||
---|---|---|
contrib | ||
dspace_statistics_api | ||
tests | ||
.flake8 | ||
.gitignore | ||
.hound.yml | ||
.travis.yml | ||
CHANGELOG.md | ||
LICENSE.txt | ||
Pipfile | ||
Pipfile.lock | ||
pytest.ini | ||
README.md | ||
requirements-dev.txt | ||
requirements.txt |
DSpace Statistics API
DSpace stores item view and download events in a Solr "statistics" core. This information is available for use in the various DSpace user interfaces, but is not exposed externally via any APIs. The DSpace 4+ REST API, for example, only exposes information about communities, collections, item metadata, and bitstreams.
This project contains an indexer and a Falcon-based web application to make the statistics available via simple REST API. You can read more about the Solr queries used to gather the item view and download statistics on the DSpace wiki.
Requirements
- Python 3.5+
- PostgreSQL version 9.5+ (due to
UPSERT
support) - DSpace with Solr usage statistics enabled (tested with 5.x)
Installation
Create a Python virtual environment and install the dependencies:
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install -r requirements.txt
Running
Set up the environment variables for Solr and PostgreSQL:
$ export SOLR_SERVER=http://localhost:8080/solr
$ export DATABASE_NAME=dspacestatistics
$ export DATABASE_USER=dspacestatistics
$ export DATABASE_PASS=dspacestatistics
$ export DATABASE_HOST=localhost
Index the Solr statistics core to populate the PostgreSQL database:
$ python -m dspace_statistics_api.indexer
Run the REST API:
$ gunicorn dspace_statistics_api.app
Test to see if there are any statistics:
$ curl 'http://localhost:8000/items?limit=1'
Testing
Install development packages using pip:
$ pip install -r requirements-dev.txt
Run tests:
$ pytest
Deployment
There are example systemd service and timer units in the contrib
directory. The API service listens on localhost by default so you will need to expose it publicly using a web server like nginx.
An example nginx configuration is:
server {
#...
location ~ /rest/statistics/?(.*) {
access_log /var/log/nginx/statistics.log;
proxy_pass http://statistics_api/$1$is_args$args;
}
}
upstream statistics_api {
server 127.0.0.1:5000;
}
This would expose the API at /rest/statistics
.
Using the API
The API exposes the following endpoints:
- GET
/
— return a basic API documentation page. - GET
/items
— return views and downloads for all items that Solr knows about¹. Acceptslimit
andpage
query parameters for pagination of results (limit
must be an integer between 1 and 100, andpage
must be an integer greater than or equal to 0). - GET
/item/id
— return views and downloads for a single item (id
must be a positive integer). Returns HTTP 404 if an item id is not found.
The item id is the internal id for an item. You can get these from the standard DSpace REST API.
¹ We are querying the Solr statistics core, which technically only knows about items that have either views or downloads. If an item is not present here you can assume it has zero views and zero downloads, but not necessarily that it does not exist in the repository.
Todo
- Better logging
- Version API
- Use JSON in PostgreSQL
- Make community and collection stats available
- Fix results for sharded statistics cores (#10)
- Switch to Python 3.6+ f-string syntax
License
This work is licensed under the GPLv3.
The license allows you to use and modify the work for personal and commercial purposes, but if you distribute the work you must provide users with a means to access the source code for the version you are distributing. Read more about the GPLv3 at TL;DR Legal.