1
0
mirror of https://github.com/ilri/dspace-statistics-api.git synced 2025-05-10 15:16:02 +02:00

Compare commits

..

45 Commits

Author SHA1 Message Date
5dd50ff998 CHANGELOG.md: Version 1.2.0
This version only works with DSpace 6+ where the internal item id-
entifiers are UUIDs instead of integers. Version 1.1.1 was the last
version to work with DSpace 4 and 5.
2020-03-02 11:34:58 +02:00
6704e7375f CHANGELOG.md: Add note about Python dependencies 2020-03-02 11:34:13 +02:00
37630d8dac CHANGELOG.md: Add note about DSpace 6+ UUIDs 2020-03-02 11:27:10 +02:00
0ef071a91d dspace_statistics_api: Use f-strings instead of format()
We had previously been avoiding the f-strings because we needed to
run on Python 3.5 and they were only available in Python 3.6+, but
now the black formatter requires Python 3.6 and all our systems are
running Python 3.6+ anyways.
2020-03-02 11:24:29 +02:00
9e7dd28156 dspace_statistics_api/app.py: Use parameterized SQL queries
This is a better way to run SQL queries because psycopg2 takes care
of the quoting for us.
2020-03-02 11:16:05 +02:00
60e6ea57b1 tests/test_api.py: Use UUID
DSpace 6+ uses a UUID for item identifiers instead of an integer so
we need to adapt our tests accordingly. The Python UUID object must
be cast to a string to use it elsewhere in the code.
2020-03-02 11:10:41 +02:00
5955868b9a dspace_statistics_api/app.py: Use UUID
DSpace 6+ uses a UUID for item identifiers instead of an integer so
we need to adapt our PostgreSQL queries to use those. Note that we
can no longer sort results in the "all items" endpoint by ID. Also,
we need to use parameterized psycopg2 queries instead of strings to
support queries with UUIDs properly. To use the Python UUID objects
elsewhere in the code we need to make sure that we cast them to str.
2020-03-02 11:06:48 +02:00
250fd8164f dspace_statistics_api/indexer.py: Use UUID
DSpace 6+ uses a UUID for item identifiers instead of an integer so
we need to update the PostgreSQL schema accordingly. Solr still re-
fers to them as "id" in its schema so we don't need to change anyt-
hing there.
2020-03-01 21:22:10 +02:00
82be1a4d00 Update requirements
Generated from pipenv with:

  $ pipenv lock -r > requirements.txt
  $ pipenv lock -r -d > requirements-dev.txt
2020-03-01 21:21:13 +02:00
0615064e3d Add pytest-clarity to pipenv
Makes pytest output easier to understand.
2020-03-01 21:19:28 +02:00
76be1b749a Run pipenv update 2020-03-01 21:13:32 +02:00
92146fe426 tests/test_api.py: Format with black 2019-12-14 12:39:58 +02:00
440b2f2dfa Pipfile.lock: Run pipenv update 2019-12-14 12:38:11 +02:00
67bc30ead0 Pipfile: Specify exact version of black
Black only releases pre-release versions, which causes issues with
pipenv. Instead of always running pipenv with "--pre" and potenti-
ally letting in some other pre-release versions for other depende-
ncies, I would rather specify the latest black version explicitly.

See: https://github.com/psf/black/issues/517
See: https://github.com/microsoft/vscode-python/issues/5171
2019-12-14 12:37:10 +02:00
142959acdb CHANGELOG.md: Unreleased changes 2019-11-27 12:56:39 +02:00
322f5a8db8 .travis.yml: Remove Python 3.5
black does not work with Python 3.5. It's not such a big deal, as
this is only required for running tests, not for running the app.
2019-11-27 12:55:34 +02:00
90dcaa6ec6 CHANGELOG.md: Fix typo 2019-11-27 12:47:07 +02:00
9aca827d69 Update requirements-dev.txt
Generated with pipenv:

    $ pipenv lock -r -d > requirements-dev.txt
2019-11-27 12:36:05 +02:00
1b394ec50e CHANGELOG.md: Move unreleased changes to 1.1.1 2019-11-27 12:32:54 +02:00
3e9753b600 CHANGELOG.md: Add unreleased changes 2019-11-27 12:32:16 +02:00
cb3c3d37fa Sort imports with isort 2019-11-27 12:31:04 +02:00
4ff1fd4a22 Format code with black 2019-11-27 12:30:06 +02:00
d2fe420a9a Add configuration for isort and black
This does linting and automatic code formatting according to PEP8.

See: https://sourcery.ai/blog/python-best-practices/
2019-11-27 12:26:55 +02:00
3197b79578 CHANGELOG.md: Update unreleased changes 2019-11-27 12:14:49 +02:00
eeb8e6bba1 dspace_statistics_api/indexer.py: Fix minor issues raised by flake8 2019-11-27 12:12:05 +02:00
3540ce328b Update requirements
Generated from pipenv with:

  $ pipenv lock -r > requirements.txt
  $ pipenv lock -r -d > requirements-dev.txt
2019-11-27 12:08:32 +02:00
520e04f9be Pipfile.lock: run pipenv update
Brings gunicorn 20.0.4, pytest 5.3.1, and others. I hadn't noticed
that gunicorn was bumped from 19.x.x to 20.x.x last week.

See: https://docs.gunicorn.org/en/stable/news.html#id6
2019-11-27 12:06:09 +02:00
8a46a64cfc CHANGELOG.md: Use Python 3.8 for pipenv 2019-11-27 10:53:38 +02:00
b8442f8cce .travis.yml: Remove pipenv-specific environment variables 2019-11-15 00:48:57 +02:00
95f7871cc1 .travis.yml: Use vanilla pip 2019-11-15 00:46:58 +02:00
3bc07027e5 .travis.yml: Test with Python 3.8 2019-11-15 00:46:04 +02:00
afcc445855 Update requirements
Generated from pipenv with:

  $ pipenv lock -r > requirements.txt
  $ pipenv lock -r -d > requirements-dev.txt
2019-11-15 00:41:12 +02:00
494548c691 Use Python 3.8.0 for pipenv
Python 3.8.0 was released several months ago and has made it into
Arch Linux's core repositories so it's time to start moving.
2019-11-15 00:38:45 +02:00
feb60b6adf CHANGELOG.md: Update unreleased changes 2019-11-15 00:06:49 +02:00
1541ae3e3b .travis.yml: Use Ubuntu 18.04 "Bionic" 2019-11-14 23:57:46 +02:00
1aedc0ca29 CHANGELOG.md: Add note about Python dependencies 2019-08-29 00:31:31 +03:00
a648183f35 Update requirements
Generated from pipenv with:

  $ pipenv lock -r > requirements.txt
  $ pipenv lock -r -d > requirements-dev.txt
2019-08-29 00:31:06 +03:00
b8f379e7fa Pipfile.lock: Run pipenv update
This brings in, among others, psycogpg 2.8.3, requests 2.22.0, and
pytest 5.1.1.
2019-08-29 00:30:06 +03:00
78f9949ecb CHANGELOG.md: Release version 1.1.0 2019-05-05 23:38:04 +03:00
af80c4b447 CHANGELOG.md: Add falcon 2.0.0 to unreleased changes 2019-05-03 16:33:00 +03:00
edd9e90f59 Update requirements
Generated using pipenv:

  $ pipenv lock -r > requirements.txt
  $ pipenv lock -r -d > requirements-dev.txt
2019-05-03 16:32:17 +03:00
1806d50a51 Pipfile: Use falcon 2.0.0
See: https://github.com/falconry/falcon/releases/tag/2.0.0
2019-05-03 16:31:06 +03:00
a459e66fd9 Use falcon 2.0.0rc2 2019-04-18 10:04:43 +03:00
5a3b392a1d dspace_statistics_api/app.py: Fix Falcon 2.0 syntax
See: dspace_statistics_api/app.py
2019-04-18 09:57:18 +03:00
9dcda114c6 Bump Falcon version to 2.0.0b1
See: https://github.com/falconry/falcon/releases/tag/2.0.0b1
2019-04-18 09:57:18 +03:00
12 changed files with 479 additions and 291 deletions

View File

@ -1,9 +1,9 @@
dist: xenial dist: bionic
language: python language: python
python: python:
- "3.5"
- "3.6" - "3.6"
- "3.7" - "3.7"
- "3.8"
addons: addons:
postgresql: "9.6" postgresql: "9.6"
before_script: before_script:
@ -13,11 +13,8 @@ before_script:
- createdb -U postgres -O dspacestatistics --encoding=UNICODE dspacestatistics - createdb -U postgres -O dspacestatistics --encoding=UNICODE dspacestatistics
- psql -U postgres -d dspacestatistics < tests/dspacestatistics.sql - psql -U postgres -d dspacestatistics < tests/dspacestatistics.sql
install: install:
- "pip install pipenv --upgrade-strategy=only-if-needed" - "pip install -r requirements.txt"
- "pipenv install --dev" - "pip install -r requirements-dev.txt"
script: pytest script: pytest
env:
- PIPENV_NOSPIN=True
- PIPENV_HIDE_EMOJIS=True
# vim: ts=2 sw=2 et # vim: ts=2 sw=2 et

View File

@ -4,6 +4,31 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.2.0] - 2020-03-02
## Changed
- Remove Python 3.5 from TravisCI because black requires Python >= 3.6
- Adapt API for DSpace 6+ UUIDs
- This requires droping the statistics database and re-indexing
- Run pipenv update, bringing requests 2.23.0 and pytest 5.3.5
## [1.1.1] - 2019-11-27
### Added
- Configuration for automatic sorting of imports with isort
- Configuration for automatic code formatting with black
### Updated
- Run pipenv update, bringing psycopg2 2.8.4, requests 2.22.0, pytest 5.3.1,
and gunicorn 20.0.4
### Changed
- Use Ubuntu 18.04 "Bionic" for TravisCI builds
- Use Python 3.8.0 for pipenv
- Minor syntax issues highlighted by flake8
## [1.1.0] - 2019-05-05
## Updated
- Falcon 2.0.0 (@alanorth)
## [1.0.0] - 2019-04-15 ## [1.0.0] - 2019-04-15
### Added ### Added
- Build configuration for build.sr.ht - Build configuration for build.sr.ht

View File

@ -5,7 +5,7 @@ name = "pypi"
[packages] [packages]
gunicorn = "*" gunicorn = "*"
falcon = "*" falcon = "==2.0.0"
"psycopg2-binary" = "*" "psycopg2-binary" = "*"
requests = "*" requests = "*"
@ -13,6 +13,9 @@ requests = "*"
ipython = "*" ipython = "*"
"flake8" = "*" "flake8" = "*"
pytest = "*" pytest = "*"
isort = "*"
black = "==19.10b0"
pytest-clarity = "==0.3.0a0"
[requires] [requires]
python_version = "3.7" python_version = "3.8"

350
Pipfile.lock generated
View File

@ -1,11 +1,11 @@
{ {
"_meta": { "_meta": {
"hash": { "hash": {
"sha256": "d18369b55f85594f6acfa49779909334a794a4002656508fc85acea43df2521d" "sha256": "be968d3927117f9ac14b9a6f60d6147b2d57ce55f694f34ed6e53abcd2197823"
}, },
"pipfile-spec": 6, "pipfile-spec": 6,
"requires": { "requires": {
"python_version": "3.7" "python_version": "3.8"
}, },
"sources": [ "sources": [
{ {
@ -18,10 +18,10 @@
"default": { "default": {
"certifi": { "certifi": {
"hashes": [ "hashes": [
"sha256:59b7658e26ca9c7339e00f8f4636cdfe59d34fa37b9b04f6f9e9926b3cece1a5", "sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3",
"sha256:b26104d6835d1f5e49452a26eb2ff87fe7090b89dfcaee5ea2212697e1e1d7ae" "sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f"
], ],
"version": "==2019.3.9" "version": "==2019.11.28"
}, },
"chardet": { "chardet": {
"hashes": [ "hashes": [
@ -32,105 +32,107 @@
}, },
"falcon": { "falcon": {
"hashes": [ "hashes": [
"sha256:0a66b33458fab9c1e400a9be1a68056abda178eb02a8cb4b8f795e9df20b053b", "sha256:18157af2a4fc3feedf2b5dcc6196f448639acf01c68bc33d4d5a04c3ef87f494",
"sha256:3981f609c0358a9fcdb25b0e7fab3d9e23019356fb429c635ce4133135ae1bc4" "sha256:24adcd2b29a8ffa9d552dc79638cd21736a3fb04eda7d102c6cebafdaadb88ad",
"sha256:54f2cb4b687035b2a03206dbfc538055cc48b59a953187b0458aa1b574d47b53",
"sha256:59d1e8c993b9a37ea06df9d72cf907a46cc8063b30717cdac2f34d1658b6f936",
"sha256:733033ec80c896e30a43ab3e776856096836787197a44eb21022320a61311983",
"sha256:74cf1d18207381c665b9e6292d65100ce146d958707793174b03869dc6e614f4",
"sha256:95bf6ce986c1119aef12c9b348f4dee9c6dcc58391bdd0bc2b0bf353c2b15986",
"sha256:9712975adcf8c6e12876239085ad757b8fdeba223d46d23daef82b47658f83a9",
"sha256:a5ebb22a04c9cc65081938ee7651b4e3b4d2a28522ea8ec04c7bdd2b3e9e8cd8",
"sha256:aa184895d1ad4573fbfaaf803563d02f019ebdf4790e41cc568a330607eae439",
"sha256:e3782b7b92fefd46a6ad1fd8fe63fe6c6f1b7740a95ca56957f48d1aee34b357",
"sha256:e9efa0791b5d9f9dd9689015ea6bce0a27fcd5ecbcd30e6d940bffa4f7f03389",
"sha256:eea593cf466b9c126ce667f6d30503624ef24459f118c75594a69353b6c3d5fc",
"sha256:f93351459f110b4c1ee28556aef9a791832df6f910bea7b3f616109d534df06b"
], ],
"index": "pypi", "index": "pypi",
"version": "==1.4.1" "version": "==2.0.0"
}, },
"gunicorn": { "gunicorn": {
"hashes": [ "hashes": [
"sha256:aa8e0b40b4157b36a5df5e599f45c9c76d6af43845ba3b3b0efe2c70473c2471", "sha256:1904bb2b8a43658807108d59c3f3d56c2b6121a701161de0ddf9ad140073c626",
"sha256:fa2662097c66f920f53f70621c6c58ca4a3c4d3434205e608e121b5b3b71f4f3" "sha256:cd4a810dd51bf497552cf3f863b575dabd73d6ad6a91075b65936b151cbf4f9c"
], ],
"index": "pypi", "index": "pypi",
"version": "==19.9.0" "version": "==20.0.4"
}, },
"idna": { "idna": {
"hashes": [ "hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407", "sha256:7588d1c14ae4c77d74036e8c22ff447b26d0fde8f007354fd48a7814db15b7cb",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c" "sha256:a068a21ceac8a4d63dbfd964670474107f541babbd2250d61922f029858365fa"
], ],
"version": "==2.8" "version": "==2.9"
}, },
"psycopg2-binary": { "psycopg2-binary": {
"hashes": [ "hashes": [
"sha256:007ca0df127b1862fc010125bc4100b7a630efc6841047bd11afceadb4754611", "sha256:040234f8a4a8dfd692662a8308d78f63f31a97e1c42d2480e5e6810c48966a29",
"sha256:03c49e02adf0b4d68f422fdbd98f7a7c547beb27e99a75ed02298f85cb48406a", "sha256:086f7e89ec85a6704db51f68f0dcae432eff9300809723a6e8782c41c2f48e03",
"sha256:0a1232cdd314e08848825edda06600455ad2a7adaa463ebfb12ece2d09f3370e", "sha256:18ca813fdb17bc1db73fe61b196b05dd1ca2165b884dd5ec5568877cabf9b039",
"sha256:131c80d0958c89273d9720b9adf9df1d7600bb3120e16019a7389ab15b079af5", "sha256:19dc39616850342a2a6db70559af55b22955f86667b5f652f40c0e99253d9881",
"sha256:2de34cc3b775724623f86617d2601308083176a495f5b2efc2bbb0da154f483a", "sha256:2166e770cb98f02ed5ee2b0b569d40db26788e0bf2ec3ae1a0d864ea6f1d8309",
"sha256:2eddc31500f73544a2a54123d4c4b249c3c711d31e64deddb0890982ea37397a", "sha256:3a2522b1d9178575acee4adf8fd9f979f9c0449b00b4164bb63c3475ea6528ed",
"sha256:484f6c62bdc166ee0e5be3aa831120423bf399786d1f3b0304526c86180fbc0b", "sha256:3aa773580f85a28ffdf6f862e59cb5a3cc7ef6885121f2de3fca8d6ada4dbf3b",
"sha256:4c2d9369ed40b4a44a8ccd6bc3a7db6272b8314812d2d1091f95c4c836d92e06", "sha256:3b5deaa3ee7180585a296af33e14c9b18c218d148e735c7accf78130765a47e3",
"sha256:70f570b5fa44413b9f30dbc053d17ef3ce6a4100147a10822f8662e58d473656", "sha256:407af6d7e46593415f216c7f56ba087a9a42bd6dc2ecb86028760aa45b802bd7",
"sha256:7a2b5b095f3bd733aab101c89c0e1a3f0dfb4ebdc26f6374805c086ffe29d5b2", "sha256:4c3c09fb674401f630626310bcaf6cd6285daf0d5e4c26d6e55ca26a2734e39b",
"sha256:804914a669186e2843c1f7fbe12b55aad1b36d40a28274abe6027deffad9433d", "sha256:4c6717962247445b4f9e21c962ea61d2e884fc17df5ddf5e35863b016f8a1f03",
"sha256:8520c03172da18345d012949a53617a963e0191ccb3c666f23276d5326af27b5", "sha256:50446fae5681fc99f87e505d4e77c9407e683ab60c555ec302f9ac9bffa61103",
"sha256:90da901fc33ea393fc644607e4a3916b509387e9339ec6ebc7bfded45b7a0ae9", "sha256:5057669b6a66aa9ca118a2a860159f0ee3acf837eda937bdd2a64f3431361a2d",
"sha256:a582416ad123291a82c300d1d872bdc4136d69ad0b41d57dc5ca3df7ef8e3088", "sha256:5dd90c5438b4f935c9d01fcbad3620253da89d19c1f5fca9158646407ed7df35",
"sha256:ac8c5e20309f4989c296d62cac20ee456b69c41fd1bc03829e27de23b6fa9dd0", "sha256:659c815b5b8e2a55193ede2795c1e2349b8011497310bb936da7d4745652823b",
"sha256:b2cf82f55a619879f8557fdaae5cec7a294fac815e0087c4f67026fdf5259844", "sha256:69b13fdf12878b10dc6003acc8d0abf3ad93e79813fd5f3812497c1c9fb9be49",
"sha256:b59d6f8cfca2983d8fdbe457bf95d2192f7b7efdb2b483bf5fa4e8981b04e8b2", "sha256:7a1cb80e35e1ccea3e11a48afe65d38744a0e0bde88795cc56a4d05b6e4f9d70",
"sha256:be08168197021d669b9964bd87628fa88f910b1be31e7010901070f2540c05fd", "sha256:7e6e3c52e6732c219c07bd97fff6c088f8df4dae3b79752ee3a817e6f32e177e",
"sha256:be0f952f1c365061041bad16e27e224e29615d4eb1fb5b7e7760a1d3d12b90b6", "sha256:7f42a8490c4fe854325504ce7a6e4796b207960dabb2cbafe3c3959cb00d1d7e",
"sha256:c1c9a33e46d7c12b9c96cf2d4349d783e3127163fd96254dcd44663cf0a1d438", "sha256:84156313f258eafff716b2961644a4483a9be44a5d43551d554844d15d4d224e",
"sha256:d18c89957ac57dd2a2724ecfe9a759912d776f96ecabba23acb9ecbf5c731035", "sha256:8578d6b8192e4c805e85f187bc530d0f52ba86c39172e61cd51f68fddd648103",
"sha256:d7e7b0ff21f39433c50397e60bf0995d078802c591ca3b8d99857ea18a7496ee", "sha256:890167d5091279a27e2505ff0e1fb273f8c48c41d35c5b92adbf4af80e6b2ed6",
"sha256:da0929b2bf0d1f365345e5eb940d8713c1d516312e010135b14402e2a3d2404d", "sha256:98e10634792ac0e9e7a92a76b4991b44c2325d3e7798270a808407355e7bb0a1",
"sha256:de24a4962e361c512d3e528ded6c7480eab24c655b8ca1f0b761d3b3650d2f07", "sha256:9aadff9032e967865f9778485571e93908d27dab21d0fdfdec0ca779bb6f8ad9",
"sha256:e45f93ff3f7dae2202248cf413a87aeb330821bf76998b3cf374eda2fc893dd7", "sha256:9f24f383a298a0c0f9b3113b982e21751a8ecde6615494a3f1470eb4a9d70e9e",
"sha256:f046aeae1f7a845041b8661bb7a52449202b6c5d3fb59eb4724e7ca088811904", "sha256:a73021b44813b5c84eda4a3af5826dd72356a900bac9bd9dd1f0f81ee1c22c2f",
"sha256:f1dc2b7b2748084b890f5d05b65a47cd03188824890e9a60818721fd492249fb", "sha256:afd96845e12638d2c44d213d4810a08f4dc4a563f9a98204b7428e567014b1cd",
"sha256:fcbe7cf3a786572b73d2cd5f34ed452a5f5fac47c9c9d1e0642c457a148f9f88" "sha256:b73ddf033d8cd4cc9dfed6324b1ad2a89ba52c410ef6877998422fcb9c23e3a8",
"sha256:b8f490f5fad1767a1331df1259763b3bad7d7af12a75b950c2843ba319b2415f",
"sha256:dbc5cd56fff1a6152ca59445178652756f4e509f672e49ccdf3d79c1043113a4",
"sha256:eac8a3499754790187bb00574ab980df13e754777d346f85e0ff6df929bcd964",
"sha256:eaed1c65f461a959284649e37b5051224f4db6ebdc84e40b5e65f2986f101a08"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.8.2" "version": "==2.8.4"
},
"python-mimeparse": {
"hashes": [
"sha256:76e4b03d700a641fd7761d3cd4fdbbdcd787eade1ebfac43f877016328334f78",
"sha256:a295f03ff20341491bfe4717a39cd0a8cc9afad619ba44b77e86b0ab8a2b8282"
],
"version": "==1.6.0"
}, },
"requests": { "requests": {
"hashes": [ "hashes": [
"sha256:502a824f31acdacb3a35b6690b5fbf0bc41d63a24a45c4004352b0242707598e", "sha256:43999036bfa82904b6af1d99e4882b560e5e2c68e5c4b0aa03b655f3d7d73fee",
"sha256:7bf2a778576d825600030a110f3c0e3e8edc51dfaafe1c146e39a2027784957b" "sha256:b3f43d496c6daba4493e7c431722aeb7dbc6288f52a6e04e7b6023b0247817e6"
], ],
"index": "pypi", "index": "pypi",
"version": "==2.21.0" "version": "==2.23.0"
},
"six": {
"hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
],
"version": "==1.12.0"
}, },
"urllib3": { "urllib3": {
"hashes": [ "hashes": [
"sha256:61bf29cada3fc2fbefad4fdf059ea4bd1b4a86d2b6d15e1c7c0b582b9752fe39", "sha256:2f3db8b19923a873b3e5256dc9c2dedfa883e33d87c690d9c7913e1f40673cdc",
"sha256:de9529817c93f27c8ccbfead6985011db27bd0ddfcdb2d86f3f663385c6a9c22" "sha256:87716c2d2a7121198ebcb7ce7cccf6ce5e9ba539041cfbaeecfb641dc0bf6acc"
], ],
"version": "==1.24.1" "version": "==1.25.8"
} }
}, },
"develop": { "develop": {
"atomicwrites": { "appdirs": {
"hashes": [ "hashes": [
"sha256:03472c30eb2c5d1ba9227e4c2ca66ab8287fbfbbda3888aa93dc2e28fc6811b4", "sha256:9e5896d1372858f8dd3344faf4e5014d21849c756c8d5701f78f8a103b372d92",
"sha256:75a9445bac02d8d058d5e1fe689654ba5a6556a1dfd8ce6ec55a0ed79866cfa6" "sha256:d8b24664561d0d34ddfaec54636d502d7cea6e29c3eaf68f3df6180863e2166e"
], ],
"version": "==1.3.0" "version": "==1.4.3"
}, },
"attrs": { "attrs": {
"hashes": [ "hashes": [
"sha256:69c0dbf2ed392de1cb5ec704444b08a5ef81680a61cb899dc08127123af36a79", "sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f0b870f674851ecbfbbbd364d6b5cbdff9dcedbc7f3f5e18a6891057f21fe399" "sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
], ],
"version": "==19.1.0" "version": "==19.3.0"
}, },
"backcall": { "backcall": {
"hashes": [ "hashes": [
@ -139,12 +141,27 @@
], ],
"version": "==0.1.0" "version": "==0.1.0"
}, },
"black": {
"hashes": [
"sha256:1b30e59be925fafc1ee4565e5e08abef6b03fe455102883820fe5ee2e4734e0b",
"sha256:c2edb73a08e9e0e6f65a0e6af18b059b8b1cdd5bef997d7a0b181df93dc81539"
],
"index": "pypi",
"version": "==19.10b0"
},
"click": {
"hashes": [
"sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13",
"sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7"
],
"version": "==7.0"
},
"decorator": { "decorator": {
"hashes": [ "hashes": [
"sha256:86156361c50488b84a3f148056ea716ca587df2f0de1d34750d35c21312725de", "sha256:41fa54c2a0cc4ba648be4fd43cff00aedf5b9465c9bf18d64325bc225f08f760",
"sha256:f069f3a01830ca754ba5258fde2278454a0b5b79e0d7f5c13b3b97e57d4acff6" "sha256:e3a62f0520172440ca0dcc823749319382e377f37f140a0b99ef45fecb84bfe7"
], ],
"version": "==4.4.0" "version": "==4.4.2"
}, },
"entrypoints": { "entrypoints": {
"hashes": [ "hashes": [
@ -155,19 +172,19 @@
}, },
"flake8": { "flake8": {
"hashes": [ "hashes": [
"sha256:859996073f341f2670741b51ec1e67a01da142831aa1fdc6242dbf88dffbe661", "sha256:45681a117ecc81e870cbf1262835ae4af5e7a8b08e40b944a8a6e6b895914cfb",
"sha256:a796a115208f5c03b18f332f7c11729812c8c3ded6c46319c59b53efd3819da8" "sha256:49356e766643ad15072a789a20915d3c91dc89fd313ccd71802303fd67e4deca"
], ],
"index": "pypi", "index": "pypi",
"version": "==3.7.7" "version": "==3.7.9"
}, },
"ipython": { "ipython": {
"hashes": [ "hashes": [
"sha256:b038baa489c38f6d853a3cfc4c635b0cda66f2864d136fe8f40c1a6e334e2a6b", "sha256:ca478e52ae1f88da0102360e57e528b92f3ae4316aabac80a2cd7f7ab2efb48a",
"sha256:f5102c1cd67e399ec8ea66bcebe6e3968ea25a8977e53f012963e5affeb1fe38" "sha256:eb8d075de37f678424527b5ef6ea23f7b80240ca031c2dd6de5879d687a65333"
], ],
"index": "pypi", "index": "pypi",
"version": "==7.4.0" "version": "==7.13.0"
}, },
"ipython-genutils": { "ipython-genutils": {
"hashes": [ "hashes": [
@ -176,12 +193,20 @@
], ],
"version": "==0.2.0" "version": "==0.2.0"
}, },
"isort": {
"hashes": [
"sha256:54da7e92468955c4fceacd0c86bd0ec997b0e1ee80d97f67c35a78b719dccab1",
"sha256:6e811fcb295968434526407adb8796944f1988c5b65e8139058f2014cbe100fd"
],
"index": "pypi",
"version": "==4.3.21"
},
"jedi": { "jedi": {
"hashes": [ "hashes": [
"sha256:2bb0603e3506f708e792c7f4ad8fc2a7a9d9c2d292a358fbbd58da531695595b", "sha256:b4f4052551025c6b0b0b193b29a6ff7bdb74c52450631206c262aef9f7159ad2",
"sha256:2c6bcd9545c7d6440951b12b44d373479bf18123a401a52025cf98563fbd826c" "sha256:d5c871cb9360b414f981e7072c52c33258d598305280fef91c6cae34739d65d5"
], ],
"version": "==0.13.3" "version": "==0.16.0"
}, },
"mccabe": { "mccabe": {
"hashes": [ "hashes": [
@ -192,26 +217,39 @@
}, },
"more-itertools": { "more-itertools": {
"hashes": [ "hashes": [
"sha256:2112d2ca570bb7c3e53ea1a35cd5df42bb0fd10c45f0fb97178679c3c03d64c7", "sha256:5dd8bcf33e5f9513ffa06d5ad33d78f31e1931ac9a18f33d37e77a180d393a7c",
"sha256:c3e4748ba1aad8dba30a4886b0b1a2004f9a863837b8654e7059eebf727afa5a" "sha256:b1ddb932186d8a6ac451e1d95844b382f55e12686d51ca0c68b6f61f2ab7a507"
], ],
"markers": "python_version > '2.7'", "version": "==8.2.0"
"version": "==7.0.0" },
"packaging": {
"hashes": [
"sha256:170748228214b70b672c581a3dd610ee51f733018650740e98c7df862a583f73",
"sha256:e665345f9eef0c621aa0bf2f8d78cf6d21904eef16a93f020240b704a57f1334"
],
"version": "==20.1"
}, },
"parso": { "parso": {
"hashes": [ "hashes": [
"sha256:17cc2d7a945eb42c3569d4564cdf49bde221bc2b552af3eca9c1aad517dcdd33", "sha256:0c5659e0c6eba20636f99a04f469798dca8da279645ce5c387315b2c23912157",
"sha256:2e9574cb12e7112a87253e14e2c380ce312060269d04bd018478a3c92ea9a376" "sha256:8515fc12cfca6ee3aa59138741fc5624d62340c97e401c74875769948d4f2995"
], ],
"version": "==0.4.0" "version": "==0.6.2"
},
"pathspec": {
"hashes": [
"sha256:163b0632d4e31cef212976cf57b43d9fd6b0bac6e67c26015d611a647d5e7424",
"sha256:562aa70af2e0d434367d9790ad37aed893de47f1693e4201fd1d3dca15d19b96"
],
"version": "==0.7.0"
}, },
"pexpect": { "pexpect": {
"hashes": [ "hashes": [
"sha256:2094eefdfcf37a1fdbfb9aa090862c1a4878e5c7e0e7e7088bdb511c558e5cd1", "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937",
"sha256:9e2c1fd0e6ee3a49b28f95d4b33bc389c89b20af6a1255906e90ff1262ce62eb" "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"
], ],
"markers": "sys_platform != 'win32'", "markers": "sys_platform != 'win32'",
"version": "==4.7.0" "version": "==4.8.0"
}, },
"pickleshare": { "pickleshare": {
"hashes": [ "hashes": [
@ -222,18 +260,17 @@
}, },
"pluggy": { "pluggy": {
"hashes": [ "hashes": [
"sha256:19ecf9ce9db2fce065a7a0586e07cfb4ac8614fe96edf628a264b1c70116cf8f", "sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0",
"sha256:84d306a647cc805219916e62aab89caa97a33a1dd8c342e87a37f91073cd4746" "sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"
], ],
"version": "==0.9.0" "version": "==0.13.1"
}, },
"prompt-toolkit": { "prompt-toolkit": {
"hashes": [ "hashes": [
"sha256:11adf3389a996a6d45cc277580d0d53e8a5afd281d0c9ec71b28e6f121463780", "sha256:a402e9bf468b63314e37460b68ba68243d55b2f8c4d0192f85a019af3945050e",
"sha256:2519ad1d8038fd5fc8e770362237ad0364d16a7650fb5724af6997ed5515e3c1", "sha256:c93e53af97f630f12f5f62a3274e79527936ed466f038953dfa379d4941f651a"
"sha256:977c6583ae813a37dc1c2e1b715892461fcbdaa57f6fc62f33a528c4886c8f55"
], ],
"version": "==2.0.9" "version": "==3.0.3"
}, },
"ptyprocess": { "ptyprocess": {
"hashes": [ "hashes": [
@ -244,10 +281,10 @@
}, },
"py": { "py": {
"hashes": [ "hashes": [
"sha256:64f65755aee5b381cea27766a3a147c3f15b9b6b9ac88676de66ba2ae36793fa", "sha256:5e27081401262157467ad6e7f851b7aa402c5852dbcb3dae06768434de5752aa",
"sha256:dc639b046a6e2cff5bbe40194ad65936d6ba360b52b3c3fe1d08a82dd50b5e53" "sha256:c20fdd83a5dbc0af9efd622bee9a5564e278f6380fffcacc43ba6f43db2813b0"
], ],
"version": "==1.8.0" "version": "==1.8.1"
}, },
"pycodestyle": { "pycodestyle": {
"hashes": [ "hashes": [
@ -265,39 +302,118 @@
}, },
"pygments": { "pygments": {
"hashes": [ "hashes": [
"sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a", "sha256:2a3fe295e54a20164a9df49c75fa58526d3be48e14aceba6d6b1e8ac0bfd6f1b",
"sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d" "sha256:98c8aa5a9f778fcd1026a17361ddaf7330d1b7c62ae97c3bb0ae73e0b9b6b0fe"
], ],
"version": "==2.3.1" "version": "==2.5.2"
},
"pyparsing": {
"hashes": [
"sha256:4c830582a84fb022400b85429791bc551f1f4871c33f23e44f353119e92f969f",
"sha256:c342dccb5250c08d45fd6f8b4a559613ca603b57498511740e65cd11a2e7dcec"
],
"version": "==2.4.6"
}, },
"pytest": { "pytest": {
"hashes": [ "hashes": [
"sha256:13c5e9fb5ec5179995e9357111ab089af350d788cbc944c628f3cde72285809b", "sha256:0d5fe9189a148acc3c3eb2ac8e1ac0742cb7618c084f3d228baaec0c254b318d",
"sha256:f21d2f1fb8200830dcbb5d8ec466a9c9120e20d8b53c7585d180125cce1d297a" "sha256:ff615c761e25eb25df19edddc0b970302d2a9091fbce0e7213298d85fb61fef6"
], ],
"index": "pypi", "index": "pypi",
"version": "==4.4.0" "version": "==5.3.5"
},
"pytest-clarity": {
"hashes": [
"sha256:5cc99e3d9b7969dfe17e5f6072d45a917c59d363b679686d3c958a1ded2e4dcf"
],
"index": "pypi",
"version": "==0.3.0a0"
},
"regex": {
"hashes": [
"sha256:01b2d70cbaed11f72e57c1cfbaca71b02e3b98f739ce33f5f26f71859ad90431",
"sha256:046e83a8b160aff37e7034139a336b660b01dbfe58706f9d73f5cdc6b3460242",
"sha256:113309e819634f499d0006f6200700c8209a2a8bf6bd1bdc863a4d9d6776a5d1",
"sha256:200539b5124bc4721247a823a47d116a7a23e62cc6695744e3eb5454a8888e6d",
"sha256:25f4ce26b68425b80a233ce7b6218743c71cf7297dbe02feab1d711a2bf90045",
"sha256:269f0c5ff23639316b29f31df199f401e4cb87529eafff0c76828071635d417b",
"sha256:5de40649d4f88a15c9489ed37f88f053c15400257eeb18425ac7ed0a4e119400",
"sha256:7f78f963e62a61e294adb6ff5db901b629ef78cb2a1cfce3cf4eeba80c1c67aa",
"sha256:82469a0c1330a4beb3d42568f82dffa32226ced006e0b063719468dcd40ffdf0",
"sha256:8c2b7fa4d72781577ac45ab658da44c7518e6d96e2a50d04ecb0fd8f28b21d69",
"sha256:974535648f31c2b712a6b2595969f8ab370834080e00ab24e5dbb9d19b8bfb74",
"sha256:99272d6b6a68c7ae4391908fc15f6b8c9a6c345a46b632d7fdb7ef6c883a2bbb",
"sha256:9b64a4cc825ec4df262050c17e18f60252cdd94742b4ba1286bcfe481f1c0f26",
"sha256:9e9624440d754733eddbcd4614378c18713d2d9d0dc647cf9c72f64e39671be5",
"sha256:9ff16d994309b26a1cdf666a6309c1ef51ad4f72f99d3392bcd7b7139577a1f2",
"sha256:b33ebcd0222c1d77e61dbcd04a9fd139359bded86803063d3d2d197b796c63ce",
"sha256:bba52d72e16a554d1894a0cc74041da50eea99a8483e591a9edf1025a66843ab",
"sha256:bed7986547ce54d230fd8721aba6fd19459cdc6d315497b98686d0416efaff4e",
"sha256:c7f58a0e0e13fb44623b65b01052dae8e820ed9b8b654bb6296bc9c41f571b70",
"sha256:d58a4fa7910102500722defbde6e2816b0372a4fcc85c7e239323767c74f5cbc",
"sha256:f1ac2dc65105a53c1c2d72b1d3e98c2464a133b4067a51a3d2477b28449709a0"
],
"version": "==2020.2.20"
}, },
"six": { "six": {
"hashes": [ "hashes": [
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c", "sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73" "sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
], ],
"version": "==1.12.0" "version": "==1.14.0"
},
"termcolor": {
"hashes": [
"sha256:1d6d69ce66211143803fbc56652b41d73b4a400a2891d7bf7a1cdf4c02de613b"
],
"version": "==1.1.0"
},
"toml": {
"hashes": [
"sha256:229f81c57791a41d65e399fc06bf0848bab550a9dfd5ed66df18ce5f05e73d5c",
"sha256:235682dd292d5899d361a811df37e04a8828a5b1da3115886b73cf81ebc9100e"
],
"version": "==0.10.0"
}, },
"traitlets": { "traitlets": {
"hashes": [ "hashes": [
"sha256:9c4bd2d267b7153df9152698efb1050a5d84982d3384a37b2c1f7723ba3e7835", "sha256:70b4c6a1d9019d7b4f6846832288f86998aa3b9207c6821f3578a6a6a467fe44",
"sha256:c6cb5e6f57c5a9bdaa40fa71ce7b4af30298fbab9ece9815b5d995ab6217c7d9" "sha256:d023ee369ddd2763310e4c3eae1ff649689440d4ae59d7485eb4cfbbe3e359f7"
], ],
"version": "==4.3.2" "version": "==4.3.3"
},
"typed-ast": {
"hashes": [
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"version": "==1.4.1"
}, },
"wcwidth": { "wcwidth": {
"hashes": [ "hashes": [
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e", "sha256:8fd29383f539be45b20bd4df0dc29c20ba48654a41e661925e612311e9f3c603",
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c" "sha256:f28b3e8a6483e5d49e7f8949ac1a78314e740333ae305b4ba5defd3e74fb37a8"
], ],
"version": "==0.1.7" "version": "==0.1.8"
} }
} }
} }

View File

@ -1,12 +1,13 @@
from .database import DatabaseManager
import falcon import falcon
from .database import DatabaseManager
class RootResource: class RootResource:
def on_get(self, req, resp): def on_get(self, req, resp):
resp.status = falcon.HTTP_200 resp.status = falcon.HTTP_200
resp.content_type = 'text/html' resp.content_type = "text/html"
with open('dspace_statistics_api/docs/index.html', 'r') as f: with open("dspace_statistics_api/docs/index.html", "r") as f:
resp.body = f.read() resp.body = f.read()
@ -14,8 +15,8 @@ class AllItemsResource:
def on_get(self, req, resp): def on_get(self, req, resp):
"""Handles GET requests""" """Handles GET requests"""
# Return HTTPBadRequest if id parameter is not present and valid # Return HTTPBadRequest if id parameter is not present and valid
limit = req.get_param_as_int("limit", min=0, max=100) or 100 limit = req.get_param_as_int("limit", min_value=0, max_value=100) or 100
page = req.get_param_as_int("page", min=0) or 0 page = req.get_param_as_int("page", min_value=0) or 0
offset = limit * page offset = limit * page
with DatabaseManager() as db: with DatabaseManager() as db:
@ -23,24 +24,33 @@ class AllItemsResource:
with db.cursor() as cursor: with db.cursor() as cursor:
# get total number of items so we can estimate the pages # get total number of items so we can estimate the pages
cursor.execute('SELECT COUNT(id) FROM items') cursor.execute("SELECT COUNT(id) FROM items")
pages = round(cursor.fetchone()[0] / limit) pages = round(cursor.fetchone()[0] / limit)
# get statistics, ordered by id, and use limit and offset to page through results # get statistics and use limit and offset to page through results
cursor.execute('SELECT id, views, downloads FROM items ORDER BY id ASC LIMIT {} OFFSET {}'.format(limit, offset)) cursor.execute(
"SELECT id, views, downloads FROM items LIMIT %s OFFSET %s",
[limit, offset],
)
# create a list to hold dicts of item stats # create a list to hold dicts of item stats
statistics = list() statistics = list()
# iterate over results and build statistics object # iterate over results and build statistics object
for item in cursor: for item in cursor:
statistics.append({'id': item['id'], 'views': item['views'], 'downloads': item['downloads']}) statistics.append(
{
"id": str(item["id"]),
"views": item["views"],
"downloads": item["downloads"],
}
)
message = { message = {
'currentPage': page, "currentPage": page,
'totalPages': pages, "totalPages": pages,
'limit': limit, "limit": limit,
'statistics': statistics "statistics": statistics,
} }
resp.media = message resp.media = message
@ -50,32 +60,40 @@ class ItemResource:
def on_get(self, req, resp, item_id): def on_get(self, req, resp, item_id):
"""Handles GET requests""" """Handles GET requests"""
import psycopg2.extras
# Adapt Pythons uuid.UUID type to PostgreSQLs uuid
# See: https://www.psycopg.org/docs/extras.html
psycopg2.extras.register_uuid()
with DatabaseManager() as db: with DatabaseManager() as db:
db.set_session(readonly=True) db.set_session(readonly=True)
with db.cursor() as cursor: with db.cursor() as cursor:
cursor = db.cursor() cursor = db.cursor()
cursor.execute('SELECT views, downloads FROM items WHERE id={}'.format(item_id)) cursor.execute(
"SELECT views, downloads FROM items WHERE id=%s", [str(item_id)]
)
if cursor.rowcount == 0: if cursor.rowcount == 0:
raise falcon.HTTPNotFound( raise falcon.HTTPNotFound(
title='Item not found', title="Item not found",
description='The item with id "{}" was not found.'.format(item_id) description=f'The item with id "{str(item_id)}" was not found.',
) )
else: else:
results = cursor.fetchone() results = cursor.fetchone()
statistics = { statistics = {
'id': item_id, "id": str(item_id),
'views': results['views'], "views": results["views"],
'downloads': results['downloads'] "downloads": results["downloads"],
} }
resp.media = statistics resp.media = statistics
api = application = falcon.API() api = application = falcon.API()
api.add_route('/', RootResource()) api.add_route("/", RootResource())
api.add_route('/items', AllItemsResource()) api.add_route("/items", AllItemsResource())
api.add_route('/item/{item_id:int}', ItemResource()) api.add_route("/item/{item_id:uuid}", ItemResource())
# vim: set sw=4 ts=4 expandtab: # vim: set sw=4 ts=4 expandtab:

View File

@ -1,12 +1,12 @@
import os import os
# Check if Solr connection information was provided in the environment # Check if Solr connection information was provided in the environment
SOLR_SERVER = os.environ.get('SOLR_SERVER', 'http://localhost:8080/solr') SOLR_SERVER = os.environ.get("SOLR_SERVER", "http://localhost:8080/solr")
DATABASE_NAME = os.environ.get('DATABASE_NAME', 'dspacestatistics') DATABASE_NAME = os.environ.get("DATABASE_NAME", "dspacestatistics")
DATABASE_USER = os.environ.get('DATABASE_USER', 'dspacestatistics') DATABASE_USER = os.environ.get("DATABASE_USER", "dspacestatistics")
DATABASE_PASS = os.environ.get('DATABASE_PASS', 'dspacestatistics') DATABASE_PASS = os.environ.get("DATABASE_PASS", "dspacestatistics")
DATABASE_HOST = os.environ.get('DATABASE_HOST', 'localhost') DATABASE_HOST = os.environ.get("DATABASE_HOST", "localhost")
DATABASE_PORT = os.environ.get('DATABASE_PORT', '5432') DATABASE_PORT = os.environ.get("DATABASE_PORT", "5432")
# vim: set sw=4 ts=4 expandtab: # vim: set sw=4 ts=4 expandtab:

View File

@ -1,25 +1,30 @@
from .config import DATABASE_NAME
from .config import DATABASE_USER
from .config import DATABASE_PASS
from .config import DATABASE_HOST
from .config import DATABASE_PORT
import falcon import falcon
import psycopg2 import psycopg2
import psycopg2.extras import psycopg2.extras
from .config import (
DATABASE_HOST,
DATABASE_NAME,
DATABASE_PASS,
DATABASE_PORT,
DATABASE_USER,
)
class DatabaseManager():
'''Manage database connection.''' class DatabaseManager:
"""Manage database connection."""
def __init__(self): def __init__(self):
self._connection_uri = 'dbname={} user={} password={} host={} port={}'.format(DATABASE_NAME, DATABASE_USER, DATABASE_PASS, DATABASE_HOST, DATABASE_PORT) self._connection_uri = f"dbname={DATABASE_NAME} user={DATABASE_USER} password={DATABASE_PASS} host={DATABASE_HOST} port={DATABASE_PORT}"
def __enter__(self): def __enter__(self):
try: try:
self._connection = psycopg2.connect(self._connection_uri, cursor_factory=psycopg2.extras.DictCursor) self._connection = psycopg2.connect(
self._connection_uri, cursor_factory=psycopg2.extras.DictCursor
)
except psycopg2.OperationalError: except psycopg2.OperationalError:
title = '500 Internal Server Error' title = "500 Internal Server Error"
description = 'Could not connect to database' description = "Could not connect to database"
raise falcon.HTTPInternalServerError(title, description) raise falcon.HTTPInternalServerError(title, description)
return self._connection return self._connection
@ -27,4 +32,5 @@ class DatabaseManager():
def __exit__(self, exc_type, exc_value, exc_traceback): def __exit__(self, exc_type, exc_value, exc_traceback):
self._connection.close() self._connection.close()
# vim: set sw=4 ts=4 expandtab: # vim: set sw=4 ts=4 expandtab:

View File

@ -29,12 +29,13 @@
# See: https://solrclient.readthedocs.io/en/latest/SolrClient.html # See: https://solrclient.readthedocs.io/en/latest/SolrClient.html
# See: https://wiki.duraspace.org/display/DSPACE/Solr # See: https://wiki.duraspace.org/display/DSPACE/Solr
import re
import psycopg2.extras
import requests
from .config import SOLR_SERVER from .config import SOLR_SERVER
from .database import DatabaseManager from .database import DatabaseManager
import json
import psycopg2.extras
import re
import requests
# Enumerate the cores in Solr to determine if statistics have been sharded into # Enumerate the cores in Solr to determine if statistics have been sharded into
@ -44,11 +45,8 @@ def get_statistics_shards():
statistics_core_years = [] statistics_core_years = []
# URL for Solr status to check active cores # URL for Solr status to check active cores
solr_query_params = { solr_query_params = {"action": "STATUS", "wt": "json"}
'action': 'STATUS', solr_url = SOLR_SERVER + "/admin/cores"
'wt': 'json'
}
solr_url = SOLR_SERVER + '/admin/cores'
res = requests.get(solr_url, params=solr_query_params) res = requests.get(solr_url, params=solr_query_params)
if res.status_code == requests.codes.ok: if res.status_code == requests.codes.ok:
@ -56,9 +54,9 @@ def get_statistics_shards():
# Iterate over active cores from Solr's STATUS response (cores are in # Iterate over active cores from Solr's STATUS response (cores are in
# the status array of this response). # the status array of this response).
for core in data['status']: for core in data["status"]:
# Pattern to match, for example: statistics-2018 # Pattern to match, for example: statistics-2018
pattern = re.compile('^statistics-[0-9]{4}$') pattern = re.compile("^statistics-[0-9]{4}$")
if not pattern.match(core): if not pattern.match(core):
continue continue
@ -72,13 +70,13 @@ def get_statistics_shards():
if len(statistics_core_years) > 0: if len(statistics_core_years) > 0:
# Begin building a string of shards starting with the default one # Begin building a string of shards starting with the default one
shards = '{}/statistics'.format(SOLR_SERVER) shards = f"{SOLR_SERVER}/statistics"
for core in statistics_core_years: for core in statistics_core_years:
# Create a comma-separated list of shards to pass to our Solr query # Create a comma-separated list of shards to pass to our Solr query
# #
# See: https://wiki.apache.org/solr/DistributedSearch # See: https://wiki.apache.org/solr/DistributedSearch
shards += ',{}/{}'.format(SOLR_SERVER, core) shards += f",{SOLR_SERVER}/{core}"
# Return the string of shards, which may actually be empty. Solr doesn't # Return the string of shards, which may actually be empty. Solr doesn't
# seem to mind if the shards query parameter is empty and I haven't seen # seem to mind if the shards query parameter is empty and I haven't seen
@ -94,30 +92,32 @@ def index_views():
# #
# see: https://lucene.apache.org/solr/guide/6_6/the-stats-component.html # see: https://lucene.apache.org/solr/guide/6_6/the-stats-component.html
solr_query_params = { solr_query_params = {
'q': 'type:2', "q": "type:2",
'fq': 'isBot:false AND statistics_type:view', "fq": "isBot:false AND statistics_type:view",
'facet': 'true', "facet": "true",
'facet.field': 'id', "facet.field": "id",
'facet.mincount': 1, "facet.mincount": 1,
'facet.limit': 1, "facet.limit": 1,
'facet.offset': 0, "facet.offset": 0,
'stats': 'true', "stats": "true",
'stats.field': 'id', "stats.field": "id",
'stats.calcdistinct': 'true', "stats.calcdistinct": "true",
'shards': shards, "shards": shards,
'rows': 0, "rows": 0,
'wt': 'json' "wt": "json",
} }
solr_url = SOLR_SERVER + '/statistics/select' solr_url = SOLR_SERVER + "/statistics/select"
res = requests.get(solr_url, params=solr_query_params) res = requests.get(solr_url, params=solr_query_params)
try: try:
# get total number of distinct facets (countDistinct) # get total number of distinct facets (countDistinct)
results_totalNumFacets = res.json()['stats']['stats_fields']['id']['countDistinct'] results_totalNumFacets = res.json()["stats"]["stats_fields"]["id"][
"countDistinct"
]
except TypeError: except TypeError:
print('No item views to index, exiting.') print("No item views to index, exiting.")
exit(0) exit(0)
@ -133,35 +133,37 @@ def index_views():
while results_current_page <= results_num_pages: while results_current_page <= results_num_pages:
# "pages" are zero based, but one based is more human readable # "pages" are zero based, but one based is more human readable
print('Indexing item views (page {} of {})'.format(results_current_page + 1, results_num_pages + 1)) print(
f"Indexing item views (page {results_current_page + 1} of {results_num_pages + 1})"
)
solr_query_params = { solr_query_params = {
'q': 'type:2', "q": "type:2",
'fq': 'isBot:false AND statistics_type:view', "fq": "isBot:false AND statistics_type:view",
'facet': 'true', "facet": "true",
'facet.field': 'id', "facet.field": "id",
'facet.mincount': 1, "facet.mincount": 1,
'facet.limit': results_per_page, "facet.limit": results_per_page,
'facet.offset': results_current_page * results_per_page, "facet.offset": results_current_page * results_per_page,
'shards': shards, "shards": shards,
'rows': 0, "rows": 0,
'wt': 'json', "wt": "json",
'json.nl': 'map' # return facets as a dict instead of a flat list "json.nl": "map", # return facets as a dict instead of a flat list
} }
solr_url = SOLR_SERVER + '/statistics/select' solr_url = SOLR_SERVER + "/statistics/select"
res = requests.get(solr_url, params=solr_query_params) res = requests.get(solr_url, params=solr_query_params)
# Solr returns facets as a dict of dicts (see json.nl parameter) # Solr returns facets as a dict of dicts (see json.nl parameter)
views = res.json()['facet_counts']['facet_fields'] views = res.json()["facet_counts"]["facet_fields"]
# iterate over the 'id' dict and get the item ids and views # iterate over the 'id' dict and get the item ids and views
for item_id, item_views in views['id'].items(): for item_id, item_views in views["id"].items():
data.append((item_id, item_views)) data.append((item_id, item_views))
# do a batch insert of values from the current "page" of results # do a batch insert of values from the current "page" of results
sql = 'INSERT INTO items(id, views) VALUES %s ON CONFLICT(id) DO UPDATE SET views=excluded.views' sql = "INSERT INTO items(id, views) VALUES %s ON CONFLICT(id) DO UPDATE SET views=excluded.views"
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)') psycopg2.extras.execute_values(cursor, sql, data, template="(%s, %s)")
db.commit() db.commit()
# clear all items from the list so we can populate it with the next batch # clear all items from the list so we can populate it with the next batch
@ -172,31 +174,33 @@ def index_views():
def index_downloads(): def index_downloads():
# get the total number of distinct facets for items with at least 1 download # get the total number of distinct facets for items with at least 1 download
solr_query_params= { solr_query_params = {
'q': 'type:0', "q": "type:0",
'fq': 'isBot:false AND statistics_type:view AND bundleName:ORIGINAL', "fq": "isBot:false AND statistics_type:view AND bundleName:ORIGINAL",
'facet': 'true', "facet": "true",
'facet.field': 'owningItem', "facet.field": "owningItem",
'facet.mincount': 1, "facet.mincount": 1,
'facet.limit': 1, "facet.limit": 1,
'facet.offset': 0, "facet.offset": 0,
'stats': 'true', "stats": "true",
'stats.field': 'owningItem', "stats.field": "owningItem",
'stats.calcdistinct': 'true', "stats.calcdistinct": "true",
'shards': shards, "shards": shards,
'rows': 0, "rows": 0,
'wt': 'json' "wt": "json",
} }
solr_url = SOLR_SERVER + '/statistics/select' solr_url = SOLR_SERVER + "/statistics/select"
res = requests.get(solr_url, params=solr_query_params) res = requests.get(solr_url, params=solr_query_params)
try: try:
# get total number of distinct facets (countDistinct) # get total number of distinct facets (countDistinct)
results_totalNumFacets = res.json()['stats']['stats_fields']['owningItem']['countDistinct'] results_totalNumFacets = res.json()["stats"]["stats_fields"]["owningItem"][
"countDistinct"
]
except TypeError: except TypeError:
print('No item downloads to index, exiting.') print("No item downloads to index, exiting.")
exit(0) exit(0)
@ -212,35 +216,37 @@ def index_downloads():
while results_current_page <= results_num_pages: while results_current_page <= results_num_pages:
# "pages" are zero based, but one based is more human readable # "pages" are zero based, but one based is more human readable
print('Indexing item downloads (page {} of {})'.format(results_current_page + 1, results_num_pages + 1)) print(
f"Indexing item downloads (page {results_current_page + 1} of {results_num_pages + 1})"
)
solr_query_params = { solr_query_params = {
'q': 'type:0', "q": "type:0",
'fq': 'isBot:false AND statistics_type:view AND bundleName:ORIGINAL', "fq": "isBot:false AND statistics_type:view AND bundleName:ORIGINAL",
'facet': 'true', "facet": "true",
'facet.field': 'owningItem', "facet.field": "owningItem",
'facet.mincount': 1, "facet.mincount": 1,
'facet.limit': results_per_page, "facet.limit": results_per_page,
'facet.offset': results_current_page * results_per_page, "facet.offset": results_current_page * results_per_page,
'shards': shards, "shards": shards,
'rows': 0, "rows": 0,
'wt': 'json', "wt": "json",
'json.nl': 'map' # return facets as a dict instead of a flat list "json.nl": "map", # return facets as a dict instead of a flat list
} }
solr_url = SOLR_SERVER + '/statistics/select' solr_url = SOLR_SERVER + "/statistics/select"
res = requests.get(solr_url, params=solr_query_params) res = requests.get(solr_url, params=solr_query_params)
# Solr returns facets as a dict of dicts (see json.nl parameter) # Solr returns facets as a dict of dicts (see json.nl parameter)
downloads = res.json()['facet_counts']['facet_fields'] downloads = res.json()["facet_counts"]["facet_fields"]
# iterate over the 'owningItem' dict and get the item ids and downloads # iterate over the 'owningItem' dict and get the item ids and downloads
for item_id, item_downloads in downloads['owningItem'].items(): for item_id, item_downloads in downloads["owningItem"].items():
data.append((item_id, item_downloads)) data.append((item_id, item_downloads))
# do a batch insert of values from the current "page" of results # do a batch insert of values from the current "page" of results
sql = 'INSERT INTO items(id, downloads) VALUES %s ON CONFLICT(id) DO UPDATE SET downloads=excluded.downloads' sql = "INSERT INTO items(id, downloads) VALUES %s ON CONFLICT(id) DO UPDATE SET downloads=excluded.downloads"
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)') psycopg2.extras.execute_values(cursor, sql, data, template="(%s, %s)")
db.commit() db.commit()
# clear all items from the list so we can populate it with the next batch # clear all items from the list so we can populate it with the next batch
@ -252,8 +258,10 @@ def index_downloads():
with DatabaseManager() as db: with DatabaseManager() as db:
with db.cursor() as cursor: with db.cursor() as cursor:
# create table to store item views and downloads # create table to store item views and downloads
cursor.execute('''CREATE TABLE IF NOT EXISTS items cursor.execute(
(id INT PRIMARY KEY, views INT DEFAULT 0, downloads INT DEFAULT 0)''') """CREATE TABLE IF NOT EXISTS items
(id UUID PRIMARY KEY, views INT DEFAULT 0, downloads INT DEFAULT 0)"""
)
# commit the table creation before closing the database connection # commit the table creation before closing the database connection
db.commit() db.commit()

View File

@ -1,26 +1,37 @@
-i https://pypi.org/simple -i https://pypi.org/simple
atomicwrites==1.3.0 appdirs==1.4.3
attrs==19.1.0 attrs==19.3.0
backcall==0.1.0 backcall==0.1.0
decorator==4.4.0 black==19.10b0
click==7.0
decorator==4.4.2
entrypoints==0.3 entrypoints==0.3
flake8==3.7.7 flake8==3.7.9
ipython-genutils==0.2.0 ipython-genutils==0.2.0
ipython==7.4.0 ipython==7.13.0
jedi==0.13.3 isort==4.3.21
jedi==0.16.0
mccabe==0.6.1 mccabe==0.6.1
more-itertools==7.0.0 ; python_version > '2.7' more-itertools==8.2.0
parso==0.4.0 packaging==20.1
pexpect==4.7.0 ; sys_platform != 'win32' parso==0.6.2
pathspec==0.7.0
pexpect==4.8.0 ; sys_platform != 'win32'
pickleshare==0.7.5 pickleshare==0.7.5
pluggy==0.9.0 pluggy==0.13.1
prompt-toolkit==2.0.9 prompt-toolkit==3.0.3
ptyprocess==0.6.0 ptyprocess==0.6.0
py==1.8.0 py==1.8.1
pycodestyle==2.5.0 pycodestyle==2.5.0
pyflakes==2.1.1 pyflakes==2.1.1
pygments==2.3.1 pygments==2.5.2
pytest==4.4.0 pyparsing==2.4.6
six==1.12.0 pytest-clarity==0.3.0a0
traitlets==4.3.2 pytest==5.3.5
wcwidth==0.1.7 regex==2020.2.20
six==1.14.0
termcolor==1.1.0
toml==0.10.0
traitlets==4.3.3
typed-ast==1.4.1
wcwidth==0.1.8

View File

@ -1,11 +1,9 @@
-i https://pypi.org/simple -i https://pypi.org/simple
certifi==2019.3.9 certifi==2019.11.28
chardet==3.0.4 chardet==3.0.4
falcon==1.4.1 falcon==2.0.0
gunicorn==19.9.0 gunicorn==20.0.4
idna==2.8 idna==2.9
psycopg2-binary==2.8.2 psycopg2-binary==2.8.4
python-mimeparse==1.6.0 requests==2.23.0
requests==2.21.0 urllib3==1.25.8
six==1.12.0
urllib3==1.24.1

6
setup.cfg Normal file
View File

@ -0,0 +1,6 @@
[isort]
multi_line_output=3
include_trailing_comma=True
force_grid_wrap=0
use_parentheses=True
line_length=88

View File

@ -11,57 +11,57 @@ def client():
def test_get_docs(client): def test_get_docs(client):
'''Test requesting the documentation at the root.''' """Test requesting the documentation at the root."""
response = client.simulate_get('/') response = client.simulate_get("/")
assert isinstance(response.content, bytes) assert isinstance(response.content, bytes)
assert response.status_code == 200 assert response.status_code == 200
def test_get_item(client): def test_get_item(client):
'''Test requesting a single item.''' """Test requesting a single item."""
response = client.simulate_get('/item/17') response = client.simulate_get("/item/c3910974-c3a5-4053-9dce-104aa7bb1621")
response_doc = json.loads(response.text) response_doc = json.loads(response.text)
assert isinstance(response_doc['downloads'], int) assert isinstance(response_doc["downloads"], int)
assert isinstance(response_doc['id'], int) assert isinstance(response_doc["id"], str)
assert isinstance(response_doc['views'], int) assert isinstance(response_doc["views"], int)
assert response.status_code == 200 assert response.status_code == 200
def test_get_missing_item(client): def test_get_missing_item(client):
'''Test requesting a single non-existing item.''' """Test requesting a single non-existing item."""
response = client.simulate_get('/item/1') response = client.simulate_get("/item/c3910974-c3a5-4053-9dce-104aa7bb1620")
assert response.status_code == 404 assert response.status_code == 404
def test_get_items(client): def test_get_items(client):
'''Test requesting 100 items.''' """Test requesting 100 items."""
response = client.simulate_get('/items', query_string='limit=100') response = client.simulate_get("/items", query_string="limit=100")
response_doc = json.loads(response.text) response_doc = json.loads(response.text)
assert isinstance(response_doc['currentPage'], int) assert isinstance(response_doc["currentPage"], int)
assert isinstance(response_doc['totalPages'], int) assert isinstance(response_doc["totalPages"], int)
assert isinstance(response_doc['statistics'], list) assert isinstance(response_doc["statistics"], list)
assert response.status_code == 200 assert response.status_code == 200
def test_get_items_invalid_limit(client): def test_get_items_invalid_limit(client):
'''Test requesting 100 items with an invalid limit parameter.''' """Test requesting 100 items with an invalid limit parameter."""
response = client.simulate_get('/items', query_string='limit=101') response = client.simulate_get("/items", query_string="limit=101")
assert response.status_code == 400 assert response.status_code == 400
def test_get_items_invalid_page(client): def test_get_items_invalid_page(client):
'''Test requesting 100 items with an invalid page parameter.''' """Test requesting 100 items with an invalid page parameter."""
response = client.simulate_get('/items', query_string='page=-1') response = client.simulate_get("/items", query_string="page=-1")
assert response.status_code == 400 assert response.status_code == 400