mirror of
https://github.com/ilri/dspace-statistics-api.git
synced 2025-05-10 15:16:02 +02:00
Compare commits
75 Commits
Author | SHA1 | Date | |
---|---|---|---|
914ec52fbb
|
|||
5524066656
|
|||
043d897cef
|
|||
bd28353cda
|
|||
e23d66c2a2
|
|||
40e284dac0
|
|||
934fa9db9b
|
|||
1fabb72b58
|
|||
c7f95f0b60
|
|||
c95a98dd2d
|
|||
3f70f94a10
|
|||
9b8ad9defd | |||
d69ab20220
|
|||
378f56ddc2
|
|||
5a2a7d684c
|
|||
18276e910f
|
|||
8de8c2765f | |||
11a1755e59
|
|||
a835b0fdc5
|
|||
a88600c92b
|
|||
019d9242c9 | |||
f4d7312a3f
|
|||
9c46cfc7e2
|
|||
c1c2e319ac
|
|||
0895b4f469
|
|||
dcfef06a65
|
|||
13736d6359
|
|||
4fc64edeb8 | |||
2a8901dc4f
|
|||
e25c974796
|
|||
ffc62e9ee6
|
|||
556c5ae088
|
|||
d94134f80a
|
|||
586231eb2d
|
|||
766b77a3b6
|
|||
1959e8154e
|
|||
d40b2f0b2e | |||
061d0a8f5f | |||
e57660ff88 | |||
5c8756bede | |||
bae9fb80e4 | |||
8a65d99e08
|
|||
d479b7dc6c
|
|||
40aac8bf89 | |||
53ba6f2936
|
|||
140cc4cb07
|
|||
d5d2d2149b
|
|||
4c51d12eb4
|
|||
a6ce44e852 | |||
f6e866a589
|
|||
eb5c187d41
|
|||
b06c82bb16
|
|||
2f342be948
|
|||
e39f2b260c | |||
60ad474b88
|
|||
888f85d19e
|
|||
df7de93964
|
|||
7218631cc4
|
|||
085e525b2f
|
|||
e1580df12f
|
|||
be18779ff9
|
|||
60cfd8f23b
|
|||
87fd117d77
|
|||
f262ebdca2
|
|||
64d7f1a3b2
|
|||
a238a727d2
|
|||
cc5ce3ab98 | |||
70dfcb93c5
|
|||
69bcd1b5e4
|
|||
5f3bd61998
|
|||
e54dd8888f
|
|||
2ba09f8693
|
|||
a468a87a5a
|
|||
6a30b6550d
|
|||
18f013bfa0
|
4
.hound.yml
Normal file
4
.hound.yml
Normal file
@ -0,0 +1,4 @@
|
||||
flake8:
|
||||
enabled: true
|
||||
config_file: .flake8
|
||||
fail_on_violations: true
|
16
.travis.yml
16
.travis.yml
@ -3,9 +3,17 @@ python:
|
||||
- "3.5"
|
||||
- "3.6"
|
||||
- "3.7-dev"
|
||||
script: pip install -r requirements.txt
|
||||
branches:
|
||||
only:
|
||||
- master
|
||||
addons:
|
||||
postgresql: "9.5"
|
||||
before_script:
|
||||
- psql --version
|
||||
- createuser -U postgres dspacestatistics
|
||||
- psql -U postgres -c "ALTER USER dspacestatistics WITH PASSWORD 'dspacestatistics'"
|
||||
- createdb -U postgres -O dspacestatistics --encoding=UNICODE dspacestatistics
|
||||
- psql -U postgres -d dspacestatistics < tests/dspacestatistics.sql
|
||||
install:
|
||||
- "pip install pipenv --upgrade-strategy=only-if-needed"
|
||||
- "pipenv install --dev"
|
||||
script: pytest
|
||||
|
||||
# vim: ts=2 sw=2 et
|
||||
|
63
CHANGELOG.md
63
CHANGELOG.md
@ -4,34 +4,67 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
### [0.6.1] - 2018-10-31
|
||||
## Added
|
||||
## [0.9.0] - 2019-01-22
|
||||
### Updated
|
||||
- pytest version 4.0.0
|
||||
- Fix indexing of sharded statistics cores ([#10))
|
||||
- Handle case of missing views/downloads gracefully
|
||||
|
||||
## [0.8.1] - 2018-11-14
|
||||
### Changed
|
||||
- README.md to recommend using vanilla Python virtual environments and pip instead of pipenv
|
||||
- Regenerate pipenv environment to capture only direct dependencies
|
||||
|
||||
### Added
|
||||
- `requirements-dev.txt` for installing development packages with pip
|
||||
|
||||
## [0.8.0] - 2018-11-11
|
||||
### Changed
|
||||
- Properly handle database connection errors
|
||||
|
||||
### Added
|
||||
- API tests with pytest
|
||||
|
||||
## [0.7.0] - 2018-11-07
|
||||
### Added
|
||||
- Ability to configure PostgreSQL database port with DATABASE_PORT environment variable (defaults to 5432)
|
||||
- Hound CI configuration to validate pull requests against PEP 8 code style with Flake8
|
||||
- Configuration for [pipenv](https://pipenv.readthedocs.io/en/latest/)
|
||||
|
||||
### Changed
|
||||
- Use a database management class with Python context management to automatically open/close connections and cursors
|
||||
|
||||
### Changed
|
||||
- Validate code against PEP 8 style guide with Flake8
|
||||
|
||||
## [0.6.1] - 2018-10-31
|
||||
### Added
|
||||
- API documentation at root path (/)
|
||||
|
||||
### [0.6.0] - 2018-10-31
|
||||
## Changed
|
||||
## [0.6.0] - 2018-10-31
|
||||
### Changed
|
||||
- Refactor project structure (note breaking changes to API and indexing invocation, see contrib and README.md)
|
||||
|
||||
### [0.5.2] - 2018-10-28
|
||||
## Changed
|
||||
## [0.5.2] - 2018-10-28
|
||||
### Changed
|
||||
- Update library versions in requirements.txt
|
||||
|
||||
### [0.5.1] - 2018-10-24
|
||||
## Changed
|
||||
## [0.5.1] - 2018-10-24
|
||||
### Changed
|
||||
- Use Python's native json instead of ujson
|
||||
|
||||
### [0.5.0] - 2018-10-24
|
||||
## Added
|
||||
## [0.5.0] - 2018-10-24
|
||||
### Added
|
||||
- Example nginx configuration to README.md
|
||||
|
||||
## Changed
|
||||
### Changed
|
||||
- Don't initialize Solr connection in API
|
||||
|
||||
### [0.4.3] - 2018-10-17
|
||||
## Changed
|
||||
## [0.4.3] - 2018-10-17
|
||||
### Changed
|
||||
- Use pip install as script for Travis CI
|
||||
|
||||
## Improved
|
||||
### Improved
|
||||
- Documentation for deployment and testing
|
||||
|
||||
## [0.4.2] - 2018-10-04
|
||||
@ -45,7 +78,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
## [0.4.1] - 2018-09-26
|
||||
### Changed
|
||||
- Use execute_values() to batch insert records to PostgreSQL
|
||||
- Use `execute_values()` to batch insert records to PostgreSQL
|
||||
|
||||
## [0.4.0] - 2018-09-25
|
||||
### Fixed
|
||||
|
18
Pipfile
Normal file
18
Pipfile
Normal file
@ -0,0 +1,18 @@
|
||||
[[source]]
|
||||
url = "https://pypi.org/simple"
|
||||
verify_ssl = true
|
||||
name = "pypi"
|
||||
|
||||
[packages]
|
||||
gunicorn = "*"
|
||||
falcon = "*"
|
||||
"psycopg2-binary" = "*"
|
||||
solrclient = {ref = "kazoo-2.5.0", git = "https://github.com/alanorth/SolrClient.git"}
|
||||
|
||||
[dev-packages]
|
||||
ipython = "*"
|
||||
"flake8" = "*"
|
||||
pytest = "*"
|
||||
|
||||
[requires]
|
||||
python_version = "3.7"
|
266
Pipfile.lock
generated
Normal file
266
Pipfile.lock
generated
Normal file
@ -0,0 +1,266 @@
|
||||
{
|
||||
"_meta": {
|
||||
"hash": {
|
||||
"sha256": "a846fdab4de5765a7e7fc19424a97a6196248e29f87285cf81fd76e8e9ae3e28"
|
||||
},
|
||||
"pipfile-spec": 6,
|
||||
"requires": {
|
||||
"python_version": "3.7"
|
||||
},
|
||||
"sources": [
|
||||
{
|
||||
"name": "pypi",
|
||||
"url": "https://pypi.org/simple",
|
||||
"verify_ssl": true
|
||||
}
|
||||
]
|
||||
},
|
||||
"default": {
|
||||
"falcon": {
|
||||
"hashes": [
|
||||
"sha256:0a66b33458fab9c1e400a9be1a68056abda178eb02a8cb4b8f795e9df20b053b",
|
||||
"sha256:3981f609c0358a9fcdb25b0e7fab3d9e23019356fb429c635ce4133135ae1bc4"
|
||||
],
|
||||
"index": "pypi",
|
||||
"version": "==1.4.1"
|
||||
},
|
||||
"gunicorn": {
|
||||
"hashes": [
|
||||
"sha256:aa8e0b40b4157b36a5df5e599f45c9c76d6af43845ba3b3b0efe2c70473c2471",
|
||||
"sha256:fa2662097c66f920f53f70621c6c58ca4a3c4d3434205e608e121b5b3b71f4f3"
|
||||
],
|
||||
"index": "pypi",
|
||||
"version": "==19.9.0"
|
||||
},
|
||||
"psycopg2-binary": {
|
||||
"hashes": [
|
||||
"sha256:036bcb198a7cc4ce0fe43344f8c2c9a8155aefa411633f426c8c6ed58a6c0426",
|
||||
"sha256:1d770fcc02cdf628aebac7404d56b28a7e9ebec8cfc0e63260bd54d6edfa16d4",
|
||||
"sha256:1fdc6f369dcf229de6c873522d54336af598b9470ccd5300e2f58ee506f5ca13",
|
||||
"sha256:21f9ddc0ff6e07f7d7b6b484eb9da2c03bc9931dd13e36796b111d631f7135a3",
|
||||
"sha256:247873cda726f7956f745a3e03158b00de79c4abea8776dc2f611d5ba368d72d",
|
||||
"sha256:3aa31c42f29f1da6f4fd41433ad15052d5ff045f2214002e027a321f79d64e2c",
|
||||
"sha256:475f694f87dbc619010b26de7d0fc575a4accf503f2200885cc21f526bffe2ad",
|
||||
"sha256:4b5e332a24bf6e2fda1f51ca2a57ae1083352293a08eeea1fa1112dc7dd542d1",
|
||||
"sha256:570d521660574aca40be7b4d532dfb6f156aad7b16b5ed62d1534f64f1ef72d8",
|
||||
"sha256:59072de7def0690dd13112d2bdb453e20570a97297070f876fbbb7cbc1c26b05",
|
||||
"sha256:5f0b658989e918ef187f8a08db0420528126f2c7da182a7b9f8bf7f85144d4e4",
|
||||
"sha256:649199c84a966917d86cdc2046e03d536763576c0b2a756059ae0b3a9656bc20",
|
||||
"sha256:6645fc9b4705ae8fbf1ef7674f416f89ae1559deec810f6dd15197dfa52893da",
|
||||
"sha256:6872dd54d4e398d781efe8fe2e2d7eafe4450d61b5c4898aced7610109a6df75",
|
||||
"sha256:6ce34fbc251fc0d691c8d131250ba6f42fd2b28ef28558d528ba8c558cb28804",
|
||||
"sha256:73920d167a0a4d1006f5f3b9a3efce6f0e5e883a99599d38206d43f27697df00",
|
||||
"sha256:8a671732b87ae423e34b51139628123bc0306c2cb85c226e71b28d3d57d7e42a",
|
||||
"sha256:8d517e8fda2efebca27c2018e14c90ed7dc3f04d7098b3da2912e62a1a5585fe",
|
||||
"sha256:9475a008eb7279e20d400c76471843c321b46acacc7ee3de0b47233a1e3fa2cf",
|
||||
"sha256:96947b8cd7b3148fb0e6549fcb31258a736595d6f2a599f8cd450e9a80a14781",
|
||||
"sha256:abf229f24daa93f67ac53e2e17c8798a71a01711eb9fcdd029abba8637164338",
|
||||
"sha256:b1ab012f276df584beb74f81acb63905762c25803ece647016613c3d6ad4e432",
|
||||
"sha256:b22b33f6f0071fe57cb4e9158f353c88d41e739a3ec0d76f7b704539e7076427",
|
||||
"sha256:b3b2d53274858e50ad2ffdd6d97ce1d014e1e530f82ec8b307edd5d4c921badf",
|
||||
"sha256:bab26a729befc7b9fab9ded1bba9c51b785188b79f8a2796ba03e7e734269e2e",
|
||||
"sha256:daa1a593629aa49f506eddc9d23dc7f89b35693b90e1fbcd4480182d1203ea90",
|
||||
"sha256:dd111280ce40e89fd17b19c1269fd1b74a30fce9d44a550840e86edb33924eb8",
|
||||
"sha256:e0b86084f1e2e78c451994410de756deba206884d6bed68d5a3d7f39ff5fea1d",
|
||||
"sha256:eb86520753560a7e89639500e2a254bb6f683342af598088cb72c73edcad21e6",
|
||||
"sha256:ff18c5c40a38d41811c23e2480615425c97ea81fd7e9118b8b899c512d97c737"
|
||||
],
|
||||
"index": "pypi",
|
||||
"version": "==2.7.6.1"
|
||||
},
|
||||
"python-mimeparse": {
|
||||
"hashes": [
|
||||
"sha256:76e4b03d700a641fd7761d3cd4fdbbdcd787eade1ebfac43f877016328334f78",
|
||||
"sha256:a295f03ff20341491bfe4717a39cd0a8cc9afad619ba44b77e86b0ab8a2b8282"
|
||||
],
|
||||
"version": "==1.6.0"
|
||||
},
|
||||
"six": {
|
||||
"hashes": [
|
||||
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
|
||||
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
|
||||
],
|
||||
"version": "==1.12.0"
|
||||
},
|
||||
"solrclient": {
|
||||
"git": "https://github.com/alanorth/SolrClient.git",
|
||||
"ref": "c629e3475be37c82770b2be61748be7e29882648"
|
||||
}
|
||||
},
|
||||
"develop": {
|
||||
"atomicwrites": {
|
||||
"hashes": [
|
||||
"sha256:0312ad34fcad8fac3704d441f7b317e50af620823353ec657a53e981f92920c0",
|
||||
"sha256:ec9ae8adaae229e4f8446952d204a3e4b5fdd2d099f9be3aaf556120135fb3ee"
|
||||
],
|
||||
"version": "==1.2.1"
|
||||
},
|
||||
"attrs": {
|
||||
"hashes": [
|
||||
"sha256:10cbf6e27dbce8c30807caf056c8eb50917e0eaafe86347671b57254006c3e69",
|
||||
"sha256:ca4be454458f9dec299268d472aaa5a11f67a4ff70093396e1ceae9c76cf4bbb"
|
||||
],
|
||||
"version": "==18.2.0"
|
||||
},
|
||||
"backcall": {
|
||||
"hashes": [
|
||||
"sha256:38ecd85be2c1e78f77fd91700c76e14667dc21e2713b63876c0eb901196e01e4",
|
||||
"sha256:bbbf4b1e5cd2bdb08f915895b51081c041bac22394fdfcfdfbe9f14b77c08bf2"
|
||||
],
|
||||
"version": "==0.1.0"
|
||||
},
|
||||
"decorator": {
|
||||
"hashes": [
|
||||
"sha256:2c51dff8ef3c447388fe5e4453d24a2bf128d3a4c32af3fabef1f01c6851ab82",
|
||||
"sha256:c39efa13fbdeb4506c476c9b3babf6a718da943dab7811c206005a4a956c080c"
|
||||
],
|
||||
"version": "==4.3.0"
|
||||
},
|
||||
"flake8": {
|
||||
"hashes": [
|
||||
"sha256:6a35f5b8761f45c5513e3405f110a86bea57982c3b75b766ce7b65217abe1670",
|
||||
"sha256:c01f8a3963b3571a8e6bd7a4063359aff90749e160778e03817cd9b71c9e07d2"
|
||||
],
|
||||
"index": "pypi",
|
||||
"version": "==3.6.0"
|
||||
},
|
||||
"ipython": {
|
||||
"hashes": [
|
||||
"sha256:6a9496209b76463f1dec126ab928919aaf1f55b38beb9219af3fe202f6bbdd12",
|
||||
"sha256:f69932b1e806b38a7818d9a1e918e5821b685715040b48e59c657b3c7961b742"
|
||||
],
|
||||
"index": "pypi",
|
||||
"version": "==7.2.0"
|
||||
},
|
||||
"ipython-genutils": {
|
||||
"hashes": [
|
||||
"sha256:72dd37233799e619666c9f639a9da83c34013a73e8bbc79a7a6348d93c61fab8",
|
||||
"sha256:eb2e116e75ecef9d4d228fdc66af54269afa26ab4463042e33785b887c628ba8"
|
||||
],
|
||||
"version": "==0.2.0"
|
||||
},
|
||||
"jedi": {
|
||||
"hashes": [
|
||||
"sha256:571702b5bd167911fe9036e5039ba67f820d6502832285cde8c881ab2b2149fd",
|
||||
"sha256:c8481b5e59d34a5c7c42e98f6625e633f6ef59353abea6437472c7ec2093f191"
|
||||
],
|
||||
"version": "==0.13.2"
|
||||
},
|
||||
"mccabe": {
|
||||
"hashes": [
|
||||
"sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
|
||||
"sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
|
||||
],
|
||||
"version": "==0.6.1"
|
||||
},
|
||||
"more-itertools": {
|
||||
"hashes": [
|
||||
"sha256:38a936c0a6d98a38bcc2d03fdaaedaba9f412879461dd2ceff8d37564d6522e4",
|
||||
"sha256:c0a5785b1109a6bd7fac76d6837fd1feca158e54e521ccd2ae8bfe393cc9d4fc",
|
||||
"sha256:fe7a7cae1ccb57d33952113ff4fa1bc5f879963600ed74918f1236e212ee50b9"
|
||||
],
|
||||
"version": "==5.0.0"
|
||||
},
|
||||
"parso": {
|
||||
"hashes": [
|
||||
"sha256:35704a43a3c113cce4de228ddb39aab374b8004f4f2407d070b6a2ca784ce8a2",
|
||||
"sha256:895c63e93b94ac1e1690f5fdd40b65f07c8171e3e53cbd7793b5b96c0e0a7f24"
|
||||
],
|
||||
"version": "==0.3.1"
|
||||
},
|
||||
"pexpect": {
|
||||
"hashes": [
|
||||
"sha256:2a8e88259839571d1251d278476f3eec5db26deb73a70be5ed5dc5435e418aba",
|
||||
"sha256:3fbd41d4caf27fa4a377bfd16fef87271099463e6fa73e92a52f92dfee5d425b"
|
||||
],
|
||||
"markers": "sys_platform != 'win32'",
|
||||
"version": "==4.6.0"
|
||||
},
|
||||
"pickleshare": {
|
||||
"hashes": [
|
||||
"sha256:87683d47965c1da65cdacaf31c8441d12b8044cdec9aca500cd78fc2c683afca",
|
||||
"sha256:9649af414d74d4df115d5d718f82acb59c9d418196b7b4290ed47a12ce62df56"
|
||||
],
|
||||
"version": "==0.7.5"
|
||||
},
|
||||
"pluggy": {
|
||||
"hashes": [
|
||||
"sha256:8ddc32f03971bfdf900a81961a48ccf2fb677cf7715108f85295c67405798616",
|
||||
"sha256:980710797ff6a041e9a73a5787804f848996ecaa6f8a1b1e08224a5894f2074a"
|
||||
],
|
||||
"version": "==0.8.1"
|
||||
},
|
||||
"prompt-toolkit": {
|
||||
"hashes": [
|
||||
"sha256:c1d6aff5252ab2ef391c2fe498ed8c088066f66bc64a8d5c095bbf795d9fec34",
|
||||
"sha256:d4c47f79b635a0e70b84fdb97ebd9a274203706b1ee5ed44c10da62755cf3ec9",
|
||||
"sha256:fd17048d8335c1e6d5ee403c3569953ba3eb8555d710bfc548faf0712666ea39"
|
||||
],
|
||||
"version": "==2.0.7"
|
||||
},
|
||||
"ptyprocess": {
|
||||
"hashes": [
|
||||
"sha256:923f299cc5ad920c68f2bc0bc98b75b9f838b93b599941a6b63ddbc2476394c0",
|
||||
"sha256:d7cc528d76e76342423ca640335bd3633420dc1366f258cb31d05e865ef5ca1f"
|
||||
],
|
||||
"version": "==0.6.0"
|
||||
},
|
||||
"py": {
|
||||
"hashes": [
|
||||
"sha256:bf92637198836372b520efcba9e020c330123be8ce527e535d185ed4b6f45694",
|
||||
"sha256:e76826342cefe3c3d5f7e8ee4316b80d1dd8a300781612ddbc765c17ba25a6c6"
|
||||
],
|
||||
"version": "==1.7.0"
|
||||
},
|
||||
"pycodestyle": {
|
||||
"hashes": [
|
||||
"sha256:cbc619d09254895b0d12c2c691e237b2e91e9b2ecf5e84c26b35400f93dcfb83",
|
||||
"sha256:cbfca99bd594a10f674d0cd97a3d802a1fdef635d4361e1a2658de47ed261e3a"
|
||||
],
|
||||
"version": "==2.4.0"
|
||||
},
|
||||
"pyflakes": {
|
||||
"hashes": [
|
||||
"sha256:9a7662ec724d0120012f6e29d6248ae3727d821bba522a0e6b356eff19126a49",
|
||||
"sha256:f661252913bc1dbe7fcfcbf0af0db3f42ab65aabd1a6ca68fe5d466bace94dae"
|
||||
],
|
||||
"version": "==2.0.0"
|
||||
},
|
||||
"pygments": {
|
||||
"hashes": [
|
||||
"sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a",
|
||||
"sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d"
|
||||
],
|
||||
"version": "==2.3.1"
|
||||
},
|
||||
"pytest": {
|
||||
"hashes": [
|
||||
"sha256:41568ea7ecb4a68d7f63837cf65b92ce8d0105e43196ff2b26622995bb3dc4b2",
|
||||
"sha256:c3c573a29d7c9547fb90217ece8a8843aa0c1328a797e200290dc3d0b4b823be"
|
||||
],
|
||||
"index": "pypi",
|
||||
"version": "==4.1.1"
|
||||
},
|
||||
"six": {
|
||||
"hashes": [
|
||||
"sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c",
|
||||
"sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73"
|
||||
],
|
||||
"version": "==1.12.0"
|
||||
},
|
||||
"traitlets": {
|
||||
"hashes": [
|
||||
"sha256:9c4bd2d267b7153df9152698efb1050a5d84982d3384a37b2c1f7723ba3e7835",
|
||||
"sha256:c6cb5e6f57c5a9bdaa40fa71ce7b4af30298fbab9ece9815b5d995ab6217c7d9"
|
||||
],
|
||||
"version": "==4.3.2"
|
||||
},
|
||||
"wcwidth": {
|
||||
"hashes": [
|
||||
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e",
|
||||
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c"
|
||||
],
|
||||
"version": "==0.1.7"
|
||||
}
|
||||
}
|
||||
}
|
29
README.md
29
README.md
@ -1,21 +1,23 @@
|
||||
# DSpace Statistics API [](https://travis-ci.org/ilri/dspace-statistics-api)
|
||||
DSpace versions 4.0 and up include a [REST API](https://wiki.duraspace.org/display/DSDOC5x/REST+API) that allows the repository to be queried programmatically. The API exposes information about communities, collections, items, and bitstreams, but not item views or downloads. This project contains a lightweight indexer and a web application to make the view and download statistics available via a simple REST API that can be deployed simultaneously with DSpace's own.
|
||||
DSpace stores item view and download events in a Solr "statistics" core. This information is available for use in the various DSpace user interfaces, but is not exposed externally via any APIs. The DSpace 4+ [REST API](https://wiki.duraspace.org/display/DSDOC5x/REST+API), for example, only exposes information about communities, collections, item metadata, and bitstreams.
|
||||
|
||||
You can read more about the Solr queries used to gather the item view and download statistics on the [DSpace wiki](https://wiki.duraspace.org/display/DSPACE/Solr).
|
||||
This project contains an indexer and a [Falcon-based](https://falcon.readthedocs.io/) web application to make the statistics available via simple REST API. You can read more about the Solr queries used to gather the item view and download statistics on the [DSpace wiki](https://wiki.duraspace.org/display/DSPACE/Solr).
|
||||
|
||||
## Requirements
|
||||
|
||||
- Python 3.5+
|
||||
- PostgreSQL version 9.5+ (due to [`UPSERT` support](https://wiki.postgresql.org/wiki/UPSERT))
|
||||
- DSpace 4+ with [Solr usage statistics enabled](https://wiki.duraspace.org/display/DSDOC5x/SOLR+Statistics)
|
||||
- DSpace with [Solr usage statistics enabled](https://wiki.duraspace.org/display/DSDOC5x/SOLR+Statistics) (tested with 5.x)
|
||||
|
||||
## Installation and Testing
|
||||
## Installation
|
||||
Create a Python virtual environment and install the dependencies:
|
||||
|
||||
$ python -m venv venv
|
||||
$ . venv/bin/activate
|
||||
$ python3 -m venv venv
|
||||
$ source venv/bin/activate
|
||||
$ pip install -r requirements.txt
|
||||
|
||||
## Running
|
||||
|
||||
Set up the environment variables for Solr and PostgreSQL:
|
||||
|
||||
$ export SOLR_SERVER=http://localhost:8080/solr
|
||||
@ -36,6 +38,15 @@ Test to see if there are any statistics:
|
||||
|
||||
$ curl 'http://localhost:8000/items?limit=1'
|
||||
|
||||
## Testing
|
||||
Install development packages using pip:
|
||||
|
||||
$ pip install -r requirements-dev.txt
|
||||
|
||||
Run tests:
|
||||
|
||||
$ pytest
|
||||
|
||||
## Deployment
|
||||
There are example systemd service and timer units in the `contrib` directory. The API service listens on localhost by default so you will need to expose it publicly using a web server like nginx.
|
||||
|
||||
@ -71,13 +82,13 @@ The item id is the *internal* id for an item. You can get these from the standar
|
||||
|
||||
## Todo
|
||||
|
||||
- Close DB connection when gunicorn shuts down gracefully
|
||||
- Better logging
|
||||
- Tests
|
||||
- Check if database exists (try/except)
|
||||
- Version API
|
||||
- Use JSON in PostgreSQL
|
||||
- Make community and collection stats available
|
||||
- Switch to [Python 3.6+ f-string syntax](https://realpython.com/python-f-strings/)
|
||||
|
||||
## License
|
||||
This work is licensed under the [GPLv3](https://www.gnu.org/licenses/gpl-3.0.en.html).
|
||||
|
||||
The license allows you to use and modify the work for personal and commercial purposes, but if you distribute the work you must provide users with a means to access the source code for the version you are distributing. Read more about the [GPLv3 at TL;DR Legal](https://tldrlegal.com/license/gnu-general-public-license-v3-(gpl-3)).
|
||||
|
@ -1,8 +1,6 @@
|
||||
from .database import database_connection
|
||||
from .database import DatabaseManager
|
||||
import falcon
|
||||
|
||||
db = database_connection()
|
||||
db.set_session(readonly=True)
|
||||
|
||||
class RootResource:
|
||||
def on_get(self, req, resp):
|
||||
@ -11,6 +9,7 @@ class RootResource:
|
||||
with open('dspace_statistics_api/docs/index.html', 'r') as f:
|
||||
resp.body = f.read()
|
||||
|
||||
|
||||
class AllItemsResource:
|
||||
def on_get(self, req, resp):
|
||||
"""Handles GET requests"""
|
||||
@ -19,56 +18,60 @@ class AllItemsResource:
|
||||
page = req.get_param_as_int("page", min=0) or 0
|
||||
offset = limit * page
|
||||
|
||||
cursor = db.cursor()
|
||||
with DatabaseManager() as db:
|
||||
db.set_session(readonly=True)
|
||||
|
||||
# get total number of items so we can estimate the pages
|
||||
cursor.execute('SELECT COUNT(id) FROM items')
|
||||
pages = round(cursor.fetchone()[0] / limit)
|
||||
with db.cursor() as cursor:
|
||||
# get total number of items so we can estimate the pages
|
||||
cursor.execute('SELECT COUNT(id) FROM items')
|
||||
pages = round(cursor.fetchone()[0] / limit)
|
||||
|
||||
# get statistics, ordered by id, and use limit and offset to page through results
|
||||
cursor.execute('SELECT id, views, downloads FROM items ORDER BY id ASC LIMIT {} OFFSET {}'.format(limit, offset))
|
||||
# get statistics, ordered by id, and use limit and offset to page through results
|
||||
cursor.execute('SELECT id, views, downloads FROM items ORDER BY id ASC LIMIT {} OFFSET {}'.format(limit, offset))
|
||||
|
||||
# create a list to hold dicts of item stats
|
||||
statistics = list()
|
||||
# create a list to hold dicts of item stats
|
||||
statistics = list()
|
||||
|
||||
# iterate over results and build statistics object
|
||||
for item in cursor:
|
||||
statistics.append({ 'id': item['id'], 'views': item['views'], 'downloads': item['downloads'] })
|
||||
|
||||
cursor.close()
|
||||
# iterate over results and build statistics object
|
||||
for item in cursor:
|
||||
statistics.append({'id': item['id'], 'views': item['views'], 'downloads': item['downloads']})
|
||||
|
||||
message = {
|
||||
'currentPage': page,
|
||||
'totalPages': pages,
|
||||
'limit': limit,
|
||||
'statistics': statistics
|
||||
'currentPage': page,
|
||||
'totalPages': pages,
|
||||
'limit': limit,
|
||||
'statistics': statistics
|
||||
}
|
||||
|
||||
resp.media = message
|
||||
|
||||
|
||||
class ItemResource:
|
||||
def on_get(self, req, resp, item_id):
|
||||
"""Handles GET requests"""
|
||||
|
||||
cursor = db.cursor()
|
||||
cursor.execute('SELECT views, downloads FROM items WHERE id={}'.format(item_id))
|
||||
if cursor.rowcount == 0:
|
||||
raise falcon.HTTPNotFound(
|
||||
title='Item not found',
|
||||
description='The item with id "{}" was not found.'.format(item_id)
|
||||
)
|
||||
else:
|
||||
results = cursor.fetchone()
|
||||
with DatabaseManager() as db:
|
||||
db.set_session(readonly=True)
|
||||
|
||||
statistics = {
|
||||
'id': item_id,
|
||||
'views': results['views'],
|
||||
'downloads': results['downloads']
|
||||
}
|
||||
with db.cursor() as cursor:
|
||||
cursor = db.cursor()
|
||||
cursor.execute('SELECT views, downloads FROM items WHERE id={}'.format(item_id))
|
||||
if cursor.rowcount == 0:
|
||||
raise falcon.HTTPNotFound(
|
||||
title='Item not found',
|
||||
description='The item with id "{}" was not found.'.format(item_id)
|
||||
)
|
||||
else:
|
||||
results = cursor.fetchone()
|
||||
|
||||
resp.media = statistics
|
||||
statistics = {
|
||||
'id': item_id,
|
||||
'views': results['views'],
|
||||
'downloads': results['downloads']
|
||||
}
|
||||
|
||||
resp.media = statistics
|
||||
|
||||
cursor.close()
|
||||
|
||||
api = application = falcon.API()
|
||||
api.add_route('/', RootResource())
|
||||
|
@ -7,5 +7,6 @@ DATABASE_NAME = os.environ.get('DATABASE_NAME', 'dspacestatistics')
|
||||
DATABASE_USER = os.environ.get('DATABASE_USER', 'dspacestatistics')
|
||||
DATABASE_PASS = os.environ.get('DATABASE_PASS', 'dspacestatistics')
|
||||
DATABASE_HOST = os.environ.get('DATABASE_HOST', 'localhost')
|
||||
DATABASE_PORT = os.environ.get('DATABASE_PORT', '5432')
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
||||
|
@ -2,11 +2,29 @@ from .config import DATABASE_NAME
|
||||
from .config import DATABASE_USER
|
||||
from .config import DATABASE_PASS
|
||||
from .config import DATABASE_HOST
|
||||
import psycopg2, psycopg2.extras
|
||||
from .config import DATABASE_PORT
|
||||
import falcon
|
||||
import psycopg2
|
||||
import psycopg2.extras
|
||||
|
||||
def database_connection():
|
||||
connection = psycopg2.connect("dbname={} user={} password={} host='{}'".format(DATABASE_NAME, DATABASE_USER, DATABASE_PASS, DATABASE_HOST), cursor_factory=psycopg2.extras.DictCursor)
|
||||
|
||||
return connection
|
||||
class DatabaseManager():
|
||||
'''Manage database connection.'''
|
||||
|
||||
def __init__(self):
|
||||
self._connection_uri = 'dbname={} user={} password={} host={} port={}'.format(DATABASE_NAME, DATABASE_USER, DATABASE_PASS, DATABASE_HOST, DATABASE_PORT)
|
||||
|
||||
def __enter__(self):
|
||||
try:
|
||||
self._connection = psycopg2.connect(self._connection_uri, cursor_factory=psycopg2.extras.DictCursor)
|
||||
except psycopg2.OperationalError:
|
||||
title = '500 Internal Server Error'
|
||||
description = 'Could not connect to database'
|
||||
raise falcon.HTTPInternalServerError(title, description)
|
||||
|
||||
return self._connection
|
||||
|
||||
def __exit__(self, exc_type, exc_value, exc_traceback):
|
||||
self._connection.close()
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
||||
|
@ -29,11 +29,59 @@
|
||||
# See: https://solrclient.readthedocs.io/en/latest/SolrClient.html
|
||||
# See: https://wiki.duraspace.org/display/DSPACE/Solr
|
||||
|
||||
from .database import database_connection
|
||||
from .database import DatabaseManager
|
||||
import json
|
||||
import psycopg2.extras
|
||||
import re
|
||||
import requests
|
||||
from .solr import solr_connection
|
||||
|
||||
|
||||
# Enumerate the cores in Solr to determine if statistics have been sharded into
|
||||
# yearly shards by DSpace's stats-util or not (for example: statistics-2018).
|
||||
def get_statistics_shards():
|
||||
# Initialize an empty list for statistics core years
|
||||
statistics_core_years = []
|
||||
|
||||
# URL for Solr status to check active cores
|
||||
solr_url = solr.host + '/admin/cores?action=STATUS&wt=json'
|
||||
res = requests.get(solr_url)
|
||||
|
||||
if res.status_code == requests.codes.ok:
|
||||
data = res.json()
|
||||
|
||||
# Iterate over active cores from Solr's STATUS response (cores are in
|
||||
# the status array of this response).
|
||||
for core in data['status']:
|
||||
# Pattern to match, for example: statistics-2018
|
||||
pattern = re.compile('^statistics-[0-9]{4}$')
|
||||
|
||||
if not pattern.match(core):
|
||||
continue
|
||||
|
||||
# Append current core to list
|
||||
statistics_core_years.append(core)
|
||||
|
||||
# Initialize a string to hold our shards (may end up being empty if the Solr
|
||||
# core has not been processed by stats-util).
|
||||
shards = str()
|
||||
|
||||
if len(statistics_core_years) > 0:
|
||||
# Begin building a string of shards starting with the default one
|
||||
shards = '{}/statistics'.format(solr.host)
|
||||
|
||||
for core in statistics_core_years:
|
||||
# Create a comma-separated list of shards to pass to our Solr query
|
||||
#
|
||||
# See: https://wiki.apache.org/solr/DistributedSearch
|
||||
shards += ',{}/{}'.format(solr.host, core)
|
||||
|
||||
# Return the string of shards, which may actually be empty. Solr doesn't
|
||||
# seem to mind if the shards query parameter is empty and I haven't seen
|
||||
# any negative performance impact so this should be fine.
|
||||
return shards
|
||||
|
||||
|
||||
def index_views():
|
||||
# get total number of distinct facets for items with a minimum of 1 view,
|
||||
# otherwise Solr returns all kinds of weird ids that are actually not in
|
||||
@ -42,131 +90,147 @@ def index_views():
|
||||
#
|
||||
# see: https://lucene.apache.org/solr/guide/6_6/the-stats-component.html
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:2',
|
||||
'fq':'isBot:false AND statistics_type:view',
|
||||
'facet':True,
|
||||
'facet.field':'id',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':1,
|
||||
'facet.offset':0,
|
||||
'stats':True,
|
||||
'stats.field':'id',
|
||||
'stats.calcdistinct':True
|
||||
'q': 'type:2',
|
||||
'fq': 'isBot:false AND statistics_type:view',
|
||||
'facet': True,
|
||||
'facet.field': 'id',
|
||||
'facet.mincount': 1,
|
||||
'facet.limit': 1,
|
||||
'facet.offset': 0,
|
||||
'stats': True,
|
||||
'stats.field': 'id',
|
||||
'stats.calcdistinct': True,
|
||||
'shards': shards
|
||||
}, rows=0)
|
||||
|
||||
# get total number of distinct facets (countDistinct)
|
||||
results_totalNumFacets = json.loads(res.get_json())['stats']['stats_fields']['id']['countDistinct']
|
||||
try:
|
||||
# get total number of distinct facets (countDistinct)
|
||||
results_totalNumFacets = json.loads(res.get_json())['stats']['stats_fields']['id']['countDistinct']
|
||||
except TypeError:
|
||||
print('No item views to index, exiting.')
|
||||
|
||||
exit(0)
|
||||
|
||||
# divide results into "pages" (cast to int to effectively round down)
|
||||
results_per_page = 100
|
||||
results_num_pages = int(results_totalNumFacets / results_per_page)
|
||||
results_current_page = 0
|
||||
|
||||
cursor = db.cursor()
|
||||
with DatabaseManager() as db:
|
||||
with db.cursor() as cursor:
|
||||
# create an empty list to store values for batch insertion
|
||||
data = []
|
||||
|
||||
# create an empty list to store values for batch insertion
|
||||
data = []
|
||||
while results_current_page <= results_num_pages:
|
||||
print('Indexing item views (page {} of {})'.format(results_current_page, results_num_pages))
|
||||
|
||||
while results_current_page <= results_num_pages:
|
||||
print('Indexing item views (page {} of {})'.format(results_current_page, results_num_pages))
|
||||
res = solr.query('statistics', {
|
||||
'q': 'type:2',
|
||||
'fq': 'isBot:false AND statistics_type:view',
|
||||
'facet': True,
|
||||
'facet.field': 'id',
|
||||
'facet.mincount': 1,
|
||||
'facet.limit': results_per_page,
|
||||
'facet.offset': results_current_page * results_per_page,
|
||||
'shards': shards
|
||||
}, rows=0)
|
||||
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:2',
|
||||
'fq':'isBot:false AND statistics_type:view',
|
||||
'facet':True,
|
||||
'facet.field':'id',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':results_per_page,
|
||||
'facet.offset':results_current_page * results_per_page
|
||||
}, rows=0)
|
||||
# SolrClient's get_facets() returns a dict of dicts
|
||||
views = res.get_facets()
|
||||
# in this case iterate over the 'id' dict and get the item ids and views
|
||||
for item_id, item_views in views['id'].items():
|
||||
data.append((item_id, item_views))
|
||||
|
||||
# SolrClient's get_facets() returns a dict of dicts
|
||||
views = res.get_facets()
|
||||
# in this case iterate over the 'id' dict and get the item ids and views
|
||||
for item_id, item_views in views['id'].items():
|
||||
data.append((item_id, item_views))
|
||||
# do a batch insert of values from the current "page" of results
|
||||
sql = 'INSERT INTO items(id, views) VALUES %s ON CONFLICT(id) DO UPDATE SET views=excluded.views'
|
||||
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)')
|
||||
db.commit()
|
||||
|
||||
# do a batch insert of values from the current "page" of results
|
||||
sql = 'INSERT INTO items(id, views) VALUES %s ON CONFLICT(id) DO UPDATE SET views=excluded.views'
|
||||
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)')
|
||||
db.commit()
|
||||
# clear all items from the list so we can populate it with the next batch
|
||||
data.clear()
|
||||
|
||||
# clear all items from the list so we can populate it with the next batch
|
||||
data.clear()
|
||||
results_current_page += 1
|
||||
|
||||
results_current_page += 1
|
||||
|
||||
cursor.close()
|
||||
|
||||
def index_downloads():
|
||||
# get the total number of distinct facets for items with at least 1 download
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:0',
|
||||
'fq':'isBot:false AND statistics_type:view AND bundleName:ORIGINAL',
|
||||
'facet':True,
|
||||
'facet.field':'owningItem',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':1,
|
||||
'facet.offset':0,
|
||||
'stats':True,
|
||||
'stats.field':'owningItem',
|
||||
'stats.calcdistinct':True
|
||||
'q': 'type:0',
|
||||
'fq': 'isBot:false AND statistics_type:view AND bundleName:ORIGINAL',
|
||||
'facet': True,
|
||||
'facet.field': 'owningItem',
|
||||
'facet.mincount': 1,
|
||||
'facet.limit': 1,
|
||||
'facet.offset': 0,
|
||||
'stats': True,
|
||||
'stats.field': 'owningItem',
|
||||
'stats.calcdistinct': True,
|
||||
'shards': shards
|
||||
}, rows=0)
|
||||
|
||||
# get total number of distinct facets (countDistinct)
|
||||
results_totalNumFacets = json.loads(res.get_json())['stats']['stats_fields']['owningItem']['countDistinct']
|
||||
try:
|
||||
# get total number of distinct facets (countDistinct)
|
||||
results_totalNumFacets = json.loads(res.get_json())['stats']['stats_fields']['owningItem']['countDistinct']
|
||||
except TypeError:
|
||||
print('No item downloads to index, exiting.')
|
||||
|
||||
exit(0)
|
||||
|
||||
# divide results into "pages" (cast to int to effectively round down)
|
||||
results_per_page = 100
|
||||
results_num_pages = int(results_totalNumFacets / results_per_page)
|
||||
results_current_page = 0
|
||||
|
||||
cursor = db.cursor()
|
||||
with DatabaseManager() as db:
|
||||
with db.cursor() as cursor:
|
||||
# create an empty list to store values for batch insertion
|
||||
data = []
|
||||
|
||||
# create an empty list to store values for batch insertion
|
||||
data = []
|
||||
while results_current_page <= results_num_pages:
|
||||
print('Indexing item downloads (page {} of {})'.format(results_current_page, results_num_pages))
|
||||
|
||||
while results_current_page <= results_num_pages:
|
||||
print('Indexing item downloads (page {} of {})'.format(results_current_page, results_num_pages))
|
||||
res = solr.query('statistics', {
|
||||
'q': 'type:0',
|
||||
'fq': 'isBot:false AND statistics_type:view AND bundleName:ORIGINAL',
|
||||
'facet': True,
|
||||
'facet.field': 'owningItem',
|
||||
'facet.mincount': 1,
|
||||
'facet.limit': results_per_page,
|
||||
'facet.offset': results_current_page * results_per_page,
|
||||
'shards': shards
|
||||
}, rows=0)
|
||||
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:0',
|
||||
'fq':'isBot:false AND statistics_type:view AND bundleName:ORIGINAL',
|
||||
'facet':True,
|
||||
'facet.field':'owningItem',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':results_per_page,
|
||||
'facet.offset':results_current_page * results_per_page
|
||||
}, rows=0)
|
||||
# SolrClient's get_facets() returns a dict of dicts
|
||||
downloads = res.get_facets()
|
||||
# in this case iterate over the 'owningItem' dict and get the item ids and downloads
|
||||
for item_id, item_downloads in downloads['owningItem'].items():
|
||||
data.append((item_id, item_downloads))
|
||||
|
||||
# SolrClient's get_facets() returns a dict of dicts
|
||||
downloads = res.get_facets()
|
||||
# in this case iterate over the 'owningItem' dict and get the item ids and downloads
|
||||
for item_id, item_downloads in downloads['owningItem'].items():
|
||||
data.append((item_id, item_downloads))
|
||||
# do a batch insert of values from the current "page" of results
|
||||
sql = 'INSERT INTO items(id, downloads) VALUES %s ON CONFLICT(id) DO UPDATE SET downloads=excluded.downloads'
|
||||
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)')
|
||||
db.commit()
|
||||
|
||||
# do a batch insert of values from the current "page" of results
|
||||
sql = 'INSERT INTO items(id, downloads) VALUES %s ON CONFLICT(id) DO UPDATE SET downloads=excluded.downloads'
|
||||
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)')
|
||||
db.commit()
|
||||
# clear all items from the list so we can populate it with the next batch
|
||||
data.clear()
|
||||
|
||||
# clear all items from the list so we can populate it with the next batch
|
||||
data.clear()
|
||||
results_current_page += 1
|
||||
|
||||
results_current_page += 1
|
||||
|
||||
cursor.close()
|
||||
|
||||
db = database_connection()
|
||||
solr = solr_connection()
|
||||
|
||||
# create table to store item views and downloads
|
||||
cursor = db.cursor()
|
||||
cursor.execute('''CREATE TABLE IF NOT EXISTS items
|
||||
with DatabaseManager() as db:
|
||||
with db.cursor() as cursor:
|
||||
# create table to store item views and downloads
|
||||
cursor.execute('''CREATE TABLE IF NOT EXISTS items
|
||||
(id INT PRIMARY KEY, views INT DEFAULT 0, downloads INT DEFAULT 0)''')
|
||||
|
||||
# commit the table creation before closing the database connection
|
||||
db.commit()
|
||||
|
||||
shards = get_statistics_shards()
|
||||
|
||||
index_views()
|
||||
index_downloads()
|
||||
|
||||
db.close()
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
||||
|
@ -1,6 +1,7 @@
|
||||
from .config import SOLR_SERVER
|
||||
from SolrClient import SolrClient
|
||||
|
||||
|
||||
def solr_connection():
|
||||
connection = SolrClient(SOLR_SERVER)
|
||||
|
||||
|
4
pytest.ini
Normal file
4
pytest.ini
Normal file
@ -0,0 +1,4 @@
|
||||
[pytest]
|
||||
addopts= -rsxX -s -v --strict
|
||||
filterwarnings =
|
||||
error::UserWarning
|
25
requirements-dev.txt
Normal file
25
requirements-dev.txt
Normal file
@ -0,0 +1,25 @@
|
||||
-i https://pypi.org/simple
|
||||
atomicwrites==1.2.1
|
||||
attrs==18.2.0
|
||||
backcall==0.1.0
|
||||
decorator==4.3.0
|
||||
flake8==3.6.0
|
||||
ipython-genutils==0.2.0
|
||||
ipython==7.2.0
|
||||
jedi==0.13.2
|
||||
mccabe==0.6.1
|
||||
more-itertools==5.0.0
|
||||
parso==0.3.1
|
||||
pexpect==4.6.0 ; sys_platform != 'win32'
|
||||
pickleshare==0.7.5
|
||||
pluggy==0.8.1
|
||||
prompt-toolkit==2.0.7
|
||||
ptyprocess==0.6.0
|
||||
py==1.7.0
|
||||
pycodestyle==2.4.0
|
||||
pyflakes==2.0.0
|
||||
pygments==2.3.1
|
||||
pytest==4.1.1
|
||||
six==1.12.0
|
||||
traitlets==4.3.2
|
||||
wcwidth==0.1.7
|
@ -1,12 +1,7 @@
|
||||
certifi==2018.10.15
|
||||
chardet==3.0.4
|
||||
-i https://pypi.org/simple
|
||||
falcon==1.4.1
|
||||
git+https://github.com/alanorth/SolrClient.git@c629e3475be37c82770b2be61748be7e29882648#egg=solrclient
|
||||
gunicorn==19.9.0
|
||||
idna==2.7
|
||||
kazoo==2.5.0
|
||||
psycopg2-binary==2.7.5
|
||||
psycopg2-binary==2.7.6.1
|
||||
python-mimeparse==1.6.0
|
||||
requests==2.20.0
|
||||
six==1.11.0
|
||||
-e git://github.com/alanorth/SolrClient.git@c629e3475be37c82770b2be61748be7e29882648#egg=SolrClient
|
||||
urllib3==1.24
|
||||
six==1.12.0
|
||||
|
0
tests/__init__.py
Normal file
0
tests/__init__.py
Normal file
77226
tests/dspacestatistics.sql
Normal file
77226
tests/dspacestatistics.sql
Normal file
File diff suppressed because it is too large
Load Diff
67
tests/test_api.py
Normal file
67
tests/test_api.py
Normal file
@ -0,0 +1,67 @@
|
||||
from falcon import testing
|
||||
import json
|
||||
import pytest
|
||||
|
||||
from dspace_statistics_api.app import api
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client():
|
||||
return testing.TestClient(api)
|
||||
|
||||
|
||||
def test_get_docs(client):
|
||||
'''Test requesting the documentation at the root.'''
|
||||
|
||||
response = client.simulate_get('/')
|
||||
|
||||
assert isinstance(response.content, bytes)
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
def test_get_item(client):
|
||||
'''Test requesting a single item.'''
|
||||
|
||||
response = client.simulate_get('/item/17')
|
||||
response_doc = json.loads(response.text)
|
||||
|
||||
assert isinstance(response_doc['downloads'], int)
|
||||
assert isinstance(response_doc['id'], int)
|
||||
assert isinstance(response_doc['views'], int)
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
def test_get_missing_item(client):
|
||||
'''Test requesting a single non-existing item.'''
|
||||
|
||||
response = client.simulate_get('/item/1')
|
||||
|
||||
assert response.status_code == 404
|
||||
|
||||
|
||||
def test_get_items(client):
|
||||
'''Test requesting 100 items.'''
|
||||
|
||||
response = client.simulate_get('/items', query_string='limit=100')
|
||||
response_doc = json.loads(response.text)
|
||||
|
||||
assert isinstance(response_doc['currentPage'], int)
|
||||
assert isinstance(response_doc['totalPages'], int)
|
||||
assert isinstance(response_doc['statistics'], list)
|
||||
assert response.status_code == 200
|
||||
|
||||
|
||||
def test_get_items_invalid_limit(client):
|
||||
'''Test requesting 100 items with an invalid limit parameter.'''
|
||||
|
||||
response = client.simulate_get('/items', query_string='limit=101')
|
||||
|
||||
assert response.status_code == 400
|
||||
|
||||
|
||||
def test_get_items_invalid_page(client):
|
||||
'''Test requesting 100 items with an invalid page parameter.'''
|
||||
|
||||
response = client.simulate_get('/items', query_string='page=-1')
|
||||
|
||||
assert response.status_code == 400
|
Reference in New Issue
Block a user