1
0
mirror of https://github.com/ilri/dspace-statistics-api.git synced 2024-11-22 22:35:06 +01:00
Commit Graph

219 Commits

Author SHA1 Message Date
b7d723ef7c
README.md: Fix sentence 2019-01-22 14:23:13 +02:00
914ec52fbb
CHANGELOG.md: Move unrelease changes to 0.9.0 2019-01-22 09:02:29 +02:00
5524066656
CHANGELOG.md: Add note about catching errors 2019-01-22 09:01:54 +02:00
043d897cef
dspace_statistics_api/indexer.py: Catch case of no views/downloads
Don't fail with an exception when there are no views or downloads,
for example on a new DSpace installation.
2019-01-22 09:00:22 +02:00
bd28353cda
README.md: Remove TODO for fixing querying of shards 2019-01-22 08:41:39 +02:00
e23d66c2a2
CHANGELOG.md: Add note about fixing querying of sharded cores 2019-01-22 08:41:31 +02:00
40e284dac0
dspace_statistics_api/indexer.py: Query multiple shards
DSpace's stats-util script splits the Solr statistics core into yearly
shards. We need to use Solr's `shards` query parameter in order to get
the statistics for previous years. This commit adds a helper function
to enumerate the active Solr cores to find yearly shards matching the
statistics-YYYY pattern and add them to the query.
2019-01-22 08:39:36 +02:00
934fa9db9b
README.md: Add TODO about sharded statistics cores 2019-01-21 12:55:43 +02:00
1fabb72b58
Update requirements
Generated from pipenv with:

  $ pipenv lock -r > requirements.txt
  $ pipenv lock -r -d > requirements-dev.txt
2019-01-16 12:34:50 +02:00
c7f95f0b60
README.md: Update TODO
I think it might be possible to compute community and collection
statistics from Solr and make them available at new endpoints:

  - /communities
  - /community/id
  - /collections
  - /collection/id
2019-01-16 09:59:29 +02:00
c95a98dd2d
Pipfile.lock: update dependencies
Updated with `pipenv update`.
2019-01-15 10:22:46 +02:00
3f70f94a10
Pipfile.lock: Run pipenv update 2018-11-26 11:53:37 +02:00
9b8ad9defd
Merge pull request #9 from ilri/pipenv-update
Pipenv update
2018-11-19 23:50:44 +02:00
d69ab20220
CHANGELOG.md: pytest version 4.0.0 2018-11-19 23:46:03 +02:00
378f56ddc2
Pipfile.lock: Run pipenv update 2018-11-19 23:34:34 +02:00
5a2a7d684c
CHANGELOG.md: Move unreleased changes to version 0.8.1 2018-11-14 09:37:00 +02:00
18276e910f
CHANGELOG.md: Add notes about pipenv 2018-11-14 09:36:13 +02:00
8de8c2765f
Merge pull request #8 from ilri/update-dependencies
Update dependencies
2018-11-14 09:34:45 +02:00
11a1755e59
Update requirements.txt
Generated from pipenv with:

    $ pipenv lock -r > requirements.txt
2018-11-14 09:19:47 +02:00
a835b0fdc5
Re-create pipenv environment from scratch
When I originally created the pipenv environment I used the standard
pip requirements.txt that I already had, which captured all the mod-
ules and their exact versions at the time. This makes it hard to se-
parate the project's actual dependencies from the dependencies' dep-
endencies, complicating the Pipfile and making it hard to update mo-
dule versions later.

I've re-created the environment with the following commands:

    $ pipenv install gunicorn falcon psycopg2-binary git+https://github.com/alanorth/SolrClient.git@kazoo-2.5.0#egg=SolrClient
    $ pipenv install --dev ipython flake8 pytest
2018-11-14 09:07:32 +02:00
a88600c92b
README.md: Add note about GPLv3 2018-11-13 12:34:31 +02:00
019d9242c9
Merge pull request #7 from ilri/use-pip
Rework to use pip instead of pipenv
2018-11-12 09:17:16 +02:00
f4d7312a3f
CHANGELOG.md: Add unreleased changes 2018-11-12 09:02:04 +02:00
9c46cfc7e2
Use Python 3.7 for pipenv
Now that I'm only using pipenv locally it shouldn't create problems
for people. They can still just create a vanilla virtualenv and use
pip to install the dependencies.
2018-11-12 08:54:54 +02:00
c1c2e319ac
README.md: Rework to use pip instead of pipenv
Pipenv is great for local development, but I don't think many people
are using it yet. I can use it locally and on Travis, but still keep
vanilla requirements.txt for use with pip. The requirements.txt file
can be generated easily from pipenv itself:

    $ pipenv lock -r > requirements.txt

The same for the development requirements:

    $ pipenv lock -r -d > requirements-dev.txt
2018-11-12 08:49:02 +02:00
0895b4f469
Add requirements-dev.txt for pip
Generated with pipenv lock -r -d. Will be used for separating the
development dependencies.
2018-11-12 08:48:45 +02:00
dcfef06a65
Pipfile.lock: Run pipenv update 2018-11-12 08:20:47 +02:00
13736d6359
CHANGELOG.md: Move unreleased changes to verison 0.8.0 2018-11-11 17:16:26 +02:00
4fc64edeb8
Merge pull request #6 from ilri/pytest
Pytest
2018-11-11 17:14:49 +02:00
2a8901dc4f
CHANGELOG.md: Update notes 2018-11-11 17:10:45 +02:00
e25c974796
README.md: We have tests now 2018-11-11 17:08:51 +02:00
ffc62e9ee6
tests/test_api.py: Use response.text for all json.loads()
This allows the code to work in Python 3.5 as well as 3.6+.
2018-11-11 17:05:31 +02:00
556c5ae088
tests/test_api.py: Use response.text instead of content
Falcon's response content is raw bytes, while its text is a string.
Let's use the latter so we can use json.loads() in Python 3.5, 3.6,
and 3.7 with the same code.

See: https://falcon.readthedocs.io/en/stable/api/testing.html
2018-11-11 17:01:17 +02:00
d94134f80a
tests/test_api.py: Try to add workaround for Python 3.5
In Python 3.5 it seems that json.loads() cannot decode a bytes, but
it works in Python 3.6 and 3.7. Let's try a workaround to see if we
can get it working on both Python 3.5 and 3.6+.

See: https://docs.python.org/3.5/library/json.html#json.loads
See: https://docs.python.org/3.6/library/json.html#json.loads
2018-11-11 17:00:20 +02:00
586231eb2d
.travis.yml: Use PostgreSQL 9.5
Default PostgreSQL in Travis CI is 9.2 which is very old, so let's
try to use 9.5.

See: https://docs.travis-ci.com/user/database-setup/#postgresql
2018-11-11 16:41:35 +02:00
766b77a3b6
.travis.yml: Use PostgreSQL directly
It seems that Travis CI already has a PostgreSQL service running.
2018-11-11 16:35:28 +02:00
1959e8154e
.travis.yml: Use localhost for Docker's PostgreSQL ports
See: https://docs.travis-ci.com/user/docker/
2018-11-11 16:28:18 +02:00
d40b2f0b2e Test API using pytest and PostgreSQL on Travis
First attempt at getting the Travis Docker setup correct. Inspired
by the Travis pipenv setup used in Responder.

See: https://docs.travis-ci.com/user/docker/
See: https://github.com/kennethreitz/responder/blob/master/.travis.yml
2018-11-11 16:25:16 +02:00
061d0a8f5f CHANGELOG.md: Add API tests to unreleased changes 2018-11-11 16:24:54 +02:00
e57660ff88 Add initial pytest configuration
From: https://github.com/kennethreitz/responder/blob/master/pytest.ini
2018-11-11 16:24:54 +02:00
5c8756bede Add pytest to pipenv development packages 2018-11-11 16:24:54 +02:00
bae9fb80e4 Add initial API tests
Test the basic assumptions of the API like response codes and types.
2018-11-11 16:24:54 +02:00
8a65d99e08
.travis.yml: Don't limit builds to master
This is good in theory but it means we can't trigger builds for other
branches on the fly from the Travis web interface.
2018-11-11 16:21:48 +02:00
d479b7dc6c
CHANGELOG.md: Syntax fixes 2018-11-11 00:08:44 +02:00
40aac8bf89
Merge pull request #5 from ilri/database-error-handling
Database error handling
2018-11-11 00:07:25 +02:00
53ba6f2936
CHANGELOG.md: Add database try/except to unreleased changes 2018-11-11 00:05:49 +02:00
140cc4cb07
README.md: Remove TODO for database try/except
Now database connection errors are properly excepted and raised.
2018-11-11 00:04:28 +02:00
d5d2d2149b
dspace_statistics_api/database.py: Raise HTTP 500 on error
Properly except on database connection error and raise an HTTP 500
instead of spamming the console/log with twenty lines of text.
2018-11-10 23:58:58 +02:00
4c51d12eb4
CHANGELOG.md: Move unreleased changes to version 0.7.0 2018-11-07 17:55:01 +02:00
a6ce44e852
Merge pull request #4 from ilri/database-refactor
Database refactor
2018-11-07 17:54:04 +02:00