1
0
mirror of https://github.com/ilri/csv-metadata-quality.git synced 2025-05-09 22:56:01 +02:00

251 Commits

Author SHA1 Message Date
ec95f69074 chore(deps): update dependency numpy to v2.2.4 2025-03-16 21:12:45 +00:00
b7a81b8ec7 csv_metadata_quality/fix.py: minor logic fix
Minor logic fix in missing regions.
2025-02-19 16:01:36 +03:00
8a2c567d1f Version 0.7.0 2025-01-31 10:12:43 +03:00
42eb9437e3 pyproject.toml: bump a few deps 2025-01-31 10:11:38 +03:00
5400bcb19b Remove pytest-clarity dependency
I think pytest itself has gotten much more readable over the years.
2025-01-31 10:10:33 +03:00
febea54f1b Remove poetry.lock
We switched to rye and then uv, so this is not needed.
2025-01-31 10:03:03 +03:00
b5565124de .github/workflows: fix pytest task 2025-01-30 11:56:27 +03:00
2869919507 pyproject.toml: minimum Python version is 3.10
Apparently this is due to ipython.
2025-01-30 11:55:17 +03:00
f7d66947f7 Use uv instead of rye 2025-01-30 11:55:09 +03:00
1d701f4056 renovate.json: use pep621 manager
We are not using pip requirements files, but I'm wondering if we
need to switch to uv instead.

See: https://docs.renovatebot.com/modules/manager/pep621/
2025-01-30 11:38:49 +03:00
1e339609a6 chore(config): migrate config renovate.json 2025-01-30 11:34:33 +03:00
2b0568de30 .github/workflows: use ubuntu-latest
This is now Ubuntu 24.04.

See: https://github.com/actions/runner-images
2025-01-30 11:28:32 +03:00
9903ada97a pyproject.toml: update isort to v6.0.0 2025-01-30 11:27:51 +03:00
d4b20e282c Python 3.13 2025-01-30 11:27:19 +03:00
9785c18301 Merge pull request #64 from ilri/renovate/ftfy-6.x 2024-10-11 20:02:53 +03:00
de5e292f1a chore(deps): update dependency ftfy to ~=6.3.0 2024-10-11 00:32:38 +00:00
2675cd288e Remove and re-sync rye requirements
Seems there is no way to force rye to sync?
2024-09-26 12:24:36 +03:00
78dc1336d0 requirements-dev.lock: run rye sync 2024-09-26 12:22:09 +03:00
28bbb919ce pyproject.toml: remove fixit from deps
I read about this online and was testing it but never used it.
2024-09-06 15:13:59 +03:00
b1de9552a4 Remove Mjanja CI (drone) 2024-09-06 12:54:16 +03:00
81e3ca3d9c .github/workflows: use rye in CI
Use rye instead of poetry in CI.
2024-08-21 18:56:09 +03:00
c470f8b375 requirements-dev.lock: run rye sync 2024-08-21 17:41:49 +03:00
0f45448517 pyproject.toml: bump dev dependencies 2024-08-21 17:41:36 +03:00
7dd52ca491 requirements: run rye sync 2024-07-29 19:58:42 -07:00
92ff0ee51b Normalize DOIs with %2f
These seem to be incorrectly URL encoded.
2024-06-25 11:54:09 +03:00
ae38a826ec csv_metadata_quality/fix.py: minor update to DOI fix
Normalize www.doi.org to doi.org in DOI field.
2024-06-25 11:48:45 +03:00
c1f630c298 Bump dependencies
All tests passing...
2024-06-18 22:17:38 +03:00
82b056f0ea Use py3langid v0.3.0 2024-06-18 21:51:32 +03:00
7fca981b95 Add .python-version
This was created with:

    rye pin --relaxed cpython@3.12

Rye will now always use the latest 3.12.x apparently.

See: https://rye-up.com/guide/toolchains/
2024-05-23 10:11:10 +03:00
1a9424197b Run rye lock --update-all and rye sync 2024-05-23 10:04:43 +03:00
f6c6c94a1e pyproject.toml: use ~= for dependencies
These are the closest to semantic versioning in Python that I can
find with PEP 621 syntax. For example:

> ~=3.1: version 3.1 or later, but not version 4.0 or later.
> ~=3.1.2: version 3.1.2 or later, but not version 3.2.0 or later.

For most cases I want to bump the minor and micro / patch.

See: https://packaging.python.org/en/latest/specifications/version-specifiers/#examples
2024-05-23 10:01:46 +03:00
f500fac64b pyproject.toml: remove scalene from dev deps 2024-05-23 10:00:01 +03:00
8143a7d978 pyproject.toml: align better with PEP 621
This PEP was approved years ago and has become a standard for the
way pyproject.toml file is laid out. We need to make some changes
to the license, URLs, add classifiers, etc.

See: https://peps.python.org/pep-0621/
2024-05-23 09:44:16 +03:00
94cec080d6 pyproject.toml: remove Hatch direct-references
Apparently I copied this from somewhere but it's not needed in this
project because we are not using direct dependency references (which
seem to be local packages).
2024-05-23 09:43:08 +03:00
9402af1e30 pyproject.toml: add comment about packages
Important for Hatch.
2024-05-23 09:42:11 +03:00
d71ff9082b pyproject.toml: add comment about backend 2024-05-23 09:41:08 +03:00
f309b694c4 Run rye sync to update lockfiles 2024-05-22 23:19:20 +03:00
4d879f6d13 pyproject.toml: remove black
rye bundles ruff so we can use that instead via `rye fmt`.
2024-05-22 23:19:20 +03:00
a30fefcd52 pyproject.toml: update formatting
rye requires a slightly different formatting.
2024-05-22 23:19:14 +03:00
2341c56c40 poetry.lock: run poetry update 2024-04-25 12:50:30 +03:00
5be2195325 Add fix for normalizing DOIs 2024-04-25 12:49:19 +03:00
736948ed2c csv_metadata_quality/check.py: run rye fmt 2024-04-12 13:40:55 +03:00
ee0b448355 csv_metadata_quality/check.py: remove unused import 2024-04-12 11:07:36 +03:00
4f3174a543 CHANGELOG.md: add note about SPDX license list
All checks were successful
continuous-integration/drone/push Build is passing
2024-03-02 10:39:00 +03:00
d5c25f82fa Update SPDX license list
From: https://github.com/spdx/license-list-data/blob/main/json/licenses.json
2024-03-02 10:38:27 +03:00
7b3e2b4e68 Merge pull request #43 from ilri/renovate/pytest-7.x-lockfile
All checks were successful
continuous-integration/drone/push Build is passing
chore(deps): update dependency pytest to v7.4.4
2024-01-05 16:40:13 +03:00
f92b2fe206 Merge pull request #44 from ilri/renovate/flake8-7.x
chore(deps): update dependency flake8 to v7
2024-01-05 16:25:22 +03:00
df040b70c7 chore(deps): update dependency flake8 to v7
All checks were successful
continuous-integration/drone/push Build is passing
2024-01-05 00:58:28 +00:00
10bc8f3e14 chore(deps): update dependency pytest to v7.4.4
All checks were successful
continuous-integration/drone/push Build is passing
2023-12-31 13:47:46 +00:00
7e6e92ecaa poetry.lock: run poetry lock
All checks were successful
continuous-integration/drone/push Build is passing
2023-12-28 14:12:03 +03:00
a21ffb0fa8 Use py3langid instead of langid
Faster and more modern code for Python 3 as a drop-in replacement.

See: https://adrien.barbaresi.eu/blog/language-detection-langid-py-faster.html
2023-12-28 14:11:21 +03:00
fb341dd9fa Merge pull request #37 from ilri/renovate/actions-setup-python-5.x
chore(deps): update actions/setup-python action to v5
2023-12-28 09:02:41 +03:00
2e943ee4db Merge pull request #39 from ilri/renovate/isort-5.x-lockfile
chore(deps): update dependency isort to v5.13.2
2023-12-28 09:01:48 +03:00
6d3a9870d6 Merge pull request #41 from ilri/renovate/pycountry-23.x-lockfile
fix(deps): update dependency pycountry to v23.12.11
2023-12-28 09:01:21 +03:00
82ecf7119a Merge pull request #42 from ilri/renovate/black-23.x-lockfile
chore(deps): update dependency black to v23.12.1
2023-12-28 09:00:39 +03:00
1db21cf275 chore(deps): update dependency black to v23.12.1
All checks were successful
continuous-integration/drone/push Build is passing
2023-12-23 00:35:13 +00:00
bcd1408798 chore(deps): update dependency isort to v5.13.2
All checks were successful
continuous-integration/drone/push Build is passing
2023-12-13 22:21:38 +00:00
ee8d255811 fix(deps): update dependency pycountry to v23.12.11
All checks were successful
continuous-integration/drone/push Build is passing
2023-12-11 21:50:09 +00:00
2cc2dbe952 tests: apply fixes from fixit
All checks were successful
continuous-integration/drone/push Build is passing
RewriteToLiteral: It's slower to call list() than using the empty literal
2023-12-09 12:20:35 +03:00
940a325d61 poetry.lock: run poetry lock 2023-12-09 12:05:26 +03:00
59b3b307c9 pyproject.toml: use official pycountry
The project is moving again and has all the latest data from the
iso-codes project.
2023-12-09 12:04:14 +03:00
b305da3f0b poetry.lock: run poetry update
Some checks failed
continuous-integration/drone/push Build is failing
2023-12-07 17:10:01 +03:00
96a486471c Update actions/setup-python action to v5
All checks were successful
continuous-integration/drone/push Build is passing
2023-12-06 13:13:11 +00:00
530cd5863b poetry.lock: run poetry update
All checks were successful
continuous-integration/drone/push Build is passing
2023-11-22 22:07:30 +03:00
f6018c51b6 Apply fixes from fixit
Apply recommended fix from fixit:

    RewriteToLiteral: It's slower to call list() than using the empty literal, because the name list must
    be looked up in the global scope in case it has been rebound.
2023-11-22 21:54:50 +03:00
80c3f5b45a Add fixit to dev dependencies 2023-11-22 21:54:09 +03:00
ba4637ea34 Merge pull request #31 from ilri/renovate/black-23.x-lockfile
All checks were successful
continuous-integration/drone/push Build is passing
Update dependency black to v23.11.0
2023-11-20 21:41:43 +03:00
355428a691 Merge pull request #32 from ilri/renovate/country-converter-1.x
Update dependency country-converter to ~1.1.0
2023-11-20 21:39:36 +03:00
58d4de973e Update dependency country-converter to ~1.1.0
Some checks failed
continuous-integration/drone/push Build is failing
2023-11-20 18:37:44 +00:00
e1216dae3c Merge pull request #33 from ilri/renovate/pandas-2.x-lockfile
Update dependency pandas to v2.1.3
2023-11-20 21:36:20 +03:00
6b650ff1b3 Update dependency pandas to v2.1.3
Some checks failed
continuous-integration/drone/push Build is failing
2023-11-20 18:33:42 +00:00
fa7bde6fc0 Merge pull request #34 from ilri/renovate/requests-cache-1.x-lockfile
Update dependency requests-cache to v1.1.1
2023-11-20 21:32:50 +03:00
f89159fe32 Update dependency requests-cache to v1.1.1
All checks were successful
continuous-integration/drone/push Build is passing
2023-11-19 09:26:49 +00:00
02058c5a65 Update dependency black to v23.11.0
All checks were successful
continuous-integration/drone/push Build is passing
2023-11-08 07:49:15 +00:00
8fed6b71ff Merge pull request #30 from ilri/renovate/ipython-8.x-lockfile
All checks were successful
continuous-integration/drone/push Build is passing
Update dependency ipython to v8.17.2
2023-10-31 22:15:50 +03:00
b005b28cbe Merge pull request #29 from ilri/renovate/pandas-2.x-lockfile
Update dependency pandas to v2.1.2
2023-10-31 22:15:27 +03:00
c626290599 Update dependency ipython to v8.17.2
All checks were successful
continuous-integration/drone/push Build is passing
2023-10-31 13:47:08 +00:00
1a06470b64 Update dependency pandas to v2.1.2
All checks were successful
continuous-integration/drone/push Build is passing
2023-10-26 23:01:25 +00:00
d46a81672e Merge pull request #28 from ilri/renovate/pytest-7.x-lockfile
All checks were successful
continuous-integration/drone/push Build is passing
Update dependency pytest to v7.4.3
2023-10-25 12:08:23 +03:00
2a50e75082 Merge pull request #27 from ilri/renovate/csvkit-1.x-lockfile
Update dependency csvkit to v1.3.0
2023-10-25 12:08:05 +03:00
0d45e73983 Merge pull request #25 from ilri/renovate/black-23.x-lockfile
Update dependency black to v23.10.1
2023-10-25 12:07:15 +03:00
3611aab425 Update dependency pytest to v7.4.3
All checks were successful
continuous-integration/drone/push Build is passing
2023-10-24 22:36:05 +00:00
5c4ad0eb41 Update dependency black to v23.10.1
All checks were successful
continuous-integration/drone/push Build is passing
2023-10-23 20:03:53 +00:00
f1f39722f6 Update dependency csvkit to v1.3.0
All checks were successful
continuous-integration/drone/push Build is passing
2023-10-18 07:56:03 +00:00
1c03999582 Merge pull request #24 from ilri/renovate/actions-checkout-4.x
All checks were successful
continuous-integration/drone/push Build is passing
Update actions/checkout action to v4
2023-10-15 23:39:45 +03:00
1f637f32cd Rework requests-cache
We should only be running this once per invocation, not for every
row we check. This should be more efficient, but it means that we
don't cache responses when running via pytest, which is actually
probably a good thing.
2023-10-15 23:37:38 +03:00
b8241e919d poetry.lock: run poetry update 2023-10-15 23:22:48 +03:00
b8dc19cc3f csv_metadata_quality/check.py: enable requests-cache
This was disabled at some point. We also need to use the new delete
method instead.
2023-10-15 23:21:58 +03:00
93c9b739ac csv_metadata_quality/check.py: use HTTPS
Use HTTPS for AGROVOC REST API.
2023-10-15 22:38:45 +03:00
4ed2786703 pyproject.toml: update pycountry
Use the latest branch in my fork that has iso-codes 4.15.0.
2023-10-15 21:53:09 +03:00
8728789183 Update actions/checkout action to v4
All checks were successful
continuous-integration/drone/push Build is passing
2023-09-04 14:26:25 +00:00
bf90464809 poetry.lock: run poetry update
Some checks failed
continuous-integration/drone/push Build is failing
continuous-integration/drone Build is passing
2023-08-08 09:55:41 +02:00
1878002391 poetry.lock: run poetry update
All checks were successful
continuous-integration/drone/push Build is passing
2023-06-12 10:42:50 +03:00
d21d2621e3 csv_metadata_quality/app.py: read fields as strings
I suspect this undermines the PyArrow backend performance gains in
recent Pandas 2.0.0, but we are dealing with messy data sometimes
and we must rely on data being strings.
2023-06-12 10:42:50 +03:00
f3fb1ff7fb Don't crash when title is missing
We shouldn't crash the country/region checker/fixer when the title
field is missing, since we only use it to show status to the user.
2023-06-12 10:42:50 +03:00
1fa81f7558 Merge pull request #13 from ilri/renovate/ipython-8.x-lockfile
All checks were successful
continuous-integration/drone/push Build is passing
Update dependency ipython to v8.14.0
2023-06-03 17:09:21 +03:00
7409193b6b Update dependency ipython to v8.14.0
All checks were successful
continuous-integration/drone/push Build is passing
2023-06-02 15:58:34 +00:00
a84fcf0b7b .drone.yml: try to use poetry instead of pip
All checks were successful
continuous-integration/drone/push Build is passing
2023-05-30 11:39:08 +03:00
25ac290df4 .github: update Python actions
Some checks failed
continuous-integration/drone/push Build is failing
We don't need to use `python setup.py install` anymore. We can use
poetry directly in CI.

See: https://github.com/actions/setup-python/blob/main/docs/advanced-usage.md
2023-05-29 22:58:01 +03:00
3f52bad1e3 Remove setup.py
As far as I understand this is deprecated.
2023-05-29 22:41:37 +03:00
0208ad0ade Merge pull request #12 from ilri/renovate/requests-cache-1.x
Update dependency requests-cache to v1
2023-05-29 22:37:23 +03:00
3632ae0fc9 Update dependency requests-cache to v1
All checks were successful
continuous-integration/drone/push Build is passing
2023-05-29 19:25:58 +00:00
17d089cc6e poetry.lock: run poetry update
All checks were successful
continuous-integration/drone/push Build is passing
2023-05-29 22:24:22 +03:00
bc470a4343 pyproject.toml: rework pandas and pyarrow
We don't explicitly depend on PyArrow. It should come as a pandas
extra. I installed it like this:

    $ poetry add pandas=="^2.0.2[feather,performance]"

See: https://pandas.pydata.org/docs/getting_started/install.html#other-data-sources
2023-05-29 22:24:04 +03:00
be609a809d setup.py: add Python 3.11 classifier 2023-05-29 21:32:59 +03:00
de3387ded7 Use Python 3.11 in Drone CI and GitHub Actions 2023-05-29 21:31:03 +03:00
f343e87f0c renovate.json: fix json 2023-05-29 21:26:03 +03:00
7d3524fbd5 renovate.json: disable requirements.txt support
Poetry is used to manage dependencies. The requirements.txt files
are generated manually by exporting from Poetry.
2023-05-29 21:11:48 +03:00
c614b71a52 Merge pull request #5 from ilri/renovate/configure
Configure Renovate
2023-05-29 21:02:16 +03:00
d159a839f3 Add renovate.json 2023-05-29 17:40:33 +00:00
36e2ebe5f4 poetry.lock: run poetry update
All checks were successful
continuous-integration/drone/push Build is passing
2023-05-10 15:06:41 +03:00
33f67b7a7c Update requirements
All checks were successful
continuous-integration/drone/push Build is passing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-05-03 14:29:12 +03:00
c0e1448439 poetry.lock: run poetry update 2023-05-03 14:28:47 +03:00
5d0804a08f Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-04-22 12:44:54 -07:00
f01c9edf17 poetry.lock: run poetry update 2023-04-22 12:44:16 -07:00
8d4295b2b3 CHANGELOG.md: add note about description field 2023-04-22 12:17:44 -07:00
e2d46e9495 csv_metadata_quality/app.py: skip newline fix on description
The description field often has free-form text like the abstract and
there are too many legitimate newlines here to be correcting them
automatically.
2023-04-22 12:16:13 -07:00
1491e1edb0 Fix path to data/licenses.json
All checks were successful
continuous-integration/drone/push Build is passing
When we install and run this from CI, this file needs to exist in
the package's folder inside site-packages. Then we can use __file__
to get the path relative to the package.

See: https://python-packaging.readthedocs.io/en/latest/non-code-files.html
2023-04-05 15:28:21 +03:00
34142c3e6b Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-04-05 12:51:56 +03:00
0c88b96e8d poetry.lock: run poetry update 2023-04-05 12:51:19 +03:00
2e55b4d6e3 pyproject.toml: add pyarrow explicitly
CI was failing because pyarrow is not an extra provided by pandas.
Indeed, according to the docs the named extras installing pyarrow
are actually feather and parquet, so we need to install pyarrow
explicitly.

See: https://pandas.pydata.org/pandas-docs/version/2.0/getting_started/install.html#install-dependencies
2023-04-05 12:49:40 +03:00
c90aad29f0 Use poetry dev group
This is the new syntax since Poetry 1.2.0.

See: https://python-poetry.org/docs/managing-dependencies/#installing-group-dependencies
2023-04-05 12:37:03 +03:00
6fd1e1377f Add pyarrow extra to Python Pandas deps 2023-04-05 11:40:22 +03:00
c64b7eb1f1 CHANGELOG.md: add note about Pandas 2.0.0 2023-04-05 11:17:48 +03:00
29cbc4f3a3 Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-04-05 11:17:06 +03:00
307af1acfc poetry.lock: run poetry update 2023-04-05 11:15:55 +03:00
b5106de9df pyproject.toml: Pandas 2.0.0 2023-04-05 11:15:40 +03:00
9eeadfc44e poetry.lock: after adding pandas 2.0.0rc1
Some checks failed
continuous-integration/drone/push Build is failing
This is going to be an issue on the master branch if I update any
dependencies in the mean time...
2023-03-22 12:17:26 +03:00
d4aed378cf Switch to pandas 2.0.0rc1
Seems to work fine with the new PyArrow datatypes.
2023-03-22 12:16:56 +03:00
20a2cce34b CHANGELOG.md: add fixes
Some checks failed
continuous-integration/drone/push Build is failing
2023-03-10 16:17:20 +03:00
d661ffe439 Check comma space on bibliographicCitation too
The regex was only matching `dc.identifier.citation`, but we need
to match `dcterms.bibliographicCitation` too.
2023-03-10 16:13:16 +03:00
45a310387a Don't fix multi-value separators on citations 2023-03-10 16:12:30 +03:00
47b03c49ba README.md: Update TODOs
Some checks failed
continuous-integration/drone/push Build is failing
2023-03-07 10:45:04 +03:00
986b81cbf4 Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-03-04 07:35:36 +03:00
d43a47ae32 poetry.lock: run poetry update 2023-03-04 07:34:50 +03:00
ede37569f1 pyproject.toml: use pycountry with iso-codes 4.13.0 2023-03-04 07:33:48 +03:00
0c53efe60a Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-03-04 06:54:34 +03:00
5f0e25b818 poetry.lock: run poetry update 2023-03-04 06:53:55 +03:00
4776154d6c pyproject.toml: switch back to upstream country_converter
Version 1.0.0 incorporates my change to Myanmar.

See: https://github.com/IndEcol/country_converter/releases/tag/v1.0.0
2023-03-04 06:52:56 +03:00
fdccdf7318 Version 0.6.1
Some checks failed
continuous-integration/drone/push Build is failing
2023-02-23 13:46:56 +03:00
ff2c986eec setup.py: minimum python 3.9 2023-02-23 11:47:40 +03:00
547574866e Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-02-23 11:46:24 +03:00
8aa7b93d87 poetry.lock: run poetry update 2023-02-23 11:45:53 +03:00
53fdb50906 csv_metadata_quality/check.py: run black
Some checks failed
continuous-integration/drone/push Build is failing
2023-02-18 22:10:04 +03:00
3e0e9a7f8b poetry.lock: run poetry update 2023-02-18 22:09:33 +03:00
03d824b78e pyproject.toml: update some dependencies 2023-02-18 22:09:05 +03:00
8bc4cd419c Strip filename descriptions before checking
Some checks failed
continuous-integration/drone/push Build is failing
When checking for uncommon file extensions in the filename field
we should strip descriptions that are meant for SAF Bundler, for
example: Annual_Report_2020.pdf__description:Report. This ends up
as a false positive that spams the output with warnings.
2023-02-13 11:00:57 +03:00
bde38e9ed4 CHANGELOG.md: add notes about abstracts 2023-02-13 10:39:03 +03:00
8db1e36a6d csv_metadata_quality/app.py: skip abstract in separator check
Also skip abstract in the separator check, since it's rare to have
any "|" here, but more likely that if one is present then it's for
a reason.
2023-02-13 10:37:33 +03:00
fbb625be5c Ignore common non-SPDX licenses
This is meant to catch licenses that are supposed to be SPDX but
aren't, not licenses that *aren't* supposed to be SPDX. We have so
many free-text license descriptions like "Copyrighted" and "Other"
that I'm sick of seeing warnings for them!
2023-02-07 17:01:56 +03:00
084b970798 CHANGELOG.md: add note about abstract field 2023-02-07 16:52:34 +03:00
171b35b015 Add data/abstract-check.csv
A test file with several whitespace and newline scenarios in the
abstract. I am currently disabling whitespace/newline fixes in the
abstract because they are too agressive.
2023-02-07 16:50:47 +03:00
545bb8cd0c csv_metadata_quality/app.py: disable whitespace on abstracts
It's too aggressive on abstracts. If people paste in text from a
PDF there are often newlines, and most of the time this is what
they want.
2023-02-07 16:48:40 +03:00
d5afbad788 Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2023-01-24 14:18:19 +03:00
d40c9ed97a poetry.lock: run poetry update 2023-01-24 14:17:44 +03:00
c4a2ee8563 CHANGELOG.md: add note about fix.separators() 2023-01-24 14:16:23 +03:00
3596381d03 csv_metadata_quality/app.py: separators fix
Don't run the invalid separators fix on title fields because some
items use "|" in the title to indicate something like a subtitle.

For example:

    Progress Review and Work Planning Meeting | Day 1
2023-01-24 14:13:55 +03:00
5abd32a41f CHANGELOG.md: run poetry update 2022-12-20 15:09:58 +02:00
0ed0fabe21 tests/test_check.py: remove local variables
This was raised by ruff.

> F841 Local variable `result` is assigned to but never used

We don't actually need the output of the function since these tests
capture the stdout.
2022-12-20 15:09:20 +02:00
d5cfec65bd tests/test_check.py: fix logic in assert
This was raised by ruff.

> E711 Comparison to `None` should be `cond is None`
2022-12-20 15:07:41 +02:00
66893753ba Move isort config to pyproject.toml
See: https://pycqa.github.io/isort/docs/configuration/black_compatibility.html
2022-12-20 15:03:10 +02:00
57be05ebb6 poetry.lock: run poetry update 2022-12-20 14:59:35 +02:00
8c23382b22 Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-12-13 10:47:16 +03:00
f640161d87 CHANGELOG.md: add notes about SPDX and Python 2022-12-13 10:45:36 +03:00
e81ae93bf0 poetry.lock: run poetry update 2022-12-13 10:44:06 +03:00
50ea5863dd .drone.yml: only test on Python 3.9+ 2022-12-13 10:43:18 +03:00
2dfb073b6b Update minimum Python version to 3.9
Due to importlib.resources.files. It's a very minor thing and there
are ways to use back-ported third-party modules with this function-
ality, but I'm the only one use this so...

See: https://docs.python.org/3/library/importlib.resources.html#importlib.resources.files
2022-12-13 10:41:32 +03:00
7cc49b500d Use licenses.json from SPDX instead of spdx-license-list
spdx-license-list has been deprecated[1] and already has outdated
information compared to recent SPDX data releases. Now I use the
JSON license data directly from SPDX[2] (currently version 3.19).

The JSON file is loaded from the package's data directory using
Python 3's stdlib functions from importlib[3], though we now need
Python 3.9 as a minimum for importlib.resources.files[4].

Also note that the data directory is not properly packaged via
setuptools, so this only works for local installs, and not via
versions published to pypi, for example (I'm currently not doing
this anyways). If I want to publish this in the future I will
need to modify setup.py/pyproject.toml to include the data files.

[1] https://gitlab.com/uniqx/spdx-license-list
[2] https://github.com/spdx/license-list-data/blob/main/json/licenses.json
[3] https://copdips.com/2022/09/adding-data-files-to-python-package-with-setup-py.html
[4] https://docs.python.org/3/library/importlib.resources.html#importlib.resources.files
2022-12-13 10:39:17 +03:00
051777bcec Ignore subregion field for missing region checks
All checks were successful
continuous-integration/drone/push Build is passing
Due to a sloppy regex I was sometimes matching the subregion field
when checking for missing UN M.49 regions in the region field.
2022-12-07 23:18:47 +01:00
58e956360a Add tests/test_check.py: fix test
All checks were successful
continuous-integration/drone/push Build is passing
2022-11-28 22:12:17 +03:00
3532175748 .drone.yml: install git
Some checks failed
continuous-integration/drone/push Build is failing
Apparently the slim images don't come with git, which we need for
cloning some dependencies.
2022-11-28 22:05:34 +03:00
a7bc929af8 Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-11-28 17:42:26 +03:00
141b2e1da3 csv_metadata_quality/check.py: update region output
Add the country to the message about missing regions. This makes it
easier to see which country is triggering the missing region error,
and helps in case of debugging possible mistakes in the data coming
from the country_converter library.
2022-11-28 17:40:27 +03:00
7097136b7e Use my fork of country_converter again
There is an issue with the UN M.49 region for Myanmar.
2022-11-28 17:38:45 +03:00
d134c93663 Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-11-28 17:16:09 +03:00
9858406894 poetry.lock: run poetry update 2022-11-28 17:15:19 +03:00
b02f1f65ee pyproject.toml: use upstream country_converter
Version 0.8.0 has the country and UN M.49 region fixes.

See: https://github.com/konstantinstadler/country_converter/releases/tag/v0.8.0
2022-11-28 17:14:16 +03:00
4d5ef38dde pyproject.toml: add ipython to dev dependencies 2022-11-28 17:11:18 +03:00
eaa8f31faf Update requirements
Some checks failed
continuous-integration/drone/push Build is failing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-11-08 10:22:39 +03:00
df57988e5a Use my fork of pycountry
Until they update to iso-codes 4.12.0.

See: https://github.com/flyingcircusio/pycountry/pull/149
2022-11-08 10:21:28 +03:00
bddf4da559 Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-11-08 10:06:26 +03:00
15f52f8be8 Switch to my fork of country-converter
Until a few issues are resolved regarding new countries and regions.

See: https://github.com/konstantinstadler/country_converter/pull/122
See: https://github.com/konstantinstadler/country_converter/pull/123
2022-11-08 10:04:31 +03:00
bc909464c7 Update requirements
All checks were successful
continuous-integration/drone/push Build is passing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-11-07 12:14:46 +03:00
2d46259dfe poetry.lock: run poetry update 2022-11-07 12:13:44 +03:00
ca82820a8e pyproject.toml: update dependencies to latest 2022-11-07 12:13:28 +03:00
86b4e5e182 Update requirements
All checks were successful
continuous-integration/drone/push Build is passing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-11-01 12:21:41 +03:00
e5d5ae7e5d poetry.lock: run poetry update 2022-11-01 12:20:43 +03:00
8f3db86a36 CHANGELOG.md: fix header
All checks were successful
continuous-integration/drone/push Build is passing
2022-10-31 11:43:14 +03:00
b0721b0a7a .github: use ubuntu-22.04 for actions
All checks were successful
continuous-integration/drone/push Build is passing
Apparently 'ubuntu-latest' is still 20.04 and today is 2022-10-03,
which seems a bit old!

See: https://github.com/actions/runner-images
2022-10-03 19:49:24 +03:00
4e5faf51bd .github/workflows: use pip caching
See: https://github.com/actions/setup-python/blob/main/docs/advanced-usage.md#caching-packages
2022-10-03 19:39:52 +03:00
5ea38d65bd .github/workflows: update actions
Update actions to latest versions:

- actions/checkout@v3
- actions/setup-python@v4
2022-10-03 19:39:52 +03:00
58b7b6e9d8 Version 0.6.0
All checks were successful
continuous-integration/drone/push Build is passing
2022-09-02 16:35:58 +03:00
ffdf1eca7b setup.py: remove Python 3.7 support
I had already set the minimum to Python 3.8 elsewhere, but forgot
to do it here. I am not sure if Python 3.7 will still work here or
not so let's just keep it in sync with the other docs.
2022-09-02 16:34:16 +03:00
59742e47f1 Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-09-02 16:32:04 +03:00
9c741b1d49 poetry.lock: sync latest deps 2022-09-02 16:31:19 +03:00
21e9948a75 pyproject.toml: manually updated all deps
Update all deps to their latest versions on pypi.org and remove the
explicit dependency on SQLAlchemy.
2022-09-02 16:30:40 +03:00
f64435fc9d tests/test_check.py: add missing excludes 2022-09-02 16:24:33 +03:00
566c2b45cf Remove Excel support
I never used this and it seems xlrd doesn't even support .xlsx any-
more anyways. If this was needed I could theoretically use openpyxl
but I'd rather just stick to CSV.
2022-09-02 16:14:24 +03:00
41b813be6e CHANGELOG.md: add not about exclude logic 2022-09-02 16:03:51 +03:00
040e56fc76 Improve exclude function
When a user explicitly requests that a field be excluded with -x we
skip that field in most checks. Up until now that did not include
the item-based checks using a transposed dataframe because we don't
know the metadata field names (labels) until we iterate over them.

Now the excludes are respected for item-based checks.
2022-09-02 15:59:22 +03:00
1f76247353 csv_metadata_quality/app.py: rework exclude/skip
Instead of processing the excludes inside the for column loop we do
it once before and then only need to check if the current column is
in the list.
2022-09-02 10:35:04 +03:00
2e489fc921 Add new data/test-geography.csv test file
All checks were successful
continuous-integration/drone/push Build is passing
This file has metadata to test different scenarios related to chec-
king and fixing missing regions.
2022-09-01 16:57:29 +03:00
117c6ca85d csv_metadata_quality/check.py: missing region fixes
Port over the recent fixes and logic improvements to regions from
fix.py.
2022-09-01 16:38:35 +03:00
f49214fa2e csv_metadata_quality/fix.py: fix bug in regions
We need to make sure we're only manipulating the regions if we have
any missing. The previous code was always manipulating the existing
row, even when there were no missing regions, which resulted in new
values like "Eastern Africa||".
2022-09-01 16:15:32 +03:00
7ce20726d0 csv_metadata_quality/fix.py: minor change
Print missing regions when we know they are missing, instead of do-
ing another check later and looping over them again.
2022-09-01 16:03:49 +03:00
473be5ac2f csv_metadata_quality/fix.py: don't add "not found" region
country_converter returns the literal "not found" string if a coun-
try cannot be found. In that case we do not want to consider that as
a region!
2022-09-01 15:46:21 +03:00
7c61cae417 csv_metadata_quality/fix.py: silence warning
By default country_converter prints "not found in regex" if a coun-
try is not found. We can silence this by switching the logging lev-
el to something above WARNING.
2022-09-01 15:44:50 +03:00
ae16289637 csv_metadata_quality/fix.py: Minor change
The country_converter documentation says we should instantiate the
CountryConverter() class once instead of calling coco.convert() in
each iteration of the loop so we don't end up loading the data file
more than once.
2022-09-01 15:40:45 +03:00
fdb7900cd0 Update requirements
All checks were successful
continuous-integration/drone/push Build is passing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --with dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==
2022-09-01 11:21:10 +03:00
9c65569c43 poetry.lock: run poetry update 2022-09-01 08:44:12 +03:00
0cf0bc97f0 csv_metadata_quality/fix.py: fix logic error again
All checks were successful
continuous-integration/drone/push Build is passing
It seems there was another logic error raised by the test in pytest.
With my real data, it was enough to check if the region column was
None, but with my test I was explicitly setting the region to "" (an
empty string). So to be really sure we should check if the string
is not None *and* if its length is greater than 0.
2022-08-03 20:51:14 +03:00
40c3585bab csv_metadata_quality/fix.py: fix logic error
Fix string concatenation with existing regions.
2022-08-03 18:26:08 +03:00
b9c44aed7d csv_metadata_quality/fix.py: fix logic issue
All checks were successful
continuous-integration/drone/push Build is passing
Forgot to return the row as-is if we don't find any countries.
2022-08-02 10:17:30 +03:00
032a1db392 README.md: Add note about missing regions
All checks were successful
continuous-integration/drone/push Build is passing
2022-07-28 16:58:01 +03:00
da87531779 CHANGELOG.md: Add note about adding missing regions 2022-07-28 16:54:05 +03:00
689ee184f7 Add unsafe check to add missing regions 2022-07-28 16:52:43 +03:00
344993370c Update requirements
All checks were successful
continuous-integration/drone/push Build is passing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==.
2022-07-08 15:50:42 +03:00
00b4dca185 poetry.lock: run poetry update 2022-07-08 15:50:03 +03:00
5a87bf4317 Update requirements
All checks were successful
continuous-integration/drone/push Build is passing
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==.
2022-03-21 14:37:38 +03:00
c706719d8b poetry.lock: run poetry update 2022-03-21 14:37:03 +03:00
e7ea8ef9f0 README.md: add note about spdx-license-list
All checks were successful
continuous-integration/drone/push Build is passing
This Python module was deprecated in favor of using the SPDX license
data directly.

See: https://github.com/spdx/license-list-data
2022-01-30 13:27:20 +03:00
ea050376fc Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==.
2022-01-30 13:26:37 +03:00
4ba615cd41 poetry.lock: run poetry update 2022-01-30 13:26:04 +03:00
b0d46cd864 pyproject.toml: update black
It's no longer in beta!
2022-01-30 13:22:47 +03:00
3ee9319d84 pyproject.toml: bump flake8 2022-01-30 13:21:09 +03:00
4d5f4b5abb pyproject.toml: update pycountry
Seems to be a few major versions from 19.x.x to 21.x.x. All tests
passing in pytest so it's probably fine.
2022-01-30 13:15:38 +03:00
98d38801fa pyproject.toml: update requests and requests-cache 2022-01-30 13:11:01 +03:00
dad7a8765c .github/workflows/python-app.yml: use Python 3.10
That's what I use for testing locally. Note that we need to quote
the version here because otherwise GitHub Actions will interpret it
as 3.1 due to how YAML works.
2022-01-30 13:06:51 +03:00
d126304534 README.md: update note about Python version 2022-01-30 13:05:36 +03:00
38c2584863 .drone.yml: don't test on Python 3.7 anymore
Pandas 1.4.0 has a minimum Python requirement of 3.8.

See: https://pandas.pydata.org/docs/whatsnew/v1.4.0.html
2022-01-30 13:04:52 +03:00
e94a4539bf pyproject.toml: bump Pandas to v1.4.0
As of Pandas v1.4.0 the minimum Python version is 3.8.

See: https://pandas.pydata.org/docs/whatsnew/v1.4.0.html
2022-01-30 13:03:56 +03:00
a589d39e38 poetry.lock: run poetry lock 2022-01-29 16:26:16 +03:00
d9e427a80e pyproject.toml: don't install ipython
It always complains about running in a virtual environment anyways,
and I can use the one from the OS instead.
2022-01-29 16:25:58 +03:00
8ee5e2e306 setup.py: denote that Python 3.10 works
All checks were successful
continuous-integration/drone/push Build is passing
I have been using Python 3.10 for months, and already added it to
the CI builds.
2022-01-29 16:08:01 +03:00
490701f244 Run more CLI tests in CI
All checks were successful
continuous-integration/drone/push Build is passing
2021-12-24 14:47:25 +02:00
e1b270cf83 CHANGELOG.md: add note about dropping invalid AGROVOC values
All checks were successful
continuous-integration/drone/push Build is passing
2021-12-23 12:47:42 +02:00
b7efe2de40 data/test.csv: update invalid AGROVOC entry
Now that we can drop invalid AGROVOC values we should have a valid
value and an invalid value here. Depending on how the checker is
invoked we will either print a warning or drop the invalid value.
2021-12-23 12:45:38 +02:00
c43095139a tests/test_check.py: add tests for dropping invalid AGROVOC 2021-12-23 12:44:32 +02:00
a7727b8431 Add support for dropping invalid AGROVOC terms
Requires --agrovoc-fields <field.name> to do the actual validation,
and -d to drop invalid ones.
2021-12-23 12:43:55 +02:00
7763a021c5 csv_metadata_quality/fix.py: sort imports with isort
All checks were successful
continuous-integration/drone/push Build is passing
2021-12-15 23:15:02 +02:00
3c12ef3f66 Update requirements
Generated with poetry export:

    $ poetry export --without-hashes -f requirements.txt > requirements.txt
    $ poetry export --without-hashes --dev -f requirements.txt > requirements-dev.txt

I am trying `--without-hashes` to work around an error on pip install
when running in CI:

    ERROR: In --require-hashes mode, all requirements must have their versions pinned with ==.
2021-12-15 23:11:44 +02:00
aee2438e94 poetry.lock: run poetry update 2021-12-15 23:10:27 +02:00
a351ba9706 CHANGELOG.md: add notes about ftfy 2021-12-15 22:09:01 +02:00
e4faf114dc csv_metadata_quality/util.py: update for ftfy 6.0
The sequence_weirdness() heuristic is deprecated. Now we should use
is_bad().

See: https://ftfy.readthedocs.io/en/v6.0/heuristic.html
See: https://github.com/rspeer/python-ftfy/blob/master/CHANGELOG.md#version-60-april-2-2021
2021-12-15 21:58:07 +02:00
ff49a80432 csv_metadata_quality/fix.py: configure ftfy
Don't replace smart quotes in ftfy. If our text has them we should
keep them.
2021-12-15 21:51:51 +02:00
8b15154285 pyproject.toml: use ftfy 6.0
Lots of improvements here! Improvements to heuristics and a new way
to configure which fixes get applied.

See: https://github.com/rspeer/python-ftfy/blob/master/CHANGELOG.md#version-60-april-2-2021
2021-12-15 21:48:56 +02:00
5854f8e865 CHANGELOG.md: add note about unnecessary Unicode 2021-12-15 13:56:31 +02:00
e7322efadd csv_metadata_quality/app.py: move unnecessary Unicode fix
We actually want to do this after we try to fix mojibake with ftfy.
These "unnecessary" Unicode characters could actually help ftfy in
some cases because often times they indicate that some character
from another encoding was there before (like an accent, dash, or
smart quote).
2021-12-15 13:53:25 +02:00
95015febbd csv_metadata_quality/fix.py: fix thin spaces
All checks were successful
continuous-integration/drone/push Build is passing
Replace thin spaces with normal spaces. Sometimes I see these get
mis handled on Windows machines and they end up as "?" or so.
2021-12-09 23:22:53 +02:00
cef6c66b30 CHANGELOG.md: start next changes 2021-12-09 23:21:58 +02:00
9905e183ea Bump version to 0.6.0-dev 2021-12-09 23:21:30 +02:00
26 changed files with 10282 additions and 2055 deletions

View File

@ -1,69 +0,0 @@
---
kind: pipeline
type: docker
name: python310
steps:
- name: test
image: python:3.10-slim
commands:
- id
- python -V
- apt update && apt install -y gcc g++ libicu-dev pkg-config
- pip install -r requirements-dev.txt
- pytest
- python setup.py install
- csv-metadata-quality -i data/test.csv -o /tmp/test.csv -e -u --agrovoc-fields dcterms.subject,cg.coverage.country
---
kind: pipeline
type: docker
name: python39
steps:
- name: test
image: python:3.9-slim
commands:
- id
- python -V
- apt update && apt install -y gcc g++ libicu-dev pkg-config
- pip install -r requirements-dev.txt
- pytest
- python setup.py install
- csv-metadata-quality -i data/test.csv -o /tmp/test.csv -e -u --agrovoc-fields dcterms.subject,cg.coverage.country
---
kind: pipeline
type: docker
name: python38
steps:
- name: test
image: python:3.8-slim
commands:
- id
- python -V
- apt update && apt install -y gcc g++ libicu-dev pkg-config
- pip install -r requirements-dev.txt
- pytest
- python setup.py install
- csv-metadata-quality -i data/test.csv -o /tmp/test.csv -e -u --agrovoc-fields dcterms.subject,cg.coverage.country
---
kind: pipeline
type: docker
name: python37
steps:
- name: test
image: python:3.7-slim
commands:
- id
- python -V
- apt update && apt install -y gcc g++ libicu-dev pkg-config
- pip install -r requirements-dev.txt
- pytest
- python setup.py install
- csv-metadata-quality -i data/test.csv -o /tmp/test.csv -e -u --agrovoc-fields dcterms.subject,cg.coverage.country
# vim: ts=2 sw=2 et

View File

@ -15,27 +15,23 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.9
uses: actions/setup-python@v2
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
with:
python-version: 3.9
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
if [ -f requirements-dev.txt ]; then pip install -r requirements-dev.txt; fi
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
version: 'latest'
- run: uv sync
- name: Test with pytest
run: |
pytest
run: uv run pytest
- name: Test CLI
run: |
python setup.py install
csv-metadata-quality -i data/test.csv -o /tmp/test.csv -e -u --agrovoc-fields dcterms.subject,cg.coverage.country
# Basic test
uv run csv-metadata-quality -i data/test.csv -o /tmp/test.csv
# Test with unsafe fixes
uv run csv-metadata-quality -i data/test.csv -o /tmp/test.csv -u
# Test with experimental checks
uv run csv-metadata-quality -i data/test.csv -o /tmp/test.csv -e
# Test with AGROVOC validation
uv run csv-metadata-quality -i data/test.csv -o /tmp/test.csv --agrovoc-fields dcterms.subject
# Test with AGROVOC validation (and dropping invalid)
uv run csv-metadata-quality -i data/test.csv -o /tmp/test.csv --agrovoc-fields dcterms.subject -d

1
.python-version Normal file
View File

@ -0,0 +1 @@
3.13

View File

@ -4,6 +4,66 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.7.0] - 2025-01-31
### Added
- Ability to normalize DOIs to https://doi.org URI format
### Fixed
- Fixed regex so we don't run the invalid multi-value separator fix on
`dcterms.bibliographicCitation` fields
- Fixed regex so we run the comma space fix on `dcterms.bibliographicCitation`
fields
- Don't crash the country/region checker/fixer when a title field is missing
### Changed
- Don't run newline fix on description fields
- Install requests-cache in main run() function instead of check.agrovoc() function so we only incur the overhead once
- Use py3langid instead of langid, see: [How to make language detection with langid.py faster](https://adrien.barbaresi.eu/blog/language-detection-langid-py-faster.html)
- Use uv instead of rye
- Remove pytest-clarity — I think pytest itself has gotten much better in the past few years
### Updated
- Python dependencies, including Pandas 2.0.0 and [Arrow-backed dtypes](https://datapythonista.me/blog/pandas-20-and-the-arrow-revolution-part-i)
- SPDX license list
## [0.6.1] - 2023-02-23
### Fixed
- Missing region check should ignore subregion field, if it exists
### Changed
- Use SPDX license data from SPDX themselves instead of spdx-license-list
because it is deprecated and outdated
- Require Python 3.9+
- Don't run `fix.separators()` on title or abstract fields
- Don't run whitespace or newline fixes on abstract fields
- Ignore some common non-SPDX licenses
- Ignore `__description` suffix in filenames meant for SAFBuilder when checking
for uncommon file extensions
### Updated
- Python dependencies
## [0.6.0] - 2022-09-02
### Changed
- Perform fix for "unnecessary" Unicode characters after we try to fix encoding
issues with ftfy
- ftfy heuristics to use `is_bad()` instead of `sequence_weirdness()`
- ftfy `fix_text()` to *not* change “smart quotes” to "ASCII quotes"
### Updated
- Python dependencies
- Metadatata field exclude logic
### Added
- Ability to drop invalid AGROVOC values with `-d` when checking AGROVOC values
with `-a <field.name>`
- Ability to add missing UN M.49 regions when both country and region columns
are present. Enable with `-u` (unsafe fixes) for now.
### Removed
- Support for reading Excel files (both `.xls` and `.xlsx`) as it was completely
untested
## [0.5.0] - 2021-12-08
### Added
- Ability to check for, and fix, "mojibake" characters using [ftfy](https://github.com/LuminosoInsight/python-ftfy)

1
MANIFEST.in Normal file
View File

@ -0,0 +1 @@
include csv_metadata_quality/data/licenses.json

View File

@ -1,14 +1,13 @@
<h1 align="center">DSpace CSV Metadata Quality Checker</h1>
<p align="center">
<a href="https://ci.mjanja.ch/alanorth/csv-metadata-quality"><img alt="Build Status" src="https://ci.mjanja.ch/api/badges/alanorth/csv-metadata-quality/status.svg"></a>
<a href="https://github.com/ilri/csv-metadata-quality/actions"><img alt="Build and Test" src="https://github.com/ilri/csv-metadata-quality/workflows/Build%20and%20Test/badge.svg"></a>
<a href="https://github.com/psf/black"><img alt="Code style: black" src="https://img.shields.io/badge/code%20style-black-000000.svg"></a>
</p>
A simple, but opinionated metadata quality checker and fixer designed to work with CSVs in the DSpace ecosystem (though it could theoretically work on any CSV that uses Dublin Core fields as columns). The implementation is essentially a pipeline of checks and fixes that begins with splitting multi-value fields on the standard DSpace "||" separator, trimming leading/trailing whitespace, and then proceeding to more specialized cases like ISSNs, ISBNs, languages, unnecessary Unicode, AGROVOC terms, etc.
Requires Python 3.7.1 or greater (3.8+ recommended). CSV and Excel support comes from the [Pandas](https://pandas.pydata.org/) library, though your mileage may vary with Excel because this is much less tested.
Requires Python 3.9 or greater. CSV support comes from the [Pandas](https://pandas.pydata.org/) library.
If you use the DSpace CSV metadata quality checker please cite:
@ -28,26 +27,28 @@ If you use the DSpace CSV metadata quality checker please cite:
- Remove unnecessary Unicode like [non-breaking spaces](https://en.wikipedia.org/wiki/Non-breaking_space), [replacement characters](https://en.wikipedia.org/wiki/Specials_(Unicode_block)#Replacement_character), etc
- Check for "suspicious" characters that indicate encoding or copy/paste issues, for example "foreˆt" should be "forêt"
- Check for "mojibake" characters (and attempt to fix with `--unsafe-fixes`)
- Check for countries with missing regions (and attempt to fix with `--unsafe-fixes`)
- Remove duplicate metadata values
- Check for duplicate items, using the title, type, and date issued as an indicator
- [Normalize DOIs](https://www.crossref.org/documentation/member-setup/constructing-your-dois/) to https://doi.org URI format
## Installation
The easiest way to install CSV Metadata Quality is with [poetry](https://python-poetry.org):
The easiest way to install CSV Metadata Quality is with [uv](https://docs.astral.sh/uv/):
```
$ git clone https://github.com/ilri/csv-metadata-quality.git
$ cd csv-metadata-quality
$ poetry install
$ poetry shell
$ uv sync
$ source .venv/bin/activate
```
Otherwise, if you don't have poetry, you can use a vanilla Python virtual environment:
Otherwise, if you don't have uv, you can use a vanilla Python virtual environment:
```
$ git clone https://github.com/ilri/csv-metadata-quality.git
$ cd csv-metadata-quality
$ python3 -m venv venv
$ source venv/bin/activate
$ python3 -m venv .venv
$ source .venv/bin/activate
$ pip install -r requirements.txt
```
@ -70,7 +71,7 @@ While it is *theoretically* possible for a single `|` character to be used legit
This will also remove unnecessary trailing multi-value separators, for example `Kenya||Tanzania||`.
## Unsafe Fixes
You can enable several "unsafe" fixes with the `--unsafe-fixes` option. Currently this will remove newlines, perform Unicode normalization, and attempt to fix "mojibake" characters.
You can enable several "unsafe" fixes with the `--unsafe-fixes` option. Currently this will remove newlines, perform Unicode normalization, attempt to fix "mojibake" characters, and add missing UN M.49 regions.
### Newlines
This is considered "unsafe" because some systems give special importance to vertical space and render it properly. DSpace does not support rendering newlines in its XMLUI and has, at times, suffered from parsing errors that cause the import process to fail if an input file had newlines. The `--unsafe-fixes` option strips Unix line feeds (U+000A).
@ -91,6 +92,9 @@ Read more about [Unicode normalization](https://withblue.ink/2019/03/11/why-you-
Pay special attention to the output of the script as well as the resulting file to make sure no new issues have been introduced. The ideal way to solve these issues is to avoid it in the first place. See [this guide about opening CSVs in UTF-8 format in Excel](https://www.itg.ias.edu/content/how-import-csv-file-uses-utf-8-character-encoding-0).
### Countries With Missing Regions
When an input file has both country and region columns we can check to see if the ISO 3166 country names have matching UN M.49 regions and add them when they are missing.
## AGROVOC Validation
You can enable validation of metadata values in certain fields against the AGROVOC REST API with the `--agrovoc-fields` option. For example, in addition to agricultural subjects, many countries and regions are also present AGROVOC. Enable this validation by specifying a comma-separated list of fields:
@ -121,9 +125,7 @@ This currently uses the [Python langid](https://github.com/saffsd/langid.py) lib
- Better logging, for example with INFO, WARN, and ERR levels
- Verbose, debug, or quiet options
- Warn if an author is shorter than 3 characters?
- Validate DOIs? Normalize to https://doi.org format? Or use just the DOI part: 10.1016/j.worlddev.2010.06.006
- Warn if two items use the same file in `filename` column
- Add an option to drop invalid AGROVOC subjects?
- Add tests for application invocation, ie `tests/test_app.py`?
- Validate ISSNs or journal titles against CrossRef API?
- Add configurable field validation, like specify a field name and a validation file?
@ -133,6 +135,7 @@ This currently uses the [Python langid](https://github.com/saffsd/langid.py) lib
- Warn if item is Open Access, but missing a license
- Warn if item has an ISSN but no journal title
- Update journal titles from ISSN
- Migrate from Pandas to Polars
## License
This work is licensed under the [GPLv3](https://www.gnu.org/licenses/gpl-3.0.en.html).

View File

@ -1,11 +1,14 @@
# SPDX-License-Identifier: GPL-3.0-only
import argparse
import os
import re
import signal
import sys
from datetime import timedelta
import pandas as pd
import requests_cache
from colorama import Fore
import csv_metadata_quality.check as check
@ -21,6 +24,12 @@ def parse_args(argv):
"-a",
help="Comma-separated list of fields to validate against AGROVOC, for example: dcterms.subject,cg.coverage.country",
)
parser.add_argument(
"--drop-invalid-agrovoc",
"-d",
help="After validating metadata values against AGROVOC, drop invalid values.",
action="store_true",
)
parser.add_argument(
"--experimental-checks",
"-e",
@ -30,7 +39,7 @@ def parse_args(argv):
parser.add_argument(
"--input-file",
"-i",
help="Path to input file. Can be UTF-8 CSV or Excel XLSX.",
help="Path to input file. Must be a UTF-8 CSV.",
required=True,
type=argparse.FileType("r", encoding="UTF-8"),
)
@ -68,33 +77,50 @@ def run(argv):
signal.signal(signal.SIGINT, signal_handler)
# Read all fields as strings so dates don't get converted from 1998 to 1998.0
df = pd.read_csv(args.input_file, dtype=str)
df = pd.read_csv(args.input_file, dtype_backend="pyarrow", dtype="str")
# Check if the user requested to skip any fields
if args.exclude_fields:
# Split the list of excluded fields on ',' into a list. Note that the
# user should be careful to no include spaces here.
exclude = args.exclude_fields.split(",")
else:
exclude = []
# enable transparent request cache with thirty days expiry
expire_after = timedelta(days=30)
# Allow overriding the location of the requests cache, just in case we are
# running in an environment where we can't write to the current working di-
# rectory (for example from csv-metadata-quality-web).
REQUESTS_CACHE_DIR = os.environ.get("REQUESTS_CACHE_DIR", ".")
requests_cache.install_cache(
f"{REQUESTS_CACHE_DIR}/agrovoc-response-cache", expire_after=expire_after
)
# prune old cache entries
requests_cache.delete()
for column in df.columns:
# Check if the user requested to skip any fields
if args.exclude_fields:
skip = False
# Split the list of excludes on ',' so we can test exact matches
# rather than fuzzy matches with regexes or "if word in string"
for exclude in args.exclude_fields.split(","):
if column == exclude and skip is False:
skip = True
if skip:
print(f"{Fore.YELLOW}Skipping {Fore.RESET}{column}")
if column in exclude:
print(f"{Fore.YELLOW}Skipping {Fore.RESET}{column}")
continue
continue
# Fix: whitespace
df[column] = df[column].apply(fix.whitespace, field_name=column)
# Fix: newlines
if args.unsafe_fixes:
df[column] = df[column].apply(fix.newlines, field_name=column)
# Skip whitespace and newline fixes on abstracts and descriptions
# because there are too many with legitimate multi-line metadata.
match = re.match(r"^.*?(abstract|description).*$", column)
if match is None:
# Fix: whitespace
df[column] = df[column].apply(fix.whitespace, field_name=column)
# Fix: newlines
df[column] = df[column].apply(fix.newlines, field_name=column)
# Fix: missing space after comma. Only run on author and citation
# fields for now, as this problem is mostly an issue in names.
if args.unsafe_fixes:
match = re.match(r"^.*?(author|citation).*$", column)
match = re.match(r"^.*?(author|[Cc]itation).*$", column)
if match is not None:
df[column] = df[column].apply(fix.comma_space, field_name=column)
@ -103,9 +129,6 @@ def run(argv):
if args.unsafe_fixes:
df[column] = df[column].apply(fix.normalize_unicode, field_name=column)
# Fix: unnecessary Unicode
df[column] = df[column].apply(fix.unnecessary_unicode)
# Check: suspicious characters
df[column].apply(check.suspicious_characters, field_name=column)
@ -115,20 +138,34 @@ def run(argv):
else:
df[column].apply(check.mojibake, field_name=column)
# Fix: invalid and unnecessary multi-value separators
df[column] = df[column].apply(fix.separators, field_name=column)
# Run whitespace fix again after fixing invalid separators
df[column] = df[column].apply(fix.whitespace, field_name=column)
# Fix: unnecessary Unicode
df[column] = df[column].apply(fix.unnecessary_unicode)
# Fix: normalize DOIs
match = re.match(r"^.*?identifier\.doi.*$", column)
if match is not None:
df[column] = df[column].apply(fix.normalize_dois)
# Fix: invalid and unnecessary multi-value separators. Skip the title
# and abstract fields because "|" is used to indicate something like
# a subtitle.
match = re.match(r"^.*?(abstract|[Cc]itation|title).*$", column)
if match is None:
df[column] = df[column].apply(fix.separators, field_name=column)
# Run whitespace fix again after fixing invalid separators
df[column] = df[column].apply(fix.whitespace, field_name=column)
# Fix: duplicate metadata values
df[column] = df[column].apply(fix.duplicates, field_name=column)
# Check: invalid AGROVOC subject
# Check: invalid AGROVOC subject and optionally drop them
if args.agrovoc_fields:
# Identify fields the user wants to validate against AGROVOC
for field in args.agrovoc_fields.split(","):
if column == field:
df[column].apply(check.agrovoc, field_name=column)
df[column] = df[column].apply(
check.agrovoc, field_name=column, drop=args.drop_invalid_agrovoc
)
# Check: invalid language
match = re.match(r"^.*?language.*$", column)
@ -192,19 +229,30 @@ def run(argv):
# should rename column in this for loop...
for column in df_transposed.columns:
# Check: citation DOI
check.citation_doi(df_transposed[column])
check.citation_doi(df_transposed[column], exclude)
# Check: title in citation
check.title_in_citation(df_transposed[column])
check.title_in_citation(df_transposed[column], exclude)
# Check: countries match regions
check.countries_match_regions(df_transposed[column])
if args.unsafe_fixes:
# Fix: countries match regions
df_transposed[column] = fix.countries_match_regions(
df_transposed[column], exclude
)
else:
# Check: countries match regions
check.countries_match_regions(df_transposed[column], exclude)
if args.experimental_checks:
experimental.correct_language(df_transposed[column])
experimental.correct_language(df_transposed[column], exclude)
# Transpose the DataFrame back before writing. This is probably wasteful to
# do every time since we technically only need to do it if we've done the
# countries/regions fix above, but I can't think of another way for now.
df_transposed_back = df_transposed.T
# Write
df.to_csv(args.output_file, index=False)
df_transposed_back.to_csv(args.output_file, index=False)
# Close the input and output files before exiting
args.input_file.close()

View File

@ -1,20 +1,18 @@
# SPDX-License-Identifier: GPL-3.0-only
import os
import logging
import re
from datetime import datetime, timedelta
from datetime import datetime
import country_converter as coco
import pandas as pd
import requests
import requests_cache
import spdx_license_list
from colorama import Fore
from pycountry import languages
from stdnum import isbn as stdnum_isbn
from stdnum import issn as stdnum_issn
from csv_metadata_quality.util import is_mojibake
from csv_metadata_quality.util import is_mojibake, load_spdx_licenses
def issn(field):
@ -33,7 +31,6 @@ def issn(field):
# Try to split multi-value field on "||" separator
for value in field.split("||"):
if not stdnum_issn.is_valid(value):
print(f"{Fore.RED}Invalid ISSN: {Fore.RESET}{value}")
@ -56,7 +53,6 @@ def isbn(field):
# Try to split multi-value field on "||" separator
for value in field.split("||"):
if not stdnum_isbn.is_valid(value):
print(f"{Fore.RED}Invalid ISBN: {Fore.RESET}{value}")
@ -137,7 +133,7 @@ def suspicious_characters(field, field_name):
return
# List of suspicious characters, for example: ́ˆ~`
suspicious_characters = ["\u00B4", "\u02C6", "\u007E", "\u0060"]
suspicious_characters = ["\u00b4", "\u02c6", "\u007e", "\u0060"]
for character in suspicious_characters:
# Find the position of the suspicious character in the string
@ -173,7 +169,6 @@ def language(field):
# Try to split multi-value field on "||" separator
for value in field.split("||"):
# After splitting, check if language value is 2 or 3 characters so we
# can check it against ISO 639-1 or ISO 639-3 accordingly.
if len(value) == 2:
@ -188,7 +183,7 @@ def language(field):
return
def agrovoc(field, field_name):
def agrovoc(field, field_name, drop):
"""Check subject terms against AGROVOC REST API.
Function constructor expects the field as well as the field name because
@ -206,22 +201,12 @@ def agrovoc(field, field_name):
if pd.isna(field):
return
# enable transparent request cache with thirty days expiry
expire_after = timedelta(days=30)
# Allow overriding the location of the requests cache, just in case we are
# running in an environment where we can't write to the current working di-
# rectory (for example from csv-metadata-quality-web).
REQUESTS_CACHE_DIR = os.environ.get("REQUESTS_CACHE_DIR", ".")
requests_cache.install_cache(
f"{REQUESTS_CACHE_DIR}/agrovoc-response-cache", expire_after=expire_after
)
# prune old cache entries
requests_cache.remove_expired_responses()
# Initialize an empty list to hold the validated AGROVOC values
values = []
# Try to split multi-value field on "||" separator
for value in field.split("||"):
request_url = "http://agrovoc.uniroma2.it/agrovoc/rest/v1/agrovoc/search"
request_url = "https://agrovoc.uniroma2.it/agrovoc/rest/v1/agrovoc/search"
request_params = {"query": value}
request = requests.get(request_url, params=request_params)
@ -231,9 +216,25 @@ def agrovoc(field, field_name):
# check if there are any results
if len(data["results"]) == 0:
print(f"{Fore.RED}Invalid AGROVOC ({field_name}): {Fore.RESET}{value}")
if drop:
print(
f"{Fore.GREEN}Dropping invalid AGROVOC ({field_name}): {Fore.RESET}{value}"
)
else:
print(
f"{Fore.RED}Invalid AGROVOC ({field_name}): {Fore.RESET}{value}"
)
return
# value is invalid AGROVOC, but we are not dropping
values.append(value)
else:
# value is valid AGROVOC so save it
values.append(value)
# Create a new field consisting of all values joined with "||"
new_field = "||".join(values)
return new_field
def filename_extension(field):
@ -267,6 +268,11 @@ def filename_extension(field):
# Iterate over all values
for value in values:
# Strip filename descriptions that are meant for SAF Bundler, for
# example: Annual_Report_2020.pdf__description:Report
if "__description" in value:
value = value.split("__")[0]
# Assume filename extension does not match
filename_extension_match = False
@ -293,13 +299,26 @@ def spdx_license_identifier(field):
Prints the value if it is invalid.
"""
# List of common non-SPDX licenses to ignore
# See: https://ilri.github.io/cgspace-submission-guidelines/dcterms-license/dcterms-license.txt
ignore_licenses = {
"All rights reserved; no re-use allowed",
"All rights reserved; self-archive copy only",
"Copyrighted; Non-commercial educational use only",
"Copyrighted; Non-commercial use only",
"Copyrighted; all rights reserved",
"Other",
}
# Skip fields with missing values
if pd.isna(field):
if pd.isna(field) or field in ignore_licenses:
return
spdx_licenses = load_spdx_licenses()
# Try to split multi-value field on "||" separator
for value in field.split("||"):
if value not in spdx_license_list.LICENSES:
if value not in spdx_licenses:
print(f"{Fore.YELLOW}Non-SPDX license identifier: {Fore.RESET}{value}")
return
@ -339,7 +358,7 @@ def duplicate_items(df):
if items_count_unique < items_count_total:
# Create a list to hold our items while we check for duplicates
items = list()
items = []
for index, row in df.iterrows():
item_title_type_date = f"{row[title_column_name]}{row[type_column_name]}{row[date_column_name]}"
@ -371,13 +390,20 @@ def mojibake(field, field_name):
return
def citation_doi(row):
def citation_doi(row, exclude):
"""Check for the scenario where an item has a DOI listed in its citation,
but does not have a cg.identifier.doi field.
Function prints a warning if the DOI field is missing, but there is a DOI
in the citation.
"""
# Check if the user requested us to skip any DOI fields so we can
# just return before going any further.
for field in exclude:
match = re.match(r"^.*?doi.*$", field)
if match is not None:
return
# Initialize some variables at global scope so that we can set them in the
# loop scope below and still be able to access them afterwards.
citation = ""
@ -395,9 +421,10 @@ def citation_doi(row):
if match is not None:
return
# Get the name of the citation field
# Check if the current label is a citation field and make sure the user
# hasn't asked to skip it. If not, then set the citation.
match = re.match(r"^.*?[cC]itation.*$", label)
if match is not None:
if match is not None and label not in exclude:
citation = row[label]
if citation != "":
@ -413,7 +440,7 @@ def citation_doi(row):
return
def title_in_citation(row):
def title_in_citation(row, exclude):
"""Check for the scenario where an item's title is missing from its cita-
tion. This could mean that it is missing entirely, or perhaps just exists
in a different format (whitespace, accents, etc).
@ -435,12 +462,12 @@ def title_in_citation(row):
# Find the name of the title column
match = re.match(r"^(dc|dcterms)\.title.*$", label)
if match is not None:
if match is not None and label not in exclude:
title = row[label]
# Find the name of the citation column
match = re.match(r"^.*?[cC]itation.*$", label)
if match is not None:
if match is not None and label not in exclude:
citation = row[label]
if citation != "":
@ -450,7 +477,7 @@ def title_in_citation(row):
return
def countries_match_regions(row):
def countries_match_regions(row, exclude):
"""Check for the scenario where an item has country coverage metadata, but
does not have the corresponding region metadata. For example, an item that
has country coverage "Kenya" should also have region "Eastern Africa" acc-
@ -466,6 +493,15 @@ def countries_match_regions(row):
region_column_name = ""
title_column_name = ""
# Instantiate a CountryConverter() object here. According to the docs it is
# more performant to do that as opposed to calling coco.convert() directly
# because we don't need to re-load the country data with each iteration.
cc = coco.CountryConverter()
# Set logging to ERROR so country_converter's convert() doesn't print the
# "not found in regex" warning message to the screen.
logging.basicConfig(level=logging.ERROR)
# Iterate over the labels of the current row's values to get the names of
# the title and citation columns. Then we check if the title is present in
# the citation.
@ -475,9 +511,9 @@ def countries_match_regions(row):
if match is not None:
country_column_name = label
# Find the name of the region column
# Find the name of the region column, but make sure it's not subregion!
match = re.match(r"^.*?region.*$", label)
if match is not None:
if match is not None and "sub" not in label:
region_column_name = label
# Find the name of the title column
@ -485,6 +521,12 @@ def countries_match_regions(row):
if match is not None:
title_column_name = label
# Make sure the user has not asked to exclude any metadata fields. If so, we
# should return immediately.
column_names = [country_column_name, region_column_name, title_column_name]
if any(field in column_names for field in exclude):
return
# Make sure we found the country and region columns
if country_column_name != "" and region_column_name != "":
# If we don't have any countries then we should return early before
@ -497,25 +539,22 @@ def countries_match_regions(row):
if row[region_column_name] is not None:
regions = row[region_column_name].split("||")
else:
regions = list()
# An empty list for our regions so we can keep track for all countries
missing_regions = list()
regions = []
for country in countries:
# Look up the UN M.49 regions for this country code. CoCo seems to
# only list the direct region, ie Western Africa, rather than all
# the parent regions ("Sub-Saharan Africa", "Africa", "World")
un_region = coco.convert(names=country, to="UNRegion")
un_region = cc.convert(names=country, to="UNRegion")
if un_region not in regions:
if un_region not in missing_regions:
missing_regions.append(un_region)
if len(missing_regions) > 0:
for missing_region in missing_regions:
print(
f"{Fore.YELLOW}Missing region ({missing_region}): {Fore.RESET}{row[title_column_name]}"
)
if un_region != "not found" and un_region not in regions:
try:
print(
f"{Fore.YELLOW}Missing region ({country} → {un_region}): {Fore.RESET}{row[title_column_name]}"
)
except KeyError:
print(
f"{Fore.YELLOW}Missing region ({country} → {un_region}): {Fore.RESET}<title field not present>"
)
return

File diff suppressed because it is too large Load Diff

View File

@ -2,13 +2,13 @@
import re
import langid
import pandas as pd
import py3langid as langid
from colorama import Fore
from pycountry import languages
def correct_language(row):
def correct_language(row, exclude):
"""Analyze the text used in the title, abstract, and citation fields to pre-
dict the language being used and compare it with the item's dc.language.iso
field.
@ -20,7 +20,7 @@ def correct_language(row):
# Initialize some variables at global scope so that we can set them in the
# loop scope below and still be able to access them afterwards.
language = ""
sample_strings = list()
sample_strings = []
title = None
# Iterate over the labels of the current row's values. Before we transposed
@ -39,7 +39,8 @@ def correct_language(row):
language = row[label]
# Extract title if it is present
# Extract title if it is present (note that we don't allow excluding
# the title here because it complicates things).
match = re.match(r"^.*?title.*$", label)
if match is not None:
title = row[label]
@ -48,12 +49,12 @@ def correct_language(row):
# Extract abstract if it is present
match = re.match(r"^.*?abstract.*$", label)
if match is not None:
if match is not None and label not in exclude:
sample_strings.append(row[label])
# Extract citation if it is present
match = re.match(r"^.*?[cC]itation.*$", label)
if match is not None:
if match is not None and label not in exclude:
sample_strings.append(row[label])
# Make sure language is not blank and is valid ISO 639-1/639-3 before proceeding with language prediction

View File

@ -1,11 +1,13 @@
# SPDX-License-Identifier: GPL-3.0-only
import logging
import re
from unicodedata import normalize
import country_converter as coco
import pandas as pd
from colorama import Fore
from ftfy import fix_text
from ftfy import TextFixerConfig, fix_text
from csv_metadata_quality.util import is_mojibake, is_nfc
@ -21,7 +23,7 @@ def whitespace(field, field_name):
return
# Initialize an empty list to hold the cleaned values
values = list()
values = []
# Try to split multi-value field on "||" separator
for value in field.split("||"):
@ -62,7 +64,7 @@ def separators(field, field_name):
return
# Initialize an empty list to hold the cleaned values
values = list()
values = []
# Try to split multi-value field on "||" separator
for value in field.split("||"):
@ -104,6 +106,7 @@ def unnecessary_unicode(field):
Replaces unnecessary Unicode characters like:
- Soft hyphen (U+00AD) → hyphen
- No-break space (U+00A0) → space
- Thin space (U+2009) → space
Return string with characters removed or replaced.
"""
@ -148,6 +151,16 @@ def unnecessary_unicode(field):
)
field = re.sub(pattern, "-", field)
# Check for thin spaces (U+2009)
pattern = re.compile(r"\u2009")
match = re.findall(pattern, field)
if match:
print(
f"{Fore.GREEN}Replacing unnecessary Unicode (U+2009): {Fore.RESET}{field}"
)
field = re.sub(pattern, " ", field)
return field
@ -162,7 +175,7 @@ def duplicates(field, field_name):
values = field.split("||")
# Initialize an empty list to hold the de-duplicated values
new_values = list()
new_values = []
# Iterate over all values
for value in values:
@ -269,9 +282,201 @@ def mojibake(field, field_name):
if pd.isna(field):
return field
# We don't want ftfy to change “smart quotes” to "ASCII quotes"
config = TextFixerConfig(uncurl_quotes=False)
if is_mojibake(field):
print(f"{Fore.GREEN}Fixing encoding issue ({field_name}): {Fore.RESET}{field}")
return fix_text(field)
return fix_text(field, config)
else:
return field
def countries_match_regions(row, exclude):
"""Check for the scenario where an item has country coverage metadata, but
does not have the corresponding region metadata. For example, an item that
has country coverage "Kenya" should also have region "Eastern Africa" acc-
ording to the UN M.49 classification scheme.
See: https://unstats.un.org/unsd/methodology/m49/
Return fixed string.
"""
# Initialize some variables at global scope so that we can set them in the
# loop scope below and still be able to access them afterwards.
country_column_name = ""
region_column_name = ""
title_column_name = ""
# Instantiate a CountryConverter() object here. According to the docs it is
# more performant to do that as opposed to calling coco.convert() directly
# because we don't need to re-load the country data with each iteration.
cc = coco.CountryConverter()
# Set logging to ERROR so country_converter's convert() doesn't print the
# "not found in regex" warning message to the screen.
logging.basicConfig(level=logging.ERROR)
# Iterate over the labels of the current row's values to get the names of
# the title and citation columns. Then we check if the title is present in
# the citation.
for label in row.axes[0]:
# Find the name of the country column
match = re.match(r"^.*?country.*$", label)
if match is not None:
country_column_name = label
# Find the name of the region column, but make sure it's not subregion!
match = re.match(r"^.*?region.*$", label)
if match is not None and "sub" not in label:
region_column_name = label
# Find the name of the title column
match = re.match(r"^(dc|dcterms)\.title.*$", label)
if match is not None:
title_column_name = label
# Make sure the user has not asked to exclude any metadata fields. If so, we
# should return immediately.
column_names = [country_column_name, region_column_name, title_column_name]
if any(field in column_names for field in exclude):
return row
# Make sure we found the country and region columns
if country_column_name != "" and region_column_name != "":
# If we don't have any countries then we should return early before
# suggesting regions.
if row[country_column_name] is not None:
countries = row[country_column_name].split("||")
else:
return row
if row[region_column_name] is not None:
regions = row[region_column_name].split("||")
else:
regions = []
# An empty list for our regions so we can keep track for all countries
missing_regions = []
for country in countries:
# Look up the UN M.49 regions for this country code. CoCo seems to
# only list the direct region, ie Western Africa, rather than all
# the parent regions ("Sub-Saharan Africa", "Africa", "World")
un_region = cc.convert(names=country, to="UNRegion")
# Add the new un_region to regions if it is not "not found" and if
# it doesn't already exist in regions.
if un_region != "not found" and un_region not in regions:
if un_region not in missing_regions:
try:
print(
f"{Fore.YELLOW}Adding missing region ({un_region}): {Fore.RESET}{row[title_column_name]}"
)
except KeyError:
# If there is no title column in the CSV we will print
# the fix without the title instead of crashing.
print(
f"{Fore.YELLOW}Adding missing region ({un_region}): {Fore.RESET}<title field not present>"
)
missing_regions.append(un_region)
if len(missing_regions) > 0:
# Add the missing regions back to the row, paying attention to whether
# or not the row's region column is None (aka null) or just an empty
# string (length would be 0).
if row[region_column_name] is not None and len(row[region_column_name]) > 0:
row[region_column_name] = (
row[region_column_name] + "||".join(missing_regions)
)
else:
row[region_column_name] = "||".join(missing_regions)
return row
def normalize_dois(field):
"""Normalize DOIs.
DOIs are meant to be globally unique identifiers. They are case insensitive,
but in order to compare them robustly they should be normalized to a common
format:
- strip leading and trailing whitespace
- lowercase all ASCII characters
- convert all variations to https://doi.org/10.xxxx/xxxx URI format
Return string with normalized DOI.
See: https://www.crossref.org/documentation/member-setup/constructing-your-dois/
"""
# Skip fields with missing values
if pd.isna(field):
return
# Try to split multi-value field on "||" separator
values = field.split("||")
# Initialize an empty list to hold the de-duplicated values
new_values = []
# Iterate over all values (most items will only have one DOI)
for value in values:
# Strip leading and trailing whitespace
new_value = value.strip()
new_value = new_value.lower()
# Convert to HTTPS
pattern = re.compile(r"^http://")
match = re.findall(pattern, new_value)
if match:
new_value = re.sub(pattern, "https://", new_value)
# Convert dx.doi.org to doi.org
pattern = re.compile(r"dx\.doi\.org")
match = re.findall(pattern, new_value)
if match:
new_value = re.sub(pattern, "doi.org", new_value)
# Convert www.doi.org to doi.org
pattern = re.compile(r"www\.doi\.org")
match = re.findall(pattern, new_value)
if match:
new_value = re.sub(pattern, "doi.org", new_value)
# Convert erroneous %2f to /
pattern = re.compile("%2f")
match = re.findall(pattern, new_value)
if match:
new_value = re.sub(pattern, "/", new_value)
# Replace values like doi: 10.11648/j.jps.20140201.14
pattern = re.compile(r"^doi: 10\.")
match = re.findall(pattern, new_value)
if match:
new_value = re.sub(pattern, "https://doi.org/10.", new_value)
# Replace values like 10.3390/foods12010115
pattern = re.compile(r"^10\.")
match = re.findall(pattern, new_value)
if match:
new_value = re.sub(pattern, "https://doi.org/10.", new_value)
if new_value != value:
print(f"{Fore.GREEN}Normalized DOI: {Fore.RESET}{value}")
new_values.append(new_value)
new_field = "||".join(new_values)
return new_field

View File

@ -1,6 +1,10 @@
# SPDX-License-Identifier: GPL-3.0-only
from ftfy.badness import sequence_weirdness
import json
import os
from ftfy.badness import is_bad
def is_nfc(field):
@ -38,7 +42,7 @@ def is_mojibake(field):
Return boolean.
"""
if not sequence_weirdness(field):
if not is_bad(field):
# Nothing weird, should be okay
return False
try:
@ -49,3 +53,13 @@ def is_mojibake(field):
else:
# Encodable as CP-1252, Mojibake alert level high
return True
def load_spdx_licenses():
"""Returns a Python list of SPDX short license identifiers."""
with open(os.path.join(os.path.dirname(__file__), "data/licenses.json")) as f:
licenses = json.load(f)
# List comprehension to extract the license ID for each license
return [license["licenseId"] for license in licenses["licenses"]]

View File

@ -1,3 +1,3 @@
# SPDX-License-Identifier: GPL-3.0-only
VERSION = "0.5.0"
VERSION = "0.7.0"

17
data/abstract-check.csv Normal file
View File

@ -0,0 +1,17 @@
id,dc.title,dcterms.abstract
1,Normal item,This is an abstract
2,Leading whitespace, This is an abstract
3,Trailing whitespace,This is an abstract
4,Consecutive whitespace,This is an abstract
5,Newline,"This
is an abstract"
6,Newline with leading whitespace," This
is an abstract"
7,Newline with trailing whitespace,"This
is an abstract "
8,Newline with consecutive whitespace,"This
is an abstract"
9,Multiple newlines,"This
is
an
abstract"
1 id dc.title dcterms.abstract
2 1 Normal item This is an abstract
3 2 Leading whitespace This is an abstract
4 3 Trailing whitespace This is an abstract
5 4 Consecutive whitespace This is an abstract
6 5 Newline This is an abstract
7 6 Newline with leading whitespace This is an abstract
8 7 Newline with trailing whitespace This is an abstract
9 8 Newline with consecutive whitespace This is an abstract
10 9 Multiple newlines This is an abstract

13
data/test-geography.csv Normal file
View File

@ -0,0 +1,13 @@
dc.title,dcterms.issued,dcterms.type,dc.contributor.author,cg.coverage.country,cg.coverage.region
No country,2022-09-01,Report,"Orth, Alan",,
Matching country and region,2022-09-01,Report,"Orth, Alan",Kenya,Eastern Africa
Missing region,2022-09-01,Report,"Orth, Alan",Kenya,
Caribbean country with matching region,2022-09-01,Report,"Orth, Alan",Bahamas,Caribbean
Caribbean country with no region,2022-09-01,Report,"Orth, Alan",Bahamas,
Fake country with no region,2022-09-01,Report,"Orth, Alan",Yeah Baby,
SE Asian country with matching region,2022-09-01,Report,"Orth, Alan",Cambodia,South-eastern Asia
SE Asian country with no region,2022-09-01,Report,"Orth, Alan",Cambodia,
Duplicate countries with matching region,2022-09-01,Report,"Orth, Alan",Kenya||Kenya,Eastern Africa
Duplicate countries with missing regions,2022-09-01,Report,"Orth, Alan",Kenya||Kenya,
Multiple countries with no regions,2022-09-01,Report,"Orth, Alan",Kenya||Bahamas,
Multiple countries with mixed matching regions,2022-09-01,Report,"Orth, Alan",Kenya||Bahamas,Eastern Africa
1 dc.title dcterms.issued dcterms.type dc.contributor.author cg.coverage.country cg.coverage.region
2 No country 2022-09-01 Report Orth, Alan
3 Matching country and region 2022-09-01 Report Orth, Alan Kenya Eastern Africa
4 Missing region 2022-09-01 Report Orth, Alan Kenya
5 Caribbean country with matching region 2022-09-01 Report Orth, Alan Bahamas Caribbean
6 Caribbean country with no region 2022-09-01 Report Orth, Alan Bahamas
7 Fake country with no region 2022-09-01 Report Orth, Alan Yeah Baby
8 SE Asian country with matching region 2022-09-01 Report Orth, Alan Cambodia South-eastern Asia
9 SE Asian country with no region 2022-09-01 Report Orth, Alan Cambodia
10 Duplicate countries with matching region 2022-09-01 Report Orth, Alan Kenya||Kenya Eastern Africa
11 Duplicate countries with missing regions 2022-09-01 Report Orth, Alan Kenya||Kenya
12 Multiple countries with no regions 2022-09-01 Report Orth, Alan Kenya||Bahamas
13 Multiple countries with mixed matching regions 2022-09-01 Report Orth, Alan Kenya||Bahamas Eastern Africa

View File

@ -1,38 +1,43 @@
dc.title,dcterms.issued,dc.identifier.issn,dc.identifier.isbn,dcterms.language,dcterms.subject,cg.coverage.country,filename,dcterms.license,dcterms.type,dcterms.bibliographicCitation,cg.identifier.doi,cg.coverage.region
Leading space,2019-07-29,,,,,,,,,,,
Trailing space ,2019-07-29,,,,,,,,,,,
Excessive space,2019-07-29,,,,,,,,,,,
Miscellaenous ||whitespace | issues ,2019-07-29,,,,,,,,,,,
Duplicate||Duplicate,2019-07-29,,,,,,,,,,,
Invalid ISSN,2019-07-29,2321-2302,,,,,,,,,,
Invalid ISBN,2019-07-29,,978-0-306-40615-6,,,,,,,,,
Multiple valid ISSNs,2019-07-29,0378-5955||0024-9319,,,,,,,,,,
Multiple valid ISBNs,2019-07-29,,99921-58-10-7||978-0-306-40615-7,,,,,,,,,
Invalid date,2019-07-260,,,,,,,,,,,
Multiple dates,2019-07-26||2019-01-10,,,,,,,,,,,
Invalid multi-value separator,2019-07-29,0378-5955|0024-9319,,,,,,,,,,
Unnecessary Unicode,2019-07-29,,,,,,,,,,,
Suspicious character||foreˆt,2019-07-29,,,,,,,,,,,
Invalid ISO 639-1 (alpha 2) language,2019-07-29,,,jp,,,,,,,,
Invalid ISO 639-3 (alpha 3) language,2019-07-29,,,chi,,,,,,,,
Invalid language,2019-07-29,,,Span,,,,,,,,
Invalid AGROVOC subject,2019-07-29,,,,FOREST,,,,,,,
dc.title,dcterms.issued,dc.identifier.issn,dc.identifier.isbn,dcterms.language,dcterms.subject,cg.coverage.country,filename,dcterms.license,dcterms.type,dcterms.bibliographicCitation,cg.identifier.doi,cg.coverage.region,cg.coverage.subregion
Leading space,2019-07-29,,,,,,,,,,,,
Trailing space ,2019-07-29,,,,,,,,,,,,
Excessive space,2019-07-29,,,,,,,,,,,,
Miscellaenous ||whitespace | issues ,2019-07-29,,,,,,,,,,,,
Duplicate||Duplicate,2019-07-29,,,,,,,,,,,,
Invalid ISSN,2019-07-29,2321-2302,,,,,,,,,,,
Invalid ISBN,2019-07-29,,978-0-306-40615-6,,,,,,,,,,
Multiple valid ISSNs,2019-07-29,0378-5955||0024-9319,,,,,,,,,,,
Multiple valid ISBNs,2019-07-29,,99921-58-10-7||978-0-306-40615-7,,,,,,,,,,
Invalid date,2019-07-260,,,,,,,,,,,,
Multiple dates,2019-07-26||2019-01-10,,,,,,,,,,,,
Invalid multi-value separator,2019-07-29,0378-5955|0024-9319,,,,,,,,,,,
Unnecessary Unicode,2019-07-29,,,,,,,,,,,,
Suspicious character||foreˆt,2019-07-29,,,,,,,,,,,,
Invalid ISO 639-1 (alpha 2) language,2019-07-29,,,jp,,,,,,,,,
Invalid ISO 639-3 (alpha 3) language,2019-07-29,,,chi,,,,,,,,,
Invalid language,2019-07-29,,,Span,,,,,,,,,
Invalid AGROVOC subject,2019-07-29,,,,LIVESTOCK||FOREST,,,,,,,,
Newline (LF),2019-07-30,,,,"TANZA
NIA",,,,,,,
Missing date,,,,,,,,,,,,
Invalid country,2019-08-01,,,,,KENYAA,,,,,,
Uncommon filename extension,2019-08-10,,,,,,file.pdf.lck,,,,,
Unneccesary unicode (U+002D + U+00AD),2019-08-10,,978-­92-­9043-­823-­6,,,,,,,,,
"Missing space,after comma",2019-08-27,,,,,,,,,,,
Incorrect ISO 639-1 language,2019-09-26,,,es,,,,,,,,
Incorrect ISO 639-3 language,2019-09-26,,,spa,,,,,,,,
Composéd Unicode,2020-01-14,,,,,,,,,,,
Decomposéd Unicode,2020-01-14,,,,,,,,,,,
Unnecessary multi-value separator,2021-01-03,0378-5955||,,,,,,,,,,
Invalid SPDX license identifier,2021-03-11,,,,,,,CC-BY,,,,
Duplicate Title,2021-03-17,,,,,,,,Report,,,
Duplicate Title,2021-03-17,,,,,,,,Report,,,
Mojibake,2021-03-18,,,,Publicaçao CIAT,,,,Report,,,
"DOI in citation, but missing cg.identifier.doi",2021-10-06,,,,,,,,,"Orth, A. 2021. DOI in citation, but missing cg.identifier.doi. doi: 10.1186/1743-422X-9-218",,
Title missing from citation,2021-12-05,,,,,,,,,"Orth, A. 2021. Title missing f rom citation.",,
Country missing region,2021-12-08,,,,,Kenya,,,,,,
NIA",,,,,,,,
Missing date,,,,,,,,,,,,,
Invalid country,2019-08-01,,,,,KENYAA,,,,,,,
Uncommon filename extension,2019-08-10,,,,,,file.pdf.lck,,,,,,
Unneccesary unicode (U+002D + U+00AD),2019-08-10,,978-­92-­9043-­823-­6,,,,,,,,,,
"Missing space,after comma",2019-08-27,,,,,,,,,,,,
Incorrect ISO 639-1 language,2019-09-26,,,es,,,,,,,,,
Incorrect ISO 639-3 language,2019-09-26,,,spa,,,,,,,,,
Composéd Unicode,2020-01-14,,,,,,,,,,,,
Decomposéd Unicode,2020-01-14,,,,,,,,,,,,
Unnecessary multi-value separator,2021-01-03,0378-5955||,,,,,,,,,,,
Invalid SPDX license identifier,2021-03-11,,,,,,,CC-BY,,,,,
Duplicate Title,2021-03-17,,,,,,,,Report,,,,
Duplicate Title,2021-03-17,,,,,,,,Report,,,,
Mojibake,2021-03-18,,,,Publicaçao CIAT,,,,Report,,,,
"DOI in citation, but missing cg.identifier.doi",2021-10-06,,,,,,,,,"Orth, A. 2021. DOI in citation, but missing cg.identifier.doi. doi: 10.1186/1743-422X-9-218",,,
Title missing from citation,2021-12-05,,,,,,,,,"Orth, A. 2021. Title missing f rom citation.",,,
Country missing region,2021-12-08,,,,,Kenya,,,,,,,
Subregion field shouldnt trigger region checks,2022-12-07,,,,,Kenya,,,,,,Eastern Africa,Baringo
DOI with HTTP and dx.doi.org,2024-04-23,,,,,,,,,,http://dx.doi.org/10.1016/j.envc.2023.100794,,
DOI with colon,2024-04-23,,,,,,,,,,doi: 10.11648/j.jps.20140201.14,,
Upper case bare DOI,2024-04-23,,,,,,,,,,10.19103/AS.2018.0043.16,,
DOI with %2f,2024-06-25,,,,,,,,,,https://doi.org/10.1016%2fj.envc.2023.100794,,

1 dc.title dcterms.issued dc.identifier.issn dc.identifier.isbn dcterms.language dcterms.subject cg.coverage.country filename dcterms.license dcterms.type dcterms.bibliographicCitation cg.identifier.doi cg.coverage.region cg.coverage.subregion
2 Leading space 2019-07-29
3 Trailing space 2019-07-29
4 Excessive space 2019-07-29
5 Miscellaenous ||whitespace | issues 2019-07-29
6 Duplicate||Duplicate 2019-07-29
7 Invalid ISSN 2019-07-29 2321-2302
8 Invalid ISBN 2019-07-29 978-0-306-40615-6
9 Multiple valid ISSNs 2019-07-29 0378-5955||0024-9319
10 Multiple valid ISBNs 2019-07-29 99921-58-10-7||978-0-306-40615-7
11 Invalid date 2019-07-260
12 Multiple dates 2019-07-26||2019-01-10
13 Invalid multi-value separator 2019-07-29 0378-5955|0024-9319
14 Unnecessary Unicode​ 2019-07-29
15 Suspicious character||foreˆt 2019-07-29
16 Invalid ISO 639-1 (alpha 2) language 2019-07-29 jp
17 Invalid ISO 639-3 (alpha 3) language 2019-07-29 chi
18 Invalid language 2019-07-29 Span
19 Invalid AGROVOC subject 2019-07-29 FOREST LIVESTOCK||FOREST
20 Newline (LF) 2019-07-30 TANZA NIA
21 Missing date
22 Invalid country 2019-08-01 KENYAA
23 Uncommon filename extension 2019-08-10 file.pdf.lck
24 Unneccesary unicode (U+002D + U+00AD) 2019-08-10 978-­92-­9043-­823-­6
25 Missing space,after comma 2019-08-27
26 Incorrect ISO 639-1 language 2019-09-26 es
27 Incorrect ISO 639-3 language 2019-09-26 spa
28 Composéd Unicode 2020-01-14
29 Decomposéd Unicode 2020-01-14
30 Unnecessary multi-value separator 2021-01-03 0378-5955||
31 Invalid SPDX license identifier 2021-03-11 CC-BY
32 Duplicate Title 2021-03-17 Report
33 Duplicate Title 2021-03-17 Report
34 Mojibake 2021-03-18 Publicaçao CIAT Report
35 DOI in citation, but missing cg.identifier.doi 2021-10-06 Orth, A. 2021. DOI in citation, but missing cg.identifier.doi. doi: 10.1186/1743-422X-9-218
36 Title missing from citation 2021-12-05 Orth, A. 2021. Title missing f rom citation.
37 Country missing region 2021-12-08 Kenya
38 Subregion field shouldn’t trigger region checks 2022-12-07 Kenya Eastern Africa Baringo
39 DOI with HTTP and dx.doi.org 2024-04-23 http://dx.doi.org/10.1016/j.envc.2023.100794
40 DOI with colon 2024-04-23 doi: 10.11648/j.jps.20140201.14
41 Upper case bare DOI 2024-04-23 10.19103/AS.2018.0043.16
42 DOI with %2f 2024-06-25 https://doi.org/10.1016%2fj.envc.2023.100794
43

1431
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,39 +1,61 @@
[tool.poetry]
[project]
name = "csv-metadata-quality"
version = "0.5.0"
version = "0.7.0"
description="A simple, but opinionated CSV quality checking and fixing pipeline for CSVs in the DSpace ecosystem."
authors = ["Alan Orth <alan.orth@gmail.com>"]
license="GPL-3.0-only"
authors = [
{ name = "Alan Orth", email = "alan.orth@gmail.com" }
]
license= { file = "LICENSE.txt" }
dependencies = [
"pandas[feather,performance]~=2.2.3",
"python-stdnum~=1.20",
"requests~=2.32.3",
"requests-cache~=1.2.1",
"colorama~=0.4",
"ftfy~=6.3.0",
"country-converter~=1.3",
"pycountry~=24.6.1",
"py3langid~=0.3",
]
readme = "README.md"
requires-python = ">= 3.10"
classifiers = [
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Natural Language :: English",
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: Implementation :: CPython",
]
[project.urls]
repository = "https://github.com/ilri/csv-metadata-quality"
homepage = "https://github.com/ilri/csv-metadata-quality"
[tool.poetry.scripts]
[project.scripts]
csv-metadata-quality = 'csv_metadata_quality.__main__:main'
[tool.poetry.dependencies]
python = "^3.7.1"
pandas = "^1.0.4"
python-stdnum = "^1.13"
xlrd = "^1.2.0"
requests = "^2.23.0"
requests-cache = "~0.6.4"
pycountry = "^19.8.18"
langid = "^1.1.6"
colorama = "^0.4.4"
spdx-license-list = "^0.5.2"
ftfy = "^5.9"
SQLAlchemy = ">=1.3.3,<1.4.23"
country-converter = "^0.7.4"
[tool.poetry.dev-dependencies]
pytest = "^6.1.1"
ipython = { version = "^7.18.1", python = "^3.7" }
flake8 = "^3.8.4"
pytest-clarity = "^1.0.1"
black = "^21.6b0"
isort = "^5.5.4"
csvkit = "^1.0.5"
# So uv doesn't fall back to setuptools
# See: https://packaging.python.org/en/latest/tutorials/packaging-projects/#choosing-build-backend
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
requires = ["hatchling"]
build-backend = "hatchling.build"
[dependency-groups]
dev = [
"pytest~=8.3",
"isort~=6.0",
"csvkit~=2.0",
"ipython~=8.31",
]
# So hatch doesn't try to build other top-level directories like "data"
[tool.hatch.build.targets.wheel]
packages = ["csv_metadata_quality"]
[tool.isort]
profile = "black"
line_length=88

9
renovate.json Normal file
View File

@ -0,0 +1,9 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended"
],
"pep621": {
"enabled": false
}
}

View File

@ -1,82 +0,0 @@
agate-dbf==0.2.2
agate-excel==0.2.5
agate-sql==0.5.8
agate==1.6.3
appnope==0.1.2; python_version >= "3.7" and python_version < "4.0" and sys_platform == "darwin"
atomicwrites==1.4.0; python_version >= "3.6" and python_full_version < "3.0.0" and sys_platform == "win32" and (python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6") or sys_platform == "win32" and python_version >= "3.6" and python_full_version >= "3.4.0" and (python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6")
attrs==21.2.0; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6"
babel==2.9.1; python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.4.0"
backcall==0.2.0; python_version >= "3.7" and python_version < "4.0"
black==21.12b0; python_full_version >= "3.6.2"
certifi==2021.10.8; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6"
charset-normalizer==2.0.9; python_full_version >= "3.6.0" and python_version >= "3.6"
click==8.0.3; python_version >= "3.6" and python_full_version >= "3.6.2"
colorama==0.4.4; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.5.0")
commonmark==0.9.1; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and (python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.4.0")
country-converter==0.7.4
csvkit==1.0.6
dbfread==2.0.7
decorator==5.1.0; python_version >= "3.7" and python_version < "4.0"
et-xmlfile==1.1.0; python_version >= "3.6"
flake8==3.9.2; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.5.0")
ftfy==5.9; python_version >= "3.5"
future==0.18.2; python_version >= "2.6" and python_full_version < "3.0.0" or python_full_version >= "3.3.0"
greenlet==1.1.2; python_version >= "3" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3"
idna==3.3; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6"
importlib-metadata==4.8.2; python_full_version >= "3.6.2" and python_version < "3.8" and python_version >= "3.6" and (python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6") and (python_version >= "3.6" and python_full_version < "3.0.0" and python_version < "3.8" or python_full_version >= "3.6.0" and python_version < "3.8" and python_version >= "3.6")
iniconfig==1.1.1; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
ipython==7.30.1; python_version >= "3.7" and python_version < "4.0"
isodate==0.6.0
isort==5.10.1; python_full_version >= "3.6.1" and python_version < "4.0"
itsdangerous==2.0.1; python_version >= "3.6"
jedi==0.18.1; python_version >= "3.7" and python_version < "4.0"
langid==1.1.6
leather==0.3.4
matplotlib-inline==0.1.3; python_version >= "3.7" and python_version < "4.0"
mccabe==0.6.1; python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.5.0"
mypy-extensions==0.4.3; python_full_version >= "3.6.2"
numpy==1.21.1
olefile==0.46; python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.4.0"
openpyxl==3.0.9; python_version >= "3.6"
packaging==21.3; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
pandas==1.3.4; python_full_version >= "3.7.1"
parsedatetime==2.4
parso==0.8.3; python_version >= "3.7" and python_version < "4.0"
pathspec==0.9.0; python_full_version >= "3.6.2"
pexpect==4.8.0; python_version >= "3.7" and python_version < "4.0" and sys_platform != "win32"
pickleshare==0.7.5; python_version >= "3.7" and python_version < "4.0"
platformdirs==2.4.0; python_version >= "3.6" and python_full_version >= "3.6.2"
pluggy==1.0.0; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
pprintpp==0.4.0; python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.4.0"
prompt-toolkit==3.0.23; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
ptyprocess==0.7.0; python_version >= "3.7" and python_version < "4.0" and sys_platform != "win32"
py==1.11.0; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6"
pycodestyle==2.7.0; python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.5.0"
pycountry==19.8.18
pyflakes==2.3.1; python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.5.0"
pygments==2.10.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2" and python_full_version < "4.0.0" and (python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.4.0")
pyparsing==3.0.6; python_version >= "3.6"
pytest-clarity==1.0.1; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.4.0")
pytest==6.2.5; python_version >= "3.6"
python-dateutil==2.8.2; python_full_version >= "3.7.1"
python-slugify==5.0.2; python_version >= "3.6"
python-stdnum==1.17
pytimeparse==1.1.8
pytz==2021.3; python_full_version >= "3.7.1"
requests-cache==0.6.4; python_version >= "3.6"
requests==2.26.0; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.6.0")
rich==10.15.2; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and (python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.4.0")
six==1.16.0; python_full_version >= "3.7.1" and python_version >= "3.6" and (python_version >= "2.7" and python_full_version < "3.0.0" or python_full_version >= "3.3.0")
spdx-license-list==0.5.2
sqlalchemy==1.4.22; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.6.0")
text-unidecode==1.3; python_version >= "3.6"
toml==0.10.2; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
tomli==1.2.2; python_version >= "3.6" and python_full_version >= "3.6.2"
traitlets==5.1.1; python_version >= "3.7" and python_version < "4.0"
typed-ast==1.5.1; python_version < "3.8" and implementation_name == "cpython" and python_full_version >= "3.6.2" and python_version >= "3.6"
typing-extensions==4.0.1
url-normalize==1.4.3; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6"
urllib3==1.26.7; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version < "4" and python_version >= "3.6"
wcwidth==0.2.5; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
xlrd==1.2.0; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.4.0")
zipp==3.6.0; python_version >= "3.6" and python_full_version < "3.0.0" and python_version < "3.8" or python_full_version >= "3.6.0" and python_version < "3.8" and python_version >= "3.6"

View File

@ -1,201 +1,356 @@
certifi==2021.10.8; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6" \
--hash=sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569 \
--hash=sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872
charset-normalizer==2.0.9; python_full_version >= "3.6.0" and python_version >= "3.6" \
--hash=sha256:b0b883e8e874edfdece9c28f314e3dd5badf067342e42fb162203335ae61aa2c \
--hash=sha256:1eecaa09422db5be9e29d7fc65664e6c33bd06f9ced7838578ba40d58bdf3721
colorama==0.4.4; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.5.0") \
--hash=sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2 \
--hash=sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b
country-converter==0.7.4 \
--hash=sha256:0291cc91c4a4efe7f128a11c8c6e4cb761f7fea7cde2517f8677c7c56da334d3
ftfy==5.9; python_version >= "3.5" \
--hash=sha256:8c4fb2863c0b82eae2ab3cf353d9ade268dfbde863d322f78d6a9fd5cefb31e9
greenlet==1.1.2; python_version >= "3" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3" \
--hash=sha256:58df5c2a0e293bf665a51f8a100d3e9956febfbf1d9aaf8c0677cf70218910c6 \
--hash=sha256:aec52725173bd3a7b56fe91bc56eccb26fbdff1386ef123abb63c84c5b43b63a \
--hash=sha256:833e1551925ed51e6b44c800e71e77dacd7e49181fdc9ac9a0bf3714d515785d \
--hash=sha256:aa5b467f15e78b82257319aebc78dd2915e4c1436c3c0d1ad6f53e47ba6e2713 \
--hash=sha256:40b951f601af999a8bf2ce8c71e8aaa4e8c6f78ff8afae7b808aae2dc50d4c40 \
--hash=sha256:95e69877983ea39b7303570fa6760f81a3eec23d0e3ab2021b7144b94d06202d \
--hash=sha256:356b3576ad078c89a6107caa9c50cc14e98e3a6c4874a37c3e0273e4baf33de8 \
--hash=sha256:8639cadfda96737427330a094476d4c7a56ac03de7265622fcf4cfe57c8ae18d \
--hash=sha256:97e5306482182170ade15c4b0d8386ded995a07d7cc2ca8f27958d34d6736497 \
--hash=sha256:e6a36bb9474218c7a5b27ae476035497a6990e21d04c279884eb10d9b290f1b1 \
--hash=sha256:abb7a75ed8b968f3061327c433a0fbd17b729947b400747c334a9c29a9af6c58 \
--hash=sha256:14d4f3cd4e8b524ae9b8aa567858beed70c392fdec26dbdb0a8a418392e71708 \
--hash=sha256:17ff94e7a83aa8671a25bf5b59326ec26da379ace2ebc4411d690d80a7fbcf23 \
--hash=sha256:9f3cba480d3deb69f6ee2c1825060177a22c7826431458c697df88e6aeb3caee \
--hash=sha256:fa877ca7f6b48054f847b61d6fa7bed5cebb663ebc55e018fda12db09dcc664c \
--hash=sha256:7cbd7574ce8e138bda9df4efc6bf2ab8572c9aff640d8ecfece1b006b68da963 \
--hash=sha256:903bbd302a2378f984aef528f76d4c9b1748f318fe1294961c072bdc7f2ffa3e \
--hash=sha256:049fe7579230e44daef03a259faa24511d10ebfa44f69411d99e6a184fe68073 \
--hash=sha256:dd0b1e9e891f69e7675ba5c92e28b90eaa045f6ab134ffe70b52e948aa175b3c \
--hash=sha256:7418b6bfc7fe3331541b84bb2141c9baf1ec7132a7ecd9f375912eca810e714e \
--hash=sha256:f9d29ca8a77117315101425ec7ec2a47a22ccf59f5593378fc4077ac5b754fce \
--hash=sha256:21915eb821a6b3d9d8eefdaf57d6c345b970ad722f856cd71739493ce003ad08 \
--hash=sha256:eff9d20417ff9dcb0d25e2defc2574d10b491bf2e693b4e491914738b7908168 \
--hash=sha256:32ca72bbc673adbcfecb935bb3fb1b74e663d10a4b241aaa2f5a75fe1d1f90aa \
--hash=sha256:f0214eb2a23b85528310dad848ad2ac58e735612929c8072f6093f3585fd342d \
--hash=sha256:b92e29e58bef6d9cfd340c72b04d74c4b4e9f70c9fa7c78b674d1fec18896dc4 \
--hash=sha256:fdcec0b8399108577ec290f55551d926d9a1fa6cad45882093a7a07ac5ec147b \
--hash=sha256:93f81b134a165cc17123626ab8da2e30c0455441d4ab5576eed73a64c025b25c \
--hash=sha256:1e12bdc622676ce47ae9abbf455c189e442afdde8818d9da983085df6312e7a1 \
--hash=sha256:8c790abda465726cfb8bb08bd4ca9a5d0a7bd77c7ac1ca1b839ad823b948ea28 \
--hash=sha256:f276df9830dba7a333544bd41070e8175762a7ac20350786b322b714b0e654f5 \
--hash=sha256:64e6175c2e53195278d7388c454e0b30997573f3f4bd63697f88d855f7a6a1fc \
--hash=sha256:b11548073a2213d950c3f671aa88e6f83cda6e2fb97a8b6317b1b5b33d850e06 \
--hash=sha256:9633b3034d3d901f0a46b7939f8c4d64427dfba6bbc5a36b1a67364cf148a1b0 \
--hash=sha256:eb6ea6da4c787111adf40f697b4e58732ee0942b5d3bd8f435277643329ba627 \
--hash=sha256:f3acda1924472472ddd60c29e5b9db0cec629fbe3c5c5accb74d6d6d14773478 \
--hash=sha256:e859fcb4cbe93504ea18008d1df98dee4f7766db66c435e4882ab35cf70cac43 \
--hash=sha256:00e44c8afdbe5467e4f7b5851be223be68adb4272f44696ee71fe46b7036a711 \
--hash=sha256:ec8c433b3ab0419100bd45b47c9c8551248a5aee30ca5e9d399a0b57ac04651b \
--hash=sha256:288c6a76705dc54fba69fbcb59904ae4ad768b4c768839b8ca5fdadec6dd8cfd \
--hash=sha256:8d2f1fb53a421b410751887eb4ff21386d119ef9cde3797bf5e7ed49fb51a3b3 \
--hash=sha256:166eac03e48784a6a6e0e5f041cfebb1ab400b394db188c48b3a84737f505b67 \
--hash=sha256:572e1787d1460da79590bf44304abbc0a2da944ea64ec549188fa84d89bba7ab \
--hash=sha256:be5f425ff1f5f4b3c1e33ad64ab994eed12fc284a6ea71c5243fd564502ecbe5 \
--hash=sha256:b1692f7d6bc45e3200844be0dba153612103db241691088626a33ff1f24a0d88 \
--hash=sha256:7227b47e73dedaa513cdebb98469705ef0d66eb5a1250144468e9c3097d6b59b \
--hash=sha256:7ff61ff178250f9bb3cd89752df0f1dd0e27316a8bd1465351652b1b4a4cdfd3 \
--hash=sha256:f70a9e237bb792c7cc7e44c531fd48f5897961701cdaa06cf22fc14965c496cf \
--hash=sha256:013d61294b6cd8fe3242932c1c5e36e5d1db2c8afb58606c5a67efce62c1f5fd \
--hash=sha256:e30f5ea4ae2346e62cedde8794a56858a67b878dd79f7df76a0767e356b1744a
idna==3.3; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6" \
--hash=sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff \
--hash=sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d
importlib-metadata==4.8.2; python_version >= "3.6" and python_full_version < "3.0.0" and python_version < "3.8" or python_full_version >= "3.6.0" and python_version < "3.8" and python_version >= "3.6" \
--hash=sha256:53ccfd5c134223e497627b9815d5030edf77d2ed573922f7a0b8f8bb81a1c100 \
--hash=sha256:75bdec14c397f528724c1bfd9709d660b33a4d2e77387a3358f20b848bb5e5fb
itsdangerous==2.0.1; python_version >= "3.6" \
--hash=sha256:5174094b9637652bdb841a3029700391451bd092ba3db90600dea710ba28e97c \
--hash=sha256:9e724d68fc22902a1435351f84c3fb8623f303fffcc566a4cb952df8c572cff0
langid==1.1.6 \
--hash=sha256:044bcae1912dab85c33d8e98f2811b8f4ff1213e5e9a9e9510137b84da2cb293
numpy==1.21.1 \
--hash=sha256:38e8648f9449a549a7dfe8d8755a5979b45b3538520d1e735637ef28e8c2dc50 \
--hash=sha256:fd7d7409fa643a91d0a05c7554dd68aa9c9bb16e186f6ccfe40d6e003156e33a \
--hash=sha256:a75b4498b1e93d8b700282dc8e655b8bd559c0904b3910b144646dbbbc03e062 \
--hash=sha256:1412aa0aec3e00bc23fbb8664d76552b4efde98fb71f60737c83efbac24112f1 \
--hash=sha256:e46ceaff65609b5399163de5893d8f2a82d3c77d5e56d976c8b5fb01faa6b671 \
--hash=sha256:c6a2324085dd52f96498419ba95b5777e40b6bcbc20088fddb9e8cbb58885e8e \
--hash=sha256:73101b2a1fef16602696d133db402a7e7586654682244344b8329cdcbbb82172 \
--hash=sha256:7a708a79c9a9d26904d1cca8d383bf869edf6f8e7650d85dbc77b041e8c5a0f8 \
--hash=sha256:95b995d0c413f5d0428b3f880e8fe1660ff9396dcd1f9eedbc311f37b5652e16 \
--hash=sha256:635e6bd31c9fb3d475c8f44a089569070d10a9ef18ed13738b03049280281267 \
--hash=sha256:4a3d5fb89bfe21be2ef47c0614b9c9c707b7362386c9a3ff1feae63e0267ccb6 \
--hash=sha256:8a326af80e86d0e9ce92bcc1e65c8ff88297de4fa14ee936cb2293d414c9ec63 \
--hash=sha256:791492091744b0fe390a6ce85cc1bf5149968ac7d5f0477288f78c89b385d9af \
--hash=sha256:0318c465786c1f63ac05d7c4dbcecd4d2d7e13f0959b01b534ea1e92202235c5 \
--hash=sha256:9a513bd9c1551894ee3d31369f9b07460ef223694098cf27d399513415855b68 \
--hash=sha256:91c6f5fc58df1e0a3cc0c3a717bb3308ff850abdaa6d2d802573ee2b11f674a8 \
--hash=sha256:978010b68e17150db8765355d1ccdd450f9fc916824e8c4e35ee620590e234cd \
--hash=sha256:9749a40a5b22333467f02fe11edc98f022133ee1bfa8ab99bda5e5437b831214 \
--hash=sha256:d7a4aeac3b94af92a9373d6e77b37691b86411f9745190d2c351f410ab3a791f \
--hash=sha256:d9e7912a56108aba9b31df688a4c4f5cb0d9d3787386b87d504762b6754fbb1b \
--hash=sha256:25b40b98ebdd272bc3020935427a4530b7d60dfbe1ab9381a39147834e985eac \
--hash=sha256:8a92c5aea763d14ba9d6475803fc7904bda7decc2a0a68153f587ad82941fec1 \
--hash=sha256:05a0f648eb28bae4bcb204e6fd14603de2908de982e761a2fc78efe0f19e96e1 \
--hash=sha256:f01f28075a92eede918b965e86e8f0ba7b7797a95aa8d35e1cc8821f5fc3ad6a \
--hash=sha256:88c0b89ad1cc24a5efbb99ff9ab5db0f9a86e9cc50240177a571fbe9c2860ac2 \
--hash=sha256:01721eefe70544d548425a07c80be8377096a54118070b8a62476866d5208e33 \
--hash=sha256:2d4d1de6e6fb3d28781c73fbde702ac97f03d79e4ffd6598b880b2d95d62ead4 \
--hash=sha256:dff4af63638afcc57a3dfb9e4b26d434a7a602d225b42d746ea7fe2edf1342fd
pandas==1.3.4; python_full_version >= "3.7.1" \
--hash=sha256:9707bdc1ea9639c886b4d3be6e2a45812c1ac0c2080f94c31b71c9fa35556f9b \
--hash=sha256:c2f44425594ae85e119459bb5abb0748d76ef01d9c08583a667e3339e134218e \
--hash=sha256:372d72a3d8a5f2dbaf566a5fa5fa7f230842ac80f29a931fb4b071502cf86b9a \
--hash=sha256:d99d2350adb7b6c3f7f8f0e5dfb7d34ff8dd4bc0a53e62c445b7e43e163fce63 \
--hash=sha256:4acc28364863127bca1029fb72228e6f473bb50c32e77155e80b410e2068eeac \
--hash=sha256:c2646458e1dce44df9f71a01dc65f7e8fa4307f29e5c0f2f92c97f47a5bf22f5 \
--hash=sha256:5298a733e5bfbb761181fd4672c36d0c627320eb999c59c65156c6a90c7e1b4f \
--hash=sha256:22808afb8f96e2269dcc5b846decacb2f526dd0b47baebc63d913bf847317c8f \
--hash=sha256:b528e126c13816a4374e56b7b18bfe91f7a7f6576d1aadba5dee6a87a7f479ae \
--hash=sha256:fe48e4925455c964db914b958f6e7032d285848b7538a5e1b19aeb26ffaea3ec \
--hash=sha256:eaca36a80acaacb8183930e2e5ad7f71539a66805d6204ea88736570b2876a7b \
--hash=sha256:42493f8ae67918bf129869abea8204df899902287a7f5eaf596c8e54e0ac7ff4 \
--hash=sha256:a388960f979665b447f0847626e40f99af8cf191bce9dc571d716433130cb3a7 \
--hash=sha256:5ba0aac1397e1d7b654fccf263a4798a9e84ef749866060d19e577e927d66e1b \
--hash=sha256:f567e972dce3bbc3a8076e0b675273b4a9e8576ac629149cf8286ee13c259ae5 \
--hash=sha256:c1aa4de4919358c5ef119f6377bc5964b3a7023c23e845d9db7d9016fa0c5b1c \
--hash=sha256:dd324f8ee05925ee85de0ea3f0d66e1362e8c80799eb4eb04927d32335a3e44a \
--hash=sha256:d47750cf07dee6b55d8423471be70d627314277976ff2edd1381f02d52dbadf9 \
--hash=sha256:2d1dc09c0013d8faa7474574d61b575f9af6257ab95c93dcf33a14fd8d2c1bab \
--hash=sha256:10e10a2527db79af6e830c3d5842a4d60383b162885270f8cffc15abca4ba4a9 \
--hash=sha256:35c77609acd2e4d517da41bae0c11c70d31c87aae8dd1aabd2670906c6d2c143 \
--hash=sha256:003ba92db58b71a5f8add604a17a059f3068ef4e8c0c365b088468d0d64935fd \
--hash=sha256:a51528192755f7429c5bcc9e80832c517340317c861318fea9cea081b57c9afd \
--hash=sha256:a2aa18d3f0b7d538e21932f637fbfe8518d085238b429e4790a35e1e44a96ffc
pycountry==19.8.18 \
--hash=sha256:3c57aa40adcf293d59bebaffbe60d8c39976fba78d846a018dc0c2ec9c6cb3cb
python-dateutil==2.8.2; python_full_version >= "3.7.1" \
--hash=sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86 \
--hash=sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9
python-stdnum==1.17 \
--hash=sha256:374e2b5e13912ccdbf50b0b23fca2c3e0531174805c32d74e145f37756328340 \
--hash=sha256:a46e6cf9652807314d369b654b255c86a59f93d18be2834f3d567ed1a346c547
pytz==2021.3; python_full_version >= "3.7.1" \
--hash=sha256:3672058bc3453457b622aab7a1c3bfd5ab0bdae451512f6cf25f64ed37f5b87c \
--hash=sha256:acad2d8b20a1af07d4e4c9d2e9285c5ed9104354062f275f3fcd88dcef4f1326
requests-cache==0.6.4; python_version >= "3.6" \
--hash=sha256:dd9120a4ab7b8128cba9b6b120d8b5560d566a3cd0f828cced3d3fd60a42ec40 \
--hash=sha256:1102daa13a804abe23fad62d694e7dee58d6063a35d94bf6e8c9821e22e5a78b
requests==2.26.0; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.6.0") \
--hash=sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24 \
--hash=sha256:b8aa58f8cf793ffd8782d3d8cb19e66ef36f7aba4353eec859e74678b01b07a7
six==1.16.0; python_full_version >= "3.7.1" and python_version >= "3.6" \
--hash=sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254 \
--hash=sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926
spdx-license-list==0.5.2 \
--hash=sha256:1b338470c7b403dbecceca563a316382c7977516128ca6c1e8f7078e3ed6e7b0 \
--hash=sha256:952996f72ab807972dc2278bb9b91e5294767211e51f09aad9c0e2ff5b82a31b
sqlalchemy==1.4.22; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.6.0") \
--hash=sha256:488608953385d6c127d2dcbc4b11f8d7f2f30b89f6bd27c01b042253d985cc2f \
--hash=sha256:5d856cc50fd26fc8dd04892ed5a5a3d7eeb914fea2c2e484183e2d84c14926e0 \
--hash=sha256:a00d9c6d3a8afe1d1681cd8a5266d2f0ed684b0b44bada2ca82403b9e8b25d39 \
--hash=sha256:5908ea6c652a050d768580d01219c98c071e71910ab8e7b42c02af4010608397 \
--hash=sha256:b7fb937c720847879c7402fe300cfdb2aeff22349fa4ea3651bca4e2d6555939 \
--hash=sha256:9bfe882d5a1bbde0245dca0bd48da0976bd6634cf2041d2fdf0417c5463e40e5 \
--hash=sha256:eedd76f135461cf237534a6dc0d1e0f6bb88a1dc193678fab48a11d223462da5 \
--hash=sha256:6a16c7c4452293da5143afa3056680db2d187b380b3ef4d470d4e29885720de3 \
--hash=sha256:44d23ea797a5e0be71bc5454b9ae99158ea0edc79e2393c6e9a2354de88329c0 \
--hash=sha256:a5e14cb0c0a4ac095395f24575a0e7ab5d1be27f5f9347f1762f21505e3ba9f1 \
--hash=sha256:bc34a007e604091ca3a4a057525efc4cefd2b7fe970f44d20b9cfa109ab1bddb \
--hash=sha256:756f5d2f5b92d27450167247fb574b09c4cd192a3f8c2e493b3e518a204ee543 \
--hash=sha256:9fcbb4b4756b250ed19adc5e28c005b8ed56fdb5c21efa24c6822c0575b4964d \
--hash=sha256:09dbb4bc01a734ccddbf188deb2a69aede4b3c153a72b6d5c6900be7fb2945b1 \
--hash=sha256:f028ef6a1d828bc754852a022b2160e036202ac8658a6c7d34875aafd14a9a15 \
--hash=sha256:68393d3fd31469845b6ba11f5b4209edbea0b58506be0e077aafbf9aa2e21e11 \
--hash=sha256:891927a49b2363a4199763a9d436d97b0b42c65922a4ea09025600b81a00d17e \
--hash=sha256:fd2102a8f8a659522719ed73865dff3d3cc76eb0833039dc473e0ad3041d04be \
--hash=sha256:4014978de28163cd8027434916a92d0f5bb1a3a38dff5e8bf8bff4d9372a9117 \
--hash=sha256:f814d80844969b0d22ea63663da4de5ca1c434cfbae226188901e5d368792c17 \
--hash=sha256:d09a760b0a045b4d799102ae7965b5491ccf102123f14b2a8cc6c01d1021a2d9 \
--hash=sha256:26daa429f039e29b1e523bf763bfab17490556b974c77b5ca7acb545b9230e9a \
--hash=sha256:12bac5fa1a6ea870bdccb96fe01610641dd44ebe001ed91ef7fcd980e9702db5 \
--hash=sha256:39b5d36ab71f73c068cdcf70c38075511de73616e6c7fdd112d6268c2704d9f5 \
--hash=sha256:5102b9face693e8b2db3b2539c7e1a5d9a5b4dc0d79967670626ffd2f710d6e6 \
--hash=sha256:c9373ef67a127799027091fa53449125351a8c943ddaa97bec4e99271dbb21f4 \
--hash=sha256:36a089dc604032d41343d86290ce85d4e6886012eea73faa88001260abf5ff81 \
--hash=sha256:b48148ceedfb55f764562e04c00539bb9ea72bf07820ca15a594a9a049ff6b0e \
--hash=sha256:1fdae7d980a2fa617d119d0dc13ecb5c23cc63a8b04ffcb5298f2c59d86851e9 \
--hash=sha256:ec1be26cdccd60d180359a527d5980d959a26269a2c7b1b327a1eea0cab37ed8
typing-extensions==4.0.1; python_version >= "3.6" and python_full_version < "3.0.0" and python_version < "3.8" or python_full_version >= "3.6.0" and python_version < "3.8" and python_version >= "3.6" \
--hash=sha256:7f001e5ac290a0c0401508864c7ec868be4e701886d5b573a9528ed3973d9d3b \
--hash=sha256:4ca091dea149f945ec56afb48dae714f21e8692ef22a395223bcd328961b6a0e
url-normalize==1.4.3; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6" \
# This file was autogenerated by uv via the following command:
# uv export --no-dev
-e .
attrs==25.1.0 \
--hash=sha256:1c97078a80c814273a76b2a298a932eb681c87415c11dee0a6921de7f1b02c3e \
--hash=sha256:c75a69e28a550a7e93789579c22aa26b0f5b83b75dc4e08fe092980051e1090a
bottleneck==1.4.2 \
--hash=sha256:037315c56605128a39f77d19af6a6019dc8c21a63694a4bfef3c026ed963be2e \
--hash=sha256:070d22f2f62ab81297380a89492cca931e4d9443fa4b84c2baeb52db09c3b1b4 \
--hash=sha256:122845e3106c85465551d4a9a3777841347cfedfbebb3aa985cca110e07030b1 \
--hash=sha256:125436df93751a226eab1732783aa8f6125e88e779587aa61be071fb66e41f9d \
--hash=sha256:1f61658ebdf5a178298544336b65020730bf86cc092dab5f6579a99a86bd888b \
--hash=sha256:1fc4e7645bd425c05e05acd5541e9e09cb4179e71164e862f082561bf4509eac \
--hash=sha256:26b5f0531f7044befaad95c20365dd666372e66bdacbfaf009ff65d60285534d \
--hash=sha256:2c9dbaf737b605b30c81611f2c1d197c2fd2e46c33f605876c1d332d3360c4fc \
--hash=sha256:2db287f6ecdbb1c998085eca9b717fec2bfc48a4ab6ae070a9820ba8ab59c90b \
--hash=sha256:2e2fe327dc2d0564e295a5857a252755103f8c6e05b07d3ff80a69afaa9f5065 \
--hash=sha256:48c6b9d9287c4102b803fcb01ae66ae7ef6b310b711b4b7b7e23bf952894dc05 \
--hash=sha256:4c6df9a60ec6ab88fec934ca864266ba95edd89c490af71dc9cd8afb2a54ebd9 \
--hash=sha256:6282fa925ac3768f66e3547f89a512376d3f9de7ef53bdd37aa29232fd864054 \
--hash=sha256:6b7790ca8658cd69e3cc0d0e4ff0e9829d60849bf7945fbd7344fbce05b2bbb8 \
--hash=sha256:7363b3c8ce6ca433779cd7e96bcb94c0e516dcacadff0011adcbf0b3ac86bc9d \
--hash=sha256:7c7d29c044a3511b36fd744503c3e697e279c273a8477a6d91a2831d04fd19e0 \
--hash=sha256:7ebbcbe5d4062e37507b9a81e2aacdb1fcccc6193f7feff124ef2b5a6a5eb740 \
--hash=sha256:89651ef18c06616850203bf8875c958c5d316ea48d8ba60d9b450199d39ae391 \
--hash=sha256:964f6ac4118ddab3bbbac79d4f726b093459be751baba73ee0aa364666e8068e \
--hash=sha256:99778329331d5fae8df19772a019e8b73ba4d9d1650f110cd995ab7657114db0 \
--hash=sha256:a74ddd0417f42eeaba37375f0fc065b28451e0fba45cb2f99e88880b10b3fa43 \
--hash=sha256:b6902ebf3e85315b481bc084f10c5770f8240275ad1e039ac69c7c8d2013b040 \
--hash=sha256:c1c885ad02a6a8fa1f7ee9099f29b9d4c03eb1da2c7ab25839482d5cce739021 \
--hash=sha256:c2fd34b9b490204f95288f0dd35d37042486a95029617246c88c0f94a0ab49fe \
--hash=sha256:c663cbba8f52011fd82ee08c6a85c93b34b19e0e7ebba322d2d67809f34e0597 \
--hash=sha256:e56a206fbf48e3b8054a964398bf1ed843e9625d3c6bdbeb7898cb48bf97441b \
--hash=sha256:e7a1b023de1de3d84b18826462718fba548fed41870df44354f9ab6a414ea82f \
--hash=sha256:eb0c611d15b0fd8f511d288e8964e4725b4b3b0d9d310880cf0ff6b8dd03c859 \
--hash=sha256:fa8e8e1799dea5483ce6669462660f9d9a95649f6f98a80d315b84ec89f449f4
cattrs==24.1.2 \
--hash=sha256:67c7495b760168d931a10233f979b28dc04daf853b30752246f4f8471c6d68d0 \
--hash=sha256:8028cfe1ff5382df59dd36474a86e02d817b06eaf8af84555441bac915d2ef85
certifi==2024.12.14 \
--hash=sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56 \
--hash=sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db
charset-normalizer==3.4.1 \
--hash=sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd \
--hash=sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601 \
--hash=sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd \
--hash=sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d \
--hash=sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313 \
--hash=sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd \
--hash=sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa \
--hash=sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8 \
--hash=sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1 \
--hash=sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2 \
--hash=sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d \
--hash=sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b \
--hash=sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408 \
--hash=sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3 \
--hash=sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f \
--hash=sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a \
--hash=sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146 \
--hash=sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6 \
--hash=sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9 \
--hash=sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f \
--hash=sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545 \
--hash=sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176 \
--hash=sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b \
--hash=sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f \
--hash=sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b \
--hash=sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9 \
--hash=sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125 \
--hash=sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de \
--hash=sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11 \
--hash=sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35 \
--hash=sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f \
--hash=sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda \
--hash=sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a \
--hash=sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971 \
--hash=sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d \
--hash=sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f \
--hash=sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757 \
--hash=sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a \
--hash=sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886 \
--hash=sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77 \
--hash=sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76 \
--hash=sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247 \
--hash=sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85 \
--hash=sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb \
--hash=sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7 \
--hash=sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037 \
--hash=sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1 \
--hash=sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807 \
--hash=sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407 \
--hash=sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12 \
--hash=sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3 \
--hash=sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd \
--hash=sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00 \
--hash=sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616
colorama==0.4.6 \
--hash=sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44 \
--hash=sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6
country-converter==1.3 \
--hash=sha256:006958c83adeada455d2f178921fdd051def736259ff250fada912eaf3ca8cf1 \
--hash=sha256:f6a1a14d1f98112ca90a5198f645f4e60bb73840e98f3f733893ff5b617c2f38
exceptiongroup==1.2.2 ; python_full_version < '3.11' \
--hash=sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b \
--hash=sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc
ftfy==6.3.1 \
--hash=sha256:7c70eb532015cd2f9adb53f101fb6c7945988d023a085d127d1573dc49dd0083 \
--hash=sha256:9b3c3d90f84fb267fe64d375a07b7f8912d817cf86009ae134aa03e1819506ec
idna==3.10 \
--hash=sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9 \
--hash=sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3
llvmlite==0.44.0 \
--hash=sha256:07667d66a5d150abed9157ab6c0b9393c9356f229784a4385c02f99e94fc94d4 \
--hash=sha256:1d671a56acf725bf1b531d5ef76b86660a5ab8ef19bb6a46064a705c6ca80aad \
--hash=sha256:2fb7c4f2fb86cbae6dca3db9ab203eeea0e22d73b99bc2341cdf9de93612e930 \
--hash=sha256:319bddd44e5f71ae2689859b7203080716448a3cd1128fb144fe5c055219d516 \
--hash=sha256:40526fb5e313d7b96bda4cbb2c85cd5374e04d80732dd36a282d72a560bb6408 \
--hash=sha256:41e3839150db4330e1b2716c0be3b5c4672525b4c9005e17c7597f835f351ce2 \
--hash=sha256:46224058b13c96af1365290bdfebe9a6264ae62fb79b2b55693deed11657a8bf \
--hash=sha256:5f79a728e0435493611c9f405168682bb75ffd1fbe6fc360733b850c80a026db \
--hash=sha256:7202b678cdf904823c764ee0fe2dfe38a76981f4c1e51715b4cb5abb6cf1d9e8 \
--hash=sha256:9c58867118bad04a0bb22a2e0068c693719658105e40009ffe95c7000fcde88e \
--hash=sha256:9fbadbfba8422123bab5535b293da1cf72f9f478a65645ecd73e781f962ca614 \
--hash=sha256:aa0097052c32bf721a4efc03bd109d335dfa57d9bffb3d4c24cc680711b8b4fc \
--hash=sha256:ace564d9fa44bb91eb6e6d8e7754977783c68e90a471ea7ce913bff30bd62427 \
--hash=sha256:c0143a5ef336da14deaa8ec26c5449ad5b6a2b564df82fcef4be040b9cacfea9 \
--hash=sha256:c5d22c3bfc842668168a786af4205ec8e3ad29fb1bc03fd11fd48460d0df64c1 \
--hash=sha256:cccf8eb28f24840f2689fb1a45f9c0f7e582dd24e088dcf96e424834af11f791 \
--hash=sha256:d752f89e31b66db6f8da06df8b39f9b91e78c5feea1bf9e8c1fba1d1c24c065d \
--hash=sha256:d8489634d43c20cd0ad71330dde1d5bc7b9966937a263ff1ec1cebb90dc50955 \
--hash=sha256:eae7e2d4ca8f88f89d315b48c6b741dcb925d6a1042da694aa16ab3dd4cbd3a1 \
--hash=sha256:eed7d5f29136bda63b6d7804c279e2b72e08c952b7c5df61f45db408e0ee52f3 \
--hash=sha256:f01a394e9c9b7b1d4e63c327b096d10f6f0ed149ef53d38a09b3749dcf8c9610
numba==0.61.0 \
--hash=sha256:074cd38c5b1f9c65a4319d1f3928165f48975ef0537ad43385b2bd908e6e2e35 \
--hash=sha256:0ebbd4827091384ab8c4615ba1b3ca8bc639a3a000157d9c37ba85d34cd0da1b \
--hash=sha256:152146ecdbb8d8176f294e9f755411e6f270103a11c3ff50cecc413f794e52c8 \
--hash=sha256:21c2fe25019267a608e2710a6a947f557486b4b0478b02e45a81cf606a05a7d4 \
--hash=sha256:43aa4d7d10c542d3c78106b8481e0cbaaec788c39ee8e3d7901682748ffdf0b4 \
--hash=sha256:44240e694d4aa321430c97b21453e46014fe6c7b8b7d932afa7f6a88cc5d7e5e \
--hash=sha256:46c5ae094fb3706f5adf9021bfb7fc11e44818d61afee695cdee4eadfed45e98 \
--hash=sha256:550d389573bc3b895e1ccb18289feea11d937011de4d278b09dc7ed585d1cdcb \
--hash=sha256:5cafa6095716fcb081618c28a8d27bf7c001e09696f595b41836dec114be2905 \
--hash=sha256:5f6c452dca1de8e60e593f7066df052dd8da09b243566ecd26d2b796e5d3087d \
--hash=sha256:6fb74e81aa78a2303e30593d8331327dfc0d2522b5db05ac967556a26db3ef87 \
--hash=sha256:74250b26ed6a1428763e774dc5b2d4e70d93f73795635b5412b8346a4d054574 \
--hash=sha256:764f0e47004f126f58c3b28e0a02374c420a9d15157b90806d68590f5c20cc89 \
--hash=sha256:888d2e89b8160899e19591467e8fdd4970e07606e1fbc248f239c89818d5f925 \
--hash=sha256:9cab9783a700fa428b1a54d65295122bc03b3de1d01fb819a6b9dbbddfdb8c43 \
--hash=sha256:9f25f7fef0206d55c1cfb796ad833cbbc044e2884751e56e798351280038484c \
--hash=sha256:b72bbc8708e98b3741ad0c63f9929c47b623cc4ee86e17030a4f3e301e8401ac \
--hash=sha256:b96fafbdcf6f69b69855273e988696aae4974115a815f6818fef4af7afa1f6b8 \
--hash=sha256:bf64c2d0f3d161af603de3825172fb83c2600bcb1d53ae8ea568d4c53ba6ac08 \
--hash=sha256:de5aa7904741425f28e1028b85850b31f0a245e9eb4f7c38507fb893283a066c \
--hash=sha256:ffe9fe373ed30638d6e20a0269f817b2c75d447141f55a675bfcf2d1fe2e87fb
numexpr==2.10.2 \
--hash=sha256:0db5ff5183935d1612653559c319922143e8fa3019007696571b13135f216458 \
--hash=sha256:15f59655458056fdb3a621b1bb8e071581ccf7e823916c7568bb7c9a3e393025 \
--hash=sha256:3bf01ec502d89944e49e9c1b5cc7c7085be8ca2eb9dd46a0eafd218afbdbd5f5 \
--hash=sha256:3fc2b8035a0c2cdc352e58c3875cb668836018065cbf5752cb531015d9a568d8 \
--hash=sha256:4213a92efa9770bc28e3792134e27c7e5c7e97068bdfb8ba395baebbd12f991b \
--hash=sha256:5191ba8f2975cb9703afc04ae845a929e193498c0e8bcd408ecb147b35978470 \
--hash=sha256:57b59cbb5dcce4edf09cd6ce0b57ff60312479930099ca8d944c2fac896a1ead \
--hash=sha256:5b3f814437d5a10797f8d89d2037cca2c9d9fa578520fc911f894edafed6ea3e \
--hash=sha256:6b360eb8d392483410fe6a3d5a7144afa298c9a0aa3e9fe193e89590b47dd477 \
--hash=sha256:81d1dde7dd6166d8ff5727bb46ab42a6b0048db0e97ceb84a121334a404a800f \
--hash=sha256:83fcb11988b57cc25b028a36d285287d706d1f536ebf2662ea30bd990e0de8b9 \
--hash=sha256:9309f2e43fe6e4560699ef5c27d7a848b3ff38549b6b57194207cf0e88900527 \
--hash=sha256:97298b14f0105a794bea06fd9fbc5c423bd3ff4d88cbc618860b83eb7a436ad6 \
--hash=sha256:a37d6a51ec328c561b2ca8a2bef07025642eca995b8553a5267d0018c732976d \
--hash=sha256:a42963bd4c62d8afa4f51e7974debfa39a048383f653544ab54f50a2f7ec6c42 \
--hash=sha256:b0aff6b48ebc99d2f54f27b5f73a58cb92fde650aeff1b397c71c8788b4fff1a \
--hash=sha256:b5323a46e75832334f1af86da1ef6ff0add00fbacdd266250be872b438bdf2be \
--hash=sha256:b5b0e82d2109c1d9e63fcd5ea177d80a11b881157ab61178ddbdebd4c561ea46 \
--hash=sha256:ba85371c9a8d03e115f4dfb6d25dfbce05387002b9bc85016af939a1da9624f0 \
--hash=sha256:cb845b2d4f9f8ef0eb1c9884f2b64780a85d3b5ae4eeb26ae2b0019f489cd35e \
--hash=sha256:ce8cccf944339051e44a49a124a06287fe3066d0acbff33d1aa5aee10a96abb7 \
--hash=sha256:d7a3fc83c959288544db3adc70612475d8ad53a66c69198105c74036182d10dd \
--hash=sha256:d9a42f5c24880350d88933c4efee91b857c378aaea7e8b86221fff569069841e \
--hash=sha256:deb64235af9eeba59fcefa67e82fa80cfc0662e1b0aa373b7118a28da124d51d \
--hash=sha256:e2d0ae24b0728e4bc3f1d3f33310340d67321d36d6043f7ce26897f4f1042db0 \
--hash=sha256:ebb73b93f5c4d6994f357fa5a47a9f7a5485577e633b3c46a603cb01445bbb19 \
--hash=sha256:ebdbef5763ca057eea0c2b5698e4439d084a0505d9d6e94f4804f26e8890c45e \
--hash=sha256:ec04c9a3c050c175348801e27c18c68d28673b7bfb865ef88ce333be523bbc01 \
--hash=sha256:f9d7805ccb6be2d3b0f7f6fad3707a09ac537811e8e9964f4074d28cb35543db
numpy==2.2.4 \
--hash=sha256:05c076d531e9998e7e694c36e8b349969c56eadd2cdcd07242958489d79a7286 \
--hash=sha256:0d54974f9cf14acf49c60f0f7f4084b6579d24d439453d5fc5805d46a165b542 \
--hash=sha256:11c43995255eb4127115956495f43e9343736edb7fcdb0d973defd9de14cd84f \
--hash=sha256:188dcbca89834cc2e14eb2f106c96d6d46f200fe0200310fc29089657379c58d \
--hash=sha256:1974afec0b479e50438fc3648974268f972e2d908ddb6d7fb634598cdb8260a0 \
--hash=sha256:1cf4e5c6a278d620dee9ddeb487dc6a860f9b199eadeecc567f777daace1e9e7 \
--hash=sha256:207a2b8441cc8b6a2a78c9ddc64d00d20c303d79fba08c577752f080c4007ee3 \
--hash=sha256:218f061d2faa73621fa23d6359442b0fc658d5b9a70801373625d958259eaca3 \
--hash=sha256:2aad3c17ed2ff455b8eaafe06bcdae0062a1db77cb99f4b9cbb5f4ecb13c5146 \
--hash=sha256:2fa8fa7697ad1646b5c93de1719965844e004fcad23c91228aca1cf0800044a1 \
--hash=sha256:31504f970f563d99f71a3512d0c01a645b692b12a63630d6aafa0939e52361e6 \
--hash=sha256:3387dd7232804b341165cedcb90694565a6015433ee076c6754775e85d86f1fc \
--hash=sha256:4ba5054787e89c59c593a4169830ab362ac2bee8a969249dc56e5d7d20ff8df9 \
--hash=sha256:4f92084defa704deadd4e0a5ab1dc52d8ac9e8a8ef617f3fbb853e79b0ea3592 \
--hash=sha256:65ef3468b53269eb5fdb3a5c09508c032b793da03251d5f8722b1194f1790c00 \
--hash=sha256:6f527d8fdb0286fd2fd97a2a96c6be17ba4232da346931d967a0630050dfd298 \
--hash=sha256:7051ee569db5fbac144335e0f3b9c2337e0c8d5c9fee015f259a5bd70772b7e8 \
--hash=sha256:7716e4a9b7af82c06a2543c53ca476fa0b57e4d760481273e09da04b74ee6ee2 \
--hash=sha256:79bd5f0a02aa16808fcbc79a9a376a147cc1045f7dfe44c6e7d53fa8b8a79392 \
--hash=sha256:7a4e84a6283b36632e2a5b56e121961f6542ab886bc9e12f8f9818b3c266bfbb \
--hash=sha256:8120575cb4882318c791f839a4fd66161a6fa46f3f0a5e613071aae35b5dd8f8 \
--hash=sha256:81413336ef121a6ba746892fad881a83351ee3e1e4011f52e97fba79233611fd \
--hash=sha256:8146f3550d627252269ac42ae660281d673eb6f8b32f113538e0cc2a9aed42b9 \
--hash=sha256:879cf3a9a2b53a4672a168c21375166171bc3932b7e21f622201811c43cdd3b0 \
--hash=sha256:892c10d6a73e0f14935c31229e03325a7b3093fafd6ce0af704be7f894d95687 \
--hash=sha256:92bda934a791c01d6d9d8e038363c50918ef7c40601552a58ac84c9613a665bc \
--hash=sha256:9ba03692a45d3eef66559efe1d1096c4b9b75c0986b5dff5530c378fb8331d4f \
--hash=sha256:9eeea959168ea555e556b8188da5fa7831e21d91ce031e95ce23747b7609f8a4 \
--hash=sha256:a0258ad1f44f138b791327961caedffbf9612bfa504ab9597157806faa95194a \
--hash=sha256:a761ba0fa886a7bb33c6c8f6f20213735cb19642c580a931c625ee377ee8bd39 \
--hash=sha256:a7b9084668aa0f64e64bd00d27ba5146ef1c3a8835f3bd912e7a9e01326804c4 \
--hash=sha256:a84eda42bd12edc36eb5b53bbcc9b406820d3353f1994b6cfe453a33ff101775 \
--hash=sha256:ab2939cd5bec30a7430cbdb2287b63151b77cf9624de0532d629c9a1c59b1d5c \
--hash=sha256:ac0280f1ba4a4bfff363a99a6aceed4f8e123f8a9b234c89140f5e894e452ecd \
--hash=sha256:adf8c1d66f432ce577d0197dceaac2ac00c0759f573f28516246351c58a85020 \
--hash=sha256:b4adfbbc64014976d2f91084915ca4e626fbf2057fb81af209c1a6d776d23e3d \
--hash=sha256:bb649f8b207ab07caebba230d851b579a3c8711a851d29efe15008e31bb4de24 \
--hash=sha256:bce43e386c16898b91e162e5baaad90c4b06f9dcbe36282490032cec98dc8ae7 \
--hash=sha256:bd3ad3b0a40e713fc68f99ecfd07124195333f1e689387c180813f0e94309d6f \
--hash=sha256:c3f7ac96b16955634e223b579a3e5798df59007ca43e8d451a0e6a50f6bfdfba \
--hash=sha256:cf28633d64294969c019c6df4ff37f5698e8326db68cc2b66576a51fad634880 \
--hash=sha256:d0f35b19894a9e08639fd60a1ec1978cb7f5f7f1eace62f38dd36be8aecdef4d \
--hash=sha256:db1f1c22173ac1c58db249ae48aa7ead29f534b9a948bc56828337aa84a32ed6 \
--hash=sha256:dbe512c511956b893d2dacd007d955a3f03d555ae05cfa3ff1c1ff6df8851854 \
--hash=sha256:df2f57871a96bbc1b69733cd4c51dc33bea66146b8c63cacbfed73eec0883017 \
--hash=sha256:e2f085ce2e813a50dfd0e01fbfc0c12bbe5d2063d99f8b29da30e544fb6483b8 \
--hash=sha256:e642d86b8f956098b564a45e6f6ce68a22c2c97a04f5acd3f221f57b8cb850ae \
--hash=sha256:e9e0a277bb2eb5d8a7407e14688b85fd8ad628ee4e0c7930415687b6564207a4 \
--hash=sha256:ea2bb7e2ae9e37d96835b3576a4fa4b3a97592fbea8ef7c3587078b0068b8f09 \
--hash=sha256:ee4d528022f4c5ff67332469e10efe06a267e32f4067dc76bb7e2cddf3cd25ff \
--hash=sha256:f05d4198c1bacc9124018109c5fba2f3201dbe7ab6e92ff100494f236209c960 \
--hash=sha256:f34dc300df798742b3d06515aa2a0aee20941c13579d7a2f2e10af01ae4901ee \
--hash=sha256:f4162988a360a29af158aeb4a2f4f09ffed6a969c9776f8f3bdee9b06a8ab7e5 \
--hash=sha256:f486038e44caa08dbd97275a9a35a283a8f1d2f0ee60ac260a1790e76660833c \
--hash=sha256:f7de08cbe5551911886d1ab60de58448c6df0f67d9feb7d1fb21e9875ef95e91
pandas==2.2.3 \
--hash=sha256:062309c1b9ea12a50e8ce661145c6aab431b1e99530d3cd60640e255778bd43a \
--hash=sha256:15c0e1e02e93116177d29ff83e8b1619c93ddc9c49083f237d4312337a61165d \
--hash=sha256:1948ddde24197a0f7add2bdc4ca83bf2b1ef84a1bc8ccffd95eda17fd836ecb5 \
--hash=sha256:1db71525a1538b30142094edb9adc10be3f3e176748cd7acc2240c2f2e5aa3a4 \
--hash=sha256:22a9d949bfc9a502d320aa04e5d02feab689d61da4e7764b62c30b991c42c5f0 \
--hash=sha256:29401dbfa9ad77319367d36940cd8a0b3a11aba16063e39632d98b0e931ddf32 \
--hash=sha256:3508d914817e153ad359d7e069d752cdd736a247c322d932eb89e6bc84217f28 \
--hash=sha256:37e0aced3e8f539eccf2e099f65cdb9c8aa85109b0be6e93e2baff94264bdc6f \
--hash=sha256:381175499d3802cde0eabbaf6324cce0c4f5d52ca6f8c377c29ad442f50f6348 \
--hash=sha256:38cf8125c40dae9d5acc10fa66af8ea6fdf760b2714ee482ca691fc66e6fcb18 \
--hash=sha256:3b71f27954685ee685317063bf13c7709a7ba74fc996b84fc6821c59b0f06468 \
--hash=sha256:3fc6873a41186404dad67245896a6e440baacc92f5b716ccd1bc9ed2995ab2c5 \
--hash=sha256:4f18ba62b61d7e192368b84517265a99b4d7ee8912f8708660fb4a366cc82667 \
--hash=sha256:56534ce0746a58afaf7942ba4863e0ef81c9c50d3f0ae93e9497d6a41a057645 \
--hash=sha256:59ef3764d0fe818125a5097d2ae867ca3fa64df032331b7e0917cf5d7bf66b13 \
--hash=sha256:5de54125a92bb4d1c051c0659e6fcb75256bf799a732a87184e5ea503965bce3 \
--hash=sha256:61c5ad4043f791b61dd4752191d9f07f0ae412515d59ba8f005832a532f8736d \
--hash=sha256:6374c452ff3ec675a8f46fd9ab25c4ad0ba590b71cf0656f8b6daa5202bca3fb \
--hash=sha256:63cc132e40a2e084cf01adf0775b15ac515ba905d7dcca47e9a251819c575ef3 \
--hash=sha256:66108071e1b935240e74525006034333f98bcdb87ea116de573a6a0dccb6c039 \
--hash=sha256:6dfcb5ee8d4d50c06a51c2fffa6cff6272098ad6540aed1a76d15fb9318194d8 \
--hash=sha256:7c2875855b0ff77b2a64a0365e24455d9990730d6431b9e0ee18ad8acee13dbd \
--hash=sha256:800250ecdadb6d9c78eae4990da62743b857b470883fa27f652db8bdde7f6659 \
--hash=sha256:86976a1c5b25ae3f8ccae3a5306e443569ee3c3faf444dfd0f41cda24667ad57 \
--hash=sha256:a5a1595fe639f5988ba6a8e5bc9649af3baf26df3998a0abe56c02609392e0a4 \
--hash=sha256:ad5b65698ab28ed8d7f18790a0dc58005c7629f227be9ecc1072aa74c0c1d43a \
--hash=sha256:b1d432e8d08679a40e2a6d8b2f9770a5c21793a6f9f47fdd52c5ce1948a5a8a9 \
--hash=sha256:b8661b0238a69d7aafe156b7fa86c44b881387509653fdf857bebc5e4008ad42 \
--hash=sha256:ba96630bc17c875161df3818780af30e43be9b166ce51c9a18c1feae342906c2 \
--hash=sha256:c124333816c3a9b03fbeef3a9f230ba9a737e9e5bb4060aa2107a86cc0a497fc \
--hash=sha256:cd8d0c3be0515c12fed0bdbae072551c8b54b7192c7b1fda0ba56059a0179698 \
--hash=sha256:d9c45366def9a3dd85a6454c0e7908f2b3b8e9c138f5dc38fed7ce720d8453ed \
--hash=sha256:f00d1345d84d8c86a63e476bb4955e46458b304b9575dcf71102b5c705320015 \
--hash=sha256:f3a255b2c19987fbbe62a9dfd6cff7ff2aa9ccab3fc75218fd4b7530f01efa24 \
--hash=sha256:fffb8ae78d8af97f849404f21411c95062db1496aeb3e56f146f0355c9989319
platformdirs==4.3.6 \
--hash=sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907 \
--hash=sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb
py3langid==0.3.0 \
--hash=sha256:0a875a031a58aaf9dbda7bb8285fd75e801a7bd276216ffabe037901d4b449ec \
--hash=sha256:38f022eec31cf9a2bf6f142acb2a9b350fd7d0d5ae7762b1392c6d3567401fd3
pyarrow==19.0.0 \
--hash=sha256:239ca66d9a05844bdf5af128861af525e14df3c9591bcc05bac25918e650d3a2 \
--hash=sha256:2795064647add0f16563e57e3d294dbfc067b723f0fd82ecd80af56dad15f503 \
--hash=sha256:29cd86c8001a94f768f79440bf83fee23963af5e7bc68ce3a7e5f120e17edf89 \
--hash=sha256:2a0144a712d990d60f7f42b7a31f0acaccf4c1e43e957f7b1ad58150d6f639c1 \
--hash=sha256:2a1a109dfda558eb011e5f6385837daffd920d54ca00669f7a11132d0b1e6042 \
--hash=sha256:2b6d3ce4288793350dc2d08d1e184fd70631ea22a4ff9ea5c4ff182130249d9b \
--hash=sha256:2f672f5364b2d7829ef7c94be199bb88bf5661dd485e21d2d37de12ccb78a136 \
--hash=sha256:450a7d27e840e4d9a384b5c77199d489b401529e75a3b7a3799d4cd7957f2f9c \
--hash=sha256:4624c89d6f777c580e8732c27bb8e77fd1433b89707f17c04af7635dd9638351 \
--hash=sha256:4d8b0c0de0a73df1f1bf439af1b60f273d719d70648e898bc077547649bb8352 \
--hash=sha256:5418d4d0fab3a0ed497bad21d17a7973aad336d66ad4932a3f5f7480d4ca0c04 \
--hash=sha256:5e8a28b918e2e878c918f6d89137386c06fe577cd08d73a6be8dafb317dc2d73 \
--hash=sha256:62ef8360ff256e960f57ce0299090fb86423afed5e46f18f1225f960e05aae3d \
--hash=sha256:66732e39eaa2247996a6b04c8aa33e3503d351831424cdf8d2e9a0582ac54b34 \
--hash=sha256:8d47c691765cf497aaeed4954d226568563f1b3b74ff61139f2d77876717084b \
--hash=sha256:8e3a839bf36ec03b4315dc924d36dcde5444a50066f1c10f8290293c0427b46a \
--hash=sha256:9348a0137568c45601b031a8d118275069435f151cbb77e6a08a27e8125f59d4 \
--hash=sha256:a08e2a8a039a3f72afb67a6668180f09fddaa38fe0d21f13212b4aba4b5d2451 \
--hash=sha256:a218670b26fb1bc74796458d97bcab072765f9b524f95b2fccad70158feb8b17 \
--hash=sha256:a22a4bc0937856263df8b94f2f2781b33dd7f876f787ed746608e06902d691a5 \
--hash=sha256:a7bbe7109ab6198688b7079cbad5a8c22de4d47c4880d8e4847520a83b0d1b68 \
--hash=sha256:a92aff08e23d281c69835e4a47b80569242a504095ef6a6223c1f6bb8883431d \
--hash=sha256:b34d3bde38eba66190b215bae441646330f8e9da05c29e4b5dd3e41bde701098 \
--hash=sha256:b903afaa5df66d50fc38672ad095806443b05f202c792694f3a604ead7c6ea6e \
--hash=sha256:be686bf625aa7b9bada18defb3a3ea3981c1099697239788ff111d87f04cd263 \
--hash=sha256:c318eda14f6627966997a7d8c374a87d084a94e4e38e9abbe97395c215830e0c \
--hash=sha256:c3b78eff5968a1889a0f3bc81ca57e1e19b75f664d9c61a42a604bf9d8402aae \
--hash=sha256:c751c1c93955b7a84c06794df46f1cec93e18610dcd5ab7d08e89a81df70a849 \
--hash=sha256:ce42275097512d9e4e4a39aade58ef2b3798a93aa3026566b7892177c266f735 \
--hash=sha256:cf3bf0ce511b833f7bc5f5bb3127ba731e97222023a444b7359f3a22e2a3b463 \
--hash=sha256:e675a3ad4732b92d72e4d24009707e923cab76b0d088e5054914f11a797ebe44 \
--hash=sha256:e82c3d5e44e969c217827b780ed8faf7ac4c53f934ae9238872e749fa531f7c9 \
--hash=sha256:f094742275586cdd6b1a03655ccff3b24b2610c3af76f810356c4c71d24a2a6c \
--hash=sha256:f208c3b58a6df3b239e0bb130e13bc7487ed14f39a9ff357b6415e3f6339b560 \
--hash=sha256:f43f5aef2a13d4d56adadae5720d1fed4c1356c993eda8b59dace4b5983843c1
pycountry==24.6.1 \
--hash=sha256:b61b3faccea67f87d10c1f2b0fc0be714409e8fcdcc1315613174f6466c10221 \
--hash=sha256:f1a4fb391cd7214f8eefd39556d740adcc233c778a27f8942c8dca351d6ce06f
python-dateutil==2.9.0.post0 \
--hash=sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3 \
--hash=sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427
python-stdnum==1.20 \
--hash=sha256:111008e10391d54fb2afad2a10df70d5cb0c6c0a7ec82fec6f022cb8712961d3 \
--hash=sha256:ad2a2cf2eb025de408210235f36b4ae31252de3186240ccaa8126e117cb82690
pytz==2024.2 \
--hash=sha256:2aa355083c50a0f93fa581709deac0c9ad65cca8a9e9beac660adcbd493c798a \
--hash=sha256:31c7c1817eb7fae7ca4b8c7ee50c72f93aa2dd863de768e1ef4245d426aa0725
requests==2.32.3 \
--hash=sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760 \
--hash=sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6
requests-cache==1.2.1 \
--hash=sha256:1285151cddf5331067baa82598afe2d47c7495a1334bfe7a7d329b43e9fd3603 \
--hash=sha256:68abc986fdc5b8d0911318fbb5f7c80eebcd4d01bfacc6685ecf8876052511d1
six==1.17.0 \
--hash=sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274 \
--hash=sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81
typing-extensions==4.12.2 ; python_full_version < '3.11' \
--hash=sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d \
--hash=sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8
tzdata==2025.1 \
--hash=sha256:24894909e88cdb28bd1636c6887801df64cb485bd593f2fd83ef29075a81d694 \
--hash=sha256:7e127113816800496f027041c570f50bcd464a020098a3b6b199517772303639
url-normalize==1.4.3 \
--hash=sha256:d23d3a070ac52a67b83a1c59a0e68f8608d1cd538783b401bc9de2c0fac999b2 \
--hash=sha256:ec3c301f04e5bb676d333a7fa162fa977ad2ca04b7e652bfc9fac4e405728eed
urllib3==1.26.7; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version < "4" and python_version >= "3.6" \
--hash=sha256:c4fdf4019605b6e5423637e01bc9fe4daef873709a7973e195ceba0a62bbc844 \
--hash=sha256:4987c65554f7a2dbf30c18fd48778ef124af6fab771a377103da0585e2336ece
wcwidth==0.2.5; python_version >= "3.5" \
--hash=sha256:beb4802a9cebb9144e99086eff703a642a13d6a0052920003a230f3294bbe784 \
--hash=sha256:c4d647b99872929fdb7bdcaa4fbe7f01413ed3d98077df798530e5b04f116c83
xlrd==1.2.0; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.4.0") \
--hash=sha256:e551fb498759fa3a5384a94ccd4c3c02eb7c00ea424426e212ac0c57be9dfbde \
--hash=sha256:546eb36cee8db40c3eaa46c351e67ffee6eeb5fa2650b71bc4c758a29a1b29b2
zipp==3.6.0; python_version >= "3.6" and python_full_version < "3.0.0" and python_version < "3.8" or python_full_version >= "3.6.0" and python_version < "3.8" and python_version >= "3.6" \
--hash=sha256:9fe5ea21568a0a70e50f273397638d39b03353731e6cbbb3fd8502a33fec40bc \
--hash=sha256:71c644c5369f4a6e07636f0aa966270449561fcea2e3d6747b8d23efaa9d7832
urllib3==2.3.0 \
--hash=sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df \
--hash=sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d
wcwidth==0.2.13 \
--hash=sha256:3da69048e4540d84af32131829ff948f1e022c1c6bdb8d6102117aac784f6859 \
--hash=sha256:72ea0c06399eb286d978fdedb6923a9eb47e1c486ce63e9b4e64fc18303972b5

View File

@ -1,6 +0,0 @@
[isort]
multi_line_output=3
include_trailing_comma=True
force_grid_wrap=0
use_parentheses=True
line_length=88

View File

@ -1,37 +0,0 @@
import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
install_requires = [
"pandas",
"python-stdnum",
"requests",
"requests-cache",
"pycountry",
"langid",
]
setuptools.setup(
name="csv-metadata-quality",
version="0.5.0",
author="Alan Orth",
author_email="aorth@mjanja.ch",
description="A simple, but opinionated CSV quality checking and fixing pipeline for CSVs in the DSpace ecosystem.",
license="GPLv3",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/alanorth/csv-metadata-quality",
classifiers=[
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: OS Independent",
],
packages=["csv_metadata_quality"],
entry_points={
"console_scripts": ["csv-metadata-quality = csv_metadata_quality.__main__:main"]
},
install_requires=install_requires,
)

View File

@ -25,7 +25,7 @@ def test_check_valid_issn():
result = check.issn(value)
assert result == None
assert result is None
def test_check_invalid_isbn(capsys):
@ -46,7 +46,7 @@ def test_check_valid_isbn():
result = check.isbn(value)
assert result == None
assert result is None
def test_check_missing_date(capsys):
@ -102,7 +102,7 @@ def test_check_valid_date():
result = check.date(value, field_name)
assert result == None
assert result is None
def test_check_suspicious_characters(capsys):
@ -128,7 +128,7 @@ def test_check_valid_iso639_1_language():
result = check.language(value)
assert result == None
assert result is None
def test_check_valid_iso639_3_language():
@ -138,7 +138,7 @@ def test_check_valid_iso639_3_language():
result = check.language(value)
assert result == None
assert result is None
def test_check_invalid_iso639_1_language(capsys):
@ -179,18 +179,41 @@ def test_check_invalid_language(capsys):
def test_check_invalid_agrovoc(capsys):
"""Test invalid AGROVOC subject."""
"""Test invalid AGROVOC subject. Invalid values *will not* be dropped."""
value = "FOREST"
valid_agrovoc = "LIVESTOCK"
invalid_agrovoc = "FOREST"
value = f"{valid_agrovoc}||{invalid_agrovoc}"
field_name = "dcterms.subject"
drop = False
check.agrovoc(value, field_name)
new_value = check.agrovoc(value, field_name, drop)
captured = capsys.readouterr()
assert (
captured.out
== f"{Fore.RED}Invalid AGROVOC ({field_name}): {Fore.RESET}{value}\n"
== f"{Fore.RED}Invalid AGROVOC ({field_name}): {Fore.RESET}{invalid_agrovoc}\n"
)
assert new_value == value
def test_check_invalid_agrovoc_dropped(capsys):
"""Test invalid AGROVOC subjects. Invalid values *will* be dropped."""
valid_agrovoc = "LIVESTOCK"
invalid_agrovoc = "FOREST"
value = f"{valid_agrovoc}||{invalid_agrovoc}"
field_name = "dcterms.subject"
drop = True
new_value = check.agrovoc(value, field_name, drop)
captured = capsys.readouterr()
assert (
captured.out
== f"{Fore.GREEN}Dropping invalid AGROVOC ({field_name}): {Fore.RESET}{invalid_agrovoc}\n"
)
assert new_value == valid_agrovoc
def test_check_valid_agrovoc():
@ -198,10 +221,11 @@ def test_check_valid_agrovoc():
value = "FORESTS"
field_name = "dcterms.subject"
drop = False
result = check.agrovoc(value, field_name)
result = check.agrovoc(value, field_name, drop)
assert result == None
assert result == "FORESTS"
def test_check_uncommon_filename_extension(capsys):
@ -225,7 +249,7 @@ def test_check_common_filename_extension():
result = check.filename_extension(value)
assert result == None
assert result is None
def test_check_incorrect_iso_639_1_language(capsys):
@ -233,12 +257,13 @@ def test_check_incorrect_iso_639_1_language(capsys):
title = "A randomised vaccine field trial in Kenya demonstrates protection against wildebeest-associated malignant catarrhal fever in cattle"
language = "es"
exclude = []
# Create a dictionary to mimic Pandas series
row = {"dc.title": title, "dc.language.iso": language}
series = pd.Series(row)
experimental.correct_language(series)
experimental.correct_language(series, exclude)
captured = capsys.readouterr()
assert (
@ -252,12 +277,13 @@ def test_check_incorrect_iso_639_3_language(capsys):
title = "A randomised vaccine field trial in Kenya demonstrates protection against wildebeest-associated malignant catarrhal fever in cattle"
language = "spa"
exclude = []
# Create a dictionary to mimic Pandas series
row = {"dc.title": title, "dc.language.iso": language}
series = pd.Series(row)
experimental.correct_language(series)
experimental.correct_language(series, exclude)
captured = capsys.readouterr()
assert (
@ -271,14 +297,15 @@ def test_check_correct_iso_639_1_language():
title = "A randomised vaccine field trial in Kenya demonstrates protection against wildebeest-associated malignant catarrhal fever in cattle"
language = "en"
exclude = []
# Create a dictionary to mimic Pandas series
row = {"dc.title": title, "dc.language.iso": language}
series = pd.Series(row)
result = experimental.correct_language(series)
result = experimental.correct_language(series, exclude)
assert result == None
assert result is None
def test_check_correct_iso_639_3_language():
@ -286,14 +313,15 @@ def test_check_correct_iso_639_3_language():
title = "A randomised vaccine field trial in Kenya demonstrates protection against wildebeest-associated malignant catarrhal fever in cattle"
language = "eng"
exclude = []
# Create a dictionary to mimic Pandas series
row = {"dc.title": title, "dc.language.iso": language}
series = pd.Series(row)
result = experimental.correct_language(series)
result = experimental.correct_language(series, exclude)
assert result == None
assert result is None
def test_check_valid_spdx_license_identifier():
@ -303,7 +331,7 @@ def test_check_valid_spdx_license_identifier():
result = check.spdx_license_identifier(license)
assert result == None
assert result is None
def test_check_invalid_spdx_license_identifier(capsys):
@ -311,7 +339,7 @@ def test_check_invalid_spdx_license_identifier(capsys):
license = "CC-BY-SA"
result = check.spdx_license_identifier(license)
check.spdx_license_identifier(license)
captured = capsys.readouterr()
assert (
@ -334,7 +362,7 @@ def test_check_duplicate_item(capsys):
}
df = pd.DataFrame(data=d)
result = check.duplicate_items(df)
check.duplicate_items(df)
captured = capsys.readouterr()
assert (
@ -351,7 +379,7 @@ def test_check_no_mojibake():
result = check.mojibake(field, field_name)
assert result == None
assert result is None
def test_check_mojibake(capsys):
@ -360,7 +388,7 @@ def test_check_mojibake(capsys):
field = "CIAT Publicaçao"
field_name = "dcterms.isPartOf"
result = check.mojibake(field, field_name)
check.mojibake(field, field_name)
captured = capsys.readouterr()
assert (
@ -379,23 +407,25 @@ def test_check_doi_field():
# the citation and a DOI field.
d = {"cg.identifier.doi": doi, "dcterms.bibliographicCitation": citation}
series = pd.Series(data=d)
exclude = []
result = check.citation_doi(series)
result = check.citation_doi(series, exclude)
assert result == None
assert result is None
def test_check_doi_only_in_citation(capsys):
"""Test an item with a DOI in its citation, but no DOI field."""
citation = "Orth, A. 2021. Testing all the things. doi: 10.1186/1743-422X-9-218"
exclude = []
# Emulate a column in a transposed dataframe (which is just a series), with
# an empty DOI field and a citation containing a DOI.
d = {"cg.identifier.doi": None, "dcterms.bibliographicCitation": citation}
series = pd.Series(data=d)
check.citation_doi(series)
check.citation_doi(series, exclude)
captured = capsys.readouterr()
assert (
@ -409,15 +439,16 @@ def test_title_in_citation():
title = "Testing all the things"
citation = "Orth, A. 2021. Testing all the things."
exclude = []
# Emulate a column in a transposed dataframe (which is just a series), with
# the title and citation.
d = {"dc.title": title, "dcterms.bibliographicCitation": citation}
series = pd.Series(data=d)
result = check.title_in_citation(series)
result = check.title_in_citation(series, exclude)
assert result == None
assert result is None
def test_title_not_in_citation(capsys):
@ -425,13 +456,14 @@ def test_title_not_in_citation(capsys):
title = "Testing all the things"
citation = "Orth, A. 2021. Testing all teh things."
exclude = []
# Emulate a column in a transposed dataframe (which is just a series), with
# the title and citation.
d = {"dc.title": title, "dcterms.bibliographicCitation": citation}
series = pd.Series(data=d)
check.title_in_citation(series)
check.title_in_citation(series, exclude)
captured = capsys.readouterr()
assert (
@ -445,14 +477,15 @@ def test_country_matches_region():
country = "Kenya"
region = "Eastern Africa"
exclude = []
# Emulate a column in a transposed dataframe (which is just a series)
d = {"cg.coverage.country": country, "cg.coverage.region": region}
series = pd.Series(data=d)
result = check.countries_match_regions(series)
result = check.countries_match_regions(series, exclude)
assert result == None
assert result is None
def test_country_not_matching_region(capsys):
@ -462,6 +495,7 @@ def test_country_not_matching_region(capsys):
country = "Kenya"
region = ""
missing_region = "Eastern Africa"
exclude = []
# Emulate a column in a transposed dataframe (which is just a series)
d = {
@ -471,10 +505,10 @@ def test_country_not_matching_region(capsys):
}
series = pd.Series(data=d)
check.countries_match_regions(series)
check.countries_match_regions(series, exclude)
captured = capsys.readouterr()
assert (
captured.out
== f"{Fore.YELLOW}Missing region ({missing_region}): {Fore.RESET}{title}\n"
== f"{Fore.YELLOW}Missing region ({country} → {missing_region}): {Fore.RESET}{title}\n"
)

View File

@ -1,5 +1,7 @@
# SPDX-License-Identifier: GPL-3.0-only
import pandas as pd
import csv_metadata_quality.fix as fix
@ -120,3 +122,41 @@ def test_fix_mojibake():
field_name = "dcterms.isPartOf"
assert fix.mojibake(field, field_name) == "CIAT Publicaçao"
def test_fix_country_not_matching_region():
"""Test an item with regions not matching its country list."""
title = "Testing an item with no matching region."
country = "Kenya"
region = ""
missing_region = "Eastern Africa"
exclude = []
# Emulate a column in a transposed dataframe (which is just a series)
d = {
"dc.title": title,
"cg.coverage.country": country,
"cg.coverage.region": region,
}
series = pd.Series(data=d)
result = fix.countries_match_regions(series, exclude)
# Emulate the correct series we are expecting
d_correct = {
"dc.title": title,
"cg.coverage.country": country,
"cg.coverage.region": missing_region,
}
series_correct = pd.Series(data=d_correct)
pd.testing.assert_series_equal(result, series_correct)
def test_fix_normalize_dois():
"""Test normalizing a DOI."""
value = "doi: 10.11648/j.jps.20140201.14"
assert fix.normalize_dois(value) == "https://doi.org/10.11648/j.jps.20140201.14"

1180
uv.lock generated Normal file

File diff suppressed because it is too large Load Diff