A simple but opinionated metadata quality checker and fixer designed to work with CSVs in the DSpace ecosystem.
Go to file
Alan Orth 40d5f7d81b
Add support for removing newlines
This was tricky because of the nature of newlines. In actuality we
are removing Unix line feeds here (U+000A) because Windows carriage
returns are actually already removed by the string stripping in the
whitespace fix.

Creating the test case in Vim was difficult because I couldn't fig-
ure out how to manually enter a line feed character. In the end I
used a search and replace on a known pattern like "ALAN", replacing
it with \r. Neither entering the Unicode code point (U+000A) direc-
tly or typing an "Enter" character after ^V worked. Grrr.
2019-07-30 20:05:12 +03:00
csv_metadata_quality Add support for removing newlines 2019-07-30 20:05:12 +03:00
data Add support for removing newlines 2019-07-30 20:05:12 +03:00
tests Add support for removing newlines 2019-07-30 20:05:12 +03:00
.build.yml .build.yml: Fix setup script 2019-07-27 00:41:57 +03:00
.flake8 Add flake8 to pipenv dev environment 2019-07-28 17:46:30 +03:00
.gitignore Add support for validating subjects against AGROVOC 2019-07-30 00:30:31 +03:00
.travis.yml .travis.yml: Only test Python 3.6 and 3.7 2019-07-29 12:13:55 +03:00
LICENSE.txt Add GPLv3 license 2019-07-26 22:16:16 +03:00
Pipfile Use pycountry instead of iso-639 for languages 2019-07-30 16:39:26 +03:00
Pipfile.lock Use pycountry instead of iso-639 for languages 2019-07-30 16:39:26 +03:00
README.md Add support for removing newlines 2019-07-30 20:05:12 +03:00
pytest.ini pytest.ini: Don't print captured output 2019-07-30 16:43:31 +03:00
requirements-dev.txt Update python requirements 2019-07-29 19:10:49 +03:00
requirements.txt Update python requirements 2019-07-30 00:34:32 +03:00

README.md

CSV Metadata Quality Build Status builds.sr.ht status

A simple, but opinionated metadata quality checker and fixer designed to work with CSVs in the DSpace ecosystem. The implementation is essentially a pipeline of checks and fixes that begins with splitting multi-value fields on the standard DSpace "||" separator, trimming leading/trailing whitespace, and then proceeding to more specialized cases like ISSNs, ISBNs, languages, etc.

Requires Python 3.6 or greater. CSV and Excel support comes from the Pandas library, though your mileage may vary with Excel because this is much less tested.

Functionality

  • Validate dates, ISSNs, ISBNs, and multi-value separators ("||")
  • Validate languages against ISO 639-2 and ISO 639-3
  • Validate subjects against the AGROVOC REST API
  • Fix leading, trailing, and excessive (ie, more than one) whitespace
  • Fix invalid multi-value separators (|) using --unsafe-fixes
  • Fix problematic newlines (line feeds) using --unsafe-fixes
  • Remove unnecessary Unicode like non-breaking spaces, replacement characters, etc
  • Check for "suspicious" characters that indicate encoding or copy/paste issues, for example "foreˆt" should be "forêt"
  • Remove duplicate metadata values

Installation

The easiest way to install CSV Metadata Quality is with pipenv:

$ git clone https://git.sr.ht/~alanorth/csv-metadata-quality
$ cd csv-metadata-quality
$ pipenv install
$ pipenv shell

Otherwise, if you don't have pipenv, you can use a vanilla Python virtual environment:

$ git clone https://git.sr.ht/~alanorth/csv-metadata-quality
$ cd csv-metadata-quality
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install -r requirements.txt

Usage

Run CSV Metadata Quality with the --help flag to see available options:

$ python -m csv_metadata_quality --help

To validate and clean a CSV file you must specify input and output files using the -i and -o options. For example, using the included test file:

$ python -m csv_metadata_quality -i data/test.csv -o /tmp/test.csv

You can enable "unsafe fixes" with the --unsafe-fixes option. Currently this will attempt to fix things like invalid multi-value separators (|). This is considered "unsafe" because it's theoretically possible for the | to be used legitimately in a metadata value, but in my experience it's always a typo where the user was attempting to use multiple metadata values, for example: Kenya|Tanzania.

Todo

  • Reporting / summary
  • Real logging

License

This work is licensed under the GPLv3.

The license allows you to use and modify the work for personal and commercial purposes, but if you distribute the work you must provide users with a means to access the source code for the version you are distributing. Read more about the GPLv3 at TL;DR Legal.