mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-17 20:27:05 +01:00
84 lines
4.7 KiB
Markdown
84 lines
4.7 KiB
Markdown
---
|
||
title: "October, 2019"
|
||
date: 2019-10-01T13:20:51+03:00
|
||
author: "Alan Orth"
|
||
tags: ["Notes"]
|
||
---
|
||
|
||
## 2019-10-01
|
||
|
||
- Udana from IWMI asked me for a CSV export of their community on CGSpace
|
||
- I exported it, but a quick run through the `csv-metadata-quality` tool shows that there are some low-hanging fruits we can fix before I send him the data
|
||
- I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U+00A0) there that would otherwise be removed by the csv-metadata-quality script's "unneccesary Unicode" fix:
|
||
|
||
```
|
||
$ csvcut -c 'id,dc.title[en_US],cg.coverage.region[en_US],cg.coverage.subregion[en_US],cg.river.basin[en_US]' ~/Downloads/10568-16814.csv > /tmp/iwmi-title-region-subregion-river.csv
|
||
```
|
||
|
||
- Then I replace them in vim with `:% s/\%u00a0/ /g` because I can't figure out the correct sed syntax to do it directly from the pipe above
|
||
- I uploaded those to CGSpace and then re-exported the metadata
|
||
- Now that I think about it, I shouldn't be removing non-breaking spaces (U+00A0), I should be replacing them with normal spaces!
|
||
- I modified the script so it replaces the non-breaking spaces instead of removing them
|
||
- Then I ran the csv-metadata-quality script to do some general cleanups (though I temporarily commented out the whitespace fixes because it was too many thousands of rows):
|
||
|
||
```
|
||
$ csv-metadata-quality -i ~/Downloads/10568-16814.csv -o /tmp/iwmi.csv -x 'dc.date.issued,dc.date.issued[],dc.date.issued[en_US]' -u
|
||
```
|
||
|
||
- That fixed 153 items (unnecessary Unicode, duplicates, comma–space fixes, etc)
|
||
- Release [version 0.3.1 of the csv-metadata-quality script](https://github.com/ilri/csv-metadata-quality/releases/tag/v0.3.1) with the non-breaking spaces change
|
||
|
||
## 2019-10-03
|
||
|
||
- Upload the 117 IITA records that we had been working on last month (aka 20196th.xls aka Sept 6) to CGSpace
|
||
|
||
## 2019-10-04
|
||
|
||
- Create an account for Bioversity's ICT consultant Francesco on DSpace Test:
|
||
|
||
```
|
||
$ dspace user -a -m blah@mail.it -g Francesco -s Vernocchi -p 'fffff'
|
||
```
|
||
|
||
- Email Francesca and Carol to ask for follow up about the test upload I did on 2019-09-21
|
||
- I suggested that if they still want to do value addition of those records (like adding countries, regions, etc) that they could maybe do it after we migrate the records to CGSpace
|
||
- Carol responded to tell me where to map the items with type Brochure, Journal Item, and Thesis, so I applied them to the [collection on DSpace Test](https://dspacetest.cgiar.org/handle/10568/103688)
|
||
|
||
## 2019-10-06
|
||
|
||
- Hector from CCAFS responded about my feedback of their CLARISA API
|
||
- He made some fixes to the metadata values they are using based on my feedback and said they are happy if we would use it
|
||
- Gabriela from CIP asked me if it was possible to generate an RSS feed of items that have the CIP subject "POTATO AGRI-FOOD SYSTEMS"
|
||
- I notice that there is a similar term "SWEETPOTATO AGRI-FOOD SYSTEMS" so I had to come up with a way to exclude that using the boolean "AND NOT" in the [OpenSearch query](https://cgspace.cgiar.org/open-search/discover?query=cipsubject:POTATO%20AGRI%E2%80%90FOOD%20SYSTEMS%20AND%20NOT%20cipsubject:SWEETPOTATO%20AGRI%E2%80%90FOOD%20SYSTEMS&scope=10568/51671&sort_by=3&order=DESC)
|
||
- Again, the `sort_by=3` parameter is the accession date, as configured in `dspace.cfg`
|
||
|
||
## 2019-10-08
|
||
|
||
- Fix 108 more issues with authors in the ongoing Bioversity migration on DSpace Test, for example:
|
||
- Europeanooperative Programme for Plant Genetic Resources
|
||
- Bioversity International. Capacity Development Unit
|
||
- W.M. van der Heide, W.M., Tripp, R.
|
||
- Internationallant Genetic Resources Institute
|
||
- Start looking at duplicates in the Bioversity migration data on DSpace Test
|
||
- I'm keeping track of the originals and duplicates in a Google Docs spreadsheet that I will share with Bioversity
|
||
|
||
## 2019-10-09
|
||
|
||
- Continue working on identifying duplicates in the Bioversity migration
|
||
- I have been recording the originals and duplicates in a spreadsheet so I can map them later
|
||
- For now I am just reconciling any incorrect or missing metadata in the original items on CGSpace, deleting the duplicate from DSpace Test, and mapping the original to the correct place on CGSpace
|
||
- So far I have deleted thirty duplicates and mapped fourteen
|
||
- Run all system updates on DSpace Test (linode19) and reboot the server
|
||
|
||
## 2019-10-10
|
||
|
||
- Felix Shaw from Earlham emailed me to ask about his admin account on DSpace Test
|
||
- His old one got lost when I re-sync'd DSpace Test with CGSpace a few weeks ago
|
||
- I added a new account for him and added it to the Administrators group:
|
||
|
||
```
|
||
$ dspace user -a -m wow@me.com -g Felix -s Shaw -p 'fuananaaa'
|
||
```
|
||
|
||
<!-- vim: set sw=2 ts=2: -->
|