mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-01 04:43:00 +01:00
324 lines
17 KiB
Markdown
324 lines
17 KiB
Markdown
+++
|
||
date = "2017-02-07T07:04:52-08:00"
|
||
author = "Alan Orth"
|
||
title = "February, 2017"
|
||
tags = ["Notes"]
|
||
|
||
+++
|
||
## 2017-02-07
|
||
|
||
- An item was mapped twice erroneously again, so I had to remove one of the mappings manually:
|
||
|
||
```
|
||
dspace=# select * from collection2item where item_id = '80278';
|
||
id | collection_id | item_id
|
||
-------+---------------+---------
|
||
92551 | 313 | 80278
|
||
92550 | 313 | 80278
|
||
90774 | 1051 | 80278
|
||
(3 rows)
|
||
dspace=# delete from collection2item where id = 92551 and item_id = 80278;
|
||
DELETE 1
|
||
```
|
||
|
||
- Create issue on GitHub to track the addition of CCAFS Phase II project tags ([#301](https://github.com/ilri/DSpace/issues/301))
|
||
- Looks like we'll be using `cg.identifier.ccafsprojectpii` as the field name
|
||
|
||
<!--more-->
|
||
|
||
## 2017-02-08
|
||
|
||
- We also need to rename some of the CCAFS Phase I flagships:
|
||
- CLIMATE-SMART AGRICULTURAL PRACTICES → CLIMATE-SMART TECHNOLOGIES AND PRACTICES
|
||
- CLIMATE RISK MANAGEMENT → CLIMATE SERVICES AND SAFETY NETS
|
||
- LOW EMISSIONS AGRICULTURE → LOW EMISSIONS DEVELOPMENT
|
||
- POLICIES AND INSTITUTIONS → PRIORITIES AND POLICIES FOR CSA
|
||
- The climate risk management one doesn't exist, so I will have to ask Magdalena if they want me to add it to the input forms
|
||
- Start testing some nearly 500 author corrections that CCAFS sent me:
|
||
|
||
```
|
||
$ ./fix-metadata-values.py -i /tmp/CCAFS-Authors-Feb-7.csv -f dc.contributor.author -t 'correct name' -m 3 -d dspace -u dspace -p fuuu
|
||
```
|
||
|
||
## 2017-02-09
|
||
|
||
- More work on CCAFS Phase II stuff
|
||
- Looks like simply adding a new metadata field to `dspace/config/registries/cgiar-types.xml` and restarting DSpace causes the field to get added to the rregistry
|
||
- It requires a restart but at least it allows you to manage the registry programmatically
|
||
- It's not a very good way to manage the registry, though, as removing one there doesn't cause it to be removed from the registry, and we always restore from database backups so there would never be a scenario when we needed these to be created
|
||
- Testing some corrections on CCAFS Phase II flagships (`cg.subject.ccafs`):
|
||
|
||
```
|
||
$ ./fix-metadata-values.py -i ccafs-flagships-feb7.csv -f cg.subject.ccafs -t correct -m 210 -d dspace -u dspace -p fuuu
|
||
```
|
||
|
||
## 2017-02-10
|
||
|
||
- CCAFS said they want to wait on the flagship updates (`cg.subject.ccafs`) on CGSpace, perhaps for a month or so
|
||
- Help Marianne Gadeberg (WLE) with some user permissions as it seems she had previously been using a personal email account, and is now on a CGIAR one
|
||
- I manually added her new account to ~25 authorizations that her hold user was on
|
||
|
||
## 2017-02-14
|
||
|
||
- Add `SCALING` to ILRI subjects ([#304](https://github.com/ilri/DSpace/pull/304)), as Sisay's attempts were all sloppy
|
||
- Cherry pick some patches from the DSpace 5.7 branch:
|
||
- DS-3363 CSV import error says "row", means "column": f7b6c83e991db099003ee4e28ca33d3c7bab48c0
|
||
- DS-3479 avoid adding empty metadata values during import: 329f3b48a6de7fad074d825fd12118f7e181e151
|
||
- [DS-3456] 5x Clarify command line options for statisics import/export tools (#1623): 567ec083c8a94eb2bcc1189816eb4f767745b278
|
||
- [DS-3458]5x Allow Shard Process to Append to an existing repo: 3c8ecb5d1fd69a1dcfee01feed259e80abbb7749
|
||
- I still need to test these, especially as the last two which change some stuff with Solr maintenance
|
||
|
||
## 2017-02-15
|
||
|
||
- Update rvm on DSpace Test and CGSpace as there was a [security disclosure about versions less than 1.28.0](https://github.com/justinsteven/advisories/blob/master/2017_rvm_cd_command_execution.md)
|
||
|
||
## 2017-02-16
|
||
|
||
- Looking at memory info from munin on CGSpace:
|
||
|
||
![CGSpace meminfo](/cgspace-notes/2017/02/meminfo_phisical-week.png)
|
||
|
||
- We are using only ~8GB of RAM for applications, and 16GB for caches!
|
||
- The Linode machine we're on has 24GB of RAM but only because that's the only instance that had enough disk space for us (384GB)...
|
||
- We should probably look into Google Compute Engine or Digital Ocean where we can get more storage without having to follow a linear increase in instance pricing for CPU/memory as well
|
||
- Especially because we only use 2 out of 8 CPUs basically:
|
||
|
||
![CGSpace CPU](/cgspace-notes/2017/02/cpu-week.png)
|
||
|
||
- Fix issue with duplicate declaration of in atmire-dspace-xmlui `pom.xml` (causing non-fatal warnings during the maven build)
|
||
- Experiment with making DSpace generate HTTPS handle links, first a change in dspace.cfg or the site's properties file:
|
||
|
||
```
|
||
handle.canonical.prefix = https://hdl.handle.net/
|
||
```
|
||
|
||
- And then a SQL command to update existing records:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http://hdl.handle.net', 'https://hdl.handle.net') where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'uri');
|
||
UPDATE 58193
|
||
```
|
||
|
||
- Seems to work fine!
|
||
- I noticed a few items that have incorrect DOI links (`dc.identifier.doi`), and after looking in the database I see there are over 100 that are missing the scheme or are just plain wrong:
|
||
|
||
```
|
||
dspace=# select distinct text_value from metadatavalue where resource_type_id=2 and metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value not like 'http%://%';
|
||
```
|
||
|
||
- This will replace any that begin with `10.` and change them to `https://dx.doi.org/10.`:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, '(^10\..+$)', 'https://dx.doi.org/\1') where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value like '10.%';
|
||
```
|
||
|
||
- This will get any that begin with `doi:10.` and change them to `https://dx.doi.org/10.x`:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, '^doi:(10\..+$)', 'https://dx.doi.org/\1') where resource_type_id=2 and metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value like 'doi:10%';
|
||
```
|
||
|
||
- Fix DOIs like `dx.doi.org/10.` to be `https://dx.doi.org/10.`:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, '(^dx.doi.org/.+$)', 'https://dx.doi.org/\1') where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value like 'dx.doi.org/%';
|
||
```
|
||
|
||
- Fix DOIs like `http//`:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, '^http//(dx.doi.org/.+$)', 'https://dx.doi.org/\1') where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value like 'http//%';
|
||
```
|
||
|
||
- Fix DOIs like `dx.doi.org./`:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, '(^dx.doi.org\./.+$)', 'https://dx.doi.org/\1') where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value like 'dx.doi.org./%'
|
||
|
||
```
|
||
|
||
- Delete some invalid DOIs:
|
||
|
||
```
|
||
dspace=# delete from metadatavalue where resource_type_id=2 and metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value in ('DOI','CPWF Mekong','Bulawayo, Zimbabwe','bb');
|
||
```
|
||
|
||
- Fix some other random outliers:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = 'https://dx.doi.org/10.1016/j.aquaculture.2015.09.003' where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value = 'http:/dx.doi.org/10.1016/j.aquaculture.2015.09.003';
|
||
dspace=# update metadatavalue set text_value = 'https://dx.doi.org/10.5337/2016.200' where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value = 'doi: https://dx.doi.org/10.5337/2016.200';
|
||
dspace=# update metadatavalue set text_value = 'https://dx.doi.org/doi:10.1371/journal.pone.0062898' where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value = 'Http://dx.doi.org/doi:10.1371/journal.pone.0062898';
|
||
dspace=# update metadatavalue set text_value = 'https://dx.doi.10.1016/j.cosust.2013.11.012' where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value = 'http:dx.doi.10.1016/j.cosust.2013.11.012';
|
||
dspace=# update metadatavalue set text_value = 'https://dx.doi.org/10.1080/03632415.2014.883570' where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value = 'org/10.1080/03632415.2014.883570';
|
||
dspace=# update metadatavalue set text_value = 'https://dx.doi.org/10.15446/agron.colomb.v32n3.46052' where metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value = 'Doi: 10.15446/agron.colomb.v32n3.46052';
|
||
```
|
||
|
||
- And do another round of `http://` → `https://` cleanups:
|
||
|
||
```
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http://dx.doi.org', 'https://dx.doi.org') where resource_type_id=2 and metadata_field_id IN (select metadata_field_id from metadatafieldregistry where element = 'identifier' and qualifier = 'doi') and text_value like 'http://dx.doi.org%';
|
||
```
|
||
|
||
- Run all DOI corrections on CGSpace
|
||
- Something to think about here is to write a [Curation Task](https://wiki.lyrasis.org/display/DSDOC5x/Curation+System#CurationSystem-ScriptedTasks) in Java to do these sanity checks / corrections every night
|
||
- Then we could add a cron job for them and run them from the command line like:
|
||
|
||
```
|
||
[dspace]/bin/dspace curate -t noop -i 10568/79891
|
||
```
|
||
|
||
## 2017-02-20
|
||
|
||
- Run all system updates on DSpace Test and reboot the server
|
||
- Run CCAFS author corrections on DSpace Test and CGSpace and force a full discovery reindex
|
||
- Fix label of CCAFS subjects in Atmire Listings and Reports module
|
||
- Help Sisay with SQL commands
|
||
- Help Paola from CCAFS with the Atmire Listings and Reports module
|
||
- Testing the `fix-metadata-values.py` script on macOS and it seems like we don't need to use `.encode('utf-8')` anymore when printing strings to the screen
|
||
- It seems this might have only been a temporary problem, as both Python 3.5.2 and 3.6.0 are able to print the problematic string "Entwicklung & Ländlicher Raum" without the `encode()` call, but print it as a bytes when it *is* used:
|
||
|
||
```
|
||
$ python
|
||
Python 3.6.0 (default, Dec 25 2016, 17:30:53)
|
||
>>> print('Entwicklung & Ländlicher Raum')
|
||
Entwicklung & Ländlicher Raum
|
||
>>> print('Entwicklung & Ländlicher Raum'.encode())
|
||
b'Entwicklung & L\xc3\xa4ndlicher Raum'
|
||
```
|
||
|
||
- So for now I will remove the encode call from the script (though it was never used on the versions on the Linux hosts), leading me to believe it really *was* a temporary problem, perhaps due to macOS or the Python build I was using.
|
||
|
||
## 2017-02-21
|
||
|
||
- Testing regenerating PDF thumbnails, like I started in 2016-11
|
||
- It seems there is a bug in `filter-media` that causes it to process formats that aren't part of its configuration:
|
||
|
||
```
|
||
$ [dspace]/bin/dspace filter-media -f -i 10568/16856 -p "ImageMagick PDF Thumbnail"
|
||
File: earlywinproposal_esa_postharvest.pdf.jpg
|
||
FILTERED: bitstream 13787 (item: 10568/16881) and created 'earlywinproposal_esa_postharvest.pdf.jpg'
|
||
File: postHarvest.jpg.jpg
|
||
FILTERED: bitstream 16524 (item: 10568/24655) and created 'postHarvest.jpg.jpg'
|
||
```
|
||
|
||
- According to `dspace.cfg` the ImageMagick PDF Thumbnail plugin should only process PDFs:
|
||
|
||
```
|
||
filter.org.dspace.app.mediafilter.ImageMagickImageThumbnailFilter.inputFormats = BMP, GIF, image/png, JPG, TIFF, JPEG, JPEG 2000
|
||
filter.org.dspace.app.mediafilter.ImageMagickPdfThumbnailFilter.inputFormats = Adobe PDF
|
||
```
|
||
|
||
- I've sent a message to the mailing list and might file a Jira issue
|
||
- Ask Atmire about the failed interpolation of the `dspace.internalUrl` variable in `atmire-cua.cfg`
|
||
|
||
## 2017-02-22
|
||
|
||
- Atmire said I can add `dspace.internalUrl` to my build properties and the error will go away
|
||
- It should be the local URL for accessing Tomcat from the server's own perspective, ie: http://localhost:8080
|
||
|
||
## 2017-02-26
|
||
|
||
- Find all fields with "http://hdl.handle.net" values (most are in `dc.identifier.uri`, but some are in other URL-related fields like `cg.link.reference`, `cg.identifier.dataurl`, and `cg.identifier.url`):
|
||
|
||
```
|
||
dspace=# select distinct metadata_field_id from metadatavalue where resource_type_id=2 and text_value like 'http://hdl.handle.net%';
|
||
dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http://hdl.handle.net', 'https://hdl.handle.net') where resource_type_id=2 and metadata_field_id IN (25, 113, 179, 219, 220, 223) and text_value like 'http://hdl.handle.net%';
|
||
UPDATE 58633
|
||
```
|
||
|
||
- This works but I'm thinking I'll wait on the replacement as there are perhaps some other places that rely on `http://hdl.handle.net` (grep the code, it's scary how many things are hard coded)
|
||
- Send message to dspace-tech mailing list with concerns about this
|
||
|
||
## 2017-02-27
|
||
|
||
- LDAP users cannot log in today, looks to be an issue with CGIAR's LDAP server:
|
||
|
||
```
|
||
$ openssl s_client -connect svcgroot2.cgiarad.org:3269
|
||
CONNECTED(00000003)
|
||
depth=0 CN = SVCGROOT2.CGIARAD.ORG
|
||
verify error:num=20:unable to get local issuer certificate
|
||
verify return:1
|
||
depth=0 CN = SVCGROOT2.CGIARAD.ORG
|
||
verify error:num=21:unable to verify the first certificate
|
||
verify return:1
|
||
---
|
||
Certificate chain
|
||
0 s:/CN=SVCGROOT2.CGIARAD.ORG
|
||
i:/CN=CGIARAD-RDWA-CA
|
||
---
|
||
```
|
||
|
||
- For some reason it is now signed by a private certificate authority
|
||
- This error seems to have started on 2017-02-25:
|
||
|
||
```
|
||
$ grep -c "unable to find valid certification path" [dspace]/log/dspace.log.2017-02-*
|
||
[dspace]/log/dspace.log.2017-02-01:0
|
||
[dspace]/log/dspace.log.2017-02-02:0
|
||
[dspace]/log/dspace.log.2017-02-03:0
|
||
[dspace]/log/dspace.log.2017-02-04:0
|
||
[dspace]/log/dspace.log.2017-02-05:0
|
||
[dspace]/log/dspace.log.2017-02-06:0
|
||
[dspace]/log/dspace.log.2017-02-07:0
|
||
[dspace]/log/dspace.log.2017-02-08:0
|
||
[dspace]/log/dspace.log.2017-02-09:0
|
||
[dspace]/log/dspace.log.2017-02-10:0
|
||
[dspace]/log/dspace.log.2017-02-11:0
|
||
[dspace]/log/dspace.log.2017-02-12:0
|
||
[dspace]/log/dspace.log.2017-02-13:0
|
||
[dspace]/log/dspace.log.2017-02-14:0
|
||
[dspace]/log/dspace.log.2017-02-15:0
|
||
[dspace]/log/dspace.log.2017-02-16:0
|
||
[dspace]/log/dspace.log.2017-02-17:0
|
||
[dspace]/log/dspace.log.2017-02-18:0
|
||
[dspace]/log/dspace.log.2017-02-19:0
|
||
[dspace]/log/dspace.log.2017-02-20:0
|
||
[dspace]/log/dspace.log.2017-02-21:0
|
||
[dspace]/log/dspace.log.2017-02-22:0
|
||
[dspace]/log/dspace.log.2017-02-23:0
|
||
[dspace]/log/dspace.log.2017-02-24:0
|
||
[dspace]/log/dspace.log.2017-02-25:7
|
||
[dspace]/log/dspace.log.2017-02-26:8
|
||
[dspace]/log/dspace.log.2017-02-27:90
|
||
```
|
||
|
||
- Also, it seems that we need to use a different user for LDAP binds, as we're still using the temporary one from the root migration, so maybe we can go back to the previous user we were using
|
||
- So it looks like the certificate is invalid AND the bind users we had been using were deleted
|
||
- Biruk Debebe recreated the bind user and now we are just waiting for CGNET to update their certificates
|
||
- Regarding the `filter-media` issue I found earlier, it seems that the ImageMagick PDF plugin will also process JPGs if they are in the "Content Files" (aka `ORIGINAL`) bundle
|
||
- The problem likely lies in the logic of `ImageMagickThumbnailFilter.java`, as `ImageMagickPdfThumbnailFilter.java` extends it
|
||
- Run CIAT corrections on CGSpace
|
||
|
||
```
|
||
dspace=# update metadatavalue set authority='3026b1de-9302-4f3e-85ab-ef48da024eb2', confidence=600 where resource_type_id=2 and metadata_field_id=3 and text_value = 'International Center for Tropical Agriculture';
|
||
```
|
||
|
||
- CGNET has fixed the certificate chain on their LDAP server
|
||
- Redeploy CGSpace and DSpace Test to on latest `5_x-prod` branch with fixes for LDAP bind user
|
||
- Run all system updates on CGSpace server and reboot
|
||
|
||
## 2017-02-28
|
||
|
||
- After running the CIAT corrections and updating the Discovery and authority indexes, there is still no change in the number of items listed for CIAT in Discovery
|
||
- Ah, this is probably because some items have the `International Center for Tropical Agriculture` author twice, which I first noticed in 2016-12 but couldn't figure out how to fix
|
||
- I think I can do it by first exporting all metadatavalues that have the author `International Center for Tropical Agriculture`
|
||
|
||
```
|
||
dspace=# \copy (select resource_id, metadata_value_id from metadatavalue where resource_type_id=2 and metadata_field_id=3 and text_value='International Center for Tropical Agriculture') to /tmp/ciat.csv with csv;
|
||
COPY 1968
|
||
```
|
||
|
||
- And then use awk to print the duplicate lines to a separate file:
|
||
|
||
```
|
||
$ awk -F',' 'seen[$1]++' /tmp/ciat.csv > /tmp/ciat-dupes.csv
|
||
```
|
||
|
||
- From that file I can create a list of 279 deletes and put them in a batch script like:
|
||
|
||
```
|
||
delete from metadatavalue where resource_type_id=2 and metadata_field_id=3 and metadata_value_id=2742061;
|
||
```
|