diff --git a/content/posts/2019-09.md b/content/posts/2019-09.md index 6bc4ca2eb..15c2baea1 100644 --- a/content/posts/2019-09.md +++ b/content/posts/2019-09.md @@ -291,4 +291,18 @@ $ dspace import -a me@cgiar.org -m 2019-09-20-bioversity2.map -s /home/aorth/Bio - Continue with institutional author normalization - Ask which collection to map items with type Brochure, Journal Item, and Thesis? +## 2019-09-21 + +- Re-upload the [IITA Sept 6 (20196th.xls) records to DSpace Test](https://dspacetest.cgiar.org/handle/10568/105116) after I did the re-sync yesterday + - Then I looked at the records again and sent some feedback about three duplicates to Bosede + - Also I noticed that many journal articles have the journal and page information in the citation, but are missing `dc.source` and `dc.format.extent` fields +- Play with language identification using the langdetect, fasttext, polyglot, and langid libraries + - ployglot requires too many system things to compile + - langdetect didn't seem as accurate as the others + - fasttext is likely the best, but [prints a blank link to the console when loading a model](https://github.com/facebookresearch/fastText/issues/909) + - langid seems to be the best considering the above experiences +- I added very experimental language detection to the [csv-metadata-quality](https://github.com/ilri/csv-metadata-quality) module + - It works by checking the predicted language of the `dc.title` field against the item's `dc.language.iso` field + - I tested it on the Bioversity migration data set and actually managed to correct about eight incorrect language fields in their records! + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index 1f76f9193..c45e35f87 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -37,7 +37,7 @@ $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspac 78 "/> - + diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index 9d0b9db35..3fc2d923e 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -37,7 +37,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index af9b8ea5d..3cecb1aa5 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -27,7 +27,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index 1b97cbd98..e5ec4a91c 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -41,7 +41,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index 95c43f53c..812e2daf5 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -27,7 +27,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index 0e02d3039..6f4c63cbb 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -31,7 +31,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index 8492b6f86..d1eb524ce 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -37,7 +37,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! 3168 "/> - + diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index 325dd7d2e..6e762e06b 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -33,7 +33,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index 6a4bd6c57..d1b012bd7 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -47,7 +47,7 @@ text_value In this case the select query was showing 95 results before the update "/> - + diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index 0b8a541e7..058746d78 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -45,7 +45,7 @@ $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index 4b8e29ba7..19076d277 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -37,7 +37,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index 476ec6f42..2d37dd10d 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -45,7 +45,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index eb83a0654..90a08cf99 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -27,7 +27,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module "/> - + diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index 02a719d7d..db18b6488 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -53,7 +53,7 @@ I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index 319cc6eea..8a3e29cf8 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -27,7 +27,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index 413e224ea..8d9942d5b 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -53,7 +53,7 @@ Create issue on GitHub to track the addition of CCAFS Phase II project tags (#30 Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index 959f5a4fd..b3f730199 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -61,7 +61,7 @@ $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index 8aafd5e71..ee37ee6f2 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -47,7 +47,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index 05d338e09..bf7705d20 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -15,7 +15,7 @@ - + diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index c97d17df7..6fd91eada 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -15,7 +15,7 @@ - + diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index d67304833..92c0e126c 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -39,7 +39,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index e4b395440..620c5c5ee 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -59,7 +59,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index a9fff9f28..ee8386e82 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -35,7 +35,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index 2ec06ef65..55be1a756 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -37,7 +37,7 @@ There appears to be a pattern but I’ll have to look a bit closer and try t Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index 2c5fc90d0..0a62c0b63 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -55,7 +55,7 @@ dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue COPY 54701 "/> - + diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index 597535068..93ff94c35 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -29,7 +29,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 4e2a025b1..9783b4979 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -163,7 +163,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index 3b856e99e..7c984fe29 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -29,7 +29,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index cfb9d0369..d8b72f0ae 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -23,7 +23,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index d989338f3..ae7b062ce 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -25,7 +25,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index 15acfb0a7..814246e9e 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -37,7 +37,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index b7e547562..355530003 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -65,7 +65,7 @@ user 8m5.056s sys 2m7.289s "/> - + diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index 4de8aa202..b6907627a 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -39,7 +39,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index 629300df4..7ddcec4f1 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -57,7 +57,7 @@ The server only has 8GB of RAM so we’ll eventually need to upgrade to a la I ran all system updates on DSpace Test and rebooted it "/> - + diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index d4a159d0d..9aab90b0e 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -29,7 +29,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index 39bb3566a..f1043ddf9 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -25,7 +25,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 4d304eb26..8deb8ef42 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -39,7 +39,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index 2371b3dee..5dc401a2b 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -39,7 +39,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index 46653ac1e..5af3f47d3 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -53,7 +53,7 @@ I don’t see anything interesting in the web server logs around that time t 903 54.70.40.11 "/> - + diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index ec9ea768f..5762bebc9 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -81,7 +81,7 @@ user 0m22.203s sys 0m1.979s "/> - + diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index a051663e5..6b4d63a41 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -45,7 +45,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index 63189f5bd..090094814 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -71,7 +71,7 @@ $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspa $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index d2df507d7..09207f012 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -51,7 +51,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index a1f42d976..3c9fe775b 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -37,7 +37,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index c081437b9..391fffb62 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -37,7 +37,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index 76d169dd0..141532d12 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -49,7 +49,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index 916e1a1b4..fc9eef51b 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -40,7 +40,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: - + @@ -75,7 +75,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 9124 45.5.186.2 "/> - + @@ -85,9 +85,9 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: "@type": "BlogPosting", "headline": "September, 2019", "url": "https:\/\/alanorth.github.io\/cgspace-notes\/2019-09\/", - "wordCount": "2166", + "wordCount": "2325", "datePublished": "2019-09-01T10:17:51\x2b03:00", - "dateModified": "2019-09-20T13:25:59\x2b03:00", + "dateModified": "2019-09-21T02:25:19\x2b03:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -510,6 +510,31 @@ $ dspace import -a me@cgiar.org -m 2019-09-20-bioversity2.map -s /home/aorth/Bio +
dc.source
and dc.format.extent
fieldsdc.title
field against the item’s dc.language.iso
field