diff --git a/content/posts/2021-07.md b/content/posts/2021-07.md index 8acd7b04b..504e549ac 100644 --- a/content/posts/2021-07.md +++ b/content/posts/2021-07.md @@ -558,3 +558,19 @@ $ cat AS* /tmp/ddos-networks-to-block.txt | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' - The last time we changed this was in 2020 (XMLUI's `Navigation.java`), and I think it makes a lot of sense so I moved it up, under the account block: ![CGSpace XMLUI navigation](/cgspace-notes/2021/07/context-navigation-menu.png) + +## 2021-07-23 + +- Spend some time reviewing patches for the upcoming DSpace 6.4 release + +## 2021-07-24 + +- Spend some time reviewing patches for the upcoming DSpace 6.4 release +- Run all system updates on DSpace Test (linode26) and reboot it + +## 2021-07-29 + +- I figured out why [come communities / collections were seemingly missing from AReS](https://github.com/ilri/OpenRXV/issues/62) + - It was not related to harvesting, but rather to our value mappings replacing values like "CGIAR Research Program on Livestock" with "Livestock" + + diff --git a/content/posts/2021-08.md b/content/posts/2021-08.md new file mode 100644 index 000000000..3c7073704 --- /dev/null +++ b/content/posts/2021-08.md @@ -0,0 +1,56 @@ +--- +title: "August, 2021" +date: 2021-08-01T09:01:07+03:00 +author: "Alan Orth" +categories: ["Notes"] +--- + +## 2021-08-01 + +- Update Docker images on AReS server (linode20) and reboot the server: + +```console +# docker images | grep -v ^REPO | sed 's/ \+/:/g' | cut -d: -f1,2 | grep -v none | xargs -L1 docker pull +``` + +- I decided to upgrade linode20 from Ubuntu 18.04 to 20.04 + + + +- First running all existing updates, taking some backups, checking for broken packages, and then rebooting: + +```console +# apt update && apt dist-upgrade +# apt autoremove && apt autoclean +# check for any packages with residual configs we can purge +# dpkg -l | grep -E '^rc' | awk '{print $2}' +# dpkg -l | grep -E '^rc' | awk '{print $2}' | xargs dpkg -P +# dpkg -C +# dpkg -l > 2021-08-01-linode20-dpkg.txt +# tar -I zstd -cvf 2021-08-01-etc.tar.zst /etc +# reboot +# sed -i 's/bionic/focal/' /etc/apt/sources.list.d/*.list +# do-release-upgrade +``` +- ... but of course it hit [the libxcrypt bug](https://bugs.launchpad.net/ubuntu/+source/libxcrypt/+bug/1903838) +- I had to get a copy of libcrypt.so.1.1.0 from a working Ubuntu 20.04 system and finish the upgrade manually + +```console +# apt install -f +# apt dist-upgrade +# reboot +``` + +- After rebooting I purged all packages with residual configs and cleaned up again: + +```console +# dpkg -l | grep -E '^rc' | awk '{print $2}' | xargs dpkg -P +# apt autoremove && apt autoclean +``` + +- Then I cleared my local Ansible fact cache and re-ran the [infrastructure playbooks](https://github.com/ilri/rmg-ansible-public) +- Open [an issue for the value mappings global replacement bug in OpenRXV](https://github.com/ilri/OpenRXV/issues/111) +- Advise Peter and Abenet on expected CGSpace budget for 2022 +- Start a fresh harvesting on AReS (linode20) + + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index 428d9df6b..8757ed887 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 "/> - + @@ -242,6 +242,8 @@ db.statementpool = true
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -250,8 +252,6 @@ db.statementpool = true
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index 6fcec32fc..b2a1a4f35 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + @@ -264,6 +264,8 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -272,8 +274,6 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index f874610fb..bb7afabd4 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + @@ -200,6 +200,8 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -208,8 +210,6 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index 77529503d..ef61f7f2d 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + @@ -378,6 +378,8 @@ Bitstream: tést señora alimentación.pdf
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -386,8 +388,6 @@ Bitstream: tést señora alimentación.pdf
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index 83a3d3b0c..5b90dc5c5 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + @@ -316,6 +316,8 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -324,8 +326,6 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index 56ce27ada..224d0efa3 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + @@ -495,6 +495,8 @@ dspace.log.2016-04-27:7271
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -503,8 +505,6 @@ dspace.log.2016-04-27:7271
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index 9d84470f0..32e28ff45 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! # awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 "/> - + @@ -371,6 +371,8 @@ sys 0m20.540s
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -379,8 +381,6 @@ sys 0m20.540s
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index 3468275ed..7098ba560 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + @@ -409,6 +409,8 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -417,8 +419,6 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index 8e11e9952..2de72a4a9 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and In this case the select query was showing 95 results before the update "/> - + @@ -325,6 +325,8 @@ discovery.index.authority.ignore-variants=true
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -333,8 +335,6 @@ discovery.index.authority.ignore-variants=true
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index c3c064a02..56d8dcd83 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + @@ -389,6 +389,8 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/b
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -397,8 +399,6 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/b
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index 72b1d66be..fd1d0e966 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + @@ -606,6 +606,8 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -614,8 +616,6 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index 627eed0d2..053adb3d9 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + @@ -372,6 +372,8 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http:
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -380,8 +382,6 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http:
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index 2e4d35319..3c63c3675 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module Add dc.type to the output options for Atmire’s Listings and Reports module (#286) "/> - + @@ -548,6 +548,8 @@ org.dspace.discovery.SearchServiceException: Error executing query
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -556,8 +558,6 @@ org.dspace.discovery.SearchServiceException: Error executing query
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index f27d4b4be..ac17a2dec 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + @@ -784,6 +784,8 @@ $ exit
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -792,8 +794,6 @@ $ exit
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index 955770cda..b1abe37dc 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + @@ -369,6 +369,8 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -377,8 +379,6 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index 20e811fbd..df087f689 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -50,7 +50,7 @@ DELETE 1 Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + @@ -424,6 +424,8 @@ COPY 1968
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -432,8 +434,6 @@ COPY 1968
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index b280b192f..74c144588 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + @@ -355,6 +355,8 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -363,8 +365,6 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index b683587f1..40e3bcc73 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + @@ -585,6 +585,8 @@ $ gem install compass -v 1.0.3
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -593,8 +595,6 @@ $ gem install compass -v 1.0.3
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index b34460e1b..4acf8a2b5 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -18,7 +18,7 @@ - + @@ -391,6 +391,8 @@ UPDATE 187
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -399,8 +401,6 @@ UPDATE 187
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index 390caee97..cd41bd256 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -18,7 +18,7 @@ - + @@ -270,6 +270,8 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace impo
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -278,8 +280,6 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace impo
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index 72b791e69..8ac6c7494 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + @@ -275,6 +275,8 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -283,8 +285,6 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index 10a756c38..e784d07b9 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + @@ -517,6 +517,8 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -525,8 +527,6 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index af4c86a6e..23c703b3a 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + @@ -659,6 +659,8 @@ Cert Status: good
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -667,8 +669,6 @@ Cert Status: good
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index cd312fbd1..7de6b0105 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + @@ -443,6 +443,8 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -451,8 +453,6 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index d3a7c0be0..c3cd4ec23 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct: dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 "/> - + @@ -944,6 +944,8 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | u
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -952,8 +954,6 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | u
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index 8302cdfd3..0a0bad157 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + @@ -783,6 +783,8 @@ DELETE 20
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -791,8 +793,6 @@ DELETE 20
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 38fa7d6fb..060c74aaa 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -150,7 +150,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + @@ -1452,6 +1452,8 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -1460,8 +1462,6 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index 479a9416a..4f9953cdc 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + @@ -1039,6 +1039,8 @@ UPDATE 3
    +
  1. August, 2021
  2. +
  3. July, 2021
  4. June, 2021
  5. @@ -1047,8 +1049,6 @@ UPDATE 3
  6. April, 2021
  7. -
  8. March, 2021
  9. -
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index a653c4d2a..ce5e4070c 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + @@ -415,13 +415,13 @@ java.lang.IllegalArgumentException: No choices plugin was configured for field
  • Unfortunately this causes those items to simply not be indexed, which users noticed because item counts were cut in half and old items showed up in RSS!
  • Since we’ve migrated the ORCID identifiers associated with the authority data to the cg.creator.id field we can nullify the authorities remaining in the database:
  • -
    dspace=# UPDATE metadatavalue SET authority=NULL WHERE resource_type_id=2 AND metadata_field_id=3 AND authority IS NOT NULL;
    +
    dspace=# UPDATE metadatavalue SET authority=NULL WHERE resource_type_id=2 AND metadata_field_id=3 AND authority IS NOT NULL;
     UPDATE 195463
     
    • After this the indexing works as usual and item counts and facets are back to normal
    • Send Peter a list of all authors to correct:
    -
    dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv header;
    +
    dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv header;
     COPY 56156
     
    • Afterwards we’ll want to do some batch tagging of ORCID identifiers to these names
    • @@ -585,6 +585,8 @@ Fixed 5 occurences of: GENEBANKS
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -593,8 +595,6 @@ Fixed 5 occurences of: GENEBANKS
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index 48312304d..5278cba8b 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + @@ -594,6 +594,8 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -602,8 +604,6 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index e23129eef..24faae8a2 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + @@ -523,6 +523,8 @@ $ psql -h localhost -U postgres dspacetest
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -531,8 +533,6 @@ $ psql -h localhost -U postgres dspacetest
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index df0416b50..2a7cf8839 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -58,7 +58,7 @@ real 74m42.646s user 8m5.056s sys 2m7.289s "/> - + @@ -517,6 +517,8 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -525,8 +527,6 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index d2838d9ae..7efcba7a1 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + @@ -569,6 +569,8 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -577,8 +579,6 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index 4fb2f6183..6cfb09d14 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes I ran all system updates on DSpace Test and rebooted it "/> - + @@ -442,6 +442,8 @@ $ dspace database migrate ignored
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -450,8 +452,6 @@ $ dspace database migrate ignored
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index 2e7addfb1..dcd04855b 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + @@ -748,6 +748,8 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND metadata_f
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -756,8 +758,6 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND metadata_f
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index 75cb4a48a..82a4734e9 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + @@ -656,6 +656,8 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: app
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -664,8 +666,6 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: app
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 5f9cdeb38..c1d7a5a7b 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + @@ -553,6 +553,8 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -561,8 +563,6 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index 971fec172..63bf58e05 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + @@ -594,6 +594,8 @@ UPDATE 1
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -602,8 +604,6 @@ UPDATE 1
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index fc5c12e38..8249e1f2e 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t 357 207.46.13.1 903 54.70.40.11 "/> - + @@ -1264,6 +1264,8 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInternal/
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -1272,8 +1274,6 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInternal/
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index 7051fdbb5..85763ff76 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -72,7 +72,7 @@ real 0m19.873s user 0m22.203s sys 0m1.979s "/> - + @@ -1344,6 +1344,8 @@ Please see the DSpace documentation for assistance.
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -1352,8 +1354,6 @@ Please see the DSpace documentation for assistance.
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index 0cf081274..42ef8f06c 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + @@ -1208,6 +1208,8 @@ sys 0m2.551s
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -1216,8 +1218,6 @@ sys 0m2.551s
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index 6a7825076..057b62028 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + @@ -1299,6 +1299,8 @@ UPDATE 14
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -1307,8 +1309,6 @@ UPDATE 14
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index 4ce26e8e2..66a2df6c5 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -48,7 +48,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + @@ -631,6 +631,8 @@ COPY 64871
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -639,8 +641,6 @@ COPY 64871
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index cae319ca4..e5024b2c1 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + @@ -317,6 +317,8 @@ UPDATE 2
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -325,8 +327,6 @@ UPDATE 2
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index c11097189..5d7b9eed5 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -38,7 +38,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + @@ -554,6 +554,8 @@ issn.validate('1020-3362')
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -562,8 +564,6 @@ issn.validate('1020-3362')
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index 9f67307ef..8f9412162 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + @@ -573,6 +573,8 @@ sys 2m27.496s
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -581,8 +583,6 @@ sys 2m27.496s
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index 18d2a7961..872951ca6 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 "/> - + @@ -581,6 +581,8 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -589,8 +591,6 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html index ba679a388..e22ef36a9 100644 --- a/docs/2019-10/index.html +++ b/docs/2019-10/index.html @@ -18,7 +18,7 @@ - + @@ -385,6 +385,8 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -393,8 +395,6 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html index 711db0c13..23b657151 100644 --- a/docs/2019-11/index.html +++ b/docs/2019-11/index.html @@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t # zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams" 106781 "/> - + @@ -692,6 +692,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -700,8 +702,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html index c0a26c0ce..624184698 100644 --- a/docs/2019-12/index.html +++ b/docs/2019-12/index.html @@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the # dpkg -C # reboot "/> - + @@ -404,6 +404,8 @@ UPDATE 1
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -412,8 +414,6 @@ UPDATE 1
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html index 32dff5127..1806a8da7 100644 --- a/docs/2020-01/index.html +++ b/docs/2020-01/index.html @@ -56,7 +56,7 @@ I tweeted the CGSpace repository link "/> - + @@ -604,6 +604,8 @@ COPY 2900
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -612,8 +614,6 @@ COPY 2900
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html index 1efa4b3a5..bcf4946c9 100644 --- a/docs/2020-02/index.html +++ b/docs/2020-02/index.html @@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install "/> - + @@ -1275,6 +1275,8 @@ Moving: 21993 into core statistics-2019
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -1283,8 +1285,6 @@ Moving: 21993 into core statistics-2019
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html index 3d98da63e..0f9d0a610 100644 --- a/docs/2020-03/index.html +++ b/docs/2020-03/index.html @@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it "/> - + @@ -484,6 +484,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -492,8 +494,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html index 49766dcf8..7e5cc6c7f 100644 --- a/docs/2020-04/index.html +++ b/docs/2020-04/index.html @@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week "/> - + @@ -658,6 +658,8 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -666,8 +668,6 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html index 4ed416390..4c4f2eee8 100644 --- a/docs/2020-05/index.html +++ b/docs/2020-05/index.html @@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2 "/> - + @@ -477,6 +477,8 @@ Caused by: java.lang.NullPointerException
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -485,8 +487,6 @@ Caused by: java.lang.NullPointerException
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html index 1720ced71..15e3dac6d 100644 --- a/docs/2020-06/index.html +++ b/docs/2020-06/index.html @@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to In other news, I checked the statistics API on DSpace 6 and it’s working I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: "/> - + @@ -811,6 +811,8 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]' /tmp
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -819,8 +821,6 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]' /tmp
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html index 793e9f453..0dc24ee19 100644 --- a/docs/2020-07/index.html +++ b/docs/2020-07/index.html @@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request "/> - + @@ -1142,6 +1142,8 @@ Fixed 4 occurences of: Muloi, D.M.
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -1150,8 +1152,6 @@ Fixed 4 occurences of: Muloi, D.M.
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html index 13cb3de74..4b44f2b49 100644 --- a/docs/2020-08/index.html +++ b/docs/2020-08/index.html @@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te "/> - + @@ -798,6 +798,8 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -806,8 +808,6 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html index 2032ac790..e81282ccd 100644 --- a/docs/2020-09/index.html +++ b/docs/2020-09/index.html @@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39 I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40 "/> - + @@ -717,6 +717,8 @@ solr_query_params = {
        +
      1. August, 2021
      2. +
      3. July, 2021
      4. June, 2021
      5. @@ -725,8 +727,6 @@ solr_query_params = {
      6. April, 2021
      7. -
      8. March, 2021
      9. -
      diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html index 1a1755f76..aabae3bc3 100644 --- a/docs/2020-10/index.html +++ b/docs/2020-10/index.html @@ -44,7 +44,7 @@ During the FlywayDB migration I got an error: "/> - + @@ -695,7 +695,7 @@ $ curl -XPOST http://localhost:9200/openrxv-values/_doc/_bulk -H "Content-T
    • Adjust the report templates on AReS based on some of Peter’s feedback
    • I wrote a quick Python script to filter and convert the old AReS mappings to Elasticsearch’s Bulk API format:
    -
    #!/usr/bin/env python3
    +
    #!/usr/bin/env python3
     
     import json
     import re
    @@ -1241,6 +1241,8 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
         
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -1249,8 +1251,6 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html index 625085835..91301634c 100644 --- a/docs/2020-11/index.html +++ b/docs/2020-11/index.html @@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat "/> - + @@ -731,6 +731,8 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -739,8 +741,6 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html index acfbac9c8..ec63a189b 100644 --- a/docs/2020-12/index.html +++ b/docs/2020-12/index.html @@ -36,7 +36,7 @@ I started processing those (about 411,000 records): "/> - + @@ -869,6 +869,8 @@ $ curl -XDELETE 'http://localhost:9200/openrxv-items-2020-12-29?pretty'
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -877,8 +879,6 @@ $ curl -XDELETE 'http://localhost:9200/openrxv-items-2020-12-29?pretty'
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html index cf03e5552..22e1c0f7d 100644 --- a/docs/2021-01/index.html +++ b/docs/2021-01/index.html @@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS "/> - + @@ -688,6 +688,8 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -696,8 +698,6 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html index b4aa50800..f9e835373 100644 --- a/docs/2021-02/index.html +++ b/docs/2021-02/index.html @@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty } } "/> - + @@ -898,6 +898,8 @@ dspace.log.2021-02-28:0
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -906,8 +908,6 @@ dspace.log.2021-02-28:0
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html index 45be3a963..94e7f1d7c 100644 --- a/docs/2021-03/index.html +++ b/docs/2021-03/index.html @@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst "/> - + @@ -836,7 +836,7 @@ Run 1 took 5m 53s -
    import requests
    +
    import requests
     
     query_params = {'item-type': 'publication', 'format': 'Json', 'limit': 10, 'offset': 0, 'api-key': 'blahhhahahah', 'filter': '[["issn","equals","0011-183X"]]'}
     r = requests.get('https://v2.sherpa.ac.uk/cgi/retrieve')
    @@ -875,6 +875,8 @@ COPY 3081
         
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -883,8 +885,6 @@ COPY 3081
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html index 9b6ceb0f8..8d60b6b7a 100644 --- a/docs/2021-04/index.html +++ b/docs/2021-04/index.html @@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in "/> - + @@ -242,7 +242,7 @@ $ csvcut -c 'id,dcterms.issued,dcterms.issued[],dcterms.issued[en_US]' /tmp/rtb.
  • Then I have a list of 296 IDs for RTB items issued in 2020
  • I constructed a JSON file to post to the DSpace Statistics API:
  • -
    {
    +
    {
       "limit": 100,
       "page": 0,
       "dateFrom": "2020-01-01T00:00:00Z",
    @@ -1042,6 +1042,8 @@ $ ./ilri/doi-to-handle.py -i /tmp/dois.txt -o /tmp/handles.csv -db dspace63 -u d
         
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -1050,8 +1052,6 @@ $ ./ilri/doi-to-handle.py -i /tmp/dois.txt -o /tmp/handles.csv -db dspace63 -u d
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html index 805605b87..d74918536 100644 --- a/docs/2021-05/index.html +++ b/docs/2021-05/index.html @@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user… "/> - + @@ -685,6 +685,8 @@ Please see the DSpace documentation for assistance.
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -693,8 +695,6 @@ Please see the DSpace documentation for assistance.
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html index 543a1003e..52925a0b8 100644 --- a/docs/2021-06/index.html +++ b/docs/2021-06/index.html @@ -36,7 +36,7 @@ I simply started it and AReS was running again: "/> - + @@ -693,6 +693,8 @@ COPY 1710
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -701,8 +703,6 @@ COPY 1710
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html index 40468a3be..74a80e0a9 100644 --- a/docs/2021-07/index.html +++ b/docs/2021-07/index.html @@ -17,7 +17,7 @@ COPY 20994 - + @@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER; COPY 20994 "/> - + @@ -40,9 +40,9 @@ COPY 20994 "@type": "BlogPosting", "headline": "July, 2021", "url": "https://alanorth.github.io/cgspace-notes/2021-07/", - "wordCount": "3400", + "wordCount": "3471", "datePublished": "2021-07-01T08:53:07+03:00", - "dateModified": "2021-07-20T22:37:59+03:00", + "dateModified": "2021-07-22T12:45:40+03:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -677,6 +677,24 @@ $ cat AS* /tmp/ddos-networks-to-block.txt | sed -e '/^$/d' -e '/^#/d' -e '/^{/d'

    CGSpace XMLUI navigation

    +

    2021-07-23

    +
      +
    • Spend some time reviewing patches for the upcoming DSpace 6.4 release
    • +
    +

    2021-07-24

    +
      +
    • Spend some time reviewing patches for the upcoming DSpace 6.4 release
    • +
    • Run all system updates on DSpace Test (linode26) and reboot it
    • +
    +

    2021-07-29

    + + @@ -697,6 +715,8 @@ $ cat AS* /tmp/ddos-networks-to-block.txt | sed -e '/^$/d' -e '/^#/d' -e '/^{/d'
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -705,8 +725,6 @@ $ cat AS* /tmp/ddos-networks-to-block.txt | sed -e '/^$/d' -e '/^#/d' -e '/^{/d'
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html new file mode 100644 index 000000000..373d892eb --- /dev/null +++ b/docs/2021-08/index.html @@ -0,0 +1,233 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + August, 2021 | CGSpace Notes + + + + + + + + + + + + + + + + + + + + + +
    +
    + +
    +
    + + + + +
    +
    +

    CGSpace Notes

    +

    Documenting day-to-day work on the CGSpace repository.

    +
    +
    + + + + +
    +
    +
    + + + + +
    +
    +

    August, 2021

    + +
    +

    2021-08-01

    +
      +
    • Update Docker images on AReS server (linode20) and reboot the server:
    • +
    +
    # docker images | grep -v ^REPO | sed 's/ \+/:/g' | cut -d: -f1,2 | grep -v none | xargs -L1 docker pull
    +
      +
    • I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
    • +
    +
      +
    • First running all existing updates, taking some backups, checking for broken packages, and then rebooting:
    • +
    +
    # apt update && apt dist-upgrade
    +# apt autoremove && apt autoclean
    +# check for any packages with residual configs we can purge
    +# dpkg -l | grep -E '^rc' | awk '{print $2}'
    +# dpkg -l | grep -E '^rc' | awk '{print $2}' | xargs dpkg -P
    +# dpkg -C
    +# dpkg -l > 2021-08-01-linode20-dpkg.txt
    +# tar -I zstd -cvf 2021-08-01-etc.tar.zst /etc
    +# reboot
    +# sed -i 's/bionic/focal/' /etc/apt/sources.list.d/*.list
    +# do-release-upgrade
    +
      +
    • … but of course it hit the libxcrypt bug
    • +
    • I had to get a copy of libcrypt.so.1.1.0 from a working Ubuntu 20.04 system and finish the upgrade manually
    • +
    +
    # apt install -f
    +# apt dist-upgrade
    +# reboot
    +
      +
    • After rebooting I purged all packages with residual configs and cleaned up again:
    • +
    +
    # dpkg -l | grep -E '^rc' | awk '{print $2}' | xargs dpkg -P
    +# apt autoremove && apt autoclean
    +
    + + + + + + +
    + + + +
    + + + + +
    +
    + + + + + + + + + diff --git a/docs/404.html b/docs/404.html index b88be2727..6a009fc1d 100644 --- a/docs/404.html +++ b/docs/404.html @@ -17,7 +17,7 @@ - + @@ -95,6 +95,8 @@
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -103,8 +105,6 @@
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/categories/index.html b/docs/categories/index.html index 992473724..b6ebf0e76 100644 --- a/docs/categories/index.html +++ b/docs/categories/index.html @@ -10,14 +10,14 @@ - + - + @@ -84,7 +84,7 @@

    Notes

    - +
    Read more → @@ -108,6 +108,8 @@
      +
    1. August, 2021
    2. +
    3. July, 2021
    4. June, 2021
    5. @@ -116,8 +118,6 @@
    6. April, 2021
    7. -
    8. March, 2021
    9. -
    diff --git a/docs/categories/index.xml b/docs/categories/index.xml index 2a5705f61..e16c9ad22 100644 --- a/docs/categories/index.xml +++ b/docs/categories/index.xml @@ -6,11 +6,11 @@ Recent content in Categories on CGSpace Notes Hugo -- gohugo.io en-us - Thu, 01 Jul 2021 08:53:07 +0300 + Sun, 01 Aug 2021 09:01:07 +0300 Notes https://alanorth.github.io/cgspace-notes/categories/notes/ - Thu, 01 Jul 2021 08:53:07 +0300 + Sun, 01 Aug 2021 09:01:07 +0300 https://alanorth.github.io/cgspace-notes/categories/notes/ diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html index bc8bf200f..d09a543db 100644 --- a/docs/categories/notes/index.html +++ b/docs/categories/notes/index.html @@ -10,14 +10,14 @@ - + - + @@ -81,6 +81,31 @@ +
    +
    +

    August, 2021

    + +
    +

    2021-08-01

    +
      +
    • Update Docker images on AReS server (linode20) and reboot the server:
    • +
    +
    # docker images | grep -v ^REPO | sed 's/ \+/:/g' | cut -d: -f1,2 | grep -v none | xargs -L1 docker pull
    +
      +
    • I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
    • +
    + Read more → +
    + + + + + +

    July, 2021

    @@ -336,26 +361,6 @@ COPY 20994 - - - - - - -