diff --git a/content/posts/2021-02.md b/content/posts/2021-02.md index 7134bd0af..6889fcaf1 100644 --- a/content/posts/2021-02.md +++ b/content/posts/2021-02.md @@ -642,7 +642,7 @@ UPDATE 18659 $ dspace metadata-import -f /tmp/0.csv ``` -- It took FOREVER to import each file... like several hours. MY GOD DSpace 6 is slow. +- It took FOREVER to import each file... like several hours *each*. MY GOD DSpace 6 is slow. - Help Dominique Perera debug some issues with the WordPress DSpace importer plugin from Macaroni Bros - She is not seeing the community list for CGSpace, and I see weird requests like this in the logs: @@ -653,4 +653,90 @@ $ dspace metadata-import -f /tmp/0.csv - The first request is OK, but the second one is malformed for sure +## 2021-02-24 + +- Export a list of journals for Peter to look through: + +```console +localhost/dspace63= > \COPY (SELECT DISTINCT text_value as "cg.journal", count(*) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=251 GROUP BY text_value ORDER BY count DESC) to /tmp/2021-02-24-journals.csv WITH CSV HEADER; +COPY 3345 +``` + +- Start a fresh harvesting on AReS because Udana mapped some items today and wants to include them in his report: + +```console +$ curl -XDELETE 'http://localhost:9200/openrxv-items-temp' +# start indexing in AReS +``` + +- Also, I want to include the new series name/number cleanups so it's not a total waste of time + +## 2021-02-25 + +- Hmm the AReS harvest last night seems to have finished successfully, but the number of items is less than I was expecting: + +```console +$ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty' +{ + "count" : 99546, + "_shards" : { + "total" : 1, + "successful" : 1, + "skipped" : 0, + "failed" : 0 + } +} +``` + +- The current items index has 101380 items... I wonder what happened + - I started a new indexing + +## 2021-02-26 + +- Last night's indexing was more successful, there are now 101479 items in the index +- Yesterday Yousef sent a [pull request](https://github.com/ilri/OpenRXV/pull/77/) for the next/previous buttons on OpenRXV + - I tested it this morning and it seems to be working + +## 2021-02-28 + +- Abenet asked me to import seventy-three records for CRP Forests, Trees and Agroforestry + - I checked them briefly and found that there were thirty+ journal articles, and none of them had `cg.journal`, `cg.volume`, `cg.issue`, or `dcterms.license` so I spent a little time adding them + - I used a GREL expression to extract the journal volume and issue from the citation into new columns: + +```console +value.partition(/[0-9]+\([0-9]+\)/)[1].replace(/\(.*\)/,"") +value.partition(/[0-9]+\([0-9]+\)/)[1].replace(/^\d+\((\d+)\)/,"$1") +``` + +- This `value.partition` was new to me... and it took me a bit of time to figure out whether I needed to escape the parentheses in the issue number or not (no) and how to reference a capture group with `value.replace` +- I tried to check the 1095 CIFOR records from last week for duplicates on DSpace Test, but the page says "Processing" and never loads + - I don't see any errors in the logs, but there are two jQuery errors in the browser console + - I filed [an issue](https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=934) with Atmire +- Upload twelve items to CGSpace for Peter +- Niroshini from IWMI is still having issues adding WLE subjects to items during the metadata review step in the workflow +- It seems the BatchEditConsumer log spam is gone since I applied [Atmire's patch](https://github.com/ilri/DSpace/pull/462) + +```console +$ grep -c 'BatchEditConsumer should not have been given' dspace.log.2021-02-[12]* +dspace.log.2021-02-10:5067 +dspace.log.2021-02-11:2647 +dspace.log.2021-02-12:4231 +dspace.log.2021-02-13:221 +dspace.log.2021-02-14:0 +dspace.log.2021-02-15:0 +dspace.log.2021-02-16:0 +dspace.log.2021-02-17:0 +dspace.log.2021-02-18:0 +dspace.log.2021-02-19:0 +dspace.log.2021-02-20:0 +dspace.log.2021-02-21:0 +dspace.log.2021-02-22:0 +dspace.log.2021-02-23:0 +dspace.log.2021-02-24:0 +dspace.log.2021-02-25:0 +dspace.log.2021-02-26:0 +dspace.log.2021-02-27:0 +dspace.log.2021-02-28:0 +``` + diff --git a/content/posts/2021-03.md b/content/posts/2021-03.md new file mode 100644 index 000000000..9d627ca77 --- /dev/null +++ b/content/posts/2021-03.md @@ -0,0 +1,97 @@ +--- +title: "March, 2021" +date: 2021-03-01T10:13:54+02:00 +author: "Alan Orth" +categories: ["Notes"] +--- + +## 2021-03-01 + +- Discuss some OpenRXV issues with Abdullah from CodeObia + - He's trying to work on the DSpace 6+ metadata schema autoimport using the DSpace 6+ REST API + - Also, we found some issues building and running OpenRXV currently due to ecosystem shift in the Node.js dependencies + + + +## 2021-03-02 + +- I fixed three build and runtime issues in OpenRXV: + - [fix highcharts-angular and ngx-tour-core build](https://github.com/ilri/OpenRXV/pull/80) + - [frontend/package.json: Pin @types/ramda at 0.27.34](https://github.com/ilri/OpenRXV/pull/82) +- Then I merged a few fixes that Abdullah had worked on last week + +## 2021-03-03 + +- I [fixed another frontend build warning on OpenRXV](https://github.com/ilri/OpenRXV/issues/83) +- Then I [updated the frontend container to use Node.js 12 and Ubuntu 20.04](https://github.com/ilri/OpenRXV/pull/84) +- Also, I [added a GitHub Actions workflow to build the frontend](https://github.com/ilri/OpenRXV/pull/85) +- I did some testing of Abdullah's patch for the values mapping search on OpenRXV + - It still doesn't work with multi-word values, so I recorded a video with wf-recorder and uploaded it to [the issue](https://github.com/ilri/OpenRXV/issues/43) for him to investigate + +## 2021-03-04 + +- Peter is having issues with the workflow since yesterday + - I looked at the Munin stats and see a high number of database locks since yesterday + +![PostgreSQL locks week](/cgspace-notes/2021/03/postgres_locks_ALL-week.png) +![PostgreSQL connections week](/cgspace-notes/2021/03/postgres_connections_cgspace-week.png) + +- I looked at the number of connections in PostgreSQL and it's definitely high again: + +```console +$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | wc -l +1020 +``` + +- I reported it to Atmire to take a look, on the [same issue](https://tracker.atmire.com/tickets-cgiar-ilri/view-ticket?id=851) we had been tracking this before +- Abenet asked me to add a new ORCID for ILRI staff member Zoe Campbell +- I added it to the controlled vocabulary and then tagged her existing items on CGSpace using my `add-orcid-identifier.py` script: + +```console +$ cat 2021-03-04-add-zoe-campbell-orcid.csv +dc.contributor.author,cg.creator.identifier +"Campbell, Zoë","Zoe Campbell: 0000-0002-4759-9976" +"Campbell, Zoe A.","Zoe Campbell: 0000-0002-4759-9976" +$ ./ilri/add-orcid-identifiers-csv.py -i 2021-03-04-add-zoe-campbell-orcid.csv -db dspace -u dspace -p 'fuuu' +``` + +- I still need to do cleanup on the journal articles metadata + - Peter sent me some cleanups but I can't use them in the search/replace format he gave + - I think it's better to export the metadata values with IDs and import cleaned up ones as CSV + +```console +localhost/dspace63= > \COPY (SELECT dspace_object_id AS id, text_value as "cg.journal" FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=251) to /tmp/2021-02-24-journals.csv WITH CSV HEADER; +COPY 32087 +``` + +- I used OpenRefine to remove all journal values that didn't have one of these values: ; ( ) + - Then I cloned the `cg.journal` field to `cg.volume` and `cg.issue` + - I used some GREL expressions like these to extract the journal name, volume, and issue: + +```console +value.partition(';')[0].trim() # to get journal names +value.partition(/[0-9]+\([0-9]+\)/)[1].replace(/^(\d+)\(\d+\)/,"$1") # to get journal volumes +value.partition(/[0-9]+\([0-9]+\)/)[1].replace(/^\d+\((\d+)\)/,"$1") # to get journal issues +``` + +- Then I uploaded the changes to CGSpace using `dspace metadata-import` +- Margarita from CCAFS was asking about an error deleting some items that were showing up in Google and should have been private + - The error was "Authorization denied for action OBSOLETE (DELETE) on BITSTREAM:bd157345-448e ..." + - I searched the DSpace issue tracker and found several issues reporting this: + - [DS-3985 Delete item fails](https://jira.lyrasis.org/browse/DS-3985) + - [DS-4004 Authorization denied Exception when trying to delete permanently an item, collection or community as a non-Admin user](https://jira.lyrasis.org/browse/DS-4004) + - [DS-4297 Authorization error when trying to delete item by submitter/administrator](https://jira.lyrasis.org/browse/DS-4297) + - The issue is apparently with non-admin users who are in the admin and submit groups of the owning collection... + - In this case the item was uploaded to the CCAFS Reports collection, and Margarita is a non-admin user who is a member of the collection's admin and submit groups, exactly as the issue described + - I added a comment about our issue to [DS-4297](https://jira.lyrasis.org/browse/DS-4297) +- Yesterday Abenet added me to a WLE collection approver/editer steps so we can try to figure out why Niroshini is having issues adding metadata to Udana's submissions + - I edited Udana's submission to CGSpace: + - corrected the title + - added language English + - changed the link to the external item page instead of PDF + - added SDGs from the external item page + - added AGROVOC subjects from the external item page + - added pagination (extent) + - changed the license to "other" because CC-BY-NC-ND is not printed anywhere in the PDF or external item page + + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index e31fbefd8..1e6595780 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 "/> - + @@ -242,6 +242,8 @@ db.statementpool = true
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -250,8 +252,6 @@ db.statementpool = true
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index 10ae17689..6a6464dfc 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + @@ -264,6 +264,8 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -272,8 +274,6 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index 6a4ae576f..0bf37b1dc 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + @@ -200,6 +200,8 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -208,8 +210,6 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index 94c1fe10b..c73163210 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + @@ -378,6 +378,8 @@ Bitstream: tést señora alimentación.pdf
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -386,8 +388,6 @@ Bitstream: tést señora alimentación.pdf
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index f610e2998..5d9a7e984 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + @@ -316,6 +316,8 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -324,8 +326,6 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index d06ac35c5..3bedec55b 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + @@ -495,6 +495,8 @@ dspace.log.2016-04-27:7271
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -503,8 +505,6 @@ dspace.log.2016-04-27:7271
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index 528b66473..50234d372 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! # awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 "/> - + @@ -371,6 +371,8 @@ sys 0m20.540s
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -379,8 +381,6 @@ sys 0m20.540s
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index 3809c78ed..e85159c3a 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + @@ -409,6 +409,8 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -417,8 +419,6 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index 4a58b6159..40e9f6358 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and In this case the select query was showing 95 results before the update "/> - + @@ -325,6 +325,8 @@ discovery.index.authority.ignore-variants=true
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -333,8 +335,6 @@ discovery.index.authority.ignore-variants=true
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index d13edd9ae..9c39d420b 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + @@ -389,6 +389,8 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/b
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -397,8 +399,6 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/b
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index b88b1420b..31ded8565 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + @@ -606,6 +606,8 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -614,8 +616,6 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index 7d7920605..5a9145a74 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + @@ -372,6 +372,8 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http:
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -380,8 +382,6 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http:
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index 6bb7cda92..b57f35ce3 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module Add dc.type to the output options for Atmire’s Listings and Reports module (#286) "/> - + @@ -548,6 +548,8 @@ org.dspace.discovery.SearchServiceException: Error executing query
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -556,8 +558,6 @@ org.dspace.discovery.SearchServiceException: Error executing query
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index 2603b7d5c..5ebd3ed39 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + @@ -784,6 +784,8 @@ $ exit
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -792,8 +794,6 @@ $ exit
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index df67f408d..89031d36e 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + @@ -369,6 +369,8 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -377,8 +379,6 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index 41fb20bb4..ea98e1aa7 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -50,7 +50,7 @@ DELETE 1 Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + @@ -424,6 +424,8 @@ COPY 1968
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -432,8 +434,6 @@ COPY 1968
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index 76648495a..efbd96aa6 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + @@ -355,6 +355,8 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -363,8 +365,6 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index 519aadfb8..53d218168 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + @@ -585,6 +585,8 @@ $ gem install compass -v 1.0.3
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -593,8 +595,6 @@ $ gem install compass -v 1.0.3
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index cc4abed24..1847b0b15 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -18,7 +18,7 @@ - + @@ -391,6 +391,8 @@ UPDATE 187
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -399,8 +401,6 @@ UPDATE 187
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index 4cf20e88f..f7e67726b 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -18,7 +18,7 @@ - + @@ -270,6 +270,8 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace impo
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -278,8 +280,6 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace impo
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index 37efc4762..3475fce06 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + @@ -275,6 +275,8 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -283,8 +285,6 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index 0b3ec3400..1005fd614 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + @@ -517,6 +517,8 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -525,8 +527,6 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index 7360bc25c..62c3c5cc2 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + @@ -659,6 +659,8 @@ Cert Status: good
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -667,8 +669,6 @@ Cert Status: good
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index 39a889c72..1d8f1e3d6 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + @@ -443,6 +443,8 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -451,8 +453,6 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index ef0eac9c9..7112cda21 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct: dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 "/> - + @@ -944,6 +944,8 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | u
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -952,8 +954,6 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | u
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index 54e7986be..e7121eda3 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + @@ -783,6 +783,8 @@ DELETE 20
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -791,8 +793,6 @@ DELETE 20
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 11e52e9fa..7f3d3361f 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -150,7 +150,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + @@ -1452,6 +1452,8 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1460,8 +1462,6 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index 6fe722fea..6b4e9ef1e 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + @@ -1039,6 +1039,8 @@ UPDATE 3
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1047,8 +1049,6 @@ UPDATE 3
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index b0b518434..082bc4e1b 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + @@ -585,6 +585,8 @@ Fixed 5 occurences of: GENEBANKS
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -593,8 +595,6 @@ Fixed 5 occurences of: GENEBANKS
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index 22b3c0bce..aa56a9436 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + @@ -594,6 +594,8 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -602,8 +604,6 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index 7d54d6291..48b7173b0 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + @@ -523,6 +523,8 @@ $ psql -h localhost -U postgres dspacetest
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -531,8 +533,6 @@ $ psql -h localhost -U postgres dspacetest
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index ba8714539..e53f77be8 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -58,7 +58,7 @@ real 74m42.646s user 8m5.056s sys 2m7.289s "/> - + @@ -517,6 +517,8 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -525,8 +527,6 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index a20da2d21..db778383c 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + @@ -569,6 +569,8 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -577,8 +579,6 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index e8555afb1..42a35b7e9 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes I ran all system updates on DSpace Test and rebooted it "/> - + @@ -442,6 +442,8 @@ $ dspace database migrate ignored
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -450,8 +452,6 @@ $ dspace database migrate ignored
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index b704a4160..88b0a5e14 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + @@ -748,6 +748,8 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND metadata_f
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -756,8 +758,6 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND metadata_f
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index 6c8ebd401..9c841ed9d 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + @@ -656,6 +656,8 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: app
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -664,8 +666,6 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: app
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 657f0f30d..c3857ec47 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + @@ -553,6 +553,8 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -561,8 +563,6 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index b5797dbe9..3a8b796b5 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + @@ -594,6 +594,8 @@ UPDATE 1
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -602,8 +604,6 @@ UPDATE 1
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index 62f230730..257d93fd6 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t 357 207.46.13.1 903 54.70.40.11 "/> - + @@ -1264,6 +1264,8 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInternal/
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1272,8 +1274,6 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInternal/
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index f13f88609..ea087cbcd 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -72,7 +72,7 @@ real 0m19.873s user 0m22.203s sys 0m1.979s "/> - + @@ -1344,6 +1344,8 @@ Please see the DSpace documentation for assistance.
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1352,8 +1354,6 @@ Please see the DSpace documentation for assistance.
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index 818eb1185..396f977eb 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + @@ -1208,6 +1208,8 @@ sys 0m2.551s
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1216,8 +1218,6 @@ sys 0m2.551s
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index 580de35b0..c3f134ac7 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + @@ -1299,6 +1299,8 @@ UPDATE 14
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1307,8 +1309,6 @@ UPDATE 14
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index 9a4b4f3a9..6f31a29ef 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -48,7 +48,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + @@ -631,6 +631,8 @@ COPY 64871
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -639,8 +641,6 @@ COPY 64871
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index 289584647..c9654cf6e 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + @@ -317,6 +317,8 @@ UPDATE 2
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -325,8 +327,6 @@ UPDATE 2
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index d2eeb95e8..f10d82794 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -38,7 +38,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + @@ -554,6 +554,8 @@ issn.validate('1020-3362')
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -562,8 +564,6 @@ issn.validate('1020-3362')
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index 140ee9c35..d7dbc4810 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + @@ -573,6 +573,8 @@ sys 2m27.496s
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -581,8 +583,6 @@ sys 2m27.496s
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index 0cb6105a2..c86d58fd4 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 "/> - + @@ -581,6 +581,8 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -589,8 +591,6 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html index a9bafaf14..54d976b2b 100644 --- a/docs/2019-10/index.html +++ b/docs/2019-10/index.html @@ -18,7 +18,7 @@ - + @@ -385,6 +385,8 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -393,8 +395,6 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html index 5b8e3d776..ba0fcae01 100644 --- a/docs/2019-11/index.html +++ b/docs/2019-11/index.html @@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t # zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams" 106781 "/> - + @@ -692,6 +692,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -700,8 +702,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html index be6780957..33a4f0c0e 100644 --- a/docs/2019-12/index.html +++ b/docs/2019-12/index.html @@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the # dpkg -C # reboot "/> - + @@ -404,6 +404,8 @@ UPDATE 1
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -412,8 +414,6 @@ UPDATE 1
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html index 606cf0b9d..ded9f1782 100644 --- a/docs/2020-01/index.html +++ b/docs/2020-01/index.html @@ -56,7 +56,7 @@ I tweeted the CGSpace repository link "/> - + @@ -604,6 +604,8 @@ COPY 2900
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -612,8 +614,6 @@ COPY 2900
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html index 836c1568a..f1f4cccf9 100644 --- a/docs/2020-02/index.html +++ b/docs/2020-02/index.html @@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install "/> - + @@ -1275,6 +1275,8 @@ Moving: 21993 into core statistics-2019
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1283,8 +1285,6 @@ Moving: 21993 into core statistics-2019
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html index e3a0d9254..4599300c2 100644 --- a/docs/2020-03/index.html +++ b/docs/2020-03/index.html @@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it "/> - + @@ -484,6 +484,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -492,8 +494,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html index 096a39a90..6fb0bf9ad 100644 --- a/docs/2020-04/index.html +++ b/docs/2020-04/index.html @@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week "/> - + @@ -658,6 +658,8 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -666,8 +668,6 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html index 15c8874c9..d6855c0c5 100644 --- a/docs/2020-05/index.html +++ b/docs/2020-05/index.html @@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2 "/> - + @@ -477,6 +477,8 @@ Caused by: java.lang.NullPointerException
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -485,8 +487,6 @@ Caused by: java.lang.NullPointerException
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html index e7715e642..6e1bd6088 100644 --- a/docs/2020-06/index.html +++ b/docs/2020-06/index.html @@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to In other news, I checked the statistics API on DSpace 6 and it’s working I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: "/> - + @@ -811,6 +811,8 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]' /tmp
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -819,8 +821,6 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]' /tmp
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html index 6c93dbdff..e280f1f24 100644 --- a/docs/2020-07/index.html +++ b/docs/2020-07/index.html @@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request "/> - + @@ -1142,6 +1142,8 @@ Fixed 4 occurences of: Muloi, D.M.
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1150,8 +1152,6 @@ Fixed 4 occurences of: Muloi, D.M.
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html index f1bc63307..09ed362af 100644 --- a/docs/2020-08/index.html +++ b/docs/2020-08/index.html @@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te "/> - + @@ -798,6 +798,8 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -806,8 +808,6 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html index a5b06cc51..c84d8a2ce 100644 --- a/docs/2020-09/index.html +++ b/docs/2020-09/index.html @@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39 I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40 "/> - + @@ -717,6 +717,8 @@ solr_query_params = {
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -725,8 +727,6 @@ solr_query_params = {
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html index ad72941bd..c23203f0a 100644 --- a/docs/2020-10/index.html +++ b/docs/2020-10/index.html @@ -44,7 +44,7 @@ During the FlywayDB migration I got an error: "/> - + @@ -1241,6 +1241,8 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -1249,8 +1251,6 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html index b32889171..cab5d45dd 100644 --- a/docs/2020-11/index.html +++ b/docs/2020-11/index.html @@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat "/> - + @@ -731,6 +731,8 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -739,8 +741,6 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html index 067b0f74f..2b83ea8ab 100644 --- a/docs/2020-12/index.html +++ b/docs/2020-12/index.html @@ -36,7 +36,7 @@ I started processing those (about 411,000 records): "/> - + @@ -869,6 +869,8 @@ $ curl -XDELETE 'http://localhost:9200/openrxv-items-2020-12-29?pretty'
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -877,8 +879,6 @@ $ curl -XDELETE 'http://localhost:9200/openrxv-items-2020-12-29?pretty'
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html index dd4c50175..8b0f61c26 100644 --- a/docs/2021-01/index.html +++ b/docs/2021-01/index.html @@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS "/> - + @@ -688,6 +688,8 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
    +
  1. March, 2021
  2. +
  3. February, 2021
  4. January, 2021
  5. @@ -696,8 +698,6 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
  6. CGSpace DSpace 6 Upgrade
  7. -
  8. November, 2020
  9. -
diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html index 3ce913b4b..db403056d 100644 --- a/docs/2021-02/index.html +++ b/docs/2021-02/index.html @@ -32,7 +32,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty - + @@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty } } "/> - + @@ -70,9 +70,9 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty "@type": "BlogPosting", "headline": "February, 2021", "url": "https://alanorth.github.io/cgspace-notes/2021-02/", - "wordCount": "3754", + "wordCount": "4170", "datePublished": "2021-02-01T10:13:54+02:00", - "dateModified": "2021-02-21T20:37:27+02:00", + "dateModified": "2021-02-24T09:21:07+02:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -779,7 +779,7 @@ UPDATE 18659
$ dspace metadata-import -f /tmp/0.csv