-
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html
index da66cd10d..dac05fb9d 100644
--- a/docs/2016-01/index.html
+++ b/docs/2016-01/index.html
@@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_
I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.
Update GitHub wiki for documentation of maintenance tasks.
"/>
-
+
@@ -200,7 +200,9 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
-
-
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html
index 39d4699e6..4ed64076f 100644
--- a/docs/2016-02/index.html
+++ b/docs/2016-02/index.html
@@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace:
Not only are there 49,000 countries, we have some blanks (25)…
Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE”
"/>
-
+
@@ -378,7 +378,9 @@ Bitstream: tést señora alimentación.pdf
-
-
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html
index 8f2cf0348..1b8f72c5c 100644
--- a/docs/2016-03/index.html
+++ b/docs/2016-03/index.html
@@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace
For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module
Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server
"/>
-
+
@@ -316,7 +316,9 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
-
-
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html
index c4e3f6fe5..a7d8f6d87 100644
--- a/docs/2016-04/index.html
+++ b/docs/2016-04/index.html
@@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any
This will save us a few gigs of backup space we’re paying for on S3
Also, I noticed the checker log has some errors we should pay attention to:
"/>
-
+
@@ -495,7 +495,9 @@ dspace.log.2016-04-27:7271
-
-
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html
index 50e06a83f..b152417df 100644
--- a/docs/2016-06/index.html
+++ b/docs/2016-06/index.html
@@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec
You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets
Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship
"/>
-
+
@@ -409,7 +409,9 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
-
-
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html
index 31f4a07ec..837ad1bbf 100644
--- a/docs/2016-07/index.html
+++ b/docs/2016-07/index.html
@@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
In this case the select query was showing 95 results before the update
"/>
-
+
@@ -325,7 +325,9 @@ discovery.index.authority.ignore-variants=true
-
-
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html
index 940083167..f6fd34275 100644
--- a/docs/2016-12/index.html
+++ b/docs/2016-12/index.html
@@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r
I’ve raised a ticket with Atmire to ask
Another worrying error from dspace.log is:
"/>
-
+
@@ -784,7 +784,9 @@ $ exit
-
-
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html
index 81c41c029..2c954db9a 100644
--- a/docs/2017-01/index.html
+++ b/docs/2017-01/index.html
@@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s
I tested on DSpace Test as well and it doesn’t work there either
I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years
"/>
-
+
@@ -369,7 +369,9 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
-
-
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html
index 5140ddff4..a5b5509c4 100644
--- a/docs/2017-02/index.html
+++ b/docs/2017-02/index.html
@@ -50,7 +50,7 @@ DELETE 1
Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301)
Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name
"/>
-
+
@@ -423,7 +423,9 @@ COPY 1968
-
-
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html
index 312913a4e..7c99de375 100644
--- a/docs/2017-07/index.html
+++ b/docs/2017-07/index.html
@@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329)
Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace
We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML:
"/>
-
+
@@ -275,7 +275,9 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
-
-
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html
index 15d410f2c..d6ce10dc0 100644
--- a/docs/2017-08/index.html
+++ b/docs/2017-08/index.html
@@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which
I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d
Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet
"/>
-
+
@@ -517,7 +517,9 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
-
-
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html
index 991f00896..99a7d5239 100644
--- a/docs/2017-09/index.html
+++ b/docs/2017-09/index.html
@@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two
Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group
"/>
-
+
@@ -659,7 +659,9 @@ Cert Status: good
-
-
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html
index a679cefd1..40f4f16b3 100644
--- a/docs/2017-10/index.html
+++ b/docs/2017-10/index.html
@@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine
Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections
"/>
-
+
@@ -443,7 +443,9 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
-
-
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html
index 8ebe56287..a8da52e97 100644
--- a/docs/2017-11/index.html
+++ b/docs/2017-11/index.html
@@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct:
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
COPY 54701
"/>
-
+
@@ -944,7 +944,9 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sor
-
-
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html
index 3972bc306..c80ea238b 100644
--- a/docs/2017-12/index.html
+++ b/docs/2017-12/index.html
@@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object”
PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today:
"/>
-
+
@@ -783,7 +783,9 @@ DELETE 20
-
-
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html
index 0b031c338..d9c8e3ed1 100644
--- a/docs/2018-01/index.html
+++ b/docs/2018-01/index.html
@@ -150,7 +150,7 @@ dspace.log.2018-01-02:34
Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains
"/>
-
+
@@ -1452,7 +1452,9 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
-
-
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html
index b36676b2c..6a4480104 100644
--- a/docs/2018-02/index.html
+++ b/docs/2018-02/index.html
@@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that
Yesterday I figured out how to monitor DSpace sessions using JMX
I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
"/>
-
+
@@ -1038,7 +1038,9 @@ UPDATE 3
-
-
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html
index fa40e9e73..0472578eb 100644
--- a/docs/2018-03/index.html
+++ b/docs/2018-03/index.html
@@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller
Export a CSV of the IITA community metadata for Martin Mueller
"/>
-
+
@@ -585,7 +585,9 @@ Fixed 5 occurences of: GENEBANKS
-
-
diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html
index 81d04afae..c326def61 100644
--- a/docs/2018-04/index.html
+++ b/docs/2018-04/index.html
@@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday:
I tried to test something on DSpace Test but noticed that it’s down since god knows when
Catalina logs at least show some memory errors yesterday:
"/>
-
+
@@ -594,7 +594,9 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
-
-
diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html
index f86a7a747..263b83d47 100644
--- a/docs/2018-05/index.html
+++ b/docs/2018-05/index.html
@@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E
Then I reduced the JVM heap size from 6144 back to 5120m
Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use
"/>
-
+
@@ -523,7 +523,9 @@ $ psql -h localhost -U postgres dspacetest
-
-
diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html
index d3c88b45c..6baf30232 100644
--- a/docs/2018-07/index.html
+++ b/docs/2018-07/index.html
@@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r
There is insufficient memory for the Java Runtime Environment to continue.
"/>
-
+
@@ -569,7 +569,9 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
-
-
diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html
index 4e66272bf..0773cff26 100644
--- a/docs/2018-08/index.html
+++ b/docs/2018-08/index.html
@@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did
The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes
I ran all system updates on DSpace Test and rebooted it
"/>
-
+
@@ -442,7 +442,9 @@ $ dspace database migrate ignored
-
-
diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html
index 419c00e94..3334d4bd2 100644
--- a/docs/2018-09/index.html
+++ b/docs/2018-09/index.html
@@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru
Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month
I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again:
"/>
-
+
@@ -748,7 +748,9 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND me
-
-
diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html
index 25b70c6a5..92b636702 100644
--- a/docs/2018-10/index.html
+++ b/docs/2018-10/index.html
@@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai
Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items
I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now
"/>
-
+
@@ -656,7 +656,9 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: applic
-
-
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html
index 989efa63a..f69a1aaa5 100644
--- a/docs/2018-11/index.html
+++ b/docs/2018-11/index.html
@@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
Today these are the top 10 IPs:
"/>
-
+
@@ -553,7 +553,9 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
-
-
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html
index 22cd4eaaf..36b9b898e 100644
--- a/docs/2018-12/index.html
+++ b/docs/2018-12/index.html
@@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
"/>
-
+
@@ -594,7 +594,9 @@ UPDATE 1
-
-
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html
index fbd03b298..9e6b920ea 100644
--- a/docs/2019-01/index.html
+++ b/docs/2019-01/index.html
@@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t
357 207.46.13.1
903 54.70.40.11
"/>
-
+
@@ -1265,7 +1265,9 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInter
-
-
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html
index d0ab2e03f..3ca929e57 100644
--- a/docs/2019-03/index.html
+++ b/docs/2019-03/index.html
@@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
"/>
-
+
@@ -1208,7 +1208,9 @@ sys 0m2.551s
-
-
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html
index 454f0da53..e55ee8af7 100644
--- a/docs/2019-05/index.html
+++ b/docs/2019-05/index.html
@@ -48,7 +48,7 @@ DELETE 1
But after this I tried to delete the item from the XMLUI and it is still present…
"/>
-
+
@@ -631,7 +631,9 @@ COPY 64871
-
-
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html
index 847ae14ba..a80728f3f 100644
--- a/docs/2019-06/index.html
+++ b/docs/2019-06/index.html
@@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it
Skype with Marie-Angélique and Abenet about CG Core v2
"/>
-
+
@@ -317,7 +317,9 @@ UPDATE 2
-
-
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html
index 774177a9f..1b9a4fd58 100644
--- a/docs/2019-07/index.html
+++ b/docs/2019-07/index.html
@@ -38,7 +38,7 @@ CGSpace
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
"/>
-
+
@@ -554,7 +554,9 @@ issn.validate('1020-3362')
-
-
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html
index af3a75f74..714c8a099 100644
--- a/docs/2019-08/index.html
+++ b/docs/2019-08/index.html
@@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck
Run system updates on DSpace Test (linode19) and reboot it
"/>
-
+
@@ -573,7 +573,9 @@ sys 2m27.496s
-
-
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html
index 6ebd03046..26bdb47bd 100644
--- a/docs/2019-09/index.html
+++ b/docs/2019-09/index.html
@@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2
"/>
-
+
@@ -581,7 +581,9 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
-
-
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html
index 8182b585e..ca20339f0 100644
--- a/docs/2019-12/index.html
+++ b/docs/2019-12/index.html
@@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the
# dpkg -C
# reboot
"/>
-
+
@@ -404,7 +404,9 @@ UPDATE 1
-
-
diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html
index 001626022..f674ea245 100644
--- a/docs/2020-04/index.html
+++ b/docs/2020-04/index.html
@@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week
On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week
"/>
-
+
@@ -658,7 +658,9 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
-
-
diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html
index 0d5f7b83f..0b9e191c4 100644
--- a/docs/2020-05/index.html
+++ b/docs/2020-05/index.html
@@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2
"/>
-
+
@@ -477,7 +477,9 @@ Caused by: java.lang.NullPointerException
-
-
diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html
index 73637822d..b4840c63d 100644
--- a/docs/2020-06/index.html
+++ b/docs/2020-06/index.html
@@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to
In other news, I checked the statistics API on DSpace 6 and it’s working
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
"/>
-
+
@@ -811,7 +811,9 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]
-
-
diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html
index f5fa96507..7e53ca75d 100644
--- a/docs/2020-07/index.html
+++ b/docs/2020-07/index.html
@@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone
Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request
"/>
-
+
@@ -1142,7 +1142,9 @@ Fixed 4 occurences of: Muloi, D.M.
-
-
diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html
index 2709a3eaf..c0ab16a8f 100644
--- a/docs/2020-08/index.html
+++ b/docs/2020-08/index.html
@@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te
"/>
-
+
@@ -798,7 +798,9 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
-
-
diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html
index a0c67568d..b8abf5e00 100644
--- a/docs/2020-09/index.html
+++ b/docs/2020-09/index.html
@@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39
I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40
"/>
-
+
@@ -717,7 +717,9 @@ solr_query_params = {
-
-
diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html
index 816ea800b..9f5950620 100644
--- a/docs/2020-11/index.html
+++ b/docs/2020-11/index.html
@@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat
"/>
-
+
@@ -731,7 +731,9 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
-
-
diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html
index c4d19d845..94093b61c 100644
--- a/docs/2021-01/index.html
+++ b/docs/2021-01/index.html
@@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS
"/>
-
+
@@ -688,7 +688,9 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
-
-
diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html
index b224b745f..4638a70de 100644
--- a/docs/2021-03/index.html
+++ b/docs/2021-03/index.html
@@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
"/>
-
+
@@ -875,7 +875,9 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
-
-
diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html
index 224510539..507a3533b 100644
--- a/docs/2021-04/index.html
+++ b/docs/2021-04/index.html
@@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
"/>
-
+
@@ -1042,7 +1042,9 @@ $ chrt -b 0 dspace dsrun com.atmire.statistics.util.update.atomic.AtomicStatisti
-
-
diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html
index 946671fec..3b9f5a3d9 100644
--- a/docs/2021-05/index.html
+++ b/docs/2021-05/index.html
@@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an
I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user…
"/>
-
+
@@ -685,7 +685,9 @@ May 26, 02:57 UTC
-
-
diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html
index 95f3c627c..c715f9689 100644
--- a/docs/2021-06/index.html
+++ b/docs/2021-06/index.html
@@ -36,7 +36,7 @@ I simply started it and AReS was running again:
"/>
-
+
@@ -693,7 +693,9 @@ I simply started it and AReS was running again:
-
-
diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html
index 083459e67..4bce5e6da 100644
--- a/docs/2021-07/index.html
+++ b/docs/2021-07/index.html
@@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO
localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
COPY 20994
"/>
-
+
@@ -715,7 +715,9 @@ COPY 20994
-
-
diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html
index 4d271deb5..093405e3c 100644
--- a/docs/2021-08/index.html
+++ b/docs/2021-08/index.html
@@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server:
I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
"/>
-
+
@@ -606,7 +606,9 @@ I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
-
-
diff --git a/docs/2021-09/index.html b/docs/2021-09/index.html
index 35e921002..e6a458694 100644
--- a/docs/2021-09/index.html
+++ b/docs/2021-09/index.html
@@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
"/>
-
+
@@ -588,7 +588,9 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
-
-
diff --git a/docs/2021-10/index.html b/docs/2021-10/index.html
index 7b98934c9..485ac9dfd 100644
--- a/docs/2021-10/index.html
+++ b/docs/2021-10/index.html
@@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt
So we have 1879/7100 (26.46%) matching already
"/>
-
+
@@ -791,7 +791,9 @@ Try doing it in two imports. In first import, remove all authors. In second impo
-
-
diff --git a/docs/2021-12/index.html b/docs/2021-12/index.html
index 41bdbb06f..97b7ea871 100644
--- a/docs/2021-12/index.html
+++ b/docs/2021-12/index.html
@@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics
Total number of bot hits purged: 3679
"/>
-
+
@@ -577,7 +577,9 @@ Total number of bot hits purged: 3679
-
-
diff --git a/docs/2022-01/index.html b/docs/2022-01/index.html
index 4e3231301..c2b4eb0d5 100644
--- a/docs/2022-01/index.html
+++ b/docs/2022-01/index.html
@@ -24,7 +24,7 @@ Start a full harvest on AReS
Start a full harvest on AReS
"/>
-
+
@@ -380,7 +380,9 @@ Start a full harvest on AReS
-
-
diff --git a/docs/2022-02/index.html b/docs/2022-02/index.html
index 3b54c83b0..21c16db8e 100644
--- a/docs/2022-02/index.html
+++ b/docs/2022-02/index.html
@@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR
"/>
-
+
@@ -724,7 +724,9 @@ isNotNull(value.match('699'))
-
-
diff --git a/docs/2022-05/index.html b/docs/2022-05/index.html
index c048a0525..69920ba50 100644
--- a/docs/2022-05/index.html
+++ b/docs/2022-05/index.html
@@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha
I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
"/>
-
+
@@ -445,7 +445,9 @@ I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
-
-
diff --git a/docs/2022-06/index.html b/docs/2022-06/index.html
index 8b38a296f..85fa733f8 100644
--- a/docs/2022-06/index.html
+++ b/docs/2022-06/index.html
@@ -48,7 +48,7 @@ There seem to be many more of these:
"/>
-
+
@@ -458,7 +458,9 @@ There seem to be many more of these:
-
-
diff --git a/docs/2022-07/index.html b/docs/2022-07/index.html
index d11f586b9..99e418bf5 100644
--- a/docs/2022-07/index.html
+++ b/docs/2022-07/index.html
@@ -34,7 +34,7 @@ Also, the trgm functions I’ve used before are case insensitive, but Levens
"/>
-
+
@@ -736,7 +736,9 @@ Also, the trgm functions I’ve used before are case insensitive, but Levens
-
-
diff --git a/docs/2022-08/index.html b/docs/2022-08/index.html
index 3b1d54f24..ee516a06d 100644
--- a/docs/2022-08/index.html
+++ b/docs/2022-08/index.html
@@ -24,7 +24,7 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
"/>
-
+
@@ -522,7 +522,9 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
-
-
diff --git a/docs/2022-09/index.html b/docs/2022-09/index.html
index acc0e3f88..7292688e8 100644
--- a/docs/2022-09/index.html
+++ b/docs/2022-09/index.html
@@ -46,7 +46,7 @@ I also fixed a few bugs and improved the region-matching logic
"/>
-
+
@@ -783,7 +783,9 @@ harvesting of meat from wildlife and not from livestock.
-
diff --git a/docs/2022-10/index.html b/docs/2022-10/index.html
index d7d0496eb..337a74ea0 100644
--- a/docs/2022-10/index.html
+++ b/docs/2022-10/index.html
@@ -36,7 +36,7 @@ I filed an issue to ask about Java 11+ support
"/>
-
+
@@ -978,7 +978,9 @@ I filed an issue to ask about Java 11+ support
-
-
diff --git a/docs/2022-11/index.html b/docs/2022-11/index.html
index d27630afd..16794ecfa 100644
--- a/docs/2022-11/index.html
+++ b/docs/2022-11/index.html
@@ -44,7 +44,7 @@ I want to make sure they use groups instead of individuals where possible!
I reverted the Cocoon autosave change because it was more of a nuissance that Peter can’t upload CSVs from the web interface and is a very low severity security issue
"/>
-
+
@@ -757,7 +757,9 @@ I reverted the Cocoon autosave change because it was more of a nuissance that Pe
-
-
diff --git a/docs/2022-12/index.html b/docs/2022-12/index.html
index 9c8556e31..6e4971304 100644
--- a/docs/2022-12/index.html
+++ b/docs/2022-12/index.html
@@ -36,7 +36,7 @@ I exported the CCAFS and IITA communities, extracted just the country and region
Add a few more authors to my CSV with author names and ORCID identifiers and tag 283 items!
Replace “East Asia” with “Eastern Asia” region on CGSpace (UN M.49 region)
"/>
-
+
@@ -577,7 +577,9 @@ Replace “East Asia” with “Eastern Asia” region on CGSpac
-
-
diff --git a/docs/2023-01/index.html b/docs/2023-01/index.html
index 0320b3e67..459db0d92 100644
--- a/docs/2023-01/index.html
+++ b/docs/2023-01/index.html
@@ -34,7 +34,7 @@ I see we have some new ones that aren’t in our list if I combine with this
"/>
-
+
@@ -827,7 +827,9 @@ I see we have some new ones that aren’t in our list if I combine with this
-
-
diff --git a/docs/2023-02/index.html b/docs/2023-02/index.html
index 507b246a9..0919e7992 100644
--- a/docs/2023-02/index.html
+++ b/docs/2023-02/index.html
@@ -32,7 +32,7 @@ I want to try to expand my use of their data to journals, publishers, volumes, i
"/>
-
+
@@ -647,7 +647,9 @@ I want to try to expand my use of their data to journals, publishers, volumes, i
-
-
diff --git a/docs/2023-03/index.html b/docs/2023-03/index.html
index 338fef960..0ede20f24 100644
--- a/docs/2023-03/index.html
+++ b/docs/2023-03/index.html
@@ -28,7 +28,7 @@ Remove cg.subject.wle and cg.identifier.wletheme from CGSpace input form after c
iso-codes 4.13.0 was released, which incorporates my changes to the common names for Iran, Laos, and Syria
I finally got through with porting the input form from DSpace 6 to DSpace 7
"/>
-
+
@@ -859,7 +859,9 @@ RL: performed 0 reads and 16 write i/o operations
-
-
diff --git a/docs/2023-04/index.html b/docs/2023-04/index.html
index 399c2b155..754e331f1 100644
--- a/docs/2023-04/index.html
+++ b/docs/2023-04/index.html
@@ -36,7 +36,7 @@ I also did a check for missing country/region mappings with csv-metadata-quality
Start a harvest on AReS
"/>
-
+
@@ -751,7 +751,9 @@ Start a harvest on AReS
-
-
diff --git a/docs/2023-05/index.html b/docs/2023-05/index.html
index 2226d7e31..9b6efb502 100644
--- a/docs/2023-05/index.html
+++ b/docs/2023-05/index.html
@@ -46,7 +46,7 @@ Also I found at least two spelling mistakes, for example “decison support
Work on cleaning, proofing, and uploading twenty-seven records for IFPRI to CGSpace
"/>
-
+
@@ -374,7 +374,9 @@ Work on cleaning, proofing, and uploading twenty-seven records for IFPRI to CGSp
-
-
diff --git a/docs/2023-06/index.html b/docs/2023-06/index.html
index 52d81dae8..148f0e6ca 100644
--- a/docs/2023-06/index.html
+++ b/docs/2023-06/index.html
@@ -44,7 +44,7 @@ From what I can see we need to upgrade the MODS schema from 3.1 to 3.7 and then
"/>
-
+
@@ -446,7 +446,9 @@ From what I can see we need to upgrade the MODS schema from 3.1 to 3.7 and then
-
-
diff --git a/docs/2023-08/index.html b/docs/2023-08/index.html
index 88d2fa523..1678d646b 100644
--- a/docs/2023-08/index.html
+++ b/docs/2023-08/index.html
@@ -34,7 +34,7 @@ I did some minor cleanups myself and applied them to CGSpace
Start working on some batch uploads for IFPRI
"/>
-
+
@@ -460,7 +460,9 @@ UPDATE 1
-
-
diff --git a/docs/2023-10/index.html b/docs/2023-10/index.html
index 333cc4ec7..186907ecf 100644
--- a/docs/2023-10/index.html
+++ b/docs/2023-10/index.html
@@ -36,7 +36,7 @@ We can be on the safe side by using only abstracts for items that are licensed u
"/>
-
+
@@ -345,7 +345,9 @@ We can be on the safe side by using only abstracts for items that are licensed u
-
-
diff --git a/docs/2023-11/index.html b/docs/2023-11/index.html
index e4799a47d..dbbe537f2 100644
--- a/docs/2023-11/index.html
+++ b/docs/2023-11/index.html
@@ -42,7 +42,7 @@ I improved the filtering and wrote some Python using pandas to merge my sources
Export CGSpace to check missing Initiative collection mappings
Start a harvest on AReS
"/>
-
+
@@ -389,7 +389,9 @@ tomcat9[732]: [9955.666s][info ][gc] GC(6292) To-space exhausted
-
dspace=#BEGIN;
-BEGIN
-dspace=*#UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=187AND text_value ~'[[:upper:]]';
-UPDATE180
-dspace=*#COMMIT;
-COMMIT
-
2024-02-06
+
Work on preparation of new server for DSpace 7 migration
-
Discuss IWMI using the CGSpace REST API for their new website
-
Export the IWMI community to extract their ORCID identifiers:
+
I’m not quite sure what we need to do for the Handle server
+
For now I just ran the dspace make-handle-config script and diffed it with the one from DSpace 6
+
I sent the bundle to the Handle admins to make sure it’s OK before we do the migration
Help Peter proof and upload 252 items from the 2023 Gender conference to CGSpace
+
Meeting with IFPRI to discuss their migration to CGSpace
+
+
We agreed to add two new fields, one for IFPRI project and one for IFPRI publication ranking
+
Most likely we will use cg.identifier.project as a general field and consolidate other project fields there
+
Not sure which field to use for the publication rank…
+
+
+
+
2024-01-05
+
+
Proof and upload 51 items in bulk for IFPRI
+
I did a big cleanup of user groups in anticipation of complaints about slow workflow tasks etc in DSpace 7
+
+
I removed ILRI editors from all the dozens of CCAFS community and collection groups, and I should do the same for other CRPs since they are closed for two years now
+
+
+
+
2024-01-06
+
+
Migrate CGSpace to DSpace 7
+
+
2024-01-07
+
+
High load on the server and UptimeRobot saying the frontend is flapping
+
+
I noticed tons of logs from pm2 in the systemd journal, so I disabled those in the systemd unit because they are available from pm2’s log directory anyway
+
I also noticed the same for Solr, so I disabled stdout for that systemd unit as well
+
+
+
I spent a lot of time bringing back the nginx rate limits we used in DSpace 6 and it seems to have helped
+
I see some client doing weird HEAD requests to search pages:
Minor work on OpenRXV to fix a bug in the ng-select drop downs
-
Minor work on the DSpace 7 nginx configuration to allow requesting robots.txt and sitemaps without hitting rate limits
-
-
2024-02-21
-
-
Minor updates on OpenRXV, including one bug fix for missing mapped collections
-
-
Salem had to re-work the harvester for DSpace 7 since the mapped collections and parent collection list are separate!
-
-
-
-
2024-02-22
-
-
Discuss tagging of datasets and re-work the submission form to encourage use of DOI field for any item that has a DOI, and the normal URL field if not
-
-
The “cg.identifier.dataurl” field will be used for “related” datasets
-
I still have to check and move some metadata for existing datasets
-
-
-
-
2024-02-23
-
-
This morning Tomcat died due to an OOM kill from the kernel:
-
-
kernel: Out of memory: Killed process 698 (java) total-vm:14151300kB, anon-rss:9665812kB, file-rss:320kB, shmem-rss:0kB, UID:997 pgtables:20436kB oom_score_adj:0
+
localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS "dcterms.publisher", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 178 GROUP BY "dcterms.publisher" ORDER BY count DESC) to /tmp/2024-01-publishers.csv WITH CSV HEADER;
+COPY 4332
-
I don’t see any abnormal pattern in my Grafana graphs, for JVM or system load… very weird
-
I updated the submission form on CGSpace to include the new changes to URLs for datasets
+
Address some feedback on DSpace 7 from users, including fileing some issues on GitHub
-
I also updated about 80 datasets to move the URLs to the correct field
The Alliance TIP team was having issues posting to one collection via the legacy DSpace 6 REST API
+
+
In the DSpace logs I see the same issue that they had last month:
-
2024-02-25
+
ERROR unknown unknown org.dspace.rest.Resource @ Something get wrong. Aborting context in finally statement.
+
2024-01-09
-
This morning Tomcat died while I was doing a CSV export, with an OOM kill from the kernel:
-
-
kernel: Out of memory: Killed process 720768 (java) total-vm:14079976kB, anon-rss:9301684kB, file-rss:152kB, shmem-rss:0kB, UID:997 pgtables:19488kB oom_score_adj:0
-
-
I don’t know why this is happening so often recently…
-
-
2024-02-27
+
I restarted Tomcat to see if it helps the REST issue
+
After talking with Peter about publishers we decided to get a clean list of the top ~100 publishers and then make sure all CGIAR centers, Initiatives, and Impact Platforms are there as well
-
IFPRI sent me a list of authors to add to our list for now, until we can find a better way of doing it
-
-
I extracted the existing authors from our controlled vocabulary and combined them with IFPRI’s: