diff --git a/content/posts/2020-07.md b/content/posts/2020-07.md index 1388f0bc6..e3d9fd2f6 100644 --- a/content/posts/2020-07.md +++ b/content/posts/2020-07.md @@ -857,4 +857,46 @@ Fixed 13 occurences of: Muloi, D. Fixed 4 occurences of: Muloi, D.M. ``` +## 2020-07-28 + +- I started analyzing the situation with the cases I've seen where a Solr record fails to be migrated: + - `id: 0-unmigrated` are mostly (all?) `type: 5` aka site view + - `id: -1-unmigrated` are mostly (all?) `type: 5` aka site view + - `id: -1` are mostly (all?) `type: 5` aka site view + - `id: 59184-unmigrated` where "59184" is the id of an item or bitstream that no longer exists +- Why doesn't Atmire's code ignore any id with "-unmigrated"? +- I sent feedback to Atmire since they had responded to my previous question yesterday + - They said that the DSpace 6 version of CUA does not work with Tomcat 8.5... +- I spent a few hours trying to write a [Jython-based curation task](https://wiki.lyrasis.org/display/DSDOC5x/Curation+tasks+in+Jython) to update ISO 3166-1 Alpha2 country codes based on each item's ISO 3166-1 country + - Peter doesn't want to use the ISO 3166-1 list because he objects to a few names, so I thought we might be able to use country codes or numeric codes and update the names with a curation task + - The work is very rough but kinda works: [mytask.py](https://gist.github.com/alanorth/6a31af592b3467f7b63ac8aea7c75d52) + - What is nice is that the `dso.update()` method updates the data the "DSpace way" so we don't need to re-index Solr + - I had a clever idea to "vendor" the pycountry code using `pip install pycountry -t`, but pycountry dropped support for Python 2 in 2019 so we can only use an outdated version + - In the end it's really limiting to this particular task in Jython because we are stuck with Python 2, we can't use virtual environments, and there is a lot of code we'd need to write to be able to handle the ISO 3166 country lists + - Python 2 is no longer supported by the Python community anyways so it's probably better to figure out how to do this in Java + +## 2020-07-29 + +- The Atmire stats tool (com.atmire.statistics.util.update.atomic.AtomicStatisticsUpdateCLI) created 150GB of log files due to errors and the disk got full on DSpace Test (linode26) + - This morning I had noticed that the run I started last night said that 54,000,000 (54 million!) records failed to process, but the core only had 6 million or so documents to process...! + - I removed the large log files and optimized the Solr core + +## 2020-07-30 + +- Looking into ISO 3166-1 from the iso-codes package + - I see that all current 249 countries have names, 173 have official names, and 6 have common names: + +``` +# grep -c numeric /usr/share/iso-codes/json/iso_3166-1.json +249 +# grep -c -E '"name":' /usr/share/iso-codes/json/iso_3166-1.json +249 +# grep -c -E '"official_name":' /usr/share/iso-codes/json/iso_3166-1.json +173 +# grep -c -E '"common_name":' /usr/share/iso-codes/json/iso_3166-1.json +6 +``` + +- Wow, the `CC-BY-NC-ND-3.0-IGO` license that I had [requested in 2019-02](https://github.com/spdx/license-list-XML/issues/767) was finally merged into SPDX... + diff --git a/content/posts/2020-08.md b/content/posts/2020-08.md new file mode 100644 index 000000000..d091e88a1 --- /dev/null +++ b/content/posts/2020-08.md @@ -0,0 +1,21 @@ +--- +title: "August, 2020" +date: 2020-07-02T15:35:54+03:00 +author: "Alan Orth" +categories: ["Notes"] +--- + +## 2020-08-02 + +- I spent a few days working on a Java-based curation task to tag items with ISO 3166-1 Alpha2 country codes based on their `cg.coverage.country` text values + - It looks up the names in ISO 3166-1 first, and then in our CGSpace countries mapping (which has five or so of Peter's preferred "display" country names) + - It implements a "force" mode too that will clear existing country codes and re-tag everything + - It is class based so I can easily add support for other vocabularies, and the technique could even be used for organizations with mappings to ROR and Clarisa... + + + +- The code is currently on my personal GitHub: https://github.com/alanorth/dspace-curation-tasks + - I still need to figure out how to integrate this with the DSpace build because currently you have to package it and copy the JAR to the `dspace/lib` directory (not to mention the config) + + + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index 41dc7b4fc..e7781df18 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -31,7 +31,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 "/> - + @@ -239,6 +239,8 @@ db.statementpool = true
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -247,8 +249,6 @@ db.statementpool = true
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index c868203ed..d18b312ef 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -33,7 +33,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + @@ -261,6 +261,8 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -269,8 +271,6 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index 212831ab1..b2699eaf2 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -25,7 +25,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + @@ -197,6 +197,8 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -205,8 +207,6 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index bd7e22c16..fb4860263 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -35,7 +35,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + @@ -375,6 +375,8 @@ Bitstream: tést señora alimentación.pdf
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -383,8 +385,6 @@ Bitstream: tést señora alimentación.pdf
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index dfcd5dc01..0175b5a0e 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -25,7 +25,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + @@ -313,6 +313,8 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -321,8 +323,6 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index bab077700..b8138d130 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -29,7 +29,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + @@ -492,6 +492,8 @@ dspace.log.2016-04-27:7271
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -500,8 +502,6 @@ dspace.log.2016-04-27:7271
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index 63759c61b..bd26b7903 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -31,7 +31,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! # awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 "/> - + @@ -368,6 +368,8 @@ sys 0m20.540s
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -376,8 +378,6 @@ sys 0m20.540s
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index e7aa02f86..7149c7135 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -31,7 +31,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + @@ -406,6 +406,8 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -414,8 +416,6 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index bdd390d2b..9b5418253 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -41,7 +41,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and In this case the select query was showing 95 results before the update "/> - + @@ -322,6 +322,8 @@ discovery.index.authority.ignore-variants=true
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -330,8 +332,6 @@ discovery.index.authority.ignore-variants=true
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index b018b7ce9..91ba6d934 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -39,7 +39,7 @@ $ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + @@ -386,6 +386,8 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/b
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -394,8 +396,6 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/b
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index cdfef1438..071c1d94f 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -31,7 +31,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + @@ -603,6 +603,8 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -611,8 +613,6 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index c0d572fe8..b551b3eb0 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -39,7 +39,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + @@ -369,6 +369,8 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http:
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -377,8 +379,6 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'http:
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index 3584a611b..ee5107b78 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -23,7 +23,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module Add dc.type to the output options for Atmire’s Listings and Reports module (#286) "/> - + @@ -545,6 +545,8 @@ org.dspace.discovery.SearchServiceException: Error executing query
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -553,8 +555,6 @@ org.dspace.discovery.SearchServiceException: Error executing query
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index 17d265736..aad5c9a4d 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -43,7 +43,7 @@ I see thousands of them in the logs for the last few months, so it’s not r I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + @@ -781,6 +781,8 @@ $ exit
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -789,8 +791,6 @@ $ exit
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index 78f5025a4..fdfa47f96 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -25,7 +25,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + @@ -366,6 +366,8 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -374,8 +376,6 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index 6a47a31c6..2ae4c4e8c 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -47,7 +47,7 @@ DELETE 1 Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + @@ -421,6 +421,8 @@ COPY 1968
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -429,8 +431,6 @@ COPY 1968
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index e810e87bb..92f309d8c 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -51,7 +51,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + @@ -352,6 +352,8 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -360,8 +362,6 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index 37b59ff4e..dd457e0f4 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -37,7 +37,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + @@ -582,6 +582,8 @@ $ gem install compass -v 1.0.3
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -590,8 +592,6 @@ $ gem install compass -v 1.0.3
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index 9f9358abb..d2f9f7187 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -15,7 +15,7 @@ - + @@ -388,6 +388,8 @@ UPDATE 187
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -396,8 +398,6 @@ UPDATE 187
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index 1a2de44f4..8417dc412 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -15,7 +15,7 @@ - + @@ -267,6 +267,8 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace impo
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -275,8 +277,6 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace impo
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index ab8f4349c..f4ba6b9b1 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -33,7 +33,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + @@ -272,6 +272,8 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -280,8 +282,6 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index d44acf4e9..b2a4bead7 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -57,7 +57,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + @@ -514,6 +514,8 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -522,8 +524,6 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index f8730e3e9..060c633b0 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -29,7 +29,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + @@ -656,6 +656,8 @@ Cert Status: good
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -664,8 +666,6 @@ Cert Status: good
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index fabc06053..477a3a528 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -31,7 +31,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + @@ -440,6 +440,8 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -448,8 +450,6 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index 1b27e04c9..89869b287 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -45,7 +45,7 @@ Generate list of authors on CGSpace for Peter to go through and correct: dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 "/> - + @@ -941,6 +941,8 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | u
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -949,8 +951,6 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | u
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index f6442a801..4af5eb6ec 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -27,7 +27,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + @@ -780,6 +780,8 @@ DELETE 20
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -788,8 +790,6 @@ DELETE 20
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 1e7666109..638915559 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -147,7 +147,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + @@ -1449,6 +1449,8 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1457,8 +1459,6 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index e5995248e..f024445ba 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -27,7 +27,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + @@ -1036,6 +1036,8 @@ UPDATE 3
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1044,8 +1046,6 @@ UPDATE 3
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index 7585a2d56..367b83a31 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -21,7 +21,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + @@ -582,6 +582,8 @@ Fixed 5 occurences of: GENEBANKS
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -590,8 +592,6 @@ Fixed 5 occurences of: GENEBANKS
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index 50ec083f4..7ae7a05a8 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -23,7 +23,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + @@ -591,6 +591,8 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -599,8 +601,6 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index 7740196a4..92070fa26 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -35,7 +35,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + @@ -520,6 +520,8 @@ $ psql -h localhost -U postgres dspacetest
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -528,8 +530,6 @@ $ psql -h localhost -U postgres dspacetest
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index 99af9aa76..a094188c1 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -55,7 +55,7 @@ real 74m42.646s user 8m5.056s sys 2m7.289s "/> - + @@ -514,6 +514,8 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -522,8 +524,6 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index c0ef5955c..289aa7d3a 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -33,7 +33,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + @@ -566,6 +566,8 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -574,8 +576,6 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index 43700c7af..24d2e0faa 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -43,7 +43,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes I ran all system updates on DSpace Test and rebooted it "/> - + @@ -439,6 +439,8 @@ $ dspace database migrate ignored
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -447,8 +449,6 @@ $ dspace database migrate ignored
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index 457628968..29e614d32 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -27,7 +27,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + @@ -745,6 +745,8 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND metadata_f
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -753,8 +755,6 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND metadata_f
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index c51e1a9bd..efc859706 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -23,7 +23,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + @@ -653,6 +653,8 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: app
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -661,8 +663,6 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: app
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 385f838e9..6dfc54343 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -33,7 +33,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + @@ -550,6 +550,8 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -558,8 +560,6 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index f22a5cd62..3050930f7 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -33,7 +33,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + @@ -591,6 +591,8 @@ UPDATE 1
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -599,8 +601,6 @@ UPDATE 1
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index fa7a75ced..5c84b3687 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -47,7 +47,7 @@ I don’t see anything interesting in the web server logs around that time t 357 207.46.13.1 903 54.70.40.11 "/> - + @@ -1261,6 +1261,8 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInternal/
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1269,8 +1271,6 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInternal/
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index 551cb833d..f42b1e144 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -69,7 +69,7 @@ real 0m19.873s user 0m22.203s sys 0m1.979s "/> - + @@ -1341,6 +1341,8 @@ Please see the DSpace documentation for assistance.
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1349,8 +1351,6 @@ Please see the DSpace documentation for assistance.
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index 80ae51f6a..6e9049c57 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -43,7 +43,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + @@ -1205,6 +1205,8 @@ sys 0m2.551s
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1213,8 +1215,6 @@ sys 0m2.551s
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index eacf90f3a..201c2ac47 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -61,7 +61,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + @@ -1296,6 +1296,8 @@ UPDATE 14
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1304,8 +1306,6 @@ UPDATE 14
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index f7800bbc4..9d4057ee3 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -45,7 +45,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + @@ -628,6 +628,8 @@ COPY 64871
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -636,8 +638,6 @@ COPY 64871
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index c947eba8d..dc0f05a9a 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -31,7 +31,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + @@ -314,6 +314,8 @@ UPDATE 2
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -322,8 +324,6 @@ UPDATE 2
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index 12e4c1b45..3c433176c 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -35,7 +35,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + @@ -551,6 +551,8 @@ issn.validate('1020-3362')
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -559,8 +561,6 @@ issn.validate('1020-3362')
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index d6a0c5e00..190121622 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -43,7 +43,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + @@ -570,6 +570,8 @@ sys 2m27.496s
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -578,8 +580,6 @@ sys 2m27.496s
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index 79bd0966c..00c2f528f 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -69,7 +69,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 "/> - + @@ -578,6 +578,8 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -586,8 +588,6 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html index e92511f01..ee32a4366 100644 --- a/docs/2019-10/index.html +++ b/docs/2019-10/index.html @@ -15,7 +15,7 @@ - + @@ -382,6 +382,8 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -390,8 +392,6 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html index 55827ea30..abacfdb24 100644 --- a/docs/2019-11/index.html +++ b/docs/2019-11/index.html @@ -55,7 +55,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t # zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams" 106781 "/> - + @@ -689,6 +689,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -697,8 +699,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html index 47beb4140..2a6d3ba34 100644 --- a/docs/2019-12/index.html +++ b/docs/2019-12/index.html @@ -43,7 +43,7 @@ Make sure all packages are up to date and the package manager is up to date, the # dpkg -C # reboot "/> - + @@ -401,6 +401,8 @@ UPDATE 1
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -409,8 +411,6 @@ UPDATE 1
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html index 8da9fa3cd..8a41c0dca 100644 --- a/docs/2020-01/index.html +++ b/docs/2020-01/index.html @@ -53,7 +53,7 @@ I tweeted the CGSpace repository link "/> - + @@ -601,6 +601,8 @@ COPY 2900
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -609,8 +611,6 @@ COPY 2900
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html index 060bcdef4..f47fd5af4 100644 --- a/docs/2020-02/index.html +++ b/docs/2020-02/index.html @@ -35,7 +35,7 @@ The code finally builds and runs with a fresh install "/> - + @@ -1272,6 +1272,8 @@ Moving: 21993 into core statistics-2019
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1280,8 +1282,6 @@ Moving: 21993 into core statistics-2019
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html index 139c329d5..309f95313 100644 --- a/docs/2020-03/index.html +++ b/docs/2020-03/index.html @@ -39,7 +39,7 @@ You need to download this into the DSpace 6.x source and compile it "/> - + @@ -481,6 +481,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -489,8 +491,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html index e553f078a..c75375576 100644 --- a/docs/2020-04/index.html +++ b/docs/2020-04/index.html @@ -45,7 +45,7 @@ The third item now has a donut with score 1 since I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week "/> - + @@ -655,6 +655,8 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -663,8 +665,6 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html index 1c262c5e4..9e4e02044 100644 --- a/docs/2020-05/index.html +++ b/docs/2020-05/index.html @@ -31,7 +31,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2 "/> - + @@ -474,6 +474,8 @@ Caused by: java.lang.NullPointerException
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -482,8 +484,6 @@ Caused by: java.lang.NullPointerException
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html index f12c713e7..00a93ce39 100644 --- a/docs/2020-06/index.html +++ b/docs/2020-06/index.html @@ -33,7 +33,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to In other news, I checked the statistics API on DSpace 6 and it’s working I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: "/> - + @@ -808,6 +808,8 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]' /tmp
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -816,8 +818,6 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]' /tmp
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html index 94c1f4b53..10b32dd7a 100644 --- a/docs/2020-07/index.html +++ b/docs/2020-07/index.html @@ -20,7 +20,7 @@ Since I was restarting Tomcat anyways I decided to redeploy the latest changes f - + @@ -35,7 +35,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request "/> - + @@ -45,9 +45,9 @@ Since I was restarting Tomcat anyways I decided to redeploy the latest changes f "@type": "BlogPosting", "headline": "July, 2020", "url": "https://alanorth.github.io/cgspace-notes/2020-07/", - "wordCount": "5184", + "wordCount": "5618", "datePublished": "2020-07-01T10:53:54+03:00", - "dateModified": "2020-07-26T22:24:52+03:00", + "dateModified": "2020-07-27T20:07:52+03:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -1063,7 +1063,62 @@ If run the update again with the resume option (-r) they will be reattempted
$ ./fix-metadata-values.py -i /tmp/2020-07-27-fix-ILRI-author.csv -db dspace -u cgspace -p 'fuuu' -f dc.contributor.author -t 'correct' -m 3
 Fixed 13 occurences of: Muloi, D.
 Fixed 4 occurences of: Muloi, D.M.
-
+

2020-07-28

+ +

2020-07-29

+ +

2020-07-30

+ +
# grep -c numeric /usr/share/iso-codes/json/iso_3166-1.json
+249
+# grep -c -E '"name":' /usr/share/iso-codes/json/iso_3166-1.json
+249
+# grep -c -E '"official_name":' /usr/share/iso-codes/json/iso_3166-1.json
+173
+# grep -c -E '"common_name":' /usr/share/iso-codes/json/iso_3166-1.json
+6
+
+ @@ -1084,6 +1139,8 @@ Fixed 4 occurences of: Muloi, D.M.
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -1092,8 +1149,6 @@ Fixed 4 occurences of: Muloi, D.M.
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/404.html b/docs/404.html index 3f953015d..23739debd 100644 --- a/docs/404.html +++ b/docs/404.html @@ -14,7 +14,7 @@ - + @@ -94,6 +94,8 @@
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -102,8 +104,6 @@
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/categories/index.html b/docs/categories/index.html index 899eb29bd..f7f38475c 100644 --- a/docs/categories/index.html +++ b/docs/categories/index.html @@ -9,12 +9,12 @@ - + - + @@ -83,7 +83,7 @@

Notes

- +
Read more → @@ -107,6 +107,8 @@
    +
  1. August, 2020
  2. +
  3. July, 2020
  4. June, 2020
  5. @@ -115,8 +117,6 @@
  6. April, 2020
  7. -
  8. March, 2020
  9. -
diff --git a/docs/categories/index.xml b/docs/categories/index.xml index bba300c3d..1b253bc19 100644 --- a/docs/categories/index.xml +++ b/docs/categories/index.xml @@ -6,7 +6,7 @@ Recent content in Categories on CGSpace Notes Hugo -- gohugo.io en-us - Wed, 01 Jul 2020 10:53:54 +0300 + Thu, 02 Jul 2020 15:35:54 +0300 @@ -14,7 +14,7 @@ Notes https://alanorth.github.io/cgspace-notes/categories/notes/ - Wed, 01 Jul 2020 10:53:54 +0300 + Thu, 02 Jul 2020 15:35:54 +0300 https://alanorth.github.io/cgspace-notes/categories/notes/ diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html index 9da93e7fc..f9a530894 100644 --- a/docs/categories/notes/index.html +++ b/docs/categories/notes/index.html @@ -9,12 +9,12 @@ - + - + @@ -80,6 +80,33 @@ +
+
+

August, 2020

+ +
+

2020-08-02

+
    +
  • I spent a few days working on a Java-based curation task to tag items with ISO 3166-1 Alpha2 country codes based on their cg.coverage.country text values +
      +
    • It looks up the names in ISO 3166-1 first, and then in our CGSpace countries mapping (which has five or so of Peter’s preferred “display” country names)
    • +
    • It implements a “force” mode too that will clear existing country codes and re-tag everything
    • +
    • It is class based so I can easily add support for other vocabularies, and the technique could even be used for organizations with mappings to ROR and Clarisa…
    • +
    +
  • +
+ Read more → +
+ + + + + +

July, 2020

@@ -358,27 +385,6 @@ - - - - - - -