From b0ba32c97c7d960bf3cf9ae1d594ead483bc1e65 Mon Sep 17 00:00:00 2001 From: Alan Orth Date: Wed, 4 May 2022 11:09:45 +0300 Subject: [PATCH] Add notes for 2022-05-04 --- content/posts/2022-04.md | 7 + content/posts/2022-05.md | 44 ++++ docs/2015-11/index.html | 6 +- docs/2015-12/index.html | 6 +- docs/2016-01/index.html | 6 +- docs/2016-02/index.html | 6 +- docs/2016-03/index.html | 6 +- docs/2016-04/index.html | 6 +- docs/2016-05/index.html | 6 +- docs/2016-06/index.html | 6 +- docs/2016-07/index.html | 6 +- docs/2016-08/index.html | 6 +- docs/2016-09/index.html | 6 +- docs/2016-10/index.html | 6 +- docs/2016-11/index.html | 6 +- docs/2016-12/index.html | 6 +- docs/2017-01/index.html | 6 +- docs/2017-02/index.html | 6 +- docs/2017-03/index.html | 6 +- docs/2017-04/index.html | 6 +- docs/2017-05/index.html | 6 +- docs/2017-06/index.html | 6 +- docs/2017-07/index.html | 6 +- docs/2017-08/index.html | 6 +- docs/2017-09/index.html | 6 +- docs/2017-10/index.html | 6 +- docs/2017-11/index.html | 6 +- docs/2017-12/index.html | 6 +- docs/2018-01/index.html | 6 +- docs/2018-02/index.html | 6 +- docs/2018-03/index.html | 6 +- docs/2018-04/index.html | 6 +- docs/2018-05/index.html | 6 +- docs/2018-06/index.html | 6 +- docs/2018-07/index.html | 6 +- docs/2018-08/index.html | 6 +- docs/2018-09/index.html | 6 +- docs/2018-10/index.html | 6 +- docs/2018-11/index.html | 6 +- docs/2018-12/index.html | 6 +- docs/2019-01/index.html | 6 +- docs/2019-02/index.html | 6 +- docs/2019-03/index.html | 6 +- docs/2019-04/index.html | 6 +- docs/2019-05/index.html | 6 +- docs/2019-06/index.html | 6 +- docs/2019-07/index.html | 6 +- docs/2019-08/index.html | 6 +- docs/2019-09/index.html | 6 +- docs/2019-10/index.html | 6 +- docs/2019-11/index.html | 6 +- docs/2019-12/index.html | 6 +- docs/2020-01/index.html | 6 +- docs/2020-02/index.html | 6 +- docs/2020-03/index.html | 6 +- docs/2020-04/index.html | 6 +- docs/2020-05/index.html | 6 +- docs/2020-06/index.html | 6 +- docs/2020-07/index.html | 6 +- docs/2020-08/index.html | 6 +- docs/2020-09/index.html | 6 +- docs/2020-10/index.html | 6 +- docs/2020-11/index.html | 6 +- docs/2020-12/index.html | 6 +- docs/2021-01/index.html | 6 +- docs/2021-02/index.html | 6 +- docs/2021-03/index.html | 6 +- docs/2021-04/index.html | 6 +- docs/2021-05/index.html | 6 +- docs/2021-06/index.html | 6 +- docs/2021-07/index.html | 6 +- docs/2021-08/index.html | 6 +- docs/2021-09/index.html | 6 +- docs/2021-10/index.html | 6 +- docs/2021-11/index.html | 6 +- docs/2021-12/index.html | 6 +- docs/2022-01/index.html | 6 +- docs/2022-02/index.html | 6 +- docs/2022-03/index.html | 6 +- docs/2022-04/index.html | 22 +- docs/2022-05/index.html | 274 +++++++++++++++++++++ docs/404.html | 6 +- docs/categories/index.html | 10 +- docs/categories/index.xml | 4 +- docs/categories/notes/index.html | 74 +++--- docs/categories/notes/index.xml | 35 ++- docs/categories/notes/page/2/index.html | 57 +++-- docs/categories/notes/page/3/index.html | 70 +++--- docs/categories/notes/page/4/index.html | 77 +++--- docs/categories/notes/page/5/index.html | 68 ++--- docs/categories/notes/page/6/index.html | 36 ++- docs/cgiar-library-migration/index.html | 6 +- docs/cgspace-cgcorev2-migration/index.html | 6 +- docs/cgspace-dspace6-upgrade/index.html | 6 +- docs/index.html | 76 +++--- docs/index.xml | 35 ++- docs/page/2/index.html | 59 +++-- docs/page/3/index.html | 72 +++--- docs/page/4/index.html | 79 +++--- docs/page/5/index.html | 70 +++--- docs/page/6/index.html | 77 +++--- docs/page/7/index.html | 79 +++--- docs/page/8/index.html | 67 ++--- docs/page/9/index.html | 37 ++- docs/posts/index.html | 76 +++--- docs/posts/index.xml | 35 ++- docs/posts/page/2/index.html | 59 +++-- docs/posts/page/3/index.html | 72 +++--- docs/posts/page/4/index.html | 79 +++--- docs/posts/page/5/index.html | 70 +++--- docs/posts/page/6/index.html | 77 +++--- docs/posts/page/7/index.html | 79 +++--- docs/posts/page/8/index.html | 67 ++--- docs/posts/page/9/index.html | 37 ++- docs/robots.txt | 3 +- docs/sitemap.xml | 17 +- docs/tags/index.html | 6 +- docs/tags/migration/index.html | 6 +- docs/tags/notes/index.html | 6 +- docs/tags/notes/page/2/index.html | 6 +- docs/tags/notes/page/3/index.html | 6 +- 121 files changed, 1590 insertions(+), 1026 deletions(-) create mode 100644 content/posts/2022-05.md create mode 100644 docs/2022-05/index.html diff --git a/content/posts/2022-04.md b/content/posts/2022-04.md index ce79468bc..acd54e9bc 100644 --- a/content/posts/2022-04.md +++ b/content/posts/2022-04.md @@ -392,4 +392,11 @@ Total number of bot hits purged: 343 - 54.162.92.93 - 54.226.171.89 +## 2022-04-28 + +- Had a meeting with FAO and the team from SEAFDAC, who run many repositories that are integrated with AGROVOC + - Elvi from SEAFDAC has modified the [DSpace-CRIS 6.x VIAF lookup plugin to query AGROVOC](https://github.com/eulereadgbe/DSpace/blob/sair-6.3/dspace-api/src/main/java/org/dspace/content/authority/AgrovocAuthority.java) + - Also, they are doing a nice integration similar to the WorldFish / MELSpace repositories where they store the AGROVOC URIs in DSpace and show the terms with an icon in the UI + - See: https://repository.seafdec.org.ph/handle/10862/6320 + diff --git a/content/posts/2022-05.md b/content/posts/2022-05.md new file mode 100644 index 000000000..ca3dd27d9 --- /dev/null +++ b/content/posts/2022-05.md @@ -0,0 +1,44 @@ +--- +title: "May, 2022" +date: 2022-05-04T09:13:39+03:00 +author: "Alan Orth" +categories: ["Notes"] +--- + +## 2022-05-04 + +- I found a few more IPs making requests using the shady Chrome 44 user agent in the last few days so I will add them to the block list too: + - 18.207.136.176 + - 185.189.36.248 + - 50.118.223.78 + - 52.70.76.123 + - 3.236.10.11 +- Looking at the Solr statistics for 2022-04 + - 52.191.137.59 is Microsoft, but they are using a normal user agent and making tens of thousands of requests + - 64.39.98.62 is owned by Qualys, and all their requests are probing for /etc/passwd etc + - 185.192.69.15 is in the Netherlands and is using a normal user agent, but making excessive automated HTTP requests to paths forbidden in robots.txt + - 157.55.39.159 is owned by Microsoft and identifies as bingbot so I don't know why its requests were logged in Solr + - 52.233.67.176 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests + - 157.55.39.144 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests + - 207.46.13.177 is owned by Microsoft and identifies as bingbot so I don't know why its requests were logged in Solr + - If I query Solr for `time:2022-04* AND dns:*msnbot* AND dns:*.msn.com.` I see a handful of IPs that made 41,000 requests +- I purged 93,974 hits from these IPs using my `check-spider-ip-hits.sh` script + + + +- Now looking at the Solr statistics by user agent I see: + - `SomeRandomText` + - `RestSharp/106.11.7.0` + - `MetaInspector/5.7.0 (+https://github.com/jaimeiniesta/metainspector)` + - `wp_is_mobile` + - `Mozilla/5.0 (compatible; um-LN/1.0; mailto: techinfo@ubermetrics-technologies.com; Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1"` + - `insomnia/2022.2.1` + - `ZoteroTranslationServer` + - `omgili/0.5 +http://omgili.com` + - `curb` + - `Sprout Social (Link Attachment)` +- I purged 2,900 hits from these user agents from Solr using my `check-spider-hits.sh` script +- I made a [pull request to COUNTER-Robots](https://github.com/atmire/COUNTER-Robots/pull/54) for some of these agents + - In the mean time I will add them to our local overrides in DSpace + + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index 52ce62961..174288536 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 "/> - + @@ -242,6 +242,8 @@ db.statementpool = true
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -250,8 +252,6 @@ db.statementpool = true
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index e956e8848..5abb3d859 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + @@ -264,6 +264,8 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -272,8 +274,6 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index 92adc73c0..167bb684a 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + @@ -200,6 +200,8 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -208,8 +210,6 @@ $ find SimpleArchiveForBio/ -iname “*.pdf” -exec basename {} ; | sor
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index b9ff7a981..9ffd251d3 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + @@ -378,6 +378,8 @@ Bitstream: tést señora alimentación.pdf
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -386,8 +388,6 @@ Bitstream: tést señora alimentación.pdf
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index 6e14dc9df..f9aa426c9 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + @@ -316,6 +316,8 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -324,8 +326,6 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index ce8bdd564..79b7d192c 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + @@ -495,6 +495,8 @@ dspace.log.2016-04-27:7271
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -503,8 +505,6 @@ dspace.log.2016-04-27:7271
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index 199543e46..6c49ee8b2 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! # awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 "/> - + @@ -371,6 +371,8 @@ sys 0m20.540s
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -379,8 +381,6 @@ sys 0m20.540s
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index 6500ee628..64e053945 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + @@ -409,6 +409,8 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -417,8 +419,6 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index c568d5b8b..437fa44ea 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and In this case the select query was showing 95 results before the update "/> - + @@ -325,6 +325,8 @@ discovery.index.authority.ignore-variants=true
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -333,8 +335,6 @@ discovery.index.authority.ignore-variants=true
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index 86991b14f..526ee00ab 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + @@ -389,6 +389,8 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/bin
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -397,8 +399,6 @@ $ JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx512m" /home/cgspace.cgiar.org/bin
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index 783807a21..96ffeab85 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + @@ -606,6 +606,8 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -614,8 +616,6 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index 6c3c7a011..236a3b5db 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + @@ -372,6 +372,8 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'h
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -380,8 +382,6 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, 'h
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index f2719555b..2b38f49cb 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module Add dc.type to the output options for Atmire’s Listings and Reports module (#286) "/> - + @@ -548,6 +548,8 @@ org.dspace.discovery.SearchServiceException: Error executing query
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -556,8 +558,6 @@ org.dspace.discovery.SearchServiceException: Error executing query
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index 5c54ca72e..33f40ead0 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + @@ -784,6 +784,8 @@ $ exit
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -792,8 +794,6 @@ $ exit
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index d540deb26..f46fdd1ba 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + @@ -369,6 +369,8 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -377,8 +379,6 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index 5c109c156..43ee55ccf 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -50,7 +50,7 @@ DELETE 1 Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + @@ -423,6 +423,8 @@ COPY 1968
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -431,8 +433,6 @@ COPY 1968
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index dd4e4acc1..8f6962626 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + @@ -355,6 +355,8 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -363,8 +365,6 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index 8e6df6fe8..20f07715c 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + @@ -585,6 +585,8 @@ $ gem install compass -v 1.0.3
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -593,8 +595,6 @@ $ gem install compass -v 1.0.3
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index f9311d1b0..18b2a0e56 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -18,7 +18,7 @@ - + @@ -391,6 +391,8 @@ UPDATE 187
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -399,8 +401,6 @@ UPDATE 187
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index 096f60d93..470aee475 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -18,7 +18,7 @@ - + @@ -270,6 +270,8 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace import
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -278,8 +280,6 @@ $ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" [dspace]/bin/dspace import
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index 7ecf11826..bbf8b071d 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + @@ -275,6 +275,8 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -283,8 +285,6 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index b37f55d54..b5f0b45b5 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + @@ -517,6 +517,8 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -525,8 +527,6 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index 32a0ea80f..3dce2b095 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + @@ -659,6 +659,8 @@ Cert Status: good
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -667,8 +669,6 @@ Cert Status: good
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index 55809997c..4a16368b4 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + @@ -443,6 +443,8 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -451,8 +453,6 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index a15ce88a6..259c97a32 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct: dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 "/> - + @@ -944,6 +944,8 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sor
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -952,8 +954,6 @@ $ cat dspace.log.2017-11-28 | grep -o -E 'session_id=[A-Z0-9]{32}' | sor
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index 8f0680559..335daf33a 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + @@ -783,6 +783,8 @@ DELETE 20
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -791,8 +793,6 @@ DELETE 20
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 729913157..45a428269 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -150,7 +150,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + @@ -1452,6 +1452,8 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1460,8 +1462,6 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index 5ee2b904a..8fef7ccb8 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + @@ -1038,6 +1038,8 @@ UPDATE 3
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1046,8 +1048,6 @@ UPDATE 3
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index 00c6ac5bb..9fffdfcab 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + @@ -585,6 +585,8 @@ Fixed 5 occurences of: GENEBANKS
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -593,8 +595,6 @@ Fixed 5 occurences of: GENEBANKS
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index 6258f59dc..3f15a96cd 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + @@ -594,6 +594,8 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -602,8 +604,6 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index c9e6b901d..0f4cb813e 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + @@ -523,6 +523,8 @@ $ psql -h localhost -U postgres dspacetest
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -531,8 +533,6 @@ $ psql -h localhost -U postgres dspacetest
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index fe988455a..2763fd8dc 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -58,7 +58,7 @@ real 74m42.646s user 8m5.056s sys 2m7.289s "/> - + @@ -517,6 +517,8 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -525,8 +527,6 @@ $ sed '/^id/d' 10568-*.csv | csvcut -c 1,2 > map-to-cifor-archive.csv
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index 224cefdcc..6b004abc0 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + @@ -569,6 +569,8 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -577,8 +579,6 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index 940531f64..cab4f9c32 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes I ran all system updates on DSpace Test and rebooted it "/> - + @@ -442,6 +442,8 @@ $ dspace database migrate ignored
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -450,8 +452,6 @@ $ dspace database migrate ignored
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index c75973f27..79220fbca 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + @@ -748,6 +748,8 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND me
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -756,8 +758,6 @@ UPDATE metadatavalue SET text_value='ja' WHERE resource_type_id=2 AND me
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index 03ccd6a44..7a61b9066 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + @@ -656,6 +656,8 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: applic
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -664,8 +666,6 @@ $ curl -X GET -H "Content-Type: application/json" -H "Accept: applic
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 0cab8ff3c..3f4456f96 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + @@ -553,6 +553,8 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -561,8 +563,6 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index 51e6326fb..296a4affc 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + @@ -594,6 +594,8 @@ UPDATE 1
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -602,8 +604,6 @@ UPDATE 1
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index 8f9681b5e..5956f0d74 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t 357 207.46.13.1 903 54.70.40.11 "/> - + @@ -1264,6 +1264,8 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInter
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1272,8 +1274,6 @@ identify: CorruptImageProfile `xmp' @ warning/profile.c/SetImageProfileInter
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index cc74fb8ff..ec470028f 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -72,7 +72,7 @@ real 0m19.873s user 0m22.203s sys 0m1.979s "/> - + @@ -1344,6 +1344,8 @@ Please see the DSpace documentation for assistance.
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1352,8 +1354,6 @@ Please see the DSpace documentation for assistance.
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index 567c0ce4b..8a31fd965 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + @@ -1208,6 +1208,8 @@ sys 0m2.551s
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1216,8 +1218,6 @@ sys 0m2.551s
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index d3eb711ef..19c10b0cf 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + @@ -1299,6 +1299,8 @@ UPDATE 14
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1307,8 +1309,6 @@ UPDATE 14
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index 16e601de6..3437dc919 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -48,7 +48,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + @@ -631,6 +631,8 @@ COPY 64871
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -639,8 +641,6 @@ COPY 64871
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index ca0a35872..5af43c2cc 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + @@ -317,6 +317,8 @@ UPDATE 2
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -325,8 +327,6 @@ UPDATE 2
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index d8a20e98e..d4569be08 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -38,7 +38,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + @@ -554,6 +554,8 @@ issn.validate('1020-3362')
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -562,8 +564,6 @@ issn.validate('1020-3362')
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index 82faaaddb..08ebcdf63 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + @@ -573,6 +573,8 @@ sys 2m27.496s
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -581,8 +583,6 @@ sys 2m27.496s
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index c23215a7e..4c756ece9 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 "/> - + @@ -581,6 +581,8 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -589,8 +591,6 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html index 143a6f578..ab4243892 100644 --- a/docs/2019-10/index.html +++ b/docs/2019-10/index.html @@ -18,7 +18,7 @@ - + @@ -385,6 +385,8 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -393,8 +395,6 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html index 76482e183..4f36ca2b4 100644 --- a/docs/2019-11/index.html +++ b/docs/2019-11/index.html @@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t # zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams" 106781 "/> - + @@ -692,6 +692,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -700,8 +702,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html index 3d5468901..3f759db92 100644 --- a/docs/2019-12/index.html +++ b/docs/2019-12/index.html @@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the # dpkg -C # reboot "/> - + @@ -404,6 +404,8 @@ UPDATE 1
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -412,8 +414,6 @@ UPDATE 1
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html index d54d8e605..b9f249f87 100644 --- a/docs/2020-01/index.html +++ b/docs/2020-01/index.html @@ -56,7 +56,7 @@ I tweeted the CGSpace repository link "/> - + @@ -604,6 +604,8 @@ COPY 2900
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -612,8 +614,6 @@ COPY 2900
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html index fb78989ea..801185828 100644 --- a/docs/2020-02/index.html +++ b/docs/2020-02/index.html @@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install "/> - + @@ -1275,6 +1275,8 @@ Moving: 21993 into core statistics-2019
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1283,8 +1285,6 @@ Moving: 21993 into core statistics-2019
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html index 54b20a443..a61ae3f35 100644 --- a/docs/2020-03/index.html +++ b/docs/2020-03/index.html @@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it "/> - + @@ -484,6 +484,8 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -492,8 +494,6 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html index 1d0b0382e..2e196d008 100644 --- a/docs/2020-04/index.html +++ b/docs/2020-04/index.html @@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week "/> - + @@ -658,6 +658,8 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -666,8 +668,6 @@ $ psql -c 'select * from pg_stat_activity' | wc -l
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html index 8be3c40e8..f26af5cfa 100644 --- a/docs/2020-05/index.html +++ b/docs/2020-05/index.html @@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2 "/> - + @@ -477,6 +477,8 @@ Caused by: java.lang.NullPointerException
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -485,8 +487,6 @@ Caused by: java.lang.NullPointerException
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html index cce24ce34..b9abf933c 100644 --- a/docs/2020-06/index.html +++ b/docs/2020-06/index.html @@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to In other news, I checked the statistics API on DSpace 6 and it’s working I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: "/> - + @@ -811,6 +811,8 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]&#
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -819,8 +821,6 @@ $ csvcut -c 'id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]&#
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html index 178c21910..f5a2168c0 100644 --- a/docs/2020-07/index.html +++ b/docs/2020-07/index.html @@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request "/> - + @@ -1142,6 +1142,8 @@ Fixed 4 occurences of: Muloi, D.M.
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1150,8 +1152,6 @@ Fixed 4 occurences of: Muloi, D.M.
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html index 5bcd2f625..da1f3d13d 100644 --- a/docs/2020-08/index.html +++ b/docs/2020-08/index.html @@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te "/> - + @@ -798,6 +798,8 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -806,8 +808,6 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html index f658278f0..d1469b06e 100644 --- a/docs/2020-09/index.html +++ b/docs/2020-09/index.html @@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39 I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40 "/> - + @@ -717,6 +717,8 @@ solr_query_params = {
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -725,8 +727,6 @@ solr_query_params = {
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html index 15f8ed23f..d79237a04 100644 --- a/docs/2020-10/index.html +++ b/docs/2020-10/index.html @@ -44,7 +44,7 @@ During the FlywayDB migration I got an error: "/> - + @@ -1241,6 +1241,8 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1249,8 +1251,6 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html index 63733f291..bd1e8f54d 100644 --- a/docs/2020-11/index.html +++ b/docs/2020-11/index.html @@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat "/> - + @@ -731,6 +731,8 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -739,8 +741,6 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html index 668e7f33e..defd71388 100644 --- a/docs/2020-12/index.html +++ b/docs/2020-12/index.html @@ -36,7 +36,7 @@ I started processing those (about 411,000 records): "/> - + @@ -869,6 +869,8 @@ $ query-json '.items | length' /tmp/policy2.json
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -877,8 +879,6 @@ $ query-json '.items | length' /tmp/policy2.json
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html index f6c98c807..92d19af81 100644 --- a/docs/2021-01/index.html +++ b/docs/2021-01/index.html @@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS "/> - + @@ -688,6 +688,8 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -696,8 +698,6 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html index 62f28a8ee..cad00c2a8 100644 --- a/docs/2021-02/index.html +++ b/docs/2021-02/index.html @@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty } } "/> - + @@ -898,6 +898,8 @@ dspace=# UPDATE metadatavalue SET text_lang='en_US' WHERE dspace_object_
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -906,8 +908,6 @@ dspace=# UPDATE metadatavalue SET text_lang='en_US' WHERE dspace_object_
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html index 2fd8e17bd..3604df1e1 100644 --- a/docs/2021-03/index.html +++ b/docs/2021-03/index.html @@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst "/> - + @@ -875,6 +875,8 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -883,8 +885,6 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html index d55023cb1..b38b2922f 100644 --- a/docs/2021-04/index.html +++ b/docs/2021-04/index.html @@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in "/> - + @@ -1042,6 +1042,8 @@ $ chrt -b 0 dspace dsrun com.atmire.statistics.util.update.atomic.AtomicStatisti
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -1050,8 +1052,6 @@ $ chrt -b 0 dspace dsrun com.atmire.statistics.util.update.atomic.AtomicStatisti
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html index ec54b09d1..201e7750d 100644 --- a/docs/2021-05/index.html +++ b/docs/2021-05/index.html @@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user… "/> - + @@ -685,6 +685,8 @@ May 26, 02:57 UTC
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -693,8 +695,6 @@ May 26, 02:57 UTC
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html index b89a5dbf0..968d43b93 100644 --- a/docs/2021-06/index.html +++ b/docs/2021-06/index.html @@ -36,7 +36,7 @@ I simply started it and AReS was running again: "/> - + @@ -693,6 +693,8 @@ I simply started it and AReS was running again:
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -701,8 +703,6 @@ I simply started it and AReS was running again:
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html index 4e0fcc2a6..6c9545405 100644 --- a/docs/2021-07/index.html +++ b/docs/2021-07/index.html @@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER; COPY 20994 "/> - + @@ -715,6 +715,8 @@ COPY 20994
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -723,8 +725,6 @@ COPY 20994
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html index 40c732c88..d1da63914 100644 --- a/docs/2021-08/index.html +++ b/docs/2021-08/index.html @@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server: I decided to upgrade linode20 from Ubuntu 18.04 to 20.04 "/> - + @@ -606,6 +606,8 @@ I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -614,8 +616,6 @@ I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-09/index.html b/docs/2021-09/index.html index 9334c865b..bf44b725b 100644 --- a/docs/2021-09/index.html +++ b/docs/2021-09/index.html @@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu "/> - + @@ -588,6 +588,8 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -596,8 +598,6 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-10/index.html b/docs/2021-10/index.html index f853c4105..40d2c1583 100644 --- a/docs/2021-10/index.html +++ b/docs/2021-10/index.html @@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt So we have 1879/7100 (26.46%) matching already "/> - + @@ -791,6 +791,8 @@ Try doing it in two imports. In first import, remove all authors. In second impo
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -799,8 +801,6 @@ Try doing it in two imports. In first import, remove all authors. In second impo
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-11/index.html b/docs/2021-11/index.html index fa7c92b6c..1c9268ce2 100644 --- a/docs/2021-11/index.html +++ b/docs/2021-11/index.html @@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace: $ ./run.sh -s http://localhost:8081/solr/statistics -f 'time:2019-*' -a export -o statistics-2019.json -k uid $ zstd statistics-2019.json "/> - + @@ -494,6 +494,8 @@ $ zstd statistics-2019.json
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -502,8 +504,6 @@ $ zstd statistics-2019.json
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2021-12/index.html b/docs/2021-12/index.html index d3c3b0451..59754df8a 100644 --- a/docs/2021-12/index.html +++ b/docs/2021-12/index.html @@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics Total number of bot hits purged: 3679 "/> - + @@ -577,6 +577,8 @@ Total number of bot hits purged: 3679
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -585,8 +587,6 @@ Total number of bot hits purged: 3679
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2022-01/index.html b/docs/2022-01/index.html index 7f0b7c215..8287c06a0 100644 --- a/docs/2022-01/index.html +++ b/docs/2022-01/index.html @@ -24,7 +24,7 @@ Start a full harvest on AReS Start a full harvest on AReS "/> - + @@ -380,6 +380,8 @@ Start a full harvest on AReS
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -388,8 +390,6 @@ Start a full harvest on AReS
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2022-02/index.html b/docs/2022-02/index.html index 5ac983ab1..16053e9d6 100644 --- a/docs/2022-02/index.html +++ b/docs/2022-02/index.html @@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR "/> - + @@ -724,6 +724,8 @@ isNotNull(value.match('699'))
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -732,8 +734,6 @@ isNotNull(value.match('699'))
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2022-03/index.html b/docs/2022-03/index.html index 4e3d954ba..5a1a84e2d 100644 --- a/docs/2022-03/index.html +++ b/docs/2022-03/index.html @@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu& $ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv "/> - + @@ -476,6 +476,8 @@ isNotNull(value.match('889'))
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -484,8 +486,6 @@ isNotNull(value.match('889'))
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2022-04/index.html b/docs/2022-04/index.html index 8602efff5..68e28c3bf 100644 --- a/docs/2022-04/index.html +++ b/docs/2022-04/index.html @@ -11,14 +11,14 @@ - + - + @@ -28,9 +28,9 @@ "@type": "BlogPosting", "headline": "April, 2022", "url": "https://alanorth.github.io/cgspace-notes/2022-04/", - "wordCount": "1947", + "wordCount": "2015", "datePublished": "2022-04-01T10:53:39+03:00", - "dateModified": "2022-04-27T09:58:45+03:00", + "dateModified": "2022-04-28T08:49:31+03:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -477,6 +477,16 @@ +

2022-04-28

+ @@ -499,6 +509,8 @@
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -507,8 +519,6 @@
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/2022-05/index.html b/docs/2022-05/index.html new file mode 100644 index 000000000..42f884b30 --- /dev/null +++ b/docs/2022-05/index.html @@ -0,0 +1,274 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + May, 2022 | CGSpace Notes + + + + + + + + + + + + + + + + + + + + + +
+
+ +
+
+ + + + +
+
+

CGSpace Notes

+

Documenting day-to-day work on the CGSpace repository.

+
+
+ + + + +
+
+
+ + + + +
+
+

May, 2022

+ +
+

2022-05-04

+
    +
  • I found a few more IPs making requests using the shady Chrome 44 user agent in the last few days so I will add them to the block list too: +
      +
    • 18.207.136.176
    • +
    • 185.189.36.248
    • +
    • 50.118.223.78
    • +
    • 52.70.76.123
    • +
    • 3.236.10.11
    • +
    +
  • +
  • Looking at the Solr statistics for 2022-04 +
      +
    • 52.191.137.59 is Microsoft, but they are using a normal user agent and making tens of thousands of requests
    • +
    • 64.39.98.62 is owned by Qualys, and all their requests are probing for /etc/passwd etc
    • +
    • 185.192.69.15 is in the Netherlands and is using a normal user agent, but making excessive automated HTTP requests to paths forbidden in robots.txt
    • +
    • 157.55.39.159 is owned by Microsoft and identifies as bingbot so I don’t know why its requests were logged in Solr
    • +
    • 52.233.67.176 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests
    • +
    • 157.55.39.144 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests
    • +
    • 207.46.13.177 is owned by Microsoft and identifies as bingbot so I don’t know why its requests were logged in Solr
    • +
    • If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a handful of IPs that made 41,000 requests
    • +
    +
  • +
  • I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
  • +
+
    +
  • Now looking at the Solr statistics by user agent I see: +
      +
    • SomeRandomText
    • +
    • RestSharp/106.11.7.0
    • +
    • MetaInspector/5.7.0 (+https://github.com/jaimeiniesta/metainspector)
    • +
    • wp_is_mobile
    • +
    • Mozilla/5.0 (compatible; um-LN/1.0; mailto: techinfo@ubermetrics-technologies.com; Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1"
    • +
    • insomnia/2022.2.1
    • +
    • ZoteroTranslationServer
    • +
    • omgili/0.5 +http://omgili.com
    • +
    • curb
    • +
    • Sprout Social (Link Attachment)
    • +
    +
  • +
  • I purged 2,900 hits from these user agents from Solr using my check-spider-hits.sh script
  • +
  • I made a pull request to COUNTER-Robots for some of these agents +
      +
    • In the mean time I will add them to our local overrides in DSpace
    • +
    +
  • +
+ + + + + + +
+ + + +
+ + + + +
+
+ + + + + + + + + diff --git a/docs/404.html b/docs/404.html index 5c30c95f4..c12791d22 100644 --- a/docs/404.html +++ b/docs/404.html @@ -17,7 +17,7 @@ - + @@ -95,6 +95,8 @@
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -103,8 +105,6 @@
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/categories/index.html b/docs/categories/index.html index 28cc944e0..bc5afcc14 100644 --- a/docs/categories/index.html +++ b/docs/categories/index.html @@ -10,14 +10,14 @@ - + - + @@ -84,7 +84,7 @@

Notes

- +
Read more → @@ -108,6 +108,8 @@
    +
  1. May, 2022
  2. +
  3. April, 2022
  4. March, 2022
  5. @@ -116,8 +118,6 @@
  6. January, 2022
  7. -
  8. December, 2021
  9. -
diff --git a/docs/categories/index.xml b/docs/categories/index.xml index a62cb3e0b..a19330ada 100644 --- a/docs/categories/index.xml +++ b/docs/categories/index.xml @@ -6,11 +6,11 @@ Recent content in Categories on CGSpace Notes Hugo -- gohugo.io en-us - Fri, 01 Apr 2022 10:53:39 +0300 + Wed, 04 May 2022 09:13:39 +0300 Notes https://alanorth.github.io/cgspace-notes/categories/notes/ - Fri, 01 Apr 2022 10:53:39 +0300 + Wed, 04 May 2022 09:13:39 +0300 https://alanorth.github.io/cgspace-notes/categories/notes/ diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html index f87af5e29..0f668535c 100644 --- a/docs/categories/notes/index.html +++ b/docs/categories/notes/index.html @@ -10,14 +10,14 @@ - + - + @@ -81,6 +81,48 @@ +
+
+

May, 2022

+ +
+

2022-05-04

+
    +
  • I found a few more IPs making requests using the shady Chrome 44 user agent in the last few days so I will add them to the block list too: +
      +
    • 18.207.136.176
    • +
    • 185.189.36.248
    • +
    • 50.118.223.78
    • +
    • 52.70.76.123
    • +
    • 3.236.10.11
    • +
    +
  • +
  • Looking at the Solr statistics for 2022-04 +
      +
    • 52.191.137.59 is Microsoft, but they are using a normal user agent and making tens of thousands of requests
    • +
    • 64.39.98.62 is owned by Qualys, and all their requests are probing for /etc/passwd etc
    • +
    • 185.192.69.15 is in the Netherlands and is using a normal user agent, but making excessive automated HTTP requests to paths forbidden in robots.txt
    • +
    • 157.55.39.159 is owned by Microsoft and identifies as bingbot so I don’t know why its requests were logged in Solr
    • +
    • 52.233.67.176 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests
    • +
    • 157.55.39.144 is owned by Microsoft and uses a normal user agent, but making excessive automated HTTP requests
    • +
    • 207.46.13.177 is owned by Microsoft and identifies as bingbot so I don’t know why its requests were logged in Solr
    • +
    • If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a handful of IPs that made 41,000 requests
    • +
    +
  • +
  • I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
  • +
+ Read more → +
+ + + + + +

April, 2022

@@ -317,30 +359,6 @@ - -
-
-

July, 2021

- -
-

2021-07-01

-
    -
  • Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVOC for Enrico:
  • -
-
localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
-COPY 20994
-
- Read more → -
- - - - -