diff --git a/content/posts/2022-09.md b/content/posts/2022-09.md index 34493ab01..e073ec16e 100644 --- a/content/posts/2022-09.md +++ b/content/posts/2022-09.md @@ -496,4 +496,79 @@ Fixed 1 occurences of: Amanda De Filippo: 0000-0002-1536-3221 ... ``` +## 2022-09-29 + +- I've been checking the size of the nginx proxy cache the last few days and it always seems to hover around 14,000 entries and 385MB: + +```console +# find /var/cache/nginx/rest_cache/ -type f | wc -l +14202 +# du -sh /var/cache/nginx/rest_cache +384M /var/cache/nginx/rest_cache +``` + +- Also on that note I'm trying to implement a workaround for a potential caching issue that causes MEL to not be able to update items on DSpace Test + - I *think* we might need to allow requests with a JSESSIONID to bypass the cache, but I have to verify with Salem + - We can do this with an nginx map: + +```console +# Check if the JSESSIONID cookie is present and contains a 32-character hex +# value, which would mean that a user is actively attempting to re-use their +# Tomcat session. Then we set the $active_user_session variable and use it +# to bypass the nginx proxy cache in REST requests. +map $cookie_jsessionid $active_user_session { + # requests with an empty key are not evaluated by limit_req + # see: http://nginx.org/en/docs/http/ngx_http_limit_req_module.html + default ''; + + '~[A-Z0-9]{32}' 1; +} +``` + +- Then in the location block where we do the proxy cache: + +```console + # Don't cache when user Shift-refreshes (Cache-Control: no-cache) or + # when a client has an active session (see the $cookie_jsessionid map). + proxy_cache_bypass $http_cache_control $active_user_session; + proxy_no_cache $http_cache_control $active_user_session; +``` + +- I found one client making 10,000 requests using a Windows 98 user agent: + +```console +Mozilla/4.0 (compatible; MSIE 5.00; Windows 98) +``` + +- They all come from one IP address (129.227.149.43) in Hong Kong + - The IP belongs to a hosting provider called Zenlayer + - I will add this IP to the nginx bot networks and purge its hits + +```console +$ ./ilri/check-spider-ip-hits.sh -f /tmp/ip -p +Purging 33027 hits from 129.227.149.43 in statistics + +Total number of bot hits purged: 33027 +``` + +- So it seems we've seen this bot before and the total number is much higher than the 10,000 this month +- I had a call with Salem and we verified that the nginx cache bypass for clients who provide a JSESSIONID fixes their issue with updating items/bitstreams from MEL + - The issue was that they delete all metadata and bitstreams, then add them again to make sure everything is up to date, and in that process they also re-request the item with all expands to get the bitstreams, which ends up getting cached and then they try to delete the old bitstream +- I also noticed that someone made a [pull request to enable POSTing bitstreams to a particular bundle](https://github.com/DSpace/DSpace/pull/8343) and it works, so that's awesome! + +## 2022-09-30 + +- I applied [the patch for POSTing bitstreams to other bundles](https://github.com/DSpace/DSpace/pull/8343) on CGSpace +- Testing a few other DSpace 6.4 patches on DSpace Test: + - [DS-3791 Make sure the "yearDifference" takes into account that a gap of 10 year contains 11 years](https://github.com/DSpace/DSpace/pull/1901) + - [DS-3873 Limit the usage of PDFBoxThumbnail to PDFs](https://github.com/DSpace/DSpace/pull/2501) + - [Reduce itemCounter init](https://github.com/DSpace/DSpace/pull/2161) + - [ImageMagick: Only execute "identify" on first page](https://github.com/DSpace/DSpace/pull/2201) + - [DS-3881: Show no total results on search-filter](https://github.com/DSpace/DSpace/pull/2371) + - [pass value instead of qualifier to method](https://github.com/DSpace/DSpace/pull/2699) + - [dspace-api: check for null AND empty qualifier in findByElement()](https://github.com/DSpace/DSpace/pull/7993) + - [Avoid exporting mapped Item more than once](https://github.com/DSpace/DSpace/pull/7995) + - [[DS-4574] v. 6 - Upgrade DBCP2 dependency](https://github.com/DSpace/DSpace/pull/3162) + - [bump up pdfbox version on 6.x to match main branch](https://github.com/DSpace/DSpace/pull/2742) + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index 0c24974a3..f783dc86f 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 "/> - + diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index f58eff8fa..338e832a3 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index b446fa5e0..1d9183b98 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index 7dd369f9b..60e0623db 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index 21373d292..ae2fe011c 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index e7cfb6a0e..55065454a 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index a87cb6616..d1b99d72e 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! # awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 "/> - + diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index 7cbb33182..485603067 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index 1d825f3f6..20839f6ac 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and In this case the select query was showing 95 results before the update "/> - + diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index c44e2fc3f..51074936c 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index 6ab3089f8..b78b2144b 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index c713905ca..0c35c6c4b 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index 1b3dbfd7a..9c2200f6b 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module Add dc.type to the output options for Atmire’s Listings and Reports module (#286) "/> - + diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index d96b5e4e6..21beafc78 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index b63145f2a..ef96e258d 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index f43fd8b85..729ac5ae5 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -50,7 +50,7 @@ DELETE 1 Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index 130b448d4..1c97936fd 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index c5be98613..2e6cf51f9 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index f59474320..ab263cb82 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index c96e23e7e..d054b445d 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index d286d2f7a..1e0e93e86 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index 512f5204e..002d6504d 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index c0043c376..9c4dd9a5a 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index cf857c2d6..7859936f5 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index 94780fe06..97a1410cd 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct: dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 "/> - + diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index b4199998e..71ab2b610 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 7340d10b3..d3fb70954 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -150,7 +150,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index 3a9e51eb7..f73351f7e 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index 59be9d262..0a4b82424 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index 6108ef00c..0af6c02d0 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index 79bc243c5..ee2eb02b9 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index acd48541d..7e344aa2d 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -58,7 +58,7 @@ real 74m42.646s user 8m5.056s sys 2m7.289s "/> - + diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index 289ec252f..8d6f7848c 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index e1ee08b76..2858e8816 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes I ran all system updates on DSpace Test and rebooted it "/> - + diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index 0a90098e3..42b3e8b6e 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index 731025be6..305cbeb26 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 2bf4e58e2..d304b4989 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index 81038f358..8e819812e 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index 8c58ab213..4d56c5c77 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t 357 207.46.13.1 903 54.70.40.11 "/> - + diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index 9b84f581d..43cc2929b 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -72,7 +72,7 @@ real 0m19.873s user 0m22.203s sys 0m1.979s "/> - + diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index 431bb7d22..6bb6dd0c8 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index d3b07fc66..80b89109e 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index c72498316..ab368bd7a 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -48,7 +48,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index f10f6579d..51168863f 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index 0c7c2d1a0..d79ee6f89 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -38,7 +38,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index 04f7197cb..ab74fd55f 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index 550c3bac7..035d32bb4 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 "/> - + diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html index a98215d3b..548f6d64c 100644 --- a/docs/2019-10/index.html +++ b/docs/2019-10/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html index d3c9c4051..fa8858f27 100644 --- a/docs/2019-11/index.html +++ b/docs/2019-11/index.html @@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t # zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams" 106781 "/> - + diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html index 02edea322..aeb5a42fe 100644 --- a/docs/2019-12/index.html +++ b/docs/2019-12/index.html @@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the # dpkg -C # reboot "/> - + diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html index daa871948..ae5918ad2 100644 --- a/docs/2020-01/index.html +++ b/docs/2020-01/index.html @@ -56,7 +56,7 @@ I tweeted the CGSpace repository link "/> - + diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html index 9f7313512..84ad91582 100644 --- a/docs/2020-02/index.html +++ b/docs/2020-02/index.html @@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install "/> - + diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html index 374eb9456..3862b936c 100644 --- a/docs/2020-03/index.html +++ b/docs/2020-03/index.html @@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it "/> - + diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html index a78e686d8..0a120554e 100644 --- a/docs/2020-04/index.html +++ b/docs/2020-04/index.html @@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week "/> - + diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html index fcaa0cce9..ada72c9f8 100644 --- a/docs/2020-05/index.html +++ b/docs/2020-05/index.html @@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2 "/> - + diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html index af7d0978e..abdc23312 100644 --- a/docs/2020-06/index.html +++ b/docs/2020-06/index.html @@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to In other news, I checked the statistics API on DSpace 6 and it’s working I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: "/> - + diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html index 68758a927..fcc0471d1 100644 --- a/docs/2020-07/index.html +++ b/docs/2020-07/index.html @@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request "/> - + diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html index 44912228a..c44bc3bd8 100644 --- a/docs/2020-08/index.html +++ b/docs/2020-08/index.html @@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te "/> - + diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html index 422cdc322..8f4936253 100644 --- a/docs/2020-09/index.html +++ b/docs/2020-09/index.html @@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39 I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40 "/> - + diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html index 06792e2b8..81255ac0c 100644 --- a/docs/2020-10/index.html +++ b/docs/2020-10/index.html @@ -44,7 +44,7 @@ During the FlywayDB migration I got an error: "/> - + diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html index 68af7f3e8..5abc166b4 100644 --- a/docs/2020-11/index.html +++ b/docs/2020-11/index.html @@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat "/> - + diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html index a6371870b..c307f1230 100644 --- a/docs/2020-12/index.html +++ b/docs/2020-12/index.html @@ -36,7 +36,7 @@ I started processing those (about 411,000 records): "/> - + diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html index fd3a1c7a8..7edc612af 100644 --- a/docs/2021-01/index.html +++ b/docs/2021-01/index.html @@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS "/> - + diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html index 6be4ec581..06fd7c0fb 100644 --- a/docs/2021-02/index.html +++ b/docs/2021-02/index.html @@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty } } "/> - + diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html index 2b005b932..0c756c298 100644 --- a/docs/2021-03/index.html +++ b/docs/2021-03/index.html @@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst "/> - + diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html index c190583d9..32a04db6e 100644 --- a/docs/2021-04/index.html +++ b/docs/2021-04/index.html @@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in "/> - + diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html index fb7080e3d..d77e9fd33 100644 --- a/docs/2021-05/index.html +++ b/docs/2021-05/index.html @@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user… "/> - + diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html index 3caee853a..9458ed6dd 100644 --- a/docs/2021-06/index.html +++ b/docs/2021-06/index.html @@ -36,7 +36,7 @@ I simply started it and AReS was running again: "/> - + diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html index 0a9380c5d..389e33b12 100644 --- a/docs/2021-07/index.html +++ b/docs/2021-07/index.html @@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER; COPY 20994 "/> - + diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html index 71ed471e3..9694b39fa 100644 --- a/docs/2021-08/index.html +++ b/docs/2021-08/index.html @@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server: I decided to upgrade linode20 from Ubuntu 18.04 to 20.04 "/> - + diff --git a/docs/2021-09/index.html b/docs/2021-09/index.html index 2926ac42a..914935c93 100644 --- a/docs/2021-09/index.html +++ b/docs/2021-09/index.html @@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu "/> - + diff --git a/docs/2021-10/index.html b/docs/2021-10/index.html index 44bcb62e4..a0d3407b4 100644 --- a/docs/2021-10/index.html +++ b/docs/2021-10/index.html @@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt So we have 1879/7100 (26.46%) matching already "/> - + diff --git a/docs/2021-11/index.html b/docs/2021-11/index.html index 4e7d3808f..fa79a15a8 100644 --- a/docs/2021-11/index.html +++ b/docs/2021-11/index.html @@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace: $ ./run.sh -s http://localhost:8081/solr/statistics -f 'time:2019-*' -a export -o statistics-2019.json -k uid $ zstd statistics-2019.json "/> - + diff --git a/docs/2021-12/index.html b/docs/2021-12/index.html index 56e9315e8..c49e0c500 100644 --- a/docs/2021-12/index.html +++ b/docs/2021-12/index.html @@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics Total number of bot hits purged: 3679 "/> - + diff --git a/docs/2022-01/index.html b/docs/2022-01/index.html index f5c2c78c0..6a2446f72 100644 --- a/docs/2022-01/index.html +++ b/docs/2022-01/index.html @@ -24,7 +24,7 @@ Start a full harvest on AReS Start a full harvest on AReS "/> - + diff --git a/docs/2022-02/index.html b/docs/2022-02/index.html index 63acd0535..97999b8e1 100644 --- a/docs/2022-02/index.html +++ b/docs/2022-02/index.html @@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR "/> - + diff --git a/docs/2022-03/index.html b/docs/2022-03/index.html index 686b3fecc..fe8ac4aeb 100644 --- a/docs/2022-03/index.html +++ b/docs/2022-03/index.html @@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu& $ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv "/> - + diff --git a/docs/2022-04/index.html b/docs/2022-04/index.html index 8b1f3cfc0..a0adc48f0 100644 --- a/docs/2022-04/index.html +++ b/docs/2022-04/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2022-05/index.html b/docs/2022-05/index.html index 0a268cd3f..44ee3cc99 100644 --- a/docs/2022-05/index.html +++ b/docs/2022-05/index.html @@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script "/> - + diff --git a/docs/2022-06/index.html b/docs/2022-06/index.html index da1b82abf..5b0073974 100644 --- a/docs/2022-06/index.html +++ b/docs/2022-06/index.html @@ -48,7 +48,7 @@ There seem to be many more of these: "/> - + diff --git a/docs/2022-07/index.html b/docs/2022-07/index.html index 3486874e7..f3d160c28 100644 --- a/docs/2022-07/index.html +++ b/docs/2022-07/index.html @@ -34,7 +34,7 @@ Also, the trgm functions I’ve used before are case insensitive, but Levens "/> - + diff --git a/docs/2022-08/index.html b/docs/2022-08/index.html index 405359a0f..a27d66893 100644 --- a/docs/2022-08/index.html +++ b/docs/2022-08/index.html @@ -24,7 +24,7 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago "/> - + diff --git a/docs/2022-09/index.html b/docs/2022-09/index.html index 654ca81e1..711a63d58 100644 --- a/docs/2022-09/index.html +++ b/docs/2022-09/index.html @@ -25,7 +25,7 @@ I also fixed a few bugs and improved the region-matching logic - + @@ -46,7 +46,7 @@ I also fixed a few bugs and improved the region-matching logic "/> - + @@ -56,9 +56,9 @@ I also fixed a few bugs and improved the region-matching logic "@type": "BlogPosting", "headline": "September, 2022", "url": "https://alanorth.github.io/cgspace-notes/2022-09/", - "wordCount": "3112", + "wordCount": "3621", "datePublished": "2022-09-01T09:41:36+03:00", - "dateModified": "2022-09-28T17:10:23+03:00", + "dateModified": "2022-09-28T21:22:59+03:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -685,7 +685,84 @@ harvesting of meat from wildlife and not from livestock.

Fixed 3 occurences of: Alessandra Galie: 0000-0001-9868-7733 Fixed 1 occurences of: Amanda De Filippo: 0000-0002-1536-3221 ... - +

2022-09-29

+ +
# find /var/cache/nginx/rest_cache/ -type f | wc -l
+14202
+# du -sh /var/cache/nginx/rest_cache
+384M    /var/cache/nginx/rest_cache
+
+
# Check if the JSESSIONID cookie is present and contains a 32-character hex
+# value, which would mean that a user is actively attempting to re-use their
+# Tomcat session. Then we set the $active_user_session variable and use it
+# to bypass the nginx proxy cache in REST requests.
+map $cookie_jsessionid $active_user_session {
+    # requests with an empty key are not evaluated by limit_req
+    # see: http://nginx.org/en/docs/http/ngx_http_limit_req_module.html
+    default '';
+
+    '~[A-Z0-9]{32}' 1;
+}
+
+
            # Don't cache when user Shift-refreshes (Cache-Control: no-cache) or
+            # when a client has an active session (see the $cookie_jsessionid map).
+            proxy_cache_bypass $http_cache_control $active_user_session;
+            proxy_no_cache $http_cache_control $active_user_session;
+
+
Mozilla/4.0 (compatible; MSIE 5.00; Windows 98)
+
+
$ ./ilri/check-spider-ip-hits.sh -f /tmp/ip -p
+Purging 33027 hits from 129.227.149.43 in statistics
+
+Total number of bot hits purged: 33027
+
+

2022-09-30

+ + diff --git a/docs/404.html b/docs/404.html index 49af71e02..864391a65 100644 --- a/docs/404.html +++ b/docs/404.html @@ -17,7 +17,7 @@ - + diff --git a/docs/categories/index.html b/docs/categories/index.html index 3306e8f57..f31ee4a9c 100644 --- a/docs/categories/index.html +++ b/docs/categories/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html index fabf8b865..91ea3afbf 100644 --- a/docs/categories/notes/index.html +++ b/docs/categories/notes/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/2/index.html b/docs/categories/notes/page/2/index.html index 66b79b061..ef1801691 100644 --- a/docs/categories/notes/page/2/index.html +++ b/docs/categories/notes/page/2/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/3/index.html b/docs/categories/notes/page/3/index.html index 84ae17385..9d9004b3e 100644 --- a/docs/categories/notes/page/3/index.html +++ b/docs/categories/notes/page/3/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/4/index.html b/docs/categories/notes/page/4/index.html index 8bcff8ace..ee33cd074 100644 --- a/docs/categories/notes/page/4/index.html +++ b/docs/categories/notes/page/4/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/5/index.html b/docs/categories/notes/page/5/index.html index 04c2db950..bcf00219e 100644 --- a/docs/categories/notes/page/5/index.html +++ b/docs/categories/notes/page/5/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/6/index.html b/docs/categories/notes/page/6/index.html index e23411d19..bceb7f678 100644 --- a/docs/categories/notes/page/6/index.html +++ b/docs/categories/notes/page/6/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/7/index.html b/docs/categories/notes/page/7/index.html index 8155ead52..17cb4cc5b 100644 --- a/docs/categories/notes/page/7/index.html +++ b/docs/categories/notes/page/7/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/cgiar-library-migration/index.html b/docs/cgiar-library-migration/index.html index 177f42ff1..7144885fc 100644 --- a/docs/cgiar-library-migration/index.html +++ b/docs/cgiar-library-migration/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/cgspace-cgcorev2-migration/index.html b/docs/cgspace-cgcorev2-migration/index.html index 41460e33d..f586ac4db 100644 --- a/docs/cgspace-cgcorev2-migration/index.html +++ b/docs/cgspace-cgcorev2-migration/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/cgspace-dspace6-upgrade/index.html b/docs/cgspace-dspace6-upgrade/index.html index 7c2971cc2..8ccff7a08 100644 --- a/docs/cgspace-dspace6-upgrade/index.html +++ b/docs/cgspace-dspace6-upgrade/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/index.html b/docs/index.html index 71db70cb8..48ba97e48 100644 --- a/docs/index.html +++ b/docs/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/2/index.html b/docs/page/2/index.html index 133884319..b964c930d 100644 --- a/docs/page/2/index.html +++ b/docs/page/2/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/3/index.html b/docs/page/3/index.html index a58f1702d..4c55cde1c 100644 --- a/docs/page/3/index.html +++ b/docs/page/3/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/4/index.html b/docs/page/4/index.html index bf2f194b0..87d333f62 100644 --- a/docs/page/4/index.html +++ b/docs/page/4/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/5/index.html b/docs/page/5/index.html index af1f6f676..32f4675d9 100644 --- a/docs/page/5/index.html +++ b/docs/page/5/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/6/index.html b/docs/page/6/index.html index e10d55edc..ab0fa077a 100644 --- a/docs/page/6/index.html +++ b/docs/page/6/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/7/index.html b/docs/page/7/index.html index 59a0c6596..3cd4d6a30 100644 --- a/docs/page/7/index.html +++ b/docs/page/7/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/8/index.html b/docs/page/8/index.html index e8f047387..82c55bedf 100644 --- a/docs/page/8/index.html +++ b/docs/page/8/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/9/index.html b/docs/page/9/index.html index 27a160196..f8e81e257 100644 --- a/docs/page/9/index.html +++ b/docs/page/9/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/index.html b/docs/posts/index.html index 04caae457..7a72ae01f 100644 --- a/docs/posts/index.html +++ b/docs/posts/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/2/index.html b/docs/posts/page/2/index.html index 893f79340..9e36bf57e 100644 --- a/docs/posts/page/2/index.html +++ b/docs/posts/page/2/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/3/index.html b/docs/posts/page/3/index.html index 9b99252f8..02bf0e403 100644 --- a/docs/posts/page/3/index.html +++ b/docs/posts/page/3/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/4/index.html b/docs/posts/page/4/index.html index ebd715136..af5c67dea 100644 --- a/docs/posts/page/4/index.html +++ b/docs/posts/page/4/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/5/index.html b/docs/posts/page/5/index.html index 398d4aeaf..377f9378d 100644 --- a/docs/posts/page/5/index.html +++ b/docs/posts/page/5/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/6/index.html b/docs/posts/page/6/index.html index 3c0e1c5fb..4544f8658 100644 --- a/docs/posts/page/6/index.html +++ b/docs/posts/page/6/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/7/index.html b/docs/posts/page/7/index.html index 01c6dc798..76f9f832e 100644 --- a/docs/posts/page/7/index.html +++ b/docs/posts/page/7/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/8/index.html b/docs/posts/page/8/index.html index eea6ad26a..b1aeff28f 100644 --- a/docs/posts/page/8/index.html +++ b/docs/posts/page/8/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/9/index.html b/docs/posts/page/9/index.html index b898ba434..9c3e3e60a 100644 --- a/docs/posts/page/9/index.html +++ b/docs/posts/page/9/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/sitemap.xml b/docs/sitemap.xml index 7afd200f2..5e21dae8e 100644 --- a/docs/sitemap.xml +++ b/docs/sitemap.xml @@ -3,19 +3,19 @@ xmlns:xhtml="http://www.w3.org/1999/xhtml"> https://alanorth.github.io/cgspace-notes/categories/ - 2022-09-28T17:10:23+03:00 + 2022-09-28T21:22:59+03:00 https://alanorth.github.io/cgspace-notes/ - 2022-09-28T17:10:23+03:00 + 2022-09-28T21:22:59+03:00 https://alanorth.github.io/cgspace-notes/categories/notes/ - 2022-09-28T17:10:23+03:00 + 2022-09-28T21:22:59+03:00 https://alanorth.github.io/cgspace-notes/posts/ - 2022-09-28T17:10:23+03:00 + 2022-09-28T21:22:59+03:00 https://alanorth.github.io/cgspace-notes/2022-09/ - 2022-09-28T17:10:23+03:00 + 2022-09-28T21:22:59+03:00 https://alanorth.github.io/cgspace-notes/2022-08/ 2022-09-27T14:35:26+03:00 diff --git a/docs/tags/index.html b/docs/tags/index.html index 6d12796a1..7248dfa57 100644 --- a/docs/tags/index.html +++ b/docs/tags/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/migration/index.html b/docs/tags/migration/index.html index a9f223431..1987a412b 100644 --- a/docs/tags/migration/index.html +++ b/docs/tags/migration/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/notes/index.html b/docs/tags/notes/index.html index 370669e79..eb563321b 100644 --- a/docs/tags/notes/index.html +++ b/docs/tags/notes/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/notes/page/2/index.html b/docs/tags/notes/page/2/index.html index 512c2bd83..980ec1f2b 100644 --- a/docs/tags/notes/page/2/index.html +++ b/docs/tags/notes/page/2/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/notes/page/3/index.html b/docs/tags/notes/page/3/index.html index b028b63c1..3a58b384e 100644 --- a/docs/tags/notes/page/3/index.html +++ b/docs/tags/notes/page/3/index.html @@ -17,7 +17,7 @@ - +