diff --git a/content/post/2017-11.md b/content/post/2017-11.md index 785398ea3..b75f1a728 100644 --- a/content/post/2017-11.md +++ b/content/post/2017-11.md @@ -474,3 +474,5 @@ proxy_set_header User-Agent $ua; - I have been looking for a reason to ban Baidu and this is definitely a good one - Disallowing `Baiduspider` in `robots.txt` probably won't work because this bot doesn't seem to respect the robot exclusion standard anyways! - I will whip up something in nginx later +- Run system updates on CGSpace and reboot the server +- Re-deploy latest `5_x-prod` branch on CGSpace and DSpace Test (includes the clickable thumbnails, CCAFS phase II project tags, and updated news text) diff --git a/public/2017-11/index.html b/public/2017-11/index.html index 92d4ce565..8448e2932 100644 --- a/public/2017-11/index.html +++ b/public/2017-11/index.html @@ -38,7 +38,7 @@ COPY 54701 - + @@ -86,9 +86,9 @@ COPY 54701 "@type": "BlogPosting", "headline": "November, 2017", "url": "https://alanorth.github.io/cgspace-notes/2017-11/", - "wordCount": "2694", + "wordCount": "2725", "datePublished": "2017-11-02T09:37:54+02:00", - "dateModified": "2017-11-08T15:20:49+02:00", + "dateModified": "2017-11-08T22:26:37+02:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -674,6 +674,8 @@ proxy_set_header User-Agent $ua;
Baiduspider
in robots.txt
probably won’t work because this bot doesn’t seem to respect the robot exclusion standard anyways!5_x-prod
branch on CGSpace and DSpace Test (includes the clickable thumbnails, CCAFS phase II project tags, and updated news text)