cgspace-notes/docs/2019-11/index.html

628 lines
33 KiB
HTML
Raw Normal View History

2019-11-04 15:41:19 +01:00
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="November, 2019" />
<meta property="og:description" content="2019-11-04
Peter noticed that there were 5.2 million hits on CGSpace in 2019-10 according to the Atmire usage statistics
I looked in the nginx logs and see 4.6 million in the access logs, and 1.2 million in the API logs:
# zcat --force /var/log/nginx/*access.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot;
4671942
# zcat --force /var/log/nginx/{rest,oai,statistics}.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot;
1277694
So 4.6 million from XMLUI and another 1.2 million from API requests
Let&rsquo;s see how many of the REST API requests were for bitstreams (because they are counted in Solr stats):
# zcat --force /var/log/nginx/rest.log.*.gz | grep -c -E &quot;[0-9]{1,2}/Oct/2019&quot;
1183456
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E &quot;[0-9]{1,2}/Oct/2019&quot; | grep -c -E &quot;/rest/bitstreams&quot;
106781
" />
<meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2019-11/" />
<meta property="article:published_time" content="2019-11-04T12:20:30+02:00" />
2019-11-17 14:39:10 +01:00
<meta property="article:modified_time" content="2019-11-17T14:21:58+02:00" />
2019-11-04 15:41:19 +01:00
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="November, 2019"/>
<meta name="twitter:description" content="2019-11-04
Peter noticed that there were 5.2 million hits on CGSpace in 2019-10 according to the Atmire usage statistics
I looked in the nginx logs and see 4.6 million in the access logs, and 1.2 million in the API logs:
# zcat --force /var/log/nginx/*access.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot;
4671942
# zcat --force /var/log/nginx/{rest,oai,statistics}.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot;
1277694
So 4.6 million from XMLUI and another 1.2 million from API requests
Let&rsquo;s see how many of the REST API requests were for bitstreams (because they are counted in Solr stats):
# zcat --force /var/log/nginx/rest.log.*.gz | grep -c -E &quot;[0-9]{1,2}/Oct/2019&quot;
1183456
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E &quot;[0-9]{1,2}/Oct/2019&quot; | grep -c -E &quot;/rest/bitstreams&quot;
106781
"/>
<meta name="generator" content="Hugo 0.59.1" />
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "November, 2019",
"url": "https:\/\/alanorth.github.io\/cgspace-notes\/2019-11\/",
2019-11-17 14:39:10 +01:00
"wordCount": "2595",
2019-11-04 15:41:19 +01:00
"datePublished": "2019-11-04T12:20:30+02:00",
2019-11-17 14:39:10 +01:00
"dateModified": "2019-11-17T14:21:58+02:00",
2019-11-04 15:41:19 +01:00
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
</script>
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2019-11/">
<title>November, 2019 | CGSpace Notes</title>
<!-- combined, minified CSS -->
<link href="https://alanorth.github.io/cgspace-notes/css/style.css" rel="stylesheet" integrity="sha384-G5B34w7DFTumWTswxYzTX7NWfbvQEg1HbFFEg6ItN03uTAAoS2qkPS/fu3LhuuSA" crossorigin="anonymous">
<!-- RSS 2.0 feed -->
</head>
<body>
<div class="blog-masthead">
<div class="container">
<nav class="nav blog-nav">
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
</nav>
</div>
</div>
<header class="blog-header">
<div class="container">
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
</div>
</header>
<div class="container">
<div class="row">
<div class="col-sm-8 blog-main">
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2019-11/">November, 2019</a></h2>
<p class="blog-post-meta"><time datetime="2019-11-04T12:20:30&#43;02:00">Mon Nov 04, 2019</time> by Alan Orth in
<i class="fa fa-folder" aria-hidden="true"></i>&nbsp;<a href="/cgspace-notes/categories/notes" rel="category tag">Notes</a>
</p>
</header>
<h2 id="2019-11-04">2019-11-04</h2>
<ul>
<li><p>Peter noticed that there were 5.2 million hits on CGSpace in 2019-10 according to the Atmire usage statistics</p>
<ul>
<li><p>I looked in the nginx logs and see 4.6 million in the access logs, and 1.2 million in the API logs:</p>
<pre><code># zcat --force /var/log/nginx/*access.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot;
4671942
# zcat --force /var/log/nginx/{rest,oai,statistics}.log.*.gz | grep -cE &quot;[0-9]{1,2}/Oct/2019&quot;
1277694
</code></pre></li>
</ul></li>
<li><p>So 4.6 million from XMLUI and another 1.2 million from API requests</p></li>
<li><p>Let&rsquo;s see how many of the REST API requests were for bitstreams (because they are counted in Solr stats):</p>
<pre><code># zcat --force /var/log/nginx/rest.log.*.gz | grep -c -E &quot;[0-9]{1,2}/Oct/2019&quot;
1183456
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E &quot;[0-9]{1,2}/Oct/2019&quot; | grep -c -E &quot;/rest/bitstreams&quot;
106781
</code></pre></li>
</ul>
<ul>
<li><p>The types of requests in the access logs are (by lazily extracting the sixth field in the nginx log)</p>
<pre><code># zcat --force /var/log/nginx/*access.log.*.gz | grep -E &quot;[0-9]{1,2}/Oct/2019&quot; | awk '{print $6}' | sed 's/&quot;//' | sort | uniq -c | sort -n
1 PUT
8 PROPFIND
283 OPTIONS
30102 POST
46581 HEAD
4594967 GET
</code></pre></li>
<li><p>Two very active IPs are 34.224.4.16 and 34.234.204.152, which made over 360,000 requests in October:</p>
<pre><code># zcat --force /var/log/nginx/*access.log.*.gz | grep -E &quot;[0-9]{1,2}/Oct/2019&quot; | grep -c -E '(34\.224\.4\.16|34\.234\.204\.152)'
365288
</code></pre></li>
<li><p>Their user agent is one I&rsquo;ve never seen before:</p>
<pre><code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot)
</code></pre></li>
<li><p>Most of them seem to be to community or collection discover and browse results pages like <code>/handle/10568/103/discover</code>:</p></li>
</ul>
<h1 id="zcat-force-var-log-nginx-access-log-gz-grep-e-0-9-1-2-oct-2019-grep-amazonbot-grep-o-e-get-bitstream-discover-handle-sort-uniq-c">zcat &ndash;force /var/log/nginx/<em>access.log.</em>.gz | grep -E &ldquo;[0-9]{1,2}/Oct/2019&rdquo; | grep Amazonbot | grep -o -E &ldquo;GET /(bitstream|discover|handle)&rdquo; | sort | uniq -c</h1>
<p>6566 GET /bitstream
351928 GET /handle</p>
<h1 id="zcat-force-var-log-nginx-access-log-gz-grep-e-0-9-1-2-oct-2019-grep-amazonbot-grep-e-get-bitstream-discover-handle-grep-c-discover">zcat &ndash;force /var/log/nginx/<em>access.log.</em>.gz | grep -E &ldquo;[0-9]{1,2}/Oct/2019&rdquo; | grep Amazonbot | grep -E &ldquo;GET /(bitstream|discover|handle)&rdquo; | grep -c discover</h1>
<p>214209</p>
<h1 id="zcat-force-var-log-nginx-access-log-gz-grep-e-0-9-1-2-oct-2019-grep-amazonbot-grep-e-get-bitstream-discover-handle-grep-c-browse">zcat &ndash;force /var/log/nginx/<em>access.log.</em>.gz | grep -E &ldquo;[0-9]{1,2}/Oct/2019&rdquo; | grep Amazonbot | grep -E &ldquo;GET /(bitstream|discover|handle)&rdquo; | grep -c browse</h1>
<p>86874</p>
<pre><code>
- As far as I can tell, none of their requests are counted in the Solr statistics:
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/select?q=(ip%3A34.224.4.16+OR+ip%3A34.234.204.152)&amp;rows=0&amp;wt=json&amp;indent=true'">http://localhost:8081/solr/statistics/select?q=(ip%3A34.224.4.16+OR+ip%3A34.234.204.152)&amp;rows=0&amp;wt=json&amp;indent=true'</a></p>
<pre><code>
- Still, those requests are CPU intensive so I will add their user agent to the &quot;badbots&quot; rate limiting in nginx to reduce the impact on server load
- After deploying it I checked by setting my user agent to Amazonbot and making a few requests (which were denied with HTTP 503):
</code></pre>
2019-11-05 09:37:16 +01:00
<p>$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/1/discover'">https://dspacetest.cgiar.org/handle/10568/1/discover'</a> User-Agent:&ldquo;Amazonbot/0.1&rdquo;</p>
<pre><code>
- On the topic of spiders, I have been wanting to update DSpace's default list of spiders in `config/spiders/agents`, perhaps by dropping a new list in from [Atmire's COUNTER-Robots](https://github.com/atmire/COUNTER-Robots) project
- First I checked for a user agent that is in COUNTER-Robots, but NOT in the current `dspace/config/spiders/example` list
- Then I made some item and bitstream requests on DSpace Test using that user agent:
</code></pre>
<p>$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;iskanie&rdquo;
$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;iskanie&rdquo;
$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/bitstream/handle/10568/105487/csl_Crane_oct2019.pptx?sequence=1&amp;isAllowed=y'">https://dspacetest.cgiar.org/bitstream/handle/10568/105487/csl_Crane_oct2019.pptx?sequence=1&amp;isAllowed=y'</a> User-Agent:&ldquo;iskanie&rdquo;</p>
<pre><code>
- A bit later I checked Solr and found three requests from my IP with that user agent this month:
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/select?q=ip:73.178.9.24+AND+userAgent:iskanie&amp;fq=dateYearMonth%3A2019-11&amp;rows=0'">http://localhost:8081/solr/statistics/select?q=ip:73.178.9.24+AND+userAgent:iskanie&amp;fq=dateYearMonth%3A2019-11&amp;rows=0'</a>
&lt;?xml version=&ldquo;1.0&rdquo; encoding=&ldquo;UTF-8&rdquo;?&gt;
<response>
<lst name="responseHeader"><int name="status">0</int><int name="QTime">1</int><lst name="params"><str name="q">ip:73.178.9.24 AND userAgent:iskanie</str><str name="fq">dateYearMonth:2019-11</str><str name="rows">0</str></lst></lst><result name="response" numFound="3" start="0"></result>
</response></p>
<pre><code>
- Now I want to make similar requests with a user agent that is included in DSpace's current user agent list:
</code></pre>
<p>$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;celestial&rdquo;
$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;celestial&rdquo;
$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/bitstream/handle/10568/105487/csl_Crane_oct2019.pptx?sequence=1&amp;isAllowed=y'">https://dspacetest.cgiar.org/bitstream/handle/10568/105487/csl_Crane_oct2019.pptx?sequence=1&amp;isAllowed=y'</a> User-Agent:&ldquo;celestial&rdquo;</p>
<pre><code>
- After twenty minutes I didn't see any requests in Solr, so I assume they did not get logged because they matched a bot list...
- What's strange is that the Solr spider agent configuration in `dspace/config/modules/solr-statistics.cfg` points to a file that doesn't exist...
</code></pre>
<p>spider.agentregex.regexfile = ${dspace.dir}/config/spiders/Bots-2013-03.txt</p>
<pre><code>
- Apparently that is part of Atmire's CUA, despite being in a standard DSpace configuration file...
- I tried with some other garbage user agents like &quot;fuuuualan&quot; and they were visible in Solr
- Now I want to try adding &quot;iskanie&quot; and &quot;fuuuualan&quot; to the list of spider regexes in `dspace/config/spiders/example` and then try to use DSpace's &quot;mark spiders&quot; feature to change them to &quot;isBot:true&quot; in Solr
- I restarted Tomcat and ran `dspace stats-util -m` and it did some stuff for awhile, but I still don't see any items in Solr with `isBot:true`
- According to `dspace-api/src/main/java/org/dspace/statistics/util/SpiderDetector.java` the patterns for user agents are loaded from any file in the `config/spiders/agents` directory
- I downloaded the COUNTER-Robots list to DSpace Test and overwrote the example file, then ran `dspace stats-util -m` and still there were no new items marked as being bots in Solr, so I think there is still something wrong
- Jesus, the code in `./dspace-api/src/main/java/org/dspace/statistics/util/StatisticsClient.java` says that `stats-util -m` marks spider requests by their IPs, not by their user agents... WTF:
</code></pre>
<p>else if (line.hasOption(&rsquo;m&rsquo;))
{
SolrLogger.markRobotsByIP();
}</p>
<pre><code>
- WTF again, there is actually a function called `markRobotByUserAgent()` that is never called anywhere!
- It appears to be unimplemented...
- I sent a message to the dspace-tech mailing list to ask if I should file an issue
## 2019-11-05
- I added &quot;alanfuu2&quot; to the example spiders file, restarted Tomcat, then made two requests to DSpace Test:
</code></pre>
<p>$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;alanfuuu1&rdquo;
$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;alanfuuu2&rdquo;</p>
<pre><code>
- After committing the changes in Solr I saw one request for &quot;alanfuu1&quot; and no requests for &quot;alanfuu2&quot;:
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/update?commit=true'">http://localhost:8081/solr/statistics/update?commit=true'</a>
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/select?q=userAgent:alanfuuu1&amp;fq=dateYearMonth%3A2019-11'">http://localhost:8081/solr/statistics/select?q=userAgent:alanfuuu1&amp;fq=dateYearMonth%3A2019-11'</a> | xmllint &ndash;format - | grep numFound
<result name="response" numFound="1" start="0">
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/select?q=userAgent:alanfuuu2&amp;fq=dateYearMonth%3A2019-11'">http://localhost:8081/solr/statistics/select?q=userAgent:alanfuuu2&amp;fq=dateYearMonth%3A2019-11'</a> | xmllint &ndash;format - | grep numFound
2019-11-06 08:35:51 +01:00
<result name="response" numFound="0" start="0"/></p>
<pre><code>
- So basically it seems like a win to update the example file with the latest one from Atmire's COUNTER-Robots list
- Even though the &quot;mark by user agent&quot; function is not working (see email to dspace-tech mailing list) DSpace will still not log Solr events from these user agents
- I'm curious how the special character matching is in Solr, so I will test two requests: one with &quot;www.gnip.com&quot; which is in the spider list, and one with &quot;www.gnyp.com&quot; which isn't:
</code></pre>
<p>$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;www.gnip.com&rdquo;
$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;www.gnyp.com&rdquo;</p>
<pre><code>
- Then commit changes to Solr so we don't have to wait:
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/update?commit=true'">http://localhost:8081/solr/statistics/update?commit=true'</a>
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/select?q=userAgent:www.gnip.com&amp;fq=dateYearMonth%3A2019-11'">http://localhost:8081/solr/statistics/select?q=userAgent:www.gnip.com&amp;fq=dateYearMonth%3A2019-11'</a> | xmllint &ndash;format - | grep numFound
2019-11-05 09:37:16 +01:00
<result name="response" numFound="0" start="0"/>
2019-11-06 08:35:51 +01:00
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics/select?q=userAgent:www.gnyp.com&amp;fq=dateYearMonth%3A2019-11'">http://localhost:8081/solr/statistics/select?q=userAgent:www.gnyp.com&amp;fq=dateYearMonth%3A2019-11'</a> | xmllint &ndash;format - | grep numFound
2019-11-07 17:22:19 +01:00
<result name="response" numFound="1" start="0"></p>
2019-11-04 15:41:19 +01:00
2019-11-07 17:22:19 +01:00
<pre><code>
- So the blocking seems to be working because &quot;www\.gnip\.com&quot; is one of the new patterns added to the spiders file...
## 2019-11-07
2019-11-05 09:37:16 +01:00
2019-11-07 17:22:19 +01:00
- CCAFS finally confirmed that they do indeed need the confusing new project tag that looks like a duplicate
- They had proposed a batch of new tags in 2019-09 and we never merged them due to this uncertainty
- I have now merged the changes in to the `5_x-prod` branch ([#432](https://github.com/ilri/DSpace/pull/432))
- I am reconsidering the move of `cg.identifier.dataurl` to `cg.hasMetadata` in CG Core v2
- The values of this field are mostly links to data sets on Dataverse and partner sites
- I opened an [issue on GitHub](https://github.com/AgriculturalSemantics/cg-core/issues/10) to ask Marie-Angelique for clarification
- Looking into CGSpace statistics again
- I searched for hits in Solr from the BUbiNG bot and found 63,000 in the `statistics-2018` core:
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:BUbiNG*'">http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:BUbiNG*'</a> | xmllint &ndash;format - | grep numFound
<result name="response" numFound="62944" start="0"></p>
<pre><code>
- Similar for com.plumanalytics, Grammarly, and ltx71!
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:">http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:</a>
<em>com.plumanalytics</em>&rsquo; | xmllint &ndash;format - | grep numFound
<result name="response" numFound="28256" start="0">
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:*Grammarly*'">http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:*Grammarly*'</a> | xmllint &ndash;format - | grep numFound
<result name="response" numFound="6288" start="0">
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:*ltx71*'">http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:*ltx71*'</a> | xmllint &ndash;format - | grep numFound
<result name="response" numFound="105663" start="0"></p>
<pre><code>
- Deleting these seems to work, for example the 105,000 ltx71 records from 2018:
</code></pre>
<p>$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics-2018/update?stream.body=">http://localhost:8081/solr/statistics-2018/update?stream.body=</a><delete><query>userAgent:<em>ltx71</em></query><query>type:0</query></delete>&amp;commit=true&rsquo;
$ http &ndash;print b &lsquo;<a href="http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:*ltx71*'">http://localhost:8081/solr/statistics-2018/select?facet=true&amp;facet.field=ip&amp;facet.mincount=1&amp;type:0&amp;q=userAgent:*ltx71*'</a> | xmllint &ndash;format - | grep numFound
2019-11-08 08:27:13 +01:00
<result name="response" numFound="0" start="0"/></p>
2019-11-07 11:40:25 +01:00
2019-11-08 08:27:13 +01:00
<pre><code>
- I wrote a quick bash script to check all these user agents against the CGSpace Solr statistics cores
- For years 2010 until 2019 there are 1.6 million hits from these spider user agents
- For 2019 alone there are 740,000, over half of which come from Unpaywall!
- Looking at the facets I see there were about 200,000 hits from Unpaywall in 2019-10:
</code></pre>
<p>$ curl -s &lsquo;<a href="http://localhost:8081/solr/statistics/select?facet=true&amp;facet.field=dateYearMonth&amp;facet.mincount=1&amp;facet.offset=0&amp;facet.limit=">http://localhost:8081/solr/statistics/select?facet=true&amp;facet.field=dateYearMonth&amp;facet.mincount=1&amp;facet.offset=0&amp;facet.limit=</a>
12&amp;q=userAgent:<em>Unpaywall</em>&rsquo; | xmllint &ndash;format - | less
&hellip;
<lst name="facet_counts">
<lst name="facet_queries"/>
<lst name="facet_fields">
<lst name="dateYearMonth">
<int name="2019-10">198624</int>
<int name="2019-05">88422</int>
<int name="2019-06">79911</int>
<int name="2019-09">67065</int>
<int name="2019-07">39026</int>
<int name="2019-08">36889</int>
<int name="2019-04">36512</int>
<int name="2019-11">760</int>
</lst>
2019-11-09 20:41:15 +01:00
</lst></p>
2019-11-07 11:40:25 +01:00
2019-11-09 20:41:15 +01:00
<pre><code>
- That answers Peter's question about why the stats jumped in October...
2019-11-07 11:40:25 +01:00
2019-11-09 20:41:15 +01:00
## 2019-11-08
2019-11-08 17:29:25 +01:00
2019-11-09 20:41:15 +01:00
- I saw a bunch of user agents that have the literal string `User-Agent` in their user agent HTTP header, for example:
- `User-Agent: Drupal (+http://drupal.org/)`
- `User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410.64 Safari/537.31`
- `User-Agent:Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0) IKU/7.0.5.9226;IKUCID/IKU;`
- `User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)`
- `User-Agent:User-Agent:Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; GTB7.5; .NET4.0C)IKU/6.7.6.12189;IKUCID/IKU;IKU/6.7.6.12189;IKUCID/IKU;`
- `User-Agent:Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0) IKU/7.0.5.9226;IKUCID/IKU;`
- I filed [an issue](https://github.com/atmire/COUNTER-Robots/issues/27) on the COUNTER-Robots project to see if they agree to add `User-Agent:` to the list of robot user agents
2019-11-08 17:29:25 +01:00
2019-11-09 20:41:15 +01:00
## 2019-11-09
- Deploy the latest `5_x-prod` branch on CGSpace (linode19)
- This includes the updated CCAFS phase II project tags and the updated spider user agents
- Run all system updates on CGSpace and reboot the server
- After rebooting it seems that all Solr statistics cores came back up fine...
- I did some work to clean up my bot processing script and removed about 2 million hits from the statistics cores on CGSpace
- The script is called `check-spider-hits.sh`
- After a bunch of tests and checks I ran it for each statistics shard like so:
</code></pre>
<p>$ for shard in statistics statistics-2018 statistics-2017 statistics-2016 statistics-2015 stat
2019-11-13 17:18:24 +01:00
istics-2014 statistics-2013 statistics-2012 statistics-2011 statistics-2010; do ./check-spider-hits.sh -s $shard -p yes; done</p>
2019-11-08 17:29:25 +01:00
2019-11-13 17:18:24 +01:00
<pre><code>
- Open a [pull request](https://github.com/atmire/COUNTER-Robots/pull/28) against COUNTER-Robots to remove unnecessary escaping of dashes
2019-11-12 10:44:05 +01:00
2019-11-13 17:18:24 +01:00
## 2019-11-12
2019-11-12 10:44:05 +01:00
2019-11-13 17:18:24 +01:00
- Udana and Chandima emailed me to ask why [one of their WLE items](https://hdl.handle.net/10568/81236) that is mapped from IWMI only shows up in the IWMI &quot;department&quot; on the Altmetric dashboard
- A [search in the IWMI department shows the item](https://www.altmetric.com/explorer/outputs?department_id%5B%5D=CGSpace%3Agroup%3Acom_10568_16814&amp;q=Towards%20sustainable%20sanitation%20management)
- A [search in the WLE department shows no results](https://www.altmetric.com/explorer/outputs?department_id%5B%5D=CGSpace%3Agroup%3Acom_10568_34494&amp;q=Towards%20sustainable%20sanitation%20management)
- I emailed Altmetric support to ask for help
- Also, while analysing this, I looked through some of the other top WLE items and fixed some metadata issues (adding `dc.rights`, fixing DOIs, adding ISSNs, etc) and noticed one issue with [an item](https://hdl.handle.net/10568/97087) that has an Altmetric score for its Handle (lower) despite it having a correct DOI (with a higher score)
- I tweeted the Handle to see if the score would get linked once Altmetric noticed it
2019-11-12 10:44:05 +01:00
2019-11-13 17:18:24 +01:00
## 2019-11-13
- The [item with a low Altmetric score for its Handle](https://hdl.handle.net/10568/97087) that I tweeted yesterday still hasn't linked with the DOI's score
- I tweeted it again with the Handle and the DOI
- Testing modifying some of the COUNTER-Robots patterns to use `[0-9]` instead of `\d` digit character type, as Solr's regex search can't use those
</code></pre>
<p>$ http &ndash;print Hh &lsquo;<a href="https://dspacetest.cgiar.org/handle/10568/105487'">https://dspacetest.cgiar.org/handle/10568/105487'</a> User-Agent:&ldquo;Scrapoo/1&rdquo;
$ http &ldquo;<a href="http://localhost:8081/solr/statistics/update?commit=true&quot;">http://localhost:8081/solr/statistics/update?commit=true&quot;</a>
$ http &ldquo;<a href="http://localhost:8081/solr/statistics/select?q=userAgent:Scrapoo*&quot;">http://localhost:8081/solr/statistics/select?q=userAgent:Scrapoo*&quot;</a> | xmllint &ndash;format - | grep numFound
<result name="response" numFound="1" start="0">
$ http &ldquo;<a href="http://localhost:8081/solr/statistics/select?q=userAgent:/Scrapoo/[0-9]/&quot;">http://localhost:8081/solr/statistics/select?q=userAgent:/Scrapoo/[0-9]/&quot;</a> | xmllint &ndash;format - | grep numFound
2019-11-14 08:47:49 +01:00
<result name="response" numFound="1" start="0"></p>
<pre><code>
- Nice, so searching with regex in Solr with `//` syntax works for those digits!
- I realized that it's easier to search Solr from curl via POST using this syntax:
</code></pre>
<p>$ curl -s &ldquo;<a href="http://localhost:8081/solr/statistics/select&quot;">http://localhost:8081/solr/statistics/select&quot;</a> -d &ldquo;q=userAgent:<em>Scrapoo</em>&amp;rows=0&rdquo;)</p>
<pre><code>
- If the parameters include something like &quot;[0-9]&quot; then curl interprets it as a range and will make ten requests
- You can disable this using the `-g` option, but there are other benefits to searching with POST, for example it seems that I have less issues with escaping special parameters when using Solr's regex search:
</code></pre>
2019-11-14 23:54:13 +01:00
<p>$ curl -s &lsquo;<a href="http://localhost:8081/solr/statistics/select'">http://localhost:8081/solr/statistics/select'</a> -d &lsquo;q=userAgent:/Postgenomic(\s|+)v2/&amp;rows=2&rsquo;</p>
<pre><code>
- I updated the `check-spider-hits.sh` script to use the POST syntax, and I'm evaluating the feasability of including the regex search patterns from the spider agent file, as I had been filtering them out due to differences in PCRE and Solr regex syntax and issues with shell handling
## 2019-11-14
- IWMI sent a few new ORCID identifiers for us to add to our controlled vocabulary
- I will merge them with our existing list and then resolve their names using my `resolve-orcids.py` script:
</code></pre>
<p>$ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-id.xml /tmp/iwmi-orcids.txt | grep -oE &lsquo;[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}&rsquo; | sort | uniq &gt; /tmp/2019-11-14-combined-orcids.txt
$ ./resolve-orcids.py -i /tmp/2019-11-14-combined-orcids.txt -o /tmp/2019-11-14-combined-names.txt -d</p>
<h1 id="sort-names-copy-to-cg-creator-id-xml-add-xml-formatting-and-then-format-with-tidy-preserving-accents">sort names, copy to cg-creator-id.xml, add XML formatting, and then format with tidy (preserving accents)</h1>
2019-11-15 18:17:08 +01:00
<p>$ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-id.xml</p>
2019-11-12 10:44:05 +01:00
2019-11-15 18:17:08 +01:00
<pre><code>
- I created a [pull request](https://github.com/ilri/DSpace/pull/437) and merged them into the `5_x-prod` branch
- I will deploy them to CGSpace in the next few days
- Greatly improve my `check-spider-hits.sh` script to handle regular expressions in the spider agents patterns file
- This allows me to detect and purge many more hits from the Solr statistics core
- I've tested it quite a bit on DSpace Test, but I need to do a little more before I feel comfortable running the new code on CGSpace's Solr cores
## 2019-11-15
- Run the new version of `check-spider-hits.sh` on CGSpace's Solr statistics cores one by one, starting from the oldest just in case something goes wrong
- But then I noticed that some (all?) of the hits weren't actually getting purged, all of which were using regular expressions like:
- `MetaURI[\+\s]API\/[0-9]\.[0-9]`
- `FDM(\s|\+)[0-9]`
- `Goldfire(\s|\+)Server`
- `^Mozilla\/4\.0\+\(compatible;\)$`
- `^Mozilla\/4\.0\+\(compatible;\+ICS\)$`
- `^Mozilla\/4\.5\+\[en]\+\(Win98;\+I\)$`
- Upon closer inspection, the plus signs seem to be getting misinterpreted somehow in the delete, but not in the select!
- Plus signs are special in regular expressions, URLs, and Solr's Lucene query parser, so I'm actually not sure where the issue is
- I tried to do URL encoding of the +, double escaping, etc... but nothing worked
- I'm going to ignore regular expressions that have pluses for now
- I think I might also have to ignore patterns that have percent signs, like `^\%?default\%?$`
- After I added the ignores and did some more testing I finally ran the `check-spider-hits.sh` on all CGSpace Solr statistics cores and these are the number of hits purged from each core:
- statistics-2010: 113
- statistics-2011: 7235
- statistics-2012: 0
- statistics-2013: 0
- statistics-2014: 316
- statistics-2015: 16809
- statistics-2016: 41732
- statistics-2017: 39207
- statistics-2018: 295546
- statistics: 1043373
- That's 1.4 million hits in addition to the 2 million I purged earlier this week...
- For posterity, the major contributors to the hits on the statistics core were:
- Purging 812429 hits from curl\/ in statistics
- Purging 48206 hits from facebookexternalhit\/ in statistics
- Purging 72004 hits from PHP\/ in statistics
- Purging 76072 hits from Yeti\/[0-9] in statistics
- Most of the curl hits were from CIAT in mid-2019, where they were using [GuzzleHttp](https://guzzle3.readthedocs.io/http-client/client.html) from PHP, which uses something like this for its user agent:
2019-11-14 23:54:13 +01:00
2019-11-15 18:17:08 +01:00
</code></pre>
<p>Guzzle/<Guzzle_Version> curl/<curl_version> PHP/<PHP_VERSION>
```</p>
2019-11-14 23:54:13 +01:00
<ul>
2019-11-15 18:17:08 +01:00
<li>Run system updates on DSpace Test and reboot the server</li>
2019-11-12 10:44:05 +01:00
</ul>
2019-11-17 13:21:58 +01:00
<h2 id="2019-11-17">2019-11-17</h2>
<ul>
<li>Altmetric support responded about our dashboard question, asking if the second &ldquo;department&rdquo; (aka WLE&rsquo;s collection) was added recently and might have not been in the last harvesting yet
<ul>
<li>I told her no, that the department is several years old, and the item was added in 2017</li>
<li>Then I looked again at the dashboard for each department and I see the item in both departments now&hellip; shit.</li>
<li>A <a href="https://www.altmetric.com/explorer/outputs?department_id%5B%5D=CGSpace%3Agroup%3Acom_10568_16814&amp;q=Towards%20sustainable%20sanitation%20management">search in the IWMI department shows the item</a></li>
<li>A <a href="https://www.altmetric.com/explorer/outputs?department_id%5B%5D=CGSpace%3Agroup%3Acom_10568_34494&amp;q=Towards%20sustainable%20sanitation%20management">search in the WLE department shows the item</a></li>
</ul></li>
<li>I finally decided to revert <code>cg.hasMetadata</code> back to <code>cg.identifier.dataurl</code> in my CG Core v2 branch (see <a href="https://github.com/AgriculturalSemantics/cg-core/issues/10">#10</a>)</li>
2019-11-17 14:39:10 +01:00
<li>Regarding the <a href="https://hdl.handle.net/10568/97087">WLE item</a> that has a much lower score than its DOI&hellip;
<ul>
<li>I tweeted the item twice last week and the score never got linked</li>
<li>Then I noticed that I had already made a note about the same issue in 2019-04, when I also tweeted it several times&hellip;</li>
<li>I will ask Altmetric support for help with that</li>
</ul></li>
2019-11-17 13:21:58 +01:00
</ul>
2019-11-04 15:41:19 +01:00
<!-- vim: set sw=2 ts=2: -->
</article>
</div> <!-- /.blog-main -->
<aside class="col-sm-3 ml-auto blog-sidebar">
<section class="sidebar-module">
<h4>Recent Posts</h4>
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2019-11/">November, 2019</a></li>
<li><a href="/cgspace-notes/cgspace-cgcorev2-migration/">CGSpace CG Core v2 Migration</a></li>
<li><a href="/cgspace-notes/2019-10/">October, 2019</a></li>
<li><a href="/cgspace-notes/2019-09/">September, 2019</a></li>
<li><a href="/cgspace-notes/2019-08/">August, 2019</a></li>
</ol>
</section>
<section class="sidebar-module">
<h4>Links</h4>
<ol class="list-unstyled">
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
</ol>
</section>
</aside>
</div> <!-- /.row -->
</div> <!-- /.container -->
<footer class="blog-footer">
<p dir="auto">
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
</p>
<p>
<a href="#">Back to top</a>
</p>
</footer>
</body>
</html>