2017-12-01 11:55:00 +01:00
<!DOCTYPE html>
< html lang = "en" >
< head >
< meta charset = "utf-8" >
< meta name = "viewport" content = "width=device-width, initial-scale=1, shrink-to-fit=no" >
< meta property = "og:title" content = "December, 2017" / >
< meta property = "og:description" content = "2017-12-01
Uptime Robot noticed that CGSpace went down
The logs say “ Timeout waiting for idle object”
PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today:
" />
< meta property = "og:type" content = "article" / >
< meta property = "og:url" content = "https://alanorth.github.io/cgspace-notes/2017-12/" / >
< meta property = "article:published_time" content = "2017-12-01T13:53:54+03:00" / >
2017-12-07 15:20:45 +01:00
< meta property = "article:modified_time" content = "2017-12-06T09:51:05+03:00" / >
2017-12-01 11:55:00 +01:00
< meta name = "twitter:card" content = "summary" / > < meta name = "twitter:title" content = "December, 2017" / >
< meta name = "twitter:description" content = "2017-12-01
Uptime Robot noticed that CGSpace went down
The logs say “ Timeout waiting for idle object”
PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today:
"/>
< meta name = "generator" content = "Hugo 0.31.1" / >
< script type = "application/ld+json" >
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "December, 2017",
"url": "https://alanorth.github.io/cgspace-notes/2017-12/",
2017-12-07 15:20:45 +01:00
"wordCount": "823",
2017-12-01 11:55:00 +01:00
"datePublished": "2017-12-01T13:53:54+ 03:00",
2017-12-07 15:20:45 +01:00
"dateModified": "2017-12-06T09:51:05+ 03:00",
2017-12-01 11:55:00 +01:00
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
< / script >
< link rel = "canonical" href = "https://alanorth.github.io/cgspace-notes/2017-12/" >
< title > December, 2017 | CGSpace Notes< / title >
<!-- combined, minified CSS -->
< link href = "https://alanorth.github.io/cgspace-notes/css/style.css" rel = "stylesheet" integrity = "sha384-O8wjsnz02XiyrPxnhfF6AVOv6YLBaEGRCnVF+DL3gCPBy9cieyHcpixIrVyD2JS5" crossorigin = "anonymous" >
< / head >
< body >
< div class = "blog-masthead" >
< div class = "container" >
< nav class = "nav blog-nav" >
< a class = "nav-link " href = "https://alanorth.github.io/cgspace-notes/" > Home< / a >
< / nav >
< / div >
< / div >
< header class = "blog-header" >
< div class = "container" >
< h1 class = "blog-title" > < a href = "https://alanorth.github.io/cgspace-notes/" rel = "home" > CGSpace Notes< / a > < / h1 >
< p class = "lead blog-description" > Documenting day-to-day work on the < a href = "https://cgspace.cgiar.org" > CGSpace< / a > repository.< / p >
< / div >
< / header >
< div class = "container" >
< div class = "row" >
< div class = "col-sm-8 blog-main" >
< article class = "blog-post" >
< header >
< h2 class = "blog-post-title" > < a href = "https://alanorth.github.io/cgspace-notes/2017-12/" > December, 2017< / a > < / h2 >
< p class = "blog-post-meta" > < time datetime = "2017-12-01T13:53:54+03:00" > Fri Dec 01, 2017< / time > by Alan Orth in
< i class = "fa fa-tag" aria-hidden = "true" > < / i > < a href = "/cgspace-notes/tags/notes" rel = "tag" > Notes< / a >
< / p >
< / header >
< h2 id = "2017-12-01" > 2017-12-01< / h2 >
< ul >
< li > Uptime Robot noticed that CGSpace went down< / li >
< li > The logs say “ Timeout waiting for idle object” < / li >
< li > PostgreSQL activity says there are 115 connections currently< / li >
< li > The list of connections to XMLUI and REST API for today:< / li >
< / ul >
< p > < / p >
< pre > < code > # cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E " 1/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
763 2.86.122.76
907 207.46.13.94
1018 157.55.39.206
1021 157.55.39.235
1407 66.249.66.70
1411 104.196.152.243
1503 50.116.102.77
1805 66.249.66.90
4007 70.32.83.92
6061 45.5.184.196
< / code > < / pre >
< ul >
< li > The number of DSpace sessions isn’ t even that high:< / li >
< / ul >
< pre > < code > $ cat /home/cgspace.cgiar.org/log/dspace.log.2017-12-01 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
5815
< / code > < / pre >
< ul >
< li > Connections in the last two hours:< / li >
< / ul >
< pre > < code > # cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E " 1/Dec/2017:(09|10)" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
78 93.160.60.22
101 40.77.167.122
113 66.249.66.70
129 157.55.39.206
130 157.55.39.235
135 40.77.167.58
164 68.180.229.254
177 87.100.118.220
188 66.249.66.90
314 2.86.122.76
< / code > < / pre >
< ul >
< li > What the fuck is going on?< / li >
< li > I’ ve never seen this 2.86.122.76 before, it has made quite a few unique Tomcat sessions today:< / li >
< / ul >
< pre > < code > $ grep 2.86.122.76 /home/cgspace.cgiar.org/log/dspace.log.2017-12-01 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
822
< / code > < / pre >
< ul >
< li > Appears to be some new bot:< / li >
< / ul >
< pre > < code > 2.86.122.76 - - [01/Dec/2017:09:02:53 +0000] " GET /handle/10568/78444?show=full HTTP/1.1" 200 29307 " -" " Mozilla/3.0 (compatible; Indy Library)"
< / code > < / pre >
< ul >
< li > I restarted Tomcat and everything came back up< / li >
< li > I can add Indy Library to the Tomcat crawler session manager valve but it would be nice if I could simply remap the useragent in nginx< / li >
< li > I will also add ‘ Drupal’ to the Tomcat crawler session manager valve because there are Drupals out there harvesting and they should be considered as bots< / li >
< / ul >
< pre > < code > # cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E " 1/Dec/2017" | grep Drupal | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
3 54.75.205.145
6 70.32.83.92
14 2a01:7e00::f03c:91ff:fe18:7396
46 2001:4b99:1:1:216:3eff:fe2c:dc6c
319 2001:4b99:1:1:216:3eff:fe76:205b
< / code > < / pre >
2017-12-03 11:03:13 +01:00
< h2 id = "2017-12-03" > 2017-12-03< / h2 >
< ul >
< li > Linode alerted that CGSpace’ s load was 327.5% from 6 to 8 AM again< / li >
< / ul >
2017-12-04 10:21:49 +01:00
< h2 id = "2017-12-04" > 2017-12-04< / h2 >
< ul >
< li > Linode alerted that CGSpace’ s load was 255.5% from 8 to 10 AM again< / li >
2017-12-04 16:14:45 +01:00
< li > I looked at the Munin stats on DSpace Test (linode02) again to see how the PostgreSQL tweaks from a few weeks ago were holding up:< / li >
2017-12-04 13:37:58 +01:00
< / ul >
2017-12-04 16:14:45 +01:00
< p > < img src = "/cgspace-notes/2017/12/postgres-connections-month.png" alt = "DSpace Test PostgreSQL connections month" / > < / p >
2017-12-04 13:37:58 +01:00
< ul >
2017-12-04 13:39:52 +01:00
< li > The results look fantastic! So the < code > random_page_cost< / code > tweak is massively important for informing the PostgreSQL scheduler that there is no “ cost” to accessing random pages, as we’ re on an SSD!< / li >
< li > I guess we could probably even reduce the PostgreSQL connections in DSpace / PostgreSQL after using this< / li >
2017-12-04 16:14:45 +01:00
< li > Run system updates on DSpace Test (linode02) and reboot it< / li >
< li > I’ m going to enable the PostgreSQL < code > random_page_cost< / code > tweak on CGSpace< / li >
< li > For reference, here is the past month’ s connections:< / li >
2017-12-04 10:21:49 +01:00
< / ul >
2017-12-04 16:14:45 +01:00
< p > < img src = "/cgspace-notes/2017/12/postgres-connections-month-cgspace.png" alt = "CGSpace PostgreSQL connections month" / > < / p >
2017-12-05 15:57:02 +01:00
< h2 id = "2017-12-05" > 2017-12-05< / h2 >
< ul >
< li > Linode alerted again that the CPU usage on CGSpace was high this morning from 8 to 10 AM< / li >
< li > CORE updated the entry for CGSpace on their index: < a href = "https://core.ac.uk/search?q=repositories.id:(1016)&fullTextOnly=false" > https://core.ac.uk/search?q=repositories.id:(1016)& fullTextOnly=false< / a > < / li >
2017-12-06 07:51:05 +01:00
< li > Linode alerted again that the CPU usage on CGSpace was high this evening from 8 to 10 PM< / li >
< / ul >
< h2 id = "2017-12-06" > 2017-12-06< / h2 >
< ul >
< li > Linode alerted again that the CPU usage on CGSpace was high this morning from 6 to 8 AM< / li >
2017-12-07 15:20:45 +01:00
< li > Uptime Robot alerted that the server went down and up around 8:53 this morning< / li >
< li > Uptime Robot alerted that CGSpace was down and up again a few minutes later< / li >
< li > I don’ t see any errors in the DSpace logs but I see in nginx’ s access.log that UptimeRobot was returned with HTTP 499 status (Client Closed Request)< / li >
< li > Looking at the REST API logs I see some new client IP I haven’ t noticed before:< / li >
< / ul >
< pre > < code > # cat /var/log/nginx/rest.log /var/log/nginx/rest.log.1 | grep -E " 6/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
18 95.108.181.88
19 68.180.229.254
30 207.46.13.151
33 207.46.13.110
38 40.77.167.20
41 157.55.39.223
82 104.196.152.243
1529 50.116.102.77
4005 70.32.83.92
6045 45.5.184.196
< / code > < / pre >
< ul >
< li > 50.116.102.77 is apparently in the US on websitewelcome.com< / li >
< / ul >
< h2 id = "2017-12-07" > 2017-12-07< / h2 >
< ul >
< li > Uptime Robot reported a few times today that CGSpace was down and then up< / li >
< li > At one point Tsega restarted Tomcat< / li >
< li > I never got any alerts about high load from Linode though… < / li >
< li > I looked just now and see that there are 121 PostgreSQL connections!< / li >
< li > The top users right now are:< / li >
< / ul >
< pre > < code > # cat /var/log/nginx/access.log /var/log/nginx/access.log.1 /var/log/nginx/library-access.log /var/log/nginx/library-access.log.1 | grep -E " 7/Dec/2017" | awk '{print $1}' | sort -n | uniq -c | sort -h | tail
838 40.77.167.11
939 66.249.66.223
1149 66.249.66.206
1316 207.46.13.110
1322 207.46.13.151
1323 2001:da8:203:2224:c912:1106:d94f:9189
1414 157.55.39.223
2378 104.196.152.243
2662 66.249.66.219
5110 124.17.34.60
< / code > < / pre >
< ul >
< li > We’ ve never seen 124.17.34.60 yet, but it’ s really hammering us!< / li >
< li > Apparently it is from China, and here is one of its user agents:< / li >
< / ul >
< pre > < code > Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.2; Win64; x64; Trident/7.0; LCTE)
< / code > < / pre >
< ul >
< li > It is responsible for 4,500 Tomcat sessions today alone:< / li >
< / ul >
< pre > < code > $ grep 124.17.34.60 /home/cgspace.cgiar.org/log/dspace.log.2017-12-07 | grep -o -E 'session_id=[A-Z0-9]{32}' | sort -n | uniq | wc -l
4574
< / code > < / pre >
< ul >
< li > I’ ve adjusted the nginx IP mapping that I set up last month to account for 124.17.34.60 and 124.17.34.59 using a regex, as it’ s the same bot on the same subnet< / li >
2017-12-05 15:57:02 +01:00
< / ul >
2017-12-01 11:55:00 +01:00
< / article >
< / div > <!-- /.blog - main -->
< aside class = "col-sm-3 ml-auto blog-sidebar" >
< section class = "sidebar-module" >
< h4 > Recent Posts< / h4 >
< ol class = "list-unstyled" >
< li > < a href = "/cgspace-notes/2017-12/" > December, 2017< / a > < / li >
< li > < a href = "/cgspace-notes/2017-11/" > November, 2017< / a > < / li >
< li > < a href = "/cgspace-notes/2017-10/" > October, 2017< / a > < / li >
< li > < a href = "/cgspace-notes/cgiar-library-migration/" > CGIAR Library Migration< / a > < / li >
< li > < a href = "/cgspace-notes/2017-09/" > September, 2017< / a > < / li >
< / ol >
< / section >
< section class = "sidebar-module" >
< h4 > Links< / h4 >
< ol class = "list-unstyled" >
< li > < a href = "https://cgspace.cgiar.org" > CGSpace< / a > < / li >
< li > < a href = "https://dspacetest.cgiar.org" > DSpace Test< / a > < / li >
< li > < a href = "https://github.com/ilri/DSpace" > CGSpace @ GitHub< / a > < / li >
< / ol >
< / section >
< / aside >
< / div > <!-- /.row -->
< / div > <!-- /.container -->
< footer class = "blog-footer" >
< p >
Blog template created by < a href = "https://twitter.com/mdo" > @mdo< / a > , ported to Hugo by < a href = 'https://twitter.com/mralanorth' > @mralanorth< / a > .
< / p >
< p >
< a href = "#" > Back to top< / a >
< / p >
< / footer >
< / body >
< / html >