2018-02-11 17:28:23 +01:00
<!DOCTYPE html>
< html lang = "en" >
< head >
< meta charset = "utf-8" >
< meta name = "viewport" content = "width=device-width, initial-scale=1, shrink-to-fit=no" >
< meta property = "og:title" content = "November, 2015" / >
< meta property = "og:description" content = "2015-11-22
CGSpace went down
Looks like DSpace exhausted its PostgreSQL connection pool
2019-05-05 15:45:12 +02:00
Last week I had increased the limit from 30 to 60, which seemed to help, but now there are many more idle connections:
2018-02-11 17:28:23 +01:00
$ psql -c ' SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
78
2019-05-05 15:45:12 +02:00
2018-02-11 17:28:23 +01:00
" />
< meta property = "og:type" content = "article" / >
2019-02-02 13:12:57 +01:00
< meta property = "og:url" content = "https://alanorth.github.io/cgspace-notes/2015-11/" / >
2019-08-08 17:10:44 +02:00
< meta property = "article:published_time" content = "2015-11-23T17:00:57+03:00" / >
< meta property = "article:modified_time" content = "2018-03-09T22:10:33+02:00" / >
2018-09-30 07:23:48 +02:00
2018-02-11 17:28:23 +01:00
< meta name = "twitter:card" content = "summary" / >
< meta name = "twitter:title" content = "November, 2015" / >
< meta name = "twitter:description" content = "2015-11-22
CGSpace went down
Looks like DSpace exhausted its PostgreSQL connection pool
2019-05-05 15:45:12 +02:00
Last week I had increased the limit from 30 to 60, which seemed to help, but now there are many more idle connections:
2018-02-11 17:28:23 +01:00
$ psql -c ' SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
78
2019-05-05 15:45:12 +02:00
2018-02-11 17:28:23 +01:00
"/>
2019-09-19 17:20:04 +02:00
< meta name = "generator" content = "Hugo 0.58.2" / >
2018-02-11 17:28:23 +01:00
< script type = "application/ld+json" >
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "November, 2015",
2019-04-13 11:15:55 +02:00
"url": "https:\/\/alanorth.github.io\/cgspace-notes\/2015-11\/",
2018-04-30 18:05:39 +02:00
"wordCount": "798",
2019-04-13 11:15:55 +02:00
"datePublished": "2015-11-23T17:00:57\x2b03:00",
"dateModified": "2018-03-09T22:10:33\x2b02:00",
2018-02-11 17:28:23 +01:00
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
< / script >
< link rel = "canonical" href = "https://alanorth.github.io/cgspace-notes/2015-11/" >
< title > November, 2015 | CGSpace Notes< / title >
<!-- combined, minified CSS -->
2019-02-13 17:47:17 +01:00
< link href = "https://alanorth.github.io/cgspace-notes/css/style.css" rel = "stylesheet" integrity = "sha384-G5B34w7DFTumWTswxYzTX7NWfbvQEg1HbFFEg6ItN03uTAAoS2qkPS/fu3LhuuSA" crossorigin = "anonymous" >
2018-02-11 17:28:23 +01:00
2019-04-14 15:59:47 +02:00
<!-- RSS 2.0 feed -->
2018-02-11 17:28:23 +01:00
< / head >
< body >
< div class = "blog-masthead" >
< div class = "container" >
< nav class = "nav blog-nav" >
< a class = "nav-link " href = "https://alanorth.github.io/cgspace-notes/" > Home< / a >
< / nav >
< / div >
< / div >
2018-12-19 12:20:39 +01:00
2018-02-11 17:28:23 +01:00
< header class = "blog-header" >
< div class = "container" >
< h1 class = "blog-title" > < a href = "https://alanorth.github.io/cgspace-notes/" rel = "home" > CGSpace Notes< / a > < / h1 >
< p class = "lead blog-description" > Documenting day-to-day work on the < a href = "https://cgspace.cgiar.org" > CGSpace< / a > repository.< / p >
< / div >
< / header >
2018-12-19 12:20:39 +01:00
2018-02-11 17:28:23 +01:00
< div class = "container" >
< div class = "row" >
< div class = "col-sm-8 blog-main" >
< article class = "blog-post" >
< header >
< h2 class = "blog-post-title" > < a href = "https://alanorth.github.io/cgspace-notes/2015-11/" > November, 2015< / a > < / h2 >
< p class = "blog-post-meta" > < time datetime = "2015-11-23T17:00:57+03:00" > Mon Nov 23, 2015< / time > by Alan Orth in
< i class = "fa fa-tag" aria-hidden = "true" > < / i > < a href = "/cgspace-notes/tags/notes" rel = "tag" > Notes< / a >
< / p >
< / header >
< h2 id = "2015-11-22" > 2015-11-22< / h2 >
< ul >
< li > CGSpace went down< / li >
< li > Looks like DSpace exhausted its PostgreSQL connection pool< / li >
2019-05-05 15:45:12 +02:00
< li > < p > Last week I had increased the limit from 30 to 60, which seemed to help, but now there are many more idle connections:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
78
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
< / ul >
2018-02-11 17:28:23 +01:00
< ul >
< li > For now I have increased the limit from 60 to 90, run updates, and rebooted the server< / li >
< / ul >
< h2 id = "2015-11-24" > 2015-11-24< / h2 >
< ul >
< li > CGSpace went down again< / li >
< li > Getting emails from uptimeRobot and uptimeButler that it’ s down, and Google Webmaster Tools is sending emails that there is an increase in crawl errors< / li >
2019-05-05 15:45:12 +02:00
< li > < p > Looks like there are still a bunch of idle PostgreSQL connections:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
96
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > For some reason the number of idle connections is very high since we upgraded to DSpace 5< / p > < / li >
2018-02-11 17:28:23 +01:00
< / ul >
< h2 id = "2015-11-25" > 2015-11-25< / h2 >
< ul >
< li > Troubleshoot the DSpace 5 OAI breakage caused by nginx routing config< / li >
2019-05-05 15:45:12 +02:00
< li > < p > The OAI application requests stylesheets and javascript files with the path < code > /oai/static/css< / code > , which gets matched here:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > # static assets we can load from the file system directly with nginx
location ~ /(themes|static|aspects/ReportingSuite) {
2019-05-05 15:45:12 +02:00
try_files $uri @tomcat;
2018-02-11 17:28:23 +01:00
...
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > The document root is relative to the xmlui app, so this gets a 404—I’ m not sure why it doesn’ t pass to < code > @tomcat< / code > < / p > < / li >
< li > < p > Anyways, I can’ t find any URIs with path < code > /static< / code > , and the more important point is to handle all the static theme assets, so we can just remove < code > static< / code > from the regex for now (who cares if we can’ t use nginx to send Etags for OAI CSS!)< / p > < / li >
< li > < p > Also, I noticed we aren’ t setting CSP headers on the static assets, because in nginx headers are inherited in child blocks, but if you use < code > add_header< / code > in a child block it doesn’ t inherit the others< / p > < / li >
< li > < p > We simply need to add < code > include extra-security.conf;< / code > to the above location block (but research and test first)< / p > < / li >
< li > < p > We should add WOFF assets to the list of things to set expires for:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > location ~* \.(?:ico|css|js|gif|jpe?g|png|woff)$ {
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > We should also add < code > aspects/Statistics< / code > to the location block for static assets (minus < code > static< / code > from above):< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > location ~ /(themes|aspects/ReportingSuite|aspects/Statistics) {
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > Need to check < code > /about< / code > on CGSpace, as it’ s blank on my local test server and we might need to add something there< / p > < / li >
< li > < p > CGSpace has been up and down all day due to PostgreSQL idle connections (current DSpace pool is 90):< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
93
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > I looked closer at the idle connections and saw that many have been idle for hours (current time on server is < code > 2015-11-25T20:20:42+0000< / code > ):< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ psql -c 'SELECT * from pg_stat_activity;' | less -S
datid | datname | pid | usesysid | usename | application_name | client_addr | client_hostname | client_port | backend_start | xact_start |
-------+----------+-------+----------+----------+------------------+-------------+-----------------+-------------+-------------------------------+-------------------------------+---
20951 | cgspace | 10966 | 18205 | cgspace | | 127.0.0.1 | | 37731 | 2015-11-25 13:13:02.837624+00 | | 20
20951 | cgspace | 10967 | 18205 | cgspace | | 127.0.0.1 | | 37737 | 2015-11-25 13:13:03.069421+00 | | 20
...
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > There is a relevant Jira issue about this: < a href = "https://jira.duraspace.org/browse/DS-1458" > https://jira.duraspace.org/browse/DS-1458< / a > < / p > < / li >
< li > < p > It seems there is some sense changing DSpace’ s default < code > db.maxidle< / code > from unlimited (-1) to something like 8 (Tomcat default) or 10 (Confluence default)< / p > < / li >
< li > < p > Change < code > db.maxidle< / code > from -1 to 10, reduce < code > db.maxconnections< / code > from 90 to 50, and restart postgres and tomcat7< / p > < / li >
< li > < p > Also redeploy DSpace Test with a clean sync of CGSpace and mirror these database settings there as well< / p > < / li >
< li > < p > Also deploy the nginx fixes for the < code > try_files< / code > location block as well as the expires block< / p > < / li >
2018-02-11 17:28:23 +01:00
< / ul >
< h2 id = "2015-11-26" > 2015-11-26< / h2 >
< ul >
< li > CGSpace behaving much better since changing < code > db.maxidle< / code > yesterday, but still two up/down notices from monitoring this morning (better than 50!)< / li >
< li > CCAFS colleagues mentioned that the REST API is very slow, 24 seconds for one item< / li >
2019-05-05 15:45:12 +02:00
< li > < p > Not as bad for me, but still unsustainable if you have to get many:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle/10568/32802?expand=all
8.415
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > Monitoring e-mailed in the evening to say CGSpace was down< / p > < / li >
< li > < p > Idle connections in PostgreSQL again:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ psql -c 'SELECT * from pg_stat_activity;' | grep cgspace | grep -c idle
66
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > At the time, the current DSpace pool size was 50… < / p > < / li >
< li > < p > I reduced the pool back to the default of 30, and reduced the < code > db.maxidle< / code > settings from 10 to 8< / p > < / li >
2018-02-11 17:28:23 +01:00
< / ul >
< h2 id = "2015-11-29" > 2015-11-29< / h2 >
< ul >
< li > Still more alerts that CGSpace has been up and down all day< / li >
2019-05-05 15:45:12 +02:00
< li > < p > Current database settings for DSpace:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > db.maxconnections = 30
db.maxwait = 5000
db.maxidle = 8
db.statementpool = true
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > And idle connections:< / p >
2018-02-11 17:28:23 +01:00
< pre > < code > $ psql -c 'SELECT * from pg_stat_activity;' | grep cgspace | grep -c idle
49
2019-05-05 15:45:12 +02:00
< / code > < / pre > < / li >
2018-02-11 17:28:23 +01:00
2019-05-05 15:45:12 +02:00
< li > < p > Perhaps I need to start drastically increasing the connection limits—like to 300—to see if DSpace’ s thirst can ever be quenched< / p > < / li >
< li > < p > On another note, SUNScholar’ s notes suggest adjusting some other postgres variables: < a href = "http://wiki.lib.sun.ac.za/index.php/SUNScholar/Optimisations/Database" > http://wiki.lib.sun.ac.za/index.php/SUNScholar/Optimisations/Database< / a > < / p > < / li >
< li > < p > This might help with REST API speed (which I mentioned above and still need to do real tests)< / p > < / li >
2018-02-11 17:28:23 +01:00
< / ul >
< / article >
< / div > <!-- /.blog - main -->
< aside class = "col-sm-3 ml-auto blog-sidebar" >
< section class = "sidebar-module" >
< h4 > Recent Posts< / h4 >
< ol class = "list-unstyled" >
2019-04-13 11:15:55 +02:00
< li > < a href = "/cgspace-notes/posts/" > Posts< / a > < / li >
2019-09-01 09:41:30 +02:00
< li > < a href = "/cgspace-notes/2019-09/" > September, 2019< / a > < / li >
< li > < a href = "/cgspace-notes/2019-08/" > August, 2019< / a > < / li >
2019-08-04 21:49:04 +02:00
< li > < a href = "/cgspace-notes/2019-07/" > July, 2019< / a > < / li >
2019-07-01 11:22:43 +02:00
< li > < a href = "/cgspace-notes/2019-06/" > June, 2019< / a > < / li >
2018-02-11 17:28:23 +01:00
< / ol >
< / section >
< section class = "sidebar-module" >
< h4 > Links< / h4 >
< ol class = "list-unstyled" >
< li > < a href = "https://cgspace.cgiar.org" > CGSpace< / a > < / li >
< li > < a href = "https://dspacetest.cgiar.org" > DSpace Test< / a > < / li >
< li > < a href = "https://github.com/ilri/DSpace" > CGSpace @ GitHub< / a > < / li >
< / ol >
< / section >
< / aside >
< / div > <!-- /.row -->
< / div > <!-- /.container -->
< footer class = "blog-footer" >
< p >
Blog template created by < a href = "https://twitter.com/mdo" > @mdo< / a > , ported to Hugo by < a href = 'https://twitter.com/mralanorth' > @mralanorth< / a > .
< / p >
< p >
< a href = "#" > Back to top< / a >
< / p >
< / footer >
< / body >
< / html >