content/post: Fix image links

This commit is contained in:
2016-09-21 15:19:43 +03:00
parent 08f725b92d
commit f50e2455a4
9 changed files with 27 additions and 27 deletions

View File

@ -38,7 +38,7 @@ Exception in thread "Lucene Merge Thread #19" org.apache.lucene.index.MergePolic
- Start cleaning up the configuration for Atmire's CUA module ([#184](https://github.com/ilri/DSpace/issues/185))
- It is very messed up because some labels are incorrect, fields are missing, etc
![Mixed up label in Atmire CUA](../images/2016/03/cua-label-mixup.png)
![Mixed up label in Atmire CUA](2016/03/cua-label-mixup.png)
- Update documentation for Atmire modules
@ -57,7 +57,7 @@ Exception in thread "Lucene Merge Thread #19" org.apache.lucene.index.MergePolic
- Make titles in Discovery and Browse by more consistent (singular, sentence case, etc) ([#186](https://github.com/ilri/DSpace/issues/186))
- Also four or so center-specific subject strings were missing for Discovery
![Missing XMLUI string](../images/2016/03/missing-xmlui-string.png)
![Missing XMLUI string](2016/03/missing-xmlui-string.png)
## 2016-03-15
@ -105,11 +105,11 @@ Exception in thread "Lucene Merge Thread #19" org.apache.lucene.index.MergePolic
- Discuss thumbnails with Francesca from Bioversity
- Some of their items end up with thumbnails that have a big white border around them:
![Excessive whitespace in thumbnail](../images/2016/03/bioversity-thumbnail-bad.jpg)
![Excessive whitespace in thumbnail](2016/03/bioversity-thumbnail-bad.jpg)
- Turns out we can add `-trim` to the GraphicsMagick options to trim the whitespace
![Trimmed thumbnail](../images/2016/03/bioversity-thumbnail-good.jpg)
![Trimmed thumbnail](2016/03/bioversity-thumbnail-good.jpg)
- Command used:
@ -135,14 +135,14 @@ $ gm convert -trim -quality 82 -thumbnail x300 -flatten Descriptor\ for\ Butia_E
- Google says the first time it saw this particular error was September 29, 2015... so maybe it accidentally saw it somehow...
- On a related note, we have 51,000 items indexed from the sitemap, but 500,000 items in the Google index, so we DEFINITELY have a problem with duplicate content
![CGSpace pages in Google index](../images/2016/03/google-index.png)
![CGSpace pages in Google index](2016/03/google-index.png)
- Turns out this is a problem with DSpace's `robots.txt`, and there's a Jira ticket since December, 2015: https://jira.duraspace.org/browse/DS-2962
- I am not sure if I want to apply it yet
- For now I've just set a bunch of these dynamic pages to not appear in search results by using the URL Parameters tool in Webmaster Tools
![URL parameters cause millions of dynamic pages](../images/2016/03/url-parameters.png)
![Setting pages with the filter_0 param not to show in search results](../images/2016/03/url-parameters2.png)
![URL parameters cause millions of dynamic pages](2016/03/url-parameters.png)
![Setting pages with the filter_0 param not to show in search results](2016/03/url-parameters2.png)
- Move AVCD collection to new community and update `move_collection.sh` script: https://gist.github.com/alanorth/392c4660e8b022d99dfa
- It seems Feedburner can do HTTPS now, so we might be able to update our feeds and simplify the nginx configs