{"id":928,"date":"2016-11-17T13:53:07","date_gmt":"2016-11-17T13:53:07","guid":{"rendered":"http:\/\/tpodolak.com\/blog\/?p=928"},"modified":"2016-11-17T13:53:07","modified_gmt":"2016-11-17T13:53:07","slug":"elastic-stack-reading-logs-queue","status":"publish","type":"post","link":"https:\/\/tpodolak.com\/blog\/2016\/11\/17\/elastic-stack-reading-logs-queue\/","title":{"rendered":"Logstash &#8211; reading logs from RabbitMQ"},"content":{"rendered":"<h3>1. Introduction <\/h3>\n<p>In my previous post, I&#8217;ve shown how to configure <i>Logstash<\/i> to parse logs from files. This is pretty useful however if your application is deployed on multiple servers, you usually log to some kind of central log storage &#8211; in my case to queue, <i>RabbitMQ<\/i> to be more specific. In this post, I will show how to configure <i>Logstash<\/i> so it reads the logs from that queue.<\/p>\n<h3>2. Preparing queue<\/h3>\n<p>Before we move to <i>Logstash<\/i> configuration, first of all, we have to prepare <i>RabbitMQ<\/i> test instance. If you don&#8217;t have <i>RabbitMQ<\/i> yet, go to <a href=\"https:\/\/www.rabbitmq.com\/download.html\">this<\/a> website and install the queue. Once installation is done, go to the installation folder (<i>C:\\Program Files\\RabbitMQ Server\\rabbitmq_server-3.6.5\\sbin in my case<\/i>) and run in console<\/p>\n<pre lang=\"bash\">\r\nrabbitmq-plugins.bat enable rabbitmq_management\r\n<\/pre>\n<p>This command will prepare <i>RabbitMQ<\/i> management website, so it will be easier for us to see what is going on in given queue. In the next step, we have to prepare the queue, the logs will be sent to. You can do it via the website we&#8217;ve just enabled (<i>http:\/\/localhost:15672\/<\/i>) or via <i>RabbitMQ<\/i> admin console. As I prefer to automate things as much as possible I will do it via command line. What is quite unusual when it comes <i>RabbitMQ CLI<\/i> is the fact that it is a python script you have to download and run locally (this is not an executable). The script can be found on management site under this <a href=\"http:\/\/localhost:15672\/cli\/rabbitmqadmin\"> address<\/a>. Once the script is downloaded (in my case it is saved as <i>rabbitmqadmin.py<\/i>) you can start preparing necessary elements: exchange, queue and the binding.<\/p>\n<pre lang=\"bash\">\r\npython rabbitmqadmin.py declare exchange name=logger type=topic -u username -p password\r\n<\/pre>\n<pre lang=\"bash\">\r\npython rabbitmqadmin.py declare queue  name=MyAppLogginQueue auto_delete=false durable=true -u username -p password\r\n<\/pre>\n<pre lang=\"bash\">\r\npython rabbitmqadmin.py  declare binding source=logger destination=MyAppLogginQueue routing_key=MyApp -u username -p password\r\n<\/pre>\n<p>As you can see I&#8217;ve created exchange called <i>logger<\/i> which is bound to <i>MyAppLogginQueue <\/i> queue using <i>MyApp<\/i> route. This means that every message with topic <i>MyApp<\/i> sent to <i>logger<\/i> exchange will be pushed to <i>MyAppLogginQueue <\/i>.<\/p>\n<h3>3. Preparing Logstash<\/h3>\n<p><i>Logstash<\/i> configuration will be modified version of my <a href=\"http:\/\/tpodolak.com\/blog\/2016\/11\/03\/logstash-parsing-multiline-log-entries\/\">previous<\/a> config. I will just add another input source. Here is a basic usage<\/p>\n<pre lang=\"javascript\">\r\ninput {\t\r\n\trabbitmq {\r\n        host => \"localhost\"\r\n        queue => \"MyAppLogginQueue\"\r\n        heartbeat => 30\r\n        durable => true\r\n        password => \"password\"\r\n        user => \"username\"\r\n    }\r\n}\r\n<\/pre>\n<p>As you can see we will be consuming messages from <i>MyAppLogginQueue<\/i> which is deployed on <i>localhost<\/i>. For <i>password<\/i> and <i>user<\/i> properties use your own credentials. That is basically it, so now it is time to see if everything is working.<\/p>\n<h3>4. Testing coniguration <\/h3>\n<p>In order to test the configuration you have to run the <i>Elasticsearch<\/i>, <i>Kibana<\/i> and use new config for <i>Logstash<\/i>. I&#8217;ve shown how to do it in one of my <a href=\"http:\/\/tpodolak.com\/blog\/2016\/11\/02\/getting-started-elastic-stack\/\"> recent post <\/a>. For sending messages to the queue I will just use <i>RabbitMQ<\/i> management website <i>API<\/i>. The <i>API<\/i> exposes <\/p>\n<pre lang=\"bash\">\r\napi\/exchanges\/%2F\/{exchange_name}\/publish\r\n<\/pre>\n<p>endpoint accepting <i>POST <\/i> verbs which can be used for publishing messages to given exchange. In my case <i>POST<\/i> body will look as follows<\/p>\n<pre lang=\"javascript\">\r\n{\r\n  \"vhost\": \"\/\",\r\n  \"name\": \"logger\",\r\n  \"properties\": {\r\n    \"delivery_mode\": 2,\r\n    \"headers\": {}\r\n  },\r\n  \"routing_key\": \"MyApp\",\r\n  \"delivery_mode\": \"2\",\r\n  \"payload\": \"TimeStamp=2016-11-01 00:13:01.1669 CorrelationId=77530786-8e6b-45c2-bbc1-31837d911c14 Level=INFO Message=Request completed with status code: 200\",\r\n  \"headers\": {},\r\n  \"props\": {},\r\n  \"payload_encoding\": \"string\"\r\n}\r\n<\/pre>\n<p>and I will be sending it to<\/p>\n<pre lang=\"bash\">\r\nhttp:\/\/localhost:15672\/api\/exchanges\/%2F\/logger\/publish\r\n<\/pre>\n<p>Note that I will be sending messages to the exchange, not to the queue itself. The exchange&#8217;s responsibility is to route the message to all bound queues. Here is how it looks in practice<br \/>\n<a href=\"http:\/\/tpodolak.com\/blog\/wp-content\/uploads\/2016\/11\/elastic-stack-reading-logs-queue\/ezgif.com-gif-maker.gif\" rel=\"attachment wp-att-942\"><img decoding=\"async\" src=\"http:\/\/tpodolak.com\/blog\/wp-content\/uploads\/2016\/11\/elastic-stack-reading-logs-queue\/ezgif.com-gif-maker.gif\" alt=\"ezgif-com-gif-maker\" width=\"960\" class=\"aligncenter size-full wp-image-942\" \/><\/a><br \/>\nAs you can see our configuration is valid and messages are shown on <i>Kibana&#8217;s<\/i> dashboard almost in real time. <\/p>\n<p>Full <i>Logstash<\/i> config can be found <a href=\"https:\/\/github.com\/tpodolak\/Blog\/tree\/master\/ElasticStackReadingFromRabbitMQ\">here<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>1. Introduction In my previous post, I&#8217;ve shown how to configure Logstash to parse logs from files. This is pretty useful however if your application is deployed on multiple servers, you usually log to some kind of central log storage &#8211; in my case to queue, RabbitMQ to be more specific. In this post, I [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[309],"tags":[311,312,310],"class_list":["post-928","post","type-post","status-publish","format-standard","hentry","category-elasticstack","tag-elasticsearch","tag-kibana","tag-logstash"],"_links":{"self":[{"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/posts\/928","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/comments?post=928"}],"version-history":[{"count":30,"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/posts\/928\/revisions"}],"predecessor-version":[{"id":961,"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/posts\/928\/revisions\/961"}],"wp:attachment":[{"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/media?parent=928"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/categories?post=928"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/tpodolak.com\/blog\/wp-json\/wp\/v2\/tags?post=928"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}