Skip to content

Pushing logs from OpenShift’s EFK Stack to LogStash

If you have your own ELK (ElasticSearch, Logstash, Kibana) stack implementation, then you may want to forward the logs from OpenShift’s bundled EFK (ElasticSearch, Fluentd, Kibana) stack.

This should be simple, but the internet doesn’t really have a clear solution. What’s described here works for the EFK distributed in OpenShift 3.9 and recent version of LogStash.

First, configure your Logstash.  In the minimal config, Logstash will accept logs with the Fluentd codec and print those message to  standard out as they are recieved.  Below is the minimal required to have Logstash accept messages:

# Minimal Logstash Implementation

$ ./bin/logstash -e 'input { tcp { port => "8080", codec =>; "fluent" } } output { stdout {} }'

Next, edit the secure-forward.conf file in the logging/logging-fluentd ConfigMap to contain the following

<store>
 @type forward
 send_timeout 60s
 recover_wait 10s
 hard_timeout 60s
 # must set none as heartbeat type, because Logstash
 # doesn't implement this
 heartbeat_type none

  <server>
     name your-logstash-host
     host 10.0.0.1
     port 8080
  </server>
</store>

After restarting the fluentd DaemonSet, logs should start flowing to LogStash.

It’s important to not that this is a Proof of Concept.  It’s not encrypted and could use some tuning.  YMMV!