Friday, November 1, 2019

PFSense logs in Graylog and Grafana using Elasticsearch

I recently felt the need to experiment with various "stacks" after seeing a Medium article on setting up these components (among others) on a Rock64 board (basically a souped-up Raspberry Pi). In the process I stumbled across a great video on YouTube showing what you can do when Grafana is added to the mix, and I was hooked.

Initially, I tried to follow the instructions from the Medium article to build the Elasticsearch+Graylog part of the stack myself, but ran into some issues and decided it wasn't worth the pain. Instead, I used the prebuilt OVA image from Graylog, which uses an Ubuntu server base OS and has a basic Elasticsearch+Graylog system preconfigured.

From there, I still needed to configure the various customizations in Graylog and get Grafana going. I found various resources, but the most helpful site I found was still not all that helpful. The information wasn't quite complete, and it wasn't completely in order. However the branch of the original pfsense-graylog implementation by opc40772 on github provided on this blog was critical to getting everything going.

So, here's the steps I would take if I were to do this again:

Note: I know this is lacking screenshots. I'll add some as I have more time, I really wanted to get the content down first before I forgot everything I knew.

  1. Install and configure the Graylog OVA image on my VM server. This includes setting the hostname and static IP as I pleased.
  2. Change the port for Graylog to port 9400, as Cerebro uses port 9000, which is the default for Graylog. (I later found that you can also change the default port for Cerebro, it really doesn't matter what you do.)
    1. Edit /etc/graylog/server/server.conf
    2. Find "http_bind_address"
    3. Change the value to "0.0.0.0:9400". This will allow external connections to Graylog, which is useful if you're not configuring all of this stuff on a server with a GUI like Ubuntu Server. :)
  3. Download the pfsense-graylog content pack: "wget https://github.com/devopstales/pfsense-graylog/archive/master.zip".
  4. Unzip the content pack.
  5. Download and install Cerebro.
    1. Cerebro releases can be found at https://github.com/lmenezes/cerebro/releases/; for Ubuntu or Debian, you can simply run "wget https://github.com/lmenezes/cerebro/releases/download/v0.8.4/cerebro_0.8.4_all.deb", then "dpkg -i cerebro_0.8.4_all.deb". For other operating systems, the download target and install command will change, and of course Cerebro could release v0.8.5 tomorrow, so make sure you check the original link.
  6. Open a web browser at your server's IP port 9000; you should hopefully see the Cerebro dashboard.
  7. Import the index template provided by pfsense-graylog into Elasticsearch using Cerebro. This template provides the fields needed for parsing and using the PFSense data in Grafana.
    1. In the Cerebro dashboard, navigate to "more" > "index templates" (image needed here)
    2. On the right-hand side under "create new template", provide the name "pfsense-custom".
    3. For the template data in the new template, copy and paste the contents of the file at "pfsense-graylog/Elasticsearch_pfsense_custom_template/pfsense_custom_template_es6.json" (downloaded in step 3).
  8. Download the GeoIP database using "wget -t0 -c http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz".
  9. Extract the downloaded GeoIP database using "tar -xvf GeoLite2-City.tar.gz"
  10. Copy the .mmdb file to the Graylog server directory using "cp GeoLite2-City_*/GeoLite2-City.mmdb /etc/graylog/server"
  11. Also copy the service-names-port-numbers file to the Graylog server directory using "cp service-names-port-numbers.csv /etc/graylog/server/"
  12. Restart Graylog: "systemctl restart graylog-server" on Ubuntu/Debian.
  13. Open Graylog in a web browser at your server's IP port 9400.
  14. Navigate to "System" > "Content Packs", then click "Upload".
    1. Choose file "pfsense-graylog/pfsense_content_pack/graylog3/3-pfsense-analysis.json"
    2. A new content pack should appear entitled "3 pfsense analysis"; click the Install button.
  15. Navigate to "System" > "Inputs", a new input should appear entitled "pfsense".
  16. Navigate to "System" > "Indices", a new index should appear entitled "pfsense-logs".
  17. Navigate to "Streams", a new stream should appear entitled "pfsense".
  18. Navigate to "System" > "Configurations".
    1. Under "Geo-Location Processor", click "Update", then check the box to enable the processor.
    2. Under "Message Processors Configuration", click "Update", then change the order of the processors as follows:
      1. AWS Instance Name Lookup
      2. Message Filter Chain
      3. Pipeline Processor
      4. GeoIP Resolver
  19. Here's where things may diverge a bit for you. In my particular setup, my PFSense box operates on my local timezone, my Ubuntu Server uses UTC time, and therefore the logs I see in Graylog all have the appropriate timestamps for my local timezone, though it seems to make the "last x minutes" search function in Graylog not work for some reason. When I got to setting up Grafana, I then had to configure it to always show items in UTC time, even though it's not really UTC. I messed with this a bit but ultimately haven't resolved it totally to my liking; it "works" as-is for now. 
    1. Nevertheless, navigate to "System" > "Pipelines", click the "pfsense" pipeline, then click "timestamp_pfsense_for_grafana".
    2. On line 6, edit the timezone to indicate your local timezone. For example, I entered "America/Detroit".
    3. Click "Save".
  20. Download Grafana using "wget <package url>"
  21. Install Grafana. On Ubuntu/Debian: 
    1. "apt install -y adduser libfontconfig1"
    2. "dpkg -i grafana_<version>.deb"
  22. Install the Grafana plugins you'll want for the PFSense dashboard. then restart Grafana:
    1. grafana-cli plugins install grafana-piechart-panel
      grafana-cli plugins install grafana-worldmap-panel
      grafana-cli plugins install savantly-heatmap-panel
      systemctl restart grafana-server
  23. Configure PFSense to push logs to the Graylog server:
    1. Log into PFSense
    2. Under "Status" > "System Logs" > "Settings":
      1. Check the box for "Enable Remote Logging"
      2. Set Source Address as needed for your particular system (default should be fine).
      3. Set Remote log server to the IP of your Graylog server, port 5442. For example, if the server IP was 192.168.1.1, the field should show "192.168.1.1:5442".
      4. Under Remote Syslog Contents, check the "Everything" box.
      5. Click "Save".
  24. Open Grafana in your web browser, http://<server ip>:3000
  25. On the left-hand side, hover over the gear icon to get to "Configuration", then click "Data Sources".
    1. Configure a new Elasticsearch data source:
      1. Name: PFSense Graylog
      2. Type: Elasticsearch
      3. URL: <server ip>:9200
      4. Access: Browser
      5. Index Name: pfsense_*
      6. Time Field Name - again I diverge, I used "timestamp", but the original instructions say to use "real_timestamp".
      7. Click "Save & Test". If you see a red message, you have a problem. If not, yay!
  26. If Grafana failed to connect in step 25, you may have an issue with CORS/anti cross-site-scripting protections in Elasticsearch. This seems to be an issue primarily when you have all of these servers on the same machine. Adding the following to /etc/elasticsearch/elasticsearch.yml resolved the problem for me, but it's ultimately not a security "best practice".
    1. # Leave XSS protection enabled, but allow from any source (basically disabling it)
      http.cors.enabled : true
      http.cors.allow-origin : "*"
      http.cors.allow-methods : OPTIONS, HEAD, GET, POST, PUT, DELETE
      http.cors.allow-headers : X-Requested-With,X-Auth-Token,Content-Type, Content-Length
  27. If you're new to Grafana, I would suggest starting with a prebuilt dashboard and tinkering around. Click the + icon on the left, then select "Import". Use dashboard ID "5420" to download a pretty interesting and useful basic dashboard. I started with this one and HEAVILY customized it, I can provide a link to mine later once I get to registering with Grafana. Dashboard configuration could be another post all in itself, I think that topic is best left to other sources.
That's it! 27 steps later and you have a working GEG stack (not quite as catchy as ELK, huh?).

2 comments:

  1. Had the same problem, used so many GIT projects for this, I am a MS guy, so eventually padding my own shoulder now I installed all stuff on a singe Ubunutu and used "timestamp" instead of "real_timestamp" as time field name in Data Source for Grafana

    ReplyDelete
  2. Hi, thank you for the helpful information. Trying to make the new cerebro "pfese-custom" index template when i press create an error message appears "Error creating template". Any clue?

    ReplyDelete