Tshark is the terminal version of the packet capture application Wireshark. Using Tshark in combination with an ELK stack (Elasticsearch, Logstash, Kibana) it is possible to display your capture results in graphs. In this post I will explain how to capture network traffic, send it to Elasticsearch using Logstash and display graphs in Kibana. As a client I used Windows, the ELK server runs on Ubuntu.
The following command will capture network traffic for 1 minute. Once it is finished, it will run again, and so on. This will prevent memory issues with tshark and a very large .csv file. It will create a .csv file. Install Wireshark and go to C:\Program Files\Wireshark. Now run tshark:
for /L %G in (*) do tshark -a duration:60 -i 1 -t ad -lT fields -E separator=, -E quote=d -e _ws.col.Time -e _ws.col.Source -e _ws.col.Destination -e ip.src -e ip.dst -e tcp.srcport -e tcp.dstport -e _ws.col.Protocol -e ip.len -e _ws.col.Info > C:\Windows\Temp\tshark.csv
The resulting .csv file will contain lines like that look like this:
“2016-02-12 20:04:12.137523”, “22.214.171.124”, “192.168.1.1”, “126.96.36.199”, “192.168.1.1”, “443”, “63103”, “TLSv1.2”, “987”, “Application Data”
On the Windows client Logstash or Filebeat needs to be installed to transport the .csv file to Elasticsearch. Filebeat is designed for this, you can install it using a Puppet module. On the ELK server Logstash will pick up the beat and apply a filter. Use the csv filter to assign the correct field names to the values in the .csv file.
source => “message”
columns => [ “col.Time”, “col.Source”, “col.Destination”, “ip.src”, “ip.dst”, “tcp.srcport”, “tcp.dstport”, “col.Protocol”, “ip.len”, “col.Info” ]
convert => [ “ip.len”, “integer” ]
match => [ “col.Time”, “YYYY-MM-dd HH:mm:ss.SSSSSS” ]
Send the results to Elasticsearch, and Kibana will show the data. Now you can start analyzing your network data: