Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Setting Up Filebeat and Logstash in DevOps | Monitoring & Logging in DevOps
Introduction to DevOps

bookSetting Up Filebeat and Logstash in DevOps

When an application generates logs, it is important to collect, process, and analyze them. In DevOps, the ELK stack (Elasticsearch, Logstash, Kibana) along with Filebeat is often used for this purpose. Filebeat reads logs from your server, Logstash processes and filters the data, and Elasticsearch stores it for further analysis.

You will learn how to configure Filebeat and Logstash to work with our Flask application so that all user events are properly collected and sent to the ELK stack.

How It Works

Filebeat acts as an agent on the server running your application. It monitors log files and sends new entries to Logstash. Logstash receives the logs, processes them β€” such as parsing JSON structures β€” and then forwards them to Elasticsearch. Elasticsearch stores and organizes the data so you can search, analyze, and visualize it easily in Kibana.

In our example, Filebeat will monitor the app.log file generated by the Flask application. Logstash will process these logs and send them to Elasticsearch. Elasticsearch creates a separate index for each day, named flask-logs-YYYY.MM.DD. These daily indexes help you organize the logs by date, making it easier to search for events, analyze trends, or troubleshoot issues from a specific day without having to sift through all the logs at once.

Configuring Filebeat

First, create a configuration file for Filebeat. Make sure you are in the elk-demo folder that you created earlier. Then run the following command to create and open the Filebeat configuration file:

This will open the editor where you can paste the Filebeat configuration.

Note
Note

After pasting, save the file with Ctrl + O, press Enter, and exit with Ctrl + X.

In the editor, insert the following configuration:

filebeat.yml

filebeat.yml

copy

This configuration tells Filebeat to monitor the file /logs/app.log. Every new log entry is sent to Logstash, which listens on port 5044. This ensures that all events from the application are automatically sent for processing and indexing.

Configuring Logstash

Next, create a configuration file for Logstash. Make sure you are in the elk-demo folder that you created earlier. Then run the following command to create and open the Logstash configuration file:

This will open the editor where you can paste the Logstash configuration. Paste the following configuration into the file:

logstash.conf

logstash.conf

copy

With these configurations, Filebeat and Logstash are ready to work together. Filebeat will monitor the application logs, Logstash will filter and process them, and Elasticsearch will store and index the data, making it ready for analysis and visualization in Kibana.

1. What is the main role of Filebeat in the ELK stack?

2. Why does Logstash create a separate index for each day in Elasticsearch?

question mark

What is the main role of Filebeat in the ELK stack?

Select the correct answer

question mark

Why does Logstash create a separate index for each day in Elasticsearch?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 4. ChapterΒ 4

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

Awesome!

Completion rate improved to 3.7

bookSetting Up Filebeat and Logstash in DevOps

Swipe to show menu

When an application generates logs, it is important to collect, process, and analyze them. In DevOps, the ELK stack (Elasticsearch, Logstash, Kibana) along with Filebeat is often used for this purpose. Filebeat reads logs from your server, Logstash processes and filters the data, and Elasticsearch stores it for further analysis.

You will learn how to configure Filebeat and Logstash to work with our Flask application so that all user events are properly collected and sent to the ELK stack.

How It Works

Filebeat acts as an agent on the server running your application. It monitors log files and sends new entries to Logstash. Logstash receives the logs, processes them β€” such as parsing JSON structures β€” and then forwards them to Elasticsearch. Elasticsearch stores and organizes the data so you can search, analyze, and visualize it easily in Kibana.

In our example, Filebeat will monitor the app.log file generated by the Flask application. Logstash will process these logs and send them to Elasticsearch. Elasticsearch creates a separate index for each day, named flask-logs-YYYY.MM.DD. These daily indexes help you organize the logs by date, making it easier to search for events, analyze trends, or troubleshoot issues from a specific day without having to sift through all the logs at once.

Configuring Filebeat

First, create a configuration file for Filebeat. Make sure you are in the elk-demo folder that you created earlier. Then run the following command to create and open the Filebeat configuration file:

This will open the editor where you can paste the Filebeat configuration.

Note
Note

After pasting, save the file with Ctrl + O, press Enter, and exit with Ctrl + X.

In the editor, insert the following configuration:

filebeat.yml

filebeat.yml

copy

This configuration tells Filebeat to monitor the file /logs/app.log. Every new log entry is sent to Logstash, which listens on port 5044. This ensures that all events from the application are automatically sent for processing and indexing.

Configuring Logstash

Next, create a configuration file for Logstash. Make sure you are in the elk-demo folder that you created earlier. Then run the following command to create and open the Logstash configuration file:

This will open the editor where you can paste the Logstash configuration. Paste the following configuration into the file:

logstash.conf

logstash.conf

copy

With these configurations, Filebeat and Logstash are ready to work together. Filebeat will monitor the application logs, Logstash will filter and process them, and Elasticsearch will store and index the data, making it ready for analysis and visualization in Kibana.

1. What is the main role of Filebeat in the ELK stack?

2. Why does Logstash create a separate index for each day in Elasticsearch?

question mark

What is the main role of Filebeat in the ELK stack?

Select the correct answer

question mark

Why does Logstash create a separate index for each day in Elasticsearch?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 4. ChapterΒ 4
some-alt