Logging#

General information#

ELK cloud service allows for the centralized and automated data collection from PaaS service logs. In addition, it can be used to extract data from logs of other systems. The service is based on Elastic Stack consisting of:

  • Elasticsearch to store and index logs;

  • Logstash to filter and process logs;

  • Kibana to visualize retrieved data.

Activity logs are integral to the monitoring of the overall system health. When running, applications and services log various debugging information, save error messages and warnings, and log actions and operations. This helps quickly identify service problems and analyze any abnormal situation, be it a failure or performance degradation. In addition, based on the data stored, you can make forecasts and take measures to prevent problems in the future.

Analyzing a problem can take a lot of time and effort if you manually scrutinize log files of different components from multiple servers and try to find correlations. Moreover, programs often log a lot of excessive data that is only needed under certain circumstances. In such cases, it is critical to be able to quickly find and filter data to extract the requried information from different sources.

Kibana’s advanced capabilities allow you to perform a full-text search across all logs, select only desired time intervals, show/hide individual message fields, and count the number of events. This helps quickly and easily filter the required information, highlight it in search results, and, if necessary, visualize as charts and graphs.

ELK service is easy-to-use and scalable. It helps timely monitor the health of various services and receive application performance data in a convenient form for analysis and decision-making in order to optimize application performance and ensure the service uptime through quick troubleshooting.

Before you begin#

To get started with the logging service, follow these steps:

  1. Create a project, if you don’t have one.

  2. In the IAM section, create a user with the PaaS Administrator or Cloud Administrator role and add it to the project with the PaaS privilege.

  3. Make sure that the project has all the required resources – subnets, SSH keys, and security groups. Otherwise, create them.

  4. Read the recommendations on how to work with the logging service in the cloud.

ELK service launch#

To launch the service, go to the Service Store or Installed Services section, select ELK service in the Logging tab and click Create.

The service launch procedure comprises the following stages:

  1. Set the network parameters required for ELK service:

    • VPC where the service will be deployed.

    • The High-availability service option.

    • Security groups to control traffic through interfaces of the instances on which ELK service will run.

    • Subnets to which instances with the service deployed on them will be attached, or network interfaces through which these instances will be attached to subnets.

  2. Specify the configuration of the instance or instances where the data search and analysis service will run. Select the instance type and parameters of its volumes: type, size and IOPS (if available for the selected type).

    Note

    The ELK service performance depends on the node components. We recommend using the Memory Optimized instance type.

    In addition, you can specify an SSH key. In this case, after automatic service configuration, you will have SSH access to the respective instances.

    Attention

    We provide the option to connect to instances using an SSH key while the new ELK service is beta testing. This feature may be disabled in the future.

  3. Set additional service parameters:

    • Service name – any unique name for the caching service.

    • Option to install a monitoring agent.

    • Elasticsearch version.

    • Allowing anonymous access in Kibana. When selecting this option, also select a role for anonymous access with viewing (viewer) or editing (editor) rights.

    • Elasticsearch superuser password. You can set it manually or generate automatically. If a password is set, authentication is required to log in to Kibana.

  4. Click Create.

    Note

    The service launching process usually takes 5 to 15 minutes.

Processing the event logs#

Logstash component allows you to filter, aggregate, and modify event logs before they are sent to Elasticsearch. You can configure pipelines to automatically pre-process logs.

Attention

Pipelines are currently supported for a service that runs on a single instance only.

Note

For the pipeline configuration details, see official documentation.

Create a pipeline#

To set a pipeline configuration:

  • Go to the section PaaS Installed Services.

  • Open the Logging tab and click the ELK service name to go to its page.

  • Open the Pipelines tab and click Create.

  • In the dialog that opens, specify the pipeline name and configuration.

  • Click Create.

Modify a pipeline#

To modify a pipeline configuration:

  • Go to the section PaaS Installed Services.

  • Open the Logging tab and click the ELK service name to go to its page.

  • Open the Pipelines tab, select the pipeline from the list and click Modify.

  • Edit the pipeline configuration

  • Click Save.

Delete a pipeline#

To delete a pipeline:

  • Go to the section PaaS Installed Services.

  • Open the Logging tab and click the ELK service name to go to its page.

  • Open the Pipelines tab, select a pipeline from the list, and click Delete. You can select multiple pipelines for deletion at the same time.

  • In the dialog window, confirm the action.

Deleting ELK service#

Deleting ELK service deletes all instances and volumes created with it.

You can delete the service using one of the following methods.

  1. Go to the Installed Services section.

  2. Open the Logging tab.

  3. Find the service in the table and click on the icon .

  1. Go to the Installed Services section.

  2. Open the Logging tab.

  3. Find the service in the table and go to the service page.

  4. Click :Delete in the Information tab.