tencent cloud

GooseFS Log Reporting
Last updated:2025-07-17 17:42:55
GooseFS Log Reporting
Last updated: 2025-07-17 17:42:55

Overview

The GooseFS log reporting function supports sending GooseFS runtime logs to Tencent Cloud's Cloud Log Service (CLS) or Elasticsearch Service (ES). This document describes how to report GooseFS runtime logs to these two log systems.

Preparations

Deploy GooseFS cluster. For GooseFS deployment, see Quick Start in Console.

Log Reporting

GooseFS Log Reporting to CLS

Creating a CLS Topic

GooseFS log reporting depends on a third-party collection system platform. CLS is described as an example.
Create a CLS topic. For details, see CLS Quick Start Guide.

Configure filebeat

1. Configure log collection directory, edit the filebeat.yml file in the GooseFS deployment root directory under $GOOSEFS_HOME/conf.
- type: log
enabled: true
paths:
- ${path.home}/../logs/job_master.log*
fields:
type: "master"
exclude_files: ['.gz$']

multiline.pattern: '^[[:space:]]+(at|\\.{3})[[:space:]]+\\b|^Caused by:'
multiline.negate: false
multiline.match: after
Note:
paths: Configure the log paths to collect. Use wildcard * for multiple log files.
fields.type: custom type name.
multiline.pattern: multi-line merging rule.
2. Configure the CLS Logs collection platform account.
output.kafka:
hosts: ["sh-producer.cls.tencentcs.com:9096"]
topic: "a99cf1de-81d4-47a-97xxxxx-xxxx"
version: "0.11.0.0"
compression: "none"
username: "cc098474-b387-381xxxx-xxxxx"
password: "secretId#secretKey"
Note:
hosts: Available regions on the CLS platform. For specific addresses, refer to the CLS platform configuration: CLS Log Service - Uploading Log over Kafka.
Topic: Log topic ID assigned during log topic creation on the CLS platform.
version: kafka version supported by the CLS platform service. Default value: 0.11.0.0.
username: Logset ID assigned when creating a log topic on the CLS platform.
password: SecretId#SecretKey. SecretId and SecretKey are keys allocated in the App key management under the cloud manage account.
3. Enter the $GOOSEFS_HOME/filebeat directory and run the following command to start filebeat.
./goosefs-filebeat -c filebeat.yml
4. After filebeat startup is completed, it will collect logs generated by the GooseFS service in real time and report them to the CLS platform. You can view the specific log information reported through the CLS platform. As shown below:




GooseFS Log Reporting to ES

Create an ES

GooseFS log reporting depends on a third-party collection system platform. The following uses ES as an example:
Create an ES cluster. For specific ways to create, please refer to: Create an ES cluster.
Access an ES cluster. For the access method, please refer to: Access an ES cluster.

Configure filebeat

1. Configure log collection directory, edit the filebeat.yml file in the GooseFS deployment root directory under $GOOSEFS_HOME/conf.
- type: log
enabled: true
paths:
- ${path.home}/../logs/job_master.log*
fields:
type: "master"
exclude_files: ['.gz$']

multiline.pattern: '^[[:space:]]+(at|\\.{3})[[:space:]]+\\b|^Caused by:'
multiline.negate: false
multiline.match: after
Notes:
paths: Configure the log path to be collected. Wildcard * is used for multiple log files.
fields.type: custom type name.
multiline.pattern: multi-line merging rule.
2. Configure the ES log collection platform account.
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["es-nli7xxxxx.public.tencentelasticsearch.com"]

protocol: "https"

# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
username: "elasticxxx"
password: "costestxxx"

index: "GooseFS-%{[fields.type]}-%{+yyyy.MM.dd}"
Note:
hosts: ES providing external services.
username: ES login username.
password: The login password for ES.
index: ES log output index. For example, the master index is GooseFS-master-2022.01.18 and the worker index is GooseFS-worker-2022.01.18.
3. Enter the $GOOSEFS_HOME/filebeat directory and run the following command to start filebeat.
./goosefs-filebeat -c filebeat.yml
4. After filebeat startup is completed, it will collect logs generated by the GooseFS service in real time and submit them to the ES platform. You can check the specific log information submitted through the ES platform, as shown below:




Enabling GooseFS Audit Log Reporting

If you need to report audit logs for the GooseFS service, enable audit log configuration as follows:
1. Edit the goosefs-site.properties file in the GooseFS deployment directory under $GOOSEFS_HOME/conf and add the following configuration (Option).
goosefs.master.audit.logging.enabled=true
2. Run the following command to copy the file $GOOSEFS_HOME/conf/goosefs-site.properties to all worker nodes.
goosefs copyDir conf/
3. Start the GooseFS cluster.
./bin/goosefs-start.sh all

Was this page helpful?
You can also Contact Sales or Submit a Ticket for help.
Yes
No

Feedback