Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Amazon OpenSearch destination to send logs to Amazon OpenSearch.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use the Amazon S3 destination to send logs to Amazon S3. If you want to send logs to Amazon S3 for [archiving][1] and [rehydration][2], you must [configure Log Archives](#configure-log-archives). If you don't want to rehydrate your logs in Datadog, skip to [Set up the destination for your pipeline](#set-up-the-destination-for-your-pipeline).

You can also [route logs to Snowflake using the Amazon S3 destination](#route-logs-to-snowflake-using-the-amazon-s3-destination).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Amazon Security Lake destination to send logs to Amazon Security Lake.

## Prerequisites
Expand Down Expand Up @@ -41,13 +43,7 @@ Set up the Amazon Security Lake destination and its environment variables when y

##### Enable TLS

Toggle the switch to **Enable TLS**. If you enable TLS, the following certificate and key files are required.
**Note**: All file paths are made relative to the configuration data directory, which is `/var/lib/observability-pipelines-worker/config/` by default. See [Advanced Worker Configurations][4] for more information. The file must be owned by the `observability-pipelines-worker group` and `observability-pipelines-worker` user, or at least readable by the group or user.
- Enter the identifier for your Amazon Security Lake key pass. If you leave it blank, the [default](#set-secrets) is used.
- **Note**: Only enter the identifier for the key pass. Do **not** enter the actual key pass.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

##### Buffering

Expand Down Expand Up @@ -93,5 +89,4 @@ A batch of events is flushed when one of these parameters is met. See [event bat
[1]: https://app.datadoghq.com/observability-pipelines
[2]: /observability_pipelines/destinations/#event-batching
[3]: /observability_pipelines/processors/remap_ocsf
[4]: /observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/
[5]: /observability_pipelines/destinations/amazon_security_lake/#aws-authentication
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use the Azure Storage destination to send logs to an Azure Storage bucket. If you want to send logs to Azure Storage for [archiving][1] and [rehydration][2], you must [configure Log Archives](#configure-log-archives). If you don't want to rehydrate logs in Datadog, skip to [Set up the destination for your pipeline](#set-up-the-destination-for-your-pipeline).

## Configure Log Archives
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' CloudPrem destination to send logs to Datadog CloudPrem.


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' CrowdStrike Next-Gen SIEM destination to send logs to CrowdStrike Next-Gen SIEM.

## Setup
Expand All @@ -34,13 +36,7 @@ To use the CrowdStrike NG-SIEM destination, you need to set up a CrowdStrike dat

##### Enable TLS

Toggle the switch to **Enable TLS**. If you enable TLS, the following certificate and key files are required.
**Note**: All file paths are made relative to the configuration data directory, which is `/var/lib/observability-pipelines-worker/config/` by default. See [Advanced Worker Configurations][4] for more information. The file must be owned by the `observability-pipelines-worker group` and `observability-pipelines-worker` user, or at least readable by the group or user.

- Enter the identifier for your CrowdStrike NG-SIEM key pass. If you leave it blank, the [default](#set-secrets) is used.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

##### Buffering

Expand Down Expand Up @@ -83,4 +79,3 @@ A batch of events is flushed when one of these parameters is met. See [event bat
[1]: https://app.datadoghq.com/observability-pipelines
[2]: /observability_pipelines/destinations/#event-batching
[3]: https://falcon.us-2.crowdstrike.com/documentation/page/bdded008/hec-http-event-connector-guide
[4]: /observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Datadog Logs destination to send logs to Datadog Log Management. You can also use [AWS PrivateLink](#aws-privatelink) to send logs from Observability Pipelines to Datadog.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Datadog Metrics destination ({{< tooltip glossary="preview" case="title" >}}) to send metrics to Datadog. You can also use [AWS PrivateLink](#aws-privatelink) to send metrics from Observability Pipelines to Datadog.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Elasticsearch destination to send logs to Elasticsearch.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

<div class="alert alert-info">For Worker versions 2.7 and later, the Google Cloud destination supports <a href = "https://cloud.google.com/storage/docs/uniform-bucket-level-access">uniform bucket-level access</a>. Google <a href = "https://cloud.google.com/storage/docs/uniform-bucket-level-access#should-you-use">recommends</a> using uniform bucket-level access. <br>For Worker version older than 2.7, only <a href = "https://cloud.google.com/storage/docs/access-control/lists">Access Control Lists</a> is supported.</div>

Use the Google Cloud Storage destination to send your logs to a Google Cloud Storage bucket. If you want to send logs to Google Cloud Storage for [archiving][1] and [rehydration][2], you must [configure Log Archives](#configure-log-archives). If you do not want to rehydrate logs in Datadog, skip to [Set up the destination for your pipeline](#set-up-the-destinations).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,10 +101,7 @@ Set up the Google Pub/Sub destination and its environment variables when you [se

##### Enable TLS

Toggle the switch to **Enable TLS** if your organization requires secure connections with custom certificates.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) Root File in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) Root File in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

##### Buffering

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Google SecOps destination to send logs to Google SecOps.

The Observability Pipelines Worker uses standard Google authentication methods. See [Authentication methods at Google][3] for more information about choosing the authentication method for your use case.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,7 @@ Toggle the switch to **Enable Compression**. If enabled:

#### Enable TLS

Toggle the switch to enable TLS. If you enable TLS, the following certificate and key files are required:
- Enter the identifier for your HTTP Client key pass. If you leave it blank, the [default](#set-secrets) is used.
- **Note**: Only enter the identifier for the key pass. Do **not** enter the actual key pass.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

#### Buffering

Expand Down
7 changes: 1 addition & 6 deletions content/en/observability_pipelines/destinations/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,11 +43,7 @@ Set up the Kafka destination and its environment variables when you [set up a pi

##### Enable TLS

Toggle the switch to enable **TLS**. The following certificate and key files are required.<br>**Note**: All file paths are made relative to the configuration data directory, which is `/var/lib/observability-pipelines-worker/config/` by default. See [Advanced Worker Configurations][6] for more information. The file must be owned by the `observability-pipelines-worker group` and `observability-pipelines-worker` user, or at least readable by the group or user.
- Enter the identifier for your Kafka TLS key pass. If you leave it blank, the [default](#set-secrets) is used.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

##### Enable SASL authentication

Expand Down Expand Up @@ -155,7 +151,6 @@ A batch of events is flushed when one of these parameters is met. See [event bat
[3]: https://docs.databricks.com/aws/en/connect/streaming/kafka
[4]: https://learn.microsoft.com/en-us/azure/event-hubs/azure-event-hubs-apache-kafka-overview
[5]: https://app.datadoghq.com/observability-pipelines
[6]: /observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/
[7]: https://docs.confluent.io/platform/current/clients/librdkafka/html/md_CONFIGURATION.html
[8]: /observability_pipelines/monitoring/metrics/
[9]: /observability_pipelines/destinations/#event-batching
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Microsoft Sentinel destination to send logs to Microsoft Sentinel. See [Logs Ingestion API][3] for API call limits in Microsoft Sentinel.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' New Relic destination to send logs to New Relic.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' OpenSearch destination to send logs to OpenSearch.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' SentinelOne destination to send logs to SentinelOne.

## Setup
Expand Down
9 changes: 3 additions & 6 deletions content/en/observability_pipelines/destinations/socket.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Socket destination to send logs to a socket endpoint.

## Setup
Expand All @@ -27,12 +29,7 @@ Set up the Socket destination and its environment variables when you [set up a p

##### Enable TLS

If you enabled **TCP** mode, you can toggle the switch to **Enable TLS**. The following certificate and key files are required for TLS:
- Enter the identifier for your socket key pass. If you leave it blank, the [default](#set-secrets) is used.

- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

#### Buffering

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Splunk HTTP Event Collector (HEC) destination to send logs to Splunk HEC.

## Setup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' Sumo Logic destination to send logs to your Sumo Logic Hosted Collector.

## Setup
Expand Down
8 changes: 3 additions & 5 deletions content/en/observability_pipelines/destinations/syslog.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ products:

{{< product-availability >}}

## Overview

Use Observability Pipelines' syslog destinations to send logs to rsyslog or syslog-ng.

## Setup
Expand Down Expand Up @@ -42,11 +44,7 @@ To set up the syslog destination in the UI:

##### Enable TLS

Toggle the switch to enable TLS. If you enable TLS, the following certificate and key files are required:
- Enter the identifier for your syslog key pass. If you leave it blank, the [default](#set-secrets) is used.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

##### Wait time for TCP keepalive probes

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,11 +35,7 @@ Select an **AWS authentication** option. If you select **Assume role**:

#### Enable TLS

Toggle the switch to **Enable TLS**. If you enable TLS, the following certificate and key files are required.<br>**Note**: All file paths are made relative to the configuration data directory, which is `/var/lib/observability-pipelines-worker/config/` by default. See [Advanced Worker Configurations][2] for more information. The file must be owned by the `observability-pipelines-worker group` and `observability-pipelines-worker` user, or at least readable by the group or user.
- Enter the identifier for your Amazon Data Firehose key pass. If you leave it blank, the [default](#set-secrets) is used.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

## Set secrets

Expand Down Expand Up @@ -76,7 +72,6 @@ Toggle the switch to **Enable TLS**. If you enable TLS, the following certificat
{{% observability_pipelines/aws_authentication/amazon_s3_source/permissions %}}

[1]: /observability_pipelines/configuration/set_up_pipelines/
[2]: /observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/
[3]: https://app.datadoghq.com/observability-pipelines
[4]: /api/latest/observability-pipelines/
[5]: https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/observability_pipeline
7 changes: 1 addition & 6 deletions content/en/observability_pipelines/sources/amazon_s3.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,11 +36,7 @@ Select an **AWS authentication** option. If you select **Assume role**:

#### Enable TLS

Toggle the switch to **Enable TLS**. If you enable TLS, the following certificate and key files are required.<br>**Note**: All file paths are made relative to the configuration data directory, which is `/var/lib/observability-pipelines-worker/config/` by default. See [Advanced Worker Configurations][2] for more information. The file must be owned by the `observability-pipelines-worker group` and `observability-pipelines-worker` user, or at least readable by the group or user.
- Enter the identifier for your Amazon S3 key pass. If you leave it blank, the [default](#set-secrets) is used.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER or PEM (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER or PEM (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER or PEM (PKCS#8) format.
{{% observability_pipelines/tls_settings %}}

## Set secrets

Expand Down Expand Up @@ -74,7 +70,6 @@ Toggle the switch to **Enable TLS**. If you enable TLS, the following certificat


[1]: /observability_pipelines/configuration/set_up_pipelines/
[2]: /observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/
[3]: https://app.datadoghq.com/observability-pipelines
[4]: /api/latest/observability-pipelines/
[5]: https://registry.terraform.io/providers/datadog/datadog/latest/docs/resources/observability_pipeline
Original file line number Diff line number Diff line change
Expand Up @@ -36,15 +36,7 @@ The following are required to send Cloudflare Logpush logs to Observability Pipe
1. Select your authorization strategy. If you selected **Plain**:
- Enter the identifiers for the HTTP/S Server username and password. See [Set secrets][3] for the defaults used.
1. In the **Decoding** dropdown menu, select **Bytes**.
1. Toggle the switch to **Enable TLS**.
- If you are using Secrets Management, enter the identifier for the HTTP/S Server key pass. See [Set secrets][3] for the defaults used.
- The following certificate and key files are required.
- `Server Certificate Path`: The path to the certificate file that has been signed by your Certificate Authority (CA) root file in DER, PEM, or CRT (X.509).
- `CA Certificate Path`: The path to the certificate file that is your Certificate Authority (CA) root file in DER, PEM, or CERT (X.509).
- `Private Key Path`: The path to the `.key` private key file that belongs to your Server Certificate Path in DER, PEM, or CERT (PKCS #8) format.
- **Notes**:
- The configuration data directory `/var/lib/observability-pipelines-worker/config/` is automatically appended to the file paths. See [Advanced Worker Configurations][7] for more information.
- The file must be readable by the `observability-pipelines-worker` group and user.
{{% observability_pipelines/tls_settings %}}
1. Copy your certificates into the configuration directory:
```shell
# Create the configuration directory
Expand Down Expand Up @@ -90,4 +82,3 @@ After your Logpush job has been successfully created, you can view your Cloudfla
[4]: /observability_pipelines/configuration/install_the_worker/?tab=docker#pipeline-ui-setup
[5]: https://developers.cloudflare.com/logs/logpush/logpush-job/enable-destinations/http/
[6]: https://app.datadoghq.com/logs
[7]: /observability_pipelines/configuration/install_the_worker/advanced_worker_configurations/
Loading
Loading