Extracting logs from Logs Data Platform
Objective
This guide explains how to export logs stored in the Logs Data Platform (LDP) using tooling that speaks the OpenSearch API. It presents two reference implementations:
- Logstash, suited to long-running pipelines and complex transformations.
- Elasticdump, a lightweight CLI utility designed for one-off or scheduled exports.
Exporting logs is a common need when you want to analyse data outside the LDP ecosystem or feed it to external BI tools. For long-term archiving, we provide another solution.
The next sections explain how to pull documents from your alias with Logstash or with the Elasticdump CLI, letting you choose the tool that best fits your operational model.
The final part explains how you can export your logs from your cold-stored archives.
Requirements
- You are already sending logs on a stream you own — see the quick start tutorial.
- You know the OpenSearch endpoint of your LDP cluster (
https://<ldp-cluster>.logs.ovh.com:9200). - Your host can reach TCP port 9200 on the cluster endpoint over TLS.
- You have credentials for the alias you want to export (basic authentication or IAM bearer token).
- You can install either Logstash ≥ 8.0 or Elasticdump ≥ 6.0 on the host that will run the export.
Instructions
Alias naming conventions
When you create an OpenSearch alias that points to a Graylog stream, the only customizable part is the suffix after -a-.. e.g., ldp-ti-98765-a-your_suffix**.
The remainder of the name is generated by the platform:
| IAM status | Generated part | Full alias example |
|---|---|---|
| IAM enabled | the service identifier (e.g. ldp-ti-98765) | ldp-ti-98765-a-logs-export |
| IAM disabled (before IAM migration) | your username (e.g. logs-ab-12345) | logs-ab-12345-a-logs-export |
The <suffix> part (here logs-export) is a free‑form string you choose to describe the purpose of the alias.
Create a stream alias
- Log in to the OVHcloud Control Panel and go to the
Identity, Security & Operationssection. - Click on
Logs Data Platformunder Operations then click on the desired account. - Select the
Aliastab and clickAdd an alias. - Choose a suffix, add a description and save the alias.
- Click the
…menu on the right of the newly created alias and selectAttach content to the alias. - Select the Graylog stream(s) you want to export and confirm.
The alias now points to the underlying OpenSearch alias that stores the logs of the chosen stream.
Export logs with Logstash
Prerequisites
| Requirement | Details |
|---|---|
| Logstash ≥ 8.0 | Install Logstash on a host that can reach your LDP cluster. |
| Java 8 or 11 | Required by Logstash. |
| OpenSearch endpoint | https://<ldp‑cluster>.logs.ovh.com:9200 |
| Authentication | Basic user/password or IAM bearer token (see the IAM FAQ). |
| Alias name | The alias created in the previous step (e.g. ldp-ti-98765-a-logs-export). |
Install Logstash and the OpenSearch plugins
Logstash pipeline
Create a file pipeline.conf (any location, e.g. config/pipeline.conf):
Run the pipeline
Logstash will connect to the OpenSearch endpoint, read the documents that belong to the alias defined above, and write the selected fields to a daily CSV file under /var/log/ldp/.
Export logs with Elasticdump
Prerequisites
| Requirement | Details |
|---|---|
| Elasticdump ≥ 6.0 | Install on a host that can reach your LDP cluster over HTTPS. |
| Node.js runtime | Elasticdump is a Node.js CLI; install Node.js 18 LTS or later. |
| Network access | Allow outbound TCP connectivity to <ldp-cluster>.logs.ovh.com on port 9200. |
| Authentication | Use either basic auth credentials (legacy users) or an IAM bearer token. |
| TLS trust store | Ensure the system trust store contains public Certificate Authorities, or supply a CA bundle with --input-ca. |
| Alias name | The alias created earlier (e.g. ldp-ti-98765-a-logs-export). |
Install Elasticdump
Elasticdump is an open-source software designed to export data from ElasticSearch/OpenSearch to a file or to another OpenSearch/ElasticSearch. It's a great tool to migrate from ElasticSearch to OpenSearch and to download data contained in your aliases or indices. It is written in JavaScript and thus relies on a JavaScript runtime to run.
For air-gapped or containerised environments, you can download the official Docker image and run the same commands with docker run --rm -v "$PWD":/work -w /work elasticdump/elasticsearch-dump.
Authenticate to the Logs Data Platform
Elasticdump relies on HTTP headers for authentication. Choose the method that matches your account:
- Basic authentication — append credentials in the URL:
https://<username>:<password>@<ldp-cluster>.logs.ovh.com:9200/<alias>. - IAM bearer token — add an
Authorizationheader:--input-headers '{"Authorization":"Bearer <iam-token>"}'.
When using IAM tokens, leave the credentials out of the URL and rely solely on the header. Tokens are short-lived; plan to refresh them before launching long exports.
Export an alias to JSON
Create a JSON file (search-body.json) that defines your query. The example below retrieves events from the last 24 hours:
Run Elasticdump to export the documents:
To authenticate with IAM, you can use the header flag or the hybrid authentication:
Elasticdump streams the results to ldp-export.json in newline-delimited JSON format, which can be loaded into analytics tools or archived for compliance.
Handle formats, pagination and large time ranges
To export data in CSV format, use the csv scheme in the output file:
Elasticdump paginates results automatically using the OpenSearch scroll API. Tune the export with the following options:
--limit <n>: controls how many documents Elasticdump pulls per batch. Reduce the value (e.g.200) if you experience timeouts.--maxSockets <n>: adjusts the number of concurrent HTTP connections. Set it to1for strict rate limiting or increase it to accelerate exports on aliases with high throughput.--input-parameters '{"scroll":"10m"}': extends the server-side cursor to 10 minutes, useful for large datasets.
To export specific time windows, modify search-body.json with a range filter and run several commands in sequence:
The --transform flag lets you adjust each document before writing it to disk. For example, to remove the _id field:
Export logs from archive
To export logs from your archives, we provide a tool to download them: ldp-archive-mirror. This software requires Python ≥ 3.6 to run.
First, install ldp-archive-mirror using pip:
Then you can use the binary ldp-mirror:
Please go to the github page to setup this software and obtain the latest information on it.
Go further
For more details on the OpenSearch input plugin, see the official documentation. The CSV output plugin reference is available here. Documentation for Elasticdump is available here.