Premiers pas avec Terraform pour les services Analytics (EN)
Objective
Public Cloud Managed Analytics services allow you to focus on building and deploying cloud applications while OVHcloud takes care of the analytics infrastructure and maintenance.
This guide explains how to order an OpenSearch instance of an Analytics service using Terraform.
Requirements
- Terraform installed. This guide was tested with version v1.14.6.
- Access to the OVHcloud API (create your credentials by consulting this guide)
- A Public Cloud project in your OVHcloud account
Instructions
Step 1: Gather the OVHcloud required parameters
Getting your cluster/API tokens information
The "OVH provider" needs to be configured with a set of credentials:
- an
application_key - an
application_secret - a
consumer_key
Why?
Because, behind the scenes, the "OVH Terraform provider" is doing requests to OVHcloud APIs.
To retrieve this information, follow our First steps with the OVHcloud APIs tutorial.
Specifically, you have to generate these credentials via the OVHcloud token generation page with the following rights:
- GET
/cloud/project/*/database/* - POST
/cloud/project/*/database/* - PUT
/cloud/project/*/database/* - DELETE
/cloud/project/*/database/*
Once you have generated your tokens, save them — you will need them shortly.
The last needed information is the service_name: it is the ID of your Public Cloud project.
How to get it?
In the Public Cloud section, you can retrieve your service name ID thanks to the Copy to clipboard button.
You will also use this information in Terraform resources definition files.
Step 2: Gather the set of required parameters
To create a new OpenSearch cluster, specify at least:
- the engine (e.g. "opensearch")
- the version (e.g. "3.3")
- the region (e.g. "EU-WEST-PAR")
- the plan (e.g. "production")
- the flavor of the cluster (e.g. "b3-8")
Step 3: Create Terraform files
First, create a main.tf file defining the resources that will be created.
Then, create a variables.tf file defining the variables used in main.tf:
Here, we defined the ovh-eu endpoint because we want to call the OVHcloud Europe API. Other endpoints exist, depending on your needs:
ovh-eufor OVHcloud Europe APIovh-cafor OVHcloud North-America API
Then, create a secrets.tfvars file containing the required variables values:
Don't forget to replace <service_name>, <application_key>, <application_secret>, <consumer_key>, <ip_range> by the real data.
Finally, create an outputs.tf file defining the resources that will be exported:
Step 4: Run
Now we need to initialise Terraform, generate a plan, and apply it.
The init command will initialize your working directory which contains .tf configuration files.
Run it first for any new configuration, or after checking out a configuration from a git repository.
The init command will:
- Download and install Terraform providers/plugins
- Initialise backend (if defined)
- Download and install modules (if defined)
Now, we can generate our plan:
Thanks to the plan command, we can check what Terraform wants to create, modify or remove.
The plan is OK for us, so let's apply it:
Finally export the user credentials and the URI:
And that's it, the OpenSearch cluster is created.
How to deploy with another engine
This guide covered deploying an OpenSearch service. You can find a Kafka example here:
Go further
Starting with OpenSearch analytics service
Configuring vRack for Public Cloud
Visit our dedicated Discord channel: https://discord.gg/ovhcloud. Ask questions, provide feedback and interact directly with the team that builds our Analytics services.
If you need training or technical assistance to implement our solutions, contact your sales representative or click on this link to get a quote and ask our Professional Services experts for a custom analysis of your project.
Join our community of users.