AI Endpoints - Create a code assistant with Continue

Database di conoscenze

AI Endpoints - Create a code assistant with Continue


Icons/System/eye-open Created with Sketch. 89 viste 19.12.2025 AI Endpoints

Introduction

What is Continue?

Continue is an IDE‑agnostic AI assistant that brings chat, code generation, and autocomplete capabilities directly into your editor. Compatible with VS Code and JetBrains IDEs (e.g., IntelliJ, PyCharm), Continue empowers you to plug in any compatible LLM hosted on AI Endpoints.

Continue enables you to configure and use your own LLMs, giving you full control over the models you use and how they interact with your code.

This tutorial describes how to configure Continue to connect to AI Endpoints LLMs, allowing these models to integrate with your editor to enhance coding efficiency, accuracy, and productivity.

Requirements

Instructions

Install Continue

Follow the official Continue installation instructions for your IDE.

Once installed, Continue will share the same configuration across your IDEs.

Configure Continue with AI Endpoints

Continue uses a YAML-based configuration file to manage:

  • Chatbot tool models
  • Tab autocomplete models

You can customize this configuration file to connect the plugin to AI Endpoints:

Continue configuration for OVHcloud AI Endpoints

name: ide-configuration
version: 0.0.1
schema: v1
models:
  - name: Meta-Llama-3_3-70B-Instruct
    provider: openai
    model: Meta-Llama-3_3-70B-Instruct
    apiBase: https://oai.endpoints.kepler.ai.cloud.ovh.net/v1
    apiKey: <you AI Endpoint API key> # replace with your API key
    roles: [chat, edit, apply, summarize]
  - name: Qwen3-Coder-30B-A3B-Instruct
    provider: openai
    model: Qwen3-Coder-30B-A3B-Instruct
    apiBase: https://oai.endpoints.kepler.ai.cloud.ovh.net/v1
    apiKey: <you AI Endpoint API key> # replace with your API key
    roles: [chat, edit, apply, summarize, autocomplete]

When you have modified your config file, make sure to reload it before trying to interact with your configured models!

Try out different LLMs from our catalog and choose the one that best fits your use case. You can switch between them easily in the IDE UI.

Try It Out

Once Continue is configured with your AI Endpoints, you're ready to test both features:

Chatbot Tool

Use the chatbot sidebar to ask for help, generate code, or refactor logic with any of your configured models.

Chatbot tool animation

Tab Completion Tool

Just start typing in your editor. The autocomplete model will complete code as you go — powered by your custom-configured model from AI Endpoints.

Tab completion animation

Troubleshooting

  • Model not loading: Verify the apiBase URL and ensure the apiKey variable is correctly set.
  • Autocomplete not working: Make sure the model role includes autocomplete and that the selected model supports this capability.
  • Connection errors: Check network connectivity and confirm your apiKey is valid.

Conclusion

By using Continue and AI Endpoints, you now have access to a fully customizable code assistant, support for cutting-edge open-source large language models such as Qwen, Mixtral, and LLaMA 3, and the ability to manage your own configuration and resources on AI Endpoints.

If you need training or technical assistance to implement our solutions, contact your sales representative or click on this link to get a quote and ask our Professional Services experts for a custom analysis of your project.

Feedback

Please feel free to send us your questions, feedback, and suggestions regarding AI Endpoints and its features:

  • In the #ai-endpoints channel of the OVHcloud Discord server, where you can engage with the community and OVHcloud team members.

Articoli correlati