Is there an IDE that can use the local open-source model?

It seems that Cursor, VScode, and others don't offer this option separately either. It seems technically feasible, but is there no way to do it?

3 points | by haebom 22 hours ago

6 comments

  • SMAAART 22 hours ago
    VS code: https://code.visualstudio.com/docs/intelligentapps/models

    > Models downloaded locally from repositories like Ollama and ONNX.

    • verdverm 16 hours ago
      This, a simple search like "vs code copilot ollama" will find you the answer

      tl;dr - OP is wrong, you can use local models with setups like Copilot in VS Code out-of-the-box

      https://code.visualstudio.com/docs/copilot/customization/lan...

      • locallyHosted2 3 hours ago
        "Currently, you cannot connect to a local model for code completions", and "Currently, using a locally hosted models still requires the Copilot service for some tasks. Therefore, your GitHub account needs to have access to a Copilot plan (for example, Copilot Free) and you need to be online. This requirement might change in a future release." - from the page you linked.
  • vunderba 16 hours ago
    I don't know about out-of-the-box but there's several popular extensions for both Jetbrains and VS Code that let you use local models.

    Roocode

    https://github.com/RooCodeInc/Roo-Code

    Continue

    https://github.com/continuedev/continue

  • animuchan 21 hours ago
    It's absolutely possible, but on an M4 mac the largest / slowest model I could feasibly run was very inferior, compared to the default paid Cursor experience.

    I tested with Kilo Code: https://kilocode.ai/ -- it's a VS Code / Cursor extension.

    To host models on desktop, there's this: https://ollama.com/

  • aj1thkr1sh 5 hours ago
    You could try Zed IDE with Ollama
  • almaira 19 hours ago
    I've been looking for this too. It seems to me as though all the ide's are trying to sell the llms as a service or trying to lock you in by downloading llms through their ide. I have been downloading llm's from huggingface as gguf files and would like to use those downloads (and running them through https://github.com/oobabooga/text-generation-webui). It is possible to run those llms as a local api using something like llama-cpp-python (https://pypi.org/project/llama-cpp-python/) and would prefer to use something like that method. Zed (https://zed.dev/), which is now available on windows might be able to do it, but i'd rather use something (foss) that doesn't have a pricing model (the development focus will always be upon those who pay). tbh i'm getting a bit sick of changing ide's, as their support changes, and really would prefer not to use (microsoft) visual studio code which seems to be cornering the market. Starting to think i'm going to try to learn emacs, with https://github.com/karthink/gptel looking as if it would meet my needs.
  • justplay 17 hours ago
    try zed