• Windows
  • Ollama, Gemini or LLM Support ?

Hi,
I know you have pro version for AI translation but is it possible to add support for ollama, gemini or other ai platforms with our own api key ?

    leland Hi! Could you please share more details about the specific challenge you'd like to address by integrating with your own API key? This will help me explore the best possible solution for your needs.

    By the way, I wondering if it could be a new feature for premium, analogy to anki, anki could set your own sync server if you build one for yourself. Same as here, if you buy a life-long payment, this kind of feature would be useful, because it certainly is a guarantee that you could use it forever. Don't take this in a wrong way and respectfully , for a customer or user, an actionable method you can do to retain the product you've paid would be more convincing than a verbal promise.

    But I wonder if there are some difficulties to achieve it, like someone would use it to copy your product, which would not be a good thing to happen.

      leuas
      You make a valid point. Rather than implementing a custom API key feature, I'm leaning toward waiting for Microsoft's promised offline AI integration in Windows. This would allow me to build a fully offline version of Lookupper—a more sustainable long-term solution.

      Thanks for getting back to me—here’s a bit more context on why being able to plug in my own API key and prompt would make a tangible difference:

      1. Deeper grammar coverage for Norwegian or another languages:
        Norwegian Bokmål actually has three grammatical genders (masculine, feminine, neuter). Article choice, definite/indefinite endings, and agreement all shift with gender, and most generic LLM prompts only return the default masculine form. With a custom prompt I can reliably ask for:
        • grammatical gender (m/f/n)
        • indefinite + definite forms
        • short usage examples for each gender
        Running that prompt myself avoids the extra manual checks I currently have to do outside Lookupper.

      2. Offline use and quota:
        Using my local Ollama (or Gemini with my key) works even off‑grid and removes the 50‑lookups/3‑hours cap—usage is only limited by my own model/account quotas.

      3. You can keep this pro‑only; so power‑users pay for their own compute while casual users stay on your servers. Of course it will be better for me if you want to keep it free-version 😃

        leland Thank you, my friend. Now I understand.

        15 days later

        I landed here with the same request! For those of us who are a bit more tech-savvy, having the ability to add our own API keys and custom instructions for the AI would be fantastic. Also, finding a way around the 50 requests every 3 hours would be a huge plus.

          efstofd

          Totally get where you’re coming from — having the option to plug in your own API keys and customize things would definitely be powerful for tech-savvy users.

          The tricky part is that there are a lot of AI providers out there, and even supporting just a few of them would be a big task. On top of that, maintaining compatibility every time one of them decides to change their API would be a constant headache. It’s not impossible, of course — I could do it — but unless this is something a lot more people are asking for, it’s probably not worth the effort for now.

          As for the 50 requests — just to clarify, that’s only a limit for free users. If you’re using Lookupper Pro, there’s no such cap.

          Appreciate the feedback, though — if I see more interest in this, I’ll definitely consider it for the roadmap.