Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    2 months ago

    Wanting answers to things you don’t want google to know that you don’t know.

      • umami_wasabi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        IMO LLMs are ok to get a head start of searching. Like got a vague idea of something but don’t know the exact keywords. LLMs can help and use the output on whatever search engine you like. This saves a lots of time tinkering the right keywords.

        • dwindling7373@feddit.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Sure, or you could send an email to the leading international institution on the matter to get a very accurate answer!

          Is it the most reasonable course of action? No. Is it more reasonable than waste a gazillion Watt so you can maybe get some better keywords to then paste in a search engine? Yes.