tl-dr

-Can someone give me step by step instructions (ELI5) on how to get access to my LLM’s on my rig from my phone?

Jan seems the easiest but I’ve tried with Ollama, librechat, etc.

I’ve taken steps to secure my data and now I’m going the selfhosting route. I don’t care to become a savant with the technical aspects of this stuff but even the basics are hard to grasp! I’ve been able to install a LLM provider on my rig (Ollama, Librechat, Jan, all of em) and I can successfully get models running on them. BUT what I would LOVE to do is access the LLM’s on my rig from my phone while I’m within proximity. I’ve read that I can do that via wifi or LAN or something like that but I have had absolutely no luck. Jan seems the easiest because all you have to do is something with an API key but I can’t even figure that out.

Any help?

  • DrDystopia@lemy.lol
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    22 days ago

    Just do like me - Install Ollama and OpenWebUI, install Termux on Android, connect through Termux with port forwarding.

    ssh -L 0.0.0.0:3000:ServerIP_OnLAN:3000

    And access OpenWebUI at http://127.0.0.1:3000/ on your phone browser. Or SSH forward the Ollama port to use the Ollama Android app. This requires you to be on the same LAN as the server. If you port forward SSH through your router, you can access it remotely through your public IP (If so, I’d recommend only allowing login through certs or have a rate limiter for SSH login attempts.

    The shell command will then be ssh -L 0.0.0.0:3000:YourPublicIP:3000

    But what are the chances that you run the LLM on a Linux machine and use an android to connect, like me, and not a windows machine and use an iPhone? You tell me. No specs posted…

    • BlackSnack@lemmy.zipOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 days ago

      Oh! Also, I’m using windows on my PC. And my phone is an iPhone.

      I’m not using Linux yet, but that is in my todo list for the future! After I get more comfortable with some more basics of self hosting.