News Feed
  • DrugHub has agreed to fully refund all users who lost money in the SuperMarket exit scam.  
  • Retro Market has gone offline. Circumstances of the closure unknown.  
  • SuperMarket has closed following an exit scam by one of the admins.  
  • The admin of Incognito Market, Pharoah, has been arrested by the FBI several months after exit scamming.  
  • Silk RoadTorhoo mini logo
  • darknet markets list
  • Popular P2P exchange LocalMonero has announced it is closing.  

Switching from Chat gpt : OpSec | Torhoo darknet markets

i would like to run a local LLM due to luck of privacy with chat bots like chat gpt
how does it work?
what LLM would you recommend for a mid-range pc?
i want to mainly use it for asking about software tech and stuff in general
/u/caseykc
1 points
4 days ago
Assuming you're on Linux it's not hard, I don't know about other OSes. I think you need to have a nice ass graphics card and a few hundred GB of free space to run any local models that would be as good as something like Grok or ChatGPT.

I don't know what models to recommend because I don't know how good your mid range PC is. Maybe start with OpenChat, it's pretty lightweight. See how well it runs and then decide if you want to try running any more demanding models.

Here's how you can get a basic local LLM setup:

Install Docker if you don't have it.

Download Ollama: https://ollama.com/download/linux
Once it installs you can pull models from ollama.com/models with:
ollama pull (model name)

You can run AI models in your terminal with Ollama but you can also get a nice ChatGPT style GUI for it, which I highly recommend.
Go to https://github.com/open-webui/open-webui and scroll down to the Docker section. There should be a list of commands for different setups, copy the one that's right for you and run it. Once Open WebUI installs you should be able to access it at localhost:3000. I don't know how to make it able to search the web yet but there's probably tutorials online. Hope this helps.