wuphysics87@lemmy.ml to Privacy@lemmy.ml · 16 hours agoCan you trust locally run LLMs?message-squaremessage-square20fedilinkarrow-up158arrow-down15file-text
arrow-up153arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 16 hours agomessage-square20fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squarestink@lemmygrad.mllinkfedilinkEnglisharrow-up1·2 hours agoIt’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
It’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.