wuphysics87@lemmy.ml to Privacy@lemmy.ml · 15 hours agoCan you trust locally run LLMs?message-squaremessage-square20fedilinkarrow-up158arrow-down15file-text
arrow-up153arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 15 hours agomessage-square20fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squaremarcie (she/her)@lemmy.mllinkfedilinkarrow-up13·14 hours agoyeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though
minus-squareSeekPie@lemm.eelinkfedilinkarrow-up6·edit-212 hours agoYou could use “Alpaca” flatpak and remove the internet access with flatseal after having downloaded the model. (Linux) Or deny the app’s access to internet in app settings. (Android)
yeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though
You could use “Alpaca” flatpak and remove the internet access with flatseal after having downloaded the model. (Linux)
Or deny the app’s access to internet in app settings. (Android)