Pro@programming.dev to Technology@lemmy.worldEnglish · 2 days agoCursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requestsconsumerrights.wikiexternal-linkmessage-square27linkfedilinkarrow-up1302arrow-down17
arrow-up1295arrow-down1external-linkCursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requestsconsumerrights.wikiPro@programming.dev to Technology@lemmy.worldEnglish · 2 days agomessage-square27linkfedilink
minus-squarefmstrat@lemmy.nowsci.comlinkfedilinkEnglisharrow-up14arrow-down1·2 days agoI’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
minus-squareAnd009@lemmynsfw.comlinkfedilinkEnglisharrow-up2·2 days agoI’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
minus-squareRetro_unlimited@lemmy.worldlinkfedilinkEnglisharrow-up2·20 hours agoI have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
minus-squareLlak@lemmy.worldlinkfedilinkEnglisharrow-up5·2 days agoCheckout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.
I’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
Checkout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.