Article: https://proton.me/blog/deepseek

Calls it “Deepsneak”, failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can’t speak for Proton, but the last couple weeks are showing some very clear biases coming out.

  • ReversalHatchery@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    13
    ·
    10 hours ago

    What??? Whoever wrote this sounds like he has 0 understanding of how it works. There is no “more privacy-friendly version” that could be developed, the models are already out and you can run the entire model 100% locally. That’s as privacy-friendly as it gets.

    Unfortunately it is you who have 0 understanding of it. Read my comment below. Tldr: good luck to have the hardware

    • v_krishna@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 hours ago

      Obviously you need lots of GPUs to run large deep learning models. I don’t see how that’s a fault of the developers and researchers, it’s just a fact of this technology.

    • simple@lemm.ee
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      10 hours ago

      I understand it well. It’s still relevant to mention that you can run the distilled models on consumer hardware if you really care about privacy. 8GB+ VRAM isn’t crazy, especially if you have a ton of unified memory on macbooks or some Windows laptops releasing this year that have 64+GB unified memory. There are also websites re-hosting various versions of Deepseek like Huggingface hosting the 32B model which is good enough for most people.

      Instead, the article is written like there is literally no way to use Deepseek privately, which is literally wrong.

      • superglue@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 hours ago

        So I’ve been interested in running one locally but honestly I’m pretty confused what model I should be using. I have a laptop with a 3070 mobile in it. What model should I be going after?

    • lily33@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      There are already other providers like Deepinfra offering DeepSeek. So while the the average person (like me) couldn’t run it themselves, they do have alternative options.

    • azron@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      8 hours ago

      Down votes be damned, you are right to call out the parent they clearly dont articulate their point in a way that confirms they actually understand what is going on and how an open source model can still have privacy implications if the masses use the company’s hosted version.