• idkwhatimdoing@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    1 year ago

    As someone who works in content marketing, this is already untrue at the current quality of LLMs. It still requires a LOT of human oversight, which obviously it was not given in this example, but a good writer paired with knowledgeable use of LLMs is already significantly better than a good content writer alone.

    Some examples are writing outside of a person’s subject expertise at a relatively basic level. This used to take hours or days of entirely self-directed research on a given topic, even if the ultimate article was going to be written for beginners and therefore in broad strokes. With diligent fact-checking and ChatGPT alone, the whole process, including final copy, takes maybe 4 hours.

    It’s also an enormously useful research tool. Rather than poring over research journals, you can ask LLMs with academic plug-ins to give a list of studies that fit very specific criteria and link to full texts. Sometimes it misfires, of course, hence the need for a good writer still, but on average this can cut hours from journalistic and review pieces without harming (often improving) quality.

    All the time writers save by having AI do legwork is then time they can instead spend improving the actual prose and content of an article, post, whatever it is. The folks I know who were hired as writers because they love writing and have incredible commitment to quality are actually happier now using AI and being more “productive” because it deals mostly with the shittiest parts of writing to a deadline and leaves the rest to the human.

    • circuitfarmer@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      It still requires a LOT of human oversight, which obviously it was not given in this example, but a good writer paired with knowledgeable use of LLMs is already significantly better than a good content writer alone.

      I’m talking about future state. The goal clearly is to avoid the need of human oversight altogether. The purpose of that is saving some rich people more money. I also disagree that LLMs improve output of good writers, but even if they did, the cost to society is high.

      I’d much rather just have the human author, and I just hope that saying “we don’t use AI” becomes a plus for PR due to shifting public opinion.

      • kromem@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        No, it’s not the ‘goal’.

        Somehow when it comes to AI it’s humans who have the binary thinking.

        It’s not going to be “either/or” anytime soon.

        Collaboration between humans and ML is going to be the paradigm for the foreseeable future.

        • M0oP0o@mander.xyz
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          The hundreds of clearly AI written help articles with bad or useless info every time I try to look something up in the last few months says otherwise…

          • Womble@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Because the internet was so clear of junk and spam before LLMs were released?

            • M0oP0o@mander.xyz
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              There once was a time, long long ago, where the interwebs had good information on it. It was even easier to find then, before the googles went hard.

              But really I have noticed a massive increase in AI junk writing popping up first in any thing I try to look up.

              • Womble@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                edit-2
                1 year ago

                if you want to go back to the 90s or early 2000s sure. But 4 years ago the internet was full of blogspam clickbait articles and fake news. LLMs have not increased that percetptably to me, the first 10 results on google were often crap 4 years ago and theyre often crap now

                • M0oP0o@mander.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Yes, some of us are old and still remember the hope and utility.

                  I will agree that things have been on the downslide for a while but maybe its just the way google now works or that AI articles are free but I get a ton of them for any “how to” or “walkthough” type search. At least if I look up “how to make taco sauce” the article will tell me how after the mandatory life story and other bullshit.