• halcyoncmdr@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    10 hours ago

    The LLM isn’t limited to just what it does. It can interact with other programs.

    There are a ton of audio recognition systems available, almost all of them predate this LLM bubble. There’s already an API for interacting with the ordering system. So it’s just down to having the LLM pull what is then do that corresponding action for the order.

    This is so simple it doesn’t require anything nearly as complicated as an LLM. The old phone assistants like Siri and Alexa could do this type of thing. It’s literally the same as telling Alexa to place an order for something, and that’s been an ability for years.

    • ch00f@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 hours ago

      So the output from the LLM is just a text description that’s fed into another, smarter piece of software that interprets that text into an order? What task is the LLM actually doing in this case?

      • Dashi@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        10 hours ago

        The LLM is taking the order. Interpreting what people say into that simple text description. Not everyone talks the same or describes things the same. That is i believe where the bulk of the LLM is doing the work. Then I’m sure there is some background stock management and health checks out manages as well

      • Vanth@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        10 hours ago

        I don’t think there is an LLM in this application. Not all AI tools involve LLM.

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      10 hours ago

      I think the role of the LLM is just to make the system understand the order more accurately.