• Khanzarate@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    10 hours ago

    Its just an API.

    There’s a few ways they could go about it. They could have part of the prompt be something like “when the customer is done taking their order, create a JSON file with the order contents” and set up a dumb register essentially that looks for those files and adds that order like a standard POS would.

    They could spell out a tutorial in the prompt, "to order a number 6 meal, type “system.order.meal(6)” calling the same functions that a POS system would, and have that output right to a terminal.

    They could have their POS system be open on an internal screen, and have a model that can process images, and have it specify a coordinate pair, to simulate a touch screen, and make it manually enter an order that way as an employee would.

    There’s lots of ways to hook up the AI, and it’s not actually that different from hooking up a normal POS system in the first place, although just because one method does allow an AI to interact doesn’t mean it’ll go about it correctly.

      • Khanzarate@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        9 hours ago

        They do, my concern is more about if that JSON is correct, not just well-formed.

        Also, 18000 waters might be correct JSON, but makes an AI a bad cashier.

        • staph@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          9 hours ago

          There is a lot more that goes into it than just being correct. 18000 waters may have been the actual order, because somebody decided to screw with the machine. A human who isn’t terminally autistic would reliably interpret that as a joke and would simply refuse to punch that in. The LLM will likely do what a human tells it to do, since it has no contextual awareness, it only has the system prompt and whatever interaction with the user it had so far.

          • Khanzarate@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            7 hours ago

            Thats part of correctness to me, delivering an order that taco bell actually would make is important.

            Semantics aside, though, we agree. That’s very important.

          • tomiant@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            5 hours ago

            So they just trim the instructions so it doesn’t take joke orders, so it can make more reasonable decisions, like:

            “May I take your order?”

            “Two double whoppers with extra mayo and a chocolate cherry banana sundae”

            “Oh you’ve GOTTA be joking!”