• Strawberry@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.

    • jagged_circle@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      Of course you can. This is why people use CDNs.

      Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.