• frongt@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 days ago

    Yeah, you’ll have to have a bypass list for some sites.

    Honestly, unless you’re actually on a very limited connection, you probably won’t see any actual value from it. Even if you do cache everything, each site hosts their own copy of jQuery or whatever the kids use these days, and your proxy isn’t going to cache that any better than the client already does.

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Even if you do cache everything, each site hosts their own copy of jQuery or whatever the kids use these days, and your proxy isn’t going to cache that any better than the client already does.

      don’t they always have a short cache timeout? the proxy could just tell the client that the cache timeout is a long time, and when the browser checks if it’s really up to date, it would redownload the asset but just return the right status code if it actually didn’t change.

      and all the jquery copies could be also eliminated with a filesystem that can do deduplication, even if just periodically. I think even ext4 can do that with reflink copy, and rmlint helps there.

    • undefined@lemmy.hogru.ch
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 days ago

      For my personal setup I’ve been wanting to do it on a VPS I have. I route my traffic through a bundle of VPNs from the US to Switzerland and I end up needing to clear browser cache often (web developer testing JavaScript, etc) on my end devices.

      each site hosts their own copy of jQuery or whatever the kids use these days

      I do this in my projects (Hotwire) but I wish I could say the same for other websites. I still run into broken websites due to trying to import jQuery from Google for example. This would be another nice thing to have cached.