• undefined@lemmy.hogru.ch
    link
    fedilink
    English
    arrow-up
    15
    ·
    3 days ago

    This very much bothers me as a web developer. I go hard on Conditinal GET Request support and compression as well as using http/2+. I’m tired of using websites (outside of work) that need to load a fuckton of assets (even after I block 99% of advertising and tracking domains).

    macOS and iOS actually allow updates to be cached locally on the network, and if I remember correctly Windows has some sort of peer-to-peer mechanism for updates too (I can’t remember if that works over the LAN though; I don’t use Windows).

    The part I struggle with is caching HTTP. It used to be easy pre-HTTPS but now it’s practically impossible. I do think other types of apps do a poor job of caching things though too.

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      Yes, Windows peer to peer update downloads work over LAN. (In theory, I’ve never verified it.)

      HTTP caching still works fine, if your proxy performs SSL termination and reencryption. In an enterprise environment that’s fine, for individuals it’s a non-starter. In this case, you’d want to have a local CDN mirror.

      • undefined@lemmy.hogru.ch
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        I couldn’t get SSL bumping in Squid on Alpine Linux about a year ago but I’m willing to give it another shot.

        My home router is also a mini PC on Alpine Linux. I do transparent caching of plain HTTP (it’s minimal but it works) but with others using the router I do feel uneasy about SSL bumping, not to mention some apps (banks) are a lot more strict about it.

        • frongt@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          Yeah, you’ll have to have a bypass list for some sites.

          Honestly, unless you’re actually on a very limited connection, you probably won’t see any actual value from it. Even if you do cache everything, each site hosts their own copy of jQuery or whatever the kids use these days, and your proxy isn’t going to cache that any better than the client already does.

          • undefined@lemmy.hogru.ch
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 days ago

            For my personal setup I’ve been wanting to do it on a VPS I have. I route my traffic through a bundle of VPNs from the US to Switzerland and I end up needing to clear browser cache often (web developer testing JavaScript, etc).

            each site hosts their own copy of jQuery or whatever the kids use these days

            I do this in my projects (Hotwire) but I wish I could say the same for other websites. I still run into broken websites due to trying to import jQuery from Google for example. This would be another nice thing to have cached.