Hi everyone,

I got a bit of an issue/I am a bit lost in terms of photo management software and the special usecase I have.

My situation: I have two main proxmox servers - one at home, one as a dedicated server with a hoster. The former is pretty capable and has plenty of storage. The latter is doing okay,but storage is getting freaking expensive pretty fast on dedicated machines so I have that much space available.

I usually use the public machine for anything “public facing”, e.g. services I host for friends and family, website and -and here comes my problem- photo backup from mobile devices as well as sharing photos with relatives,friends,etc.

The home server originally started as a NAS and acts as a storage for my relatively large photo collection (I worked as a photographer as a sidegig for a bit and therefore have,well, a relatively large collection).

My task/issue: I can’t really put the home box public facing (home internet is way too unstable here) and honestly don’t want to for security concerns. On the other hand I can’t really put my collection on the public machine - that would quadruple my costs as I would need a much better dedicated machine then. For the lower amount of new photos coming in through backups it’s not an issue,but for the whole collection it would be. Now,very rightfully, the family complains that uploading and sorting the photos twice can’t also be a proper solution. Side note: (Photos shared are basically only newly added ones)

So I had the idea to enable a one way push from the public facing instance to the private instance. That can of course be done by an export script once per day or something. But that would only export the actual pictures - no software I know of provides an option to one way sync the metadata around it as well. Which is quite odd, as I don’t think I would be the only one with that issue.

So… People…am I overthinking this? Am I doing something wrong? Does anyone have an idea how to solve this?

  • kwa@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    12 hours ago

    I have been using frp to expose one port of my private server to the public one. Then on the public server, I’m using nginx as reverse proxy to enable https.

    This works great for my use case. Regarding security, if the application has a vulnerability, it is still an open door to your private server. My app runs on rootless podman, so only the container and the data it contains would be compromised.

  • UnfortunateShort@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    14 hours ago

    For any kind of storage, cloud or not, I think rclone should provide whatever you need for free (not entirely shure about metadata). You will likely want at least one script tho, since the commands can get quite long, depending on configuration.

    I don’t know whether this fits your needs better, but “Ente” and “Proton” offer E2E encrypted cloud storage with the option to share parts of it via link or accounts, as well as clients that automatically sync stuff for you (Android only for Ente I think)

  • Statick@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    15 hours ago

    Wireguard VPN tunnel + rclone sync (or rclone copy) to a specific driver/folder on your home network

    Edit: Not sure why it posted 3 times… Deleted the others.

    • philpo@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 hours ago

      Yeah, that would be ideal,but the issue is that I need a photo management software that exports the metadata and imports it - this is the main issue. Otherwise things would be far easier.

  • VeryFrugal@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    14 hours ago

    If you are concerned about storage price, you should use something like S3 or B2. It’s dirt cheap and it will work with almost anything you are currently using.

    • philpo@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      I would love to do that, but the issue is the software to accept it - basically I need a solution that exports the metadata as well and then adds it into a larger library - and that is the problem.

      • zlatko@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        That really depends on the software you use. Some software might have a way to do it, but it may be indirect.

        E.g. digikam is a photo library management software. It can move albums between “libraries”, and is designed that some of those libraries can be offline occasionally (more in the sense of SD cards, but also e.g. USB storage). So how you could do it is you map one, mountable, library to one disk, another to your “network storage” (however you attach your home server). That includes the metadata (depending on where and how you store it). And the digikam database itself is just a file as well (sqlite database), so you can also back that up at the same time. I’m not sure how to automate this process. Even a manual “cheat” - moving the files to network drive, then symlinking it back, per month or something, might work. It’s a bit of a manual process, but digikam is designed to be storage-based. And a lot of other software is, as well.

        But again, I don’t know if you’re using digiikam or something else, and how you set it up. So, what software do you have? How do your users sync their photos and albums? That might help planning.

  • Use a script to copy over the photos from the public to the private. I assume the software ur using has a database to store all the metadata. U can set up one of ur servers as a master db and mirror that through a variety of tools to the other location. I think u would need to set the master db as the one on ur private machine (as it has all the content) and mirror that to the public. I’m not sure how ur software will handle having metadata for images it doesn’t actually have.

    Their might be some way to do bidirectional db sync. Or perhaps run the db on the public and connect ur private to that db. So u would have 2 services running off the same db (and syncing files back to the private as necessary).

    I use Immich as my image management and u can point it to any db service u want.

    Edit: u might be able to do some custom delivery config in nginx or similar to serve the content itself from the public and if that fails serve the content from the private (u would need to make the private public for this).

  • Abe@civv.es
    link
    fedilink
    arrow-up
    0
    ·
    18 hours ago

    @[email protected] Why not do something like wireguard/tailscale in your public facing instance, having nginx there, and reroute photos.domain.com to your private server?

    so, something like subdomain.domain-> dns pointed to public -> hits your public -> nginx there routes to wireguard ip of 10.1.1.5 of your private box -> <OVER WIREGUARD> -> hits your private 10.1.1.5:8080 or something?

    • philpo@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 hours ago

      Would need a stable enough connection - as written above that is sadly not the case and won’t be it for the next few years.

      Nightly backups/syncs might work,putting the private machine “public facing” won’t - I tried that.