Global Feed Post Login
Replying to Avatar Loki

I think the best method is to use a shell script. Something like this:

```

ls path/to/files|xargs scp -R remote.ssh.server:/path/to/backup && rm path/to/files/*

```

This way if it gets stuck, it won't do the delete, and you can run it again after you deal with the error. If you used a remote file mount, anything at all, sshfs, SMB, whatever, you could take advantage of `cp -rfvpu` and preserve the permissions, and it would not copy the file if it was the same size and creation date as the original.

To be honest, scp is crappy.

You could also use `rsync` it has this functionality you seek, all in one, I believe.

Avatar
ᴄʏʙᴇʀɢᴜʏ 👽 2y ago

I use the Nextcloud with bind mounted smb drives.

But I steal your shell script 😁, its simple and it seems effective

Reply to this note

Please Login to reply.

Discussion

No replies yet.