Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.
rsync is indeed fiddly. Consider SFTP in your GUI of choice. I mount the folder I need in my file browser and grab the files I need. No terminal needed and I can put the folders as favorites in the side bar.
If you want to use the terminal though, there is
scp
which is supported on both windows and Linux.Its just
scp [file to copy] [username]@[server IP]:[remote location]
Magic wormhole is pretty dead simple https://magic-wormhole.readthedocs.io/en/latest/welcome.html#installation
I use this a lot at work for moving stuff between different test vms, as you don’t need to check IPs/hostnames
As I understand it, the establishing of the connection is reliant on a relay server. So this would not work on a local network without a relay server and would, by default, try to reach a server on the internet to make connections.
People have already covered most of the tools I typically use, but one I haven’t seen listed yet that is sometimes convenient is
python3 -m http.server
which runs a small web server that shares whatever is in the directory you launched it from. I’ve used that to download files onto my phone before when I didn’t have the right USB cables/adapters handy as well as for getting data out of VMs when I didn’t want to bother setting up something more complex.Syncthing
Not gonna lie, I just map a network share and copy and paste through the gui.
Same lol, somebody please enlighten me on a faster way!
Sounds very straight forward. Do you have a samba docker container running on your server or how do you do that?
I dont have a docker container, I just have Samba running on the server itself.
I do have an owncloud container running, which is mapped to a directory. And I have that shared out through samba so I can access it through my file manager. But that’s unnecessary because owncloud is kind of trash.
I have two servers, one Mac and one Windows. For the Mac I just map directly to the smb share, for the Windows it’s a standard network share. My desktop runs Linux and connects to both with ease.
I just type
sftp://[ip, domain or SSH alias]
into my file manager and browse it as a regular folderDolphin?
Any file manager on Linux supports this
Rsync and NFS for me.
And me.
scp
scp is deprecated.
SCP, the protocol, is deprecated. scp, the command, just uses the SFTP protocol these days. I find its syntax convenient.
Oh does it? I didn’t realize that. I’ve just switched over to rsync completely.
Checks username… yeah that tracks
By “homelab”, do you mean your local network? I tend to use shared folders, kdeconnect, or WebDAV.
I like WebDAV, which i can activate on Android with DavX5 and Material Files, and i use it for Joplin.
Nice thing about this setup is that i also have a certificate secured OpenVPN, so in a pinch i can access it all remotely when necessary by activating that vpn, then disconnecting.
- sftp for quick shit like config files off a random server because its easy and is on by default with sshd in most distros
- rsync for big one-time moves
- smb for client-facing network shares
- NFS for SAN usage (mostly storage for virtual machines)
As a lazy person, I just prefer
sftp
on thunar.rclone. I have a few helper functions;
fn mount { rclone mount http: X: --network-mode } fn kdrama {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/KDrama/$x --filter-from ~/.config/filter.txt } fn tv {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/TV/$x --filter-from ~/.config/filter.txt } fn downloads {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/Downloads/$x --filter-from ~/.config/filter.txt }
So I download something to my seedbox, then use
rclone lsd http:
to get the exact name of the folder/files, and runtv "filename"
and it runs my function. Pulls all the files (based on filter.txt) using multiple threads to the correct folder on my NAS. Works great, and maxes out my connection.I’d say use something like zeroconf(?) for local computer names. Or give them names in either your dns forwarder (router), hosts file or ssh config. Along with shell autocompletion, that might do the job. I use scp, rsync and I have a NFS share on the NAS and some bookmarks in Gnome’s file manager, so i just click on that or type in scp or rsync with the target computer’s name.
rsync if it’s a from/to I don’t need very often
More common transfer locations are done via NFS
Resilion Sync
I have a shared syncthing folder on all my devices