Setting up a dedicated SFTP Ubuntu Linux root server with basic security and some other things you can do on a linux server

I wrote this post about using filemail, a dedicated sftp server, IBM Aspera and Amazon s3 to transfer big files like DCPs and Prores files. This is a follow up post on some things you can do on a dedicated ubuntu server that can be useful if you download and upload large 100 GB files like DCPs and Prores files. I use a hetzner dedicated server which is easy to install Ubuntu Linux on. A dedicated server with unlimited data running SFTP at hetzner can be set up on the cheapest action servers that cost around 30 Euro a month.  Alternatively you can buy a Hetzner storage box with FTP, SFTP, Rsync over ssh. For 11 Euro you can get a storage box with 2 TB storage, 10 concurrent users and 10 TB monthly data transfer. For 100 Euro a month you can buy a managed server. 

Setting up basic security

Updating the server

sudo apt update
sudo apt upgrade

Make a user with sudo privileges instead of using root

When Ubuntu is installed you have root access and can log on to the server in a SSH shell with the root account. For better security I make a user with sudo privileges that is used instead of root.
I used the instructions from here

sudo adduser newuser
usermod -aG sudo newuser

Disable PermitRootLogin yes from /etc/ssh/sshd_config

sudo apt  install nano
sudo nano /etc/ssh/sshd_config

Comment PermitRootLogin yes to disable it

#PermitRootLogin yes

Stop brute force attacks with sshguard and nftables

Install nftables if necessary

Ubuntu 20.10 has nftables as default, but if running 20.04 you can install it.
Used the instruction from here

sudo apt install nftables
sudo systemctl enable nftables
sudo systemctl start nftables
sudo systemctl status nftables

To set up sshguard I used the instruction from here


sudo apt install sshguard

Add sshg-fw-nft-sets to BACKEND= in sshguard-conf


BACKEND="/usr/lib/x86_64-linux-gnu/sshg-fw-nft-sets"

sudo nano /etc/sshguard/sshguard.conf
sudo systemctl enable sshguard
sudo systemctl restart sshguard
sudo systemctl status sshguard

Soon you will see blocked IP addresses when you list the NFT ruleset

sudo nft list ruleset

Make chroot jail sftp users that can download and upload only from a folder in their home directory

Instead of setting up sshd_config with correct settings and manually adding chrooted users with folder I these shells scripts on github from Matthieu Petiteau.

https://github.com/smallwat3r/jailed-sftp-users

To make a user called download1 with the password goodpassword I would run these commands. I make my password without symbols with the https://www.lastpass.com/password-generator

sudo apt install git
git clone https://github.com/smallwat3r/jailed-sftp-users.git
cd jailed-sftp-users
sudo ./initialize
sudo ./create_user download1 goodpassword


If you want to run all the commands from the scripts you can follow the instructions from here

Change these settings in /etc/ssh/sshd_config

nano /etc/ssh/sshd_config

Put a comment # before this line
#Subsystem      sftp    /usr/lib/openssh/sftp-server
Add this line

Subsystem sftp internal-sftp

add the group sftpusers

Match Group sftpusers
ChrootDirectory %h
ForceCommand internal-sftp
AllowTcpForwarding no
X11Forwarding no

Make a download user called download1 that can download and upload to only /home/download1/data

sudo useradd -g sftpusers -s /sbin/nologin -m -d /home/download1 download1
sudo passwd download1
sudo chown root:root /home/download1
sudo chmod 755 /home/download1/
sudo mkdir /home/download1/data
sudo chown download1:sftpusers /home/download1/data
sudo chmod 755 /home/download1/data
sudo systemctl restart sshd.service

Using SFTPGo with S3 storage with awscli

SFTPGo and awscli is an alternative if you want to store data in Amazon S3 storage instead of locally. Each user is chrooted inside a S3 Amazon bucket. You can also use SFTPGo with ordinary storage.

I have sucessfully used SFTPGo with s3 storage uploading and downloading 100+ GB files. (Note: This solution does not support resume and you need to turn off segmented downloads in cyberduck. )

SFTPGO uses more CPU resources than the standard SFTP server, but you get a lot of extra features. You can add users and see who is logged on and transferring in a web interface.

I used the instruction from here to install it. I changed UL Part Size (MB) to 50 and UL Concurrency to 5 for the user. And installed awscli 2 manually (newest version. )

Backup or copy data to Amazon S3 storage with awscli

Install awscli manually

I used instruction from here to configure awscli. (SFTPGo)

To backup new files from a folder to an Amazon s3 storage bucket use this command:

aws s3 sync /home/download1 s3://bucketname64352/download1backup/

Backup or copy data to another linux server with rsync over ssh

More info on the commands here. Sync a directory and subdirectories to another server.

rsync -aP /home/download1/data/ download1@1.1.1.1:/home/download1/data

Other things you can do on a linux server

Logging on to the server with SSH and downloading a DCP folder or videofiles from another SFTP/FTP server

Sometimes it can be nice to download something directly from ftp servers and filemail via ftp to the server. In terminal you can use SSH to login to the server and use the command screen to detach from the session so you can resume it if the connection to the server is broken.

SSH user@111.111.111.111


If I want to download something to a folder /home/user/download/newfolderwiththings.

cd /home/user/download
mkdir newfolderwiththings
sudo apt update
sudo apt install ncftp
sudo apt install screen
screen
ncftp -u username 111.111.111.111

When you have started ncftp you can use get with the recursive command -R to download a folder and subdirectories. And use ls to list the content and cd to change directories,

ls
cd folder
get -R /folderwithhings

detach from the screen by using control + a,d

when connecting to the ssh server again use screen -r to attach to the session again.

screen -r 

If you have started many screen sessions you will get a list of session so you can choose which to start. To exit a screen session you can use

exit

Compress and uncompress files on the server

You can make a 7z compressed archive of files on the server so it would be faster to download them. You can also upload an archive and then uncompress it on the server. You can also split the archive in parts so you can download many parts at once. You can use screen to be able to log of the ssh session while the file are being compressed.

Install 7z

sudo apt update
sudo apt install p7zip p7zip-rar p7zip-full 

Compress a folder and it´s subdirectories

7z a -r directory.7z /directory

Compress a file

7z a archive.7z file.wav

Compress a directory and subdirectories. Normal compression. Useful for wav files and similar files that will compress to half the size. Split in 3 GB parts. Cyberduck default setting is to use segmented download, but it can speed up uploads and downloads to split files in parts.

7z a -r -v3000m directory.7z  /directory

Compress file. Using -mx0 for no compression. Split in 3 GB parts . Useful if you want to make the archive faster and for files that don’t compress much like prores video files.

7z a -mx0 -v3000m archive.7z  prores.mov

You can also upload 7z archive files and extract them on the server.
Extract archive recursively and keep the subdirectories

7z x archive.7z

Making a MD5 checksum file of files in a directory.

To be able to test the integrity of files you can have md5 checksum files. You can make a checksum.md5 file with the md5 checksum of files in a directory like this

md5sum * > checksums.md5

To check the checksum of the files in a directory with the md5 checksum file you can use .

md5sum -c checksums.md5

You can also use more advanced commands like these commands. This version makes md5Sum.md5 files in all sub directories from the one you run the command in.

find "$PWD" -type d | sort | while read dir; do cd "${dir}"; [ ! -f md5Sum.md5 ] && echo "Processing " "${dir}" || echo "Skipped " "${dir}" " md5Sum.md5 already present" ; [ ! -f md5Sum.md5 ] &&  md5sum * > md5Sum.md5 ; chmod a=r "${dir}"/md5Sum.md5 ;done 

And run this command to check the md5 checksum of all the directories after md5Sum.md5 files have been generated.

find "$PWD" -name md5Sum.md5 | sort | while read file; do cd "${file%/*}"; md5sum -c md5Sum.md5; done > checklog.txt

You can check the result in the resulting checklog.txt with this command or in a text editor like nano

cat checklog.txt

You can remove the .md5 files from the subdirectories with this command

find  -type f -name "*.md5" -exec rm -f "{}" +;

Mount a remote directory with sshfs and fuse.


Sometimes it can be handy to change the name of a DCP that had already been uploaded. Or do other things with files on the server as if they were on your local computer . With sshfs and fuse you can mount a folder on the server as a local folder and open it in Easydcp Creator and change the name or other metadata like the content kind or the offset on reels. Easydcp Creator saves the new metadata to the folder on the server.
To mount the folder from the server locally on my mac I use the commands from here

Install Homebrew

Install osxfuse and sshfs in the terminal with these commands

brew cask install osxfuse
brew install sshfs

Reboot.
Make a local directory that the folder will be mounted in. In terminal I made a directory called server on the desktop.

cd Desktop
mkdir server



To mount the folder on the server you use the sshfs command (link man sshfs)

sudo apt update

sudo apt upgrade

sudo apt install sshfs

sshfs [user@]host:[dir] mountpoint [options]

This is how you mount the home folder of the user yourname

sshfs yourname@111.111.111.111:/home/yourname /Users/yourname/Desktop/server

To open the DCP in the Easydcp Creator I drag the folder to the Easydcp window and wait a litte bit. You can also open a Resolve or Premiere project this way.


To unmount you can use the umount command

sudo umount /Users/yourname/Desktop/server


Converting a videofile or sound file on the server with ffmpeg


Converting a high quality video file to a low-res version on the server so it can downloaded easier. Example: You need a file to check subtitles. Or a small file to upload as a screener.
If you need a small size h264 video file of a file on the server you can use ffmpeg to convert it.

sudo apt update
sudo apt install ffmpeg 
ffmpeg -i bigfile.mov smallfile.mp4
ffmpeg -i bigfile.wav smallfile.aac

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.