Some tips on downloading and uploading DCPs and large files

Here are some tips on downloading and uploading DCPs and other large files like prores quicktime files:


-I recommended using a  internet connection. (20/20-50/50 mbit/s). (Use a good network cable/router)
-I recommend using a download manager like Free download manager  (Windows/mac) that support resume and multi-part downloading. Free Download will allocate disk space in the start of the transfer and then the download speed will increase.
-I recommend using Cyberduck if you want to download from filemail via FTP.

-I recommended using a fast wired internet connection. (20/20-50/50 mbit/s). (Use a good network cable/router)
-I recommend a reboot of the computer or at least a restart of chrome before uploading large files (100 GB) in a web uploader.


I have used filemail to receive files on many film festivals I have worked on.

-On you can send 30 GB files for free. With a Filemail Pro or Business account you can send and receive 100+ GB files.

-I have used the filemail html5 uploader on many 100+ gb files. You can also use the html5 uploader to send folders. The Desktop filemail app also works well on 100+ gb files and support resume and error checking and can be used to auto download all incoming files.

-With a Filemail business account you will have access to the files for longer and people can use a branded upload web page. You can also enable the md5 hash value. Filemail now has an option for permanent storage.

Filemail md5 hash check

Filemail shows the md5 checksum on the download page if you enable it in the settings for the business download page.

This way you can check if the uploaded file is correct if the client sends you the original md5 checksum. You can ask for the md5 checksum as info on the upload page. This way you and the client can verify that the file is correct.

Generating a MD5 checksum
The client can find the MD5 checksum of a file with utilities like Hash Tab for windows and Hash Tab for mac

You also use the command md5 "file" in a terminal on a Mac to generate a md5 file hash value. The Terminal app is in Applications – Utilities. You can type md5 and push space, then drag the file from a finder window and push enter.

You can use the command certutil -hashfile "file" md5 on a Windows 7, 8, 10 computer in a command prompt (CMD)

You can use md5sum "file" in a terminal on a linux computer.

-If the sender sends a 7z archive, the receiver will know when unpacking if the file fails the 7z checksum test.
-When sending prores quicktime files it is important that the sender include a md5 checksum to enable verification of the file.
-DCPs have the checksums in the XML files that can be verified with easydcp player or using this command
openssl sha1 -binary "FILE_NAME" | openssl base64 

7z archives
-If you want to send a uncompressed quicktime file, TIFF or DPX image sequences or wav files I suggest using 7z with normal compression. A uncompressed quicktime file will be reduced to half the size after 7z compression.
-I recommend Keka on mac or 7zip on Windows.
-7z archives can be protected with encryption and a password and have a built in hash check.
-In Keka, choose 7z and store for no compression or normal for normal compression, then drag the folder to the Keka window.
-You can also split the archive in parts and upload the file in parts. Example 20 GB. If one of the parts is corrupted you only need to upload that part again.

Use a PAR2 recovery file as backup
-You can use a PAR 2 file as an additional backup solution if there is no time to upload/download the corrupt files again . I have tried both MultiPar and MacPAR Deluxe to generate PAR 2 1 percent redundancy files. PAR 2 files with 1 percent redundancy are 1 percent of the size of original file. If the downloaded file is 0,5 percent corrupted, you can use the 1 percent PAR 2 recovery file to recover the file instead of downloading again. I have tried this and it works.

Amazon S3

Amazon S3 can be used to deliver and receive big files on the internet.
-You can use the s3 web uploader or use Cyberduck and similar programs to upload to S3. It uses integrity checks when uploading so you will know the uploaded file is working.
-After uploading you can make the link accessible (public) and give the clients the S3 download link.
-S3 storage is very reliable and scalable. You pay for each download.
-If you want a client to upload a file to a s3 upload bucket you can make an upload user in with a policy that enables the user to upload to that upload bucket.

Like this policy:

Adapted from

"Version": "2012-10-17",
"Statement": [
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"Resource": "arn:aws:s3:::storefiler"
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"Resource": "arn:aws:s3:::storefiler/*"

– then give the client the access key and secret key and directory for the upload user. (with this policy the user will need to add the directory to be able to upload.)
-You can give out the access key and secret key and directory for clients who will be downloading via Cyberduck

FTP Server

Aspera Cloud
I have used Aspera Cloud on many DCP transfers. It is used by many film festivals and post-production services. It is very fast.

Google Drive
I have downloaded many DCP with free download manager from Google Drive. It is fast.

I have downloaded many DCP with free download manager from Dropbox. I had no problems.

DCP and quicktime specific advice

-When downloading a DCP folder in Filezilla from a FTP server it is important to change Transfer – transfer type – binary. It should not be ascii or auto, this way the XML files will pass the checksum test.

-Ask people who submit DCPs to ether upload the folder or the archive format 7z instead of zip. This way you can avoid the (large) Mac OS archive utility zip files that can only be opened by the Mac OS archive zip utility. Both Keka and 7-Zip can make 7z archives.

-I recommend checking DCPs in Easydcp player. The trial version will let you know if the DCP fails some checks. You can also run a file hash check.

You should also make a DCP quality checklist that may include:
-Is the DCP named correctly?
Is it in the correct scope or flat format? The DCP should have a correct digital naming convention name. Example: The DCP is called Dcpname_SHR_S but is actually Dcpname_SHR_F because the subtitles are burned in too low and will be cropped at a scope preset in your cinema.
-Is the DCP in the correct language/subtitle language?
-Will the DCP play at your venue. Example: You may have a server with old firmware that can play 25 SMPTE DCPs with burned in subtitles, but not SMPTE 2010 subtitles. (all film festivals should have updated firmware/software).

-If you have downloaded a DCP folder archive like DCP.7z on a mac and want to ingest in on a digital cinema media block/server/player you can extract it to a DX115 hard drive or similar.

Other tips

-If you get an error message extracting a zip file in Keka that says error code 2 using p7zip, you probably need to extract the zip file with mac os archive utility.

3 Replies to “Some tips on downloading and uploading DCPs and large files”

  1. Hi Knut.
    Great work! Just the info I needed to solve my problems transporting DCP files over the net.
    Thank you.
    Best Regards Ulrik Lyhne, Aarhus Film Workshop

  2. Hi Knut, thanks for the tips.
    Quick question.. Do you know if there is a site out there with limits as high as say,, which allow you to upload via an FTP program like filezilla?
    Alternatively, a site which has a desktop uploader which you can use with a free account?
    – I seem to have trouble sending larger files over my internet connection using a web-browser; it often fails and I have to start again.

  3. Hi.
    An alternative is Amazon S3. You can use CrossFtp and similar programs to upload to S3. And the link you give clients will be very reliable.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.