Uploading large files

From Wikitech

Requirement

You need:

  • the URL of the media file to upload
  • a text file with the first revision content
  • the name of the user account for this first revision and upload

Our swift backend storage is limited to files smaller than 5 Gb. See phab:T191804 and phab:T191802 for discussion to extend this limit respectively to 5 GB and beyond.

Step 1: download files

Download the files to mwmaint1002 (or if there's not enough space, deploy1001).

wget <URL>

At this stage, it could be pertinent to check the hash of the file if known.

Requestors are advised to provide direct link to the to-upload-file. However, ocasionally, they do not do that, and use a public cloud service instead, which usually don't provide direct download links (like Google Drive).

From Google Drive, it is possible to download a file using it's unique ID via rclone:

urbanecm@titanium  /nas/urbanecm/wmf-rehosting
$ rclone -P backend copyid <config>: '<fileid>' '<filename>'

where:

Since rclone is not installed at production servers, this requires copying the file first to a temporary location first and then transfering to the maintenance server, however, it does not mean downloading the file to administrator's own laptop (which might have capacity or connection speed issues).

Step 2: import image to Commons

Server-side uploads run much faster because of minimal network overhead, and as a result can cause extra strain on the job queue, especially videos which require transcoding. It's recommended to add some delay in between each upload with the --sleep parameter. Because videos have various factors (resolution, fps, length) that would affect how long transcodes would take, it might be worth uploading one video, seeing how long the median transcode takes, and then sleeping for that length to avoid queuing up a large number of transcodes.

mwscript importImages.php --wiki=commonswiki --sleep=SECONDS --comment-ext=txt --user=USERNAME /tmp/uploads