41

Upload to Amazon S3 with Uploadify

Posted (Updated ) in Javascript, PHP

Update Jul 02 2011: crossdomains.xml should be crossdomain.xml

Update Jun 29 2011: Updated S3 upload URL – no more security error

If you’ve ever wanted to allow people on your website to upload files to Amazon S3, you’ll know the only real way to do this is with flash – as it allows the client to upload directly instead of funneling all traffic through your server. You could write your own script for this, however there’s already a nifty little prebuilt tool I’m sure you’ve already heard of called Uploadify which can not only do the job for you, it can do so with great flexability.

For the lazy: Download the entire script here

You’ll need the following things:

Set up the bucket

Firstly set up your bucket. I’ve called mine bucket.mysite.com. Inside the bucket needs to be a file called crossdomain.xml which is required to upload files to S3. Here is the contents:

<?xml version="1.0"?>
<!DOCTYPE cross-domain-policy SYSTEM
"http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd">
<cross-domain-policy>
  <allow-access-from domain="*" secure="false" />
</cross-domain-policy>

Make sure this file is readable by everyone.

Create your site

Create a folder for your site in your web directory (I’m calling mine S3Uploader) and inside create includes and files directories. Drop both the jquery.js file and your uploadify folder into files.

Here is the code for index.php (to go in your S3Uploader directory):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
<?php
	//System path for our website folder
	define('DOCROOT', realpath(dirname(__FILE__)).DIRECTORY_SEPARATOR);
	//URL for our website
	define('WEBROOT', htmlentities(
		substr($_SERVER['REQUEST_URI'], 0, strcspn($_SERVER['REQUEST_URI'], "\n\r")),
		ENT_QUOTES
	));
 
	//Which bucket are we placing our files into
	$bucket = 'bucket.mysite.com';
	// This will place uploads into the '20100920-234138' folder in the $bucket bucket
	$folder = date('Ymd-His').'/'; //Include trailing /
 
	//Include required S3 functions
	require_once DOCROOT."includes/s3.php";
 
	//Generate policy and signature
	list($policy, $signature) = S3::get_policy_and_signature(array(
		'bucket' 		=> $bucket,
		'folder'		=> $folder,
	));
?>
<html>
<head>
<title>test Upload</title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<link rel="stylesheet" href="files/uploadify/uploadify.css" />
<script type='text/javascript' src="files/jquery.js"></script>
<script type='text/javascript' src="files/uploadify/swfobject.js"></script>
<script type='text/javascript' src="files/uploadify/jquery.uploadify.v2.1.4.min.js"></script>
<script type="text/javascript">
	$(document).ready(function() {
		$("#file_upload").uploadify({
			'uploader'		: '<?= WEBROOT ?>files/uploadify/uploadify.swf',
			'buttonText'		: 'Browse',
			'cancelImg'		: '<?= WEBROOT ?>files/uploadify/cancel.png',
			'script'		: 'http://s3.amazonaws.com/<?= $bucket ?>',
			'scriptAccess'		: 'always',
			'method'		: 'post',
			'scriptData'		: {
				"AWSAccessKeyId"		: "<?= S3::$AWS_ACCESS_KEY ?>",
				"key"				: "${filename}",
				"acl"				: "authenticated-read",
				"policy"			: "<?= $policy ?>",
				"signature"			: "<?= $signature ?>",
				"success_action_status"		: "201",
				"key"				: encodeURIComponent(encodeURIComponent("<?= $folder ?>${filename}")),
				"fileext"			: encodeURIComponent(encodeURIComponent("")),
				"Filename"			: encodeURIComponent(encodeURIComponent(""))
			},
			'fileExt'		: '*.*',
			'fileDataName' 		: 'file',
			'simUploadLimit'	: 2,
			'multi'			: true,
			'auto'			: true,
			'onError' 		: function(errorObj, q, f, err) { console.log(err); },
			'onComplete'		: function(event, ID, file, response, data) { console.log(file); }
		});
	});
</script>
</head>
<body>
 
	<div align='center'>
		<input type='file' id='file_upload' name='file_upload' />
	</div>
 
</body>
</html>

A breakdown: At the top I set up some convenience definitions as well as defining the name of the bucket we’re uploading to ($bucket variable) and the folder within your bucket. The $folder variable is useful for avoiding overwriting of files when uploading one with the same filename, however it is not required. I then included the required stylesheet and script files (You can customize the stylesheet to change the way transfers look) and set up uploadify.

The scriptData object is the most important part here. That’s the upload data being sent to S3. You shouldn’t need to change anything in this section and doing so may cause uploads to fail.

One very common issue with using Uploadify to upload to S3 is that it will seemingly randomly fail to upload and instead return a ‘403’ error. This is due to a problem with actionscript encoding ‘+’ characters in the signature incorrectly. To avoid this, I’ve written a get_policy_and_signature function in my S3 class (code below) to recursively regenerate until a signature without this character is made – a bandaid that should fix the problem. Here’s includes/s3.php:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
<?php
class S3 {
	public static $AWS_ACCESS_KEY 		= '< Your access key >';
	public static $AWS_SECRET_ACCESS_KEY 	= '< Your secrete key >';
 
	/*
	 * Purpose:
	 * 		Actionscript encodes '+' characters in the signature incorrectly - it makes
	 * 		them a space instead of %2B the way PHP does. This causes uploadify to error
	 * 		out on upload. This function recursively generates a new policy and signature
	 * 		until a signature without a + character is created.
	 * Accepts: array $data
	 * Returns: policy and signature
	 */
	public static function get_policy_and_signature( array $data )
	{
		$policy = self::get_policy_doc( $data );
		$signature = self::get_signature( $policy );
 
		if ( strpos($signature, '+') !== FALSE )
		{
			$data['timestamp'] = intval(@$data['timestamp']) + 1;
			return self::get_policy_and_signature( $data );
		}
 
		return array($policy, $signature);
	}
 
	public static function get_policy_doc(array $data)
	{
		return base64_encode(
			'{'.
				'"expiration": "'.gmdate('Y-m-d\TH:i:s\Z', time()+60*60*24+intval(@$data['timestamp'])).'",'.
				'"conditions": '.
				'['.
					'{"bucket": "'.$data['bucket'].'"},'.
					'["starts-with", "$key", ""],'.
					'{"acl": "authenticated-read"},'.
					//'{"success_action_redirect": "'.$SWFSuccess_Redirect.'"},'.
					'{"success_action_status": "201"},'.
					'["starts-with","$key","'.str_replace('/', '\/', $data['folder'] ).'"],'.
					'["starts-with","$Filename",""],'.
					'["starts-with","$folder",""],'.
					'["starts-with","$fileext",""],'.
					'["content-length-range",0,5242880]'.
				']'.
			'}'
		);
	}
 
	public static function get_signature( $policy_doc ) {
		return base64_encode(hash_hmac(
			'sha1', $policy_doc, self::$AWS_SECRET_ACCESS_KEY, true
		));
	}
}

This should be all there is to it. Navigate to the S3Uploader folder in your browser and you’ll be presented with a single Browse button.

Download the full script here.