Category: PHP


Sending requests through a proxy

by Michael Dowling | on | in PHP | Permalink | Comments |  Share

Some network configurations require that outbound connections be sent through a proxy server. Requiring a proxy for outbound HTTP requests is a common practice in many companies, and is often something that must be configured in a client.

You can send requests with the AWS SDK for PHP through a proxy using the "request options" of a client. These "request options" are applied to each HTTP request sent from the client. One of the option settings that can be specified is the proxy option. This setting controls how the SDK utilizes a proxy.

Request options are passed to a client through the client’s factory method. Here’s an example of how you can specify a proxy for an Amazon S3 client:

use AwsS3S3Client;

$s3 = S3Client::factory(array(
    'request.options' => array(
        'proxy' => '127.0.0.1:123'
    )
));

The above example tells the client that all requests should be proxied through an HTTP proxy located at the 127.0.0.1 IP address using port 123.

Username and password

You can supply a username and password when specifying your proxy setting if needed:

$s3 = S3Client::factory(array(
    'request.options' => array(
        'proxy' => 'username:password@127.0.0.1:123'
    )
));

Proxy protocols

Because proxy support is handled through cURL, you can specify various protocols when specifying the proxy (e.g., socks5://127.0.0.1). More information on the proxy protocols supported by cURL can be found in the online cURL documentation.

Wire Logging in the AWS SDK for PHP

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

One of the features of the AWS SDK for PHP that I often recommend to customers is the LogPlugin, that can be used to do wire logging. It is one of the many plugins included with Guzzle, which is the underlying HTTP library used by the SDK. Guzzle’s LogPlugin includes a default configuration that will output the content of the requests and responses sent over the wire to AWS. You can use it to help debug requests or just learn more about how the AWS APIs work.

Adding the LogPlugin to any client in the SDK is simple. The following shows how to set it up.

$logPlugin = GuzzlePluginLogLogPlugin::getDebugPlugin();
$client->addSubscriber($logPlugin);

The output generated by LogPlugin for a single request looks similar to the following text (this request was for executing an Amazon S3 ListBuckets operation).

# Request:
GET / HTTP/1.1
Host: s3.amazonaws.com
User-Agent: aws-sdk-php2/2.4.6 Guzzle/3.7.3 curl/7.25.0 PHP/5.3.27
Date: Fri, 27 Sep 2013 15:53:10 +0000
Authorization: AWS AKIAEXAMPLEEXAMPLE:eEXAMPLEEsREXAMPLEWEFo=

# Response:
HTTP/1.1 200 OK
x-amz-id-2: EXAMPLE4j/v8onDxyeuFaQFsNvN66EXAMPLE30KQLfq0T6sVcLxj
x-amz-request-id: 4F3EXAMPLEE14
Date: Fri, 27 Sep 2013 15:53:09 GMT
Content-Type: application/xml
Transfer-Encoding: chunked
Server: AmazonS3

<?xml version="1.0" encoding="UTF-8"?>
<ListAllMyBucketsResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">[...]</ListAllMyBucketsResult>

This is the output generated using the default configuration. You can configure the LogPlugin to customize the behavior, format, and location of what is logged. It’s also possible to integrate with third-party logging libraries like Monolog. For more information, see the section about the wire logger in the AWS SDK for PHP User Guide.

Release: AWS SDK for PHP – Version 2.4.7

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

We would like to announce the release of version 2.4.7 of the AWS SDK for PHP. This release adds support for audio transcoding features to the Amazon Elastic Transcoder client and updates to the Amazon CloudFront, Amazon EC2, Amazon RDS, Auto Scaling, and AWS OpsWorks clients.

Changelog

  • Added support for audio transcoding features to the Amazon Elastic Transcoder client
  • Added support for modifying Reserved Instances in a region to the Amazon EC2 client
  • Added support for new resource management features to the AWS OpsWorks client
  • Added support for additional HTTP methods to the Amazon CloudFront client
  • Added support for custom error page configuration to the Amazon CloudFront client
  • Added support for the public IP address association of instances in Auto Scaling group via the Auto Scaling client
  • Added support for tags and filters to various operations in the Amazon RDS client
  • Added the ability to easily specify event listeners on waiters
  • Added support for using the ap-southeast-2 region to the Amazon Glacier client
  • Added support for using the ap-southeast-1 and ap-southeast-2 regions to the Amazon Redshift client
  • Updated the Amazon EC2 client to use the 2013-09-11 API version
  • Updated the Amazon CloudFront client to use the 2013-09-27 API version
  • Updated the AWS OpsWorks client to use the 2013-07-15 API version
  • Updated the Amazon CloudSearch client to use Signature Version 4
  • Fixed an issue with the Amazon S3 Client so that the top-level XML element of the CompleteMultipartUpload operation is correctly sent as CompleteMultipartUpload
  • Fixed an issue with the Amazon S3 Client so that you can now disable bucket logging using with the PutBucketLogging operation
  • Fixed an issue with the Amazon CloudFront so that query string parameters in pre-signed URLs are correctly URL-encoded
  • Fixed an issue with the Signature Version 4 implementation where headers with multiple values were sometimes sorted and signed incorrectly

Install/Download the Latest SDK

AWS at Web & PHP Con 2013

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

In September, I was able to attend and speak at Web & PHP Con in San
Jose, CA. It was great to be around a good group of PHP developers, talk about web development and AWS, and meet new
friends.

Getting Good with the AWS SDK for PHP

On Wednesday, September 17th, I gave a talk called Getting Good with the AWS SDK for PHP. In my session, I gave a
brief introduction to AWS and its services, taught how to use the AWS SDK for PHP, and walked through some code examples
from a small PHP application built with the SDK using Amazon S3, Amazon DynamoDB, and AWS Elastic Beanstalk. Here is the
slide deck,
joind.in page, and Lanyrd page for the
talk.

Git Educated About Git

On September 18th, I gave a talk called Git Educated About Git – 20 Essential Commands
(slide deck). This talk was not
related to AWS or the AWS SDK for PHP, but I used the development of the SDK as a use case during the presentation.
Since we work on a combination of both publicly available and unannounced features, we don’t have a single canonical
repository. Instead we have two remotes, our public GitHub repository and another private, internal repository. For fun,
I also wrote and performed a song called You’re Doing Git! during my session, and you can watch the performance on
YouTube
.

Attending PHP Conferences

I’ve enjoyed my opportunities to attend various PHP and developer conferences and user group meetings throughout this
year. I’ve found it to be a great opportunity to connect to PHP developers and help them learn more about developing on
AWS and with the AWS SDK for PHP. I hope to see you at future conferences.

See You at ZendCon 2013

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

Are you attending ZendCon this year? You are? Great! The AWS SDK for PHP team will be there too, and we hope to see you there.

We will have a booth in the expo area, so make sure to come and see us. We will have goodies to hand out, and will be ready to answer questions or help you get started with AWS, the AWS SDK for PHP, and AWS Elastic Beanstalk.

Also, make sure to check out the UnCon schedule, because we will be doing presentations about the AWS SDK for PHP and Guzzle.

See you there!

Streaming Amazon S3 Objects From a Web Server

by Michael Dowling | on | in PHP | Permalink | Comments |  Share

Have you ever needed a memory-efficient way to stream an Amazon S3 object directly from your web server to a browser? Perhaps your website has its own authorization system and you want to limit access to a file to only users who have purchased it. Or maybe you need to perform a specific action each time a file is accessed (e.g., add an image to a user’s "recently viewed" list).

Using PHP’s readfile function and the Amazon S3 stream wrapper provides a simple way to efficiently stream data from Amazon S3 to your users while proxying the bytes sent over the wire through a web server.

Register the Amazon S3 stream wrapper

First you need to create an Amazon S3 client:

use AwsS3S3Client;

$client = S3Client::factory(array(
    'key'    => '****',
    'secret' => '****'
));

Next you need to register the Amazon S3 stream wrapper:

$client->registerStreamWrapper();

Send the appropriate headers

Now you need to send the appropriate headers from the web server to the client downloading the file. You can specify completely custom headers to send to the client, including any relevant headers of the Amazon S3 object.

Here’s how you could retrieve the headers of a particular Amazon S3 object:

// Send a HEAD request to the object to get headers
$command = $client->getCommand('HeadObject', array(
    'Bucket' => 'my-bucket',
    'Key'    => 'my-images/php.gif'
));

$headers = $command->getResponse()->getHeaders();

Now that you’ve retrieved the headers of the Amazon S3 object, you can send the headers to the client that is downloading the object using PHP’s header function.

// Only forward along specific headers
$proxyHeaders = array('Last-Modified', 'ETag', 'Content-Type', 'Content-Disposition');

foreach ($proxyHeaders as $header) {
    if ($headers[$header]) {
        header("{$header}: {$headers[$header]}");
    }
}

Disable output buffering

When you use functions like echo or readfile, you might actually be writing to an output buffer. Using output buffering while streaming large files will unnecessarily consume a large amount of memory and reduce the performance of the download. You should ensure that output buffering is disabled before streaming the contents of the file.

// Stop output buffering
if (ob_get_level()) {
    ob_end_flush();
}

flush();

Send the data

Now you’re ready to stream the file using the Amazon S3 stream wrapper and the readfile function. The stream wrapper uses a syntax of "s3://[bucket]/[key]" where "[bucket]" is the name of an Amazon S3 bucket and "[key]" is the key of an object (which can contain additional "/" characters to emulate folder hierarchies).

readfile('s3://my-bucket/my-images/php.gif');

Caching

Our very simple approach to serving files from Amazon S3 does not take advantage of HTTP caching mechanisms. By implementing cache revalidation into your script, you can allow users to use a cached version of an object.

A few slight modifications to the script will allow your application to benefit from HTTP caching. By passing the ETag and Last-Modified headers from Amazon S3 to the browser, we are allowing the browser to know how to cache and revalidate the response. When a web browser has previously downloaded a file, a subsequent request to download the file will typically include cache validation headers (e.g., "If-Modified-Since", "If-None-Match"). By checking for these cache validation headers in the HTTP request sent to the PHP server, we can forward these headers along in the HEAD request sent to Amazon S3.

Here’s a complete example that will pass along cache-specific HTTP headers from the Amazon S3 object.

// Assuming the SDK was installed via Composer
require 'vendor/autoload.php';

use AwsS3S3Client;

// Create a client object
$client = S3Client::factory(array(
    'key'    => '****',
    'secret' => '****',
));

// Register the Amazon S3 stream wrapper
$client->registerStreamWrapper();

readObject($client, 'my-bucket', 'my-images/php.gif');

/**
 * Streams an object from Amazon S3 to the browser
 *
 * @param S3Client $client Client used to send requests
 * @param string   $bucket Bucket to access
 * @param string   $key    Object to stream
 */
function readObject(S3Client $client, $bucket, $key)
{
    // Begin building the options for the HeadObject request
    $options = array('Bucket' => $bucket, 'Key' => $key);

    // Check if the client sent the If-None-Match header
    if (isset($_SERVER['HTTP_IF_NONE_MATCH'])) {
        $options['IfNoneMatch'] = $_SERVER['HTTP_IF_NONE_MATCH'];
    }

    // Check if the client sent the If-Modified-Since header
    if (isset($_SERVER['HTTP_IF_MODIFIED_SINCE'])) {
        $options['IfModifiedSince'] = $_SERVER['HTTP_IF_MODIFIED_SINCE'];
    }

    // Create the HeadObject command
    $command = $client->getCommand('HeadObject', $options);

    try {
        $response = $command->getResponse();
    } catch (AwsS3ExceptionS3Exception $e) {
        // Handle 404 responses
        http_response_code(404);
        exit;
    }

    // Set the appropriate status code for the response (e.g., 200, 304)
    $statusCode = $response->getStatusCode();
    http_response_code($statusCode);

    // Let's carry some headers from the Amazon S3 object over to the web server
    $headers = $response->getHeaders();
    $proxyHeaders = array(
        'Last-Modified',
        'ETag',
        'Content-Type',
        'Content-Disposition'
    );

    foreach ($proxyHeaders as $header) {
        if ($headers[$header]) {
            header("{$header}: {$headers[$header]}");
        }
    }

    // Stop output buffering
    if (ob_get_level()) {
        ob_end_flush();
    }

    flush();

    // Only send the body if the file was not modified
    if ($statusCode == 200) {
        readfile("s3://{$bucket}/{$key}");
    }
}

Caveats

In most cases, this simple solution will work as expected. However, various software components are interacting with one another, and each component must be able to properly stream data in order to achieve optimal performance.

The PHP.net documentation for flush() provides some useful information to keep in mind when attempting to stream data from a web server to a browser:

Several servers, especially on Win32, will still buffer the output from your script until it terminates before transmitting the results to the browser. Server modules for Apache like mod_gzip may do buffering of their own that will cause flush() to not result in data being sent immediately to the client. Even the browser may buffer its input before displaying it. Netscape, for example, buffers text until it receives an end-of-line or the beginning of a tag, and it won’t render tables until the </table> tag of the outermost table is seen. Some versions of Microsoft Internet Explorer will only start to display the page after they have received 256 bytes of output, so you may need to send extra whitespace before flushing to get those browsers to display the page.

Release: AWS SDK for PHP 2.4.5

by Michael Dowling | on | in PHP | Permalink | Comments |  Share

We would like to announce the release of version 2.4.5 of the AWS SDK for PHP. This release adds support for using the Redis cache engine software with Amazon ElastiCache.

Changelog

  • Amazon ElastiCache now offers the Redis cache engine software, in addition to Memcached. Customers who currently use Redis can optionally "seed" a new ElastiCache Redis cache cluster with their existing data from a Redis snapshot file, easing migration to a managed ElastiCache environment. In addition, to support the Redis replication capabilities, the ElastiCache API now supports replication groups: Customers can create a replication group with a primary Redis cache node, and add one or more read replica nodes that automatically stay synchronized with cache data in the primary node. Read-intensive applications can be offloaded to a read replica, reducing the load on the primary node. Read replicas can also guard against data loss in the event of a primary cache node failure.
  • Added support for using the us-gov-west-1 region to the AWS CloudFormation client.

Install/Download the Latest SDK

AWS Service Provider for Laravel 1.1.0

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

We would like to announce the availability of version 1.1.0 of the AWS Service Provider for Laravel. This release updates the config handling logic of the service provider and provides a package-level configuration that can be published to your Laravel application via Artisan for easy customization.

Are there any other features you would like to see in the service provider? Please let us know on our GitHub issue tracker. Better yet, please consider submitting a pull request!

Release: AWS SDK for PHP 2.4.4

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

We would like to announce the release of version 2.4.4 of the
AWS SDK for PHP. This release updates the Amazon EC2 client to use the
2013-07-15 API version and fixes issues reported on the forums and GitHub.

Changelog

  • Added support for assigning a public IP address to a VPC instance at launch to the Amazon EC2 client
  • Updated the Amazon EC2 client to use the 2013-07-15 API version
  • Updated the Amazon SWF client to sign requests with Signature V4
  • Updated the Instance Metadata client to allow for higher and more customizable connection timeouts
  • Fixed an issue with the SDK where XML map structures were not being serialized correctly in some cases
  • Fixed issue #136 where a few of the new Amazon SNS mobile push operations were not working properly
  • Fixed an issue where the AWS STS AssumeRoleWithWebIdentity operation was requiring credentials and a signature
    unnecessarily
  • Fixed and issue with the S3Client::uploadDirectory method so that true key prefixes can be used
  • Updated the API docs to include sample code for each operation that indicates the parameter structure
  • Updated the API docs to include more information in the descriptions of operations and parameters
  • Added a page about Iterators to the user guide

Install/Download the Latest SDK

Provision an Amazon EC2 Instance with PHP

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

Amazon EC2 is a powerful AWS service that includes the ability to provision on-demand servers. While you can easily do this through the AWS Management Console, in this post, I want show you how to use the AWS SDK for PHP to do it programmatically by interacting with the Amazon EC2 API.

Let’s create a single PHP script, piece by piece, that uses the SDK to do the following:

  1. Create and configure an Amazon EC2 client.
  2. Create an EC2 key pair and store the private key.
  3. Create and configure an EC2 security group.
  4. Launch an EC2 instance of an Amazon Machine Image (AMI) and retrieve its public DNS name so we can access it via SSH.

Create an EC2 client

First, let’s bootstrap the SDK and create an EC2 client object. Make sure to replace the placeholder values in the following code with your AWS credentials and desired region.

<?php

require 'vendor/autoload.php';

use Aws\Ec2\Ec2Client;

$ec2Client = Ec2Client::factory(array(
    'key'    => '[aws access key]',
    'secret' => '[aws secret key]',
    'region' => '[aws region]' // (e.g., us-east-1)
));

Create a key pair

Next, we’ll create a key pair that will provide SSH access to our server once it is running. We need to create the key pair first so we can specify it when we launch the EC2 instance. Creating the key pair is simple.

// Create the key pair
$keyPairName = 'my-keypair';
$result = $ec2Client->createKeyPair(array(
    'KeyName' => $keyPairName
));

In order to use the key pair later, we will need to save the private key locally. We can do this by extracting the key material from the response and using some of PHP’s file handling functions to save it to a file. We also need to adjust the file permissions so that the key can be used for SSH access.

// Save the private key
$saveKeyLocation = getenv('HOME') . "/.ssh/{$keyPairName}.pem";
file_put_contents($saveKeyLocation, $result['keyMaterial']);

// Update the key's permissions so it can be used with SSH
chmod($saveKeyLocation, 0600);

Create and configure a security group

Next, let’s create and configure a security group which will allow the server to be accessed via HTTP (port 80) and SSH (port 22). By default, access to an EC2 instance is completely locked down. Security groups allow you to whitelist access to ports on an EC2 instance. Creating a security group requires only a name and description.

// Create the security group
$securityGroupName = 'my-security-group';
$result = $ec2Client->createSecurityGroup(array(
    'GroupName'   => $securityGroupName,
    'Description' => 'Basic web server security'
));

// Get the security group ID (optional)
$securityGroupId = $result->get('GroupId');

After creating the security group, you can then configure its rules. To open up ports 22 and 80 we will use the AuthorizeSecurityGroupIngress operation and specify the security group name.

// Set ingress rules for the security group
$ec2Client->authorizeSecurityGroupIngress(array(
    'GroupName'     => $securityGroupName,
    'IpPermissions' => array(
        array(
            'IpProtocol' => 'tcp',
            'FromPort'   => 80,
            'ToPort'     => 80,
            'IpRanges'   => array(
                array('CidrIp' => '0.0.0.0/0')
            ),
        ),
        array(
            'IpProtocol' => 'tcp',
            'FromPort'   => 22,
            'ToPort'     => 22,
            'IpRanges'   => array(
                array('CidrIp' => '0.0.0.0/0')
            ),
        )
    )
));

Note: In this simple example, we are granting all IP addresses access to these two ports, but in a production setting you should consider limiting the access to certain IP addresses or ranges as appropriate. Also, you may need to open additional ports for MySQL or HTTPS traffic.

Launch an instance

Now that we have a key pair and security group set up, we are ready to launch an EC2 instance (our server) with these settings. To launch an EC2 instance, you also need to specify the ImageId parameter, which is a reference to the AMI that the EC2 instance should be created from. In this example, we are going to use an Amazon Linux AMI. Use the EC2 RunInstances operation to launch the instance.

// Launch an instance with the key pair and security group
$result = $ec2Client->runInstances(array(
    'ImageId'        => 'ami-570f603e',
    'MinCount'       => 1,
    'MaxCount'       => 1,
    'InstanceType'   => 'm1.small',
    'KeyName'        => $keyPairName,
    'SecurityGroups' => array($securityGroupName),
));

From the result, we must get the ID of the instance. We do this using the getPath method available on the result object. This allows us to pull data out of the result that is deep within the result’s structure. The following line of code retrieves an array of instance IDs from the result. In this case, where we have launched only a single instance, the array contains only one value.

$instanceIds = $result->getPath('Instances/*/InstanceId');

Now that the launch has been triggered, we must wait for the instance to become available. The AWS SDK for PHP provides a feature called Waiters, which allow you to poll a resource until it is in a desired state. We will use the waitUntilInstanceRunning method of the EC2 client to wait until the instance that we have just launched is in the “Running” state.

// Wait until the instance is launched
$ec2Client->waitUntilInstanceRunning(array(
    'InstanceIds' => $instanceIds,
));

Once the instance is running, we can use the DescribeInstances operation to retrieve information about the instance, including its public DNS name. We’ll use the getPath method again on the result to extract the PublicDnsName value.

// Describe the now-running instance to get the public URL
$result = $ec2Client->describeInstances(array(
    'InstanceIds' => $instanceIds,
));
echo current($result->getPath('Reservations/*/Instances/*/PublicDnsName'));

Using the public DNS name and the private key that you downloaded, you can SSH into the server. You can do this (from Linux/Unix and Mac devices) by using the ssh command from your CLI.

ssh -i <path to key> ec2-user@<public dns name>

Once you are logged in, you can install software (e.g., yum install php) and deploy your application. Good work! Hopefully, this tutorial helps you with your next PHP-related DevOps project.