Neo@Ogilvy is Ogilvy & Mather's global media agency and performance marketing network, employing more than 800 digital experts in 40 offices worldwide. Ogilvy & Mather launched Neo@Ogilvy in 2006 to help clients organize and relate complex consumer information so they can make time-sensitive decisions, monitor emerging trends, course-correct rapidly and jump on new business opportunities.

Neo@Ogilvy believes that technology is a crucial component of a new marketing world where every action has at least one data point. To deliver marketplace foresight to some of the world’s most innovative and respected brands, Neo@Ogilvy developed a proprietary processing T-SQL pipeline that uses clickstream data to analyze product and revenue information for its customers by region. “This raw data is an extremely valuable asset for us in order to fully understand user behavior. It has allowed us to evolve our strategic planning function fundamentally into a truly scientific process,” explains Rafael Garcia-Navarro, Head of Analytics, EMEA.

The company initially ran the processing pipeline on a single in-house server that could only process three months of data across a single market at a time. Compute times became unfeasibly long when the company tried to analyze larger volumes of data. This limitation impacted the robustness of the statistical models and meant that deploying the solution for large global clients (who had campaigns and other activities running over 35 markets), became an extremely resource intensive process.

Neo@Ogilvy was using a ten-step Extract Transform Load (ETL) process that included extensive data parsing, data aggregation, and pivot operations to provide information in a usable structure for its team of analysts. “The process was sequential in nature and didn’t take advantage of massive parallel processing (MPP) technologies,” says Garcia-Navarro. “We had to find an alternative and sustainable platform to carry out the ETL stage of our data modeling framework”.

Neo@Ogilvy approached Amazon Web Services (AWS) to learn how it could use cloud technology to continue to innovate for its customer base. “AWS is widely recognized across the industry as the leader in cloud computing platforms both in terms of breadth and depth of its offering,” says Garcia-Navarro. “We only work with the best and the AWS Cloud provides on-demand access to cost-effective, scalable big data technologies to help collect, store, compute, and collaborate around data sets of all sizes.”

Neo@Ogilvy uses a secure connection over the Internet to export data from Google DoubleClick to Amazon Simple Storage Service (Amazon S3). “We don’t have to think about capacity planning,” says Garcia-Navarro. Migrating to Amazon S3 removes the need to keep an in-house storage platform, significantly reducing our operational and fixed costs.”

The company is using Amazon Elastic MapReduce (Amazon EMR) to allow for a more scalable, powerful and cost-effective ETL process. Amazon EMR enables a set of parallel streaming operations for Hadoop in the Python programming language. By using Amazon EMR, Neo@Ogilvy can provision the compute capacity needed to perform the ETL quickly without having to worry about time-consuming setup, management or tuning Hadoop clusters. “The benefit of using Amazon EMR is that we’re capable of matching our processing power to the urgency of the client’s needs,” says Garcia-Navarro.

Once the basic string transformations are complete, Neo@Ogilvy creates data aggregations and joins with Apache Hive, and uses Amazon Relational Database Service (Amazon RDS) to generate a data pivot structure with the raw event data. The processed data is loaded into MS SQL Server environment in Neo@Ogilvy’s environment so analysts can perform statistical modeling using Microsoft SQL Server Analysis Services (SSAS) and the R statistical computing platform. By using AWS, Neo@Ogilvy designed its architecture to be able to analyze a year’s worth of data and scale and could increase the volume of data as needed. Figure 1 demonstrates Neo@Ogilvy’s architecture running on AWS.

Neo@Ogilvy Architecture Diagram

Neo@Ogilvy Architecture Diagram

Neo@Ogilvy was able to implement its complex infrastructure in only eight weeks and realized performance improvements and costs savings. By using Amazon EMR, Neo@Ogilvy was able to execute the core string transformations against the full dataset of 500 GB in 193 minutes that would be an overnight processing job with its legacy system.

Neo@Ogilvy also takes advantage of Amazon EC2 Spot Instances pricing model to lower computing costs significantly for time-flexible and interruption-tolerant tasks. “By using Spot Instances, we were able to perform the ETL process at a cost of only $250.00, notes Garcia-Navarro. “The cost reduction coupled with the performance improvement has pushed our analytical delivery capability to a whole new level”.

Richard Wheaton, CEO of Neo@Ogilvy EMEA, sums up the AWS relationship as follows: "Working with AWS has given Neo@Ogilvy a genuine competitive advantage in a critical new phase of digital media innovation. The speed and power of the architecture that we have created on AWS provides our analysts with new insight and predictive tools to optimize performance between digital channels, and within the channel. At the end of the day, it is all about finding new ways to drive better performance for our clients. Our new capability, courtesy of AWS, helps us crystalize our insights and make them scalable and repeatable for large international clients.”

Neo@Ogilvy plans to explore long-term data archiving with Amazon Glacier and EC2 Reserved Instances to continue to lower costs for time-critical processes. “AWS forms the backbone of our data proposition, enabling us to deliver business value to our clients from big data,” says Garcia-Navarro. “We view the relationship with AWS as a strategic one that will continue to grow in the future.”

To learn more about how AWS can help your digital marketing needs, visit our Digital Marketing details page: http://aws.amazon.com/digital-marketing/.

To learn more about how AWS can help your data needs, visit our Big Data details page: http://aws.amazon.com/big-data/.