Looking for a way to preview online display ads and automatically save a screenshot/grab/capture?
Based on conversations we’ve had with one of our consulting clients, Datapoint Media, who are very familiar with the online advertising industry, it became quickly apparent that there really isn’t a good automated solution currently out there. When a buyer asks for “proofs” of their banner ads on the main sites that they will appear in, Ad Operations personnel are faced with two less than thrilling (and quite time intensive) options:
- Grab screenshots of the sites that the client would like to preview and download the standalone display ad images the client is buying. Then open up Photoshop or other photo editor and copy and paste those ad images over the existing banner ads on the screenshot of the target websites.
- Wait until the campaign is in flight and hope to catch lightning in a bottle by loading up the website the ad is likely to rotate into, refreshing the page continuously until the ads the clients bought appear, and finally taking a screenshot of the site.
Imagine having to do this every day week in and week out for hundreds of client orders.
Given the strong demand for a tool and a lack of automated solutions, we worked with Datapoint Media to build a tool as part of their existing Audience Extension platform .
Here’s how the Banner Ad preview tool works:
- Simple web based UI allows users to enter a website URL for which they’d like to preview the ads on. Once selected, the website is displayed in an iframe “preview window” to allow the user to get the lay of the land and see the current ad layout of the website.
- Users can choose from 3 options on how they want to input the banner ad/creative images they want to display on the selected site. The 3 options are:
- At this point, users submit the preview request. If they chose the Ad Server ID entry method, the Ad Servers API is pinged for a listing of all the associated creative images. After that, users select which creatives they want to include in the screenshot.
- The request is placed in a queue to be automatically processed. Next, users are presented with a confirmation that they will receive an email with the screenshot file attached within a few minutes. No need for any more work to be done by humans, it’s time for the robots to do the heavy lifting.
- Behind the scenes the tool loads up an “invisible” browser window on the server which points to the target website. Next it executes a series of commands to inspect the website determining where the valid ad slots are located. Once the slots are defined, it matches up the open slots with the dimensions of the banner ads that the user has selected. If the dimensions match, it replaces the existing ads on the website with the user entered banner ads and takes a screenshot.
- The resulting screenshot file is saved on the application server and automatically emailed to the user.
If you have any questions or are interested in gaining access to the tool, feel free to contact the guys over at http://www.datapointmedia.com.
Posted In: General
Earlier this month one of our clients, SoonSpoon, launched a reservation site that caters towards diners looking to score last minute reservations at some of the most intimate and creative restaurants in Boston. The growing list of partner restaurants such as L’Espalier, Menton and Journeyman are able to use the SoonSpoon website to list last minute reservations that surface as a result of cancellations. Soonspoon then disseminates the open reservations to users through an constantly updated listing on their website, text, twitter and email. As soon as the listed reservation is booked by a spontaneous diner the restaurants receive a text informing them that SoonSpoon has them covered and their table filled.
A few months ago SoonSpoon co-founders Travis Lowry and Conor Clary approached us for help finalizing the last pieces of back-end functionality required to get their platform production ready. We added a mobile-friendly dashboard for restaurants to easily list new reservations as well an integration with Twilio’s REST API for sending SMS messages. It was great to see this thing lift off successfully and we wish these guys the best of luck – with close to 500 diners , 14 partner restaurants and plans to expand to other cities it looks like they’re off to a great start!
If you want to read more about these guys their website is www.soonspoon.com and twitter handle is @soonspoonhq. Here’s some recent press from Boston Eater, Boston Business Journal and Boston Magazine.
Posted In: Launch
Recently, one of our clients noticed that when they added additional text to the body field of a node with a bunch of existing content the changes would appear to save on the back-end edit screen but the body content of the page disappears on the front end without a trace and with no errors. At first, we thought it was a character or word count restriction that was placed on the body field or that a text-format filter/html combination was throwing things off. After checking a bunch of settings on the admin screen and testing different combinations of words, characters and text-format filters we came up empty handed.
Turns out it was an obscure setting within sites/default/settings.php. If you open this file and search for ‘pcre.backtrack_limit’ you’ll find a surprisingly accurate description of the problem at hand:
* If you encounter a situation where users post a large amount of text, and
* the result is stripped out upon viewing but can still be edited, Drupal’s
* output filter may not have sufficient memory to process it. If you
* experience this issue, you may wish to uncomment the following two lines
* and increase the limits of these variables. For more information, see
# ini_set(‘pcre.backtrack_limit’, 200000);
# ini_set(‘pcre.recursion_limit’, 200000);
So once you comment these out and increase the limits you’ll find that the body content reappears on the front end.
Since everyone’s server setup is different, you’ll have to experiment with what values work best for you. Here’s a link to the php.net manual for this configuration setting: http://php.net/manual/en/pcre.configuration.php.
Hope this saves you some time and frustration!
Boston is one of the most active places in the US for technology innovation and home to hundreds of exciting young companies with incredible new ideas. In support of the Boston tech startup scene, I have been publishing a series of short blog posts spotlighting some of our most interesting neighbors.
Due to our continued fascination with big data and support for companies playing in the space it seemed only logical to write about Recorded Future for this edition. These guys are also headquartered in Cambridge, with offices in Göteborg, Sweden and Arlington, VA.
They constantly collect real-time data from web sources such as news, blogs, and public social media and use their technology to analyze trends and identify past, present, and future events. These events are then linked to the people, places, and organizations that matter to their clients, who include Fortune 500 companies and leading government agencies.
Recorded Future’s team of computer scientists, statisticians, linguists, and technical business people offer up an array of software products and services centered around web intelligence. They also provide the Recorded Future API, a web service that allows developers to get in on the action by accessing Recorded Future’s index for large scale analysis of online media flow.
If you’re interested, there’s lots more about their products and services on their website.
Stay tuned for the next startup spotlight.
Posted In: General
It’s that time of year again. Lines forming outside the most popular retailers filled with turkey-gorged shoppers eagerly awaiting this years biggest Black Friday deals. In efforts to curb their boredom, these shoppers take to Twitter to pass the time in line and share their shopping experiences. Since we’re not big shoppers ourselves, and certainly not fans of waiting in lines, we took a different approach to participating in Black Friday.
We decided to flex our big data muscles and hook into Twitter’s streaming API sample which represents a random sampling of twitter’s 400 million tweets per day and recorded all tweets mentioning Black Friday. In order to handle the streaming data from Twitter, we set up a Storm cluster which processed close to 1 million Black Friday related tweets, and then saved the data in a MySQL database we spun up on AWS.
For those of you not familiar, Storm is an open source distributed real-time computation system which can be used to reliably process unbounded streams of data. If you’re interested in the technical details, stay tuned because we’ll be putting out a separate blog post that will walk you through what we did. Also, if you’d like a copy of the mySQL table with the tweet data, you can download it here.
We put together the below infographic based on the data we collected over the 24 hour period beginning Thurs 8pm EST to Friday 8pm EST. We hope you enjoy.
Posted In: General