By Jesse Michelsen, Ad Proxy Technical Lead
Improving OTT advertising sourcing, playback, and verification
OTT is an excellent opportunity for broadcasters and content creators to go well beyond the linear TV experience by making it possible to personalize video streams based on each viewer’s interests. This high level of personalization is also a critical factor in attracting advertising revenue to OTT streams by enabling the delivery of highly-targeted advertising at premium CPM rates.
This opportunity, however, is being held back by challenges around ad sourcing, playback, and verification. Many of the standards around OTT advertising are nascent and still evolving. Moreover, in-depth debug and analysis around quality of service (QoS) is often limited. It’s also important to understand the quality of experience (QoE), such as whether an ad played at consistent volume levels.
With these challenges in mind, and our ongoing commitment to improve scaling and reduce latency, we developed a dedicated ad proxy service as part of our platform. Originally designed as a back end enhancement to improve the scalability of our streaming platform, it also offers a number of management advantages, including far more visibility and control into the ad sourcing and delivery workflow. These tools enable publishers to optimize the delivery of the right ad to the right viewer and monitor both QoS and many aspects of QoE.
Personalized streams with the manifest server
In a previous blog post, we detailed the role of the manifest server in personalizing streams to incorporate tailored advertising content. As we discussed in that post, the manifest server is responsible for making ad requests, parsing the response, and then downloading and processing advertising creative just like it would with any other content. The manifest server then sends an integrated stream to the player giving viewers a more consistent experience, maximizing device compatibility, and bypassing ad blockers.
While the manifest server is well-equipped to handle the playback and personalization portion, the work involved with sourcing and verifying advertising brings an additional level of complexity and new challenges. As we continue to optimize the streaming architectures that power personalized experiences to millions of concurrent viewers, this led to the development of an ad proxy service focused on supporting these activities.
Sourcing and verification challenges
To obtain ads that are going to be inserted into a stream, ad content must be fetched from an ad decision server (ADS) such as FreeWheel or Google Ad Manager. This process involves requesting ads and passing along the stream and all its information, so the correct ads are placed. The challenge is that many of the ads on a given server are just a wrapper pointing to the actual ads on a different server.
For example, if there are four ad slots to be filled, two of them may be inserted directly, but the other two may not have ad assets and instead are wrappers that say, “Your ad isn’t here, it’s somewhere else and you need to go get it here.” We make every effort to unpack and source a playable video asset for every ad response we see. We validate responses as we unpack them to ensure a playable ad asset is ready to stitch into the stream. Given that our architecture is designed to deliver a personalized manifest to each viewer, this process is repeated for each session which can amount to a considerable amount of load.
Ad lookup latency
Tracking down assets through several wrappers can be a major cause of latency if not handled in parallel. Some wrappers never resolve to an actual ad asset. To prevent this from degrading the video experience, we limit this “waterfalling” before moving on to fetch the next ad. Exposing data and insights during this workflow help publishers identify and resolve demand sources that don’t result in ads served and ensure viewers have an uninterrupted viewing experience while also maximizing ad revenue.
Ensuring a responsive ad experience also means looking at the impact of the ad lookup on the manifest server, which is busy assembling personalized streams with minimal latency. The manifest server doesn’t have unlimited resources dedicated to generating and storing ad performance data. It only stores the ad information it needs to generate the manifest, which can limit the availability of data to debug problematic ad calls and playback.
Ad Proxy Service takes over
Publishers today need a scalable platform that not only interacts and manages the increasingly complex ad insertion process but also provides visibility into the workflow and relationship with their ad partners.
Shown below is the Ad Proxy Service flow architecture. At the front end of the flow, the player makes requests to the manifest server until it has enough information to request ads from the ADS. Once that happens, instead of reaching out to the ADS itself, the manifest server hands that task off to the Ad Proxy Service. Not only does this offload work from the manifest server, it also enables several other advantages such as reduced latency and the capture of far more debug data.
The work of fetching and verifying an ad is handled by the Ad Proxy Service, which frees up resources for the manifest server to stitch the ads into the stream for playback and deliver a seamless viewing experience.
4. Content asks Ad Proxy, “Where are my ads for job x” referencing the unique identifier. Ad Proxy returns the ads to content, and content puts them in the manifest and returns that to the player.
Scaling ad lookup
As the Ad Proxy Service receives requests, it queues them up so that it can continue to receive new requests, improving scalability. It also provides the manifest server with a job ID as a placeholder while ads are tracked down so that the manifest server can move on without having to wait for Ad Proxy. The ADS worker then begins to chew through the “ad jobs” in the queue by calling out to the ADS and sending along all the player data captured and other stream information so the ADS can supply the appropriate ads. A key advantage of this process is that the ADS workers fetch ads in parallel, eliminating potential bottlenecks and reducing latency.
Standardizing ADS data
Throughout the process, communication between the Ad Proxy and ADS is recorded along with the ads and stored in a database. The data, which can vary somewhat from provider to provider, is parsed and normalized with consistent naming conventions. This makes it much more efficient to use the ADS data during analysis or debug.
Delivering the ads
The process is completed when the manifest server gets to the point where it needs the ads. It makes another call to Ad Proxy and says, “Here’s the job ID you gave me, give me the ads.” Ad proxy then fetches them from the database and sends them along.
Indexing and storing ad beacon activity
The Ad Proxy Service is also responsible for capturing and storing beacon information from the player, a key to ensuring proper monetization. Beacons are stored as individual objects with a primary key. Because of this, when the manifest server requests ads, the Ad Proxy service also provides beacon information. Then when the player hits a specific checkpoint, it fires a beacon, which is based on what it was instructed to do in the manifest. The beacon worker then fetches the objects from the database and then make appropriate updates to say this fired at this time, the response back from the ADS was x, it had an error or didn’t have an error, and it stores all that information.
Troubleshooting ad playback
Tracking and analysis are included in the process. The Ad Proxy architecture provides extensive information on ad performance and viewership either through an API or a GUI and push logs. We know “if” and “why” there’s an ad issue, so there’s no more finger pointing if an ad doesn’t load — you can point to the data. Every session is included without additional configuration, and data is accessible for a maximum of 14 days.
Through the API, content publishers can analyze such information as:
Publishers who are looking to engage each and every viewer with a personalized video experience must architect their streaming workloads to scale. Creating a dedicated service for ad processing, not only improves the performance of the manifest server, the engine that powers personalized ads, content, blackouts for individual viewers, but it also creates a powerful tool for troubleshooting advertising-supported video streams and ensures a high-quality, TV-like viewing experience.
With a better understanding of the root cause of problems with Ad Proxy Service, content publishers and broadcasters have visibility into the ad operational workflow and can correlate with other data to increase viewer retention and maximize ad revenue.