Figure 1 - uploaded by Edward Bortnikov
Content may be subject to copyright.
Example of Dynamic Ad Campaign (for Yahoo! Shopping)  

Example of Dynamic Ad Campaign (for Yahoo! Shopping)  

Source publication
Article
Full-text available
One of the most challenging recommendation tasks is recommending to a new, previously unseen user. This is known as the user cold start problem. Assuming certain features or attributes of users are known, one approach for handling new users is to initially model them based on their features. Motivated by an ad targeting application, this paper desc...

Context in source publication

Context 1
... that once the ad variant has been selected, the resulting ad impression looks much like non-dynamic ad. Figure 1 shows an instance of a dynamic ad campaign for Yahoo! Shopping. ...

Similar publications

Article
Full-text available
The primary objective of the present study was to factor analyze a recently developed mental ability test in order to develop verbal and numerical subtesrs—then to establish the reliability of the subtests (and of the test as a whole) using 97 advanced college students as Ss. Using a varimax rotation, some 34 factors were extracted, the most promin...

Citations

... Examples of interactions are purchase and consumption of products (termed implicit interactions) or ratings that users entered to convey and share their level of satisfaction or enjoyment with the products (termed explicit interactions). Despite the success of collaborative filtering models, they present several limitations [7,50,84]. For instance, they tend to recommend popular items, may create filter bubbles, or fail to recommend relevant items to cold users. ...
... However, future work must comply with each dataset's license agreements. 7 Table 4 summarizes relevant statistics of public datasets, such as the number of users, items, user-item interactions, and user-item impressions. In this section, we comprehensively describe each public dataset and its attributes, e.g., its definition of users, items, and collection period. ...
Preprint
Full-text available
Novel data sources bring new opportunities to improve the quality of recommender systems. Impressions are a novel data source containing past recommendations (shown items) and traditional interactions. Researchers may use impressions to refine user preferences and overcome the current limitations in recommender systems research. The relevance and interest of impressions have increased over the years; hence, the need for a review of relevant work on this type of recommenders. We present a systematic literature review on recommender systems using impressions, focusing on three fundamental angles in research: recommenders, datasets, and evaluation methodologies. We provide three categorizations of papers describing recommenders using impressions, present each reviewed paper in detail, describe datasets with impressions, and analyze the existing evaluation methodologies. Lastly, we present open questions and future directions of interest, highlighting aspects missing in the literature that can be addressed in future works.
... Collaborative Filtering (CF) and (iii) Hybridized mechanism of RS. CBF involves trademark data aimed at proposal [7], and CF frameworks work in light of the connection among client things [8]. ...
Article
Full-text available
When a new customer enters the spectrum of the E-Commerce system, the informative records and dataset, such as about the new user, purchasing history and other browsing data become insufficient, resulting in the emergence of one serious issue such as a Cold start problem (CSP). Furthermore, when the interaction among the product items becomes limited, a new problem such as Sparsity arises to handle such problems in E-Commerce system, we have designed an extensive and hybridized methodological approach known as Cold start and sparsity aware hybridized recommendation system (CSSHRS), to reduce the Sparsity of dataset as well as to overcome the cold start problem in the recommendation framework. The proposed CSSHRS technique has been predicted by using the dataset of Last. FM, and Book-Crossing resulted in Mean absolute percentage error (MAPE) of 37%, recalls 0.07, precision 0.18, Normalized Discounted Cumulative Gain (NDCD) 0.61, and F-measure 0.1. This article proves the proposed CSSHRS technique as an effective and efficient hybrid of RS against the issue of data sparsity as well as CSP.
... In order to rank the native ads for incoming users and their specific context according to the cost per click (CPC) price type where advertisers pay for clicks, a score (or expected revenue) is calculated by multiplying the predicted clickthrough rate (pCTR) by the bid for each active ad. The pCTR is provided by a model that is periodically updated by OFFSET -a feature-enhanced collaborative-filtering (CF) based event-prediction algorithm [3][5] [20]. OFFSET is a onepass 3 algorithm that updates its latent factor model for every new batch of logged data using a online gradient descent (OGD) based learning approach. ...
... Powering the Gemini native models is OFFSET (Onepass Factorization of Feature Sets): a feature enhanced collaborative-filtering (CF)-based ad event-prediction algorithm [3][5] [6][8] [20]. According to OFFSET, the predicted event probability (pET) of a user u and ad a is roughly given by ...
... In order to rank the native ads for incoming users and their specific context according to the cost per click (CPC) price type where advertisers pay for clicks, a score (or expected revenue) is calculated by multiplying the predicted clickthrough rate (pCTR) by the bid for each active ad. The pCTR is provided by a model that is periodically updated by OFFSET -a feature-enhanced collaborative-filtering (CF) based event-prediction algorithm [3][5] [20]. OFFSET is a onepass 3 algorithm that updates its latent factor model for every new batch of logged data using a online gradient descent (OGD) based learning approach. ...
... Powering the Gemini native models is OFFSET (Onepass Factorization of Feature Sets): a feature enhanced collaborative-filtering (CF)-based ad event-prediction algorithm [3][5] [6][8] [20]. According to OFFSET, the predicted event probability (pET) of a user u and ad a is roughly given by ...
Preprint
Full-text available
Yahoo Gemini native advertising marketplace serves billions of impressions daily, to hundreds millions of unique users, and reaches a yearly revenue of many hundreds of millions USDs. Powering Gemini native models for predicting advertise (ad) event probabilities, such as conversions and clicks, is OFFSET - a feature enhanced collaborative-filtering (CF) based event prediction algorithm. The predicted probabilities are then used in Gemini native auctions to determine which ads to present for every serving event (impression). Dynamic creative optimization (DCO) is a recent Gemini native product that was launched two years ago and is increasingly gaining more attention from advertisers. The DCO product enables advertisers to issue several assets per each native ad attribute, creating multiple combinations for each DCO ad. Since different combinations may appeal to different crowds, it may be beneficial to present certain combinations more frequently than others to maximize revenue while keeping advertisers and users satisfied. The initial DCO offer was to optimize click-through rates (CTR), however as the marketplace shifts more towards conversion based campaigns, advertisers also ask for a {conversion based solution. To accommodate this request, we present a post-auction solution, where DCO ads combinations are favored according to their predicted conversion rate (CVR). The predictions are provided by an auxiliary OFFSET based combination CVR prediction model, and used to generate the combination distributions for DCO ad rendering during serving time. An online evaluation of this explore-exploit solution, via online bucket A/B testing, serving Gemini native DCO traffic, showed a 53.5% CVR lift, when compared to a control bucket serving all combinations uniformly at random.
... Launched seven years ago and operating with a yearly runrate of many hundred of millions USD, Yahoo Gemini native marketplace 1 is one of Yahoo's largest and fastest growing businesses. With more than two billion impressions daily, and an inventory of a few hundred thousand active ads, Gemini native serves users with ads that are rendered to resemble the surrounding native content (see Figure 1 for examples of 1 See https://gemini.yahoo.com/advertiser/home Gemini native ads on different devices). ...
... In order to rank native ads for an incoming users and their specific context according to the cost per click (CPC) price type, a score (or expected revenue) is calculated by multiplying the advertiser's bid and the predicted click-through rate (pCTR) for each ad. The pCTR is calculated using models that are generated by OFFSET -a feature enhanced collaborative-filtering (CF) based event-prediction algorithm [1]. OFFSET is a one-pass incremental algorithm that updates its latent factor model for every new batch of logged data using a gradient based learning approach. ...
... Yahoo Gemini native models are driven by OFFSET (Onepass Factorization of Feature Sets): a feature enhanced collaborative-filtering (CF) based event prediction algorithm [1]. ...
... In order to rank the native ads for the incoming users and their specific context according to the cost per click (CPC) price type, the expected revenue of each ad is computed as a product of the advertiser's bid and the predicted click probability (pCTR). The pCTR is calculated using models that are continually updated by OFFSET -a feature enhanced collaborative-filtering (CF) based event prediction latent factor model [2]. ...
... The algorithm driving Verizon media native models is OFFSET (One-pass Factorization of Feature Sets): a feature enhanced collaborative-filtering (CF)-based ad click-prediction algorithm [2], which resembles a factorization machine [24]. The predicted click-through-rate (pCTR) of a given user u and an ad a according to OFFSET is given by ...
Preprint
Full-text available
Real-world content recommendation marketplaces exhibit certain behaviors and are imposed by constraints that are not always apparent in common static offline data sets. One example that is common in ad marketplaces is swift ad turnover. New ads are introduced and old ads disappear at high rates every day. Another example is ad discontinuity, where existing ads may appear and disappear from the market for non negligible amounts of time due to a variety of reasons (e.g., depletion of budget, pausing by the advertiser, flagging by the system, and more). These behaviors sometimes cause the model's loss surface to change dramatically over short periods of time. To address these behaviors, fresh models are highly important, and to achieve this (and for several other reasons) incremental training on small chunks of past events is often employed. These behaviors and algorithmic optimizations occasionally cause model parameters to grow uncontrollably large, or \emph{diverge}. In this work present a systematic method to prevent model parameters from diverging by imposing a carefully chosen set of constraints on the model's latent vectors. We then devise a method inspired by primal-dual optimization algorithms to fulfill these constraints in a manner which both aligns well with incremental model training, and does not require any major modifications to the underlying model training algorithm. We analyze, demonstrate, and motivate our method on OFFSET, a collaborative filtering algorithm which drives Yahoo native advertising, which is one of VZM's largest and faster growing businesses, reaching a run-rate of many hundreds of millions USD per year. Finally, we conduct an online experiment which shows a substantial reduction in the number of diverging instances, and a significant improvement to both user experience and revenue.
... In contrast to search ads, user intent is usually unknown which makes ad matching more challenging in native marketplace. Nevertheless, native ads market share is steadily increasing over the years 2 . ...
... The expected ad close loss, which can be any function of the predicted ad close probability (pClose), is estimated here as the pClose times the loss due to an ad close -a system parameter. To predict the ad close probability we use Offset -a feature enhanced collaborative-filtering (CF) based event prediction algorithm [2] [3]. To facilitate Offset to predict ad close events, we use ad closes as positive events, and conduct a feature selection process to maximize model accuracy under certain system resources constraints. ...
... In particular, collaborative filtering (CF) in general and specifically matrix factorization (MF) based approaches are leading recommendation technologies, where entities are represented by latent vectors and learned by users' feedback (such as ratings, clicks and purchases) [16]. MF-CF based models are used successfully for many recommendation tasks such as movie recommendation [6], music recommendation [4], ad matching [2], and much more. ...
... Collaborative filtering (CF) in general and specifically matrix factorization (MF) based approaches are leading recommendation technologies, according to which entities are represented by latent vectors and learned by users' feedback (such as ratings, clicks and purchases) [14]. MF-CF based models are used successfully for many recommendation tasks such as movie recommendation [6], music recommendation [5], ad matching [3], and much more. CF is evolving constantly, where recently it was combined with deep learning (DL) for embedding entities into the model [12]. ...
... Yahoo has also shared its native ad click prediction algorithm with the community where an earlier version of Offset was presented [3]. A mature version of Offset was presented in [4], where the focus was on the adaptive online hyper-parameter tunning approach of it, taking advantage of its parallel system architecture. ...
Conference Paper
Yahoo's native advertising (also known as Gemini native) serves billions of ad impressions daily, reaching a yearly run-rate of many hundred of millions USD. Driving the Gemini native models that are used to predict both click probability (pCTR) and conversion probability (pCONV) is øffset\ -- a feature enhanced collaborative-filtering (CF) based event prediction algorithm. øffset is a one-pass algorithm that updates its model for every new batch of logged data using a stochastic gradient descent (SGD) based approach. Since øffset represents its users by their features (i.e., user-less model) due to sparsity issues, rule based hard frequency capping (HFC) is used to control the number of times a certain user views a certain ad. Moreover, related statistics reveal that user ad fatigue results in a dramatic drop in click through rate (CTR). Therefore, to improve click prediction accuracy, we propose a soft frequency capping (SFC) approach, where the frequency feature is incorporated into the øffset model as a user-ad feature and its weight vector is learned via logistic regression as part of øffset training. Online evaluation of the soft frequency capping algorithm via bucket testing showed a significant $7.3$% revenue lift. Since then, the frequency feature enhanced model has been pushed to production serving all traffic, and is generating a hefty revenue lift for Yahoo Gemini native. We also report related statistics that reveal, among other things, that while users' gender does not affect ad fatigue, the latter seems to increase with users' age.
... The pCTR is calculated using models that are periodically updated by Offset -a feature enhanced collaborative filtering (CF) based event-prediction algorithm. Offset is a onepass algorithm that updates its latent factor model for every new mini-batch of logged data using a stochastic gradient descent (SGD) based learning approach [2] [3]. ...
... In particular, collaborative filtering (CF) in general and specifically matrix factorization (MF) based approaches are leading recommendation technologies, where entities are represented by latent vectors and learned by users' feedback (such as ratings, clicks and purchases) [14]. MF-CF based models are used successfully for many recommendation tasks such as movie recommendation [6], music recommendation [4], ad matching [2], and much more. ...
... Yahoo has also shared its native ad click prediction algorithm with the community where an earlier version of Offset was presented in [2]. A mature version of Offset was presented in [3], where the focus was on the adaptive online hyper-parameter tuning approach, taking advantage of its parallel system architecture. ...
Conference Paper
Yahoo's native advertising marketplace (also known as Gemini native) serves billions of ad impressions daily, reaching many hundreds of millions USD in yearly revenue. Driving Gemini native models that are used to predict ad click probability (pCTR) is OFFSET - a feature enhanced collaborative-filtering (CF) based event prediction algorithm. While some of the user features used by OFFSET have high coverage, other features, especially those based on click patterns, suffer from extremely low coverage. In this work, we present a framework that simplifies complex interactions between users and other entities in a bipartite graph. The one mode projection of this bipartite graph onto users represents a user similarity network, allowing us to quantify similarities between users. This network is combined with existing user features to create an enhanced feature set. In particular, we describe the implementation and performance of our framework using user Internet browsing data (e.g., visited pages URLs) to enhance the user category feature. Using our framework we effectively increase the feature coverage by roughly 15%. Moreover, online results evaluated on 1% of Gemini native traffic show that using the enhanced feature increases revenue by almost 1% when compared to the baseline operating with the original feature, which is a substantial increase at scale.
... In order to rank the native ads for an incoming user and her specific context according to the cost per click (CPC) price type, a score (or expected revenue) is calculated by multiplying the advertiser's bid and the predicted click probability (pCTR) for each active ad. The pCTR is calculated using models that are periodically updated by Offset -a feature enhanced collaborative-filtering (CF) based event-prediction algorithm [3] [4]. Offset is a one-pass algorithm that updates its latent factor model for every new mini-batch of logged data using a stochastic gradient descent (SGD) based learning approach. ...
... Since slots are not symmetrical and some are more conspicuous than others (e.g., the first slot is much bigger on desktops while the first slot is the only one completely visible without scrolling on mobile), it is beneficial to optimize the mapping of assets to slots in terms 1 https://gemini.yahoo.com/advertiser/home 2 Where Yahoo presents its ads on a third party site and shares the revenues with the site owner. 3 Due to obvious commercial confidentiality matters, we provide rough estimations or relative numbers regarding traffic sizes and revenues, and present relative performance lifts and rescaled CTRs only. of CTR and revenue. The Gemini native marketplace user-interface (UI) enables advertisers to map assets to slots for carousel ads, and also to choose whether the carousel is to be presented using this mapping or to let the system optimize the mapping automatically. ...
... The algorithm driving Gemini native models is Offset (One-pass Factorization of Feature Sets): a feature-enhanced collaborativefiltering (CF) ad click-prediction algorithm [3] [4]. The predicted click-probability or click-through-rate (pCTR) of a given user u and ad a according to Offset is given by 7 pCT R(u, a) = 1 ...
Conference Paper
Yahoo's native advertising (also known as Gemini native) serves billions of ad impressions daily, reaching a yearly run-rate of many hundred of millions USD. Driving Gemini native models for predicting both click probability (pCTR) and conversion probability (pCONV) is OFFSET - a feature enhanced collaborative-filtering (CF) based event prediction algorithm. The predicted pCTRs are then used in Gemini native auctions to determine which ads to present for each serving event. A fast growing segment of Gemini native is Carousel ads that include several cards (or assets) which are used to populate several slots within the ad. Since Carousel ad slots are not symmetrical and some are more conspicuous than others, it is beneficial to render assets to slots in a way that maximizes revenue. In this work we present a post-auction successive elimination based approach for ranking assets according to their click trough rate (CTR) and render the carousel accordingly, placing higher CTR assets in more conspicuous slots. After a successful online bucket showing 8.6% CTR and 4.3% CPM (or revenue) lifts over a control bucket that uses predefined advertisers assets-to-slots mapping, the carousel asset optimization (CAO) system was pushed to production and is serving all Gemini native traffic since. A few months after CAO deployment, we have already measured an almost 40% increase in carousel ads revenue. Moreover, the entire revenue growth is related to CAO traffic increase due to additional advertiser demand, which demonstrates a high advertisers' satisfaction of the product.