{"id":37760,"date":"2021-06-05T00:15:11","date_gmt":"2021-06-04T18:45:11","guid":{"rendered":"https:\/\/www.algonomy.com.br\/?p=37760"},"modified":"2021-06-26T13:12:44","modified_gmt":"2021-06-26T07:42:44","slug":"solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp","status":"publish","type":"post","link":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/","title":{"rendered":"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\"><p align=\"justify\"><strong>A deep dive into how we leverage neural networks to create relevant cross-sell recommendations<\/strong><\/p><\/h2>\n\n\n\n<p>Have you ever wondered how, as you are shopping online, adding items to your cart, the website seems to suggest products that perfectly complement what you have already bought or added to your cart? This is known as cross-sell: recommending additional items that complement the products you are currently considering.&nbsp;<\/p>\n\n\n\n<p>Traditionally, cross-sell has been done using a method called collaborative filtering or statistical \u201cwisdom of the crowd\u201d models. These methods are great in that they are relatively simple and fast. Unfortunately, they do not work well for new products or categories of products that do not get much traffic, i.e. the long tail, because they cannot make recommendations for products without sufficient view or purchase data. Methods using more abundant data (like views or clicks) or content-based unsupervised methods can reliably provide similar product recommendations, but fail to capture the complex associations between products that make good cross-sell.&nbsp;<\/p>\n\n\n\n<p><strong>Natural Language Processing (NLP) for Cross-Sell<\/strong><\/p>\n\n\n\n<p>NLP is a branch of linguistics and artificial intelligence that uses machine learning to analyze and convert natural language into useful representations or insights.&nbsp;<\/p>\n\n\n\n<p>At Algonomy, we use NLP in many ways. One of which is in extracting structured information from unstructured texts. For instance, gathering product features or metadata from its text description (Topic for another blog post). But here we will focus on our deep learning NLP cross-sell models, hereafter shortened to \u201c<strong>DeepRecs NLP<\/strong>.\u201d<\/p>\n\n\n\n<p><strong>What Data do NLP Models use?&nbsp;<\/strong><\/p>\n\n\n\n<p>Rather than using Product IDs as input, like a collaborative filtering or statistics based method would, DeepRecs NLP uses natural language and structured metadata, such as product descriptions, reviews, brand or categories. This allows the model to uncover patterns in the language used to describe a product and therefore inform cross-sell. More importantly, it allows the models to reach products that have rarely been viewed or bought, e.g. new or&nbsp; long-tail products. These are products with relatively little purchase (or view) data, as shown in Figure 1.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh5.googleusercontent.com\/YQcn465cxCnYnkgXCeWB1tg7xHq6TY78-j1nUsbTe9gaXGy5oNDbrVaawLCRLleIxQeBcu5HmAVsZm6QviifmVDTjsM-FDfUyWRpDZlWSPw6tQzIJ3HAljROmNXQH6S32MuxfFKz\" alt=\"\"\/><\/figure>\n\n\n\n<p class=\"has-small-font-size\"><em>Figure 1. Plot illustrating product long tail. The majority of products have relatively little purchase data, making it difficult to use traditional recommendation strategies. Purchase count was capped at 1000 for plot clarity, though in some cases the actual purchase count was much higher<\/em>.<\/p>\n\n\n\n<p>We leverage shopper data to build neural networks (a specific type of machine learning model, broadly encompassed by the field of deep learning). These networks consider larger patterns of shopper behavior by learning from pairs of products that co-occur. For example, products that were purchased by the same user, within a given time window, are \u201cconnected.\u201d Based on the number of times a product pair co-occurs across <em>all <\/em>users, each pair is assigned a score between 0 and 1, with a higher score denoting better cross-sell fit. Specifically,<\/p>\n\n\n\n<figure class=\"wp-block-image is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/lh4.googleusercontent.com\/mpKmQ9tQbHXL7VOpLR7e50lJF53XkAiE7eOzdRTT3Mu-3h1vBWhViCAf30iNEPnG_wB4EIDG3380LQ8Wj4lNHwpGxGnfZxAeUqfyS7_F4An0OjwC2yWUhdj9L8Y1uPObbVvo0Faz\" alt=\"\" width=\"200\" height=\"45\"\/><\/figure>\n\n\n\n<p>where <a href=\"https:\/\/www.codecogs.com\/eqnedit.php?latex=C_%7Bp_1%7D%5E%7Bp_2%7D#0\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/lh3.googleusercontent.com\/ageYZckJxXgWOOKGupupePUsVCbo7x7FvuRVkBrSz0GEvRhKu_n1rv_hvrkHVrTHZosXuK4L6QOVQohoWEiXEUivgarw3zM1L0MaGF0JIQAj_Gm65ay4w8JE9YBaKC4MFSoSgzNs\" width=\"20\" height=\"16\"><\/a> is the co-occurrence count of product&nbsp;<em>p2&nbsp;<\/em>with product&nbsp;<em>p1<\/em>.<\/p>\n\n\n\n<p>From these product pairs, a seed product and a recommendation (rec) product is defined. Each will have various information associated with it, such as name, brand, description(s) or reviews. We normalize the text data. From there, based on the input type, we might process input in different ways &#8211; more on that after we look at the results our clients are seeing.<\/p>\n\n\n\n<p><strong>Impact on eCommerce Metrics&nbsp;<\/strong><\/p>\n\n\n\n<p>DeepRecs NLP has consistently shown increases to key metrics our clients care about, such as click-through rate (CTR), average order value (AOV), and revenue per visit (RPV). Of the five clients for which we ran A\/B tests comparing slates of recommendation strategies with and without DeepRecs NLP, the treatment side (with DeepRecs NLP) for all five showed a statistically significant increase in CTR. In most cases, DeepRecs NLP also showed an increase to AOV and RPV, with one site obtaining a 7.296% increase to RPV!&nbsp;<\/p>\n\n\n\n<p>Many clients were so convinced with the initial results from DeepRecs NLP that they skipped the A\/B test and immediately made it live to all traffic in key places like the Product Detail Page or Add-to-cart pages. In one month alone, DeepRecs NLP provided over 14.5 million viewed recommendations to 20 sites world-wide.<\/p>\n\n\n\n<p><strong>DeepRecs NLP Model Architecture<\/strong><\/p>\n\n\n\n<p>In our DeepRecs NLP system, we use two identical \u201cstacks\u201d of neural networks, one for seed products and another for rec products. In this section, we detail some of the specifics of our model architecture.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh5.googleusercontent.com\/X7nZs7VFbs1Fk5XbgMcQdMUSFubB1ao6ivIWMtlkSrxBe1TViJBHtBem4VAjXjElsGnhKGP0T7kyjPhI5Id7zWwaZTpXhvglzLTTvwDsTCgrZTBmVKYwKG_Kz1zboKshNBHEfGZR\" alt=\"\"\/><\/figure>\n\n\n\n<p class=\"has-small-font-size\"><em>Figure 2. DeepRecs NLP model architecture. Orange boxes represent inputs and grey ovals represent model layers or interchangeable components. The green and gold rectangles represent intermediate seed and recommendation \u201cstack\u201d vectors, respectively.<\/em><\/p>\n\n\n\n<p>Each input feature gets <em>encoded <\/em>in different ways, based on its type. For instance, for discrete categorical data, e.g. brand, we learn a dense \u201cembedding\u201d vector representation for each unique value, or a \u201cmulti-hot\u201d embedded vector representation if there are multiple categories(a vector is basically a series of numbers like [0.52, 0.02, 0.97, 0.08, 0.55]). For unstructured text input we might \u201cembed\u201d each word and run them through an Attention layer <sup>[1][2]<\/sup>&#8211;a mechanism for combining vectors, focusing on important values to get a feature-level representation. Alternatively, we might run the raw text through a pre-trained BERT model <sup>[3]<\/sup>, followed by one or more fully-connected neural network layers to capture cross-sell specific relationships from the pre-trained model.<\/p>\n\n\n\n<p>The result is that we have vectors of the same size for each input. A benefit of processing inputs in this way is that we can ingest an arbitrary number of inputs, determined by the client. We then combine them all (usually with a weighted sum or another Attention layer) and optionally add more fully-connected layers to get a final product vector.<\/p>\n\n\n\n<figure class=\"wp-block-video\">\n\n<video controls=\"\">\n<source src=\"http:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/video.mp4\" type=\"video\/mp4\"><\/video>\n\n<figcaption><em>Visualization of a product vector space in children\u2019s clothing domain generated using TensorBoard.<\/em><\/figcaption><\/figure>\n\n\n\n<p>To produce a score between seed and rec products, we take the cosine similarity of the normalized final product vectors. For training, this similarity score is compared to the score from Equation 1 using a hinge loss. At prediction time, the similarity score can be used to rank pairs of products for cross-sell.<\/p>\n\n\n\n<p>Each of the \u201cusual\u201d model options mentioned above are automatically determined during hyper-parameter optimization based on the complexity of the site catalog and loss metrics on a validation dataset. This allows each of our clients to have a custom, personalized model that is best suited to their catalog.<\/p>\n\n\n\n<p><strong>Mitigating Inherent Model Bias<\/strong><\/p>\n\n\n\n<p>BERT has been a breakthrough in the world of NLP for many tasks, including our cross-sell recommendation systems. Because they were trained on a lot<em> <\/em>of data from the internet, pre-trained BERT models can process just about any text in over 100 languages into a meaningful, dense representation that can be used in downstream tasks without the costs of training from scratch.&nbsp;<\/p>\n\n\n\n<p>Unfortunately, training on internet text also means that the models implicitly encode potentially harmful language bias. Research suggests that BERT introduces language bias into systems that use it <sup>[4][5][6]<\/sup>. For example, when used in a Natural Language Generation system, it might always associate \u201cDoctor\u201d with male pronouns and \u201cNurse\u201d with female pronouns.<\/p>\n\n\n\n<p>In the context of cross sell, consider children\u02bcs clothing and toys. It is a highly gendered catalog space because that is what society and shoppers reinforce through norms and macro-patterns of purchases. If you add into those interactions the gender bias of BERT, it is not hard to imagine a scenario where NLP cross-sell disproportionately recommends a kitchen play-set or Barbie doll for a unicorn blouse seed, but swim trunks or a backyard science kit for a shark graphic tee seed. Although contrived, this example demonstrates how training data can inject unfair bias into recommendations, ultimately impacting the interests or opportunities of different groups of people. At Algonomy, we are actively researching how large of an issue this is in our models and ways such bias can be mitigated.<\/p>\n\n\n\n<p><strong>Conclusion&nbsp;<\/strong><\/p>\n\n\n\n<p>Cross sell models that understand industry context (in this case retail) and address domain-specific challenges like catalog long-tail can create high business value. At Algonomy, we leverage deep learning techniques as well as shopper history, patterns and insights that improve relevance of cross-sell, enhance the shopping experience and grow metrics such as AOV and RPS.<\/p>\n\n\n\n<p><strong>References<\/strong><\/p>\n\n\n\n<p>1. <a href=\"http:\/\/www.stanford.edu\/~lmthang\">Minh-Thang Luong<\/a>, <a href=\"http:\/\/cs.stanford.edu\/~hyhieu\">Hieu Pham<\/a>, and <a href=\"http:\/\/nlp.stanford.edu\/~manning\/\">Christopher D. Manning<\/a>. 2015. Effective Approaches to Attention-based Neural Machine Translation. In <em>Empirical Methods in Natural Language Processing (EMNLP)<\/em>.<\/p>\n\n\n\n<p>2. Nikolas Adaloglou. 2020. How Attention works in Deep Learning: understanding the attention mechanism in sequence models. <a href=\"https:\/\/theaisummer.com\/attention\/\">https:\/\/theaisummer.com\/attention\/<\/a><\/p>\n\n\n\n<p>3. Devlin, Jacob, et al. &#8220;Bert: Pre-training of deep bidirectional transformers for language understanding.&#8221; <em>arXiv preprint arXiv:1810.04805<\/em> (2018).<\/p>\n\n\n\n<p>4. Tan, Y. and L. E. Celis. \u201cAssessing Social and Intersectional Biases in Contextualized Word Representations.\u201d <em>NeurIPS<\/em> (2019).<\/p>\n\n\n\n<p>5. Kurita, Keita et al. &#8220;Measuring Bias in Contextualized Word Representations.&#8221; Proceedings of the First Workshop on Gender Bias in Natural Language Processing. Association for Computational Linguistics.<\/p>\n\n\n\n<p>6. Bender, Emily M. et al. &#8220;On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ?.&#8221; Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. Association for Computing Machinery.<\/p>\n","protected":false},"excerpt":{"rendered":"<span class=\"span-reading-time rt-reading-time\" style=\"display: block;\"><span class=\"rt-label rt-prefix\">Reading Time: <\/span> <span class=\"rt-time\"> 6<\/span> <span class=\"rt-label rt-postfix\">minutes<\/span><\/span>A deep dive into how we leverage neural networks to create relevant cross-sell recommendations Have you ever wondered how, as you are shopping online, adding items to your cart, the website seems to suggest products that perfectly complement what you&#8230;","protected":false},"author":20,"featured_media":38744,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"inline_featured_image":false,"footnotes":""},"categories":[150],"tags":[159,158,162,160,161,163],"topics":[],"industry":[],"product":[],"class_list":["post-37760","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-personalization","tag-ai","tag-artificial-intelligence","tag-deep-learning","tag-machine-learning","tag-ml","tag-nlp"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.4 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP - Algonomy<\/title>\n<meta name=\"description\" content=\"Find out how to use deep learning and natural language processing to make cross-sell recommendations that are meaningful for the individual customer.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP - Algonomy\" \/>\n<meta property=\"og:description\" content=\"Find out how to use deep learning and natural language processing to make cross-sell recommendations that are meaningful for the individual customer.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/\" \/>\n<meta property=\"og:site_name\" content=\"Algonomy\" \/>\n<meta property=\"article:published_time\" content=\"2021-06-04T18:45:11+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-06-26T07:42:44+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2048\" \/>\n\t<meta property=\"og:image:height\" content=\"1411\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Neal Digre\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Neal Digre\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/\",\"url\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/\",\"name\":\"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP - Algonomy\",\"isPartOf\":{\"@id\":\"https:\/\/www.algonomy.com.br\/en\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg\",\"datePublished\":\"2021-06-04T18:45:11+00:00\",\"dateModified\":\"2021-06-26T07:42:44+00:00\",\"author\":{\"@id\":\"https:\/\/www.algonomy.com.br\/en\/#\/schema\/person\/7615ccf7e0ea5b6046938eac5ea83c80\"},\"description\":\"Find out how to use deep learning and natural language processing to make cross-sell recommendations that are meaningful for the individual customer.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#primaryimage\",\"url\":\"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg\",\"contentUrl\":\"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg\",\"width\":2048,\"height\":1411,\"caption\":\"Machine learning 3 step infographic, artificial intelligence, Machine learning and Deep learning flat line vector banner with icons on white background.\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.algonomy.com.br\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.algonomy.com.br\/en\/#website\",\"url\":\"https:\/\/www.algonomy.com.br\/en\/\",\"name\":\"Algonomy\",\"description\":\"Redo Digital\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.algonomy.com.br\/en\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.algonomy.com.br\/en\/#\/schema\/person\/7615ccf7e0ea5b6046938eac5ea83c80\",\"name\":\"Neal Digre\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.algonomy.com.br\/en\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/Neal-96x96.jpeg\",\"contentUrl\":\"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/Neal-96x96.jpeg\",\"caption\":\"Neal Digre\"},\"description\":\"Neal Digre is a Senior Data Scientist at Algonomy, where his current focus is applying deep learning and natural language processing to push the performance of recommendation strategies and provide tools for enriching catalog metadata. These interests and more are driven by his passion to explore the amazing complexities of human language, fostered through degrees in Linguistics (BA) and Computer Science (BS) from Western Washington University, a MSc in Informatics from the University of Edinburgh, and previous work as an intern at the National Institute of Informatics (Tokyo, Japan). In non-pandemic times, he enjoys watching movies on airplanes and curling (that funny sport on ice).\",\"url\":\"https:\/\/www.algonomy.com.br\/en\/author\/neal\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP - Algonomy","description":"Find out how to use deep learning and natural language processing to make cross-sell recommendations that are meaningful for the individual customer.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/","og_locale":"en_US","og_type":"article","og_title":"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP - Algonomy","og_description":"Find out how to use deep learning and natural language processing to make cross-sell recommendations that are meaningful for the individual customer.","og_url":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/","og_site_name":"Algonomy","article_published_time":"2021-06-04T18:45:11+00:00","article_modified_time":"2021-06-26T07:42:44+00:00","og_image":[{"width":2048,"height":1411,"url":"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg","type":"image\/jpeg"}],"author":"Neal Digre","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Neal Digre","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/","url":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/","name":"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP - Algonomy","isPartOf":{"@id":"https:\/\/www.algonomy.com.br\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#primaryimage"},"image":{"@id":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#primaryimage"},"thumbnailUrl":"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg","datePublished":"2021-06-04T18:45:11+00:00","dateModified":"2021-06-26T07:42:44+00:00","author":{"@id":"https:\/\/www.algonomy.com.br\/en\/#\/schema\/person\/7615ccf7e0ea5b6046938eac5ea83c80"},"description":"Find out how to use deep learning and natural language processing to make cross-sell recommendations that are meaningful for the individual customer.","breadcrumb":{"@id":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#primaryimage","url":"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg","contentUrl":"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/personalization-deep-learning-nlp-scaled-1.jpg","width":2048,"height":1411,"caption":"Machine learning 3 step infographic, artificial intelligence, Machine learning and Deep learning flat line vector banner with icons on white background."},{"@type":"BreadcrumbList","@id":"https:\/\/www.algonomy.com.br\/en\/blogs\/personalization\/solving-the-limitations-of-traditional-cross-sell-models-with-deep-learning-and-nlp\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.algonomy.com.br\/en\/"},{"@type":"ListItem","position":2,"name":"Solving the Limitations of Traditional Cross-Sell Models with Deep Learning and NLP"}]},{"@type":"WebSite","@id":"https:\/\/www.algonomy.com.br\/en\/#website","url":"https:\/\/www.algonomy.com.br\/en\/","name":"Algonomy","description":"Redo Digital","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.algonomy.com.br\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.algonomy.com.br\/en\/#\/schema\/person\/7615ccf7e0ea5b6046938eac5ea83c80","name":"Neal Digre","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.algonomy.com.br\/en\/#\/schema\/person\/image\/","url":"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/Neal-96x96.jpeg","contentUrl":"https:\/\/www.algonomy.com.br\/wp-content\/uploads\/2021\/06\/Neal-96x96.jpeg","caption":"Neal Digre"},"description":"Neal Digre is a Senior Data Scientist at Algonomy, where his current focus is applying deep learning and natural language processing to push the performance of recommendation strategies and provide tools for enriching catalog metadata. These interests and more are driven by his passion to explore the amazing complexities of human language, fostered through degrees in Linguistics (BA) and Computer Science (BS) from Western Washington University, a MSc in Informatics from the University of Edinburgh, and previous work as an intern at the National Institute of Informatics (Tokyo, Japan). In non-pandemic times, he enjoys watching movies on airplanes and curling (that funny sport on ice).","url":"https:\/\/www.algonomy.com.br\/en\/author\/neal\/"}]}},"_links":{"self":[{"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/posts\/37760","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/users\/20"}],"replies":[{"embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/comments?post=37760"}],"version-history":[{"count":17,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/posts\/37760\/revisions"}],"predecessor-version":[{"id":38405,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/posts\/37760\/revisions\/38405"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/media\/38744"}],"wp:attachment":[{"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/media?parent=37760"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/categories?post=37760"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/tags?post=37760"},{"taxonomy":"topics","embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/topics?post=37760"},{"taxonomy":"industry","embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/industry?post=37760"},{"taxonomy":"product","embeddable":true,"href":"https:\/\/www.algonomy.com.br\/en\/wp-json\/wp\/v2\/product?post=37760"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}