The age of machines is upon us – or, rather, we’ve been living in it for a while now. With the rise of Google’s RankBrain, self-driving cars, and companies in numerous industries pumping money into the development of robotics and artificial intelligence, it certainly appears that humanity’s end days are nigh. Software and robots have proven to be ever-more-efficient in a variety of fields – mostly service-focused and manual labor-oriented – and have eliminated positions for plenty of our flesh-and-blood-based brothers and sisters. Now, though, the machines have invaded a field not thought likely: content creation. Does this mark the end for writers across the globe? Will the next generation of CEOs oversee bullpens of new, chrome-faced fiends?
The answer is an emphatic “no.” Media outlets like the Associated Press and Yahoo! Sports have already been using automated content tools to produce summaries of things like financial reports and football games. These formulaic pieces would have otherwise occupied the valuable time of journalists. Rather than heralding the end of human writers, programs like Automated Insights allow them to focus on richer, more in-depth pieces. To put it simply, automated content tools take the jobs no writer wants to do, freeing them up to compose the soulful stories that no robot could ever do. It’s precisely those in-depth stories that carry the most weight with both users and search engines.
But while your writers are safe, the growing popularity of tools like Wordsmith does raise some interesting concerns about how automated content will affect SEO going forward.
In the most recent development, Automated Insights released a public beta of their content automation platform, Wordsmith. By utilizing structured data, the tool can generate thousands of pieces of content in very little time. The software is highly flexible and can build a wide array of narratives, from election results to product descriptions and even “Game of Thrones” battle recaps. However, most of this content has a very short shelf life. Few investors will bookmark a company’s Q3 financials and come back in a year. The same goes for sports scores.
Automated content also runs the risk of being duplicative. Devoid of critical analysis, financial statements posted on Site A will look a lot like those on Site B. Content automation is not a quality play – it’s all about quantity. Modern SEO is exactly the opposite.
For SEOs, the idea of Wordsmith may be a source of uncertainty and confusion, especially given Google’s stance on automatically-generated content. While listed in Google’s Webmaster Guidelines as a technique that can result in a site becoming penalized, the content examples provided are unlikely to rank to begin with. Software programs are not writing 1,500-word reflections on the state of the tech industry – they’re pushing updates on dividends.
So what can SEOs take away from this new wave of automatically-generated content? Platforms like Wordsmith offer a means to create loads of digestible information in less time than ever before. Time-consuming, data-driven content can be generated in an instant, leaving writers to focus on things that can affect your overall content authority.
We’re not quite at the point where writers can start being thrown out of windows in droves, as there are limits to what these tools can do. The platform authored by Automated Insights can only derive stories from structured data provided by people, and they aren’t without errors. Most importantly of all, the software has no soul. Of course, by this I don’t mean some vaguely-defined, metaphysical essence (although it certainly doesn’t have that, either), but the range of emotions and creative energy that flows through human wordsmiths. The boundaries of the program’s abilities begin to grow more obvious once one sees multiple pieces of its work, which start to read as repetitive and simplistic.
For SEOs, automated content generators aren’t nearly as dangerous as might be assumed. While some may accuse us of whistling past the graveyard on this issue, Google’s continued insistence on long-form, insightful content creates a high hill for automated assistants to climb. So let your SEOs continue to do what they do best: strategize and execute. Robotic writers are not going to serve as useful employees.
…Until the inevitable uprising, of course. But, hey, who’s counting the days until that? Beep-boop.
There’s nothing more frustrating to search marketers than receiving only a handful of clicks after spending countless hours researching and building out thousands, sometimes hundreds of thousands, of campaign keywords. For advertisers looking to expand lower-funnel traffic without all that work, there is now a viable solution—Google’s Dynamic Search Ads.
Google’s Dynamic Search Ads do not use keywords to target consumers, but rather target based on an advertiser’s website content. Hence, the more pages available with rich content to crawl, the higher the chances your ads will map to long-tail queries and drive a good amount of incremental volume. This targeting option thus provides advertisers with an automated and efficient way to serve ads, minimizing the amount of exhaustive, time-consuming, difficult-to-manage, minimal-payoff long-tail keyword expansions.
Haven’t Dynamic Search Ads been around for a while now? Why post about them now?
Dynamic Search Ads, or DSAs, have indeed been around for a couple of years, but some advertisers are still reluctant to give them a try. Why this slow adoption rate? Possibly because advertisers remain wary of the limited control DSAs offer in terms of which queries are mapped to ads, which landing pages Google then sends traffic to via those ads, and which search terms dynamically populate those ad headlines.
Yes *spoiler alert* Dynamic Search Ads use dynamic keyword insertion to populate ad headlines, which means that in order to run these ads at all, you must relinquish control to Google’s algorithm. Some big brand advertisers concerned about brand messaging are not fans of Dynamic Search Ads just for this reason alone.
So how is it that Google’s DSA campaigns can drive a decent amount of incremental traffic from long-tail queries when seasoned search marketers are only able to drive a small amount of clicks?
16% of all searches are new queries—keywords, phrases, or questions that Google has never seen typed before. That in mind, even the most experienced search marketers cannot possibly guess every single query their target audiences will type into a search bar. They can’t predict the future. That’s where Dynamic Search Ads can potentially be of great use. Since Dynamic Search Ads use Google’s organic web-crawling technology, they have the ability to capture these never seen before queries and serve up relevant ads based on an advertiser’s site content.
How does Google factor in quality score when it comes to Dynamic Search Ads?
How Google factors in quality score for Dynamic Search Ads campaigns is somewhat of a black box. There is no real way to view quality score for these kinds of campaigns, since there are no specific keywords targeted in an account. Does this mean that DSA is an exception to quality score? Not likely. However, think about how quality score is calculated in the first place. This score is designed to measure the relevance of an ad to its subsequent landing-page content. If Google displays Dynamic Search Ads by crawling advertisers’ site content and then mapping a query to that crawled content, the experience stands to be innately relevant, eliminating concerns for quality score entirely.
What should I expect from my Dynamic Search Ad campaigns in terms of performance levels?
Dynamic Search ads will take longer to ramp up than a typical keyword-targeted campaign as Google’s technology needs time to crawl and understand your site. Google’s spiders typically take anywhere from 2–3 weeks before you will understand just how much traffic a DSA campaign will drive. Bids will also affect traffic levels, so be willing to increase bids if the initial bids are only allowing your campaign to capture a small amount of impression share.
*Added bonus* Another advantage to Dynamic Search Ads are more-efficient CPCs. DSA CPCs are typically lower compared with keyword-targeted campaigns, since by the campaign’s very nature the keywords are long tail and searched infrequently, and therefore less competitive. CTRs also tend to be higher than regular keyword-targeted campaigns, due both to the long-tail nature of the search terms and dynamic keyword insertion in the ads which makes for a highly relevant and customized ad.
Are there any signs that might point to Dynamic Search Ads as a good option for my brand?
If your website has a substantial number of pages with static, robust content, then Dynamic Search Ads could very well be the answer to your long-tail strategy. Consequently, if you have a small number of pages with a small amount of, or frequently changing, content, DSA may not be the right fit. In any case, Dynamic Search Ads should always run alongside other campaigns and be used simply as a catch-all for relevant searched-on terms that you are not already actively bidding on in your account.
While Dynamic Search Ads are certainly not the be-all and end-all product to use when striving to reach any advertiser’s long-tail traffic goals, they do offer an effective way to drive incremental volume, while scaling long-tail expansions in an efficient and automated way. This automation can additionally free up your search marketing team to focus further optimization efforts on the higher-volume keywords driving the bulk of clicks and spend. Sounds like a Grade A time-management strategy to us!
Like what you read? Subscribe to our newsletter to receive leading industry insights via email.
RankBrain, Google’s newly launched artificial intelligence component, which works in conjunction with the search engine’s Hummingbird algorithm, was announced late last month – setting SEO experts spiraling into a frenzy. While to most, Google’s RankBrain announcement felt like a huge surprise, we’ve taken the time to back track, and in fact have found two major events foreshadow the announcement of this artificial intelligence system now live on Google Search.
First, Eric Schmidt, the Executive Chairman of Alphabet Inc., is quoted on March 16.th
“I think the biggest trend is going to be the use of machine intelligence of large data sets to solve every problem,” Schmidt said. “I cannot think of a field of study, a field of research — whether it is English, soft sciences, hard sciences or any corporation — that can’t become far more efficient, far more powerful, far more clever.” (McFarland, 2015)
This quote appears prophetic, creating anticipation of vast changes to revolutionize Alphabet’s many companies. Eric Schmidt, the former president of Google, now Alphabet, has been spending many months trying to assure both national and international business interests, including public and private institutions, that the potential impacts of artificial intelligence, also called machine learning, on the world remain largely misunderstood. Then on September 28th of this year, Wired revealed another landmark Google announcement – that its vast quantum computer farms are undergoing a massive upgrade.
“GOOGLE IS UPGRADING its quantum computer. Known as the D-Wave, Google’s machine is making the leap from 512 qubits—the fundamental building block of a quantum computer—to more than a 1000 qubits. And according to the company that built the system, this leap does not require a significant increase in power, something that could augur well for the progress of quantum machines.” (Metz, 2015)
This change signifies the upgrade of integration between hardware and software breathing into Google’s live artificial intelligence, aka machine learning. This advancement means that Google Search’s speed has increased many times over, with little need to improve the power infrastructure.
Wide-ranging coverage on RankBrain in numerous articles, blog posts and tweets is now live all across the web. Moreover, like all multifaceted updates in Google Search, this change comes with numerous intricate details. I quickly realized that throughout this press storm, some critical facts have been misquoted or lost in translation.RankBrain’s Launch Full of Press Surprises
Sundar Pichai, Google’s Chief Executive Officer made a startling announcement on October 22th at Alphabet’s earnings meeting, saying, “machine learning is a core transformative way by which we are rethinking everything we are doing,” (Clark, Google Turning Its Lucrative Web Search Over to AI Machines, 2015). Google has never been so bold as to announce that artificial intelligence, aka machine learning would take such an immediate and active role in search, let alone other parts of the company. Moreover, the surprises kept coming when a few days later, Bloomberg Business interviewed Greg Corrado, a senior research scientist who announced that a major update to Google.com went live a few months ago. The conversation revealed that a new component in their search algorithm now assists in delivering search results for a large portion of national and international search results. He christened this independent software “RankBrain.”
“For the past few months, a “very large fraction” of the millions of queries a second that people type into the company’s search engine have been interpreted by an artificial intelligence system, nicknamed RankBrain, said Greg Corrado, a senior research scientist with the company, outlining for the first time the emerging role of AI in search.” (Clark, Google Turning Its Lucrative Web Search Over to AI Machines, 2015)
This announcement is consistent with Google’s habit of announcing significant changes to its algorithm after the fact. However, with slight prodding Google uncharacteristically gave an interview offering answers to relatively direct questions. Greg went on to say, “…in the few months it has been deployed, RankBrain has become the third-most important signal contributing to the result of a search query.” (Clark, Google Turning Its Lucrative Web Search Over to AI Machines, 2015)
The final, and I believe most stunning revelation that came out of the Bloomberg Business article was a quote detailing how accurate RankBrain has become at matching search results with queries.
“Google search engineers, who spend their days crafting the algorithms that underpin the search software, were asked to eyeball some pages and guess which they thought Google’s search engine technology would rank on top. While the humans guessed correctly 70 percent of the time, RankBrain had an 80 percent success rate.” (Clark, Google Turning Its Lucrative Web Search Over to AI Machines, 2015)
Information experts and marketers have long known of Google’s, now Alphabet’s mission to develop and launch artificial intelligence on Google.com. However, this interview reveals how far Google has gotten ahead of the competition. We now know that their AI is alive, and it is learning within the machine at Google.com. I recommend following all the sources and that we each read every article carefully from this point on, so that we can each form our own insights and questions.
That being said, here are a few more insights I’ve gathered from relevant sources thus far.The Bloomberg Business Video Interview & Jack Clark Article Insights
Bloomberg Business published an article, which I feel is the trumpet announcing that Google’s AI is conscious and that Google’s yearlong effort to launch the update has been successful, or in the words of the interviewed Google engineer, “I was surprised,” Corrado said. “I would describe this as having gone better than we would have expected,” (Clark, Google Turning Its Lucrative Web Search Over to AI Machines, 2015). Here are a few key takeaways:
Bill Slawski’s RankBrain coverage stands apart based on his thorough and extensive experience exploring technology patents. In his latest article, Investigating Google RankBrain and Query Term Substitutions, Slawski examines Kedar Dhamdhere’s patent titled Using Concepts as Contexts for Query Term Substitutions. It is relevant to note that one of the inventors of this patent, Thomas Strohmann has written multiple papers on Machine Learning – Sparse Greedy Minimax Probability Machine Classification and A Formulation for Minimax Probability Machine Regression to name a few.
The patent reveals a treasure trove of insights into how RankBrain is an independent, autonomous entity that stands outside the traditional Google algorithm factors.
Bill summarizes the patent in the following processes:
Bill’s article brings the most clarity to the RankBrain coverage – though it still leaves many questions unanswered due to the shallow content provided in the initial Bloomberg interview and article.
The concepts discussed in Bill’s article sparked a lively conversation in our office. After concluding our banter, the fact dawned on me that I might have made an error in my understanding of RankBrain. I went back and re-read all the articles and was shocked that I had misquoted a central concept. In the original interview with Greg Corrado and Bill’s treatment of the Google patent discussion, I understood that RankBrain processes “new queries” that Google has never seen before representing 15% of “completely new” Google Search queries. That seems like a huge figure and appears to support the earlier Bloomberg quote, “(a) very large fraction” of the millions of queries a second that people type into the company’s search engine.” (Clark, Google Turning Its Lucrative Web Search Over to AI Machines, 2015) However, I believe that Greg Corrado, the senior research scientist at Google was pointing to a much larger figure that I initially inferred wrongly, and thus used the quote out of context. Greg never says that RankBrain currently processes 15% exclusively. Moreover, I suspect the percentage is much larger. How large is impossible to know at this point in time, but what I took away from the discussion was that RankBrain is a lot bigger update that I first realized.
Further insights from Bill Slawski’s article:
The more people understand RankBrain, the more faceted the coverage and conversations will become. Review the next section for further coverage available on the rising influence of artificial intelligence systems. I highly recommend reading the span of quality reporting on the web when you have a moment to take a deeper dive.
There are two additional articles published in The SEM Post that ask and answer a lot of important questions. The first article, RankBrain: Everything We Know About Google’s AI Algorithm, is a suitable primer on RankBrain with post-article/interview conversations from Bloomberg journalist Jack Clark and expanded coverage of the original interview with Greg Corrado. The second article, Google’s RankBrain: 9 Industry Experts Weigh In, includes conversations between nine thoughtful search engine experts and Google engineers. These two articles close the gap in understanding about RankBrain including addressing some of my discoveries.
Until we talk again, here are some major mysteries and rabbit holes to explore that the coverage leaves unanswered.RankBrain Questions
There are several key takeaways that the process of researching RankBrain revealed. Moreover, the most palatable and I hope not too surprising is that AI still has a long way to go before the Robot Uprising begins. However, I am starting to get excited that AI might be taking its first breaths. Here are my takeaways from this article.
Acknowledgments: Thanks to Brandon Schakola for inspiration and support. Thanks to the significant research and articles by all the thinkers and writers in my industry.
TSA Upcoming Series: How to Thrive after the Machines have already Risen
Important questions we will try to answer in the new TSA series:
Clark, J. (2015, October 26). Google Turning Its Lucrative Web Search Over to AI Machines. Retrieved from Bloomberg: http://www.bloomberg.com/news/articles/2015-10-26/google-turning-its-lucrative-web-search-over-to-ai-machines
Clark, J. (2015, October 26). Jack Clark on Twitter. Retrieved from Twitter: https://twitter.com/mappingbabel/status/658780659889143812
Kedar Dhamdhere, T. S. (2015, August 11). Using concepts as contexts for query term substitutions. Retrieved from google.com: https://www.google.com/patents/US9104750
McFarland, M. (2015, March 16). Google’s Eric Schmidt downplays fears over artificial intelligence. Retrieved from The Washington Post: https://www.washingtonpost.com/news/innovations/wp/2015/03/16/googles-eric-schmidt-downplays-fears-over-artificial-intelligence/
Metz, C. (2015, September 28). Google’s Quantum Computer Just Got a Big Upgrade. Retrieved from Wired.com: http://www.wired.com/2015/09/googles-quantum-computer-just-got-a-big-upgrade-1000-qubits/
Slawski, B. (2015, October 26). Investigating Google RankBrain and Query Term Substitutions. Retrieved from http://gofishdigital.com/: http://gofishdigital.com/investigating-google-rankbrain-and-query-term-substitutions/
Slegg, J. (2015, October 27). Google’s RankBrain: 9 Industry Experts Weigh In. Retrieved from thesempost.com: http://www.thesempost.com/googles-rankbrain-9-industry-experts-weigh-in/
Slegg, J. (2015, October 27). RankBrain: Everything We Know About Google’s AI Algorithm. Retrieved from The SEM Post: http://www.thesempost.com/rankbrain-everything-we-know-about-googles-ai-algorithm/