Subscribe to The Search Agents Feed feed
Online Marketing Intelligence
Updated: 8 hours 24 min ago

True Lies, or “Hey, Google! You’re Trying Too Hard!”

Mon, 09/29/2014 - 14:35

Much has been written recently about the peculiarities of Google’s Quick Answer Boxes. Many site owners are concerned that content is being scrapped and showcased on Google itself, while SEOs are puzzling over why some boxes come with a citation, while many do not. Then there’s the pesky question of whether or not Quick Answer Boxes are even part of the proper Knowledge Graph. After all, how is scrapping a page particularly intelligent?

I recently devoted time to poking and prodding this feature. What I found suggests that Quick Answer Boxes are less than ideal for certain types of queries, and are sometimes featured when organic results would better serve the user.

Example 1: Don’t Know Much about History

When it comes to search, Google is only as good as its sources. Unfortunately, plenty of people post things on the internet that just aren’t true (I know, Grandma – it shocked me, too!). Malicious lies aside, let’s consider an example regarding what many accept as common knowledge. Below is the answer box I was served for the query, “when was the declaration of independence signed”:

Google is confident in the July 4, 1776 answer, and has displayed text scrapped from to back-up this assertion. However, those who take the time to read the paragraph will realize they’ve been misled. is actually reporting that the document “wasn’t signed on July 4, 1776.” Many other sites also call attention to this misconception, so why has Google accepted the “common knowledge” answer?

My suspicion is that there are two culprits at play: bias and verb confusion. When I search for the exact statement, “The Declaration of Independence was signed on July 4, 1776”, Google returns 37,100 results. Meanwhile, when I run the same search with the date historians believe is correct, (August 2, 1776) I only get 12,200 results.

If Google is simply crawling pages for an answer, July 4th would seem to be the safe bet. Most users are probably satisfied with this result, as it is the nationally celebrated birthday of our great nation. Due to this misconception, there are probably few behavioral signals to indicate to Google that the result is, in fact, wrong.

I repeated the original query several days later. While the same date was displayed, the accompanying text and source changed:

Here we have writing that the document was “adopted” on July 4, 1776. The delegates “began signing it” on August 2, 1776. At least in this case, “adopted” and “signed” seem to be close enough to keep the erroneous answer front and center. Google failed to understand the less straightforward line with the correct answer: “on August 2, 1776, delegates began signing it.”

I wanted to test the verb confusion hypothesis with another date, so I searched for “when was the magna carta signed”. The following answer box is courtesy of Wikipedia:

While the date isn’t served up top, it is bolded. What I find interesting is that there’s no version of the verb “signed” in the text. Instead, it was “sealed under oath” on 15 June 1215. I have no idea if anyone actually signed the Magna Carta, but Google has made the assumption that this is a satisfactory answer.

This is a case where the Quick Answer Box does a disservice to the user by jumping to conclusions. Simply presenting the organic results would’ve allowed users to consider the controversy, while bolding July 4, 1776 misleads them.

If the Knowledge Graph is ever to reach its full potential as an answer engine, it will need to be sensitive to conflicting results that require a human tie breaker. Sometimes “I don’t know” is the right answer.

Example 2: Welcome to Jurassic Park?

Misconceptions aren’t the only things Google has to be concerned with. Humanity has created fictional people, places, and things which have become integral parts of our culture. L. Frank Baum wrote dozens of books about the magical world of Oz, and, if you take films at face value, Indiana Jones was a thorn in Hitler’s side.

While our brains are powerful enough to consider contextual clues, a skill which makes distinguishing fact from fiction a cinch, Google’s not quite there yet.

To further prove this point, I tried my luck using Google to find a fictional location. Like many of my 30-something peers, I’m a big Jurassic Park fan. So I asked, “where is jurassic park”. Much to my surprise, I got a straightforward answer:

The culprit is actually, a seemingly upstanding site that decided to create a page for Jurassic Park on April Fools’ Day. Unfortunately, Google didn’t get the joke. The Jurassic Park page does contain a disclaimer at the bottom. I suppose that the content was so well-suited to my query, Google returned it without seeking corroborating evidence from other travel sites.

When I tried the same query on Bing, I was initially served information about the film. However, Bing offered a search suggestion for “Where is Jurassic Park located?” When I clicked on that, I received a suitable answer:

Bing managed to avoid the WikiTravel trap and determined that, unfortunately, I cannot book a trip to Jurassic Park after all (sorry for getting your hopes up, teenage me).

Regarding Google, this seems to be yet another case where the search engine was eager to please me with a Quick Answer Box. As a result, it assigned undeserved weight to a single source. While the ultimate goal of the Knowledge Graph – and advanced search in general – is to classify entities according to their relations, any search tool needs the ability to understand when an answer is ambiguous or contested. In fact, the examples above suggest that good, old-fashioned organic results are sometimes the best way to go.

As engineers build out the Knowledge Graph and Quick Answer Box, I’d remind them that it’s OK to ask for help. Humans create fiction and misconceptions, so bringing us back into the loop is often the best way to shake-out the truth.

In the end, this is not an indictment of the Knowledge Graph or search algorithms. Instead, we’re brought to the realization that human knowledge is strangely ambiguous, and until computers can integrate this into their structured knowledge bases, there will always be room for people to weigh in.



3 Ways to Adjust Your SEO Strategy in the Wake of Pigeon

Thu, 09/11/2014 - 11:34

A few weeks back, our Search Agents scoured the SERPs, investigating the effects of Google’s yet-to-be-confirmed-or-denied Pigeon update. While Google still has yet to offer any definitive remarks on the matter, no search marketer can refute that U.S. local SERP results have changed dramatically. Like every Google update before it, Pigeon’s landing certainly begs the question, what do we need to do to adjust our strategy?

Yelp and Google Business Listings are Dominating Local SERPs

Problem: Yelp and Google-owned entities are heavily competing with their main page counterparts.

Solution: Think like Yelp.

Engage with your customers. Yelp is all about reviews, ratings and comparisons. Add reviews or testimonials page(s) onto your site with your customer’s name, rating, date, and other relevant information – after obtaining consent, of course. Including a widget on your Yelp page can also remedy this problem.

Another fix? Improve your pages’ semantic markup. For example, the reviews on your site can be marked up as “reviews,” which can help stand out with star ratings in the SERPs like Yelp. is the preferred data markup method by Google, Bing, and Yahoo!.

These fixes aside, managing profiles in Google My Business and Yelp are no longer optional if SMBs want to compete in local SERPs. In short, get chummy with Yelp and Google+ for SERP success. Customers are rating your business on these sites, so keep active profiles, engage with reviewers, and attempt to resolve issues for repeat business. Businesses should apply these same practices to other influential local directories such as TripAdvisor, Urbanspoon, and OpenTable.

Maps Are Hyper Local

Yes, SERP maps seem to be getting more localized under Pigeon’s reign. However, Pigeon shouldn’t change your local content strategy, just the way Google ranks that strategy. The new local search algorithm has been tied to more traditional Web standards, so sites still need quality, natural content with a strong backline profile to build domain authority and rank locally.

 Here are some optimizing tips for new local searches:

  1. Make sure your business is verified, along with the right contact information, business description, hours, website URL, and more through Google My Business.
  2. Do you have multiple locations? Create dedicated local pages, each with unique, relevant content targeting your specific city/state in the meta data.
  3. Beware: Do not duplicate content across your location pages with the city/state as the only unique factor, as these duplications will not rank post-Panda. Each page should have standalone value to a user.

Optimize for the Local Carousel

Unlike local listings packs, Google’s local carousel was not affected by Pigeon, but remains a local ranking opportunity for businesses. Google derives these results from a combination of Google My Business and Zagat (a Google-owned property) listings, so maintain an active profile with a high-quality photo and positive reviews on both sites. Results are not sorted by name, rating, or price, but rather by the most relevant query result. However, the listing’s image and rating affect CTR, since we’re talking side-by-side competitors.

At the end of the day, Google’s going to change. Regularly. However, any impending changes will undoubtedly be instated to police the search engine in favor of marketers producing quality, responsible, creative content. The less marketers lean into exploiting new Google developments, and instead focus attention on improving content quality, the lower the likelihood of Google slamming those marketers with any future updates.

It’s pesky for SEO marketers but, ultimately with Pigeon, Google continues refining the user experience…with Google products.

Are You Ready for Shopping Campaigns?

Tue, 09/02/2014 - 14:16

In Cinderella’s story, at the stroke of midnight, her carriage turns back into a pumpkin. Online advertising is no Cinderella story, unless Google’s involved. Google sets fairy godmother-style deadlines for all its upgrades and transitions, such as the upcoming Shopping campaign deadline. Basically, if you do not manage this upgrade by manually transitioning your PLA campaigns to Shopping campaigns, Google will flick its wand and change your campaigns for you. However, this change is better made manually. Here are the top 3 reasons why:


Google has offered to upgrade your PLA campaigns to Shopping campaigns automatically. However, when your PLA campaigns automatically upgrade, they may not upgrade in your desired structure. Right now, before Google’s automatic September 3rd switch date, is thus a good chance for you to change your campaigns manually. Put together your thoughts based on past performance and future strategy, to define the structure of your Shopping campaign yourself. Do you want to base your campaign on brands, ROI, or Product Type? Use a whiteboard. Draw a flowchart. Plan out the structure.


One of the new features of Google’s Shopping campaigns are their ‘Low’, ‘Medium’ and ‘High’ priority settings. These settings allow users to tell Google which campaign or bid to prioritize when running multiple campaigns for the same product groups. For example, if you set up a campaign with ‘Medium’ or ‘High’ priority, Google will select the bids in that campaign over bids in ‘Low’ priority campaigns. This feature could be especially useful for sales, promotions, and even seasonal campaigns.

Important to Note: Google’s default setting is ‘Low’. Thus, make sure you alter the priority setting tabs for each of your campaigns, setting ‘High’, ‘Medium’, and ‘Low’ campaigns accordingly before implementation, as otherwise all of your PLA campaigns will be set at ‘Low’ priority with the automatic upgrade.


Automatically upgraded campaigns may not include any proactive negative additions. However, assuming that the structure of your campaign changes, consider proactively checking for negatives to add over the first few weeks, so that costs do not build up on irrelevant terms. Again, in this case, if you were setting up the campaigns manually, you will know what to exclude in terms of both keywords and product groups.

Fun Fact: Using Shopping campaigns, you can no longer find search terms in the Keywords tab. You can, instead, find them under Dimensions, by clicking on View by Search Terms.

With these tips in mind, design your structure, define your negatives, and plan out your settings before the clock strikes 12. Then just wait patiently on your pumpkin until Google strikes back with another deadline for a new upgrade.



The Internet of Things Isn’t Coming; It’s Here

Fri, 08/22/2014 - 14:45

Imagine a world where you can monitor your oven settings after leaving the house, and even shut the appliance off remotely if need be. Perhaps you’ve forgotten to lock your doors and fear intrusion? In this world, with one simple click, you can secure your home even post departure, and control the smoke and carbon monoxide detectors, and even switch off the lights! That imagined world? It’s ours, right now.

By now you have probably heard of something called the Internet of Things (IoT). Simply put, loT is the technology described above; responsible for connecting all electronic devices in your home, office, or anywhere else directly to the Internet, giving you, the user, remote control over these areas.

Speaking of homes, recently, I was out window shopping for a new home myself. During my tour of a rather expensive house, I noticed a device on the wall labeled NEST. When I inquired about this device, the Realtor explained that it was used to control all of the home’s electronic devices from an app on the owner’s smartphone. An app, controllable from anywhere worldwide, astounded me and I realized, our world is about to become much more connected by way of our surrounding devices. Bigger still, the role these devices will play has yet to be determined.

Attracting the BIG players

This technology is so real that Google, Samsung, and a slew of other brands have joined forces to standardize the IoT. This collaboration of companies is called the Thread Group. According to an article on Business Insider, this group’s goal is to develop “Thread, a new IP-based wireless networking protocol, which will enable devices to connect into a more open, secure, and low-power wireless mesh network”. This technology can be applied across many different industries including healthcare, aviation, entertainment, transportation and home automation. These categories barely scratch the surface of the IoT’s potential application, spanning to any industry where data collection, monitoring, and control could prove beneficial.

Chip manufacturers like Texas Instruments and Intel could see a huge upswing from the adoption of IoT, as well as internet security based companies like Cisco, Barracuda Networks and Checkpoint Software. Moreover, digital marketers and advertising agencies could see an increase in engagement metrics as products become increasingly more interactive. For example, media consumers of home television can, with the click of a button, purchase whatever product captures their interests on screen. The list of this technology’s possible applications is virtually endless.

Protecting Your Home Will Take On a Whole New Meaning

Of course, modern concerns for security and privacy stand to accompany this new technology’s arrival . By the year 2020, a projected 20 billion devices will have seamlessly connected to the IoT. However, connecting more devices to the internet increases the chances of hacking or other digital compromises. Hewlett Packard has a security check system called Fortify on Demand to protect against these concerns. According to a study by HP, “the company’s Fortify application security unit conducted an analysis of the 10 most popular consumer [IoT] devices on the market and found 250 different security vulnerabilities in the products, for an average of 25 faults each. They were from the manufacturers of TVs, webcams, home thermostats, remote power outlets…”… “What’s happening,” says Mike Armistead, VP and general manager of HP’s Fortify unit, “is that manufacturers are rushing to get their products on the market without doing the harder work of locking their devices down against the most basic kinds of attacks”.

With that in mind, ask yourself… are we prepared for the kind of change that an increased number of connected devices will bring?  While the application of the IoT continues to take shape, we need to shift our perception and concerns regarding connected devices.  What were once thought of as dumb devices may soon become a network of interconnected, living devices, that communicate with each other, making intelligent decisions, dare I say, apart from human interaction and delegation; a possible Skynet of sorts. Ultimately, whether you feel excited about the Internet of Things revolution or skeptical due to looming security concerns, one thing is certain; we no longer have to dream or imagine. This technology is here to stay.

Share your thoughts and comments below to let us know how you feel about The Internet of Things.


Ready, Aim…wait, Read This: More on Google’s HTTPS Update

Tue, 08/19/2014 - 09:33

Last week, Google announced that HTTPS is now a ranking signal, a change for which Matt Cutts gave his personal endorsement months ago. Although he official announcement indicates that HTTPS is only a “lightweight signal” for now, effects may become stronger in the future. The following are site update considerations to prioritize in the interim:

  • Level of implementation effort
  • The cost of an SSL certificate
  • Effort and cost of maintenance

What Does this Mean for Brands?

For brand leaders, Google’s announcement signals yet another available avenue for getting a leg up on competitors. At The Search Agency, we predict this announcement as a precursor to more expansive future Google initiatives. Thus, staying ahead of this curve by responding actively to Google’s current update will prove beneficial if this change does indeed prove itself as a signal that weighs heavily in terms of ranking factors.  Still, more impactful changes currently exist, implementable by way of less effort and lower cost. As Google’s John Mueller indicates, this is not something we need to drop everything and implement right away, so it should be prioritized accordingly among other site updates.

The changes required to transition existing URLs to reflect the HTTPS structure can be overwhelming. However in reality, they are not much different than those necessary for any other site migration. Firstly, E-commerce sites should already be utilizing HTTPS on pages requiring security encryption, such as  shopping carts and transactions, so, for those sites, changes may be as simple as extending that same security site-wide. For non-E-commerce sites, once an SSL certificate has been set up and tested (we recommend following SSL best practices to ensure security and to verify your server will be able to hold up under the additional processing requirements), there are a number of steps to take to ensure no value is lost in organic search. Following these practices, switching to HTTPS is reminiscent of most other site migration patterns.  Here are some of the high level steps to consider in such a migration:

  • 301 redirect all non-HTTPS URLs to HTTPS
  • Update all internal linking to point to the correct HTTPS URLs; the easiest way to do this would be to use relative linking across the board, as long as there are no other sources of duplication at the domain level, such as coexisting www and non-www URLs
  • Add the HTTPS version of the site to Google Webmaster Tools, if it isn’t already there
  • Once Google’s Change of Address tool supports HTTPS migration, use the tool to inform Google of the site move
  • Reach out to webmasters of sites with valuable backlinks to your site and request changing the linked URLs from HTTP to HTTPS

Some other things to keep in mind:

Heartbleed – In April, the Heartbleed bug showed the corporate community that even those technologies designed to enhance security can themselves be vulnerable to system breaches. Keep in mind that leak opportunities are still possible, reinforcing the importance of staying up-to-date with any and all security protocol advancements.

 Page Speed – Technically speaking the “round-trips” necessary for HTTPS may actually have a slightly negative effect on page speed, creating a slower page load time, which goes against Google’s interests in ranking sites with faster load times. Will Google take this into consideration when factoring in page speed?

Mobile Devices – Especially on older mobile devices, the possibility remains that certain, less common SSL certificate authorities may not be recognized.  Therefore, cross-platform testing should be performed to catch any such issues.


Overall, these changes, potential page speed drains aside, stand to boost rankings and better secure both brand and consumer Internet interactions. Implementing HTTPS is a safety precaution that’s well worth the up-front implementation efforts to ensure site security and compliance; a little work, and we’ll all be breathing easier. See? That wasn’t so bad.

Need help with a migration? We can help. Contact us for a free strategic consultation today.



A Whole New SERP

Thu, 08/14/2014 - 15:40
The Search Agency Crowd Sources on Google’s Recent Pigeon Implementation

Since Pigeon swept the search scene early last week, The Search Agency’s thought leaders have been evaluating Google’s changes, drawing their own conclusions concerning how these updates stand to affect the quality of search moving forward. Here is a sampling of their thoughts.

Local Products and Services SERPs:
The Search Agency started its Pigeon evaluation examining the changes made to local products and services searches. Our team’s findings point to the notion that, under Pigeon, semantic markup will now not necessarily improve SERP rank. Mary Hayes, The Search Agency’s Earned Media Content Manager, investigated this notion, running her own search for car wash west la. She found:

“A search for car wash west la yields a seven pack (surprising, since we’re mostly seeing only three or four packs). The resulting map also looks to cover a decent area.”

Mary then refined her search to car wash 90064, and while she got the same 7-pack, the resulting map was smaller and paid ads took up much more real estate underneath said map. As she scrolled down, Mary noted that Yelp takes this search’s top three organic SERP spots, further concluding that while semantic markup may not improve rank, it stands to at least help main domains stand out against their Yelp counterparts.

The Search Agency’s Gregory Sidor performed his own search for oil change as seen below.

From these results, Greg concluded, “To me, this seems like an example of Pigeon gone wrong. Three different Jiffy Lube URLs [appear] in the result, and one is even duplicated in a right-rail ad. I would expect Google to improve on this.”

Like Greg’s oil change query and Mary’s car wash queries, Kirby Burke, The Search Agency’s Earned Media Manager, looked up a couple service queries as well: electrician and plumber. Both results returned maps which reinforce local intent. They also completely pushed all organic results below the paid ads and six packs:

Furthermore, adding a geo term to his query (ex: electricians west los angeles) pushed the six pack down and front-loaded three Yelp results. To Kirby, the SERPs seemed identical after he flipped the service and geo (ex: west los angeles electrician).

These findings demonstrate that with the new Pigeon implementations, Yelp is still dominating for local listings and are subsequently considered very relevant for specifically-targeted queries. Managing profiles in Google My Business and Yelp are no longer optional if SMBs want to compete in local SERPs. SMB’s will need to make sure they are listed and getting active, positive reviews in influential local directories. Similarly, it’s recommended that SMB’s keep an active Google+ business profile with a high-quality images (which could get picked up by Google Carousel) and to encourage customer reviews. These are the staples of natural and fresh content which has a positive association to the main domain.


Honing focus further on SERP maps themselves, Kirby compared queries for restaurants near me against restaurants west los angeles. He found:

Queries for restaurants near me and restaurants west los angeles both provide very targeted results, within a few miles of the location. However, a search for restaurants los angeles provides broader results that canvas the greater LA area. Thus, the maps on SERPS appear to be relative to the query.”

Mega Video:

Embedded within the wide scope of Pigeon updates, The Search Agency team also noticed that some video thumbnails had disappeared from SERPs, replaced instead by fuller-scale Mega Video ads, i.e. Google properties. Matt McKinley, Earned Media Manager, adds, “Mega Videos are a way for Google to promote YouTube. Brands who are optimized and active on YouTube can benefit from this [change] when users do a specific search for a video such as nike the last game rather than more general searches such as soccer videos, which still result in thumbnails.”

Promoting these larger scale ads also points to Google finding yet another opportunity to engage users within its own properties, expanding avenues for data accumulation.


Kirby continued to evaluate Pigeon changes, investigating what factors affect the new SERP carousel results.

The listings appear to be an aggregate of Google My Business and Zagat (a Google-owned property) listings. I verified carousel appearance for keywords like bars, nightclubs, museums, movies, hotel (note the check-in and check-out date options) and specific food items like pizza, burgers and ramen

The initial carousel results are not sorted by name, rating, price or any other factors I could discover. Most of the listings have Google My Business profiles and some have Zagat pages too. So having an active presence on both these platforms is highly encouraged.

I also found carousel results for entertainment queries, such as list of hip hop albums, list of popular movies, list of comedy tv shows and list of nba teams, which seem to function differently. Underlined keywords are interchangeable with other, relevant keywords, such as specific genres and sports leagues.”

Kirby noted that “the carousel also has options to filter results. For example, here are carousels for 2013 Heavy Metal albums and burger places that are cheap ($ or $$), have a rating of 4.0+, and are open now:”

“The creative works are especially of note because Google lists them as ‘[Creative Work] frequently mentioned on the web.’ Google is obviously tracking these creative works as entities, most likely through semantic markup, and tracking the frequency of occurrence in the index.

Today, the carousel is featuring albums and films. In the future, this could expand to other entities, such as products, places, and events. This reinforces the need to designate entities with semantic markup.”

Overall, Kirby found that “Clicking a carousel item does not lead to product, business, or review pages. Instead, it keeps the user on the search engine results page, updates the query, and displays results for the entity, all while maintaining the carousel at the top of the browser. This allows the user to browse and compare multiple listings within the same window.”

The Search Agents agree that purposed as such, the implementation of this carousel seems to be yet another Google tactic to elongate the duration of its user experiences while at the same time affording Google the opportunity to garner more data during user comparison searches.

Best Practices:

All things considered, Matt McKinley offered best practices to follow regarding these Pigeon updates, saying, “I don’t think [Pigeon] changes local content strategy, just the way it is being ranked. The new local search algorithm has been tied to more traditional Web standards, so you still need quality, natural content with a strong backlink profile to build domain authority and rank locally. If you have multiple locations, having a dedicated local section with unique, relevant content and the targeted city/state in the meta data is recommended. Do not duplicate content across your location pages with the city/state as the only unique factor.”

To reiterate:

  • Make sure you are listed and getting active, positive reviews in influential local directories such as Yelp, Urbanspoon, OpenTable, and TripAdvisor.
  • Keep an active Google+ business profile with a high-quality image (for visibility in the Google Carousel) and encourage customer reviews.

While Pigeon may have landed on our laps in a hurry, seemingly a pest, The Search Agency’s thought leaders find most of its updates to be in line with Google’s commitment to further enhancing its user experience levels. Though SERP kinks may still need some working out, with Pigeon, Google continues to prioritize engaging, readable content in its rankings. Google has also found ways to elongate user experience times, using both in-engine comparison shopping experiences like its carousel, and increased visibility for subsidiary platforms like Mega Video from its YouTube property. These changes provide Google increased data collection opportunities, to help better answer the question; what do searchers want?

Who knows, Pigeon may be the wings on which Google soars to new heights. Only time will tell.

Troubleshooting in Google Webmaster Tools

Thu, 08/07/2014 - 14:15

Monitoring your site in Google Webmaster Tools regularly is critical to the success of your site. For the purpose of this article, we are focusing specifically on monitoring the number of pages on your site that appear in Google’s index.

This number will naturally fluctuate and increase or decrease as you add and remove pages to your site. However, we are focusing on this from a troubleshooting perspective. This article is a resource to help you identify significant, unplanned spikes and dips in the number of indexed pages.

This chart is a basic overview of some standard troubleshooting to isolate and resolve the root cause(s) of indexation issues. Supporting documentation and resources can be found below.

Check Index Status

Index Status is a module in Google Webmaster Tools that displays the total number of indexed pages over a rolling 90-day period. Look for any significant spikes or dips in the number of pages.

Index Status is located under Google Index in the GWT sidebar navigation

[Back to Top]

Pages Decreased?

If the number of indexed pages in GWT dips that is an indicator that something is preventing pages on the site from being crawled or certain pages are not being indexed by Google. This can be the result of the following:

  • Site Errors
  • Robots.txt Settings
  • Google Penalties

Resolving these errors is important because you want to ensure that all of your content is visible to crawlers so it can be indexed and accessed by users.

[Back to Top]

Site Errors

Site Errors in GWT refer to errors that prevent the crawler from accessing the site. These errors include DNS communication issues, server connectivity (server is down or Google is being blocked) and the ability to retrieve the robots.txt file.

Site Errors is at the top of the Crawl Errors module, under Crawl in GWT

Site Errors are also displayed in the Crawl Errors section of the GWT Site Dashboard.

The three types of Site Errors are displayed in tabs at the top of the window. Tabs will have a green check mark if no errors are found or a red exclamation point to designate errors. Click on each tab to display the specific errors.

[Back to Top]

DNS Errors

DNS Errors are produced when Google is not able to successfully recognize the hostname of a site or the request times out.

[Back to Top]

Server Connectivity

Server Connectivity Errors occur when Google is having trouble connecting with the server, losing an established connection or receiving a bad response from the server.

[Back to Top]

Robots.txt Failure

Robots.txt Failures are caused by the robots.txt file not returning a 200 response or a 404 response (because the site doesn’t have one). If Google cannot confirm whether or not a robots.txt file exists, it will postpone crawling the site to mitigate the risk of crawling and indexing content which is restricted by the robots.txt file.

[Back to Top]

Blocked URLs

Blocked URLs in GWT are caused by robots.txt commands that prevent Google from crawling and indexing certain pages and directories.

Blocked URLs is under Crawl in GWT

This module includes an overview of the robots.txt file and allows it to be modified and tested. It does not modify the actual robots.txt file though. So, if necessary, be sure to update the actual robots.txt file on your server.

[Back to Top]

Check for Google Penalties

To ensure quality content is being delivered to its users, Google created its Webmaster Guidelines. These guidelines are enforced by both manual reviewers and automated functions such as the Panda and Penguin algorithms.

More information about how Google manually reviews sites can be found in our blog article, Understanding Google Search Quality Rating Guidelines.

[Back to Top]

Manual Actions

Manual Actions are penalties which are applied after a manual review of the problem. These penalties can be isolated to a specific section or applied to the entire site.

Manual Actions is under Search Traffic in GWT

Once the issues are resolved, Manual Actions can be resolved by submitting a Reconsideration Request to Google for their review.

[Back to Top]

Algorithmic Penalties


Algorithmic Penalties occur automatically when changes to the Google algorithms go live. Unfortunately, there is no reporting of these kind of penalties in GWT.

If you believe you’ve been impacted by Algorithmic Penalties, compare your Index Status trends to see if they correspond with the algorithm updates. A comprehensive list of updates can be found in Moz’s Google Algorithm Change History article.

If the drops correspond to the updates, then you will need to review the details of the update and find the violations of the Quality Guidelines that correspond with the update (content, links, etc.).

Unfortunately, Algorithmic Penalties cannot be resolved with a Reconsideration Request. After resolving the issues, these URLs will be crawled again and submitted to the index if they meet the Quality Guidelines. You can expedite this process by manually uploading URLs to Google by using the “Fetch as Google” feature in the “Crawl” menu.

[Back to Top]

Pages Increased?

If the number of pages has increased, this means your indexed pages are being saturated with additional pages. These pages are usually the result of the following:

  • URL Errors
  • Other factors which are leading to URL duplication being generated on the site

Resolving increased pages is important because it ensures the search engines are crawling and indexing valuable content instead of wasting their bandwidth chasing error pages.

[Back to Top]

URL Errors

URL Errors are also located under the Crawl Errors module, just like Site Errors. URL Errors provides specific examples of URLs with errors or bad URL request to the site.

URL Errors is at the bottom of the Crawl Errors module, under Crawl in GWT

First, select the tab that corresponds to the specific error. Then, look below the chart for the specific URLs. Clicking on them will provide details about the errors and from where they are linked.

Specific URLs are listed below the URL Errors graph

Clicking a specific URL will provide Error details and the Linked from information

Sudden spikes in errors can have several causes. A few examples are errors in the CMS that generate URLs with errors or an increase in external links to non-existent URLs on the site.

[Back to Top]

HTML Improvements

HTML Improvements displays any issues with title tags and meta descriptions that are duplicated, missing and not properly optimized. For the purpose of researching spikes in indexed pages, we will focus on the duplicated tags.

HTML Improvements is under Search Appearance in GWT

Meta descriptions or titles that span across several versions of the exact same page can be an indicator of an increase in duplicate pages being generated on the site.

[Back to Top]


Continuously monitoring Index Status is one way to ensure your site is consistently performing and all of your content is available. This also allows you to identify and resolve any issues quickly to get your site back online or mitigate any spammy or low-value content.

Inside the Search Studio with Steve Sirich

Thu, 08/07/2014 - 14:03

We recently stat down with Steve Sirich, GM Bing Ads Product Marketing for this edition of Inside The Search Studio.

How long have you been working with Microsoft?

I’ve been with Microsoft for a little over 15 years. I started in field organization where I was involved in sales leadership, and then moved to corporate headquarters in Seattle. Over the past ten years I have had several different roles – running marketing, operations, and business development teams, all in the digital/online services space.


What made you become interested in search marketing/advertising? 

I actually became more intimately involved in that side of the business in 2009. Prior to that, I was very involved in display and video, so it made sense. At the time, Search was emerging as strategic growth for Microsoft and the industry as a whole, and I was interested in exploring that arm of the company. In fact, I helped to write the Yahoo deal, which included a technology deal with Bing services.


What has been the biggest surprise for you in your tenure at Microsoft?

There are a couple things that have surprised me, but perhaps the one that sticks out is the growth that we’ve achieved with Bing in last five years, which has been outstanding. Today we are at 20% market share, whereas in 2009 we were trending down at 7% market share. At the time, we knew we could reverse that trend, but in last five years an average of 2% growth is great, and possibly more than we anticipated.


What has been the project that you are most proud of during your tenure?

In the 15 years I’ve been here, one of the most satisfying projects I’ve worked on has been the Yahoo relationship and partnership. Not only what we outlined, but also what we executed. In 2009, we were operating Ad Center, and set out a scope with Yahoo to transition to as many as 30 markets worldwide across search, ad platform, and make a shift in how we were selling.

We knew that there were multiple layers of complexity and a short time to execute globally. Today we operate in 35 markets with Bing ads – 33 of those are Yahoo partnerships. It was very rewarding to have such a lead role in the deal and execution.


What are some of the biggest themes that Microsoft is tackling in the next quarter when it comes to search marketing and advertising?

At Microsoft, two major themes are mobile and cloud, in relation to the function of devices and a services strategy. A primary theme we are shifting toward is a future in services and devices play. We’re moving into a departure from OS/Office Suite, and toward the idea of how we will profit relative to tech – the disruption to the current landscape is device and service.

Core to these changes is economics, more specifically a holistic means of monetization like advertising, transactions (service like Xbox movie purchase), and subscriptions (licensing fee). We are profoundly shifting from classic package/licensed software to mobile and device play. Google has had a lot of success for a services model, and we’re looking to do the same.

As far as search is concerned, we’re continuing to focus on customer experience as we think about the ongoing innovation of Bing ads. We’ve been investing in areas like user simplicity, driving industry standardization on electronic IO, and removing friction in what consumers want to buy. When it comes to reporting and analytics, today we measure customer campaign performance in hours, but soon we’ll move to less than 30 minutes.

We understand Google’s large market share, so we’re trying to make it easy for consumers to spend time with us. Microsoft recognizes the high standards Google has driven when it comes to AdWords, and we want to show the value of spending time on Bing Ads.

It’s important to understand that Microsoft thinks of search beyond the query box/destination search. Google has a strong presence, but search is more than just a web destination; it is a part of all of the services you engage in with digitally. Search is very pervasive across all our assets – Xbox, Windows 8.1, and our mobile phone – and can be the intelligent fabric that powers our devices and overall experiences. Search as a platform and technology is the foundation of our services moving forward.


What has surprised you in terms of the way search has evolved? How is/did Microsoft address(ing) this?

Search has taken on a more natural interface, weaving itself into our lifestyle and becoming part of our natural rhythms. We recently introduced Cortana – a voice enabled experience anchored by Bing technology that operates as a personal digital assistant – as an answer to Siri. I’ve also seen a shift in search as a reaction to our behaviors switching between our devices and screens that has become more personal. Search is now a means for technology to learn personal interests and become more intelligent when it comes to your daily tasks.


How has social impacted search marketing and advertising?

Social is big part of Bing strategy because we have a solid relationship with Facebook. Social makes marketing and advertising a much more personal experience because it’s an easy way to affirm/validate activity and interests. People naturally refer to a community environment when they make decisions, so it’s important for marketers and advertisers to have a presence as much as possible.

Today, social is big part of the Bing experience in terms of providing relevant info both through web searches and the social graph. We will continue to evolve search based on social’s ability to showcase what colleagues, family and friends think about any given experience.


What are three predictions you see for search marketing/advertising in the last half of 2014? What are you predictions for the next 5-10 years in search? 

1.) The Idea that today’s world has unprecedented digital experiences. We have to drive well-integrated, seamless experiences that are intelligent across a number of devices. PC at work, tablets at home, and phones as you travel – the experience needs to shift to the need of the device. Currently, cross device experiences are somewhat fragmented, but we will soon need to simplify and unify them into one seamless product.

2.) We will continue to see the industry push toward more human and intuitive experiences around natural interfaces. Today it’s voice, but tomorrow you can only imagine what the experience will be to engage with devices in a natural way.

3.) The convergence of media and experiences will continue to push across search, display and TV. There will be a significant conversion – especially TV – to an on-demand model across the digital landscape.


What issues or news has sparked some of the biggest conversations on Microsoft when it comes to search advertising?

The biggest thing for us over the last 12 months is that we won’t win in search if we focus on search as a destination experience. Search is evolving into a platform of an intelligent fabric that will integrate into multiple experiences to produce a more profound experience for consumers that can separate us from the competition.


What do you think are the biggest challenges facing this industry?

1.) Fragmentation is more prevalent than ever. I have been involved in digital marketing since 1998 and seen it move in several different directions with the explosion of sites and services available to marketers today. Advertisers and agencies are challenged because there are so many experiences available to buy. Programmatic is starting to solve for that, but it still has a long way to go. Microsoft has to make sure it’s connecting customers in the best, most profitable and profound way to desired audiences.

2.) The lack of standardization still makes it difficult for advertisers and agencies. Publishers offer many experiences to engage with audiences, but without standardization, it’s difficult to manage all of these relationships.


What is the most interesting search marketing/advertising campaign that you’ve seen? How about social?

In February 2013, Dodge Ram rolled out an emotive ad – “Year of the Farmer”, narrated by Paul Harvey – that linked into an integrated campaign that involved search. It appealed to your sense of the fabric of America, all while bringing awareness to Dodge Ram and their charity, FFA, for which they raised $1M. It’s a great example of a brand play that used search after showcasing what they wanted to achieve with the Super Bowl, and carried it out through the year. They maximized the footprint opportunity by using all of the different ad formats and innovations on both Google and Bing ads.


Any advice for search marketers using Bing Ads?

I recommend marketers continue to stay very close to Bing and Bing Ads in terms of what we are doing with innovation; we’re actively responding to the customer voice. There are actually a lot of things that we’re rolling out in next six months that came directly from customer feedback. For example, today we offer 50,000 keywords, but at end of the calendar year will have 1M through the Bing Ads UI.

We’re also focusing on Electronic IO – Google has offered this, and we’re working to make it easier for customers. We’re spending time on innovation in our ad footprint – site, location and call extensions are available now, and we will continue to expand this so customers can connect with consumers in other ways like introducing zip code targeting.

Overall the experience will continue to improve. I encourage marketers to stay tuned and stay close.


Any final thoughts?

Microsoft is continuing to invest in Bing as an experience and a platform. We’re focusing on innovating for the customer, agency and advertiser with Bing Ads, and also for Bing as it relates to consumers. Our next OS system will feature Bing as that experience.

With the investment Microsoft is making in Bing today, we enjoy 19% query share in the US, and as we continue to invest, it is expected to grow. We plan to double down on what search means for both the device and service experience in what Microsoft is offering.


Major Shifts on Smartphones Led the Way in Q2 of 2014

Thu, 07/17/2014 - 11:01

Today we’ve released our quarterly state of paid search report for Q2 2014, and the data demonstrates solid growth across devices and search engines. Total clicks grew 36%, spend increased 22%, CTR jumped 58%, and CPCs fell 10%.

Much of the growth can be attributed to increased advertiser focus on mobile devices, including both tablets and smartphones. Spend share on mobile devices continues to climb on a quarter-over-quarter basis, with overall mobile spend share reaching 29%.

Mobile growth can be attributed in large part to smartphones, which are gradually becoming a “search anytime” device. Data on mobile impressions across the day and across the week demonstrate that users are increasingly using smartphones consistently throughout the day, no longer reserving smartphones for the evenings and weekends.

The increased usage of smartphones also becomes apparent with the large jump in smartphone click share on Google Product Listing Ads (PLAs)—in Q2 of this year, smartphones accounted for 12.7% of all PLA clicks. This indicates that consumers are becoming increasingly comfortable using smartphones as shopping devices.

Check out the full State of Paid Search Report for Q2 of 2014 for more graphs and analysis on the latest trends.


Domino’s Has No Excuse Not To Be Prepared During the World Cup

Wed, 07/02/2014 - 13:55

Watching overtime of the World Cup match between USA and Belgium yesterday afternoon was perhaps the only thing more stressful than the hour and a half wait our entire LA office endured for our order from Domino’s pizza.

There we were, 43 minutes into the match and 50 very hungry Americans. How did this happen? Ask Domino’s.  Apparently Domino’s didn’t plan ahead for the demand during a rather popular sporting event such as, oh I don’t know, the World Cup, and a match that just happened to coincide with lunch time (1PM). How could you not know? How could you not at least factor in a good response or backup plan if you absolutely get inundated with requests for pizza and wings?

Let’s set the stage; we knew on Monday afternoon that we would be hosting a work/watch party at the office. We called to place the order late Monday afternoon, then called again Tuesday morning to confirm the order and its delivery time of 12:30PM.

We called at 1PM to see what the status was.

We called at 1:30 to get an ETA on the driver delivery.

We called at 2PM to try to speak to the manager.

Finally, at 2:15PM pizza arrives. Is it the correct order? Who cares?

We know Americans love their pizza.

Far be it for me to know the inner workings of Domino’s but, as a marketer, wouldn’t it have been better to anticipate a large volume of orders in line with a popular sporting event? Based on interest over time* from Google Trends, Domino’s showed (4), pizza (21), world cup (100) which means that today, the search term “world cup” garnered at least 10% of search interest.  Look at “pizza” over time, the term only continues to slowly rise and maintain popularity.

Now, what if you overlay this with another live, prominent sporting event like “super bowl”. Here’s what you get:

Now try the search term “olympics”. More search blips:

The point is, there are always major sporting events and there will always be pizza. There is no excuse not to be ready, no reason not to plan your advertising and search marketing campaigns accordingly with major real time events.  Hockey has the Stanley Cup, baseball has the World Series, and soccer has the World Cup going on right now!  You’re missing the mark if you’re a Quick Service Restaurant (QSR) or pizza delivery business, Domino’s, that offers delivery and you’re not capitalizing on the search opportunity.

*Interest over time Numbers represent search interest relative to the highest point on the chart. If at most 10% of searches for the given region and time frame were for “pizza,” we’d consider this 100. This doesn’t convey absolute search volume.


Today: Canadian Legislation Attempts to Limit Online Spam

Tue, 07/01/2014 - 09:59

(Caveat: I am not a lawyer, nor have I played one on TV. However, I am a law graduate and have passed a bar exam once upon a time, so my reading of the law is based on that background. Please consult with a media regulatory lawyer for any compliance issues or questions with this law).

The Canada Anti-Spam Legislation, or CASL, goes into effect today, July 1, 2014. It is Canada’s attempt at limiting the amount of online spam people in Canada receive. CASL affects any advertising done over SMS, text messaging, email, or voicemail to anyone in Canada. The penalties for failing to follow the law are steep – up to $1 million in fines for individuals and $10 million for other persons – and any company wishing to advertise to someone in Canada will be impacted, even if you are based in the US.

What do you need to do to be in compliance?

  1. You must have implied or expressed consent from the consumer to advertise to them.
  2. You must clearly identify yourself and anyone else on behalf of whom the message is sent.
  3. In every message you send, you must provide a way for the recipients to unsubscribe from receiving messages in the future.

Let’s break point 1 down. The consumer must consent to the form of advertising, meaning the consumer must choose to receive it. You can’t have a checkbox saying ‘yes, please send me the latest information’ checked by default on your site; the consumer must choose to check the box themselves. And, you can’t fool the consumer into consenting to advertising by sending it bundled with other consent requests – each consent request for a different purpose must be sent separately.

The tricky part of the consent section of the law comes into play with ‘the installation of computer software programs.’ The law states that a consumer is considered to have expressly consented to the installation of a cookie, html code, or JavaScript code, if the consumer’s conduct suggests a ‘reasonable belief’ that that they consented to the program’s installation. Putting that in layman’s terms, the consumer is OK with getting “cookied” if their behavior suggests that they are. How businesses are supposed to determine this ‘reasonable belief’ standard is not yet clear in the law and most likely will be determined as the law is applied against businesses and individuals. However, this section of the law goes into effect on January 15, 2015, so there is still a little time to get clarity on this point from your friendly media lawyer.

Points 2 and 3 for compliance are pretty straightforward – you need to include valid contact information in anything you send and you must give the consumer a way to withdraw from receiving that advertising, whether by providing an email address or a web link where they can indicate their desire to opt out. For both of these, the information you provide must be valid for 60 days after you send the advertising to the consumer.

Basically, what you really need to know is that the law is changing for advertising online to consumers in Canada, and you need to be prepared. Or be prepared to pay a hefty fine.

For a look at the legal jargon, you can find the law here. If you need more guidance, the CRTC is another helpful source.


Travel Websites Lagging it on Mobile

Wed, 06/25/2014 - 09:22

Today, we released the latest edition of our Mobile Experience Scorecard report on the 100 most visited travel destination and accommodation websites based on data from Hitwise. The results reveal that a lot of travel sites are still failing to optimize for smartphone devices. Only 8 of the 100 sites used responsive web design, compared to 67 that served dedicated mobile sites and the remaining 25 that continued to serve the desktop version of their sites.

The average load time for the 100 travel sites was 2.64 seconds, exceeding Google’s recommended time of 1 second. However, load times varied widely depending on the type of site format used, with dedicated mobile sites loading significantly faster than both desktop and responsive web design sites.

To come up with a score for each site, we evaluated each mobile homepage on a variety of features that improve the mobile experience. These included a search function, click-to-call, sign in, social media buttons, and app download buttons. While these elements require minimal effort to set up on a site, many of the sites failed to include them on their mobile pages.

Download the full report to see more data on the mobile sites of the top 100 most visited travel sites in the United States.