Imagine a world where you can monitor your oven settings after leaving the house, and even shut the appliance off remotely if need be. Perhaps you’ve forgotten to lock your doors and fear intrusion? In this world, with one simple click, you can secure your home even post departure, and control the smoke and carbon monoxide detectors, and even switch off the lights! That imagined world? It’s ours, right now.
By now you have probably heard of something called the Internet of Things (IoT). Simply put, loT is the technology described above; responsible for connecting all electronic devices in your home, office, or anywhere else directly to the Internet, giving you, the user, remote control over these areas.
Speaking of homes, recently, I was out window shopping for a new home myself. During my tour of a rather expensive house, I noticed a device on the wall labeled NEST. When I inquired about this device, the Realtor explained that it was used to control all of the home’s electronic devices from an app on the owner’s smartphone. An app, controllable from anywhere worldwide, astounded me and I realized, our world is about to become much more connected by way of our surrounding devices. Bigger still, the role these devices will play has yet to be determined.
Attracting the BIG players
This technology is so real that Google, Samsung, and a slew of other brands have joined forces to standardize the IoT. This collaboration of companies is called the Thread Group. According to an article on Business Insider, this group’s goal is to develop “Thread, a new IP-based wireless networking protocol, which will enable devices to connect into a more open, secure, and low-power wireless mesh network”. This technology can be applied across many different industries including healthcare, aviation, entertainment, transportation and home automation. These categories barely scratch the surface of the IoT’s potential application, spanning to any industry where data collection, monitoring, and control could prove beneficial.
Chip manufacturers like Texas Instruments and Intel could see a huge upswing from the adoption of IoT, as well as internet security based companies like Cisco, Barracuda Networks and Checkpoint Software. Moreover, digital marketers and advertising agencies could see an increase in engagement metrics as products become increasingly more interactive. For example, media consumers of home television can, with the click of a button, purchase whatever product captures their interests on screen. The list of this technology’s possible applications is virtually endless.
Protecting Your Home Will Take On a Whole New Meaning
Of course, modern concerns for security and privacy stand to accompany this new technology’s arrival . By the year 2020, a projected 20 billion devices will have seamlessly connected to the IoT. However, connecting more devices to the internet increases the chances of hacking or other digital compromises. Hewlett Packard has a security check system called Fortify on Demand to protect against these concerns. According to a study by HP, “the company’s Fortify application security unit conducted an analysis of the 10 most popular consumer [IoT] devices on the market and found 250 different security vulnerabilities in the products, for an average of 25 faults each. They were from the manufacturers of TVs, webcams, home thermostats, remote power outlets…”… “What’s happening,” says Mike Armistead, VP and general manager of HP’s Fortify unit, “is that manufacturers are rushing to get their products on the market without doing the harder work of locking their devices down against the most basic kinds of attacks”.
With that in mind, ask yourself… are we prepared for the kind of change that an increased number of connected devices will bring? While the application of the IoT continues to take shape, we need to shift our perception and concerns regarding connected devices. What were once thought of as dumb devices may soon become a network of interconnected, living devices, that communicate with each other, making intelligent decisions, dare I say, apart from human interaction and delegation; a possible Skynet of sorts. Ultimately, whether you feel excited about the Internet of Things revolution or skeptical due to looming security concerns, one thing is certain; we no longer have to dream or imagine. This technology is here to stay.
Share your thoughts and comments below to let us know how you feel about The Internet of Things.
Last week, Google announced that HTTPS is now a ranking signal, a change for which Matt Cutts gave his personal endorsement months ago. Although he official announcement indicates that HTTPS is only a “lightweight signal” for now, effects may become stronger in the future. The following are site update considerations to prioritize in the interim:
What Does this Mean for Brands?
For brand leaders, Google’s announcement signals yet another available avenue for getting a leg up on competitors. At The Search Agency, we predict this announcement as a precursor to more expansive future Google initiatives. Thus, staying ahead of this curve by responding actively to Google’s current update will prove beneficial if this change does indeed prove itself as a signal that weighs heavily in terms of ranking factors. Still, more impactful changes currently exist, implementable by way of less effort and lower cost. As Google’s John Mueller indicates, this is not something we need to drop everything and implement right away, so it should be prioritized accordingly among other site updates.
The changes required to transition existing URLs to reflect the HTTPS structure can be overwhelming. However in reality, they are not much different than those necessary for any other site migration. Firstly, E-commerce sites should already be utilizing HTTPS on pages requiring security encryption, such as shopping carts and transactions, so, for those sites, changes may be as simple as extending that same security site-wide. For non-E-commerce sites, once an SSL certificate has been set up and tested (we recommend following SSL best practices to ensure security and to verify your server will be able to hold up under the additional processing requirements), there are a number of steps to take to ensure no value is lost in organic search. Following these practices, switching to HTTPS is reminiscent of most other site migration patterns. Here are some of the high level steps to consider in such a migration:
Some other things to keep in mind:
Heartbleed – In April, the Heartbleed bug showed the corporate community that even those technologies designed to enhance security can themselves be vulnerable to system breaches. Keep in mind that leak opportunities are still possible, reinforcing the importance of staying up-to-date with any and all security protocol advancements.
Page Speed – Technically speaking the “round-trips” necessary for HTTPS may actually have a slightly negative effect on page speed, creating a slower page load time, which goes against Google’s interests in ranking sites with faster load times. Will Google take this into consideration when factoring in page speed?
Mobile Devices – Especially on older mobile devices, the possibility remains that certain, less common SSL certificate authorities may not be recognized. Therefore, cross-platform testing should be performed to catch any such issues.
Overall, these changes, potential page speed drains aside, stand to boost rankings and better secure both brand and consumer Internet interactions. Implementing HTTPS is a safety precaution that’s well worth the up-front implementation efforts to ensure site security and compliance; a little work, and we’ll all be breathing easier. See? That wasn’t so bad.
Since Pigeon swept the search scene early last week, The Search Agency’s thought leaders have been evaluating Google’s changes, drawing their own conclusions concerning how these updates stand to affect the quality of search moving forward. Here is a sampling of their thoughts.
Local Products and Services SERPs:
The Search Agency started its Pigeon evaluation examining the changes made to local products and services searches. Our team’s findings point to the notion that, under Pigeon, semantic markup will now not necessarily improve SERP rank. Mary Hayes, The Search Agency’s Earned Media Content Manager, investigated this notion, running her own search for car wash west la. She found:
“A search for car wash west la yields a seven pack (surprising, since we’re mostly seeing only three or four packs). The resulting map also looks to cover a decent area.”
Mary then refined her search to car wash 90064, and while she got the same 7-pack, the resulting map was smaller and paid ads took up much more real estate underneath said map. As she scrolled down, Mary noted that Yelp takes this search’s top three organic SERP spots, further concluding that while semantic markup may not improve rank, it stands to at least help main domains stand out against their Yelp counterparts.
The Search Agency’s Gregory Sidor performed his own search for oil change as seen below.
From these results, Greg concluded, “To me, this seems like an example of Pigeon gone wrong. Three different Jiffy Lube URLs [appear] in the result, and one is even duplicated in a right-rail ad. I would expect Google to improve on this.”
Like Greg’s oil change query and Mary’s car wash queries, Kirby Burke, The Search Agency’s Earned Media Manager, looked up a couple service queries as well: electrician and plumber. Both results returned maps which reinforce local intent. They also completely pushed all organic results below the paid ads and six packs:
Furthermore, adding a geo term to his query (ex: electricians west los angeles) pushed the six pack down and front-loaded three Yelp results. To Kirby, the SERPs seemed identical after he flipped the service and geo (ex: west los angeles electrician).
These findings demonstrate that with the new Pigeon implementations, Yelp is still dominating for local listings and are subsequently considered very relevant for specifically-targeted queries. Managing profiles in Google My Business and Yelp are no longer optional if SMBs want to compete in local SERPs. SMB’s will need to make sure they are listed and getting active, positive reviews in influential local directories. Similarly, it’s recommended that SMB’s keep an active Google+ business profile with a high-quality images (which could get picked up by Google Carousel) and to encourage customer reviews. These are the staples of natural and fresh content which has a positive association to the main domain.
Honing focus further on SERP maps themselves, Kirby compared queries for restaurants near me against restaurants west los angeles. He found:
“Queries for restaurants near me and restaurants west los angeles both provide very targeted results, within a few miles of the location. However, a search for restaurants los angeles provides broader results that canvas the greater LA area. Thus, the maps on SERPS appear to be relative to the query.”
Embedded within the wide scope of Pigeon updates, The Search Agency team also noticed that some video thumbnails had disappeared from SERPs, replaced instead by fuller-scale Mega Video ads, i.e. Google properties. Matt McKinley, Earned Media Manager, adds, “Mega Videos are a way for Google to promote YouTube. Brands who are optimized and active on YouTube can benefit from this [change] when users do a specific search for a video such as nike the last game rather than more general searches such as soccer videos, which still result in thumbnails.”
Promoting these larger scale ads also points to Google finding yet another opportunity to engage users within its own properties, expanding avenues for data accumulation.
Kirby continued to evaluate Pigeon changes, investigating what factors affect the new SERP carousel results.
“The listings appear to be an aggregate of Google My Business and Zagat (a Google-owned property) listings. I verified carousel appearance for keywords like bars, nightclubs, museums, movies, hotel (note the check-in and check-out date options) and specific food items like pizza, burgers and ramen.
The initial carousel results are not sorted by name, rating, price or any other factors I could discover. Most of the listings have Google My Business profiles and some have Zagat pages too. So having an active presence on both these platforms is highly encouraged.
I also found carousel results for entertainment queries, such as list of hip hop albums, list of popular movies, list of comedy tv shows and list of nba teams, which seem to function differently. Underlined keywords are interchangeable with other, relevant keywords, such as specific genres and sports leagues.”
Kirby noted that “the carousel also has options to filter results. For example, here are carousels for 2013 Heavy Metal albums and burger places that are cheap ($ or $$), have a rating of 4.0+, and are open now:”
“The creative works are especially of note because Google lists them as ‘[Creative Work] frequently mentioned on the web.’ Google is obviously tracking these creative works as entities, most likely through semantic markup, and tracking the frequency of occurrence in the index.
Today, the carousel is featuring albums and films. In the future, this could expand to other entities, such as products, places, and events. This reinforces the need to designate entities with semantic markup.”
Overall, Kirby found that “Clicking a carousel item does not lead to product, business, or review pages. Instead, it keeps the user on the search engine results page, updates the query, and displays results for the entity, all while maintaining the carousel at the top of the browser. This allows the user to browse and compare multiple listings within the same window.”
The Search Agents agree that purposed as such, the implementation of this carousel seems to be yet another Google tactic to elongate the duration of its user experiences while at the same time affording Google the opportunity to garner more data during user comparison searches.
All things considered, Matt McKinley offered best practices to follow regarding these Pigeon updates, saying, “I don’t think [Pigeon] changes local content strategy, just the way it is being ranked. The new local search algorithm has been tied to more traditional Web standards, so you still need quality, natural content with a strong backlink profile to build domain authority and rank locally. If you have multiple locations, having a dedicated local section with unique, relevant content and the targeted city/state in the meta data is recommended. Do not duplicate content across your location pages with the city/state as the only unique factor.”
While Pigeon may have landed on our laps in a hurry, seemingly a pest, The Search Agency’s thought leaders find most of its updates to be in line with Google’s commitment to further enhancing its user experience levels. Though SERP kinks may still need some working out, with Pigeon, Google continues to prioritize engaging, readable content in its rankings. Google has also found ways to elongate user experience times, using both in-engine comparison shopping experiences like its carousel, and increased visibility for subsidiary platforms like Mega Video from its YouTube property. These changes provide Google increased data collection opportunities, to help better answer the question; what do searchers want?
Who knows, Pigeon may be the wings on which Google soars to new heights. Only time will tell.
Monitoring your site in Google Webmaster Tools regularly is critical to the success of your site. For the purpose of this article, we are focusing specifically on monitoring the number of pages on your site that appear in Google’s index.
This number will naturally fluctuate and increase or decrease as you add and remove pages to your site. However, we are focusing on this from a troubleshooting perspective. This article is a resource to help you identify significant, unplanned spikes and dips in the number of indexed pages.
This chart is a basic overview of some standard troubleshooting to isolate and resolve the root cause(s) of indexation issues. Supporting documentation and resources can be found below.Check Index Status
Index Status is a module in Google Webmaster Tools that displays the total number of indexed pages over a rolling 90-day period. Look for any significant spikes or dips in the number of pages.
Index Status is located under Google Index in the GWT sidebar navigation
If the number of indexed pages in GWT dips that is an indicator that something is preventing pages on the site from being crawled or certain pages are not being indexed by Google. This can be the result of the following:
Resolving these errors is important because you want to ensure that all of your content is visible to crawlers so it can be indexed and accessed by users.
Site Errors in GWT refer to errors that prevent the crawler from accessing the site. These errors include DNS communication issues, server connectivity (server is down or Google is being blocked) and the ability to retrieve the robots.txt file.
Site Errors is at the top of the Crawl Errors module, under Crawl in GWT
Site Errors are also displayed in the Crawl Errors section of the GWT Site Dashboard.
The three types of Site Errors are displayed in tabs at the top of the window. Tabs will have a green check mark if no errors are found or a red exclamation point to designate errors. Click on each tab to display the specific errors.
DNS Errors are produced when Google is not able to successfully recognize the hostname of a site or the request times out.
Server Connectivity Errors occur when Google is having trouble connecting with the server, losing an established connection or receiving a bad response from the server.
Robots.txt Failures are caused by the robots.txt file not returning a 200 response or a 404 response (because the site doesn’t have one). If Google cannot confirm whether or not a robots.txt file exists, it will postpone crawling the site to mitigate the risk of crawling and indexing content which is restricted by the robots.txt file.
Blocked URLs in GWT are caused by robots.txt commands that prevent Google from crawling and indexing certain pages and directories.
Blocked URLs is under Crawl in GWT
This module includes an overview of the robots.txt file and allows it to be modified and tested. It does not modify the actual robots.txt file though. So, if necessary, be sure to update the actual robots.txt file on your server.
To ensure quality content is being delivered to its users, Google created its Webmaster Guidelines. These guidelines are enforced by both manual reviewers and automated functions such as the Panda and Penguin algorithms.
More information about how Google manually reviews sites can be found in our blog article, Understanding Google Search Quality Rating Guidelines.
Manual Actions are penalties which are applied after a manual review of the problem. These penalties can be isolated to a specific section or applied to the entire site.
Manual Actions is under Search Traffic in GWT
Once the issues are resolved, Manual Actions can be resolved by submitting a Reconsideration Request to Google for their review.
Algorithmic Penalties occur automatically when changes to the Google algorithms go live. Unfortunately, there is no reporting of these kind of penalties in GWT.
If you believe you’ve been impacted by Algorithmic Penalties, compare your Index Status trends to see if they correspond with the algorithm updates. A comprehensive list of updates can be found in Moz’s Google Algorithm Change History article.
If the drops correspond to the updates, then you will need to review the details of the update and find the violations of the Quality Guidelines that correspond with the update (content, links, etc.).
Unfortunately, Algorithmic Penalties cannot be resolved with a Reconsideration Request. After resolving the issues, these URLs will be crawled again and submitted to the index if they meet the Quality Guidelines. You can expedite this process by manually uploading URLs to Google by using the “Fetch as Google” feature in the “Crawl” menu.
If the number of pages has increased, this means your indexed pages are being saturated with additional pages. These pages are usually the result of the following:
Resolving increased pages is important because it ensures the search engines are crawling and indexing valuable content instead of wasting their bandwidth chasing error pages.
URL Errors are also located under the Crawl Errors module, just like Site Errors. URL Errors provides specific examples of URLs with errors or bad URL request to the site.
URL Errors is at the bottom of the Crawl Errors module, under Crawl in GWT
First, select the tab that corresponds to the specific error. Then, look below the chart for the specific URLs. Clicking on them will provide details about the errors and from where they are linked.
Specific URLs are listed below the URL Errors graph
Clicking a specific URL will provide Error details and the Linked from information
Sudden spikes in errors can have several causes. A few examples are errors in the CMS that generate URLs with errors or an increase in external links to non-existent URLs on the site.
HTML Improvements displays any issues with title tags and meta descriptions that are duplicated, missing and not properly optimized. For the purpose of researching spikes in indexed pages, we will focus on the duplicated tags.
HTML Improvements is under Search Appearance in GWT
Meta descriptions or titles that span across several versions of the exact same page can be an indicator of an increase in duplicate pages being generated on the site.
Continuously monitoring Index Status is one way to ensure your site is consistently performing and all of your content is available. This also allows you to identify and resolve any issues quickly to get your site back online or mitigate any spammy or low-value content.
We recently stat down with Steve Sirich, GM Bing Ads Product Marketing for this edition of Inside The Search Studio.
How long have you been working with Microsoft?
I’ve been with Microsoft for a little over 15 years. I started in field organization where I was involved in sales leadership, and then moved to corporate headquarters in Seattle. Over the past ten years I have had several different roles – running marketing, operations, and business development teams, all in the digital/online services space.
What made you become interested in search marketing/advertising?
I actually became more intimately involved in that side of the business in 2009. Prior to that, I was very involved in display and video, so it made sense. At the time, Search was emerging as strategic growth for Microsoft and the industry as a whole, and I was interested in exploring that arm of the company. In fact, I helped to write the Yahoo deal, which included a technology deal with Bing services.
What has been the biggest surprise for you in your tenure at Microsoft?
There are a couple things that have surprised me, but perhaps the one that sticks out is the growth that we’ve achieved with Bing in last five years, which has been outstanding. Today we are at 20% market share, whereas in 2009 we were trending down at 7% market share. At the time, we knew we could reverse that trend, but in last five years an average of 2% growth is great, and possibly more than we anticipated.
What has been the project that you are most proud of during your tenure?
In the 15 years I’ve been here, one of the most satisfying projects I’ve worked on has been the Yahoo relationship and partnership. Not only what we outlined, but also what we executed. In 2009, we were operating Ad Center, and set out a scope with Yahoo to transition to as many as 30 markets worldwide across search, ad platform, and make a shift in how we were selling.
We knew that there were multiple layers of complexity and a short time to execute globally. Today we operate in 35 markets with Bing ads – 33 of those are Yahoo partnerships. It was very rewarding to have such a lead role in the deal and execution.
What are some of the biggest themes that Microsoft is tackling in the next quarter when it comes to search marketing and advertising?
At Microsoft, two major themes are mobile and cloud, in relation to the function of devices and a services strategy. A primary theme we are shifting toward is a future in services and devices play. We’re moving into a departure from OS/Office Suite, and toward the idea of how we will profit relative to tech – the disruption to the current landscape is device and service.
Core to these changes is economics, more specifically a holistic means of monetization like advertising, transactions (service like Xbox movie purchase), and subscriptions (licensing fee). We are profoundly shifting from classic package/licensed software to mobile and device play. Google has had a lot of success for a services model, and we’re looking to do the same.
As far as search is concerned, we’re continuing to focus on customer experience as we think about the ongoing innovation of Bing ads. We’ve been investing in areas like user simplicity, driving industry standardization on electronic IO, and removing friction in what consumers want to buy. When it comes to reporting and analytics, today we measure customer campaign performance in hours, but soon we’ll move to less than 30 minutes.
We understand Google’s large market share, so we’re trying to make it easy for consumers to spend time with us. Microsoft recognizes the high standards Google has driven when it comes to AdWords, and we want to show the value of spending time on Bing Ads.
It’s important to understand that Microsoft thinks of search beyond the query box/destination search. Google has a strong presence, but search is more than just a web destination; it is a part of all of the services you engage in with digitally. Search is very pervasive across all our assets – Xbox, Windows 8.1, and our mobile phone – and can be the intelligent fabric that powers our devices and overall experiences. Search as a platform and technology is the foundation of our services moving forward.
What has surprised you in terms of the way search has evolved? How is/did Microsoft address(ing) this?
Search has taken on a more natural interface, weaving itself into our lifestyle and becoming part of our natural rhythms. We recently introduced Cortana – a voice enabled experience anchored by Bing technology that operates as a personal digital assistant – as an answer to Siri. I’ve also seen a shift in search as a reaction to our behaviors switching between our devices and screens that has become more personal. Search is now a means for technology to learn personal interests and become more intelligent when it comes to your daily tasks.
How has social impacted search marketing and advertising?
Social is big part of Bing strategy because we have a solid relationship with Facebook. Social makes marketing and advertising a much more personal experience because it’s an easy way to affirm/validate activity and interests. People naturally refer to a community environment when they make decisions, so it’s important for marketers and advertisers to have a presence as much as possible.
Today, social is big part of the Bing experience in terms of providing relevant info both through web searches and the social graph. We will continue to evolve search based on social’s ability to showcase what colleagues, family and friends think about any given experience.
What are three predictions you see for search marketing/advertising in the last half of 2014? What are you predictions for the next 5-10 years in search?
1.) The Idea that today’s world has unprecedented digital experiences. We have to drive well-integrated, seamless experiences that are intelligent across a number of devices. PC at work, tablets at home, and phones as you travel – the experience needs to shift to the need of the device. Currently, cross device experiences are somewhat fragmented, but we will soon need to simplify and unify them into one seamless product.
2.) We will continue to see the industry push toward more human and intuitive experiences around natural interfaces. Today it’s voice, but tomorrow you can only imagine what the experience will be to engage with devices in a natural way.
3.) The convergence of media and experiences will continue to push across search, display and TV. There will be a significant conversion – especially TV – to an on-demand model across the digital landscape.
What issues or news has sparked some of the biggest conversations on Microsoft when it comes to search advertising?
The biggest thing for us over the last 12 months is that we won’t win in search if we focus on search as a destination experience. Search is evolving into a platform of an intelligent fabric that will integrate into multiple experiences to produce a more profound experience for consumers that can separate us from the competition.
What do you think are the biggest challenges facing this industry?
1.) Fragmentation is more prevalent than ever. I have been involved in digital marketing since 1998 and seen it move in several different directions with the explosion of sites and services available to marketers today. Advertisers and agencies are challenged because there are so many experiences available to buy. Programmatic is starting to solve for that, but it still has a long way to go. Microsoft has to make sure it’s connecting customers in the best, most profitable and profound way to desired audiences.
2.) The lack of standardization still makes it difficult for advertisers and agencies. Publishers offer many experiences to engage with audiences, but without standardization, it’s difficult to manage all of these relationships.
What is the most interesting search marketing/advertising campaign that you’ve seen? How about social?
In February 2013, Dodge Ram rolled out an emotive ad – “Year of the Farmer”, narrated by Paul Harvey – that linked into an integrated campaign that involved search. It appealed to your sense of the fabric of America, all while bringing awareness to Dodge Ram and their charity, FFA, for which they raised $1M. It’s a great example of a brand play that used search after showcasing what they wanted to achieve with the Super Bowl, and carried it out through the year. They maximized the footprint opportunity by using all of the different ad formats and innovations on both Google and Bing ads.
Any advice for search marketers using Bing Ads?
I recommend marketers continue to stay very close to Bing and Bing Ads in terms of what we are doing with innovation; we’re actively responding to the customer voice. There are actually a lot of things that we’re rolling out in next six months that came directly from customer feedback. For example, today we offer 50,000 keywords, but at end of the calendar year will have 1M through the Bing Ads UI.
We’re also focusing on Electronic IO – Google has offered this, and we’re working to make it easier for customers. We’re spending time on innovation in our ad footprint – site, location and call extensions are available now, and we will continue to expand this so customers can connect with consumers in other ways like introducing zip code targeting.
Overall the experience will continue to improve. I encourage marketers to stay tuned and stay close.
Any final thoughts?
Microsoft is continuing to invest in Bing as an experience and a platform. We’re focusing on innovating for the customer, agency and advertiser with Bing Ads, and also for Bing as it relates to consumers. Our next OS system will feature Bing as that experience.
With the investment Microsoft is making in Bing today, we enjoy 19% query share in the US, and as we continue to invest, it is expected to grow. We plan to double down on what search means for both the device and service experience in what Microsoft is offering.
Today we’ve released our quarterly state of paid search report for Q2 2014, and the data demonstrates solid growth across devices and search engines. Total clicks grew 36%, spend increased 22%, CTR jumped 58%, and CPCs fell 10%.
Much of the growth can be attributed to increased advertiser focus on mobile devices, including both tablets and smartphones. Spend share on mobile devices continues to climb on a quarter-over-quarter basis, with overall mobile spend share reaching 29%.
Mobile growth can be attributed in large part to smartphones, which are gradually becoming a “search anytime” device. Data on mobile impressions across the day and across the week demonstrate that users are increasingly using smartphones consistently throughout the day, no longer reserving smartphones for the evenings and weekends.
The increased usage of smartphones also becomes apparent with the large jump in smartphone click share on Google Product Listing Ads (PLAs)—in Q2 of this year, smartphones accounted for 12.7% of all PLA clicks. This indicates that consumers are becoming increasingly comfortable using smartphones as shopping devices.
Check out the full State of Paid Search Report for Q2 of 2014 for more graphs and analysis on the latest trends.
Watching overtime of the World Cup match between USA and Belgium yesterday afternoon was perhaps the only thing more stressful than the hour and a half wait our entire LA office endured for our order from Domino’s pizza.
There we were, 43 minutes into the match and 50 very hungry Americans. How did this happen? Ask Domino’s. Apparently Domino’s didn’t plan ahead for the demand during a rather popular sporting event such as, oh I don’t know, the World Cup, and a match that just happened to coincide with lunch time (1PM). How could you not know? How could you not at least factor in a good response or backup plan if you absolutely get inundated with requests for pizza and wings?
Let’s set the stage; we knew on Monday afternoon that we would be hosting a work/watch party at the office. We called to place the order late Monday afternoon, then called again Tuesday morning to confirm the order and its delivery time of 12:30PM.
We called at 1PM to see what the status was.
We called at 1:30 to get an ETA on the driver delivery.
We called at 2PM to try to speak to the manager.
Finally, at 2:15PM pizza arrives. Is it the correct order? Who cares?
Far be it for me to know the inner workings of Domino’s but, as a marketer, wouldn’t it have been better to anticipate a large volume of orders in line with a popular sporting event? Based on interest over time* from Google Trends, Domino’s showed (4), pizza (21), world cup (100) which means that today, the search term “world cup” garnered at least 10% of search interest. Look at “pizza” over time, the term only continues to slowly rise and maintain popularity.
Now, what if you overlay this with another live, prominent sporting event like “super bowl”. Here’s what you get:
Now try the search term “olympics”. More search blips:
The point is, there are always major sporting events and there will always be pizza. There is no excuse not to be ready, no reason not to plan your advertising and search marketing campaigns accordingly with major real time events. Hockey has the Stanley Cup, baseball has the World Series, and soccer has the World Cup going on right now! You’re missing the mark if you’re a Quick Service Restaurant (QSR) or pizza delivery business, Domino’s, that offers delivery and you’re not capitalizing on the search opportunity.
*Interest over time Numbers represent search interest relative to the highest point on the chart. If at most 10% of searches for the given region and time frame were for “pizza,” we’d consider this 100. This doesn’t convey absolute search volume.
(Caveat: I am not a lawyer, nor have I played one on TV. However, I am a law graduate and have passed a bar exam once upon a time, so my reading of the law is based on that background. Please consult with a media regulatory lawyer for any compliance issues or questions with this law).
The Canada Anti-Spam Legislation, or CASL, goes into effect today, July 1, 2014. It is Canada’s attempt at limiting the amount of online spam people in Canada receive. CASL affects any advertising done over SMS, text messaging, email, or voicemail to anyone in Canada. The penalties for failing to follow the law are steep – up to $1 million in fines for individuals and $10 million for other persons – and any company wishing to advertise to someone in Canada will be impacted, even if you are based in the US.
What do you need to do to be in compliance?
Let’s break point 1 down. The consumer must consent to the form of advertising, meaning the consumer must choose to receive it. You can’t have a checkbox saying ‘yes, please send me the latest information’ checked by default on your site; the consumer must choose to check the box themselves. And, you can’t fool the consumer into consenting to advertising by sending it bundled with other consent requests – each consent request for a different purpose must be sent separately.
Points 2 and 3 for compliance are pretty straightforward – you need to include valid contact information in anything you send and you must give the consumer a way to withdraw from receiving that advertising, whether by providing an email address or a web link where they can indicate their desire to opt out. For both of these, the information you provide must be valid for 60 days after you send the advertising to the consumer.
Basically, what you really need to know is that the law is changing for advertising online to consumers in Canada, and you need to be prepared. Or be prepared to pay a hefty fine.
Today, we released the latest edition of our Mobile Experience Scorecard report on the 100 most visited travel destination and accommodation websites based on data from Hitwise. The results reveal that a lot of travel sites are still failing to optimize for smartphone devices. Only 8 of the 100 sites used responsive web design, compared to 67 that served dedicated mobile sites and the remaining 25 that continued to serve the desktop version of their sites.
The average load time for the 100 travel sites was 2.64 seconds, exceeding Google’s recommended time of 1 second. However, load times varied widely depending on the type of site format used, with dedicated mobile sites loading significantly faster than both desktop and responsive web design sites.
To come up with a score for each site, we evaluated each mobile homepage on a variety of features that improve the mobile experience. These included a search function, click-to-call, sign in, social media buttons, and app download buttons. While these elements require minimal effort to set up on a site, many of the sites failed to include them on their mobile pages.
Download the full report to see more data on the mobile sites of the top 100 most visited travel sites in the United States.
If you have faulty redirects appearing in Google Webmaster Tools (GWT), now is the time to fix them! Faulty redirects DO impact rankings and now they’ll also impact CTR.
Google just released an update to their SERPs which will notify users that they might be redirected to a page they didn’t intend to go to.
What exactly is a faulty redirect, you ask? It is when you re-route mobile traffic headed to a non-mobile optimized page to your mobile-optimized home page. This often occurs when only the home page and a select few highly valuable pages are mobile-optimized, usually due to lack of immediate resources. How might this have happened? Your analytics probably demonstrated that a significant amount of mobile traffic visited your desktop-optimized website only to quickly bounce off or have low engagement compared to your desktop-device users. You realize something should be done to stop the bleeding, but instead of fixing the problem the right way, you select only a handful of pages to optimize due to lack of resources instead of optimizing the whole site.
Yep. So while everyone is jumping on the mobile bandwagon, you, for whatever reason, think mobile is just a fad that will soon go away (hint: mobile is here to stay). You still need to appease those demanding C-Level Execs as well as your mobile-savvy visitors so you decide that, instead of creating a mobile-friendly experience for your entire site, you’ll take a shortcut and only optimize the homepage, redirecting all mobile traffic to that one page.
Here’s essentially what you’re saying:
“Hello there, mobile user! I see you’re looking for a page on my site. I’d love to give you EXACTLY what you’re looking for but since I’m lazy and haven’t taken the time to implement a mobile-ready site, I’ll just simply redirect you to my homepage which is mobile-ready – cool shortcut ey?”
NOT! Why would you FORCE a potential customer back to the beginning of the conversion funnel? Why would you not give the best experience to your potential customer? You are trying to make money here, right?
Have you ever called a customer service line (remember, these lines are there to serve the customer, much like your site) and after selecting your option, are dropped from the call? I know! It’s like, WHAT just happened?
So what are your options? You can do one the following:
Think about this when deciding which one is best for you:
Are you still using a dot matrix printer?
Are you still living in your parents’ basement?
Are you still using that old Motorolla?
Yeah…neither is the rest of the world. This isn’t just a strong hint – it’s a huge push from Google to get your site in order and be mobile ready…NOW!
Need more information? Check out our white paper on Optimization Strategies for the Mobile Web. If you want to learn how to get your site to be responsive, read our white paper on Implementing Responsive Web Design. And if you need some real world examples of companies doing mobile right (and wrong), check out our Mobile Experience Scorecard research series.
Recently, Search Engine Journal published an article about how SEO has drastically changed. The article hits some very relevant points on what SEO looks like today. There’s no denying that search engine optimization has changed over the years, but one thing that people tend to dilute when they’re waxing poetically about the death, decay or evolution of SEO is that in many ways SEO hasn’t changed.
Now that you’ve read my obligatory intro paragraph, here are 4 ways SEO has not changed.
Website Crawlability is Key
If search engines can’t crawl and index your website, it will rank for nothing. All the content marketing, PR efforts, infographic creation and authorship markup will be a huge waste of time if search engines can’t crawl your website. It always amazes me how many websites still struggle with maintaining a crawlable website. They either inadvertently orphan 50% of their content or somehow block their entire site from being crawled in their robots.txt file. It’s true that PR and content marketing are what typically move the organic needle, but having a crawlable website will always be step 1 in SEO.
Tips on Ensuring Website Crawlability:
On-page Content Counts
More and more I see people paying less and less attention to the mechanical details of on-page content. This worries me. Of course keyword stuffing doesn’t work anymore and quality of content outweighs quantity of on-page content, but you shouldn’t assume that just because a web page was written by an in-house copywriter or brand manager that all is well with the content. Great content can work against you if it’s not structured in an optimal way. Elements like alt text, image and video summaries, above-the-fold content and headers should be present and checked. Don’t abandon simple on-page optimization checks just because everyone’s talking about content marketing (it could be nothing but hype).
On-page Content Elements to Check:
Meta Data Matters
For us old timers, we’ve seen the death of a few meta tags as ranking factors. First the meta keywords tag; then the meta description. But just because meta tags aren’t as gameable as they once were, it doesn’t mean they should be ignored. Meta tags still have an effect on organic search performance. Meta descriptions can still influence click-through rate. Meta keywords have been replaced by meta news keywords (something specific to those accepted into Google News). Title tags (although technically not a meta tag) are still a ranking factor. So don’t forget to check your meta data!
Meta Data Checks:
Organic Search Success Takes Time
One of the challenges with website optimization is that the results are never instantaneous. Although Google is much quicker at identifying website changes and making updates to its index, because of the drastic change in ranking factors and various Google updates, it still takes a significant amount of time to see the fruits of your SEO labor. SEO is a journey, not a potato sack race. If you want to see instantaneous results, go signup for Google AdWords.
Things to Keep in Mind
It is true that SEO has drastically changed. It takes much more to get on the first page. However, if you only focus on what’s changed and disregard what hasn’t, you’ll be missing a big piece of the SEO puzzle.
Today, Amazon and Twitter announced a partnership that would allow Twitter users to place products directly into their Amazon shopping carts while remaining within the Twitter ecosystem. How it works: Twitter users connect their Twitter accounts to their Amazon accounts. Then, whenever they see a tweet containing an Amazon product link, users hit reply followed by the hashtag #AmazonCart, and the product will automatically appear in their Amazon carts. This gives users the chance to save the item and check out later when it’s more convenient, without having to switch between apps or enter new user logins.
This unexpected new partnership poses a couple of benefits for both Twitter and Amazon. It makes it easier for Amazon customers to immediately respond to products they view on Twitter by eliminating a few extra clicks on the path to conversion. For Twitter, the deal enables users to remain on Twitter’s platform without switching over to Amazon.
The potential is there, but will this new partnership pan out for both Twitter and Amazon? David Waterman, Director of SEO & Content Development, thinks it might:
“I could see loyal Amazon customers using this. I can’t see this new partnership significantly increasing Amazon’s user base but I can see it increasing the average order size of loyal Amazon customers who are also loyal Twitter users. I’m guessing Amazon did the research and saw that they are receiving a significant amount of referral traffic from Twitter, but a small number of items added to the cart.
I believe the keys to making this a success are:
1) If Amazon started to significantly push more product content through Twitter via their category specific Amazon Twitter profiles (and possibly expanding them)
2) And if other brands pushed their Amazon hosted product pages, with links to Amazon products.”
This partnership might make brands on Twitter more inclined to tweet their products with Amazon product links, driving more purchases to Amazon’s site over other merchant sites, and simultaneously driving more revenue for Amazon.
But Twitter has a lot to gain from this new relationship as well – the move might be a way for Twitter to more directly demonstrate the ROI of promoted tweets, thereby getting more brands to invest in them. If a company sends out a promoted tweet of an Amazon product, and a Twitter user retweets it with #AmazonCart, that product goes directly into their Amazon cart, and Twitter can directly attribute that the promoted tweet led to that product entering a cart. This new connection forged between the Twitter and Amazon empires might just make it easier to track conversions across platforms, from the social media stream into the shopping bag.
While this new partnership is certainly innovative and full of possibility, it remains to be seen if Twitter/Amazon users actually end up using it! Without user engagement, it’s nothing more than an interesting idea.