Posts in SEO
Google Penguin 4.0 is Finally Here

It's Happening

 It's finally here.

After 23 months of waiting, Google just confirmed that the Penguin 4.0 algorithm is rolling out to all languages.

Per Google:

Penguin is now real-time. Historically, the list of sites affected by Penguin was periodically refreshed at the same time. Once a webmaster considerably improved their site and its presence on the internet, many of Google's algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed. With this change, Penguin's data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page. It also means we're not going to comment on future refreshes.

Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

Now that the wait is finally over, sites who have worked hard on cleaning up their link profile may see a stronger road to recovery and sites who build "unnatural" links might see even swifter penalties. Any interesting thoughts on your end? Let us know in the comments!

The Rising Role of Voice Search

Google says 20 percent of mobile queries are voice searches:

Earlier today, Google CEO Sundar Pichai announced during his Google I/O keynote that 20 percent of queries on its mobile app and on Android devices are voice searches. He spoke about this in the context of introducing Google’s new Amazon Echo competitor, Google Home.

This 20-percent figure is actually lower than one mentioned by Google Executive Chairman Eric Schmidt in September 2010. At that time, he said “25 percent of Android searches in the US are voice searches.” Regardless of the precise number, it’s clear that voice searches are growing.

[...]

Screen-Shot-2015-12-10-at-11.52.23-AM-800x490

The range of virtual assistants, such as Siri, Cortana, Google Voice Search/Now, Viv, Amazon Alexa, and now, Google Home, are collectively training people to search using their voices and to become more “conversational” with search and mobile devices.

As the world of virtual assistants continues to surge, so does the world of voice search. In the last 6 months, we have seen a significant increase in search terms that are clearly voice-driven.  For example, "hey google what is the best office chair 2016".  No one would ever type in a search like that, but advertisers should be prepared for this new (and growing) form of search term.

Google's Most Recent Mobile-Friendly Algorithm Update

Google’s mobile-friendly algorithm boost has rolled out

Google has fully rolled out the second version of the mobile-friendly update today. Google Webmaster Trends Analyst John Mueller announced it this morning on Twitter, saying, “The mobile changes mentioned here are now fully rolled out.”

Google gave us a heads-up in March that they are preparing to boost the mobile-friendly algorithm in May, and clearly, that has finished rolling out today.

This technically is supposed to “increase the effect of the [mobile-friendly] ranking signal.” As we reported in March, Google said if you are already mobile-friendly, you do not have to worry, because “you will not be impacted by this update.”

As a reminder, the Google mobile-friendly algorithm is a page-by-page signal, so it can take time for Google to assess each page, and that is why it took some time to roll out fully. So depending on how fast Google crawls and indexes all of the pages on your site, the impact can be slow to show up.

As we first discussed in February of 2015, Google is putting increasing pressure on advertisers to take their site responsive. In late May of this year, we saw the most recent step in this initiative, as Google finished rolling out the most recent mobile-friendly algorithm update. The moral of the story: if your site is not mobile-friendly, you will experience an increasing up-hill battle in ranking well organically. Additionally, there are some indicators that ranking through AdWords will become increasingly difficult, too. If you haven't taken the responsive plunge already, let this serve as a strong reminder that the future is clear. Moving to responsive is an investment in your future advertising efforts.

Do SEM Ads Result In Organic Drop?

New Google Test: AdWords Ads Can Result In 15 Point Drop In Organic Click Through Rates:

There haven't been any big studies on the impact of PPC, AdWords ads, on organic click through rates since Google dropped ads from the right rail in February. But a new small test has one advertiser saying their click through rates on their number one listings drop 15 points when their ads show, that is a CTR of 35% dropping to 20% when their ads show. [...]

He wrote:

We've done some tests where we have toned down on PPC, but for the most part this test has been brought around because we're spending 3x as much on PPC for certain commercial keywords. This has impacted the traffic we get through organic significantly. The most extreme case is where we spend a lot more money on PPC and our SEO CTR for keyword goes from 35% CTR to 20% CTR. It's almost like dropping from position 1 for a term to position 2.

Although this study is very interesting, it does raise a few concerns. First, we are talking about brands that rank #1 for the studied search terms. That is certainly the minority of advertisers. Second, what does it mean for organic listings when competitors advertise above them? There would definitely be some lost traffic in that situation. So the question should revolve around the total net impact of the two scenarios, instead of one ad vs one organic listing.

Two Big Changes In Local Search

screen_shot_2016-04-20_at_1.24.52_pm Google makes 2 ad updates that will affect local search marketers:

Ads in Local Finder results

The addition of the ads (more than one ad can display) in the Local Finder results means retailers and service providers that aren’t featured in the local three-pack have a new way of getting to the top of the results if users click through to see more listings. (It also means another haven for organic listings has been infiltrated with advertising.)

The ads in the Local Finder rely on AdWords location extensions just like Google Maps, which started featuring ads that used location extensions when Google updated Maps in 2013. Unlike the results in Maps, however, advertisers featured in Local Finder results do not get a pin on the map results.

Google Maps is no longer considered a Search Partner

Google has also announced changes to how ads display in Google Maps. Soon, Google will only show ads that include location extensions in Maps; regular text ads will not be featured. The other big change is that Google Maps is no longer considered part of Search Partners. Google has alerted advertisers, and Maps has been removed from the list of Google sites included in Search Partners in the AdWords help pages.

This change in Maps’ status means:

1. Advertisers that use location extensions but had opted out of Search Partners will now be able to have their ads shown in Maps and may see an increase in impressions and clicks as their ads start showing there.

2. Advertisers that don’t use location extensions but were opted into Search Partners could see a drop in impressions and clicks with ads no longer showing in Maps.

These changes signal some interesting new pay-to-play opportunities to brick & mortar advertisers looking to connect with local traffic.

Google Issues Ultimatum to Non-Responsive Sites

Google announced yesterday they'll be making a change to their rankings that will have "a significant impact in [their] search results." Mobile-friendliness will be a much more important ranking factor and will affect Google searches worldwide starting April 21, 2015. Google claims that the positive outcome of this update will be that "users will find it easier to get relevant, high quality search results that are optimized for their devices." This is consistent with statistics Google has previously touted: Mobile Users Hold a Grudge

Announcing that mobile friendliness will become a permanent part of their algorithm comes as no surprise, however Google isn't known to give early warnings so it's very likely that this change is going to have big effects on the search results landscape. Is your site ready?

Google's Mobile Friendly Test Tool

You have less than 2 months to make sure your site is responsive and mobile-friendly or you may see a negative impact on your organic traffic. Not sure if your site is responsive? Check here with Google's Mobile Friendly Tool.  In light of this change, our responsive design and development team is taking on a limited number of accelerated projects to help our clients beat this April 21st deadline. If you're interested in learning more about this process and how your site could benefit, contact us today.

"Fix Mobile Usability Issues Found On" Your Site

Have you received a message in your Google Webmaster Tools inbox like the one below recently? If so, you're not alone. Just a few days ago, Google started sending out large-scale notifications via email and Webmaster Tools, warning non-mobile-friendly sites that their pages "will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users."Fix mobile usability issues found on

So just what does this mean? Historically, Google has only notified supposed "mobile-friendly" sites when they had mobile usability issues. Now, with Google sending out these warnings in such a mass scale, it seems that the mobile-friendly ranking factors they've been testing for the past several months might be included into the overall algorithm in the near future. This move would make responsive web design more important than ever, as your mobile rankings could eventually differ widely from your desktop rankings.

To start fixing your problems, you'll want to follow Google's checklist:

  1. Find problematic pages.
    • Log in to your Google Webmaster Tools account and navigate to the "Mobile Usability" section under the "Search Traffic" left-nav option.
    • Here, Google will give you a breakdown of the pages they deem to have mobile usability errors and will list out the specific errors so you can work on problems individually.
    • Also utilize the Google PageSpeed Insights Tool to test your site speed and usability issues on a per-page basis. Again, Google lists out errors individually and even offers a "show how to fix" option to help walk you through what needs to happen to make certain elements responsive.
  2. Learn more about mobile-friendly design
    • Google offers a pretty expansive "Web Fundamentals" reference guide for designing responsive sites. After you've made a list of all of your problematic pages from Step 1, use this guide to help formulate the best strategy for your site on fixing these mobile usability issues. This reference guide has 114+ sections including: general principles, look and feel, building multi-device layouts, forms and user input, optimizing performance, and more.
  3. Fix mobile usability issues
    • After you've compiled the list of pages with mobile usability errors and read up on how to fix them, start altering your site template/pages to improve their mobile usability as much as possible. After each improvement, re-test each site page with the Google PageSpeed Insights Tool and chart how your score improves. Every element you can improve will help better your score and can eventually work you back into Google's good graces net you the "Mobile-friendly" tag for mobile searches like the screenshot below.

Mobile-friendly tag Now, there's no need to panic. Lots of sites received the same warning you did and will all be working towards the same goal you are. In the meantime, you might see a small dip in your mobile rankings if Google's overall algorithm does start taking mobile-friendly sites into account. If you fix most/all of the issues that Google highlights and move your site to fully responsive design, you will very likely qualify for the above "mobile-friendly" tag and may benefit from better rankings than competing sites that still aren't mobile-friendly. If you have any questions about this post or would like to know more about our Responsive Design process, just let us know how we can help.

10 Technical SEO Problems & How to Solve Them

With Google getting more and more proactive in trying to provide the best possible search experience, keeping your site near the top of search rankings is as hard as it's ever been. No longer can you easily game your way to the top, especially in competitive industries where multibillion-dollar companies crowd the first page. While high quality content still remains a big piece of the puzzle, overlooking technical SEO issues can leave you spinning your wheels rather than driving your way to the top. A lot of these issues have been around for a while, but (A) they continue to be overlooked and (B) they are more important than ever with Google closing the door on ways to game the system. Without further adieu, here are our top 10 SEO problems and how to solve them.

(1) Overlooking 301 redirect best practices

The problem: Search engines consider http://www.example.com and http://example.com to be two different websites. If your site has been linked to from other sites using a mixture of the two URLs you can be effectively splitting your ranking potential in half. The screenshot below, from Open Site Explorer, highlights this as the www version of the site has 143 unique root domains with links pointed towards it while the non-www version has 75. Graph highlighting links and rankings metrics for a site that doesn't have an established www versus non-www best practice. The solution: Choose the domain you prefer (www or non-www) and implement a 301 redirect rule on all instances of the other, pointing that redirect to the one you choose, to consolidate all of your ranking potential.

(2) Duplicate Content

The problem: Similar to the first point, search engines have a harder time knowing which pages on your site deserve to rank if you have a lot of duplicate content. This duplicate content can stem from common CMS functionalities, most frequently from sorting parameters. This is an issue because search engines only have a finite amount of time to crawl your site on a regular basis, and too many duplicate content pages can cause them to weigh your pages less ideally than if you were serving just one “clean” version of a page.

The solution: Avoiding duplicate content is always going to depend on the CMS used and a variety of other factors, but one of the most common methods of telling search engines which page you want to rank is to have a <rel=canonical> link on all of the duplicate pages that point towards the page you do want to rank. As a general rule of thumb, every unique page should only be accessible under one URL. Alternatively, there are some cases where you would be better off using 301 redirects or excluding certain site pages via the robots.txt, but be sure to do your research first.

(3) Not Helping Search Engines

The problem: Again, search engines only have a fixed amount of time to crawl sites on the web. Helping them efficiently crawl your site ensures that all of your important pages get indexed. One common mistake is not having a robots.txt file for the site to identify which sections of your site you DON’T want the search engines crawling. An even bigger mistake is not having a sitemap.xml file, which helps show the search engines which pages on your site are the most important and which should be crawled most frequently. A site that has no robots.txt or sitemap.xml gives Google little to no clues on how it should be crawled. The solution: Be sure to include a hand-built robots.txt file on the root of your server that includes pages/sections of your site that you don't want appearing in search results. Also on the root of your server, be sure to include a curated sitemap.xml file that helps outline all the unique pages on your site. This will help Google crawl it more efficiently.

(4) Poor Meta Tag Usage

The problem: Some CMSes auto-generate page titles for you, but in many cases this is less than ideal. The <title> tag and meta description tag are two of the most important pieces of “off-page” information you can serve to search engines as it tells them how you would like your pages to show up in search results.

The solution: Where possible, handwrite your meta title and meta description for every page on your site. Keep meta titles under 65 characters and meta descriptions under 150 characters to avoid truncation. The title tag is an ideal place for the keyword(s) you want a given page to rank for. Think of the meta description as a headline. Try to convince a browser to click on your site rather than a competing site in the search results.

(5) Disorganized URL Structure

The problem: Presenting a clean URL structure to the search engines is very important. Search engines will look at your URLs to see how you silo off sections of the site. A search engine can follow that www.example.com/widgets/3-inch/heavy-duty/blue-widgets is a page about 3" heavy duty blue widgets. Some CMSes don’t present such clean URLs out of the box, such as the screenshot below, but having them is very beneficial to SEO. Example of poor URL naming convention and a proper one. The solution: You should always aim for URLs that a reader can look at and know exactly what type of page they are on. This is the best recommendation on how to restructure bad URLs that will help both the user as well as search engines. How to get there depends on the CMS at hand and can be an arduous task to fix. When migrating an entire site from "ugly" to "clean" URLs, one-to-one 301 redirect mapping of the almost the entire site is usually needed to ensure that minimal SEO value is lost.

(6) Poor Use of Local Search Data & Structured Markup

The problem: One, or rather, two big missed opportunities are sites who don’t take advantage of Local Search Data or Structured Data Markup. In 2014, Google started recognizing local search intent better than ever and sites that ensure that they have a presence on all the local search data providers such as Yelp, Foursquare, Facebook, Bing, YellowPages, etc... can see boosts in local searches within their immediate city scope. Also, taking advantage of Structured Data markup can qualify certain sites for “Authorship” in Google search results which can show a picture beside the link, giving users a more enticing reason to click. There are dozens of kind of schema markup as well for products, breadcrumbs, publisher, local business, and more.

The solution: Ensure that you go out and claim all the major local search listings you can for your site, with specific respect to the free “big name” ones. Where relevant, look into all the kinds of Schema markup that could be implemented on your site to help differentiate your site in the search results when compared to competitors.

(7) Shady Link Building

The problem: One big issue that some sites still run into is buying links from some SEO companies. In 2011, Google start cracking down on what they consider unnatural linking practices with specific respect to sites that accumulate lots of “artificial” links and are simply trying to game the search engines to be ranked higher. This was known as the “Penguin” algorithm update. They also took a strong stance against sites that repost low-quality content from other sites - known as the “Panda” algorithm. Both of these algorithms are still updated with great regularity. Screenshot from Google Analytics of someone hit by a Penguin algorithmic penalty. The solution: Only link to sites where it is natural to do so and vice versa. You should never have to pay for a link unless it is a sponsored-type link. If this is the case, the link should include the rel=nofollow attribute or else this also risks setting off a red flag to Google. Buying X number of links from SEO companies is usually a very bad practice that will likely lead to eventual penalization in search engines as well. See the screenshot above? You don't wanna be that guy.

(8) Broken Links/404

The problem: A “broken link” is a hyperlink that points to a page that is no longer active (also known as a 404 page). There are few things more frustrating than finding a resource you need only to follow the link and find out the resource no longer exists. Search engines recognize this and will downgrade rankings of sites that accumulate large amounts of internal 404 links.

The solution: Avoid this by keeping an eye on 404s that Google finds on your site within Google Webmaster Tools. Do regular “housecleaning” on your site to ensure you keep this number of 404s to a minimum, implementing 301 redirects when moving resource from one URL to another, or combining heavily similar pages.

(9) Slow Site Speed

The problem: The speed at which pages load for your visitors may be so slow that they abandon your pages, or circle back and click another search result. Having site pages render quickly provides a good user experience while the opposite can cause visitors to leave. If your site speed proves to heavily affect the experience of your mobile visitors, Google will weigh that when serving your site in mobile search engine result pages. Google Page Speed Tool - Slow Site Speed The solution: Regularly monitor your average site load speeds in Google Analytics and also run your site through the Google Page Speed Tool. Follow the recommendations provided to help increase your overall and mobile site speeds.

(10) Quality On-Page Content

The problem: At the end of the day, one of the main ranking factors remains on-page content. If your page about blue widgets only has 100 unique words of content but a competitor’s page about blue widgets has over 3,000 unique words of content, the search engines will almost always give more algorithmic weight to the site with more quality, unique content - all other things equal.

The solution: If you want one of the simplest ways to potentially improve your ranking, plan to continually revisit top pages of your site to revise and expand on the content every few months. As long as it’s high quality, the more content the better. Every time new pages are launched, be sure they include plenty of content that's helpful for your users and describes the product/service in question. Always write for your visitors, not the search engines. Avoid intentional keyword spamming just to try and rank higher.

Any big technical SEO issues you deal with regularly that weren't mentioned on our list? Feel free to let us know in the comments below!

3 Time-Saving Tips When Cleaning Up Your Link Profile

Google Webmaster Tools Manual Penalty SectionPrepping for a link cleanup project is no easy task. Ever since Google's Penguin updates started cracking down on unnatural linking practices in 2012, many site owners who enlisted the aid of unscrupulous SEO services to build hundreds or even thousands of links for them have eventually found themselves on the receiving end of a penalty. Seeing your organic traffic tank after a Google algorithm update is a scary realization, but do not lose hope. If you think your site may have been penalized by a recent algorithm update, our AlgoSleuth Tool can help you see the organic traffic trends and know for sure. You will also want to check Google Webmaster Tools for your site and see if you've been dealt a Manual Penalty. If there are no messages in Webmaster Tools then the penalty is most likely algorithmic. If AlgoSleuth identifies a particular update where you were hit hard or you see a manual penalty identified in Webmaster Tools, you need to audit and most likely clean up your backlink profile. A thorough write-up of the steps involved can be found here. Below we outline 3 time-saving recommendations to keep in mind when undertaking a link cleanup project.

1. Pull from as many data sources as possible.

Google has said that you don't generally need to utilize third party tools to find all the "unnatural" links pointing to your website, but from personal experience and the experience of others this just hasn't proven to be true. Multiple backlink data sources such as Ahrefs, Moz, and Majestic SEO (in addition to WMT) will help uncover as many of these unnatural backlinks as possible, ensuring that you do the cleanup right the first time around .

2. De-duplicate.

If you followed the aforementioned step, you'll have at least 3-4 data pulls in CSV format to deal with. To save yourself time, you'll want to import all of this data into one spreadsheet to work with. The column headers you really need are: Backlink URL and Anchor text. Feel free to include the URL on your site the backlink is pointing to, if desired. The rest can be deleted when importing your pulls into one spreadsheet. Now that they are all in one master file, you'll want to de-duplicate your list so you are not wasting any time going over something twice. This can be accomplished in both Excel as well as Google Docs. For Google Docs, create a new "Master Data" tab. In Cell A1, paste the formula shown below, replacing the portions underlined in red with the tab names of your data pulls.

Google Docs De-Duplication Formula =unique(query(vmerge('Open Site Explorer'!A1:B;'Majestic SEO'!A1:B;'Ahrefs'!A1:B);"select * where Col1<>'' ";0))

If you are working in Excel, you'll first need to copy all of your data from each of the 3-4 tabs into one single "Master" tab. After you've done that, you can follow the directions here under "Filter for unique values". The end result will be the same as the Google Docs method, outputting a list of unique backlinks to work with.

3. Filter out 404's.

If your backlink profile is sizable, this step can save you quite a few hours. The SEO tools mentioned earlier, such as Ahrefs or Moz, do a great job at crawling the web for these backlinks but your backlink lists will undoubtedly contain 404's from sites that have since closed up shop. Before you start going through your master list to ask for link removals, you'll want to filter out these 404's to double back on or add to your final disavow list. This is easily accomplished in Google Docs with a simple custom script. After installing the script, add a new column to your master tab called "Link Status". Now you can call on any cell containing a URL with =HTTPResponse() and see an output of the current HTTP Status. After doing this for one cell you can drag the formula down and it will automatically run on your URL's as shown below.

Google Docs HTTP Status Checker

If you are working in Excel you'll want to install the free SeoTools plugin which lets you call HttpStatus() on a cell in the same way as the Google Docs method. After you've done this in Google Docs or Excel, you can quickly filter your "Link Status" column to exclude backlinks with a 404 status. This will help focus on all the live links first and then double back later on the 404's.

Final Thoughts

Now that you've combined your lists, de-duplicated, and prioritized by live links, you're ready to start combing through the list manually to find potential bad eggs among the bunch. It's not too hard to discern natural links from "unnatural" ones but if you want to save even more time a link cleanup tool such as LinkDetox is recommended. It helps you easily ID the bad links and even serves as an organized area to contact webmasters to request a link removal and keep good records.

Cleaning up a backlink profile that has been hit by Penguin is never an easy task. Even for smaller sites, a full-scale analysis and cleanup can take weeks. Just as you encounter frustrations when trying to build new links, you will with cleaning up old ones. Many webmasters will  be unresponsive when asking for link removals. Some will even ask for money to remove the link(s) or flat out refuse to remove them. There will be times when you just want to bang your head against your desk, but in the end, cleaning things up can give you a much better idea of how to move forward and build better quality links in the future. As always, if you have any questions about this post or need some extra help feel free to drop us a line.

Were You Affected by Penguin 2.0? Find Out With AlgoSleuth!

If You Hated Panda And Penguin, Just Wait For Pterodactyl It's been just over a month now since Google rolled out Penguin 2.0 and many people are still left wondering - just how big of an impact did it have on search results?

We just finished updating our free SEO tool, AlgoSleuth, so you can check your site and see what was won or lost with the biggest algorithm update thus far this year. AlgoSleuth uses the Google Analytics API and Google Docs to provide a quick snapshot of your site's Google organic traffic over the past two and a half years and highlights the updates that positively and negatively affected your site traffic along the way. As long as interest in the tool remains high, we will continue to update it well into the future with all major algorithm changes acknowledged by Google. If you have any questions or feedback, feel free to leave us a comment below! Get AlgoSleuth for Free