Delegator

Blog

10 Technical SEO Problems & How to Solve Them

June 18, 2014

With Google getting more and more proactive in trying to provide the best possible search experience, keeping your site near the top of search rankings is as hard as it’s ever been. No longer can you easily game your way to the top, especially in competitive industries where multibillion-dollar companies crowd the first page. While high quality content still remains a big piece of the puzzle, overlooking technical SEO issues can leave you spinning your wheels rather than driving your way to the top. A lot of these issues have been around for a while, but (A) they continue to be overlooked and (B) they are more important than ever with Google closing the door on ways to game the system. Without further adieu, here are our top 10 SEO problems and how to solve them.

(1) Overlooking 301 redirect best practices

The problem:
Search engines consider http://www.example.com and http://example.com to be two different websites. If your site has been linked to from other sites using a mixture of the two URLs you can be effectively splitting your ranking potential in half. The screenshot below, from Open Site Explorer, highlights this as the www version of the site has 143 unique root domains with links pointed towards it while the non-www version has 75.
Graph highlighting links and rankings metrics for a site that doesn't have an established www versus non-www best practice. The solution:
Choose the domain you prefer (www or non-www) and implement a 301 redirect rule on all instances of the other, pointing that redirect to the one you choose, to consolidate all of your ranking potential.

(2) Duplicate Content

The problem:
Similar to the first point, search engines have a harder time knowing which pages on your site deserve to rank if you have a lot of duplicate content. This duplicate content can stem from common CMS functionalities, most frequently from sorting parameters. This is an issue because search engines only have a finite amount of time to crawl your site on a regular basis, and too many duplicate content pages can cause them to weigh your pages less ideally than if you were serving just one “clean” version of a page.

The solution:
Avoiding duplicate content is always going to depend on the CMS used and a variety of other factors, but one of the most common methods of telling search engines which page you want to rank is to have a <rel=canonical> link on all of the duplicate pages that point towards the page you do want to rank. As a general rule of thumb, every unique page should only be accessible under one URL. Alternatively, there are some cases where you would be better off using 301 redirects or excluding certain site pages via the robots.txt, but be sure to do your research first.

(3) Not Helping Search Engines

The problem:
Again, search engines only have a fixed amount of time to crawl sites on the web. Helping them efficiently crawl your site ensures that all of your important pages get indexed. One common mistake is not having a robots.txt file for the site to identify which sections of your site you DON’T want the search engines crawling. An even bigger mistake is not having a sitemap.xml file, which helps show the search engines which pages on your site are the most important and which should be crawled most frequently.
A site that has no robots.txt or sitemap.xml gives Google little to no clues on how it should be crawled. The solution:
Be sure to include a hand-built robots.txt file on the root of your server that includes pages/sections of your site that you don’t want appearing in search results. Also on the root of your server, be sure to include a curated sitemap.xml file that helps outline all the unique pages on your site. This will help Google crawl it more efficiently.

(4) Poor Meta Tag Usage

The problem:
Some CMSes auto-generate page titles for you, but in many cases this is less than ideal. The <title> tag and meta description tag are two of the most important pieces of “off-page” information you can serve to search engines as it tells them how you would like your pages to show up in search results.

The solution:
Where possible, handwrite your meta title and meta description for every page on your site. Keep meta titles under 65 characters and meta descriptions under 150 characters to avoid truncation. The title tag is an ideal place for the keyword(s) you want a given page to rank for. Think of the meta description as a headline. Try to convince a browser to click on your site rather than a competing site in the search results.

(5) Disorganized URL Structure

The problem:
Presenting a clean URL structure to the search engines is very important. Search engines will look at your URLs to see how you silo off sections of the site. A search engine can follow that www.example.com/widgets/3-inch/heavy-duty/blue-widgets is a page about 3″ heavy duty blue widgets. Some CMSes don’t present such clean URLs out of the box, such as the screenshot below, but having them is very beneficial to SEO.
Example of poor URL naming convention and a proper one. The solution:
You should always aim for URLs that a reader can look at and know exactly what type of page they are on. This is the best recommendation on how to restructure bad URLs that will help both the user as well as search engines. How to get there depends on the CMS at hand and can be an arduous task to fix. When migrating an entire site from “ugly” to “clean” URLs, one-to-one 301 redirect mapping of the almost the entire site is usually needed to ensure that minimal SEO value is lost.

(6) Poor Use of Local Search Data & Structured Markup

The problem:
One, or rather, two big missed opportunities are sites who don’t take advantage of Local Search Data or Structured Data Markup. In 2014, Google started recognizing local search intent better than ever and sites that ensure that they have a presence on all the local search data providers such as Yelp, Foursquare, Facebook, Bing, YellowPages, etc… can see boosts in local searches within their immediate city scope. Also, taking advantage of Structured Data markup can qualify certain sites for “Authorship” in Google search results which can show a picture beside the link, giving users a more enticing reason to click. There are dozens of kind of schema markup as well for products, breadcrumbs, publisher, local business, and more.

The solution:
Ensure that you go out and claim all the major local search listings you can for your site, with specific respect to the free “big name” ones. Where relevant, look into all the kinds of Schema markup that could be implemented on your site to help differentiate your site in the search results when compared to competitors.

(7) Shady Link Building

The problem:
One big issue that some sites still run into is buying links from some SEO companies. In 2011, Google start cracking down on what they consider unnatural linking practices with specific respect to sites that accumulate lots of “artificial” links and are simply trying to game the search engines to be ranked higher. This was known as the “Penguin” algorithm update. They also took a strong stance against sites that repost low-quality content from other sites – known as the “Panda” algorithm. Both of these algorithms are still updated with great regularity.
Screenshot from Google Analytics of someone hit by a Penguin algorithmic penalty. The solution:
Only link to sites where it is natural to do so and vice versa. You should never have to pay for a link unless it is a sponsored-type link. If this is the case, the link should include the rel=nofollow attribute or else this also risks setting off a red flag to Google. Buying X number of links from SEO companies is usually a very bad practice that will likely lead to eventual penalization in search engines as well. See the screenshot above? You don’t wanna be that guy.

(8) Broken Links/404

The problem:
A “broken link” is a hyperlink that points to a page that is no longer active (also known as a 404 page). There are few things more frustrating than finding a resource you need only to follow the link and find out the resource no longer exists. Search engines recognize this and will downgrade rankings of sites that accumulate large amounts of internal 404 links.

The solution:
Avoid this by keeping an eye on 404s that Google finds on your site within Google Webmaster Tools. Do regular “housecleaning” on your site to ensure you keep this number of 404s to a minimum, implementing 301 redirects when moving resource from one URL to another, or combining heavily similar pages.

(9) Slow Site Speed

The problem:
The speed at which pages load for your visitors may be so slow that they abandon your pages, or circle back and click another search result. Having site pages render quickly provides a good user experience while the opposite can cause visitors to leave. If your site speed proves to heavily affect the experience of your mobile visitors, Google will weigh that when serving your site in mobile search engine result pages.
Google Page Speed Tool - Slow Site Speed The solution:
Regularly monitor your average site load speeds in Google Analytics and also run your site through the Google Page Speed Tool. Follow the recommendations provided to help increase your overall and mobile site speeds.

(10) Quality On-Page Content

The problem:
At the end of the day, one of the main ranking factors remains on-page content. If your page about blue widgets only has 100 unique words of content but a competitor’s page about blue widgets has over 3,000 unique words of content, the search engines will almost always give more algorithmic weight to the site with more quality, unique content – all other things equal.

The solution:
If you want one of the simplest ways to potentially improve your ranking, plan to continually revisit top pages of your site to revise and expand on the content every few months. As long as it’s high quality, the more content the better. Every time new pages are launched, be sure they include plenty of content that’s helpful for your users and describes the product/service in question. Always write for your visitors, not the search engines. Avoid intentional keyword spamming just to try and rank higher.

Any big technical SEO issues you deal with regularly that weren’t mentioned on our list? Feel free to let us know in the comments below!


How Long Should I Run My Experiment in AdWords?

April 4, 2014

“How long should I run my experiment?” asks a ridiculously high number of online marketing managers. Typically, the answer to this question is more complicated than most want to hear. It may be a buzz phrase by now, but the goal with any experiment is to reach statistical significance–not simply to run an experiment for a pre-determined period of time. When measuring some AdWords metrics for a “winner,” it can be difficult to determine when statistical significance has been reached, especially with the limitations of the AdWords Experiments feature.

For any AdWords managers who are very hands-on with their accounts and want to verify statistically significant results, Delegator has provided the AdWords Statistical Significance Calculator. Whether you’re testing ad copy, different landing pages, or any assortment of variables that can be separated into “Group A” and “Group B,” this calculator allows you to work on actionable data alone.

Example: I am testing two separate landing pages with the exact same ad copy to see which page leads to the most conversions. I’ve been running my experiment for one week, and I have gathered this data so far:

Landing Page 1

  • Clicks: 958
  • Goal Completions: 33

Landing Page 2

  • Clicks: 1,014
  • Goal Completions: 45
The data is not significant.

Using the calculator, the results for my experiment are not significant, so I must either run my test longer or try a different test that may be more conclusive.

With any test, Delegator encourages running for at least one entire week since shopping and browsing patterns can vary by day of the week. After one full week, the calculator will let you know whether your data is conclusive enough for you to act on it.

Yes, this calculator is designed specifically for AdWords account managers, but the functionality can work on any data you collect about your website through any channel. For additional testing ideas or help managing your account, give us a shout.

Start Calculating Now!

4 Best Practices For User Testing Your Website

March 13, 2014

So you’ve built out a new website or landing page, and you’re excited to show it to the world. Your colleagues, friends, and family think that your new creation is awesome – and you’re feeling pretty good about yourself. You’re thinking that you might be ready to launch..

images

DO NOT GO ANY FURTHER WITHOUT USER TESTING!

Here at Delegator, we preach the gospel of thorough and unbiased user testing. In fact, we have distilled our approach to using a singular partner (most of the time) that we really enjoy working with, because they have a great platform: UserTesting.com.

Simply signing up for their service, however, isn’t enough to get you the actionable data you need to properly optimize your new site.

Here are four best practices that will allow you to user test efficiently and effectively:

Get involved in, first hand, user testing videos AND analysis sessions:

User testing is one of the most important pre-launch protocols.  If you are a decision maker, there is no substitute for first hand consumption of user testing content.  If you pass the task off through multiple degrees of separation, you are opening yourself up to multiple layers of bias. Instead, consume the user tests first hand to see for yourself EXACTLY how people interact with your site.

Do not make definitive conclusions based on just a couple of user tests:

Although you may feel that a random user test is providing you with a goldmine of actionable data, temper your eagerness to make changes with the understanding that one or two tests are not statistically significant relative to hundreds of site visitors.  If multiple users tests start revealing similar faults or potential enhancements, AND your team agrees with said faults, you should probably feel safe making that change.

Find the right balance between instruction specificity and freedom:

Unless you want your user tester to be floundering around the site with no clear direction, be specific in dictating to the tester who they are, and what their goal is.  Don’t, however, instruct them on every step they need to take to reach the goal.  You want your user tester to best emulate your actual customers.  You, unfortunately, won’t be able to instruct each customer on how to use your website step by step, so take that into account during your UserTesting.com session setup.

Use the convenient annotations feature of UserTesting.com for efficient & effective sharing:

You’ll want to share the user testing intel with your team members.  Since the tests are delivered in the form of a narrated video and can often be quite long, take a pass through the videos and annotate the important revelations.  This way, other team members can quickly scan the video and watch the important parts where your tester might be stumbling, or (hopefully) completing tasks with ease.

Delegator is an official partner of UserTesting.com and can help you set your account up,  work through your testing, and analyze the tests to form actionable recommendations.  Contact us here if you would like to learn more!


12 Google AdWords Facts & Trends From 2013

February 4, 2014

2013 was easily the biggest year ever for Google AdWords, both from a financial standpoint as well as expanding and adding new features. Once a service that started with a trial of just 350 users, AdWords offers a platform where advertisers can set up a campaign and have their ads shown to millions of highly targeted searchers worldwide within a few short hours.

As we march on into 2014, it’s worth keeping a close eye on how AdWords continues to affect the way we search and find things both on our computers as well as our mobile devices.

Here are 12 interesting Google AdWords facts and trends from 2013:

  1. In 2013, Google officially surpassed $50 billion in total advertising revenue. This comprised 85% of their total revenues for 2013.
  2. Of that $50 billion in ad revenue, Google reported $12.9 billion in net income for 2013.
  3. Total paid clicks on Google and Google Display Network sites were up 31% over the year prior and 13% over the third quarter of 2013.
  4. The most expensive keywords in Google AdWords continue to be insurance and lawyer related with keywords like “mesothelioma lawyers” now costing advertisers well over $200/click.Mesothelioma Average Cost Per Click Amounts
  5. Smartphones and tablets combined for 32% of paid search clicks in Q4 2013 and accounted for 25% of paid search spending.
  6. On Christmas Day an estimated 45% of Google searches worldwide occurred on a smartphone or tablet. This was the highest mobile traffic share day in 2013.
  7. 85% of desktop clicks on Google AdWords ads are incremental. That is, when ads are paused only 15% of searches, on average, will still find their way to your site.Google's Search Ads Pause Experiment
  8. On average, the top 3 AdWords spots take 41.1% of the total clicks on a given results page. This includes both paid and organic results.
  9. The average click through rate for an AdWords ad in the first position is just over 7%.Google Search - Average Click Through Rates
  10. Once free, spending on Google’s Product Listing Ads increased 72% in 2013. Retailers who ran both PLAs and AdWords text ads generated 42% of their non-branded traffic from PLAs.
  11. Google removed over 350 million bad ads from AdWords and the Google Display Network in 2013 – a 59% increase versus 2012.
  12. Google rolled out over 1,000 changes to the AdWords ecosystem in 2013. Some of the most important updates include Enhanced Campaigns, an updated Ad Rank algorithm, and a wider variety of remarketing options.

Bonus Facts:

  1. An estimated $52 million of Google AdWords spend in the US was wasted on click fraud in just the first six months of 2013.
  2. An estimated 40+% of clicks on AdWords ads originating from China ultimately get classified suspected click fraud. Indonesia and Iraq sometimes display rates of over 60%!

Sources:

Last updated: April 17, 2014


Apple’s Iconic 1984 Macintosh Ad, As It Happened

January 17, 2014

January 22, 2014 marks the 30th anniversary of Apple’s iconic Macintosh ad, directed by Ridley Scott, and aired during Super Bowl XVIII. To commemorate this occasion, here’s a never before seen, original home recording of the ad, as it first appeared, watched and recorded by our CEO as a kid.

Two unexpected things happened on that day 30 years ago. One was seeing the Los Angeles Raiders’ Marcus Allen, Howie Long, and Lyle Alzado roll over the Washington Redskins’ Joe Theismann, Art Monk and John Riggins. The game was announced by Pat Summerall and John Madden (coach of the Super-Bowl champ Oakland Raiders), and the upset was so surprising, and such a beat-down, that the day was called “Black Sunday”.

The other unexpected event occurred as I was eating my Beefaroni about midway through the 3rd quarter. The broadcast cut to a commercial, the screen momentarily went dark, and what aired next became part of marketing and tech history.

For some reason I was recording it all on a VCR, most likely because our new VCR (VHS, not Beta, featuring an innovative blue blinking “12:00AM” indicator) was the coolest thing in the world at the time besides my Atari 2600, so I was likely recording everything on TV.

In any case, for all you Apple lovers and haters, we dug through our old tapes, digitized them, and zeroed in on the iconic ad, nestled within a little Super Bowl context, for your enjoyment, love and hate.

It features Ridley Scott as director, Anya Major as the hammer-thrower, and a likely IBM as Big Brother (later replaced by Microsoft, later the NSA). Many consider it an advertising masterpiece and watershed event. Steve Jobs was fired a year later.

(NOTE: the Apple 1984 add starts at :09. We wanted you to see it as it happened, in context.)
 


3 Time-Saving Tips When Cleaning Up Your Link Profile

December 31, 2013

Google Webmaster Tools Manual Penalty SectionPrepping for a link cleanup project is no easy task. Ever since Google’s Penguin updates started cracking down on unnatural linking practices in 2012, many site owners who enlisted the aid of unscrupulous SEO services to build hundreds or even thousands of links for them have eventually found themselves on the receiving end of a penalty. Seeing your organic traffic tank after a Google algorithm update is a scary realization, but do not lose hope.

If you think your site may have been penalized by a recent algorithm update, our AlgoSleuth Tool can help you see the organic traffic trends and know for sure. You will also want to check Google Webmaster Tools for your site and see if you’ve been dealt a Manual Penalty. If there are no messages in Webmaster Tools then the penalty is most likely algorithmic. If AlgoSleuth identifies a particular update where you were hit hard or you see a manual penalty identified in Webmaster Tools, you need to audit and most likely clean up your backlink profile. A thorough write-up of the steps involved can be found here. Below we outline 3 time-saving recommendations to keep in mind when undertaking a link cleanup project.

1. Pull from as many data sources as possible.

Google has said that you don’t generally need to utilize third party tools to find all the “unnatural” links pointing to your website, but from personal experience and the experience of others this just hasn’t proven to be true. Multiple backlink data sources such as Ahrefs, Moz, and Majestic SEO (in addition to WMT) will help uncover as many of these unnatural backlinks as possible, ensuring that you do the cleanup right the first time around .

2. De-duplicate.

If you followed the aforementioned step, you’ll have at least 3-4 data pulls in CSV format to deal with. To save yourself time, you’ll want to import all of this data into one spreadsheet to work with. The column headers you really need are: Backlink URL and Anchor text. Feel free to include the URL on your site the backlink is pointing to, if desired. The rest can be deleted when importing your pulls into one spreadsheet. Now that they are all in one master file, you’ll want to de-duplicate your list so you are not wasting any time going over something twice. This can be accomplished in both Excel as well as Google Docs. For Google Docs, create a new “Master Data” tab. In Cell A1, paste the formula shown below, replacing the portions underlined in red with the tab names of your data pulls.

Google Docs De-Duplication Formula =unique(query(vmerge(‘Open Site Explorer’!A1:B;’Majestic SEO’!A1:B;’Ahrefs’!A1:B);”select * where Col1<>” “;0))

If you are working in Excel, you’ll first need to copy all of your data from each of the 3-4 tabs into one single “Master” tab. After you’ve done that, you can follow the directions here under “Filter for unique values”. The end result will be the same as the Google Docs method, outputting a list of unique backlinks to work with.

3. Filter out 404′s.

If your backlink profile is sizable, this step can save you quite a few hours. The SEO tools mentioned earlier, such as Ahrefs or Moz, do a great job at crawling the web for these backlinks but your backlink lists will undoubtedly contain 404′s from sites that have since closed up shop. Before you start going through your master list to ask for link removals, you’ll want to filter out these 404′s to double back on or add to your final disavow list. This is easily accomplished in Google Docs with a simple custom script. After installing the script, add a new column to your master tab called “Link Status”. Now you can call on any cell containing a URL with =HTTPResponse() and see an output of the current HTTP Status. After doing this for one cell you can drag the formula down and it will automatically run on your URL’s as shown below.

Google Docs HTTP Status Checker

If you are working in Excel you’ll want to install the free SeoTools plugin which lets you call HttpStatus() on a cell in the same way as the Google Docs method. After you’ve done this in Google Docs or Excel, you can quickly filter your “Link Status” column to exclude backlinks with a 404 status. This will help focus on all the live links first and then double back later on the 404′s.

Final Thoughts

Now that you’ve combined your lists, de-duplicated, and prioritized by live links, you’re ready to start combing through the list manually to find potential bad eggs among the bunch. It’s not too hard to discern natural links from “unnatural” ones but if you want to save even more time a link cleanup tool such as LinkDetox is recommended. It helps you easily ID the bad links and even serves as an organized area to contact webmasters to request a link removal and keep good records.

Cleaning up a backlink profile that has been hit by Penguin is never an easy task. Even for smaller sites, a full-scale analysis and cleanup can take weeks. Just as you encounter frustrations when trying to build new links, you will with cleaning up old ones. Many webmasters will  be unresponsive when asking for link removals. Some will even ask for money to remove the link(s) or flat out refuse to remove them. There will be times when you just want to bang your head against your desk, but in the end, cleaning things up can give you a much better idea of how to move forward and build better quality links in the future. As always, if you have any questions about this post or need some extra help feel free to drop us a line.


How Do I Measure Remarketing Performance?

October 22, 2013

How do you know if you’re remarketing well? This is a tricky campaign to measure, especially since it’s more likely to play a big part in assisted conversions compared to any other campaign. For ecommerce clients, the “Time to Purchase” report might help give you some insight into how remarketing compares to the typical conversion cycle for last-click conversions.

The “Time to Purchase” report (categorized under the Ecommerce Conversion reports) has two views: “Days to Transactions” and “Visits to Transactions.” These two reports together give you a sense of just how engaged remarketed users are within the conversion cycle.

  
Photo: Company A remarketing results showing low engagement.

In the above report to the left, 18.18% of remarketing transactions have occurred 7-13 days after the initial visit. However, we see to the right that most remarketing transactions occurred in less than 4 visits to the site. What we’re seeing here is that, potentially, one to two weeks can go by with few visits back to the site before a remarketed customer will convert. This company may be interested in trying to close that time gap with a more robust remarketing strategy.

Alternately in the example below, most remarketing transactions occurred less than 4 days after the initial remarketed visit, yet a significant portion of visitors come back to the site 7-25 times until they convert. These visitors are highly engaged for fewer days. The company below had remarketing results much like what’s shown above until we put an expansive remarketing strategy into place. Now remarketing brings in a significant portion of last-click conversions and assists in almost every AdWords conversion.

  
Photo: Company B remarketing results showing high engagement.

Take a look at your “Time to Purchase” reports. If it looks like you’re not getting much engagement through remarketing, it might be time to rework or start up a new strategy. If you’re not sure where to get started, let us know so that we can help.


10 Interesting Ecommerce Facts & Trends

October 10, 2013

The ecommerce industry has become a major part of worldwide consumerism and is now baked into popular culture and daily life.  Companies like Amazon and Ebay are household names and often the first choice when something – anything – needs to be purchased.

Estimates place worldwide ecommerce sales at $1 trillion in 2012, a 26% increase from the previous year.

With ecommerce representing such a massive money-making opportunity with relatively few barriers to entry, it is no surprise that this industry experiences more disruption than many others – often leading to wild swings in consumer trends.

Here are 10 interesting facts and trends about ecommerce that you may not currently know:

  1.  Pizza Hut was one of the first major brands to experiment with online commerce, starting in 1994.Pizza Hut - Welcome to PizzaNet!
  2. Ecommerce is predicted to represent 10% of all US retail by 2017.
  3. North Dakota, Connecticut, and Alaska lead all US states in ecommerce sales per capita.
  4. India is home to the fastest growing ecommerce market, and France is experiencing the slowest growth.
  5. 80% of the online population has used the internet to make a purchase, and 50% of the online population has purchased online more than once.
  6. ‘Apparel and Accessories’ is the fastest growing ecommerce sector of the 9 major categories.
  7. Although it launched in 1995, Amazon wasn’t able to turn a profit until 2003.Amazon's First Gateway Page
  8. 26% of all products added to cart are abandoned and never purchased.
  9. 44% of smartphone users admitted to “show-rooming” – They browsed products in brick-and-mortar stores, picked what they liked, then purchased online.
  10. During the third quarter of 2012, $4,423 was transacted via Paypal, per second.

Sources: 


The 4 Most Overlooked Items During Google Analytics Setup

September 10, 2013

Google Analytics offers incredible insights on your site visitors, and the data it provides helps you make the most optimal business decisions. That being said, there are countless items in Analytics that usually end up overlooked when a new site is launched. If you want the cleanest, most relevant data possible from your Google Analytics, here are four key items to pay attention to during your setup.

1. Site Search

An easily overlooked Analytics option, site search can provide invaluable data on how your users are utilizing your internal search. Setting this up is a very simple process. First, perform a search on your site and then look in the resulting URL for the term you searched for. The string between the “?” and and “=” is the query parameter. In the case of Delegator.com, the query parameter is “s”, as seen below.
Site Search Query Parameter
Now that you know your query parameter, head over to the Admin panel in Google Analytics and click “View Settings” at the Profile level. Towards the bottom, you will see what’s in the screenshot below. Just check the box for “Do track Site Search”, enter your query parameter, click Apply, and you’re good to go!
Google Analytics Site Search Setting

2. IP Filters

Decisions are best made when backed by accurate, meaningful data. There’s no quicker way to muddle your Google Analytics data than to ignore its awesome filter options, specifically IP filters. A common best practice is to exclude the IP addresses of anyone on your team who is regularly on your site, so you’re not skewing your data.

To find out your IP address, just visit WhatIsMyIP.com and record the IP address it returns. Next, head over to the Admin panel in Google Analytics and click on “Filters” at the Profile level, then click on “New Filter”. Enter in a name for the filter and follow the layout in the screenshot below, inserting the IP address you recorded earlier. Then, just click save and you’re done! Repeat this process as needed for more employees or internal computers.

IP Filter - Google Analytics

3. Linking AdWords & Webmaster Tools

As simple as these two items are, I’ve seen them overlooked time and time again in Analytics audits. If you don’t link your AdWords account to Analytics, you are flying blind on your AdWords spend with respect to on-site metrics and ecommerce data. Don’t be that guy – it’s foolish to ignore such juicy, free data. Google walks you through the process in very clear detail here.

While you’re at it, linking your Webmaster Tools is a very simple process as well, outlined here. This lets you view Webmaster tool data within the Analytics interface, which is a lot more streamlined and easier to navigate.

4. Missing/Inaccurate Ecommerce Tracking

There are few things more frustrating in Google Analytics than having no ecommerce data for an ecommerce website. Setting up ecommerce tracking is a tricky piece that will most certainly require a developer to help implement, but it is absolutely invaluable to your long-term success. If you don’t have crystal clear transparency on how much revenue your various traffic sources are driving to your site then you are making suboptimal decisions, plain and simple.

Ecommerce Tracking Fail

These four items only represent a small portion of what often gets overlooked with your average Google Analytics setup. There are countless things you can do with remarketing lists and custom metrics. Same can be said for custom dashboards, reports, and advanced segments. The moral of the story? Don’t be content with “Vanilla Analytics”. Get out there and make your GA data more accurate and relevant!


Congrats to Client Global Green Lighting

August 29, 2013

We’d like to congratulate Global Green Lighting President & CEO, Don Lepard, and the entire GGL team on a successful Grand Opening! We were proud to be a part of such an exciting day and enjoyed celebrating the Grand Opening event with other business leaders and elected officials.

Global Green Lighting President & CEO Don Lepard Cuts The Ribbon

GGL President & CEO, Don Lepard, cuts the ribbon with Gov. Haslam, GGL Vice President, Mack Davis, Rep. Chuck Fleischmann and Chattanooga Mayor Andy Berke.

Global Green Lighting Grand Opening With Governor Haslam, Mayor Berke And Elected Officials

Gov. Haslam, Mayor Berke and other elected officials listen to a training demo showcasing GGL’s lighting and wireless technology.

Governor Haslam Speaks at Global Green Lighting Grand Opening

Gov. Haslam speaks during the Grand Opening Ceremony in front of GGL’s new manufacturing lines.

To read more about the event, see the Chattanooga Times Free Press or Nooga.com stories. For more information about Global Green Lighting, visit globalgreenlighting.com.