Category Archives: Search Marketing

SMX East 2014 – The Value Part 1

There are many conference options available for individuals interested in staying sharp with their online marketing skills. Popular events include Pubcon, SMX, SES & ClickZ Live, Internet Retailer (IRCE), Content Marketing World, and many others. The biggest organizations host more than one event each year in various cities around the world. For example, at last count there are 16 different SES/ClickZ events from Atlanta to Shanghai scheduled to take place in 2014.

With all these choices, TKG has decided to send two of our online marketing strategists to Search Engine Marketing Expo – SMX East. (For the record, we’re also sending a few individuals to Content Marketing World)

SMX East takes place on September 30, 2014 in New York City.

SMX EastWith all the options out there, why did we choose this event?

  • Trusted Host: The SMX events are run by Third Door Media, the company behind SearchEngineMarketingLand.com. An authority in the SEO space that we follow regularly.
  • Proximity: While we’d like to attend one of the SMX events in Milan, Beijing, or London, New York City is the most cost-effective travel expense option for a company based out of Ohio.
  • Convenient Date: Let’s be honest, if an event doesn’t fit in your schedule, it doesn’t matter how great the content is – you’re not going! The end of September works well for us.

What we hope to gain from the event:

  1. Actionable Ideas: We want specific tips, tricks and strategies from our marketing peers that we can take back to the office and implement for our clients. On a small scale, it could be something as simple as a new Google AdWords report in Google Analytics we weren’t aware of. On a larger scale, it could be a social media strategy that boosted engagement for a company with a small budget. In any rate, our number one goal is obtaining real-world knowledge that can help our clients grow their businesses.
  2. Latest Online Marketing Trends: With the constant changes in Google, content marketing, mobile marketing, and more, it’s important to stay ahead of the curve. A few years ago, responsive design was a new trend shaping the way users experience websites on mobile devices. Now it’s being utilized by nearly every new website we build for clients. We want to know what the next big thing in online marketing is going to be so we can adjust our strategies for the future.
  3. B2B and Small Business Insights: Many conferences of this size bring in guest speakers from Fortune 500 companies that spend hundreds of thousands of dollars on a single marketing campaign. TKG doesn’t have any clients willing to spend $50,000/month on advanced analytics platforms. Instead, our sweet spot is mid-sized businesses that understand the value of getting sales and leads from their website. Many of these businesses sell to other businesses rather than end users. Accordingly, while it’s nice to hear from presenters who spent 5 million dollars to help Abercrombie a Fitch launch a new line of jeans, that doesn’t really help our client that provides financing to staffing agencies. We’re hoping SMX understands the diversity of their audience and doesn’t just focus on B2C marketing.

Stay tuned for part 2 of our SMX East post in October when we recap the event and describe whether or not we learned what we hoped.

What is Ranking Retention?

If you’ve ever re-launched your website, hopefully you’ve heard the phrase “ranking retention” or maybe “ranking protection”. At its most basic, it’s a series of steps you should take when you’re re-launching your website that are designed to minimize the impact of a re-launch. While this phrase implies that you need to do something to preserve your search engine rankings, it’s also something that you’re doing to preserve the user experience.

So, what is ranking retention exactly?

We tend to think of it as two separate tasks, actually. The first is matching up old URLs with new URLs and applying re-directs, and the second is updating deep links to your site.

301 Re-directs

Let’s say that your old site was built with urls that ended in .html (or .asp or .php) and your new site doesn’t use this extension (a common scenario for sites that haven’t been rebuilt in the last 5 or 6 years). Even if you keep the page names exactly the same (i.e. www.site.com/about-us.html vs. www.site.com/about-us), because of the extension being removed, Google (and other search engines) still view these as two separate URLs. Since one is really just an updated version of the other, you need to re-direct about-us.html to about-us so that the search engines understand that they should stop indexing the old URL and start indexing the new URL in its place. These are called 301 re-directs. Wikipedia defines a 301 re-direct as follows:

“The HTTP response status code 301 Moved Permanently is used for permanent redirection, meaning current links or records using the URL that the 301 Moved Permanently response is received for should be updated to the new URL provided in the Location field of the response.”

And Google in particular recommends using them when you want to change the URL of a page as it’s shown in search results.

While the actual setting up of the re-directs is probably something you’ll have your web development or hosting team handle, creating the list is something that you or your marketing team should do. It’s usually as simple as setting up an Excel file with the old URL in one column and the new URL in another, and every page from your old site that has a version on the new site should be matched up against a new URL.

Ranking RetentionIf there are pages on the old site that won’t exist in any form on the new site, you do not need to (or want to) create a re-direct – just let these pages go. Once you have the list, provide it to the appropriate people and they should be able to apply these at launch.

Once applied, there are 2 benefits:

  1. You’ve now directed Google (and other search engines) on how you would like them to index and rank your site. There is, of course, no guarantee – Google might decide that even though they showed your page for a particular keyword search before, it no longer is the best fit, but if you’ve lined up your re-directs well and have had an eye towards choosing the right keywords and creating good content, you are in the best shape possible for this to be successful.Ranking Retention 404
  2. Users who click on your site on a search results page will be re-directed to your new site instead of just getting a dead link.Can you skip this step? Sure, but why would you want to? This is the easiest way to a) make your links in the search results still clickable and b) tell the search engines what you would like for them to do. And, having seen sites try to recover from a re-launch without doing this vs. the site who have done it – I’d vote for this every time, if for no other reason than it is incredibly easy to do, and it is terribly frustrating sending users to dead page after dead page while you wait for Google to crawl your new site and figure out on its own what pages it should index and de-index and how it should rank them.

One caveat:

As you continue to grow and market your website, you will likely come up with other legitimate reasons to use 301 re-directs and you may wind up re-directing a page to a page that has been re-directed to another page, and then re-directing that page to another page, etc. etc and so on. Sounds crazy, but we’ve seen it happen. And what we’re left with is a loop of re-directs that don’t always work the way they’re supposed to and can sometimes land visitors in the wrong place or no place at all. So be sure to monitor your site’s performance (is Google indexing the updated urls?, etc) and consider removing 301s when they are no longer needed. You will lighten the load on the server and minimize the possibility of crazy 301 loops down the road.

Link Updates

The second step in a ranking retention project is updating links to your site. To be clear, these are not links from your site to your distributors or partners, but rather links from their sites to your site.

There are a lot of different tools that can help you find a list of sites that are linking to you – Google WebMaster Tools is usually a good place to start. Once you have this list, you’ll need to separate the links so that you have a list of sites that link to your homepage (this URL usually isn’t changing, unless you are changing domains) and deep links (i.e. links that go to a product page or somewhere else in your site other than your homepage). It’s usually this second list where you might consider requesting updates. If you’ve only got a few links, it’s not such a big deal but if you’re a larger site that’s been around for years, you might have hundreds or even thousands, and you’ll want to set up a spreadsheet similar to the 301 list so that you can see what sites are linking to you, where they are linking and where you want them to link to. During this research stage, you should also be collecting contact information for these sites so that you can reach out to them.

There’s been a lot of questions over the years on whether this step is even necessary. Is it more valuable to have a site that correctly links to exactly the right URL vs. leaving the link alone and letting your 301 re-directs handle passing the user over? A recent response from Google says that it doesn’t actually matter.

So…. Can you skip this step? As much as I want to say “yes” (because, let me tell you, requesting link updates is maybe my least favorite thing to do at work ever), there is at least one good reason to keep doing it:

Your 301s might not always be on the server. Let’s say that you re-launch your site and put 301s in place. Some time down the road someone (maybe you!) notes that Google is indexing all of the new pages and decides to remove the 301s from the server. Although the 301s served their purpose and helped Google re-index your site, if you were counting on them to re-direct users who clicked on now-defunct links, once they are gone, users are now reaching dead links. Talk about a bad user experience! This is especially troublesome for the sites that were referring a lot of traffic to you. Once the 301s are gone, all of that traffic is instead going to your 404 page (if you have one, hopefully you have one. You do have one, right??) Bummer.

If you’ve got thousands and thousands of backlinks, you may need to prioritize where you make these link request efforts (for example, the sites that actually send traffic) but it’s worth taking the time to determine where you’ll get the biggest bang for your buck and at minimum make those update requests.

Sadly (and this is why it’s my least favorite task), despite your best efforts, you may not get a lot of response. We’ve done some really successful link update campaigns and a lot where our requests fell on deaf ears… but as part of our due diligence and settings sites up in the best possible way, it’s still something we recommend doing. And, we tend to find that the sites with the best links get the best response – and there’s a lesson in that too!

Have you re-launched your site recently? Did you engage in ranking retention? How did it go? If you didn’t do it, what impact did you have from skipping this step? Tell us in the comments!

Google Panda 4.0 Impacting Press Releases

On May 20, Google announced Panda 4.0, an update to the Google Panda Algorithm, a filter on Google’s search results designed to stop sites with poor quality content from working their way into Google’s top search results. Updates to Panda happen every now and then and have had various impacts on sites’ performance. The overall message has been and continues to be: build good quality content that users want and the search engines will figure it out.

Panda 4.0But what about press releases? Surely news from your company is considered good content?

Well, maybe.

The problem is that in the link building years of SEO, press releases were an easy way to publish content on another site that links back to yours – and SEOs abused the heck out of it! Entire link building strategies were built around buying links in directories (now a spammy link building method, usually) and publishing press releases on sites like PRWeb, PRNewsWire, Pr.com, etc. And, not only were you linking back to your site when you published these press releases, you were providing opportunity for your brand to be found on those PR site, too.

It worked for a while.

But Search Engine Land is reporting that many of the favorite PR sites are feeling the impact of Panda 4.0 and it’s not a happy change! For example:

  • PRNewsWire.com has had a 63% drop in visibility after Panda 4.0 was released
  • BusinessWire.com had a 60% drop

If you’ve sent press releases to these sites and spent the hundreds of dollars they typically cost, this is a real blow to your strategy. (Not a surprising one – but a real one!)

So, what should you do?

First of all: stop it. Stop sending press releases to press release sites that cost hundreds of dollars and will publish just about anything as long as you can try to make it sound like news and pay the fee.

Second: Only write good press releases about REAL news. (That’s certainly why these sites were hit – plenty of PR content that really wasn’t news!) If you aren’t sure if something is real news, it’s not.

Third: Create a hub and spoke content strategy. Publish real press releases with real news on your web properties and distribute via social and other channels, pointing back to your hub. This way you own the content, you are driving traffic to your site (where you can then pull visitors to other things you want them to know or do) and it won’t cost you a dime. You’ll also be on your way to building out your site in a way that is natural, non-spammy and won’t be impacted by the content that other people are publishing on the site.

Has your sit been affected by a Google update? How so? What have you done to minimize the impact? Tell us in the comments!
photo credit

The Real Deal: Why You Should Ask Your SEO Firm for Their Case Studies

Why should you ask an SEO firm for some case studies before hiring them?

Well, there’s the obvious. You want to see their successes, and how they were achieved. But more than that, you want to know if they really know what they are doing.

For example, it is not all that uncommon to see case studies posted on an SEO firm’s site, but it might be a little vague. It doesn’t mention the company by name or some other irregularity. This could be a red flag for you. So a few questions you should ask:

  • What company was the subject of this case study? (Then follow up on your own – make sure it is a real company.)
  • Who is your contact at the company in question? Can we call for a reference?
  • Was the work done in house or contracted out?
  • Does the case study list the tactics used to generate results?

These questions should give you a pretty good idea of whether or not the firm is up front about their business practices, if they really know SEO, and if they really know their client.

Take a good look at their case studies and do your own homework. Look up the company, even the contact. You want to make sure both are real. You never know ‑ we’ve come across more than one phony case study while doing our own research.

Asking for a reference is always a good idea. Most firms will want to provide you with the chance to hear about the great work they did. It goes without saying this will ensure that the SEO firm actually did the work and that the company was pleased, or at least satisfied.

Determining whether the SEO work was done in house or contracted out is a bigger deal than it might seem on the surface. You want to know that the firm/person doing the work understands the goals of your business. Hopefully, if you have a web presence, you have a goal for your site and have a pretty good idea of what you want to achieve. If the work is contracted out, some of that gets lost in translation. It also could mean that the same person or firm who did all the great work on the case study company won’t be the person doing the work for your company.

If you’ve been engaged in the web long enough, you know that there are SEO tactics that are good ideas and some that aren’t so great. Make sure that the SEO firm you hire has outlined how they approach SEO, and that they don’t make any kind of huge promise of traffic, sales, etc. No one can promise those things; but good, solid SEO and all that entails will certainly get you headed in the right direction. A good case study will tell you what tactics were used and what impact they had.

In the end, due diligence here will serve you well, just like in every other aspect of your business. Read up on the case studies before you decide on a firm. Take a look at some of TKG’s case studies.

Do you have a great or phony case study that you’d like to share? Post it in the comments ‑ we’d love to see it too!

2014 Search Engine Trends

A year has passed since we first evaluated the usage data from all the major U.S. search engines. Now it’s time to check in and see what changed 12 months later.

As a reminder, comScore releases search engine data monthly. With January officially in the books, we can compare January 2014 with January 2013January 2012, and January 2011.

Search Share:

Search ShareExplicit Core Search Share

Ready for the broken record? With 67.6%, Google continues to dominate U.S. market share. In addition to being the most popular search engine, the company continues to grow its search share each year. In second is Microsoft’s Bing with 18.3%. This is a nice bump up from 16.5% in January 2013 and is an all-time high for the search engine. While Bing is showing growth, the fact remains that the search engine still failed in its quest to take a chunk out of Google’s pie. Instead, it continues to pull market share from Bing-powered Yahoo, Ask, and AOL. All of which continue their slow and painful descent into irrelevance.

Explicit Core Search QueriesExplicit Core Search Queries

In addition to market share data, comScore released totals for explicit core search queries. This is a measure of how many traditional searches take place in the U.S. across all the search engines. In January 2014, there were 19.561 billion searches completed compared to 19.484 billion in January 2013, and 17.804 billion in January 2012. That’s nearly a 10% increase from 2012 to 2014. This three year increase is less than the three year increase from last year’s review but is still steady growth.

What does this mean for your business in 2014?

With more visitors than ever using search engines to find and research your company, it’s important to make sure the information on your website is accurate. Have you introduced a new product or service recently? If so, have you incorporated it into all the logical places on your site? Does your VP of Operations have a bio page saying, “I love my two boys” even though she had a third boy last summer? Did you create a great video for a tradeshow but never added it to your YouTube channel? Do you still have that “new features” PDF available for download that your supplier sent you in 2012? You’ll be surprised how much of your site’s information might be outdated even though you’re convinced your business hasn’t changed. Time to get an early head start on your web content spring cleaning project!

Will Google Analytics Always Be Free?

Google Analytics is one of our favorite tools for understanding how our clients’ websites are working. It is powerful, easy to use and free. But that last point is increasingly being questioned. With the recent increases in (not provided) keyword data (see our look at that topic) many online marketers are beginning to suspect that Google is positioning themselves to be able to charge for the popular analytics platform.

Source: Google Analytics on Google+

While the idea that Google could be withholding information in an effort to make a little money from analytic users certainly makes for a good conspiracy theory, but I don’t think that it is credible. Google’s business model is based on organizing all the information on the internet. By providing data and information about how sites are performing, Google encourages site owners to improve their sites, which in turn benefits their services. My guess is that the amount Google benefits from helping site owners understand and improve their sites is more valuable when they calculate they could make by charging for the platform.

But that isn’t the only approach Google could take. They already offer a Premium service level for enterprise level customers. With an annual price of $150,000 it isn’t targeting the majority of the sites, instead it offers a big sites (with big budgets) the extras that they need. This approach lets Google focus on making fewer, high value sales while providing a great tool to smaller sites.

So, will Google Analytics always be free? I think that Google will continue to offer a very powerful analytics platform at no cost to small and medium sites. The Premium option will continue to target the big, high value sites. So, if you are looking for a solid platform to measure and track the performance of your site, I think you are safe going with Google Analytics without fear of surprise costs.

Do Meta Keywords Impact SEO?

Many content management systems have fields available for common SEO tags such as page titles, keyword-friendly URLs , meta descriptions, and meta keywords. Since those fields are often under a section labeled, “Search Engine Optimization,” one may assume filling out all of them would increase the chances of having the pages of a website rank well with Google and the other search engines.

Not to mention, websites like Social Media Today continue to making statements such as:

Meta keywords and Meta descriptions are an important part of your SEO strategy.

The casual web marketer would be fairly confident meta keywords should be included in their SEO strategy.

Those casual web marketers would be wrong.

The Karcher Group stopped providing meta keywords to clients about a year ago and more recently took it one step further and started removing existing meta keywords from websites that previously had them.

But why?

Here’s an article from Google’s own Matt Cutts from waaaaaaay back in 2009 that clearly states how Google does not use the meta keywords tag.

Yes, the last paragraph does include, “It’s possible that Google could use this information in the future, but it’s unlikely.” That’s why we continued to provide meta keywords for our clients.

Then meta keywords went from harmless to harmful.

Over the last year, reputable SEO websites such as Search Engine Land and SEM Rush started writing about how populating meta keywords can actually penalize your SEO efforts, particularly with Bing.

At this point, we’re confident meta keywords will not benefit SEO rankings and should be avoided.

Not all meta tags are bad

This article is specific to meta keywords. NOT meta descriptions and NOT keywords in general. You should still write unique, quality meta descriptions for each page of your site and you should still identify relevant keywords to incorporate into the other SEO tags.

Just don’t waste your time filling in that meta keywords box!

Google Doesn’t Really Care if You’re Ugly

Google seems to favor the fast and easy type. We’re talking websites here. Google says fast and optimized pages lead to visitor engagement and conversions. Two things The Karcher Group keeps an eye on when offering professional SEO services as part of our online marketing strategy.

Ugly Dog ContestGoogle offers valuable tools for you to look under the hood of your website and perform necessary analysis and optimization. These include the following options:

PageSpeed Insights will identify performance best practices you can use for your website. The PageSpeed optimization tools look to provide a level of automation to the process.

PageSpeed Insights offers up a lot of recommendations and technical guidance that helps you get a lean and mean website. Things to avoid such as landing page redirects and things to consider such as server response time. All of these things aim to optimize the performance of your website.

All of this is great information to use on your site and help rank higher when someone searches for your product or service. The truth of the matter though is once Google serves up your website a human must look at it and actually use it. That’s where good looks (aka. User Experience Design) and a friendly smile can set you apart from the competition.

Good User Experience Design (UX) helps serve up a good experience when customers find your website. It considers how the website looks and how its content is organized, all for the sake of usability and accessibility. UX aims to present things like a human would want to see them as opposed to how a computer would organize and store data.

So, is your website looking ugly? Or maybe it’s real easy on the eyes but still not making sales? Take a look at Google’s tools and then let me know questions. We can talk about design and search engine optimization.

photo credit

2014 SEO Strategy: Microdata

If you’re like us, you’re already thinking about what your 2014 SEO strategy will include. If you’ve got us, don’t worry, we’re on it. :) If you don’t, well, it’s time to get semantic markup/schema/microdata/whatever you want to call it on your radar.

What is Microdata?

At a high level, it’s a way of formatting information about your site in a way that is highly readable by Google.  At a more detailed level, well, Josh McDermitt already covered it so I’ll let you read his microdata post.

What does Microdata mean for your site?

Improved user experience

Microdata isn’t actually visible to your visitors but it is visible to search engines and impacts how your site shows in the search results page (this is called “rich snippets”). If you can get Google to display all the important information about your site, you’ll attract the right visitors – you know, the ones actually interested in what you have to offer. And visitors finding the right information are much more likely to engage with your business (fill out a form, call you, spend money!!)

Improved click-through rates

Which listing would you click on?

microdata

If you said the one with an image, you would be right. Of course not every B2B site, for example, has a need for showing an author’s picture, but if you run a blog, you can be it’s going to make a difference! And if you’re selling something, getting the image of your products to show on the results page can also make a huge impact.

And, finally, if you’ve got everything else in place:

An improved bottom line:

If you’re attracting more of the right visitors because Google is displaying the right information about what you offer, and if you’ve got good conversion points in place… Well, it stands to reason that microdata can help you make more money. And, let’s be honest, we all like making more money.

So, is microdata on your radar for 2014? Are you already using it? Tell me in the comments!

An Introduction to the Google Search Algorithm

In the world of Search Engine Optimization (SEO) the Google Search Algorithm is of central importance. It is the focus of many blog posts, rants, arguments and complaints. But what is it? At the simplest level the Google Search Algorithm identifies what stuff on the internet is most relevant to your search. But let’s take a closer look and get an idea of what is going on.

What is an Algorithm?

Before we start looking at Google Search, let’s step back and look at what an algorithm is. Wikipedia defines algorithm as a “step-by-step procedure for calculations.” Essentially, an algorithm takes an input, does some calculating and provides an output. For our conversation today we can think of a Google search. Consider this very simple visual representation using a flow chart.

Simple Search Algorithm Flowchart

Simple, right? You give Google a few words as an input, the search algorithm does some calculations and gives you a list of relevant stuff on the internet. While this isn’t wrong, there is a lot more going on.

Input: More Than What You Type

In the early days of search the input was what you typed into the search box and maybe a setting or two, but is has expanded to include many other clues about what may be relevant to you. Everything form search history and location to elements like spell check, autocomplete, and synonyms are also factored in.

Single or Plural?

Up to this point we have been talking about a Google Search Algorithm as a single entity, but in reality there are over 200 different factors being weighed to determine the most relevant information for you. Each of these factors are rigorously evaluated and adjusted so each search returns the best results. New factors are always being tested and tweaked to make things a little better.

If you want to explore some of the elements that go into search, I’d recommend looking at Google’s How Search Works and Dr. Pete Meyers’s A [Poorly] Illustrated Guide to Google’s Algorithm.

End Result

This where it all comes together: moments after you start typing a list of results is displayed, sorted using all the information available to help you find exactly what you are looking for. The beauty of the Google Search Algorithm is that for all the complexity that goes into it, the experience is incredibly simple. For most users, the flow chart at the beginning of this post covers all they need to know about how Google works. But if your business depends on people finding your site online having someone that cares about how all those algorithms work is incredibly valuable. Check out our online marketing offerings to learn how we can apply our nerdy love of all things web to your site.