PubCon 2014: SEO Part III

-

SEO Managing Partners Carol Morgan and Mitch Levinson attended PubCon last fall in Las Vegas. The premier search engine optimization (SEO) conference in the U.S. provided them with plenty of great information for how to conduct SEO with all of the recent algorithm changes. The conference provided so much useful information that we had to spread it across three blog posts! From panda and penguin to hummingbird and now pigeon, SEO changes rapidly. Here are more tips for keeping up with all of the recent changes on Google.

This post will cover these basic SEO tactics:

  • H tags
  • Death of keywords
  • SEO tools
  • Basic SEO survival tips

H Tags need to be used on websites, and controlled in the cascading style sheet (CSS). The H tags need to be a built in function of HTML that denotes the text as a header. H tags are assigned a value with the H1 tag represengint the most important content. There should be only one title tag and one H1 tag on any given page, but H2-H6 tags can be used more often on the same page. H tags are the style tags for content and can help determine the strength and hierarchy of the content.

Here are tips for H Tags:

  • Use H1 tags for the title or most important text on the page and should only be used once on a page
  • Use CSS to control how H1 tags are displayed by the browser
  • Do not over use H1 tags as on optimization technique
  • They may also be used with H2 tags
  • Using an Hx tag gives some extra weight to words within the tag
  • The most important keywords should appear towards the beginning of the tag
  • The H1 tag should not exactly match the HTML title

Did you know that 80% of the keywords are now “not provided”, which indicates the death of the keyword? There is no analytic platform workaround for lost keyword level attribution data in Google Analytics, which Google turned off on September 23, 2013. This change ultimately turns Google and the other search engines simply into referring sites. Searches are down, not only because of this, but because of apps and mobile too. Once someone has downloaded an app, it is just easier to go back to the app to get things done versus going to a search engine.

We suggest implementing these four steps to solve this keyword issue:

  1. Link webmaster tools to adwords (Search Query Data is Saved)
  2. Wait for one year data availability
  3. Archive the search query data with API or a third party solution
  4. Learn from the paid where to grow organic

Let’s introduce some specific SEO tools to make SEO easier. With all of these tools it is important to have a tool that helps diagnose search problems.

  • Google Webmaster Tools is always one of our favorite tools for SEO (after all, it is Google)
  • Spyfu— a keyword tool that helps with organic keyword research, paid keyword research and researching the competition.
  • gSiteCrawler—a tool that creates fast Meta tag spreadsheet to be used as an SEO worksheet to detect duplicate content and checks for 404 errors.
  • Firefox and their SEO extensions/plugins—a tool that performs multiple tasks including: start, disable Javascript, look at alt tags, linearize the page, search status no follow and search status show index.

Other tools to consider are:

  • Bing’s Webmaster Center
  • Seobook.com
  • Bruce Clay’s SEO Toolset
  • Raven SEO Tools
  • Moz Open Site Explorer
  • Google tag assistant

Here are some basic survival tips that help with SEO techniques:

  • Recognize the impact of personalized search. It is absolutely essential, and could be the reason why rankings are not effective for your specific KPI’s.
  • Recognize the impact of social activity on rankings. Social is more natural but there needs to be more time before the correlation turns into a direct connection.
  • Look at specific words used to get engagement and sharing
  • Local SEO is different than Local search – web traffic versus foot traffic/phone traffic. There are tactical differences between the two:
    1. Local SEO is content strategy aimed at increasing website traffic from visitors using geo targeted search queries in search engines.
    2. Local Search is a lead generation strategy aimed at increasing foot traffic and phone calls to a physical location.

Be aware that there are some common issues related to updating and claiming your online listing profiles. Contact us if you are having trouble and need help completing the verification process.

  • Duplicate listings with no options to remove, incompletes and partials
  • Lack of phone support for non-ads
  • Existing Accounts Locked Out
  • Verification w/ Owner – Outgoing Caller ID
  • Claiming existing listings via call to location
  • Paying to fix bad data – referring visits data
  • Postcards don’t come
  • Phone verification bugs

Claiming listings is much better than submitting a listing. Claiming listings is easy if all locations have been submitted previously, information is accurate and up to date, and someone else has not claimed the listing first. Remember, pulling information is easier than pushing new information and claiming improves content linking to your site by filling every field.

SEO content must match the local search and have it on the home page exactly where it needs to be on the website homepage.

Also, it is essential to continuously update listing sites. Infogroup, Acxion, D and B, Localize, Factual and nSphere are a few that can be used. An UBL service, which submits your business information to the top data centers, does several of them and costs about $79.

This post, SEO Pt I and SEO Pt II are all you need to know for SEO basics and best practices. Remember, SEO is tricky, so if you are tired of trying to figure out SEO on your own, contact mRELEVANCE for your SEO needs.