iMarket Solutions Blog : Archive for the ‘SEO’ Category

Google Penguin 3.0 – The Algorithm Refresh That Should Have Been an Update

Wednesday, October 22nd, 2014

On October 4, 2013, Matt Cutts announced the release of Penguin 2.1 – an update to their infamous algorithm that targets spammy onsite and offsite SEO strategies. Quite a few webmasters in the SEO community reported that the 2.1 update had quite the negative impact on their websites. And due to the nature of this particular algorithm, a recovery is not possible without a manual update, or refresh, of the algorithm being pushed through by Google.

The Google Penguin thug

The Google Penguin thug

Many people perceive the Penguin algorithm as nothing more than a thug, here to force thousands of small businesses into paid advertisement on Google by tanking their organic visibility. So I’m sure you can imagine the unrest within the community as the one year anniversary of the last update passed. But on Friday (October 17th, 2014), webmasters finally got their wish – Google began rolling out Penguin 3.0. Whether or not it was what they had hoped for is yet to be determined.


The Difference between a Penguin Refresh and Update

As I mentioned above, a website that has been negatively impacted by the Penguin algorithm cannot recover, unless Google manually updates the algorithm, or refreshes the current version of that algorithm. So what is the difference between an algorithm update and an algorithm refresh? I’m glad you asked …

What is an Algorithm Refresh?

An algorithm refresh simply means Google has not modified, removed or added any new signals to a previously introduced algorithm. Think of it as running anti-virus software, except that all of the “viruses” (a.k.a. spammy tactics) it finds when it is initially rolled out are not allowed to impact your “computer” (a.k.a. website) again.

But just like with viruses, there are always variations of spammy SEO strategies being utilized, so as to try and stay one step ahead of the “software” (i.e. algorithm). Google can then run a refresh of their algorithm, in hopes that they will catch any new spammy tactics that have been used since the previous refresh or update. However, just like anti-virus software, these refreshes can become obsolete as SEO’s find new ways to avoid the algorithm. That’s when an update comes in handy.

What is an Algorithm Update?

An algorithm update means Google has modified, removed, or added new signals to a pre-existing algorithm, in hopes that these signals will catch any new strategy variations that previous updates had missed. Again, using the previous analogy; it would be just like Norton providing updates to the anti-virus software on your computer, in hopes of catching any newly found viruses, or variations of old ones.


What Do We Know About Penguin 3.0 So Far?

It’s tough to say at this point. Commentary from Matt Cutts and other Google representatives led us all to believe that the next version of Penguin was to be a significant update, which implied new signals would be introduced. Barry Schwartz wrote an article at the beginning of October that suggested a Penguin 3.0 update may come as soon as within a week following his post. Barry made this prediction based on input provided by Gary Illyes, a Google Webmaster Trends Analyst who apparently was involved in working on the algorithm. However, it seems Barry may have been a bit too ambitious with his choice of words, as Gary even commented on Google+, “I love how you guys could twist “soon” into this”. Some useful insights on the Penguin algorithm were extracted from Barry’s post though.

Penguin Insights:

  • Gary confirmed that a disavow file (which is a .txt file you can submit within Google Webmaster Tools that pretty much indicates to Google which backlinks pointing to your site you do not want any credit from) are taken into consideration when the Penguin algorithm is updated or refreshed.
  • Disavow files submitted after two weeks prior to Gary’s presentation at SMX East would not be taken into consideration in this next iteration of the Penguin update/refresh.
  • Google is working on speeding up the rate at which future Penguin refreshes will happen.

In a Google+ Hangout session on October 20th, John Mueller stated that as far as he knows, the Penguin update had rolled out completely – but when asked by Barry to clarify if it was indeed an update or just a refresh, he declined to comment. However, that same day, he then followed up with Barry in a Google+ comment stating, “I might have spoken a bit early, hah – it looks like things may still be happening. I’ll double-check in the morning.”

It turns out that John did indeed jump the gun in stating Penguin had rolled out completely, as Pierre Farr (who works at Google UK) stated the following:

“On Friday last week, we started rolling out a Penguin refresh affecting fewer than 1% of queries in US English search results. This refresh helps sites that have already cleaned up the webspam signals discovered in the previous Penguin iteration, and demotes sites with newly-discovered spam.

It’s a slow worldwide roll-out, so you may notice it settling down over the next few weeks.”

Notice the choice of words in his first sentence – Penguin refresh? Although it’s not an official confirmation, it definitely suggests that this iteration of Penguin is indeed just a refresh and not an actual update of the algorithm. He then goes on to state refresh again in the next line, and his definition of what this refresh does pretty much coincides with what you would expect from a typical refresh; not an update. Lastly, he confirms that this refresh (which supposedly impacts < 1%) is still rolling out worldwide, and as such, fluctuations in rankings and traffic can be expected to last for the next few weeks.


What Can Should You Do?

First and foremost, it is of the utmost importance that you DO NOT panic. As we’ve seen with rolling updates (which so far have only been confirmed with the Panda update), they can take some time to impact all websites. And in some instances, we have seen rankings move up, down, and up again over the period of a roll-out (and significantly in some cases).

Don't panic; organize your SEO strategies!

Don’t panic; organize your SEO strategies!

So before you decide on throwing in the flag, wait to see how your websites traffic and impressions are impacted. If you see noticeable positive increases, then keep on doing what you’re doing, as it’s obviously working at this point in time. And if you see that your website was negatively impacted by Google, then it’s important to understand why, so you can develop a strategy to fix the issues on your website, or within your backlink profile.

Fortunately for us and our client’s, we build the majority of our clients’ websites from scratch, so it would be rare that one of our sites would be targeted as a result of their onsite work. Much more common scenarios are domains that were involved in unscrupulous link building campaigns prior to hiring our services. Nevertheless, we’re adequately prepared to tackle either issue, should they arise, and want any webmasters reading this blog post to be prepared as well.

Below Are Some Steps You Can Take to Help You Recover from Penguin:

Step 1 – Review Your Analytics Data

It is important to know if your site has been impacted, and which pages in particular, before you can devise a strategy to keep Penguin from targeting you. What my team and I do is look at only Google organic analytic’s data, comparing traffic of landing pages for the 2 – 4 weeks following the day the algorithm rolled out to the same amount of time immediately prior. You have to compare apples to apples (i.e. looking at Monday – Sunday vs the previous Monday – Sunday) in order to get an accurate representation of what your traffic trend should look like.

Also, it’s important to look at absolute data vs. average data, as the number of visits lost is much more telling of a sign vs. the average percentage (as you could have a -100% decrease to a specific landing page, but that does not really tell you anything if the page was getting 4 visits in those two weeks prior and is now getting none).

Step 2 – Understand Why Certain Landing Pages Were Targeted

Penguin can and does impact traffic to your entire site, but more often than not, specific landing pages are the cause of you being targeted in the first place (especially if you’re dealing with onsite spam vs. low quality backlinking). So what you need to do is evaluate the pages on your site you have deemed as being targeted and determine why Google thinks those pages are spammy. Are you stuffing important keywords or mentions of a geo-target within the content or headers? Are the Meta tags extremely long and stuffed with near identical variations of a keyword? For further guidance, you can read my previous post on Penguin 2.1, where I specify the instances of onsite spam I would be consider “Penguin bait”.

And beyond onsite, you also have to take into consideration offsite work: backlinks. Using, we are able to evaluate the quality, quantity and methodologies our clients’ have used in the past to build backlinks for their website. My blog post on “How to Recover from a Google Unnatural Linking Penalty” will walk you through the steps on not only identifying spammy backlinks, but how to disavow them as well.

Step 3 – Take Action!

As pointed out above, understanding why you have been targeted by Penguin is the key to recovering. Once you have identified the culprit strategies, you need to work diligently on remedying them. If spammy onsite is to blame, then you need to work on cleaning up the SEO strategies you have in place on your website. If spammy backlinks are to blame, then you need to identify what backlinks are harming your site and work on asking the webmasters of the linking sites to remove those links.

And always remember: what may have worked wonderfully in the past will not always work as well in the future, so just appreciate the ride you had enjoyed, implement a revised search engine optimization strategy as soon as possible, clean up what needs to be cleaned up on or offsite, and hope that a future refresh or update of the Penguin algorithm will work in your favor.

Continue Reading

The Life and Death of the Google Authorship Markup

Wednesday, October 8th, 2014

It was exciting while it lasted, but unfortunately, Google authorship is no longer supported by Google. But first, allow me to shed a little light on the rise and fall of Google’s authorship markup.

Google authorship markup is no longer supported  by Google

A screenshot from the Google Webmaster Tools Help forum, which states, “Authorship markup is no longer supported in web search.”

The Google authorship rich snippet was first introduced by Matt Cutts at the SMX Advanced conference, back in 2011. For those unfamiliar with this rich snippet; it allowed you to identify yourself as the author of the content within a blog post, which would then publish a small thumbnail of your Google+ profile photo directly to the left of your blog post snippet within Google’s search results. In the beginning, the process for setting up authorship on your website and verifying it with Google was quite unclear for many, which proved to be quite the perk for those that were successful in setting up their authorship’s markup early on.


The Benefits of Google Authorship

As mentioned previously, the early adopters of Google’s authorship rich snippet quickly realized that the most obvious benefit was the increased click through rate (CTR) for their blog posts that ranked on the first page of Google. It’s no surprise really; the addition of the photo thumbnail displayed prominently on search results pages, especially when it was the only one showing for a particular keyword.

An example screenshot of the authorship markup in Google's search results

An example screenshot of the authorship rich snippet for Nadia Romeo (the President of iMarket Solutions) within Google’s search results

Another large benefit of having authorship was the ease in which one could find all of the content another had written and associated with their Google+ profile. All you had to do was click on their name within the byline of their search snippet and it would take you directly to a Google search page that listed all of the content associated with their authorship. This was beneficial to both brands and authors alike, as people who followed their work could more easily find other content the author had written which they may have not previously read.

The last benefit, AuthorRank, was largely considered a myth up until Matt Cutts confirmed its existence on Twitter after Amit Singhal publicly denied its existence at SMX West this past March.

[twitter_stream id=”443560265808756736″]

AuthorRank was the concept that Google assigned quality scores to authors who created a Google Plus account and correctly associated their authorship with content they had written and published to websites. The assumption was that the more authoritative you appeared to Google on a particular topic or topics, the more likely their AuthorRank algorithm was to rank your related content higher in their search results. Many of us in the search engine optimization (SEO) community truly believed AuthorRank was going to be the key to the future of becoming more visible in Google’s search results. But alas, it appeared it was not used to the broad scope that so many SEO’s had imagined.


R.I.P. Google Authorship

On June 25, 2014, it came as a shock to most SEO’s when Google’s John Mueller announced on Google+ that they would be “simplifying the way authorship is shown in mobile and desktop searches”, so as to provide for a better mobile experience. The simplification turned out to be a mass reduction of authorship photographs showing for particular searches on Google. However, the byline and date of publication still showed below the URL of a search result for those who were affected. The significance of the impact that change had on search results was documented by the MozCast feature graph and can be further reviewed here.

Then just two months later, John Mueller reported that Google had made the decision to remove authorship from their search results entirely. In paraphrasing: he stated Google found that the authorship markup was not very useful to visitors of their search engine, and in fact, was potentially distracting.

A Few Last Words for Google Authorship

For those of you who have the authorship markup installed on your websites; don’t worry about Google punishing you for leaving the code in place. Barry Schwartz, the owner of the popular SEO blog, asked John Mueller in a comment on his announcement of the death of authorship if Google would penalize websites which left the authorship markup in place. John confirmed that Google would not punish you for leaving the markup on your website, and that in fact, visitors to your website still might appreciate being able to find out more about you and/or your company through your Google+ profile.

Continue Reading

The Google Algorithm Frenzy:
Pigeon, HTTPS, Authorship and MUCH More

Thursday, August 14th, 2014

On July 3, 2014, Matt Cutts declared to the search community that he was going on leave for 4 months, all the way through October. Upon hearing the news, I had a big sigh of relief. For you see, I thought to myself, “There is absolutely no way Google is going to launch any algorithms or make any significant updates to their existing algorithms while the face of their search quality department was on leave of absence.” I mean, who are SEO’s going to yell at and blame for all of their woes while he is away, right? John Mueller, a Google Webmaster Trends Analyst, tends to navigate Google’s help forums and host Google Hangouts, giving advice on how to build a website that’s both user and search engine friendly, would be my first guess – but is he adequately prepared to fend off the masses of SEO questions ahead?

Screenshot of Matt Cutts stating he is going on leave in 2014

A screenshot of Matt Cutts stating he is going on leave in 2014.


Google Algorithms Launched and Updated in 2014

Well, it turns out that Google had no plans for holding back during Matt’s absence. In fact, it seems they’ve decided to turn up the heat a bit. They’ve made quite a splash with announcing two new algorithms, and there have even been a couple periods of unconfirmed algorithm changes hitting the streets, causing significant SERP (search engine ranking placement) changes. But before we get into the more recent algorithms, let’s first have a quick look at the algorithms that Google has acknowledged have launched or been updated, prior to Matt skipping town.

Page Layout Algorithm #3

Release Date: February 6, 2014

The Page Layout algorithm (originally referred to as, “Ads above the Fold” algorithm) was first launched on January 19, 2012, and then updated in October 9, 2012. Google believes having too much ad-space / advertisements above the page fold of a website provides for a bad user experience, so the algorithm aims to hinder the rankings of websites that are dominantly ad heavy. In trying to find what impact the Page Layout algorithm had on our 130+ HVAC, plumbing and electrical clients, I found that this algorithm negatively impacted any thin content pages. And I also suspected that it treated having too many images above the page fold the same as having too many advertisements. Moral of the story is to try and have as much unique and valuable content above the page fold as possible instead of having too many images or ad’s.

Unnamed Update

Release Date: March 24, 2014

Although it was never publicly confirmed by Google, there was a major shake-up in the SERPs around March 24 – 25. According to’s documented Google Algorithm Change History, SEO’s speculated this was the softer Panda update Matt Cutts confirmed at SMX West 2014 would be rolling out relatively soon.

Payday Loan 2.0

Release Date: May 16, 2014

The Payday Loan algorithm was first introduced by Matt Cutts via Twitter on June 11, 2013. In his tweet, he suggested webmasters watch a video he posted previously in which he discussed some upcoming changes to Google’s search. In the video, he hints that the algorithm is geared towards cleaning up search results that tend to be more aggressively spammed than others, such as payday loan, Viagra, gambling, pornographic and other like keywords.

Panda 4.0

Release Date: May 19, 2014

In what’s now becoming a more common thing, Google is releasing significant algorithm updates within short periods of one another, in what I believe is a tactic to help make it more difficult for SEO’s (such as myself) from reverse engineering the exact metrics their algorithms target. Apparently Matt Cutts stated the algorithm started rolling out on May 20, but the SERP data collects indicated it may have started a day earlier.

Payday Loan 3.0

Release Date: June 12, 2014

Not even a month after the 2nd confirmed update for the Payday Loan algorithm, we received word from Google that they launched yet another update. According to Barry Schwartz’s Search Engine Roundtable article, Matt Cutts stated that the 2.0 update not only helped to better prevent negative SEO (the black hat art of pointing spammy, irrelevant and/or low quality backlinks to competitor sites), but it also specifically targeted spammy websites. Whereas this 3rd update more so targeted spammy search queries, which is consistent with what the initial algorithm did.

Authorship Photo Dropped

Release Date: June 28, 2014

This is more so a SERP display change rather than an algorithm update, but it is significant enough of a change to have earned a spot on this list. John Mueller announced on June 25, 2014 that Google would no longer be showing authorship photos within their search results. Many webmasters, including myself, noticed in the days and weeks prior to the official announcement that there were some rare instances of authorship photos disappearing, but I don’t believe any of us would have believed Google would have removed the authorship photos entirely. Nevertheless, Google has opted to leave the authorship name, which links directly to the Google+ profile for the respective authors, as well as the date of when the blog article was published. You can see an example of this in the screenshot below showing my authorship info for my blog post on keyword cannibalization.

Screenshot of the authorship search result for my post on keyword cannibalization

A screenshot of the authorship search result for my post on keyword cannibalization.


Above I discussed new, updated and unconfirmed algorithm tweaks that occurred prior to Matt Cutts going on leave.

Below are the algorithm’s which launched after Matt Cutts going on leave.

Pigeon Algorithm

Release Date: July 24, 2014

This update was actually never officially named internally at Google. So in an attempt to make it easier for SEO’s to reference it later, Search Engine Land (SEL) was, as usual, quick to dub this algorithm – they decided on “Pigeon.” Keeping with the theme of P-named algorithm updates (i.e. Panda, Penguin, Payday Loan, Page Layout), SEL felt this name was appropriate because this algorithm specifically targeted local search results, and “Pigeons tend to fly back home.”

Now getting to the good stuff – what we do know about this algorithm is that it is said to be the 2nd largest algorithm to be released since the Venice update. Barry Schwartz also claims Google told him that the new local-focused algorithm particularly made local SERP ranking signals similar to that of organic SERP ranking signals, and that it improved their ability to better interpret and factor in both distance and location for improved local ranking. To me, this suggests that websites will have a harder time ranking locally for cities in which they are no physically located within.

MozCast’s Google SERP Feature Graph, a tool that shows changes in Google SERP results, indicated a local SERP drop from 19.3% to a low of 9.2% within the days following the algorithm’s launch and a slight increase of knowledge graph results, from 26.7% to 28%.

Screenshot of the MozCast Google SERPs Feature Graph tool showing the impact of the Pigeon algorithm update

A screenshot of the MozCast Google SERPs Feature Graph tool showing the impact of the Pigeon algorithm update.


The commonality spotted by many SEO’s in the industry is that there are many keyword phrases which used to show a local map pack, but no longer do (Mike Blumenthal, a well-known local SEO expert, noticed that real estate type keywords in particular seemed to be the ones that were most significantly impacted, which he indicates in his comment here). Moz’s graph is showing that local search results seem to have regulated back to normal around July 29, 2014 – Barry Schwartz reached out to Google to confirm if we were already seeing a fresh of the Pigeon algorithm, but they would not confirm nor deny.

HTTPS / SSL Algorithm

Release Date: August 6, 2014

During my trip to SMX West this year, Matt Cutts stated that he would love to see websites that utilize SSL certificates (i.e.; note the HTTPS vs. HTTP, which means this website is secured) receive a ranking boost for providing a secured website to their websites visitors. Well, it seems he knew more than he was letting on – five months later and Google officially announced that they now provide a minor ranking boost to websites that have SSL certificates installed. They also suggested that they might increase the weight of this ranking signal in the future.

If this post has been informative for you, feel free to share it with your friends.

[socialize service=’twitter’]     [socialize service=’facebook’]     [socialize service=’googleplus’]     [socialize service=’linkedin’]



The Google Algorithms Keep Coming, and Coming, and Coming ….

As you can see, we SEO’s have our hands full. There are only a handful of algorithms (out of the hundreds of updates Google makes per year) that are significant enough for Google to publicly announce, but it truly is a never ending battle. One day you might have 1st page rankings for “Denver Plumber”, the next week you could fall off to the 5th page or even farther, and then you could find yourself ranking higher in the weeks following.

My seven plus years of SEO experience has taught me that Google’s SERPs are in a never-ending state of flux, but if you build and optimize your website to the highest standard, you tend to not have to worry about any sort of negative impact from these frequent updates. And this holds true with the majority of client’s here at iMarket Solutions. We only utilize white hat methodologies, and we stay up to date on Google’s quality guidelines and the algorithms they do publicly announce, so we know exactly what Google prefers in a website.

We and our methodologies aren’t perfect; we have had some clients’ who have seen negative results from algorithmic changes, but I’ve learned to accept that as collateral damage, if you will. It’s simply impossible to build a perfect website and marketing campaign, especially with so many great minds and websites competing with our own, and to not expect some sort of backwards movement at one time or another. The important thing to do in those situations though, which is pretty much a favorite past-time of ours, is to review ranking, Google Analytics and Google Webmaster Tools data to try and determine which pages or strategies an algorithm has targeted on a site, and to use that knowledge to better your methodologies as a whole. This is what iMarket Solutions does on a weekly basis for our clients’, and a large reason as to why we are capable of building such successful SEO campaigns.


If you are in the HVAC, electrical, plumbing or home remodeling industry and want a website that dominates organic search results, feel free to give us a call – (800) 825-7935!

 And if you have any questions or comments regarding this blog post, I’d love to hear about them below.

Continue Reading

The Google Penguin 2.1 Algorithm Update Is Here, And It’s Scarier Than Ever!

Friday, October 4th, 2013

Scary Penguin 2.1 algorithm update


All right, so maybe Google’s Penguin 2.1 algorithm update isn’t nearly as scary as the picture above. But with Halloween right around the corner, would you expect anything less?

Go ahead, share it with your friends – you know you want to.

[socialize service=’twitter’]     [socialize service=’facebook’]     [socialize service=’googleplus’]     [socialize service=’linkedin’]


What Is the Penguin Algorithm, You Ask?

Although Penguin 2.1 doesn’t have glaring red eyes, snarling fangs, and a tattoo on its upper right shoulder, I know many webmasters would rather tango with the beast portrayed above rather than Google’s infamous Penguin algorithm. Matt Cutts, the head of the Google Web Spam team (a.k.a. the Search Quality team) announced via Twitter the launch of the Penguin 2.1 update. And of course, it wasn’t long before the story was covered by Danny Sullivan of Search Engine Land.

Google first released the Penguin algorithm on April 24, 2012. At the time, many SEO’s largely considered it to be the beginning of the end of search engine optimization. And for many website owners that indulged in paid, spamming or low quality link building or black hat onsite tactics, it was just that. In fact, Matt Cutts reported that the first Penguin algorithm impacted 3.1% of English search queries. Fortunately, Penguin 2.1 only “affects less than 1% of searches to noticeable degree”, again, as reported by Matt Cutts.

Since the inception of Penguin in 2012, there have been four other modifications to the algorithm, with Penguin 2.1 being the fifth and most recent change made to the algorithm. Keep in mind, Google only upgrades the point system a full number when they feel a significant enough amount of modifications have been made to the algorithm for them to consider it to be a full algorithm update. In instances where there were only minor modifications made to the previous algorithm, like with Penguin 2.1, they tend to only change the increments by decimal points; similar to how WordPress categorizes their installation updates.

What Does Penguin 2.1 Look for?

It’s likely that Penguin 2.1 looks for everything that the preceding Penguin algorithms have looked for, but it’s hard to say at this point in time. To give you some history: when the first Penguin algorithm was released, no one knew what to expect. As the months passed, certain characteristics emerged that were synonymous with someone who was negatively affected by a Penguin update being released. But what no one had anticipated was that the first Penguin algorithm likely only analyzed and took action on back links pointing to the homepage (or at the very least, only top level pages of a site). This of course wasn’t discovered until Penguin 2 was announced, and Matt Cutts hinted at the fact that Penguin 2 went much deeper into a website than the original Penguin algorithm.

My personal understanding of the Penguin algorithm, based on my own research and personal experiences, is that it largely targets the following:

  • Low quality links
  • Spammy links
  • Paid links that are not marked as nofollow
  • Spammy or black hat SEO techniques
  • Article syndication links
  • Forum or blog posting abuse (such as planting links within forum signatures, or creating really low quality blog posts on 3rd party sites, like Blogger, for the sole sake of acquiring backlinks)
  • Large amounts of exact match keywords used within external links
  • Large amounts of backlinks coming from one website (usually referred to as site wide links)
  • Widget links
  • Aggressive link exchanges
  • In fact, pretty much anything listed on the Google Link Schemes page, which is a part of the quality guidelines set forth by Google

Keep in mind the Penguin algorithm is just that – an algorithm. There is no human intervention. It follows a formula that was designed by the engineers working at Google, and adjusts how a website or a specific webpage ranks, based on its analysis of that entity. It is my own personal belief that the Penguin algorithm acts as a flagging system for the Google Search Quality team, alerting them to instances of unnatural linking. I believe this is how they were able to assign so many manual actions (a.k.a. unnatural linking penalties) to websites throughout 2012 – 2013. Granted, Google does claim that every manual action is reviewed manually by an actual human being, but too many websites were penalized for this process to not be somewhat automated.

As a side note: if your website is showing you have a manual action within Google Webmaster Tools and you want to better understand what you can do to remove it, read my blog post to learn how to recover from a Google unnatural linking penalty.

How Can I Keep Track of These Penguin Changes and Other Algorithm Updates?

Besides having to troll through countless SEO blogs, in hopes of keeping up-to-date with all the most recent Google algorithm changes or modifications you have a few more options for staying in the loop.

  1. Subscribe to RSS feeds. Fortunately, there’s no getting around it. SEO sites like and are constantly reporting on all things SEO. In many cases, they’ll learn of an algorithm change or feature introduction far before others in the industry will have. Sites like these I was always have an RSS feed which you can sync to your RSS reader, delivering the stories directly to your computer, email, or mobile phone.
  2. Read our blog. We at iMarket Solutions love to read about SEO to help us stay on top of our game, and we equally enjoy writing about it. As such, we feel it’s in the best interest of our clients and our readers to keep them all up-to-date all of the most important search engine related updates. With that being said, feel free to keep an eye on our blog for all of the latest SEO news and tips.
  3. Bookmark the Google algorithm change history webpage. is one of the most trusted sources in the search engine optimization industry. Originally, they started off as SEO consultants. But their love of SEO and helping people grew so much beyond what they had initially anticipated, that they had change of heart; decidedly choosing to help SEO’s become better at what they do so they can better assist their own clients. One of the ways they help SCO’s is by keeping track of all of the Google algorithm update changes – and now you can too, by bookmarking this page.


In Conclusion

One thing is for certain – Google sure has been busy. With the introduction of the Google Hummingbird algorithm, their LARGEST algorithm release in over a decade, it is most certain that many webmasters will see some sort of fluctuation with their rankings, traffic and leads. The important thing is not to panic. If you are an iMarket Solutions client, chances are slim you will be negatively affected by Penguin 2.1, as we only engage in white hat SEO (also referred to as “best practices” throughout the industry), and only pursue organic link opportunities. If by chance you do experience any dramatic shifts in rankings, traffic or leads, do be sure to give us a call. We can have an SEO specialist look at your website to determine what is affecting your website and come up with an actionable plan to reverse the results.

Continue Reading

The Google Hummingbird Algorithm: The Beginning of Contextual Search Results

Friday, September 27th, 2013

Today marks Google’s 15th anniversary. And with birthdays come birthday presents. But oddly enough, Google has decided to give us all a gift instead, by announcing a brand-new algorithm: the Google Hummingbird algorithm. However, the Google Hummingbird algorithm is unlike previous algorithm updates such as Panda, Penguin, Freshness, and others. In fact, this isn’t an update at all. The Google Hummingbird algorithm is actually a completely revamped version of their search algorithm which helps Google better understand the implied intent and context of a search query, allowing them to return better search results that are more relevant to the searchers’ intent.

The Google Hummingbird algorithm is here!


What Is the Google Hummingbird Algorithm?

So far, we know very little about the Hummingbird algorithm, besides that it was apparently implemented under the radar sometime in August, 2013. Danny Sullivan of Search Engine Land was the first to bring the algorithm to light, having learned about its introduction in a press event some Google executives held yesterday at the Silicon Valley home were Google was first launched 15 years ago. Having covered search engine news since 1996, Danny was able to wield his mighty reputation to gather a few more details on Hummingbird from some well-known top Google executives: Amit Singhal and Ben Gomes. To try to keep his explanation as simple as possible, Danny compared this algorithm change to that of an engine swap, and not necessarily a modification like the infamous Panda or Penguin algorithms.

When asked to give an example of another algorithm change Google introduced that is comparable in size to this algorithm, the Google executives struggled to reference another time when the company introduced an algorithm of this size, impact, and nature. Amit stated to his knowledge that the last time a change this significant happened with their algorithm was back in 2001. And, of course, there was the Google Caffeine update which was a significant change in their system, but that affected how they indexed pages, not how they presented them to the end-user.

How Will the Google Hummingbird Algorithm Affect My Websites Rankings?

It’s really difficult to determine the effect the Google Hummingbird algorithm is having on search results at this point. From what I’ve gathered, the purpose of the Hummingbird algorithm is to help Google present webpages which are more in tune with the searchers’ intent, versus just a page that has the specific words used within the search query throughout its content, headers, Meta tags, and backlinks. What this means to me is that long tail search queries will start to play a more pivotal role in organic online marketing.

And as this algorithm’s goal is to better understand the intent of the searcher who is using phrases (or as Google refers to it, “conversational search”), unusual traffic spikes or dips are unlikely to happen, as only exact match keywords (like “plumbing Irvine”) would have such a drastic impact on inbound search traffic to a website. Nevertheless, I encourage you to check your Google Analytics and Google Webmaster Tools accounts to see what, if any, impact the Google Hummingbird algorithm has had on your organic search traffic.

Personally, I believe the most significant impact this algorithm will have on the search engine world is that it will force webmasters to stop focusing so much on the progress of exact match keywords (a.k.a. “money keywords”). At this point, I expect many of you reading this will be laughing hysterically, thinking, “Right buddy… Let me just go saddle up my unicorn so I can deliver that message to my clients.” Well, before you laugh your socks off, allow me to share a 2009 post from Rand Fishkin, where he references a study done by HitWise in the year prior that outlines exactly how important long tail keywords really are.

Understanding “Fathead,” “Chunky Middle,” and “Long Tail” Keywords

First: Yes, this study is more than five years old. But considering that Google has only progressively gotten better at understanding a user’s intent, I’m willing to wager that the percentage of long tail keywords contributing to one’s organic search traffic volume has only increased since then. The post that Rand Fishkin initially referenced is apparently no longer available on the Experian website; however, we still have Rand Fishkin’s post on “Illustrating the Long Tail” to reference.

In his article, Rand Fishkin breaks up the types of keywords that were referenced in the HitWise study within three different groups: fat head, chunky middle, and long tail. I have placed below his graph on “The Search Demand Curve” to better illustrate this data for you.

A graph of the search demand curve showing the importance of long tail keywords in Google organic search results.

Source: Rand Fishkin,

As you can see above, the fat head accounts for the top 10,000 keywords from the study, making up just 18.5% of the total search traffic. The chunky middle consists of roughly 11.5% of monthly searches, and long tail keywords account for the other 70%. Keep in mind that these numbers are based off of just 3 months’ worth of sample keyword data from search engines, with adult search terms filtered out, so the actual size of the long tail could potentially be much, much longer.


Keep your friends and clients in the loop regarding the Google Hummingbird algorithm:

[socialize service=’twitter’]     [socialize service=’facebook’]     [socialize service=’googleplus’]     [socialize service=’linkedin’]


So How Can I Take Advantage of Google Hummingbird?

The most important thing you can do is to understand the intent of visitors on your website. In fact, I’m asking you to think like Google does. Tom Anthony from Distilled does a great job of breaking down the differences between implicit and explicit searches in his Moz blog, “From Keywords to Context: the New Query Model,” which should help you better understand how Google interprets search results. But allow me to summarize his points:

Tom talks about how Google may deliver varying results to you based on a number of different conditions. For example: if you type in “fast food” on your desktop computer at home, Google may just show you results for fast food restaurants within the immediate area. But if you do the same search while on a bike or driving a car, Google would interpret that you can cover a much larger distance, and therefore will display results for fast food restaurants outside of your immediate area; he references this as an implicit search. And with the public announcement of the Google Hummingbird algorithm, it is clear his train of thought was pretty spot-on.

If you are an iMarket Solutions client, you are most likely a contractor in the HVAC, plumbing, electrical, or general contractor industry; so your website’s visitors will almost always be using implicit queries (as you provide a service to fill their needs). Fortunately, we already have strategies in place for our clients that tend to the specific needs of their customers for each of the different services they provide, and for all the different regions they service. As a client, if you want to know what else you can do to help us improve your organic search engine visibility; I strongly recommend you read my previous post on “How Your SEO Company Is Like Jerry Maguire.” I give three SEO tips on exactly what you can do to help complement our current SEO strategies.

If you’d like to learn even more about the Google Hummingbird algorithm, read what Danny Sullivan reported on here.

Continue Reading

How to Recover from a Google Unnatural Linking Penalty (a.k.a. Manual Action)

Friday, August 23rd, 2013

Unnatural linking is a no-no. Google has made this very clear for quite some time. And yet, even until this day, there are many professional search engine optimizers (SEOs) who simply refuse to believe it. Although many SEOs acknowledge the perils of unnatural linking, they nevertheless continue their black hat methods (I like to refer to them as “the special few”). And do you know why they still condone and practice unnatural linking? That is because to an extent, it still works well for increasing keyword rankings. In this blog post, I talk about why unnatural linking can be more harmful than helpful, give you some information about the various types of unnatural linking I have found to be most common in websites that I have helped recover. I also explain how to recover from an unnatural linking penalty (a.k.a a manual action).

Not All Backlinks Are Created Equal

Although inbound links still play an influential role in Google’s ranking algorithm, I would definitely consider backlink building to be a double-edged sword. On the one hand, building backlinks can be a great way of increasing keyword rankings for the keywords targeted within the anchor text of those links. But on the other hand, obtaining those links unnaturally poisons your site. In the more severe cases I have seen, Google has taken manual action on entire websites instead of just specific pages, removing all pages of the affected site from their index completely. Needless to say, not all backlinks were created equal.

Not all links were created equal

For those unfamiliar with the term, manual action; this is the terminology Google uses in-house when referring to their unnatural linking penalties. The Google unnatural linking penalty is a result of a manual action, which means that someone from the Search Quality team personally reviewed the website in question and found it to be in violation of their quality guidelines. However, even with this being a manual process, Google has still been able to target tens of thousands of webmasters (this was confirmed by Matt Cutts on the official Google Webmaster Central blog when they first notified the public about the new unnatural link notifications, and also provided vague examples of unnatural links). I think Google has been able to do this on a scalable level with the help of one of their more formidable algorithms to date, Penguin.

The Penguin Algorithm May Help Google Dish Out Unnatural Linking Penalties

Penguin is an algorithm, which means that it completes its work according to set formulas and adjusts your websites rankings accordingly. From what we know, the Penguin algorithm was designed by the Search Quality team at Google and released for two very apparent reasons: to identify and nullify unnatural links.

The inception of the Penguin algorithm on April 24, 2012 has since forced SEOs to find alternative, more natural ways of link building to keep their clients’ websites ranking above their competition. But I think there is more to this algorithm than I first realized. I think that Google is using this algorithm to collect large amounts of data on unnatural backlinks, which they then use to identify and take manual action on sites which have practiced the most egregious of unnatural link building.

Who Needs to Worry About Google Linking Penalties?

There are still many SEO companies and SEO consultants who have ignored Google’s advice. Unfortunately, this means that the clients’ who put their faith into these companies are putting their websites, and consequently the livelihood of their online business, at risk. I have even seen unnatural linking penalties innocently triggered by website owners who were simply uneducated on Google’s strict quality guidelines. This is all the more reason why every SEO and webmaster alike should read the Google quality guidelines. Not doing so can result in a significant loss of rankings, which can lead to a reduction of both traffic and leads.

Read Google's quality guidelines to learn why you may have received an unnatural linking penalty


Here is Your Unnatural Linking Penalty Recovery Plan

This isn’t rehab, but admitting that you have a problem is definitely the first step on the road to unnatural linking recovery. Because Google now confirms whether or not a website actually has an unnatural linking penalty within Google Webmaster Tools, finding out whether or not your site has been penalized could not be easier. Assuming you have already set up and verified Google Webmaster Tools, and have confirmed that your website has an unnatural linking penalty, the next step is identifying the toxic links pointing to your website.

How to Identify Unnatural Links

The process for identifying unnatural links is a long and tedious one. There are many different reasons why Google may take manual action on your website. The following steps will outline some of the ways I have been able to find unnatural links which are capable of triggering manual actions:

  1. Verify ownership of your website within Once you do this, you can gain access to your websites link data (as seen by Majestic SEO) for free.
  2. Once you are logged into Majestic SEO, create a historical report of all links pointing to the root domain of your website (the root domain is your website’s domain name, without the http:// or www preceding it) and identify any links deemed as unnatural. Some things that would classify a link as unnatural:
    1. A poor quality or spammy website. Although a tedious process, manually reviewing each backlink pointing to your website is the best and most failsafe way for identifying unnatural, spammy or low quality backlinks.
    2. A completely irrelevant website. The websites pointing to the target site should relate to that site. And just because the anchor text used to link back to the website is relevant does not make the entire site relevant. If the content of the page linking to you does not relate, the link should be classified as unnatural.
    3. Large amounts of backlinks coming from one referring domain (also known as site wide links). The more backlinks you receive from a website, the less valuable they become. In my experience, large amounts of backlinks are a clear indicator of unnatural linking, as they usually stem from widget or footer links which replicate throughout a large portion or all of a websites pages.
    4. Large amounts of links using exact match anchor text (i.e. “Orlando Plumbing”, “Visit this website”, etc. …). I have found exact match anchor text as probably the most common culprit for manual actions being assigned to a website. When natural link building occurs, it is rare that you will see many instances of the same anchor text being used twice. So you can imagine why Penguin would raise a red flag on your website when it notices half of your backlinks are using one anchor text variation. This is why it’s important you learn how to do keyword research properly before even thinking about starting a link building campaign. And of course, make sure you stay clear of keyword cannibalization.
    5. Paid links (i.e. where a webmaster pays a 3rd party to place a followed link on their own site pointing back to the webmasters own site). Paid backlinks are also one of the more common reasons why webmasters are receiving unnatural linking penalties. It is a clear violation of Google’s quality guidelines to purchase links which flow PageRank. If you do pay for any directory submissions or to have a backlink point to your website, make sure the webmaster adds a rel=nofollow tag to that link. This will prevent the flow of PageRank, and will keep you manual action-free (assuming you aren’t partaking in any other forms of unnatural link building).
  3. In my experience, the more backlink data you have to evaluate, the better your chances of succeeding. Some other ways for you to find backlinks to analyze are:
    1. Download the provided sample links within the “Links to Your Site” section of Google Webmaster Tools. Matt Cutts states this is all you need to focus on in order to recover from an unnatural linking manual action, though some in the SEO industry are questioning Matt Cutts’ statement.
    2. Subscribe to or is great for analyzing your backlink data down to a very granular level, similar to what is now possible at as well. And has a Link Detox report which algorithmically calculates the most likely unnatural backlinks within your backlink profile.

Steps to Recover from a Google Unnatural Linking Penalty

Once you have compiled a comprehensive list of unnatural links, you will need to start contacting the webmasters of these links, politely asking them to either nofollow or remove the backlinks from their site pointing to. Keep in mind that many webmasters might get offended when you contact them, implying their backlinks may be hurting your rankings, so word your e-mail as politely as possible. And in some instances, the amount of backlinks is so steep that it may be in your best interest to request that they remove all backlinks pointing from their site to your own vs. just a handful of backlinks you identified. Below is an outline of the exact steps you need to take in order to request removal of an unnatural linking manual action placed upon your website:

  1. Reach out to the webmasters of the sites you identified as having unnatural links. In fact, you need to contact them multiple times via either contact forms or e-mail addresses found on their site, asking that they remove all backlinks on their site pointing to the target site. Sites like and tools like can make this tedious task that much more manageable, as they automate a lot of the initial contact and follow up process.
  2. Document all link removal efforts within a Google Doc sheet. Statistics you should track are the target URL, the dates you attempted to contact the webmaster, and the status of the link at the end of all your efforts. If any of the webmasters request that you make payment for link removal, just disregard them and add that note to your document for the respective website. Remember to make this document available to all, so you can share it within your submitted reconsideration request to Google’s Search Quality team.
  3. Submit a disavow file. Once you have reached out at least three different times to the target webmasters, I would then recommend you create and submit a disavow file, asking Google to disregard all of the links you’ve deemed unnatural. Make sure you read Google’s instructions on how to create and submit a disavow file, so you know that you’re completing this process correctly.
  4. Submit a reconsideration request. After you have successfully submitted your disavow file, I would recommend waiting at least two weeks before submitting a reconsideration request to Google. This should give Google ample time to review the file and disavow the targeted links accordingly. However, please note that it is extremely rare for Google to revoke a manual action after the first submitted reconsideration request.The worst-case scenario could very well be that you may have to wait for the manual action to time out on your website, but that’s not to say Google’s Search Quality team won’t simply apply another of greater magnitude later down the road (which they have been known to do, as verified by Barry Schwartz). Some webmasters have even disavowed all backlinks pointing to their site, but this blanketed approach should not be considered lightly.

My hopes are that this blog post will help you with your unnatural linking road to recovery, but the hard truth is that you may not be able to do this yourself. Identifying unnatural links can be difficult enough, and making yourself available to reach out to webmasters can be a full-time gig on its own. You don’t have to walk this path alone, however. There are plenty of SEO companies out there who have extensive experience in manual action recovery. If you need further guidance or would like to speak with iMarket Solutions about helping you recover from an unnatural linking penalty, feel free to reach out to us. We’d be delighted to steer you in the right direction to a brighter, unnatural link-free future. And as always, if you found this article to be of use to you, please consider sharing it with your social circles.

Continue Reading

How Your SEO Company is Like Jerry Maguire

Friday, August 9th, 2013

As an SEO (search engine optimizer) with years of experience at multiple successful SEO companies, I have had the pleasure of optimizing hundreds of websites. Local, national, international; services focused, informational and e-commerce – I have worked on them all. And in that time, my points of contact at these companies has ranged from interns all the way up to CEO’s. Despite the vast differences in positions held, along with what they offer and who they offer it to, there is one common denominator among the majority of these companies – they all say to their SEO company, in so many words, “SHOW ME THE MONEY!”

Rod Tidwell saying, "Show me the money."

The phrase, you may recall, comes from Rod Tidwell: the NFL receiver played by Cuba Gooding, Jr. in the movie Jerry Maguire. People in business expect results from their partners, just like Rod did. In this case, you have most likely invested into your SEO company’s experience, knowledge and capabilities as a company that specializes in search engine optimization because you want to improve your online marketing results. But many business owners tend to forget that their success online is largely based off of their cooperation, involvement and dedication to their online marketing campaign. This blog post carries two purposes: to offer SEO tips for local contractors in the HVAC, plumbing and electrical industries (though these SEO tips can apply to most industries), and to educate business owners on why they should not be uninvolved like Rod.


Your SEO Company LOVES You!

All right, so maybe “love” is a bit of a strong word, but great SEOs truly are invested into helping their clients achieve better online visibility, and ultimately, leads. Why, you might ask? Because we understand that your success is our success. We know from experience that providing great customer service and delivering exceptional results for our clients’ online marketing campaigns is the key to continuing viable business relationships, and consequently, the best way to secure our place as one of the best SEO companies around. But we cannot do it all without you.

I referenced Jerry Maguire’s “help me, help you” scene in my headline because it perfectly describes our intent as an SEO. Our only purpose is to help your online marketing campaign succeed. But as hard as we work, there are simply some things we cannot do successfully without the help of our clients.

Here are a few helpful tips from an SEO company that specializes in helping electrical, plumbing and HVAC contractors with their website development and local online marketing needs.

And while you’re at it, feel free to share this blog post with any friends or business partners you feel may benefit from it, especially if they are in the plumbing or heating and air conditioning industries.

[socialize service=’twitter’]     [socialize service=’facebook’]     [socialize service=’googleplus’]     [socialize service=’linkedin’]


3 SEO Tips to Improve Your Online Marketing Strategy

SEO Tip #1: I Don’t Know What I Don’t Know

This tip isn’t as applicable for our company because we are experts in HVAC and plumbing marketing. All we do is build, write about and optimize HVAC and plumbing websites – and we’ll soon be rolling out social media services to all of our clients. It’s hard not to become an expert when dealing with the same industry all day, every day. Needless to say, one couldn’t expect an SEO company that offers services to hundreds of clients within a vast array of industries to have expert knowledge on all of them. So don’t assume most SEOs know what they are doing; many of them just follow a formula they devised years prior and scaled to all of their clients’ websites. Instead of accepting generic boilerplate content, get involved and offer ideas. Whether it’s suggestions for performing keyword research or an idea for a new infographic – we love to hear your thoughts! After all, your knowledge and expertise in your field will most likely always surpass that of your SEOs.

SEO Tip #2: Don’t Be Stingy!

We get it. You’re busy and can never find a time to meet with your SEO. Well guess what: if your SEO is good at their job, you shouldn’t have much time to chat with them. You’ll be too busy with all your new customers! But that doesn’t mean you still can’t help them help you! If you can’t set aside the time to get on the phone, you can still send an e-mail.

A faux portrayal of an SEO company representative screaming at a client

Some of the ways we encourage our clients to help benefit our online marketing efforts include:

  1. Send us videos and photos. Everyone loves watching videos. Seeing something on video paints a far more engaging picture than a string of words can ever accomplish. Still pictures can be just as beneficial. We encourage our contractors to send photographs and videos of common issues experienced among their clients, so potential clientele and site visitors know what to keep an eye out for. Not only can these videos and photos potentially bring in quite a bit of traffic to the site, but they help build your brand as an authority in your area and service, and make you look that much more professional.
  2. Send us past interviews or publications. If you have ever done any interviews or published any articles, be sure to make your SEO company aware of that. Not only do those references help build your credibility when posted on your site, but they ensure your potential customers see them. (Such articles rarely get the benefit of links to your website when posted on other sites.) Knowing about previously published material helps us find any websites that may be hosting that content, and a friendly e-mail to the webmaster could result in a valuable backlink to your site.
  3. Send us content ideas. Again, this isn’t as applicable for our company because we are very familiar with the heating and air conditioning industry, but every SEO company can learn new things from their clients. As Tip #1 makes clear, “I don’t know what I don’t know.” So don’t be scared or modest – speak up and share your ideas! Collaborations between a client and an SEO can lead to a more successful Internet marketing strategy.

Even if it takes you a month to put together just a few things, that can still make a difference. And who knows; your help may be the grain of salt that makes your website outrank your competition.

SEO Tip #3: Introduce Us to Your Friends

Surprisingly enough, we’ve met many business owners who are simply not comfortable with introducing their friends to their SEO company. It doesn’t have to be this way! Can’t we all just be friends? Seriously though, your affiliates can benefit you in more ways than one. We encourage our clients to put us in touch with their affiliates. With a proper introduction and a little bit of charm, gaining backlinks from these affiliates to your website can be a breeze. And the best part about it?

  1. With any luck, your affiliate partner has a reputable business, which often means they also have a website with a high PageRank (PR). And higher-caliber websites are less likely to link their sites to many people, giving you the leg up on your competition.
  2. Affiliates tend to be in the same industry or a vertical industry. For example, HVAC contractors travel a lot and carry large amounts of weight in their vehicles. And the larger the HVAC company is, the more vehicles they have. In order to keep their technicians on the road, some will hire companies that specialize in servicing HVAC vehicles. More often than not, such a company would be more than happy to place a testimonial from you on their website, which could (at our request) be linked back to your own site. As testimonial pages tend to appear in the main navigation on most websites, this could result in a backlink from a more authoritative page, which only builds your authority and relevancy in this industry that much more.


Do You Feel More Obligated to Help Your SEO Company Now?

Hopefully these SEO tips have given you a different perspective of your SEO, or at least their intent. Remember: the next time you question the success of your SEO campaign, ask yourself if you’re doing everything within your power to help your SEO company succeed. After all, you can’t expect us to “show you the money” if you’re not willing to “help us help you.”

Continue Reading

The How’s and Why’s of Performing Industry Specific Keyword Research to Maximize Organic Search Leads

Tuesday, July 23rd, 2013

Practically all of us in the Internet marketing industry have heard the phrase, “content is king”, but just because you have written relevant content for your website does not make it worthy of the 1st position in Google search results (a.k.a. the throne). Sure, your website’s content may rank decently for some keywords and phrases. But without performing industry-specific keyword research, your website’s traffic (and the flow of online organic search leads) will suffer in the long run. And with so many factors contributing to a website’s rank at the top of search engines in today’s competitive world, one must capitalize on every possible strategy to gain the competitive advantage. Again, none of this can be achieved without performing thorough and industry-specific keyword research.

We at iMarket Solutions regularly create and manage relevant content on behalf of our clients. But we understand that some of our clients want to proactively perform keyword research to identify any low-hanging fruit for their online marketing campaigns. To help guide them down the right path, I have put together this blog post that will briefly detail why you should perform keyword research specific to your industry, and how you can go about doing so.

Perform Keyword Research Like You Dress for Work: Appropriately

Would you consider flying to Alaska in October without a jacket? The most likely answer: no. The same concept can be applied to online marketing. What may be appropriate for one region or industry may not necessarily apply to all. One must also take into account seasonal changes. For example, “air conditioning” focused keywords are not as popular in Seattle, WA as “heating” related keywords, especially in the winter months. This is why you need to put in the time and effort to ensure you are targeting as many of the most relevant, likely, and widely searched combinations of keyword phrases used to find businesses of your nature.

This is how you maximize your intake of organic search engine leads.

THIS is how you perform keyword research!

THIS is how you perform keyword research!

Fortunately, your hard work will not go unnoticed, as this keyword research will pave the foundation for the focus of your content creation – both on the main pages of your website and on your blog (if you have one). For the sake of this blog post though, I will focus primarily on how to perform keyword research to rank better in Google search results.

What Are the Best Ways to Perform Keyword Research?

You can utilize many different techniques to help ascertain the best possible keywords to use for your Internet marketing campaign, but I’ll share with you some of the techniques we use here at iMarket Solutions.

  1. One of the best ways for you to perform keyword research is to watch the top-ranking websites in your industry. After all, they are there for a reason. Chances are your competition has put in the time and effort to determine the most viable keywords to pursue. If you notice any particular competitive websites ranking for numerous keywords you would also like to target, then we suggest you reverse engineer them using a tool like SEM Rush or SpyFu. At no cost to you (besides the time spent researching) you can get a better understanding of which keywords your competitors are ranking for, and an estimate of how much search volume they are receiving as a result of their efforts.

  3. If you have not already done so, set up Google Webmaster Tools. Once you have gathered enough data (we recommend waiting at least one month before viewing the report), you can go to the Search Queries report within the Search Traffic tab and view a list of keywords which your website is ranking for. To better determine the keywords that will drive the most traffic to your site, we recommend you take your filtering a step further by sorting impressions from greatest to least – I refer to this as the “keyword opportunities” report. Search for keywords with large amounts of impressions that are not on the first page, or even better are not in the top 3 positions. At this point, these keywords stand a much better chance at getting to the first page than others, and we have found they can end up being some of the more widely searched keywords in your targeted service area.

    Perform keyword research using the Google Webmaster Tools search queries report

    Perform keyword research using the Google Webmaster Tools search queries report


  5. Even though this SEO tool is being phased out, the Google Adwords Keyword Tool has always been a favorite among SEO consultants and agencies alike. And as it has still not been pulled offline by Google, I figured it was still worthy of at least a farewell posting. The Google Adwords Keyword tool has long been the go to keyword research tool, as it provides data directly “from the horse’s mouth.” Once you have completed your competitive keyword research, you would want to take your list of keywords and input them into “Word or phrase” box. For a more targeted search, you can specify the website you are performing keyword research for, and even choose a relevant category. Additional search filtering options include match types (which allows you to receive search data for broad, exact, and phrase match type keywords), select the location and language of interest, view desktop or mobile search data separately, and even view search volume on either a global or local level.Once the Google Adwords Keyword Tool is phased out, it will be replaced with the Google Adwords Keyword Planner. This modified version of the tool has a much more user-friendly user interface. The most useful of the new features is not only having the ability to combine multiple keyword lists together, but also being able to view and select cities from an interactive map online, making it that much easier to ascertain search volume for targeted keywords in very specific cities or regions.

Help us, help you with your online marketing strategy – call iMarket Solutions today!

Now that you are armed with some basic strategies and tools that can help you further expand your company’s reach on your target demographic, get out there and start researching new potential keywords to use for your site. Once you have a thorough list of keywords to consider, the next thing you need to do is create some awesome content to publish to your website. Not sure where to start? Then learn how to write your own website content first. If you at any time find yourself overwhelmed by any part of the process, or feel more comfortable passing off your duties to a professional online marketing company that specializes in helping HVAC, electrical and plumbing contractors, then consider calling iMarket Solutions.

Internet marketing for electrical, plumbing, heating and air conditioning contractors is what we do.

Our SEO experts will build and market a website that can dominate the plumbing, heating and air conditioning, and electrical SERPs in your region – and all with a goal CPL (cost per lead) that will make your competitors weep. Contact us today for pricing.

Continue Reading

The History and Benefits of Google Authorship

Wednesday, July 10th, 2013

June 07, 2011 marks a significant date in the history of Google – the introduction of Google Authorship. Othar Hansson, a software engineer at Google, announced via the Google Webmaster Central blog the introduction of authorship markup in Google’s web search. Experts on search engine optimization and Google’s algorithms could immediately foresee the importance of this feature. But interestingly enough, Google’s authorship markup has flown under the radar for quite some time. To help shed light on the value of Google authorship, allow us to elaborate on the history of authorship and its perceived benefits.

What is Google Authorship?

In the most straightforward terms, Google authorship is a markup (coding) which can be implemented onto websites with blogs, allowing authors who post on those blogs to take credit for their content. When properly set up, the author’s Google Plus profile picture appears next to his or her content in search results (as seen in the below screenshot), along with a home page address, if pertinent. Though initially welcomed with open arms by well-known writers in the SEO industry (such as Barry Schwartz), as well as other large brands (such as the Washington Post, Entertainment Weekly and CNET), authorship has really only recently become more common in Google’s search results.

Google authorship snippet for Nadia Romeo

So the question remains: why is Google authorship only now becoming more common?

We believe there are multiple reasons:

  1. At the time of its introduction, quality content creation did not stand at the forefront of most SEO company proposals (if they even listed it at all). “Fly by the seat of their pants” SEO companies were not prepared for, nor convinced, that delivering quality content would be essential to gaining better rankings, especially when they could still reap the rewards by focusing on large amounts of outsourced or scripted, low-quality link building. And until Penguin reared its “beak” in April of 2012, the majority of Google’s search results reflected these more spammy techniques. Fortunately, with the release of Penguin 2.0, we’re seeing fewer low-quality sites and more original and quality content curators appearing higher in Google searches.

  3. “We wanted to make sure the markup was as easy to implement as possible,” Othar stated in the original announcement from Google. Ironically enough, that wasn’t the case. The comments of the authorship markup and web search post reflect the frustrations of countless webmasters perplexed at how to implement the new markup onto their websites. Since then, Google has built several resources to help users set up authorship. You can easily set up authorship here, and the structured data testing tool can help you verify the accuracy of your authorship markup code.

  5. With Panda, Penguin and other Google algorithm updates rolling out on a regular basis, everyone may just be too busy trying to recover and adapt to the new algorithms. After all, who has time to figure out the set-up process for some new complex Google feature that may or may not positively impact their site’s search visibility?

Now that the storm has settled and fewer algorithm updates become public, more and more users are finally setting up and verifying Google authorship for their websites. If you’re contemplating such a step – setting up authorship for either yourself or a client – you need to consider the potential benefits.

Google Authorship Benefits

We can still only speculate on exactly how Google Authorship will tie into Google’s search engine ranking algorithms. However, you can still reap some direct benefits from setting up authorship sooner rather than later.

The benefits of Google authorship include:

  1. Increased Click-Through Rate – Justin Briggs wrote a great post which included statistics on how user behavior is changing in Google’s search results. He makes it clear that one’s authorship image draws far more attention than a regular search snippet. In an attempt to understand why an authorship image increases click-through rates of listings in Google, our SEO Manager Matt Dimock found in a small test done a few years back. It indicates that the addition of an author tag increased the percentage of visitors to a specific landing page by almost 15%, while maintaining the same position in the SERPs.

  3. Content Ownership – Prior to authorship, it could be difficult to find good quality content from a trusted source. Now people can more easily find the content they want from sources they trust by clicking on the “by [Name]” link. For example, if you click on “by Nadia Romeo” while doing a search for HVAC Nadia Romeo, you will see all content Nadia Romeo, the President of iMarket Solutions, has written and verified ownership of, as well as her own personal Google knowledge graph.Nadia Romeo Google authorship search results
  4. Personal Branding – Everyone is aware of the thought leaders and big brands in their respective industries. Naturally, we gravitate to content written by these more reputable sources because they are probably more inclined to share content worth reading. If you’re well known within your industry, then it makes all kinds of sense to let your image show within Google’s search results.

  6. Build Trust with Your Community – Unless you look like Barry Schwartz (who speculates that Google removed his authorship image for being ugly), your authorship image could help build trust with your community. After all, they say a picture is worth a thousand words and that you only get one first impression, so don’t blow it; dress sharp and smile for the birdie.

Now that you know the history and some of the known benefits of Google authorship, what are you waiting for? Go and verify your Google authorship today!

If you are in the HVAC, plumbing or electrical business and need help with setting up authorship on your blog, give us a call or contact us today.

Continue Reading

Copyright Infringement and Google Search Results

Wednesday, August 15th, 2012

Imagine you are enjoying a nice walk in the park and someone snatches your smartphone right out of your hands! How would you feel? Would you do this to someone else?

These seem like silly questions. The answer to all is generally, “of course not!” However, on the Internet, it seems like many have let the rules slide a bit with copying content from other websites.

Some may think, “What’s the harm, really?” Other than it being against the law, Google is now very serious about copyright infringement. Google recently announced that they have updated their algorithm to push sites down, or even penalize sites, that have been reported as violating copyright.

We have seen other companies steal content from our clients’ sites. Now that Google is serious about it, we will be reporting (on your behalf) anyone who steals content from your site. This means we are looking out for our clients to make sure and pursue a complaint against a site that has copied you and refuses to remove the copied content. This is part of your ongoing service with iMarket Solutions.

What if you copy content or images from another site without permission? Well we know you would never do that! But just in case, if you or a company you work with adds copied information to your iMarket Solutions website, we will remove it immediately upon notification or discovery. We are legally obligated to take this action as your host, and we want to avoid your site dropping in rankings over copyright complaints filed by other sites.

If you have any questions about this important update, please contact us!

Continue Reading