Website Design, Optimization & Lead Generation Solutions

Call Us Today:

800-727-3920

Join Us Online

  • YouTube
  • Facebook
  • Twitter
  • Linked in

Posts Tagged ‘Google’

Google Penguin 3.0 – The Algorithm Refresh That Should Have Been an Update

Wednesday, October 22nd, 2014 by Matt Dimock

On October 4, 2013, Matt Cutts announced the release of Penguin 2.1 – an update to their infamous algorithm that targets spammy onsite and offsite SEO strategies. Quite a few webmasters in the SEO community reported that the 2.1 update had quite the negative impact on their websites. And due to the nature of this particular algorithm, a recovery is not possible without a manual update, or refresh, of the algorithm being pushed through by Google.

The Google Penguin thug

The Google Penguin thug

Many people perceive the Penguin algorithm as nothing more than a thug, here to force thousands of small businesses into paid advertisement on Google by tanking their organic visibility. So I’m sure you can imagine the unrest within the community as the one year anniversary of the last update passed. But on Friday (October 17th, 2014), webmasters finally got their wish – Google began rolling out Penguin 3.0. Whether or not it was what they had hoped for is yet to be determined.

The Difference between a Penguin Refresh and Update

As I mentioned above, a website that has been negatively impacted by the Penguin algorithm cannot recover, unless Google manually updates the algorithm, or refreshes the current version of that algorithm. So what is the difference between an algorithm update and an algorithm refresh? I’m glad you asked …

What is an Algorithm Refresh?

An algorithm refresh simply means Google has not modified, removed or added any new signals to a previously introduced algorithm. Think of it as running anti-virus software, except that all of the “viruses” (a.k.a. spammy tactics) it finds when it is initially rolled out are not allowed to impact your “computer” (a.k.a. website) again.

But just like with viruses, there are always variations of spammy SEO strategies being utilized, so as to try and stay one step ahead of the “software” (i.e. algorithm). Google can then run a refresh of their algorithm, in hopes that they will catch any new spammy tactics that have been used since the previous refresh or update. However, just like anti-virus software, these refreshes can become obsolete as SEO’s find new ways to avoid the algorithm. That’s when an update comes in handy.

What is an Algorithm Update?

An algorithm update means Google has modified, removed, or added new signals to a pre-existing algorithm, in hopes that these signals will catch any new strategy variations that previous updates had missed. Again, using the previous analogy; it would be just like Norton providing updates to the anti-virus software on your computer, in hopes of catching any newly found viruses, or variations of old ones.

What Do We Know About Penguin 3.0 So Far?

It’s tough to say at this point. Commentary from Matt Cutts and other Google representatives led us all to believe that the next version of Penguin was to be a significant update, which implied new signals would be introduced. Barry Schwartz wrote an article at the beginning of October that suggested a Penguin 3.0 update may come as soon as within a week following his post. Barry made this prediction based on input provided by Gary Illyes, a Google Webmaster Trends Analyst who apparently was involved in working on the algorithm. However, it seems Barry may have been a bit too ambitious with his choice of words, as Gary even commented on Google+, “I love how you guys could twist “soon” into this”. Some useful insights on the Penguin algorithm were extracted from Barry’s post though.

Penguin Insights:

  • Gary confirmed that a disavow file (which is a .txt file you can submit within Google Webmaster Tools that pretty much indicates to Google which backlinks pointing to your site you do not want any credit from) are taken into consideration when the Penguin algorithm is updated or refreshed.
  • Disavow files submitted after two weeks prior to Gary’s presentation at SMX East would not be taken into consideration in this next iteration of the Penguin update/refresh.
  • Google is working on speeding up the rate at which future Penguin refreshes will happen.

In a Google+ Hangout session on October 20th, John Mueller stated that as far as he knows, the Penguin update had rolled out completely – but when asked by Barry to clarify if it was indeed an update or just a refresh, he declined to comment. However, that same day, he then followed up with Barry in a Google+ comment stating, “I might have spoken a bit early, hah – it looks like things may still be happening. I’ll double-check in the morning.”

It turns out that John did indeed jump the gun in stating Penguin had rolled out completely, as Pierre Farr (who works at Google UK) stated the following:

“On Friday last week, we started rolling out a Penguin refresh affecting fewer than 1% of queries in US English search results. This refresh helps sites that have already cleaned up the webspam signals discovered in the previous Penguin iteration, and demotes sites with newly-discovered spam.

It’s a slow worldwide roll-out, so you may notice it settling down over the next few weeks.”

Notice the choice of words in his first sentence – Penguin refresh? Although it’s not an official confirmation, it definitely suggests that this iteration of Penguin is indeed just a refresh and not an actual update of the algorithm. He then goes on to state refresh again in the next line, and his definition of what this refresh does pretty much coincides with what you would expect from a typical refresh; not an update. Lastly, he confirms that this refresh (which supposedly impacts < 1%) is still rolling out worldwide, and as such, fluctuations in rankings and traffic can be expected to last for the next few weeks.

What Can Should You Do?

First and foremost, it is of the utmost importance that you DO NOT panic. As we’ve seen with rolling updates (which so far have only been confirmed with the Panda update), they can take some time to impact all websites. And in some instances, we have seen rankings move up, down, and up again over the period of a roll-out (and significantly in some cases).

Don't panic; organize your SEO strategies!

Don’t panic; organize your SEO strategies!

So before you decide on throwing in the flag, wait to see how your websites traffic and impressions are impacted. If you see noticeable positive increases, then keep on doing what you’re doing, as it’s obviously working at this point in time. And if you see that your website was negatively impacted by Google, then it’s important to understand why, so you can develop a strategy to fix the issues on your website, or within your backlink profile.

Fortunately for us and our client’s, we build the majority of our clients’ websites from scratch, so it would be rare that one of our sites would be targeted as a result of their onsite work. Much more common scenarios are domains that were involved in unscrupulous link building campaigns prior to hiring our services. Nevertheless, we’re adequately prepared to tackle either issue, should they arise, and want any webmasters reading this blog post to be prepared as well.

Below Are Some Steps You Can Take to Help You Recover from Penguin:

Step 1 – Review Your Analytics Data

It is important to know if your site has been impacted, and which pages in particular, before you can devise a strategy to keep Penguin from targeting you. What my team and I do is look at only Google organic analytic’s data, comparing traffic of landing pages for the 2 – 4 weeks following the day the algorithm rolled out to the same amount of time immediately prior. You have to compare apples to apples (i.e. looking at Monday – Sunday vs the previous Monday – Sunday) in order to get an accurate representation of what your traffic trend should look like.

Also, it’s important to look at absolute data vs. average data, as the number of visits lost is much more telling of a sign vs. the average percentage (as you could have a -100% decrease to a specific landing page, but that does not really tell you anything if the page was getting 4 visits in those two weeks prior and is now getting none).

Step 2 – Understand Why Certain Landing Pages Were Targeted

Penguin can and does impact traffic to your entire site, but more often than not, specific landing pages are the cause of you being targeted in the first place (especially if you’re dealing with onsite spam vs. low quality backlinking). So what you need to do is evaluate the pages on your site you have deemed as being targeted and determine why Google thinks those pages are spammy. Are you stuffing important keywords or mentions of a geo-target within the content or headers? Are the Meta tags extremely long and stuffed with near identical variations of a keyword? For further guidance, you can read my previous post on Penguin 2.1, where I specify the instances of onsite spam I would be consider “Penguin bait”.

And beyond onsite, you also have to take into consideration offsite work: backlinks. Using www.MajesticSeo.com, we are able to evaluate the quality, quantity and methodologies our clients’ have used in the past to build backlinks for their website. My blog post on “How to Recover from a Google Unnatural Linking Penalty” will walk you through the steps on not only identifying spammy backlinks, but how to disavow them as well.

Step 3 – Take Action!

As pointed out above, understanding why you have been targeted by Penguin is the key to recovering. Once you have identified the culprit strategies, you need to work diligently on remedying them. If spammy onsite is to blame, then you need to work on cleaning up the SEO strategies you have in place on your website. If spammy backlinks are to blame, then you need to identify what backlinks are harming your site and work on asking the webmasters of the linking sites to remove those links.

And always remember: what may have worked wonderfully in the past will not always work as well in the future, so just appreciate the ride you had enjoyed, implement a revised search engine optimization strategy as soon as possible, clean up what needs to be cleaned up on or offsite, and hope that a future refresh or update of the Penguin algorithm will work in your favor.

The Life and Death of the Google Authorship Markup

Wednesday, October 8th, 2014 by Matt Dimock

It was exciting while it lasted, but unfortunately, Google authorship is no longer supported by Google. But first, allow me to shed a little light on the rise and fall of Google’s authorship markup.

Google authorship markup is no longer supported  by Google

A screenshot from the Google Webmaster Tools Help forum, which states, “Authorship markup is no longer supported in web search.”

The Google authorship rich snippet was first introduced by Matt Cutts at the SMX Advanced conference, back in 2011. For those unfamiliar with this rich snippet; it allowed you to identify yourself as the author of the content within a blog post, which would then publish a small thumbnail of your Google+ profile photo directly to the left of your blog post snippet within Google’s search results. In the beginning, the process for setting up authorship on your website and verifying it with Google was quite unclear for many, which proved to be quite the perk for those that were successful in setting up their authorship’s markup early on.

The Benefits of Google Authorship

As mentioned previously, the early adopters of Google’s authorship rich snippet quickly realized that the most obvious benefit was the increased click through rate (CTR) for their blog posts that ranked on the first page of Google. It’s no surprise really; the addition of the photo thumbnail displayed prominently on search results pages, especially when it was the only one showing for a particular keyword.

An example screenshot of the authorship markup in Google's search results

An example screenshot of the authorship rich snippet for Nadia Romeo (the President of iMarket Solutions) within Google’s search results

Another large benefit of having authorship was the ease in which one could find all of the content another had written and associated with their Google+ profile. All you had to do was click on their name within the byline of their search snippet and it would take you directly to a Google search page that listed all of the content associated with their authorship. This was beneficial to both brands and authors alike, as people who followed their work could more easily find other content the author had written which they may have not previously read.

The last benefit, AuthorRank, was largely considered a myth up until Matt Cutts confirmed its existence on Twitter after Amit Singhal publicly denied its existence at SMX West this past March.

AuthorRank was the concept that Google assigned quality scores to authors who created a Google Plus account and correctly associated their authorship with content they had written and published to websites. The assumption was that the more authoritative you appeared to Google on a particular topic or topics, the more likely their AuthorRank algorithm was to rank your related content higher in their search results. Many of us in the search engine optimization (SEO) community truly believed AuthorRank was going to be the key to the future of becoming more visible in Google’s search results. But alas, it appeared it was not used to the broad scope that so many SEO’s had imagined.

R.I.P. Google Authorship

On June 25, 2014, it came as a shock to most SEO’s when Google’s John Mueller announced on Google+ that they would be “simplifying the way authorship is shown in mobile and desktop searches”, so as to provide for a better mobile experience. The simplification turned out to be a mass reduction of authorship photographs showing for particular searches on Google. However, the byline and date of publication still showed below the URL of a search result for those who were affected. The significance of the impact that change had on search results was documented by the MozCast feature graph and can be further reviewed here.

Then just two months later, John Mueller reported that Google had made the decision to remove authorship from their search results entirely. In paraphrasing: he stated Google found that the authorship markup was not very useful to visitors of their search engine, and in fact, was potentially distracting.

A Few Last Words for Google Authorship

For those of you who have the authorship markup installed on your websites; don’t worry about Google punishing you for leaving the code in place. Barry Schwartz, the owner of the popular SEO blog SERoundtable.com, asked John Mueller in a comment on his announcement of the death of authorship if Google would penalize websites which left the authorship markup in place. John confirmed that Google would not punish you for leaving the markup on your website, and that in fact, visitors to your website still might appreciate being able to find out more about you and/or your company through your Google+ profile.

The Google Algorithm Frenzy:
Pigeon, HTTPS, Authorship and MUCH More

Thursday, August 14th, 2014 by Matt Dimock

On July 3, 2014, Matt Cutts declared to the search community that he was going on leave for 4 months, all the way through October. Upon hearing the news, I had a big sigh of relief. For you see, I thought to myself, “There is absolutely no way Google is going to launch any algorithms or make any significant updates to their existing algorithms while the face of their search quality department was on leave of absence.” I mean, who are SEO’s going to yell at and blame for all of their woes while he is away, right? John Mueller, a Google Webmaster Trends Analyst, tends to navigate Google’s help forums and host Google Hangouts, giving advice on how to build a website that’s both user and search engine friendly, would be my first guess – but is he adequately prepared to fend off the masses of SEO questions ahead?

Screenshot of Matt Cutts stating he is going on leave in 2014

A screenshot of Matt Cutts stating he is going on leave in 2014.

 

Google Algorithms Launched and Updated in 2014

Well, it turns out that Google had no plans for holding back during Matt’s absence. In fact, it seems they’ve decided to turn up the heat a bit. They’ve made quite a splash with announcing two new algorithms, and there have even been a couple periods of unconfirmed algorithm changes hitting the streets, causing significant SERP (search engine ranking placement) changes. But before we get into the more recent algorithms, let’s first have a quick look at the algorithms that Google has acknowledged have launched or been updated, prior to Matt skipping town.

Page Layout Algorithm #3

Release Date: February 6, 2014

The Page Layout algorithm (originally referred to as, “Ads above the Fold” algorithm) was first launched on January 19, 2012, and then updated in October 9, 2012. Google believes having too much ad-space / advertisements above the page fold of a website provides for a bad user experience, so the algorithm aims to hinder the rankings of websites that are dominantly ad heavy. In trying to find what impact the Page Layout algorithm had on our 130+ HVAC, plumbing and electrical clients, I found that this algorithm negatively impacted any thin content pages. And I also suspected that it treated having too many images above the page fold the same as having too many advertisements. Moral of the story is to try and have as much unique and valuable content above the page fold as possible instead of having too many images or ad’s.

Unnamed Update

Release Date: March 24, 2014

Although it was never publicly confirmed by Google, there was a major shake-up in the SERPs around March 24 – 25. According to Moz.com’s documented Google Algorithm Change History, SEO’s speculated this was the softer Panda update Matt Cutts confirmed at SMX West 2014 would be rolling out relatively soon.

Payday Loan 2.0

Release Date: May 16, 2014

The Payday Loan algorithm was first introduced by Matt Cutts via Twitter on June 11, 2013. In his tweet, he suggested webmasters watch a video he posted previously in which he discussed some upcoming changes to Google’s search. In the video, he hints that the algorithm is geared towards cleaning up search results that tend to be more aggressively spammed than others, such as payday loan, Viagra, gambling, pornographic and other like keywords.

Panda 4.0

Release Date: May 19, 2014

In what’s now becoming a more common thing, Google is releasing significant algorithm updates within short periods of one another, in what I believe is a tactic to help make it more difficult for SEO’s (such as myself) from reverse engineering the exact metrics their algorithms target. Apparently Matt Cutts stated the algorithm started rolling out on May 20, but the SERP data Moz.com collects indicated it may have started a day earlier.

Payday Loan 3.0

Release Date: June 12, 2014

Not even a month after the 2nd confirmed update for the Payday Loan algorithm, we received word from Google that they launched yet another update. According to Barry Schwartz’s Search Engine Roundtable article, Matt Cutts stated that the 2.0 update not only helped to better prevent negative SEO (the black hat art of pointing spammy, irrelevant and/or low quality backlinks to competitor sites), but it also specifically targeted spammy websites. Whereas this 3rd update more so targeted spammy search queries, which is consistent with what the initial algorithm did.

Authorship Photo Dropped

Release Date: June 28, 2014

This is more so a SERP display change rather than an algorithm update, but it is significant enough of a change to have earned a spot on this list. John Mueller announced on June 25, 2014 that Google would no longer be showing authorship photos within their search results. Many webmasters, including myself, noticed in the days and weeks prior to the official announcement that there were some rare instances of authorship photos disappearing, but I don’t believe any of us would have believed Google would have removed the authorship photos entirely. Nevertheless, Google has opted to leave the authorship name, which links directly to the Google+ profile for the respective authors, as well as the date of when the blog article was published. You can see an example of this in the screenshot below showing my authorship info for my blog post on keyword cannibalization.

Screenshot of the authorship search result for my post on keyword cannibalization

A screenshot of the authorship search result for my post on keyword cannibalization.

 

Above I discussed new, updated and unconfirmed algorithm tweaks that occurred prior to Matt Cutts going on leave.

Below are the algorithm’s which launched after Matt Cutts going on leave.

Pigeon Algorithm

Release Date: July 24, 2014

This update was actually never officially named internally at Google. So in an attempt to make it easier for SEO’s to reference it later, Search Engine Land (SEL) was, as usual, quick to dub this algorithm – they decided on “Pigeon.” Keeping with the theme of P-named algorithm updates (i.e. Panda, Penguin, Payday Loan, Page Layout), SEL felt this name was appropriate because this algorithm specifically targeted local search results, and “Pigeons tend to fly back home.”

Now getting to the good stuff – what we do know about this algorithm is that it is said to be the 2nd largest algorithm to be released since the Venice update. Barry Schwartz also claims Google told him that the new local-focused algorithm particularly made local SERP ranking signals similar to that of organic SERP ranking signals, and that it improved their ability to better interpret and factor in both distance and location for improved local ranking. To me, this suggests that websites will have a harder time ranking locally for cities in which they are no physically located within.

MozCast’s Google SERP Feature Graph, a tool that shows changes in Google SERP results, indicated a local SERP drop from 19.3% to a low of 9.2% within the days following the algorithm’s launch and a slight increase of knowledge graph results, from 26.7% to 28%.

Screenshot of the MozCast Google SERPs Feature Graph tool showing the impact of the Pigeon algorithm update

A screenshot of the MozCast Google SERPs Feature Graph tool showing the impact of the Pigeon algorithm update.

 

The commonality spotted by many SEO’s in the industry is that there are many keyword phrases which used to show a local map pack, but no longer do (Mike Blumenthal, a well-known local SEO expert, noticed that real estate type keywords in particular seemed to be the ones that were most significantly impacted, which he indicates in his comment here). Moz’s graph is showing that local search results seem to have regulated back to normal around July 29, 2014 – Barry Schwartz reached out to Google to confirm if we were already seeing a fresh of the Pigeon algorithm, but they would not confirm nor deny.

HTTPS / SSL Algorithm

Release Date: August 6, 2014

During my trip to SMX West this year, Matt Cutts stated that he would love to see websites that utilize SSL certificates (i.e. https://www.wellsfargo.com; note the HTTPS vs. HTTP, which means this website is secured) receive a ranking boost for providing a secured website to their websites visitors. Well, it seems he knew more than he was letting on – five months later and Google officially announced that they now provide a minor ranking boost to websites that have SSL certificates installed. They also suggested that they might increase the weight of this ranking signal in the future.

If this post has been informative for you, feel free to share it with your friends.

          

     

 

The Google Algorithms Keep Coming, and Coming, and Coming ….

As you can see, we SEO’s have our hands full. There are only a handful of algorithms (out of the hundreds of updates Google makes per year) that are significant enough for Google to publicly announce, but it truly is a never ending battle. One day you might have 1st page rankings for “Denver Plumber”, the next week you could fall off to the 5th page or even farther, and then you could find yourself ranking higher in the weeks following.

My seven plus years of SEO experience has taught me that Google’s SERPs are in a never-ending state of flux, but if you build and optimize your website to the highest standard, you tend to not have to worry about any sort of negative impact from these frequent updates. And this holds true with the majority of client’s here at iMarket Solutions. We only utilize white hat methodologies, and we stay up to date on Google’s quality guidelines and the algorithms they do publicly announce, so we know exactly what Google prefers in a website.

We and our methodologies aren’t perfect; we have had some clients’ who have seen negative results from algorithmic changes, but I’ve learned to accept that as collateral damage, if you will. It’s simply impossible to build a perfect website and marketing campaign, especially with so many great minds and websites competing with our own, and to not expect some sort of backwards movement at one time or another. The important thing to do in those situations though, which is pretty much a favorite past-time of ours, is to review ranking, Google Analytics and Google Webmaster Tools data to try and determine which pages or strategies an algorithm has targeted on a site, and to use that knowledge to better your methodologies as a whole. This is what iMarket Solutions does on a weekly basis for our clients’, and a large reason as to why we are capable of building such successful SEO campaigns.

 

If you are in the HVAC, electrical, plumbing or home remodeling industry and want a website that dominates organic search results, feel free to give us a call - (800) 825-7935!

 And if you have any questions or comments regarding this blog post, I’d love to hear about them below.

The Google Penguin 2.1 Algorithm Update Is Here, And It’s Scarier Than Ever!

Friday, October 4th, 2013 by Matt Dimock

Scary Penguin 2.1 algorithm update

 

All right, so maybe Google’s Penguin 2.1 algorithm update isn’t nearly as scary as the picture above. But with Halloween right around the corner, would you expect anything less?

Go ahead, share it with your friends – you know you want to.

          

     

What Is the Penguin Algorithm, You Ask?

Although Penguin 2.1 doesn’t have glaring red eyes, snarling fangs, and a tattoo on its upper right shoulder, I know many webmasters would rather tango with the beast portrayed above rather than Google’s infamous Penguin algorithm. Matt Cutts, the head of the Google Web Spam team (a.k.a. the Search Quality team) announced via Twitter the launch of the Penguin 2.1 update. And of course, it wasn’t long before the story was covered by Danny Sullivan of Search Engine Land.

Google first released the Penguin algorithm on April 24, 2012. At the time, many SEO’s largely considered it to be the beginning of the end of search engine optimization. And for many website owners that indulged in paid, spamming or low quality link building or black hat onsite tactics, it was just that. In fact, Matt Cutts reported that the first Penguin algorithm impacted 3.1% of English search queries. Fortunately, Penguin 2.1 only “affects less than 1% of searches to noticeable degree”, again, as reported by Matt Cutts.

Since the inception of Penguin in 2012, there have been four other modifications to the algorithm, with Penguin 2.1 being the fifth and most recent change made to the algorithm. Keep in mind, Google only upgrades the point system a full number when they feel a significant enough amount of modifications have been made to the algorithm for them to consider it to be a full algorithm update. In instances where there were only minor modifications made to the previous algorithm, like with Penguin 2.1, they tend to only change the increments by decimal points; similar to how WordPress categorizes their installation updates.

What Does Penguin 2.1 Look for?

It’s likely that Penguin 2.1 looks for everything that the preceding Penguin algorithms have looked for, but it’s hard to say at this point in time. To give you some history: when the first Penguin algorithm was released, no one knew what to expect. As the months passed, certain characteristics emerged that were synonymous with someone who was negatively affected by a Penguin update being released. But what no one had anticipated was that the first Penguin algorithm likely only analyzed and took action on back links pointing to the homepage (or at the very least, only top level pages of a site). This of course wasn’t discovered until Penguin 2 was announced, and Matt Cutts hinted at the fact that Penguin 2 went much deeper into a website than the original Penguin algorithm.

My personal understanding of the Penguin algorithm, based on my own research and personal experiences, is that it largely targets the following:

  • Low quality links
  • Spammy links
  • Paid links that are not marked as nofollow
  • Spammy or black hat SEO techniques
  • Article syndication links
  • Forum or blog posting abuse (such as planting links within forum signatures, or creating really low quality blog posts on 3rd party sites, like Blogger, for the sole sake of acquiring backlinks)
  • Large amounts of exact match keywords used within external links
  • Large amounts of backlinks coming from one website (usually referred to as site wide links)
  • Widget links
  • Aggressive link exchanges
  • In fact, pretty much anything listed on the Google Link Schemes page, which is a part of the quality guidelines set forth by Google

Keep in mind the Penguin algorithm is just that – an algorithm. There is no human intervention. It follows a formula that was designed by the engineers working at Google, and adjusts how a website or a specific webpage ranks, based on its analysis of that entity. It is my own personal belief that the Penguin algorithm acts as a flagging system for the Google Search Quality team, alerting them to instances of unnatural linking. I believe this is how they were able to assign so many manual actions (a.k.a. unnatural linking penalties) to websites throughout 2012 – 2013. Granted, Google does claim that every manual action is reviewed manually by an actual human being, but too many websites were penalized for this process to not be somewhat automated.

As a side note: if your website is showing you have a manual action within Google Webmaster Tools and you want to better understand what you can do to remove it, read my blog post to learn how to recover from a Google unnatural linking penalty.

How Can I Keep Track of These Penguin Changes and Other Algorithm Updates?

Besides having to troll through countless SEO blogs, in hopes of keeping up-to-date with all the most recent Google algorithm changes or modifications you have a few more options for staying in the loop.

  1. Subscribe to RSS feeds. Fortunately, there’s no getting around it. SEO sites like SearchEngineLand.com and SERoundTable.com are constantly reporting on all things SEO. In many cases, they’ll learn of an algorithm change or feature introduction far before others in the industry will have. Sites like these I was always have an RSS feed which you can sync to your RSS reader, delivering the stories directly to your computer, email, or mobile phone.
  2. Read our blog. We at iMarket Solutions love to read about SEO to help us stay on top of our game, and we equally enjoy writing about it. As such, we feel it’s in the best interest of our clients and our readers to keep them all up-to-date all of the most important search engine related updates. With that being said, feel free to keep an eye on our blog for all of the latest SEO news and tips.
  3. Bookmark the Google algorithm change history webpage. Moz.com is one of the most trusted sources in the search engine optimization industry. Originally, they started off as SEO consultants. But their love of SEO and helping people grew so much beyond what they had initially anticipated, that they had change of heart; decidedly choosing to help SEO’s become better at what they do so they can better assist their own clients. One of the ways they help SCO’s is by keeping track of all of the Google algorithm update changes – and now you can too, by bookmarking this page.

In Conclusion

One thing is for certain – Google sure has been busy. With the introduction of the Google Hummingbird algorithm, their LARGEST algorithm release in over a decade, it is most certain that many webmasters will see some sort of fluctuation with their rankings, traffic and leads. The important thing is not to panic. If you are an iMarket Solutions client, chances are slim you will be negatively affected by Penguin 2.1, as we only engage in white hat SEO (also referred to as “best practices” throughout the industry), and only pursue organic link opportunities. If by chance you do experience any dramatic shifts in rankings, traffic or leads, do be sure to give us a call. We can have an SEO specialist look at your website to determine what is affecting your website and come up with an actionable plan to reverse the results.

How to Recover from a Google Unnatural Linking Penalty (a.k.a. Manual Action)

Friday, August 23rd, 2013 by Matt Dimock

Unnatural linking is a no-no. Google has made this very clear for quite some time. And yet, even until this day, there are many professional search engine optimizers (SEOs) who simply refuse to believe it. Although many SEOs acknowledge the perils of unnatural linking, they nevertheless continue their black hat methods (I like to refer to them as “the special few”). And do you know why they still condone and practice unnatural linking? That is because to an extent, it still works well for increasing keyword rankings. In this blog post, I talk about why unnatural linking can be more harmful than helpful, give you some information about the various types of unnatural linking I have found to be most common in websites that I have helped recover. I also explain how to recover from an unnatural linking penalty (a.k.a a manual action).

Not All Backlinks Are Created Equal

Although inbound links still play an influential role in Google’s ranking algorithm, I would definitely consider backlink building to be a double-edged sword. On the one hand, building backlinks can be a great way of increasing keyword rankings for the keywords targeted within the anchor text of those links. But on the other hand, obtaining those links unnaturally poisons your site. In the more severe cases I have seen, Google has taken manual action on entire websites instead of just specific pages, removing all pages of the affected site from their index completely. Needless to say, not all backlinks were created equal.

Not all links were created equal

For those unfamiliar with the term, manual action; this is the terminology Google uses in-house when referring to their unnatural linking penalties. The Google unnatural linking penalty is a result of a manual action, which means that someone from the Search Quality team personally reviewed the website in question and found it to be in violation of their quality guidelines. However, even with this being a manual process, Google has still been able to target tens of thousands of webmasters (this was confirmed by Matt Cutts on the official Google Webmaster Central blog when they first notified the public about the new unnatural link notifications, and also provided vague examples of unnatural links). I think Google has been able to do this on a scalable level with the help of one of their more formidable algorithms to date, Penguin.

The Penguin Algorithm May Help Google Dish Out Unnatural Linking Penalties

Penguin is an algorithm, which means that it completes its work according to set formulas and adjusts your websites rankings accordingly. From what we know, the Penguin algorithm was designed by the Search Quality team at Google and released for two very apparent reasons: to identify and nullify unnatural links.

The inception of the Penguin algorithm on April 24, 2012 has since forced SEOs to find alternative, more natural ways of link building to keep their clients’ websites ranking above their competition. But I think there is more to this algorithm than I first realized. I think that Google is using this algorithm to collect large amounts of data on unnatural backlinks, which they then use to identify and take manual action on sites which have practiced the most egregious of unnatural link building.

Who Needs to Worry About Google Linking Penalties?

There are still many SEO companies and SEO consultants who have ignored Google’s advice. Unfortunately, this means that the clients’ who put their faith into these companies are putting their websites, and consequently the livelihood of their online business, at risk. I have even seen unnatural linking penalties innocently triggered by website owners who were simply uneducated on Google’s strict quality guidelines. This is all the more reason why every SEO and webmaster alike should read the Google quality guidelines. Not doing so can result in a significant loss of rankings, which can lead to a reduction of both traffic and leads.

Read Google's quality guidelines to learn why you may have received an unnatural linking penalty

Here is Your Unnatural Linking Penalty Recovery Plan

This isn’t rehab, but admitting that you have a problem is definitely the first step on the road to unnatural linking recovery. Because Google now confirms whether or not a website actually has an unnatural linking penalty within Google Webmaster Tools, finding out whether or not your site has been penalized could not be easier. Assuming you have already set up and verified Google Webmaster Tools, and have confirmed that your website has an unnatural linking penalty, the next step is identifying the toxic links pointing to your website.

How to Identify Unnatural Links

The process for identifying unnatural links is a long and tedious one. There are many different reasons why Google may take manual action on your website. The following steps will outline some of the ways I have been able to find unnatural links which are capable of triggering manual actions:

  1. Verify ownership of your website within www.MajesticSEO.com. Once you do this, you can gain access to your websites link data (as seen by Majestic SEO) for free.
  2. Once you are logged into Majestic SEO, create a historical report of all links pointing to the root domain of your website (the root domain is your website’s domain name, without the http:// or www preceding it) and identify any links deemed as unnatural. Some things that would classify a link as unnatural:
    1. A poor quality or spammy website. Although a tedious process, manually reviewing each backlink pointing to your website is the best and most failsafe way for identifying unnatural, spammy or low quality backlinks.
    2. A completely irrelevant website. The websites pointing to the target site should relate to that site. And just because the anchor text used to link back to the website is relevant does not make the entire site relevant. If the content of the page linking to you does not relate, the link should be classified as unnatural.
    3. Large amounts of backlinks coming from one referring domain (also known as site wide links). The more backlinks you receive from a website, the less valuable they become. In my experience, large amounts of backlinks are a clear indicator of unnatural linking, as they usually stem from widget or footer links which replicate throughout a large portion or all of a websites pages.
    4. Large amounts of links using exact match anchor text (i.e. “Orlando Plumbing”, “Visit this website”, etc. …). I have found exact match anchor text as probably the most common culprit for manual actions being assigned to a website. When natural link building occurs, it is rare that you will see many instances of the same anchor text being used twice. So you can imagine why Penguin would raise a red flag on your website when it notices half of your backlinks are using one anchor text variation. This is why it’s important you learn how to do keyword research properly before even thinking about starting a link building campaign. And of course, make sure you stay clear of keyword cannibalization.
    5. Paid links (i.e. where a webmaster pays a 3rd party to place a followed link on their own site pointing back to the webmasters own site). Paid backlinks are also one of the more common reasons why webmasters are receiving unnatural linking penalties. It is a clear violation of Google’s quality guidelines to purchase links which flow PageRank. If you do pay for any directory submissions or to have a backlink point to your website, make sure the webmaster adds a rel=nofollow tag to that link. This will prevent the flow of PageRank, and will keep you manual action-free (assuming you aren’t partaking in any other forms of unnatural link building).
  3. In my experience, the more backlink data you have to evaluate, the better your chances of succeeding. Some other ways for you to find backlinks to analyze are:
    1. Download the provided sample links within the “Links to Your Site” section of Google Webmaster Tools. Matt Cutts states this is all you need to focus on in order to recover from an unnatural linking manual action, though some in the SEO industry are questioning Matt Cutts’ statement.
    2. Subscribe to www.Ahrefs.com or www.Linkresearchtools.com. Ahrefs.com is great for analyzing your backlink data down to a very granular level, similar to what is now possible at Majesticseo.com as well. And Linkresearchtools.com has a Link Detox report which algorithmically calculates the most likely unnatural backlinks within your backlink profile.

Steps to Recover from a Google Unnatural Linking Penalty

Once you have compiled a comprehensive list of unnatural links, you will need to start contacting the webmasters of these links, politely asking them to either nofollow or remove the backlinks from their site pointing to. Keep in mind that many webmasters might get offended when you contact them, implying their backlinks may be hurting your rankings, so word your e-mail as politely as possible. And in some instances, the amount of backlinks is so steep that it may be in your best interest to request that they remove all backlinks pointing from their site to your own vs. just a handful of backlinks you identified. Below is an outline of the exact steps you need to take in order to request removal of an unnatural linking manual action placed upon your website:

  1. Reach out to the webmasters of the sites you identified as having unnatural links. In fact, you need to contact them multiple times via either contact forms or e-mail addresses found on their site, asking that they remove all backlinks on their site pointing to the target site. Sites like www.Removeem.com and tools like www.BoomerangGmail.com can make this tedious task that much more manageable, as they automate a lot of the initial contact and follow up process.
  2. Document all link removal efforts within a Google Doc sheet. Statistics you should track are the target URL, the dates you attempted to contact the webmaster, and the status of the link at the end of all your efforts. If any of the webmasters request that you make payment for link removal, just disregard them and add that note to your document for the respective website. Remember to make this document available to all, so you can share it within your submitted reconsideration request to Google’s Search Quality team.
  3. Submit a disavow file. Once you have reached out at least three different times to the target webmasters, I would then recommend you create and submit a disavow file, asking Google to disregard all of the links you’ve deemed unnatural. Make sure you read Google’s instructions on how to create and submit a disavow file, so you know that you’re completing this process correctly.
  4. Submit a reconsideration request. After you have successfully submitted your disavow file, I would recommend waiting at least two weeks before submitting a reconsideration request to Google. This should give Google ample time to review the file and disavow the targeted links accordingly. However, please note that it is extremely rare for Google to revoke a manual action after the first submitted reconsideration request.The worst-case scenario could very well be that you may have to wait for the manual action to time out on your website, but that’s not to say Google’s Search Quality team won’t simply apply another of greater magnitude later down the road (which they have been known to do, as verified by Barry Schwartz). Some webmasters have even disavowed all backlinks pointing to their site, but this blanketed approach should not be considered lightly.

My hopes are that this blog post will help you with your unnatural linking road to recovery, but the hard truth is that you may not be able to do this yourself. Identifying unnatural links can be difficult enough, and making yourself available to reach out to webmasters can be a full-time gig on its own. You don’t have to walk this path alone, however. There are plenty of SEO companies out there who have extensive experience in manual action recovery. If you need further guidance or would like to speak with iMarket Solutions about helping you recover from an unnatural linking penalty, feel free to reach out to us. We’d be delighted to steer you in the right direction to a brighter, unnatural link-free future. And as always, if you found this article to be of use to you, please consider sharing it with your social circles.

The How’s and Why’s of Performing Industry Specific Keyword Research to Maximize Organic Search Leads

Tuesday, July 23rd, 2013 by Matt Dimock

Practically all of us in the Internet marketing industry have heard the phrase, “content is king”, but just because you have written relevant content for your website does not make it worthy of the 1st position in Google search results (a.k.a. the throne). Sure, your website’s content may rank decently for some keywords and phrases. But without performing industry-specific keyword research, your website’s traffic (and the flow of online organic search leads) will suffer in the long run. And with so many factors contributing to a website’s rank at the top of search engines in today’s competitive world, one must capitalize on every possible strategy to gain the competitive advantage. Again, none of this can be achieved without performing thorough and industry-specific keyword research.

We at iMarket Solutions regularly create and manage relevant content on behalf of our clients. But we understand that some of our clients want to proactively perform keyword research to identify any low-hanging fruit for their online marketing campaigns. To help guide them down the right path, I have put together this blog post that will briefly detail why you should perform keyword research specific to your industry, and how you can go about doing so.

Perform Keyword Research Like You Dress for Work: Appropriately

Would you consider flying to Alaska in October without a jacket? The most likely answer: no. The same concept can be applied to online marketing. What may be appropriate for one region or industry may not necessarily apply to all. One must also take into account seasonal changes. For example, “air conditioning” focused keywords are not as popular in Seattle, WA as “heating” related keywords, especially in the winter months. This is why you need to put in the time and effort to ensure you are targeting as many of the most relevant, likely, and widely searched combinations of keyword phrases used to find businesses of your nature.

This is how you maximize your intake of organic search engine leads.

THIS is how you perform keyword research!

THIS is how you perform keyword research!

Fortunately, your hard work will not go unnoticed, as this keyword research will pave the foundation for the focus of your content creation – both on the main pages of your website and on your blog (if you have one). For the sake of this blog post though, I will focus primarily on how to perform keyword research to rank better in Google search results.

What Are the Best Ways to Perform Keyword Research?

You can utilize many different techniques to help ascertain the best possible keywords to use for your Internet marketing campaign, but I’ll share with you some of the techniques we use here at iMarket Solutions.

  1. One of the best ways for you to perform keyword research is to watch the top-ranking websites in your industry. After all, they are there for a reason. Chances are your competition has put in the time and effort to determine the most viable keywords to pursue. If you notice any particular competitive websites ranking for numerous keywords you would also like to target, then we suggest you reverse engineer them using a tool like SEM Rush or SpyFu. At no cost to you (besides the time spent researching) you can get a better understanding of which keywords your competitors are ranking for, and an estimate of how much search volume they are receiving as a result of their efforts.
  2.  

  3. If you have not already done so, set up Google Webmaster Tools. Once you have gathered enough data (we recommend waiting at least one month before viewing the report), you can go to the Search Queries report within the Search Traffic tab and view a list of keywords which your website is ranking for. To better determine the keywords that will drive the most traffic to your site, we recommend you take your filtering a step further by sorting impressions from greatest to least – I refer to this as the “keyword opportunities” report. Search for keywords with large amounts of impressions that are not on the first page, or even better are not in the top 3 positions. At this point, these keywords stand a much better chance at getting to the first page than others, and we have found they can end up being some of the more widely searched keywords in your targeted service area.
  4.  

    Perform keyword research using the Google Webmaster Tools search queries report

    Perform keyword research using the Google Webmaster Tools search queries report


     

  5. Even though this SEO tool is being phased out, the Google Adwords Keyword Tool has always been a favorite among SEO consultants and agencies alike. And as it has still not been pulled offline by Google, I figured it was still worthy of at least a farewell posting. The Google Adwords Keyword tool has long been the go to keyword research tool, as it provides data directly “from the horse’s mouth.” Once you have completed your competitive keyword research, you would want to take your list of keywords and input them into “Word or phrase” box. For a more targeted search, you can specify the website you are performing keyword research for, and even choose a relevant category. Additional search filtering options include match types (which allows you to receive search data for broad, exact, and phrase match type keywords), select the location and language of interest, view desktop or mobile search data separately, and even view search volume on either a global or local level.Once the Google Adwords Keyword Tool is phased out, it will be replaced with the Google Adwords Keyword Planner. This modified version of the tool has a much more user-friendly user interface. The most useful of the new features is not only having the ability to combine multiple keyword lists together, but also being able to view and select cities from an interactive map online, making it that much easier to ascertain search volume for targeted keywords in very specific cities or regions.

Help us, help you with your online marketing strategy – call iMarket Solutions today!

Now that you are armed with some basic strategies and tools that can help you further expand your company’s reach on your target demographic, get out there and start researching new potential keywords to use for your site. Once you have a thorough list of keywords to consider, the next thing you need to do is create some awesome content to publish to your website. Not sure where to start? Then learn how to write your own website content first. If you at any time find yourself overwhelmed by any part of the process, or feel more comfortable passing off your duties to a professional online marketing company that specializes in helping HVAC, electrical and plumbing contractors, then consider calling iMarket Solutions.

Internet marketing for electrical, plumbing, heating and air conditioning contractors is what we do.

Our SEO experts will build and market a website that can dominate the plumbing, heating and air conditioning, and electrical SERPs in your region – and all with a goal CPL (cost per lead) that will make your competitors weep. Contact us today for pricing.

Google Places has been replaced with Google+ Local! What does this mean for local business owners?

Wednesday, May 30th, 2012 by Nadia Romeo

There have been a lot of changes at Google lately. The recent search algorithm updates have definitely changed how people searching on the web see results and find exactly what they’re looking for; this latest Google announcement is no different.

Earlier today, Google announced the merging of Google Places into a new aspect of Google+ called Google+ Local. Over 80 million Google Places pages have already been automatically converted to the new Google+ Local format, and more will follow.

The new format is much more user-friendly and will help your business be represented in a more powerful way. Of course, as with any change, you probably have some questions.

What does this Google+ Local announcement mean for your business?

The new Google+ Local listings are a way for you to begin to manage your local business listing with Google in a more meaningful way. Many industry experts believe that this will help solve some of the issues with listings needing to be reclaimed or corrected as often due to inaccurate information.

Is it going to be more work for you?

Not at all. If you are an iMarket Solutions client with ongoing SEO services, we will continue to manage your Google+ Local page just as we have with your Google Places listing. If you do not have ongoing SEO or are not an iMarket Solutions client, contact us to learn more about how we can help!

If you like to be hands-on with your local presence on Google, you can continue to manage your Google+ Local page through your normal Google Places login. Google does recommend that you create a Google+ Business page as they will soon release a way to connect the Business page with the Google+ Local listing. iMarket Solutions will be working on this process for our SEO clients.

Will I lose my reviews?

Your reviews will be migrated over to your Google+ Local page. The reviews will be attributed to “A Google User” until the owners of the reviews verify their old reviews can be attributed to their identity on Google+ Local. The good news is that Google will ask users to do this now that Google+ Local is rolled out.

Google will now be incorporating the 30-point review system created by ZAGAT (which Google acquired in September 2011). Reviews will be moving away from the star system to this new scoring system.

Will I lose my photos?

The photos that were part of your Google Places listing will remain part of your Google+ Local page. If you have user-uploaded photos, they will be migrated the same way as reviews.

Does this change how potential customers find my business in Google?

Your potential customers will be able to search for your local business as normal, and their experience will remain consistent whether they are searching in Google, or on Google+, Google Maps, or through mobile apps.

Other changes to expect…

  • You will be able to develop followers and interact with them through posts and messages on your Google+ Local page.
  • The new Google+ Local page is more visually interesting with photos and reviews given a more prominent location.
  • A new “Local” tab has been added for Google+ users making it easier to find local businesses like yours.
  • Google+ Local pages will be integrated throughout all Google search properties.
  • Users will have a more personalized experience with the integration of Google+ Circles to help highlight businesses that their friends and family have recommended. This means if you have a customer who has recommended you, their online connections will be more likely to see your business in a search for services you provide!

Overall, the migration of Google Places to Google+ Local appears to be a positive change for local business owners like you. Please do contact us with any questions about Google+ Local, or your overall SEO strategy for your website. We are happy to help!

For more information on the Google Places change to Google+ Local, we recommend Search Engine Land as a fantastic source of up-to-the-minute information!

Serve your customers off site? Time to hide your address in Google Places!

Wednesday, March 28th, 2012 by Nadia Romeo

Google has stated that for businesses that service their clients off-site at a client’s home, or “on the road”, the address in Google Places should be hidden. As an HVAC, Plumbing, or other on-location service provider, your business falls into this category. This is definitely a big adjustment but one that is necessary for your optimum success in Google Places and local search overall.

Show your address or hide your address?

Prior to the last few months, it has always been key to have an address showing on your Google Places listing. This was one way you were optimized for the searches in your area for your business. However, we have been noticing some changes in ranking factors, and have had these changes confirmed with Google’s latest update to the Google Places Guidelines, as well as prominent industry information.

Google’s newly updated guidelines state, “If you don’t receive customers at your location, you must select the ‘Do not show my business address on my Maps listing’ option within your dashboard. If you don’t hide your address, your listing may be removed from Google Maps.

While we know that you are used to seeing your address on your Google Maps / Places listing, we are sure you don’t want to risk your listing being removed! If you are an iMarket customer, we have already begun to make this change for you.

What if my business has a showroom where clients do come to our location for service and purchases?

If you have a showroom, we can leave your address visible; however, in the future it may be something you want to change. We believe that Google is making this change across all verticals that are known for primarily serving customers at their homes or office locations, not at your brick and mortar address.

We are definitely doing some testing to determine the best course of action for clients who have a showroom. Let me know if you have a showroom and we can discuss your listing.

What if I have (or wanted to have) multiple Google Places listings for different addresses?

We are no longer recommending that you have multiple listings. On a single Google Places listing, you can specify your location and any additional cities / areas you serve. We have 2 options for this:

  • We can list all towns you service one by one.
  • Or we can select a certain radius (in miles) surrounding your location.

Both options will enable you to come up for searches in the areas you service without the need for a separate Google Places listing in each city or region.

If you have multiple Google Places listings we are recommending the additional ones are deleted so that Google does not have to remove them as spam. Remember, Google may remove the one you wanted to keep, and leave up your others, so it’s best that we choose to remove the right ones, and optimize the best one.

What can I do to help my Google Places page rank better?

You can add photos and videos to your Places listing as a start. If you’re an iMarket Solutions customer, you can send your photos and videos to us and we’ll take care of it for you.

Another post is coming soon with even more information about local ranking factors – stay tuned!

We realize this is a lot of new information, so please contact us with any questions you have! iMarket Solutions is 100% committed to you having the most successful online presence possible, and everyone on our team is happy to help you achieve your goals!

Bing, Facebook, and Google +1: Who’s Going to Dominate Social Search, and What It All Might Mean for HVAC, Plumbing, and Electrical Service Businesses

Wednesday, July 13th, 2011 by Nadia Romeo

As we discussed in some previous blog posts, Bing is making a bid to challenge Google’s dominance in the search world. We’ve already looked at Bing’s deal with Blackberry, which may give Bing a lot more traction in the mobile search market. Now we’re going to look at Bing’s deal with Facebook, which adds an exciting “social” aspect to Bing’s search results – and Google’s attempt to strike back.

Here’s how it works. Bing search results will now include “Likes” from Facebook. If you’re logged into Facebook when you make a search – say, to find a local plumber – you’ll get a list of plumbers’ websites, and if any of your Facebook Friends have “Liked” any of the websites in the list, you’ll see a little thumbs-up “Like” icon and the name of your friend(s) who “Like” the site.

Of course, when your Facebook Friends make searches in Bing, they in turn will be able to see which websites you have “Liked”. To make it as easy as possible for everyone to contribute their opinion, There’s even a Bing toolbar that lets you “Like” a website right from your browser controls – you don’t have to hunt on the website for the little “Like” icon.

It’s certainly handy to be able to see at a glance if one or more of your Friends “Likes” a website – especially when you’re trying to evaluate something you don’t know much about. Many people rely primarily on word-of-mouth to choose heating, plumbing, and electrical contractors, and “we think consumers searching for service businesses will quickly come to use “Likes” to choose one contractor over another.

But the Bing/Facebook integration goes deeper than that. Bing will actually use your Friends’ “Likes” to help calculate the search engine results it presents to you – that is, the sites that your Friends “Like” will get a higher spot in the listings. And because each person has a different group of Facebook Friends with different “Likes”, Bing will present a different set of search results to each person – a truly personalized search.

For businesses, this means that it will be increasingly important to encourage happy customers to register their satisfaction online. There are some great ways to do this, and we’ll revisit this topic again in future blog posts.

Why Do People Use a Smartphone to Call a Plumbing, Electrical, or HVAC Company? Does Your Website Give Smartphone Users What They Need?

Wednesday, June 15th, 2011 by Nadia Romeo

For the last few weeks, we’ve been talking about the increase in smartphone use and the decrease in Yellow Pages use. As we have already observed in previous blogs, that adds up to two very important conclusions: 1) more and more of your potential customers will be looking for you on the web; and 2) many of those potential customers will be using a smartphone to do it.

Your website needs to be smartphone-friendly, that’s for sure.

But what exactly does “smartphone-friendly” mean?

Well, let’s start with the basics: your website must be compatible with Safari, the most common smartphone web browser.

Also, you need to make sure that there’s no Flash on your website. Flash can be cool, but iPhones (arguably cooler) just don’t get along with it. There are work-arounds that let people view Flash on iPhones, sure, but you have to have some technical interest and skill to implement them. Chances are your customers won’t be interested in doing extra work to see your website; they’ll call your competitor instead. Leave the Flash for your teenager to play with on YouTube.

But…even if your website is compatible with Safari and is Flash-free, it might not be truly smartphone-friendly. If you’ve ever used a smartphone to surf the web, you know that on some websites it can be really hard to get to the information you want. Often, navigation points like tabs, links, or buttons are so small on a smartphone screen that it can be difficult even to know what a website offers, much less get there. If there are a lot of navigation points spreading horizontally across the screen, you have to scroll and scroll and scroll to see them all. If “Contact Us” is all the way on the right of all that navigation, will your customers have the patience to travel all the way over to it? And, if the “Contact Us” button is really small, will they be able to press it comfortably or will they hit another button by accident? You get the idea.

To make sure that all our clients’ websites work optimally on smartphones, we’ve created a “quick action” mobile template and install it on every website we build. The experience is seamless for the user: if someone visits one of our clients’ websites using a smartphone, he or she is immediately taken to the mobile version of the website. (There is special code built into the programming that detects the kind of device that the visitor is using.)

To design our mobile website template, we asked ourselves two questions:

  1. When and why are people using their smartphones to call service companies?
  2. What information will satisfy their needs?

Based on our own experience and what our clients have told us about their customers, people don’t use a smartphone to look for extensive information about a service company. They may enjoy your blog, but they’re not going to use their smartphone to read it. When they go to your website via smartphone, they probably have an emergency or already know what they want to do (book an annual inspection, buy a service contract, etc). They want what we call “quick action” information: When are you open? Do you have 24-hour service? Do you fix toilets or do you only do HVAC equipment? How can they email or call you right now?

Our mobile templates put all that primary information out there, literally at users’ fingertips. Then, to make sure that everyone can see and access it easily, we make sure that the logo is big enough to see, the font is big enough to read, and the links and phone numbers are big enough for everyone to click on, even if they have great big clumsy fingers (like our CTO’s).

Of course, our mobile websites also contain a link to the client’s main website in case someone does want to research something more closely or read the blog – because after all, what is more exciting than a blog?

On that note of shameless self-promotion, iMarket’s blogger will sign off for this week. Next week, we’ll look at the other big new market that Bing is trying to break into – social search.


© 2010–2014 iMarket Solutions, all rights reserved.
West Coast Office: 7700 Irvine Center Drive, Suite 160, Irvine CA, 92618
East Coast Office: 130 West Canal Street, Suite 5, Winooski, VT 05404

Website design, graphic design, and web hosting provided by iMarket Solutions