Log in
Hot & Trending on FMT

Is Your FileMaker Website Safe from an SEO Penalty

Featured Is Your FileMaker Website Safe from an SEO Penalty

With all of the buzz floating around regarding Google’s algorithm updates and the huge number of sites getting penalized, many webmasters, SEO’s, and marketers are concerned that their website could be next on Google’s hit list.

Yes, trying to decipher which SEO strategies are worth implementing and which should be avoided with a 10-foot pole can sometimes be challenging, but if you’re performing SEO correctly by abiding by Google’s Webmaster Guidelines and only performing honest white-hat SEO strategies, there’s no need to worry about any future Google updates.

According to Moz’s Algorithm update page, there have been 67 updates since 2012. With an average of 1% of sites affected per update, roughly 8 million websites have experienced serious ranking drops, fluctuations, or improvements in the past few years (based on the fact that there are roughly 800 million website on the web).

Matt Cutts -- Google SEO

 

While some people are dreading the next Google update, personally I am waiting in anticipation, hoping it happens as soon as possible.

Why? Because I’m confident that the strategies I use are solid, Google approved, and future proofed strategies that historically have improved the rankings of websites during every new update.

Below are 10 Do’s & Don’ts I recommend to ensure your website is safe from an SEO penalty.

The Do’s

#1: DO make sure your website sticks to Google’s Webmaster Guidelines

This is an obvious one that I shouldn’t even have to mention.

But the point must be made. If you want to rank well in Google, I advise you to listen to their rules. It’s that simple.

Sometimes their guidelines are a bit ambiguous and hard to implement into strategy. But the next time you strategize or before you implement a technique, run it through Google’s webmaster guidelines and make sure Google can’t find any fault with it. Assume Google is the referee and if you can’t do it in plain sight in front of Google, consider it cheating and not worth doing.

#2: DO make sure everything is clean in your webmaster toolsets

Webmaster Toolsets

This is another seemingly obvious point, but again, if you want to rank well on Google’s search engine, listen to them.

If they provide you with HTML improvements, or tell you about crawl errors, or recommend submitting your XML sitemap to them, do it. We are fortunate enough that Google (and Bing) provide us with tools to help us view our sites the way they do. Any red flags, errors or alarms you see within their toolsets should be fixed ASAP.

(You can access to these webmaster tools here: Google Webmaster Tools / Bing Webmaster Tools.)

#3: DO make sure all of your pages provide value to your users

Google’s #1 goal is to deliver websites to searchers that will keep them trusting their search engine results for the thousands of searches they do in years to come.

The goal of your website should be to impress your users first, then sell them on whatever you’re trying to sell.

Think about what your users really want when they search your target keywords, and give it to them. If they have questions, answer them. If a video would make them happy, make one. If pictures are what they need to be sure about your products, take some. If users want a good looking website that’s easy to navigate, give it to them.

Always think user first.

And don’t let your website visitors leave your website thinking about visiting another. This can go a long way, and has an impact way beyond the search results.

#4: DO seek to improve analytics data

Google Analytics Data

Believe it or not, how a user performs before, on, and after your website plays a role in how you look in Google’s eyes.

How many websites a user visits before landing on yours, as well as how many they visit after yours can help Google determine if your content is better (or worse) than the others and if yours was/wasn’t enough to satisfy their needs.

A high percentage of bounces from your website, only a page or two surfed, or less than a few seconds spent on your website all tell stories about your website’s value. Seeking to keep users on your website longer and visiting more pages is extremely important for website growth.

When brainstorming about improving onpage SEO, don’t put all of your efforts into keyword placement. Remember to invest in developing ways to get users to stay on your website longer and wanting to return. Always aim to improve your website’s bounce rate, time on site, pages visited, and returns.

#5: DO recognize what past updates had in common

The-Penguins-of-MadagascarWhen you look at past algorithm update announcements (or tweets) they tend to fall into only a few buckets (Panda, Local Packs, Penguin, Hummingbird). The updates are usually ‘updates’ to an already established algorithm, such as Panda targeting weak content, Penguin targeting spammy links, etc. It’s been made clear what each of these algorithms are targeting.

Over the years, Google has had to make numerous updates to its algorithm, whether it’s because a variable in their giant formula needed changing or because of those grey/black hat SEO’s that keep trying to find loopholes and ways to continue being naughty.

Whatever the case may be, the updates are just updates.

If you raised your link building bar and stopped getting spammy links and have been putting efforts into getting rid of questionable links, you can most likely rest assured that future Pengiun updates will actually be good for you. If you’ve been focusing on the above DO’s and making your website awesome, you are probably safe from a giant panda attack.

Only those still doing things that are questionable are the ones who can’t be certain that their site will remain on top when the next update comes. It all boils down to DO #1—are you sticking to the guidelines and would you do it if a Google rep was looking right over your shoulder or are you doing something Google has already said they don’t approve of. Like the saying goes, whenever in doubt, rule it out.

#6: DO aim to really be the best in your niche

This really should be the #1 DO.

After all, you use Google because it does the best job at delivering results. There was a time when the top 10 results weren’t the top 10 websites for that term. But with millions and millions of dollars, years of experience, and several algorithm updates, we can all pretty much trust Google to provide us with a legitimate list of top 10 sites on the 1st page of each Google search result.

Gone are the days when you could manipulate Google’s results.

Google is now sophisticated enough that you might as well try to be the very best site in your niche. If they don’t reward you today, they will with future updates. Just keep working really hard at being the best for what you do.

Also do this: Look at the top 10 ranking sites for any given term and study why they are there. Look at their layouts, site offerings, what they provide the users, trust indicators, testimonials, pictures, pricing, etc. and have it set in your head that you are going to be better.

To be the best you have to try your best. Get to the point where you honestly feel that you have the BEST site for your users. Not the best company or best services or best products, yes those are all very important. But if that isn’t showing in your website how are users and Google supposed to know?

 The Dont’s

#8: DON’T make pages purposefully to deceive bots

For example, a common thing I’ve seen clients and competitors do is create pages for every city and neighborhood in hopes that they will rank all over the country/region, yet all with similar content.

If you really want to target every city, give each page its own unique content relevant to that city. Sometimes, it’s just a manner of using the right LSI terms (IE: the windy city, soldier field, sears tower, Lake Michigan if targeting Chicago, etc.).

But if you are just creating these pages and saying those terms for SEO purposes, there’s no guarantee those pages will be future update proof.

The good news is that there are some really honest ways to target each city, but that would take serious investing into the people of each of those cities with time, social media, other forms of marketing, locale knowledge, etc. I’m not advising against creating 100’s of city pages. I just don’t recommend it without putting some serious thought into what it’s really going to take to make the campaign honest and not deceitful.

Another common mistake some websites make is having the same three line description for hundreds of products that are all completely different. Again, it may take some time and investment to write a unique description for each product. But in the end, it’s totally worth it.

#9: DON’T over optimize your website

Always void “over-optimizing” your website.

Don’t put extra effort into meeting exact keyword densities or making sure you say a specific keyword phrase in the best places of your web pages. Yes, placing relevant phrases in the right places may help. But only if you are doing it naturally.

Are you saying your target keyword in the heading because you need it there for SEO or because it’s the most appropriate way to say what you do on the most relevant page? There’s a fine line between keyword placement optimization and end-user optimization. Sometimes there isn’t a difference, but sometimes there’s a huge one.

#10: A DO and a DON’T.

DO try to get social signals from real users

Creating content that people like, share, tweet, send, etc. is a good indicator to Google and the other engines that your site and content is user-liked and trust worthy.

DON’T buy social signals!

Never buy followers, likes, fans, video views, etc!

Those are the type of things future algorithm updates will be looking for.

Buying followers and fans is an obvious red flag telling the search engines, “Hey, I know my site isn’t that interesting, so I’ll just buy interest”. Instead of putting money and time into buying signals, put effort into creating things that the social world will love.

It’s expected that future algorithms will count referral traffic and past-site visits when trying to determine the value of your followers and content, which makes this something you want to get right now so you can benefit later.

How safe is your site?

After reviewing this list, how safe is your site? Are you following Google’s pre-determined rules, or are you using gray-hat techniques that are questionable but not yet detrimental?

Be sure to follow these 10 steps to future proof your website from soon-to-come algorithm updates. It’s fairly straight forward and not as difficult as most people think. Instead of trying to find ways to game the search engines to rank higher in SERP’s, invest your efforts into being a better site for your users, and like me, you’ll be itching for the next update to push down any sites that are using questionable tactics so yours can rise to the top.

James Harrison HeadshotAuthor James Harrison is a seasoned search engine marketing professional specializing in SEO, Local Search, and Social Media. He has extensive experience working with a wide variety of brands from major companies such as Mitsubishi Electronics, Oakley, and Rubio’s, to small local businesses including hotels, lawyers, and carpet cleaners. His internet marketing company, Better Web Marketing is dedicated to providing superior services for both clients and their users.