What is SEO?
Search Engine Optimisation in 2017 is a technical, analytical and creative process to improve the visibility of a website in search engines. Its primary function is to drive more visits to a site that convert into sales.
The free SEO tips you will read on this page will help you create a successful SEO friendly website yourself.
This article is a beginner’s guide to effective white hat SEO.
I deliberately steer clear of techniques that might be ‘grey hat’, as what is grey today is often ‘black hat’ tomorrow, as far as Google is concerned.
No one-page guide can explore this complex topic in full. What you’ll read here are answers to questions I had when I was starting out in this field.
Google insists webmasters adhere to their ‘rules’ and aims to reward sites with high-quality content and remarkable ‘white hat’ web marketing techniques with high rankings.
Conversely, it also needs to penalise websites that manage to rank in Google by breaking these rules.
These rules are not ‘laws’, but ‘guidelines’, for ranking in Google; lay down by Google. You should note, however, that some methods of ranking in Google are, in fact, illegal. Hacking, for instance, is illegal in the UK and US.
You can choose to follow and abide by these rules, bend them or ignore them – all with different levels of success (and levels of retribution, from Google’s web spam team).
White hats do it by the ‘rules’; black hats ignore the ‘rules’.
What you read in this article is perfectly within the laws and also within the guidelines and will help you increase the traffic to your website through organic, or natural search engine results pages (SERPs).
There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2017 is mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK):
The art of web SEO lies in understanding how people search for things and understanding what type of results Google wants to (or will) display to its users. It’s about putting a lot of things together to look for opportunity.
A good optimiser has an understanding of how search engines like Google generate their natural SERPs to satisfy users’ navigational, informational and transactionalkeyword queries.
A good search engine marketer has a good understanding of the short term and long term risks involved in optimising rankings in search engines, and an understanding of the type of content and sites Google (especially) WANTS to return in its natural SERPs.
The aim of any campaign is more visibility in search engines and this would be a simple process if it were not for the many pitfalls.
There are rules to be followed or ignored, risks to take, gains to make, and battles to be won or lost.
A Mountain View spokesman once called the search engine ‘kingmakers‘, and that’s no lie.
Ranking high in Google is VERY VALUABLE – it’s effectively ‘free advertising’ on the best advertising space in the world.
Traffic from Google natural listings is STILL the most valuable organic traffic to a website in the world, and it can make or break an online business.
The state of play, in 2017, is that you can STILL generate highly targeted leads, for FREE, just by improving your website and optimising your content to be as relevant as possible for a buyer looking for your company, product or service.
As you can imagine, there’s a LOT of competition now for that free traffic – even from Google (!) in some niches.
You shouldn’t compete with Google. You should focus on competing with your competitors.
The process can be practised, successfully, in a bedroom or a workplace, but it has traditionally always involved mastering many skills as they arose including diverse marketing technologies including but not limited to:
- Website design
- User experience
- Website development
- PHP, HTML, CSS, etc.
- Server management
- Domain management
- Backlink analysis
- Keyword research
- Social media promotion
- Software development
- Analytics and data analysis
- Information architecture
- Log Analysis
- Looking at Google for hours on end
It takes a lot, in 2017, to rank on merit a page in Google in competitive niches.
The big stick Google is hitting every webmaster with (at the moment, and for the foreseeable future) is the ‘QUALITY USER EXPERIENCE‘ stick.
If you expect to rank in Google in 2017, you’d better have a quality offering, not based entirely on manipulation, or old school tactics.
Is a visit to your site a good user experience?
If not – beware manual ‘Quality Raters’ and beware the Google Panda/Site Qualityalgorithms that are looking for poor user experience for its users.
Google raising the ‘quality bar’, year on year, ensures a higher level of quality in online marketing in general (above the very low-quality we’ve seen over the last years).
Success online involves investment in higher quality on-page content, website architecture, usability, conversion to optimisation balance, and promotion.
If you don’t take that route, you’ll find yourself chased down by Google’s algorithms at some point in the coming year.
This ‘what is SEO‘ guide (and this entire website) is not about churn and burn type of Google SEO (called webspam to Google) as that is too risky to deploy on a real business website in 2017.
What Is A Successful Strategy?
Get relevant. Get trusted. Get Popular.
It is no longer just about manipulation in 2017.
It’s about adding quality and often useful content to your website that together meet a PURPOSE that delivers USER SATISFACTION.
If you are serious about getting more free traffic from search engines, get ready to invest time and effort in your website and online marketing.
Google wants to rank QUALITY documents in its results, and force those who wish to rank high to invest in higher-quality content or great service that attracts editorial links from reputable websites.
If you’re willing to add a lot of great content to your website, and create buzz about your company, Google will rank you high.
If you try to manipulate Google, it will penalise you for a period, and often until you fix the offending issue – which we know can LAST YEARS.
Backlinks in general, for instance, are STILL weighed FAR too positively by Google and they are manipulated to drive a site to the top positions – for a while. That’s why blackhats do it – and they have the business model to do it. It’s the easiest way to rank a site, still today.
If you are a real business who intends to build a brand online – you can’t use black hat methods. Full stop.
Fixing the problems will not necessarily bring organic traffic back as it was before a penalty.
Recovery from a Google penalty is a ‘new growth’ process as much as it is a ‘clean-up’ process.
Google Rankings Are In Constant Ever-Flux
It’s Google’s job to MAKE MANIPULATING SERPs HARD.
So – the people behind the algorithms keep ‘moving the goalposts’, modifying the ‘rules’ and raising ‘quality standards’ for pages that compete for top ten rankings.
In 2017 – we have ever-flux in the SERPs – and that seems to suit Google and keep everybody guessing.
Google is very secretive about its ‘secret sauce’ and offers sometimes helpful and sometimes vague advice – and some say offers misdirection – about how to get more from valuable traffic from Google.
Google is on record as saying the engine is intent on ‘frustrating’ search engine optimisers attempts to improve the amount of high-quality traffic to a website – at least (but not limited to) – using low-quality strategies classed as web spam.
At its core, Google search engine optimisation is still about KEYWORDS and LINKS. It’s about RELEVANCE, REPUTATION and TRUST. It is about QUALITY OF CONTENT & VISITOR SATISFACTION.
A Good USER EXPERIENCE is a key to winning – and keeping – the highest rankings in many verticals.
Relevance, Authority & Trust
Web page optimisation is about making a web page relevant and trusted enough to rank for a query.
It’s about ranking for valuable keywords for the long term, on merit. You can play by ‘white hat’ rules lay down by Google, and aim to build this Authority and Trust naturally, over time, or you can choose to ignore the rules and go full time ‘black hat’.
MOST SEO tactics still work, for some time, on some level, depending on who’s doing them, and how the campaign is deployed.
Whichever route you take, know that if Google catches you trying to modify your rank using overtly obvious and manipulative methods, then they will class you a web spammer, and your site will be penalised ( you will not rank high for relevant keywords).
These penalties can last years if not addressed, as some penalties expire and some do not – and Google wants you to clean up any violations.
Google does not want you to try and modify where you rank, easily. Critics would say Google would prefer you paid them to do that using Google Adwords.
The problem for Google is – ranking high in Google organic listings is a real social proof for a business, a way to avoid PPC costs and still, simply, the BEST WAY to drive VALUABLE traffic to a site.
It’s FREE, too, once you’ve met the always-increasing criteria it takes to rank top.
‘User Experience’ Matters
Is User Experience A Ranking Factor?
User experience is mentioned 16 times in the main content of the quality ratersguidelines (official PDF), but we have been told by Google it is not, per say, a classifiable ‘ranking factor‘ on desktop search, at least.
On mobile, sure, since UX is the base of the mobile friendly update. On desktop currently no. (Gary Illyes: Google, May 2015)
While UX, we are told, is not literally a ‘ranking factor’, it is useful to understand exactly what Google calls a ‘poor user experience’ because if any poor UX signals are identified on your website, that is not going to be a healthy thing for your rankings anytime soon.
Matt Cutts consistent SEO advice was to focus on a satisfying user experience.
What is Bad UX?
For Google – rating UX, at least from a quality rater’s perspective, revolves around marking the page down for:
- Misleading or potentially deceptive design
- sneaky redirects (cloaked affiliate links)
- malicious downloads and
- spammy user-generated content (unmoderated comments and posts)
- Low-quality MC (main content of the page)
- Low-quality SC (supplementary content)
What is SC (supplementary content)?
When it comes to a web page and positive ux, Google talks a lot about the functionality and utility of Helpful Supplementary Content – e.g. helpful navigation links for users (that are not, generally, MC or Ads).
Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose. SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.
To summarize, a lack of helpful SC may be a reason for a Low quality rating, depending on the purpose of the page and the type of website. We have different standards for small websites which exist to serve their communities versus large websites with a large volume of webpages and content. For some types of “webpages,” such as PDFs and JPEG files, we expect no SC at all.
It is worth remembering that Good SC cannot save Poor MC (“Main Content is any part of the page that directly helps the page achieve its purpose“.) from a low-quality rating.
Good SC seems to certainly be a sensible option. It always has been.
Key Points about SC
- Supplementary Content can be a large part of what makes a High-quality page very satisfying for its purpose.
- Helpful SC is content that is specifically targeted to the content and purpose of the page.
- Smaller websites such as websites for local businesses and community organizations, or personal websites and blogs, may need less SC for their purpose.
- A page can still receive a High or even Highest rating with no SC at all.
Here are the specific quotes containing the term SC:
- Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose.
- SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.
- SC which contributes to a satisfying user experience on the page and website. – (A mark of a high-quality site – this statement was repeated 5 times)
- However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.
- However, some pages are deliberately designed to shift the user’s attention from the MC to the Ads, monetized links, or SC. In these cases, the MC becomes difficult to read or use, resulting in a poor user experience. These pages should be rated Low.
- Misleading or potentially deceptive design makes it hard to tell that there’s no answer, making this page a poor user experience.
- Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
- However, you may encounter pages with a large amount of spammed forum discussions or spammed user comments. We’ll consider a comment or forum discussion to be “spammed” if someone posts unrelated comments which are not intended to help other users, but rather to advertise a product or create a link to a website. Frequently these comments are posted by a “bot” rather than a real person. Spammed comments are easy to recognize. They may include Ads, download, or other links, or sometimes just short strings of text unrelated to the topic, such as “Good,” “Hello,” “I’m new here,” “How are you today,” etc. Webmasters should find and remove this content because it is a bad user experience.
- The modifications make it very difficult to read and are a poor user experience. (Lowest quality MC (copied content with little or no time, effort, expertise, manual curation, or added value for users))
- Sometimes, the MC of a landing page is helpful for the query, but the page happens to display porn ads or porn links outside the MC, which can be very distracting and potentially provide a poor user experience.
- The query and the helpfulness of the MC have to be balanced with the user experience of the page.
- Pages that provide a poor user experience, such as pages that try to download malicious software, should also receive low ratings, even if they have some images appropriate for the query.
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance, W3c Mobile testing tools.
Google is still, evidently, more interested in rating the main content of the webpage in question and the reputation of the domain the page is on – relative to your site, and competing pages on other domains.
A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school SEO stuff like keyword stuffing a site.
If you are improving user experience by focusing primarily on the quality of the MC of your pages, and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2017 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.
Balancing Conversions With Usability & User Satisfaction
Take pop-up windows or pop-unders as an example:
According to usability expert Jakob Nielson, 95% of website visitors hated unexpected or unwanted pop-up windows, especially those that contain unsolicited advertising.
In fact, Pop-Ups have been consistently voted the Number 1 Most Hated Advertising Technique since they first appeared many years ago.
Accessibility students will also agree:
- creating a new browser window should be the authority of the user
- pop-up new windows should not clutter the user’s screen.
- all links should open in the same window by default. (An exception, however, may be made for pages containing a links list. It is convenient in such cases to open links in another window, so that the user can come back to the links page easily. Even in such cases, it is advisable to give the user a prior note that links would open in a new window).
- Tell visitors they are about to invoke a pop-up window (using the link <title> attribute)
- Popup windows do not work in all browsers.
- They are disorienting for users
- Provide the user with an alternative.
It is inconvenient for usability aficionados to hear that pop-ups can be used successfully to vastly increase signup subscription conversions.
EXAMPLE: TEST With Using A Pop Up Window
Pop ups suck, everybody seems to agree. Here’s the little test I carried out on a subset of pages, an experiment to see if pop-ups work on this site to convert more visitors to subscribers.
I tested it out when I didn’t blog for a few months and traffic was very stable.
Testing Pop Up Windows Results
|Pop Up Window||Total %Change|
|WK1 On||Wk2 Off|
That’s a fair increase in email subscribers across the board in this small experiment on this site. Using a pop up does seem to have an immediate impact.
I have since tested it on and off for a few months and the results from the small test above have been repeated over and over.
I’ve tested different layouts and different calls to actions without pop-ups, and they work too, to some degree, but they typically take a bit longer to deploy than activating a plugin.
I don’t really like pop-ups as they have been an impediment to web accessibility but it’s stupid to dismiss out-of-hand any technique that works. I’ve also not found a client who, if they had that kind of result, would choose accessibility over sign-ups.
I don’t really use the pop up on days I post on the blog, as in other tests, it really seemed to kill how many people share a post in social media circles.
With Google now showing an interest with interstitials, I would be very nervous of employing a pop-up window that obscures the primary reason for visiting the page. If Google detects a dissatisfaction, I think this would be very bad news for your rankings.
I am, at the moment, using an exit strategy pop-up window as hopefully by the time a user sees this device, they are FIRST satisfied with my content they came to read. I can recommend this as a way to increase your subscribers, at the moment, with a similar conversion rate than pop-ups – if NOT BETTER.
I think, as an optimiser, it is sensible to convert customers without using techniques that potentially negatively impact Google rankings.
Do NOT let conversion get in the way of the PRIMARY reason a visitor is CURRENLTY on ANY PARTICULAR PAGE or you risk Google detecting relative dissatifaction with your site and that is not going to help you as Google’s RankBrain gets better at working out what ‘quality’ actually means.
Google Wants To Rank High-Quality Websites
Google has a history of classifying your site as some type of entity, and whatever that is, you don’t want a low-quality label on it. Put there by algorithm or human. Manual evaluators might not directly impact your rankings, but any signal associated with Google marking your site as low-quality should probably be avoided.
If you are making websites to rank in Google without unnatural practices, you are going to have to meet Google’s expectations in the Quality Raters Guidelines (PDF).
Low-quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well.
There is ‘sufficient reason’ in some cases to immediately mark the page down on some areas, and Google directs quality raters to do so:
- An unsatisfying amount of MC is a sufficient reason to give a page a Low-quality rating.
- Low-quality MC is a sufficient reason to give a page a Low-quality rating.
- Lacking appropriate E-A-T is sufficient reason to give a page a Low-quality rating.
- Negative reputation is sufficient reason to give a page a Low-quality rating.
What are low-quality pages?
When it comes to defining what a low-quality page is, Google is evidently VERY interested in the quality of the Main Content (MC) of a page:
Main Content (MC)
Google says MC should be the ‘main reason a page exists’.
- The quality of the MC is low.
- There is an unsatisfying amount of MC for the purpose of the page.
- There is an unsatisfying amount of website information.
POOR MC & POOR USER EXPERIENCE
- This content has many problems: poor spelling and grammar, complete lack of editing, inaccurate information. The poor quality of the MC is a reason for the Lowest+ to Low rating. In addition, the popover ads (the words that are double underlined in blue) can make the main content difficult to read, resulting in a poor user experience.
- Pages that provide a poor user experience, such as pages that try to download malicious software, should also receive low ratings, even if they have some images appropriate for the query.
DESIGN FOCUS NOT ON MC
- If a page seems poorly designed, take a good look. Ask yourself if the page was deliberately designed to draw attention away from the MC. If so, the Low rating is appropriate.
- The page design is lacking. For example, the page layout or use of space distracts from the MC, making it difficult to use the MC.
MC LACK OF AUTHOR EXPERTISE
- You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.
- There is no evidence that the author has medical expertise. Because this is a YMYL medical article, lacking expertise is a reason for a Low rating.
- The author of the page or website does not have enough expertise for the topic of the page and/or the website is not trustworthy or authoritative for the topic. In other words, the page/website is lacking E-A-T.
After page content, the following are given the most weight in determining if you have a high-quality page.
POOR SECONDARY CONTENT
- Unhelpful or distracting SC that benefits the website rather than helping the user is a reason for a Low rating.
- The SC is distracting or unhelpful for the purpose of the page.
- The page is lacking helpful SC.
- For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating
- For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits, however, an extremely distracting and graphic porn ad may warrant a Low rating.
- If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.
- The website is lacking maintenance and updates.
SERP SENTIMENT & NEGATIVE REVIEWS
- Credible negative (though not malicious or financially fraudulent) reputation is a reason for a Low rating, especially for a YMYL page.
- The website has a negative reputation.
When it comes to Google assigning your page the lowest rating, you are probably going to have to go some to hit this, but it gives you a direction you want to ensure you avoid at all costs.
Google says throughout the document, that there are certain pages that…
should always receive the Lowest rating
..and these are presented below. Note – These statements below are spread throughout the raters document and not listed the way I have listed them there. I don’t think any context is lost presenting them like this, and it makes it more digestible.
Anyone familiar with Google Webmaster Guidelines will be familiar with most of the following:
- True lack of purpose pages or websites.
- Sometimes it is difficult to determine the real purpose of a page.
- Pages on YMYL websites with completely inadequate or no website information.
- Pages or websites that are created to make money with little to no attempt to help users.
- Pages with extremely low or lowest quality MC.
- If a page is deliberately created with no MC, use the Lowest rating. Why would a page exist without MC? Pages with no MC are usually lack of purpose pages or deceptive pages.
- Webpages that are deliberately created with a bare minimum of MC, or with MC which is completely unhelpful for the purpose of the page, should be considered to have no MC
- Pages deliberately created with no MC should be rated Lowest.
- Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
- Pages on YMYL (Your Money Or Your Life Transaction pages) websites with completely inadequate or no website information.
- Pages on abandoned, hacked, or defaced websites.
- Pages or websites created with no expertise or pages that are highly untrustworthy, unreliable, unauthoritative, inaccurate, or misleading.
- Harmful or malicious pages or websites.
- Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.
- Deceptive pages or websites. Deceptive webpages appear to have a helpful purpose (the stated purpose), but are actually created for some other reason. Use the Lowest rating if a webpage page is deliberately created to deceive and potentially harm users in order to benefit the website.
- Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
- Sometimes, pages just don’t “feel” trustworthy. Use the Lowest rating for any of the following: Pages or websites that you strongly suspect are scams
- Pages that ask for personal information without a legitimate reason (for example, pages which ask for name, birthdate, address, bank account, government ID number, etc.). Websites that “phish” for passwords to Facebook, Gmail, or other popular online services. Pages with suspicious download links, which may be malware.
- Use the Lowest rating for websites with extremely negative reputations.
Websites ‘Lacking Care and Maintenance’ Are Rated ‘Low Quality’.
Sometimes a website may seem a little neglected: links may be broken, images may not load, and content may feel stale or out-dated. If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.
“Broken” or Non-Functioning Pages Classed As Low Quality
I touched on 404 pages in my recent post about investigating why has a site lost traffic.
Google gives clear advice on creating useful 404 pages:
- Tell visitors clearly that the page they’re looking for can’t be found
- Use language that is friendly and inviting
- Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site.
- Consider adding links to your most popular articles or posts, as well as a link to your site’s home page.
- Think about providing a way for users to report a broken link.
- Make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested
Ratings for Pages with Error Messages or No MC
Google doesn’t want to index pages without a specific purpose or sufficient main content. A good 404 page and proper setup prevents a lot of this from happening in the first place.
Some pages load with content created by the webmaster, but have an error message or are missing MC. Pages may lack MC for various reasons. Sometimes, the page is “broken” and the content does not load properly or at all. Sometimes, the content is no longer available and the page displays an error message with this information. Many websites have a few “broken” or non-functioning pages. This is normal, and those individual non-functioning or broken pages on an otherwise maintained site should be rated Low quality. This is true even if other pages on the website are overall High or Highest quality.
Does Google programmatically look at 404 pages?
We are told, NO in a recent hangout – – but – in Quality Raters Guidelines “Users probably care a lot”.
Do 404 Errors in Search Console Hurt My Rankings?
404 errors on invalid URLs do not harm your site’s indexing or rankingin any way. JOHN MEULLER
It appears this isn’t a once size fits all answer. If you properly deal with mishandled 404 errors that have some link equity, you reconnect equity that was once lost – and this ‘backlink reclamation’ evidently has value.
The issue here is that Google introduces a lot of noise into that Crawl Errors report to make it unwieldy and not very user-friendly.
A lot of broken links Google tells you about can often be totally irrelevant and legacy issues. Google could make it instantly more valuable by telling us which 404s are linked to from only external websites.
Fortunately, you can find your own broken links on site using the myriad of SEO toolsavailable.
I also prefer to use Analytics to look for broken backlinks on a site with some history of migrations, for instance.
John has clarified some of this before, although he is talking specifically (I think) about errors found by Google in Search Console (formerly Google Webmaster Tools):
- In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How do you tell? Double-check the origin of the crawl error. If there’s a broken link on your site, in your page’s static HTML, then that’s always worth fixing
If you are making websites and want them to rank, the 2015 and 2014 Quality Raters Guidelines document is a great guide for Webmasters to avoid low-quality ratings and potentially avoid punishment algorithms.
Google Is Not Going To Rank Low-Quality Pages When It Has Better Options
If you have exact match instances of key-phrases on low-quality pages, mostly these pages won’t have all the compound ingredients it takes to rank high in Google in 2017.
I was working this, long before I understood it partially enough to write anything about it.
Here is a few examples of taking a standard page that did not rank for years and then turning it into a topic oriented resource page designed around a user’s intent:
Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.
- Investigating A Website Traffic Drop
- Dealing With Low Quality Pages On A Website
- Example of a High Quality Webpage
- Making High Quality Websites
If you are doing a professional SEO audit for a real business, you are going to have to think like a Google Search Quality Rater AND a Google search engineer to provide real long term value to a client.
Google has a LONG list of technical requirements it advises you meet, on top of all the things it tells you NOT to do to optimise your website.
Meeting Google’s technical guidelines is no magic bullet to success – but failing to meet them can impact your rankings in the long run – and the odd technical issue can actually severely impact your entire site if rolled out across multiple pages.
The benefit of adhering to technical guidelines is often a second order benefit.
You don’t get penalised, or filtered, when others do. When others fall, you rise.
Mostly – individual technical issues will not be the reason you have ranking problems, but they still need addressed for any second order benefit they provide.
When making a site for Google in 2017, you really need to understand that Google has a long list of things it will mark sites down for, and that’s usually old-school seo tactics which are now classed as ‘web spam‘.
Conversely, sites that are not marked down are not demoted and so improve in rankings. Sites with higher rankings pick up more organic links, and this process can float a high-quality page quickly to the top of Google.
So – the sensible thing for any webmaster is to NOT give Google ANY reason to DEMOTE a site. Tick all the boxes Google tell you to tick.
I have used this simple (but longer term) strategy to rank on page 1 or thereabouts for ‘SEO’ in the UK over the last few years, and drive 100 thousand relevant organic visitors to this site, every month, to only about 70 pages, without building any links over the last few years (and very much working on it part-time):
What Is Domain Authority?
Domain authority, or as Google calls it, ‘online business authority’, is an important ranking factor in Google. What is domain authority? Well, nobody knows exactly how Google calculates popularity, reputation, intent or trust, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted – all of which can be faked, of course.
Most sites that have domain authority/online business authority have lots of links to them – that’s for sure – hence why link building has traditionally been so popular a tactic – and counting these links is generally how most 3rd party tools calculate it a pseudo domain authority score, too.
Massive domain authority and ranking ‘trust’ was in the past awarded to very successful sites that have gained a lot of links from credible sources, and other online business authorities too.
Amazon has a lot of online business authority…. (Official Google Webmaster Blog)
SEO more usually talk about domain trust and domain authority based on the number, type and quality of incoming links to a site.
Examples of trusted, authority domains include Wikipedia, the W3C and Apple. How do you become a OBA? Through building a killer online or offline brand or service with, usually, a lot of useful content on your site.
How do you take advantage of being an online business authority? Either you turn the site into a SEO Black Hole (only for the very biggest brands) or you pump out information – like all the time. On any subject. Because Google will rank it!
EXCEPT – If what you publish is deemed low quality and not suitable for your domain to have visibility on Google.
I think this ‘quality score’ Google has developed could be Google’s answer to this sort of historical domain authority abuse.
Can you (on a smaller scale in certain niches) mimic a online business authority by recognising what OBA do for Google, and why Google ranks these high in search result?. These provide THE service, THE content, THE experience. This takes a lot of work and a lot of time to create, or even mimic.
In fact, as a SEO, I honestly think the content route is the only sustainable way for a most businesses to try to achieve OBA at least in their niche or locale. I concede a little focused linkbuilding goes a long way to help, and you have certainly got to get out there and tell others about your site…
Have other relevant sites link to yours. Google Webmaster Guidelines
Brands are how you sort out the cesspool.
“Brands are the solution, not the problem,” Mr. Schmidt said. “Brands are how you sort out the cesspool.“
Google CEO Eric Schmidt said this. Reading between the lines, I’ve long thought this is good SEO advice.
If you are a ‘brand’ in your space, or well-cited site, Google wants to rank your stuff at the top because it trusts you won’t spam it and fill results pages with crap and make Google look stupid.
That’s money just sitting on the table the way Google currently awards massive domain authority and trust to particular sites they rate highly.
Tip – Keep content within your topic, unless you are producing high-quality content, of course. (e.g. the algorithms detect no unnatural practices)
I am always thinking:
“how do I get links from big KNOWN sites to my site. Where is my next quality link coming from?”
Getting links from ‘Brands’ (or well-cited websites) in niches can mean ‘quality links’.
Easier said than done, for most, of course, but that is the point.
But the aim with your main site should always be to become an online brand.
Does Google Prefer Big Brands In Organic SERPs?
Well, yes. It’s hard to imagine that a system like Google’s was not designed exactly over the last few years to deliver the listings it does today – and it IS filled with a lot of pages that rank high LARGELY because the domain the content is on.
Big Brands have an inherent advantage in Google’s ecosystem, and it’s kind of a suck for small businesses. There’s more small businesses than big brands for Google to get Adwords bucks out of, too.
That being said – small businesses can still succeed if they focus on a strategy based on depth, rather than breadth regarding how content is
Business of the Day
Top Tree Service Newark For All Your Tree Needs. We are the best choice in Central NJ and North NJ FOR TREE REMOVAL AND TREE TRIMMING as well as all other tree service needs. We have 24/7 emergency service too. We have all the necessary equipment and are fully insured for your protection. Call us for a free estimate and see the difference Top Tree Service Newark has to offer.973-577-7289
- Alexa Rank Checker
- Backlink Checker
- Check Page Rank
- Cloaking Checker
- Favicon creator
- Get High PR Backlinks
- HTML to Text Converter
- Internet Speed Test
- Keyword Density Checker
- Link Popularity Checker
- Meta Tag Analysis
- Position Checker Tool
- Proxy Detector
- Sitemap Generator
- Sitemap Submitter
- Spider View Simulator
- Submit to Search Engines
- Website Speed Test
- Website Submitter
- What is my IP Address