SEO Tutorials

Image optimization 101: How to rank higher in image search

SEO is not only about optimizing written content.

The increasing dominance of visual content online has brought with it new opportunities for increasing a site’s search traffic by optimizing videos and images.

Optimizing your images gives your website an additional chance to be found via image search, and a good logo or some eye-catching graphics can be just as effective at attracting visitors to your website as your written content.

Continue reading

Do you need a PPC management expert?

The eternal question for businesses both large and small: should you run your marketing in-house, or should you hire an expert?

There are numerous factors to take into account including level of expertise, the complexity of the campaign, existing internal resources and the management fee of said expert.

We’ll come clean. Most of these types of articles are written by an agency of some sort and will therefore naturally have a tendency to be biased towards the benefits of external help. Some would call it bias, some would call it scaremongering.

This article has also been written by an agency, but one looking to give as objective a view as possible. The truth is that both options are completely viable when everything goes to plan. In turn, both options can have significant downsides when those cogs do not turn quite as smoothly as intended.

Note: We previously explored this topic on Search Engine Watch back in 2014. I will include some of the points made in the aforementioned article, as well as adding some new ones.

Cost effectiveness

On the face of it, one of the main points of the argument boils down to cost effectiveness. For a business, the obvious question is: why would you pay a potentially hefty management fee if you could find the time in-house and do it yourself? Especially when PPC is just a bidding system, and does not require design or development skills.

Dig a little deeper, though, and there are key questions that you need to ask yourself on both sides of the coin.

Using an agency

  • How does the management fee stack up against the actual PPC spend and subsequent ROI offered by the campaign?
  • More importantly, what is the risk profile of the agency not hitting the expected ROI and as such their management fee actually removing all of the margin in the campaign?

Managing the campaign in-house

  • Do you actually have the in-house resources to properly manage a campaign, or are you just going to try (unsuccessfully) to squeeze more time out of an already busy team? How will this impact other critical daily tasks within the business (i.e. opportunity cost)?
  • If you are hiring a new person to run the campaign, what are their total costs? Basic salary is an initial indicator but what about benefits, pension, increased desk costs?
  • Furthermore, is this something that you are committed to for the mid to long term? Hiring someone is easy, but if this is their sole responsibility, they may become an unnecessary part of your cost base should you not continue with the campaign. In the UK and EU especially, you can’t then just get rid of this unfortunate soul without going through a somewhat arduous process which will likely involve additional costs such as redundancy pay.

As you can see above, assessing the cost effectiveness of agency vs in-house involves a whole swathe of variables. No two businesses are the same, and as such, applying a standard equation to this situation is simply not possible.

Hopefully you will have enough information available to put together your own version of this equation and come to a decision with regards to the cost effectiveness of a PPC expert.

It all depends on the level of knowledge

Knowledge is power, or in the case of PPC, knowledge and experience will result in a campaign that outperforms one managed by a beginner. You might get lucky, but as with most things in life, the more experienced practitioner will come out on top.

Taking this into account when looking at how to manage a PPC campaign (agency or in-house), you must first look at the level of knowledge within your team.

The fundamentals of a bidding system and focusing on search terms that deliver ROI for the business are easy to grasp. However, to really squeeze the benefits of your spend on PPC you simply need to know what you are doing. Of course this required level of knowledge is only likely to increase with more complex campaigns.

As mentioned in the previous blog post on this topic, having an expert conduct the initial research and set up the campaign is often a cost effective method of making sure that the campaign gets off on the right foot. Much like an initial SEO audit, gettting an expert to set up the campaign could provide you with a sufficiently stable foundation to then manage the campaign in-house.

Again, we have seen a plethora of pretty decent campaigns that have been set up by business owners after doing their own research so it still comes back to the level of knowledge (and time) available to you.

 

How far do you want to go?

There is a reason why different marketing campaigns have different associated costs. Some are very simple, single product localised campaigns whereas others may involve national or international coverage with thousands of individual products.

Generally speaking, the more complex the campaign, the higher the budget, and therefore the higher the risk liability of an underperforming campaign. It doesn’t just come down to bid adjustments.

Great digital marketing campaigns break down the barriers between departments and channels. Teams now communicate with one another instead of remaining in their silos, combining to generate greater results than the sum of their individual parts.

As mentioned back in 2014, using an agency does give you access to more than just PPC experts at the top of their game. An awesome PPC campaign will also pull in web design experts, taking into account UX/UI, conversion rate optimization, content writing and remarketing (among others).

In sport it is often referred to as ‘marginal gains’, although you’ll find that an overhaul of your website’s user flow could deliver far more than just a marginal gain. Access to multiple disciplines is what you potentially turn your back on when running a campaign in house and it can make or break a campaign.

The core functions apply to the aforementioned single product campaign just as much as they do to the international campaign; it’s just that the latter has more to lose. As campaigns increase in complexity, there comes a point where you need to fully commit to the process and give the campaign the very best chance of success.

In this case, it is worth handing it over to a team that specializes in PPC.

 
 
 
 

Money talks!

In the end, the likelihood is that if you put an agency’s campaign next to one that has been run in house you would imagine that the additional level of expertise and experience would mean that the agency’s campaign would produce better results. All other things being equal you would choose the agency nine times out of ten.

The elephant in the room is of course the agency fees.

Yes, you could argue that in-house teams have a higher level of industry knowledge, or that a PPC expert will be able to make sense of the campaign dashboard and utilize some of the more advanced tools. But you already know this. That argument is pretty clear (and well covered!).

It all comes down to money. In basic terms, how much money will you get in return for the corresponding costs?

Agencies often have a minimum fee associated with PPC campaigns. Therefore if your budget for PPC is particularly modest, for the sake of argument let’s say £500 per month, you will probably find that the agency fees will exceed the margin made from the campaign.

Remember to look at your internal margins rather than revenue! In this case it is likely that having an expert set up the campaign initially will be the only option that would produce a meaningful ROI.

On the other hand, if the agency fee is only a small percentage of the total budget spend and in turn, the margins from the campaign exceed the agency fee then it makes sense. As the Americans love to say, “if it makes money it makes sense”. The agency provides a higher level of expertise, and may also allow you the flexibility to stop the campaign at any moment.

One size doesn’t fit all

Hopefully my ramblings have shown that this not a one size fits all argument. There are simply too many variables.

My advice would be to base your decision on real world facts. For example, we often see business owners try to run campaigns themselves (or ask one of their team to do it) without seriously considering the time expenditure required. Trust us when we say that if you don’t have the time, it won’t get done. You’ll end up spending a chunk of money for three months and then giving up without making any adjustments.

On the other hand, if you do have the time you could find yourself getting to grips with the campaign and increasing your sales as a result. It may even mean that you can reduce spend on other marketing channels, all without having to pay an agency fee.

Consider your options before making a decision. Don’t just do PPC for the sake of doing it; there should be a real business case put forward which will provide real data as to whether hiring a PPC expert is going to be a viable option.

If it transpires that running the campaign in-house is going to work in the real world, then great, go for it! Be flexible, be realistic and you should find yourself making the right decision. Remember – both options can work. Add URL Directory.

How to Improve Page Speed for More Traffic & Conversions

 

By looking at several studies, we've pulled together the factors that are most important for local rankings in Google. Some of those are unexpected to say the least.

 

Unfortunately, most websites perform poorly when it comes to page speed. and that has a direct negative impact on their revenue.

There is an almost infinite number of things we can spend our days doing as digital marketers, and there’s never enough time to do them all. As a result, some things get pushed to the back burner.

One of the things that seem to get pushed back most often is optimizing page speed. This is easy to understand because most people don’t truly comprehend the importance of this often overlooked detail, so they don’t see the value in investing time and money to improve it by a few seconds or less.

What may seem like an inconsequential amount of time to some marketers, including those who focus solely on search engine optimization, has been proven to be monumental by data from industry giants all the way down to our own analytics data.

I’ll assume that you’re like me and you want to maximize your results, and of course, your revenue, right? Then let’s get started in making your website faster than greased snot! (That’s quite a visual, isn’t it?)

1. Ditch the Budget Web Hosting

We’re all trying to save money these days, after all, those subscriptions to Raven, SEMrush, Moz, and all the other tools we use on a daily basis add up quickly. It’s almost like having an extra kid.

One way a lot of people try to save money is by choosing the kind of cheap shared hosting that crams as many websites as they can fit onto a server, much like a bunch of clowns piling into a single car. Performance be damned!

Sure, your website will be available most of the time as it would with most any web host, but it will load so bloody slowly that your visitors will leave frustrated without ever converting into buyers.

“But it’s barely noticeable!” these bargain shoppers insist.

Here’s the thing — it might be barely noticeable to you because it’s your baby and you love it. But everyone else only wants to get in and get out of your website as quickly as possible.

People want to be on your site for just long enough to do what they came to do, whether that means to get an answer, buy a product, or some other specific objective. If you slow them down even a little bit, they will be likely to hate their experience and leave without converting.

Think about it like this:

Most people love their own kids unconditionally. But someone else’s kid screaming, throwing things, disrupting their night out at a restaurant? They hate that kid. It’s the same with your website.

How Much of a Difference Does It Really Make?

According to a study conducted by Amazon, a difference of just 100ms — a unit of time that a human can’t even perceive, was enough to reduce their sales by 1 percent. Walmart found similar results.

If that tiny unit of time has that much direct impact on sales, what kind impact do you think an extra second or more will have?

But it doesn’t stop there because how quickly (or slowly) your website loads also has an impact on organic search ranking and pay-per-click costs. In other words, if your website loads slowly, you should expect your competitors who have invested in this critical area to eat your lunch.

Bottom line: skip the budget web hosting. If they are selling it like a commodity (based mainly on price) then they’ll treat their customers like a commodity too.

There are a lot of web hosts that are optimized for speed, particularly for WordPress websites, and some of them are priced similarly to the budget options. So ask around, do some testing, and invest in a web host that will give you the performance to satisfy both your visitors and Google.

2. Reduce HTTP Calls

Every file needed for a webpage to render and function, such as HTML, CSS, JavaScript, images, and fonts require a separate HTTP request. The more requests made, the slower that page will load.

Now if you’re anything like most of the people I talk to, you’re probably thinking “Oh, I don’t need to worry about that, Jeremy. I know what I’m doing and I don’t add a bunch of bloated garbage into my website!”

That may be partially true. You may not add a bunch of bloated garbage to your website, but for 90 percent+ of the websites that I encounter — it’s still there anyway.

That bloat isn’t there because the Bloat Fairy snuck it in while you were sleeping. It’s there because a majority of web designers, regardless of skill or experience, don’t make page speed a priority. The sad truth is that most don’t even know how.

Here’s where the problem starts:

Most themes load one or more CSS files and several JavaScript files. Some, such as Jquery or FontAwesome, are usually loaded remotely from another server, which dramatically increases the time it takes a page to load.

3. Include the Trailing Slash

Omitting the trailing slash on links pointing to your website, whether from external sources (link building efforts) or from within your own website, has an adverse impact on speed.

Here’s how:

When you visit a URL without the trailing slash, the web server will look for a file with that name. If it doesn’t find a file with that name, it will then treat it as a directory and look for the default file in that directory.

In other words, by omitting the trailing slash, you’re forcing the server to execute an unnecessary 301 redirect. While it may seem instantaneous to you, it does take slightly longer, and as we’ve already established, every little bit adds up.

https://example.com (this is bad)
or
https://example.com/services (this is also bad)
vs
https://example.com/ (this is good)
or
https://example.com/services/ (this is also good)

4. Enable Compression

Enabling GZIP compression can significantly reduce the amount of time it takes to download your HTML, CSS, JavaScript files because they are downloaded as much smaller compressed files, which are then decompressed once they get to the browser.

Don’t worry — your visitors won’t have to do anything extra because all modern browsers support GZIP and automatically process it for all HTTP requests already.

5. Enable Browser Caching

With browser caching enabled, the elements of a webpage are stored in your visitors’ browser so the next time they visit your site, or when they visit another page, their browser can load the page without having to send another HTTP request to the server for any of the cached elements.

Once the first page has been loaded and its elements are stored in the user’s cache, only new elements need to be downloaded on subsequent pages. This can drastically reduce the number of files that need to be downloaded during a typical browsing session.

6. Minify Resources

Minifying your CSS and JavaScript files removes unnecessary white space and comments to reduce the file size, and as a result, the time it takes to download them.

Fortunately, this doesn’t have to be a manual process because there are several tools available online to convert a file into a smaller, minified version of itself. There are also several plugins available for WordPress that will replace the links in your website head for your regular CSS and JavaScript files with a minified version of them without modifying your original files, including popular caching plugins such as:

  • W3 Total Cache
  • WP Super Cache
  • WP Rocket

It may take a bit of effort to get the settings just right because minification can often break CSS and JavaScript, so once you’ve minified everything, be sure to test your website thoroughly.

7. Prioritize Above-the-Fold Content

Your website can appear to the visitor to load more quickly if it’s coded to prioritize above-the-fold content — in other words, the content that is visible before a visitor scrolls.

This means ensuring that any elements that appear above the fold are also as near the beginning of the HTML code so the browser can download and render them first.

It’s also critical to include any CSS and JavaScript that are required to render that area inline rather than in an external CSS file.

8. Optimize Media Files

Because mobile devices with high-quality cameras are common and modern content management systems such as WordPress makes it convenient to upload images, many people simply shoot a photo and upload it without realizing that, often, the image is at least four times bigger than is necessary. This slows your website down considerably — especially for mobile users.

Optimizing the media files on your website has the potential to improve your page speed tremendously, and doing so is relatively easy, so it’s a good investment of your time.

Optimizing Images

  • Opt for the ideal format. JPG is perfect for photographic images, while GIF or PNG are best for images with large areas of solid color. 8-bit PNG files are for images without an alpha channel (transparent background) and 24-bit files are for images with an alpha channel.
  • Ensure images are properly sized. If an image is displayed at 800 pixels wide on your website, there is no benefit to using a 1600 pixels wide image.
  • Compress the image file. Aside from being the top image editing program, Adobe Photoshop has awesome image compression capabilities and starts at $9.99/month. You can also use free WordPress plugins – such as WWW Image OptimizerImsanity, and TinyJPG – that automatically compress uploaded images.

Optimizing Video

  • Choose the ideal format. MP4 is best in most cases because it produces the smallest file size.
  • Serve the optimal size (dimensions) based on visitors’ screen size.
  • Eliminate the audio track if the video is used in the background as a design element.
  • Compress the video file. I use Adobe Premiere most of the time, but Camtasia is a solid choice too.
  • Reduce the video length.
  • Consider uploading videos to YouTube or Vimeo instead of serving them locally and use their iframe embedding code.

You shouldn’t stop there though because that only scratches the surface.

To truly optimize the media on your website, you need to serve the appropriately-sized images based on the screen size rather than simply resizing them. There are two ways to handle this, based on the implementation of an image.

  • Images within the HTML of your website can be served using src set, which enables the browser to select, download, and display the appropriate image based on the screen size of the device a visitor is using.
  • Images placed via CSS – typically as background images, can be served using media queries to select the appropriate image based on screen size of the device a visitor is using.

9. Utilize Caching & CDNs

Caching enables your web server to store a static copy of your webpages so they can be delivered more quickly to a visitor’s browser, while a CDN allows those copies to be distributed to servers all over the world so that a visitor’s browser can download them from the server closest to their location. This improves page speed dramatically.

How To Write Page Titles For Google & Other Search Engines in 2017

Page Titles Are An Important Ranking Signal

SEO Page Title Tag Best Practices for Google in 2017

Ranking in Google in 2017 is about SO MUCH more than just optimising one element of a page. The Page Title Tag (or more accurately the HTML Title Element) is still, however, arguably the most important on-page SEO factor to address on any web page.

Keywords in page titles can HELP your pages rank higher in Google results pages (SERPs). The page title is also often used by Google as the title of a search snippet link in search engine results pages. Keywords in page titles often end up as links to your web page.

As arguably an important ranking element, it is very important NOT to keyword stuff this element across multiple pages, using boiler-plate techniques.

Page Titles: Example Use

<title>What Is The Best Title Tag For Google?</title>

Note also that the text link Google DISPLAYS as your page snippet title can change dynamically. A search snippet title is very much QUERY DEPENDENT in 2017. If you want to almost ensure a page title will display in full in google.co.uk on a desktop machine – stick to about 55 characters in length max – although if you do that – you are missing out on some longer tail ranking benefit. I go into all this in detail below.

The same page might have at least a few variations displayed in Google, all dependent on words typed by the user, and this process starts as soon as a page is published:

Screen Shot 2015-03-17 at 00.36.26

Screen Shot 2015-03-17 at 00.36.03

 

….and over time links and other data soon give Google even more options to change that search snippet title. I go into these options below, but you can in most instances prevent Google from changing your title tag if you are succinct and a little more careful when creating it, so Google is not compelled to modify it:

Screenshot 2015-03-20 14.49.02

Note that desktop and mobile versions of Google are different too.

It was a surprise the first time I saw that Google displays longer title tags in MOBILE view than it does in DESKTOP view (in Google.co.uk, at least):

google-mobile-serps-iphone-screenshot

What Is The Perfect Page Title Tag?

There is no one-size -fits -all formula to creating the perfect title tag as the perfect title tag is perfectly relevant to the words that are on a specific page. An effective page title is created with how people search for things on the page in mind.

It’s difficult for me to describe an abstract but basically it does all come down to keywords and keyword phrases taken from text on the page (naturally) and expected popular/valuable search phrases (based on data available):

Screenshot 2015-03-18 03.38.42

Different kinds of pages require different kinds of title tags. For certain pages, the perfect page title might change over time to guarantee diversity in anchor text to the page, as the title ends up often used as anchor text in backlinks from other sites. For some pages – a more permanent page title might suffice.

The perfect page title for the page is usually going to be very page dependant and eventually user query dependent.

I will point out I optimise for raw search engine traffic performance before I optimise for display performance.

There is a big difference.

I will use a long title if it suits the page content. Vice versa, I will use a short title if that instead suits the page.

I rarely try and be sensational with my titles. My titles are functional. And longer than some best practices. I very rarely trim my titles to meet a recommendation lay down by any third party.

I’ll often post a functional title and then revisit it after I publish it once I observe how it is performing against competing pages, or have made an error or have new inspiration to modify it.

I expect my page titles to change over time for I too am always testing and optimising.

For me, a perfect title tag in Google is going to be dependent on a number of competing factors;

  1. A page title that is highly relevant to the page it refers to will maximise usability, search engine ranking performance and user experience ratings as Google measures these. It will probably be displayed in a web browser’s window title bar, bookmarks and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring AT LEAST ONCE within it.
  2. Most modern search engines have traditionally placed a lot of importance in the words contained within this HTML element. A good page title is made up of keyword phrases of value and high search volumes.
  3. The last time I looked Google displayed as many characters as it can fit into  “a block element that’s 512px wide and doesn’t exceed 1 line of text”. So – THERE BECAME NO AMOUNT OF CHARACTERS any SEO could lay down as exact best practice to GUARANTEE a title will display, in full in Google, at least, as the search snippet title. Ultimately – only the characters and words you use will determine if your entire page title will be seen in a Google search snippet. Recently Google displayed 70 characters in a title – but that changed in 2011/2012.
  4. If you want to *ENSURE* your FULL title tag shows in the desktop UK version of Google SERPs, stick to a shorter title of about 55 characters but that does not mean your title tag MUST end at 55 characters and remember your mobile visitors see a longer title (in the UK, in March 2015 at least). What you see displayed in SERPs depends on the characters you use. In 2017 – I just expect what Google displays to change – so I don’t obsess about what Google is doing in terms of display. See the tests later on in this article.
  5. Google is all about ‘user experience’ and ‘visitor satisfaction’ in 2017 so it’s worth remembering that usability studies have shown that a good page title length is about seven or eight words long and fewer than 64 total characters. Longer titles are less scan able in bookmark lists, and might not display correctly in many browsers (and of course probably will be truncated in SERPs).
  6. Google will INDEX perhaps 1000s of characters in a title… but I don’t think no one knows exactly how many characters or words Google will count AS a TITLE when determining relevance for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).
  7. You can probably include up to 12 words that will be counted as part of a page title, and consider using your important keywords in the first 8 words. The rest of your page title will be counted as normal text on the page.
  8. NOTE, in 2017, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY dependant these days. Google often chooses what itthinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
  9. When optimising a title, you are looking to rank for as many terms as possible, without keyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) – and take a more long-tail approach. Note that too many page titles and not enough actual page text per page could lead to Google Panda or other ‘user experience‘ performance issues. A highly relevant unique page title is no longer enough to float a page with thin content. Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites.
  10. Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. THINK CAREFULLY before autogenerating keyword phrase footprints across a site. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot of competing pages out there in 2017.
  11. The perfect title tag on a page is unique to other pages on the site. In light of Google Panda, an algorithm that looks for a ‘quality’ in sites, you REALLY need to make your page titles UNIQUE (ESPECIALLY RELEVANT TO OTHER PAGES ON YOUR SITE), and minimise any duplication, especially on larger sites.
  12. I like to make sure my keywords feature as early as possible in a title tag but the important thing is to have important keywords and key phrases in your page title tag SOMEWHERE.
  13. For me, when SEO is more important than branding, the company name goes at the end of the tag, and I use a variety of dividers to separate as no one way performs best. If you have a recognisable brand – then there is an argument for putting this at the front of titles – although Google often will change your title dynamically – sometimes putting your brand at the front of your snippet link title itself. I often leave out branding. There is no one size fits all approach as the strategy will depend on the type of page you are working with.
  14. Note that Google is pretty good these days at removing any special characters you have in your page title – and I would be wary of trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give you a further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA mark-up.
  15. I like to think I write titles for search engines AND humans.
  16. Know that Google tweaks everything regularly – why not what the perfect title keys off? So MIX it up…
  17. Don’t obsess. Natural is probably better, and will only get better as engines evolve. I optimise for key-phrases, rather than just keywords.
  18. I prefer mixed case page titles as I find them more scan able than titles with ALL CAPS or all lowercase.
  19. Generally speaking, the more domain trust/authority your SITE has in Google, the easier it is for a new page to rank for something. So bear that in mind. There is only so much you can do with your page titles – your websites rankings in Google are a LOT more to do with OFFSITE factors than ONSITE ones – negative and positive.
  20. Click satisfaction (whatever that is) is something that is likely measured by Google when ranking pages (Bing say they use it too), so it is really worth considering whether you are best optimising your page titles for user click-through satisfaction or optimising for more search engine rankings (the latter being risky in 2017).
  21. I would think keyword stuffing your page titles could be one area that Google could look at.
  22. Remember….think ‘keyword phrase‘ rather than ‘keyword‘, ‘keyword‘ ,’keyword‘… think Long Tail.
  23. Google will select the best title it wants for your search snippet – and it will take that information from multiple sources, NOT just your page title element. A small title is often appended with more information about the domain. Sometimes, if Google is confident in the BRAND name, it will replace it with that (often adding it to the beginning of your title with a colon, or sometimes appending the end of your snippet title with the actual domain address the page belongs to).
  24. Beware of repeating keywords unnecessarily, keyword stuffing or using bolierplate text to create your titles. Any duplication that is perceived by Googlebot as manipulation is easily down-ranked by algorithms.

Spammy Title Tags

When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword 4 times or only once?

I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.

I always aim to keep my HTML page title elements relatively simple, and looking as human-generated and unique, as possible, although it is easy to end up stuffing keywords in there.

Spammy title tags can also look real ugly. Your audience might not like ugly title tags, whereas another audience might not care.

As I said, page titles are ‘arguably the most important on-page SEO factor to address on any web page’ other than primary content, and as such, they may be used by Google against you if the detect an intent to rank higher by low-quality means.

Optimising your titles was traditionally the first place to go to to improve your rank in Google, so it is quite reasonable that Google would use this against you, if it wanted to. Yes – keyword stuffing works and might bring you traffic in the short term – but it may be that over time the benefit you get for this is pegged back, and wider site ‘quality signals’ are impacted, leading to less traffic than you would have got if you didn’t keyword stuff your titles.

That doesn’t mean you can’t have long titles. It just means don’t use the CMS and boilerplate methods to generate them. If you do employ long titles, make sure it is not just repeating the same words over and over again, and the title is uniquely relevant to that page.

For now – I would stick to very concise titles on pages and leave all boiler plating out. If it is machine generated, keep it very simple (and that advice probably fits for most of us).

Pay close attention to what Google says in 2017 about how to optimise your title tag:

  1. make sure every page on your site has a title specified in the <title>tag

  2. Page titles should be descriptive and concise

  3. Avoid keyword stuffing

  4. Avoid repeated or boilerplate titles

  5. Brand your titles, but concisely (I generally wouldn’t bother with branding on most pages on your site – Shaun)

If you are doing anything other than the above to drive more organic traffic to a page using the age-old tactic of page title optimisation, you might run smack bang into the Google Quality Algorithm.

Dynamic Titles In Google SERP Snippet

Google works VERY differently since the days I started publishing results on such SEO tests. I kept some of the information below on this page to show how SEO learned what Google seemed to like, and how you tested such a thing, in case anyone is interested in the future.

With DYNAMIC page titles – Google is free to ignore the page title you use – and will choose the best title for your search snippet, based on what it thinks, is the most relevant text, to a search query.

It is now VERY COMMON for Google to create its own search snippet title, all but effectively ignoring the title you specified for the page.

I witnessed this tested for months if not for over a year (usually to help repair malformed titles or pages with the same title ‘tag’ as some call it, for instance), but it became very widespread, even for well formed pages too.

Example (from 2012):

Title 1

and

Picture 2

Dynamic snippet titles seem to key off various signals – from anchor text pointing to the page in question, or from the page title itself, or from Headers (h1-h6) – all based on what the searcher actually typed in.

I see a lot of folk asking in forums why their snippet title is different from their page title, and it’s probably that you just now can’t ever guarantee what title Google will pick to match to a phrase (unless you control the linking of course – more of that below).

SEO are used to very dynamic DESCRIPTIONS in the snippet. Google is MUCH more confident at stretching that dynamism to the snippet title these days, and not just using this to ‘repair’ malformed or very unhelpful titles.

Perhaps EVEN more of a reason to mix up the anchor text pointing to a page, and creating unique page titles that are different from H1 headers etc….Note – There are other reasons your page title is wrong in Google (see below).

 

How Google, Yahoo & Bing Handle Title Snippet If Title Is ‘Malformed’

Google during 2015

I demonstrated a long time ago Google will use the next available Header, be it a H1, H2, H3, H4, H5 or H6.

Today, in 2017, Google is happy to label a page like this ‘untitled’ rather than put that much work into making sense of this specific page:

Screen Shot 2015-03-18 at 01.19.33

 

Google during 2012:

This SERP screenshot from 2012 is an example of Google using a H6 to form the snippet title.

Screen Shot 2012-04-02 at 01.56.59

I intend to change this back to a H2 to see if it picks it up again….

It’s worth pointing out that Google will even handle this particular word VERY DIFFERENTLY on a different site with more domain authority and/or better kind of page than my test page.

Screenshot 2015-03-18 01.41.40

 

… so the point is to expect Google to modify your page titles in your SERP snippet – at some point.

Bing during 2015:

Yahoo and Bing now much work the same (as Bing actually powers Yahoo search now) as they have done for years. Bing and Yahoo will display as many characters as possible in the title tag. I would expect these to change to a more similar approach to Google’s in the coming year(s).

This is a screenshot of Bing in 2017…

 

Screen Shot 2015-03-18 at 01.48.05

 

Bing during 2012:

Screen Shot 2012-04-02 at 02.04.15

Yahoo during 2015:

 

Screen Shot 2015-03-18 at 01.48.51

Yahoo in 2012:

Screen Shot 2012-04-02 at 02.02.37

NOTE – It’s worth pointing out that you CAN search Bing and Yahoo for the longest word coined in the English language – and that is the word in the ‘malformed’ Page Title Element I am using to test this. It’s far too big to display – and far too big to even search for at Google.

Does Google Penalise Keyword Stuffing of Title Tags?

Some time ago I showed if you had a long spammy page title Google would ‘forgive you’.

At one time the recent past Google seemed to count the first 8-12 words (while displaying 70 characters in a search snippet) and then just seem to count the rest of the words as part of page text but it would not obviously penalise you for a massive title.

I’ve only ever seen one obvious case of Google ‘maybe’ responding negatively to a very spammy title.

A company contacted me to ask:

I was looking to see why my site has not been getting any hits

I Looked at the home page title:

Company Name – xxxxxxx, telephone, health and safety, xxxxxxx, scanner, PC, fax, monitor, xxxxxx, keyboard, office equipment, cleaning, cleaning service‚ cleaning company, computer cleaning, xxxxxxxx cleaning, PC cleaning, dust control, telephone cleaning, computer xxxxxx cleaning, anti static, xxxxxx room, printer cleaning, xxxxxxx display unit, raised access xxxxxx, preventative maintenance, zinc dust, anti static mat‚  xxxxxxx room, anti static flooring, xxxxxxx room, computer cleaning equipment, keyboard cleaning, companyname uk

Google showed no title in the page title link in the SERPs:

No Title In Google

There was H1s on the page (multiples too) – but perhaps if created correctly, Google would have used a H1 as the title tag. Unfortunately, the site in this example was using images as H1 Tags too.

So, this could have been more about sloppy code, than actual penalty.

Over the course of my career, I’ve not made many observations of Google reallypenalising for keyword stuffing relevant page title elements. However, this sort of abuse (keyword stuffing titles) may well be rolled into things like the May 2015 Google Quality Algorithm. I don’t know, for sure.

How Many Words In A Page Title Tag?

Way back in 2007 I  tested how many keywords will Google read in the title tag / element using a simple test. And here’s some of the observations, which were quite surprising.

First – here’s the test title tag I tried to get Google to swallow. And it did. All of it. Even though it was a bit spammy;

HoboA HoboB HoboC HoboD HoboE HoboF HoboG HoboH HoboI HoboJ HoboK HoboL HoboM HoboN HoboO HoboP HoboQ HoboR HoboS HoboT HoboU HoboV HoboW HoboX HoboY Hob10 Hob20 Hob30 Hob40 Hob50 Hob60 Hob70 Hob80 Hob90 Hob11 Hob12 Hob13 Hob14 Hob15 Hob16 Hob17 Hob18 Hob19 Hob1a Hob1b Hob1c Hob1d Hob1e Hob1f Hob1g Hob1h

Using a keyword search – hoboA Hob1h – we were surprised to see Google returned our page. We also tested it using – Hob1g Hob1h – the keywords right at the end of the title – and again our page was returned.

So that’s 51 words, and 255 characters without spaces305 characters with spaces, at least! It seems clear Google will read a title length of just about any amount – hence why some SEO do use very long titles.

**************

Update: Qwerty pointed out an interesting fact about the intitle: site operator in Google.

Google results with the intitle: command

…..results as expected. But next in the sequence returns the following, unexpected result…..

Google results with the intitle: command

So what does this tell us? Google seemed to stop at the 12th word on this page at least when returning results using the intitle: site operator. Another interesting observation. Thanks Qwerty.

**************

I’m obviously not sure what benefit a title tag with this many keywords in it has for your page, with regard to keyword density / dilution, and “clickability” in the search engine results pages (SERPs).

50 plus words in a title is certainly not best practice.

How Many Characters will Google DISPLAY As A Page Title SERP Snippet?

Desktop Examples from Google.co.uk March 2015:

Screen Shot 2015 03 26 at 00.24.40Screen Shot 2015 03 26 at 00.16.37Screen Shot 2015 03 26 at 00.14.00

At one time Google showed a maximum of 70 characters in the title – but, as the examples above illustrate: in 2017 it is not possible to lay down an exact number of characters for page snippets in Google:

Google displays as many characters as it can fit into  ”a block element that’s 512px  wide and doesn’t exceed 1 line of text”. So – THERE IS NO AMOUNT OF CHARACTERS any SEO can lay down as exact best practice to GUARANTEE your title will display, in full in Google, at least.Ultimately – only the characters and words you use will determine if your entire page title will be seen in a Google search snippet.

What is was pre-2012:

This is a common question and surprising how many people don’t know it. Google will display up to 70 characters maximum in a Page Title for Google. How many characters or words Google actually counts in terms of attributing it to the page title in the TITLE element is another story. Best practice – keep your page titles to under 70 characters and keep your important keywords in the first 55 characters or 8 words if you think like me there must be a limit on the number of characters in anchor text link. If your page title is over 70 characters, Google will truncate your page title when it displays it in the SERPs leaving you with a page title that doesn’t make any … ;) PS – here’s some test pages that illustrate Google displays 70 characters in the page title – and if it encounters 71 characters, it will consider the page title malformed and Google will use the next available H tag as the page title (although sometimes it will use incoming anchor text too).

Home Page Title Is Wrong/Different In Google Search Results

In short, Google will often make up its own snippet title for the page, if it thinks it can create a cleaner one, based on what it knows about the page from internal links, content, HTML mark-up and links from other sites to this page. I thought I would list some of the more obvious reasons your home page title seems wrong in Google search results:

  • If you’re totally new to this, Google will look in your page <title></title> tags in the HEAD for your page title information, to display as the link to your page in search results (SERPs)
  • If you’ve made changes recently to optimise your page title for search engines, it might very well be just that Google doesn’t know yet, because it’s not visited your page since you made the changeSo give it time. Check the Google CACHE link under your listing in search engines to see which page Google is “supposedly” using for your page (it’s usually accurate).
  • If this is clearly not the case, it might be because it is using Open Directory Project (ODP) data (the DMOZ directory) to replace your title with your directory listing data – that is, it’s using the link on that directory to replace your title because it thinks its more relevant to a specific query. If this is a possibility, add <meta name=”robots” content=”noodp” /> and <meta name=”robots” content=”noydir” /> – for the same issue in Yahoo – to your home page and once Google visits the page again, the problem should be resolved See – NOODP & also NOYDIR.
  • If you are not in DMOZ or the Yahoo directory, it may well be your title element ‘My Title’ is malformed in someway – I’ve shown before how search engines can choose to use a Header ‘tag’ as a Page title, or ignore the page title completely if you’re a spammer in the making – so ensure your Page Title is properly marked up and starts with and ends with and is in the HEAD of your document – and there is only one.
  • Another reason is you may be confusing Google in some way from getting to the correct page title, and/or from displaying it in results, using directives in your meta tags or Robots.txt. When Google knows about the page because other pages it CAN read link to your page with descriptive keywords, it might very well use these links on third party pages to determine what that page title should be if it decides to include the URI in its listings, and you CERTAINLY don’t want that. For questions about Robots.txt I go to Sebastian.
  • Google uses many signals to help create a search snippet title these days, and will use elements or attributes of your page, or the information in links to your page, to create a snippet title, based on what a searcher typed in.

An intelligent, well formed page title, that is highly relevant to the page, and not duplicated on other pages of your site, is just about the most important single thing you can do to ‘SEO’ a page on your site – after you’ve got some text to put a title on, of course.

Your Page Title is still INCREDIBLY IMPORTANT

Having valuable keyword phrases in your page title is a must for Google to work out the topic of your page, but it’s also a great opportunity for a call to action.

I would be BE VERY WARY in 2017 of repeating keywords using boilerplate techniques throughout multiple pages on a site in an effort to improve SEO, to make them ‘unique’. If an example title tag is 50% similar to another title, or many other titles on your page – that might be not what Google would call ‘unique’.

There’s lots of opportunity to find if you experiment with more laser focused page title for search engines, or more engaging title for humans, however if you take one thing away from this article today – remember this:

A highly relevant unique page title is no longer enough to rely on if the page itself is ‘thin content’.

Google wants to rank long form, keyword rich text pages (with nice titles) these days – rather than the opposite of that, which is what it used to want to rank.

Single, unique, long form quality content pieces with a well thought out page title perform REALLY well in Google in 2016.

That’s better for users, better for Google’s bottom line and harder for spammers….perfect, for Google.

Check out our Character Counter Tool if you want to count some characters.

Read Next:

My guide to SEO for beginners or my guide to link building for beginners. More experienced SEOs will find my recent Google Panda tips post useful.

External Links & Sources

 

How To Submit A Site To Search Engines Like Google, Bing & Yahoo

Why Should I Submit My Website to Google, Yahoo And Bing?

Getting listed in Google and the other popular search engines is one of the most effective ways of directing free (or more accurately unpaid), targeted traffic to your website.

Organic traffic is still the most valuable traffic in the world, in 2017, with search enginesstill rated the most trusted source for finding news and information:

Search engines rated most trusted source in the world in 2017

 

What Are The Top Search Engines In the UK To Submit My Site?

The most popular search engines in the UK are Google, with around 90% of the market share, Bing, and Yahoo. All search engines serve nationwide results, based on thereputation of a website and local results, based on the proximity of the user to the business.

Google, for example, has many country-specific engines (e.g. www.google.co.uk) that help Google deliver more accurate results for UK based users.

Graph: The Top 5 Search Engines in the UK from 2015-2016; Date: 2016-04-06

 

Connect Your Website To Google & Other Search Engines

I remember the first time I had to add my site to Google. That was many years ago now, but I had no idea how to do it either.

I didn’t know if I had to register my site with search engines in some special way. I didn’t know if I had to add my site to Google myself or pay them to do it for me. It’s simple for me now – so I wrote this to give you a better understanding of the process (which costs nothing).

This ‘how to’ guide is a primer about getting your small business website into Google, Yahoo and Bing….fast, and free.

Google is the principal driver of traffic in the UK. When it comes to submitting your site to thousands of search engines – well, don’t bother.  There is only a handful of players in the UK and most partner with the top global search engines Google and Bing.

Getting your site into search engines is one thing; ranking high in Google, for instance, is another story altogether. I go into both scenarios in this article.

How Do I Check If My Site Is In Google?

Just type your website address into the Google search box. If Google knows about your site, it will tell you. If your site doesn’t feature as the number 1 result, you may need to submit your site to Google. Another way to check if a page is in either of the search engines is to lift a piece of unique text from the page, put it into the search box “in quotes”.

Your page should come up if Google is indexing your site.

Consider also using the info: operator to see if Google knows about your page – e.g.info:www.hobo-web.co.uk or indeed any page from your website site:www.hobo-web.co.uk (Bing too).

In 2017 – I would just put your homepage URL into Google’s search box to ensure Google is serving that page to users. Just being found, and indexed by Google does not mean that your site appears in the main Google results that everyone sees.

Why Can’t I Find My Site In Google?

Search engines like Google need to find your website before it can crawl, index it, rate it and display it in its listings (SERPs – or Search Engine Results Pages).

Googlebot (the spider Google uses) accesses your page if it knows your website exists. Your website can only be listed in Google search if it was crawled and indexed by Googlebot in the first place.

Google may not know about your site, yet.

There are many reasons a site does not rank in Google. My SEO report identifies most of them.

Do I Need To Pay to Get Into Google, Bing or Yahoo?

No. You do NOT have to pay a penny to get your site into any of the major search engines.

You can submit your URL to all the main global search engines entirely for free.

How Do You Submit A Site To Search Engines?

You do not need a SEO (search engine optimizer) to present your site to Google, Yahoo or Bing. You don’t pay to get into any of the big search engines natural (free or organic) listings (and note, Bing powers Yahoo results).

Google has ways of submitting your web pages directly to their index. Most search engines do.

I’d expect submitting your site through the following methods will certainly get you started:

How Do I Connect My Site With Google Search Console?

 

At some point, to rank better in search engines, you are going to have to get other sites to link to it, so you may as well think about that first link on external sites.

In 2017 – that usually means creating useful, accurate and in-depth content that attracts links naturally.

If you want to bypass all that, for now, you can submit your website and verify it inGoogle Webmaster Tools. The procedure to connect your website is very simple with a little technical knowledge.

How Do I Verify A Site with Google Webmaster Tools?

You have a number of options to verify that you own your website:

  • Add a meta tag to your home page (proving that you have access to the source files). To use this method, you must be able to edit the HTML code of your site’s pages.

  • Upload an HTML file with the name you specify to your server. To use this method, you must be able to upload new files to your server.

  • Verify via your domain name provider. To use this method, you must be able to sign in to your domain name provider (for example, GoDaddy.com or networksolutions.com) or hosting provider and add a new DNS record.

  • Add the Google Analytics code you use to track your site. To use this option, you must be an administrator on the Google Analytics account, and the tracking code must use the new asynchronous snippet.

 

How Do I Automatically Get New Web Pages Into Google?

You can ping Google (and other engines) using RSS or XML.

If you have a blog, In the past, I’ve found registering a site with FEED BURNER is useful in happy, instant indexing of new content on a site.

If your website publishes an xml sitemap, this is even better. It’s amazing how fast Google indexes and ranks pages.

I’ve seen pages get into Google and rank in the top ten for a phrase in less than 1 minute and I recently tested just how fast Google publishes your content if you’re well connected.

WordPress, for example, pings Google when you post new content, which is very handy with a blogging system. Most modern open source content management systems of note do something like this.

You can add your website direct to Google Webmaster Tools, too.

Do I Need Links for Google To Find My Site?

NO. Historically, the best way to get into Google, and the other top search engines, was for the search engine spider to find your site through a link that was already on another website, which Google already includes in its index.

A link from a blog, forum or another website would usually suffice.

While this is still the case, there are other easier and safer ways to submit your site in 2017.

Any link from another website will do as long as Google sees it as a link (nofollow prevents Google from passing Pagerank and other quality signals – but Google said they might still be used for discovery.).

When it comes to links, you should think how relevant the page is that you want the link from to your website. Does Google rate the web page highly? If not, a link on that site might be ignored.

Many SEO think relevant links are important. I think it’s more important just to get a link from a real website.

A link from a reputable website will help Google trust your site more. Ranking high or even number 1 on Google often comes down to reputation and the quality of sites that link to you. Google rates the quality of your website too, in this rank calculation).

If you submit your website to cheap directories or buy links to get into Google, these links can harm your online reputation and rankings in the future. You could be effectively penalised, or removed from Google’s results entirely.

In 2017, the simple truth is, you hardly even need a link for Google to find your website. Google (especially) WILL find your domain within days or weeks, and if your website has unique content on it – it will go into its index quickly.

It will probably rank high for a little while too. Then drop down the rankings a bit. A ‘honeymoon period’. Maybe it’s to give you a taste of Google traffic – or perhaps it is to determine which quality standards your site fails to meet.

Should I Submit My Site To Directories?

You could, of course, think about submitting your site to directories.

There are some good quality directories out there. A link in ONE OR TWO of these directories will probably get your site into search engines.

Unfortunately, there is a LOT of low-quality links you will probably stumble upon first. That kind of backlink can quickly turn toxic – and too many of them can cause you problems.

I would AVOID all low cost, cheap SEO submission services, especially those based around submitting your site to directories.

Often pointless and often risky.

Which Paid or Free Search Engine Submission Tools and Services Should I Use?

I was talking to someone who recently paid a website submission company to submit their site to lots of different search engines.

It’s easy to forget many are ignorant to what seems, to the experienced, an obvious waste of money. I was looking at some free of the free search engine submission tools available recently and a great many of them are simply lead generation tools to get your email address.

You put your website into the tool with your email, and this company has now been alerted you need SEO services.

If you’re lucky, you’ll only receive the odd unsolicited email.

If you’re unlucky, you’ve just been added to a spam list to be forever bombarded with low-quality SEO services, directory submission services, penguin proof link building strategies and emails telling you about undetectable link networks and the latest traffic exploding WordPress plugin.

If you are being told about most of these services in an unsolicited manner – there’s nothing private or undetectable about them.

If you do use any of them – point those links at a minisite (and not your main site) because they won’t last forever (if at all) and Google is getting serious penalising sites who buy and sell links using these services.

Search engine submission has not been a priority for most SEO I mix with because they know the search engines that matter (and are relevant) do a good job of finding a new site by themselves – and there are many documented ways of getting a page into Google, Yahoo and Bing, for instance. You do not need any company to do this for you – for instance, there’s pretty decent documented evidence (and observations made) that Google Plus can be helpful getting your site into Google.

You can submit your site to search engines yourself, for free, anytime you want. Your site should be plugged into relevant search engine offerings (like Google Webmaster Tools and Bing Webmaster Tools, Google Places and Google Plus, for a start).

I don’t ever recall a decent site not being indexed pretty quickly by all the main search engines, and that’s without bothering even with the above services from the engines.

Invest your money instead in building the online reputation of your website and making your pages richer, and more relevant, and seek to keep people on your site, when you get them. Don’t submit your website to sites where you do not know where the link will appear or you are probably in unnatural links territory.

You can use our free SEO tool to quickly audit your website and check against some SEO best practices.

Why Is My Website Not Listed in Google Search WEEKS or MONTHS After Submitting It?

If your website is not in Google (at all) after a few weeks (or months) of doing the above on a few sites (don’t spam), then something might be wrong. Instructions in HTML may be blocking Google from indexing your website, for instance.

You can FETCH THE WEBSITE AS GOOGLE in Google Webmaster tools to make sure everything is ok.

If Google can access your site, you probably need to review your content and make sure your website complies with Webmaster Guidelines.

Now you’ve learned how to submit a website to search engines, the real fun begins!

Check out our SEO tutorial for beginners here.

Find out how to check your rankings in search engines, and the best SEO tools you should be using.

Can Social Media Links Get My Site Indexed by Search Engines?

Probably.

If you do a lot of it.

Google certainly spiders Twitter, Facebook (when it can) and Google Plus – and links on these websites often spread to other places –  so creating a profile on these websites sometimes works to help a website get into these search engines.

I have also read that Bing spiders Facebook pages too (they have an agreement with FB). Social media links are often ‘nofollowed’, though, so they often do not count as a ‘vote’ to your website, so I would not rely on them.

How Do Search Engines Work Out Where My Website Ranks?

Before you try to add your site to the search engines, you should understand what they look for when they decide how to rank your site.

Just because you’re website is listed in Google doesn’t mean you’ll get traffic. You need to make sure your site is search engine ready or search engine friendly.

The general understanding is that most engines use a “formula” to determine intent, keyword relevancy, online reputation and site quality.

The technical term is called an “algorithm”, and each search engine has its own unique algorithm (s) that it uses to rank pages.

Generally, this ‘magic’ formula consists of your page title, text content and the number and quality of links pointing back to your site, etc, etc.

Some say Google uses over 200 signals to rank a page, all weighted differently, at any one time – and Google likes this flux. It makes manipulating their index more difficult.

It’s important to note that every engine is different.

Some may look at meta tags, some may ignore them and just place emphasis on your body content. These days, most metadata is becoming less and less important, and often ignored.

Because of SPAM abuse, many search engines no longer use these tags to help rank pages, but you should still include them because they do use them to display information about your site.

For example, whatever is inside the title tag is often, but not always, the resource the search engine will use as the title of your page in the search results snippet. The easiest way to get into all the major search engines is to be linked to from a site that already appears in, for instance, Google and Yahoo and Bing, and social signals certainly play a part in discovery of new pages and sites.

How Do I Improve My Ranking Position in Google?

There are a few ways I can think of getting into Google, and improving your rankings.The best way to improve your rankings is to get a link from a reputable website to your website.

The best way to improve your rankings is to get a link from a reputable website to your website. Yes, that’s all you need to do. Google likes finding new sites “itself”, and it does this by spidering links from website to website, and eventually ranks pages on the quality and number of those links.

If you want to improve your rankings, try and get a link from an “authority” site in your industry, or exchange a few links with other businesses or suppliers in your line of work, perhaps not direct competitors. One link from a very well trusted site in your niche can improve how much trust Google has in your website, but they want you to create ‘buzz’ about your site, rather than build links in a mechanical fashion. Editorial links from media sites also still carry a lot of weight in 2017.

If you cannot think at all how to get a link you can add your site to Google directly, via the links above, or via Google Webmaster Tools, or Google Plus and Google Places.

Remember to know search engine rules, and if you are interested in learning about SEO….

Will Google Find New Pages on My Site?

Don’t worry about telling Google about new pages on your site once your site is indexed and appearing in Google. Google is very good at finding new content. Google loves fresh content, and if you publish lots of it on your blog, and it’s of a decent quality, Google will visit your site often, and rank your content for free.

A while back I tested what happens when publishing NEW content to a website already IN Google. I know (generally) what to expect but it’s always good to have a peak every now and again. I use WordPress, RSS & Feedburner (pinging Google Blog search) to instantly syndicate the Hobo articles so I expect a few things to happen:

  1. Get into Google SERPs in a few minutes
  2. Disappear for a while again
  3. Come back into the SERPs & stabilise
  4. Get Cached by Google
  5. Rank

So how long did it take to index my new page when I pressed publish, and what happened to the page in SERPs:

  • My Feedreader (instantly – though there can be a delay sometimes)
  • Google Blog Search (15 minutes)
  • Google SERPs (30 minutes – so Google knows about it)
  • 1 hour UK position 25
  • 12 Hours later top 10 UK
  • Cache accessible within @18 hours, I can access it use using the info:command, but the cache is not available in the SERPs yet
  • A few days later Google dropped it from the SERPs as expected. Today it seems to be back on page 2 of Google.
  • During this period the page is not even returned in a site: command but it is using the page info:command.

This is all down to the way Google handles CERTAIN fresh up-to-date content – see QDF Query Deserves Freshness. If you get a handle on it, you see why a little online business authority and being first with the news is KEY to getting traffic out of Google.

I’ve got 12,000 visits from Google alone in just over 24 hours fiddling with that – but you’ve got to be fast out of the traps. This is also why publishing fresh content on your site is seen to be such an important strategy for SEO success.

Those rankings above stabilised n the first page, and longer term ranking success will ultimately depend a lot on the ‘authority’ or ‘domain trust’ of this site, the relevance of my page title, the content and over time, how well linked it is within this site (which a lot of people forget about) as I don’t really expect too many incoming links to the page. Of course, rankings will also depend on how well ‘optimised’ the competition is for the term.

See here for more Geek stuff on QDF.

Note – In a recent test in October 2014, I saw an article indexed within 1 second (albeit with an incorrect timestamp of 3 hours caused by differences in time formats/geolocation settings)

How To Put Your Company Logo On Google SERPs

You can now put your company logo on search results using simple code, placed on the home page of your website:

Today, we’re launching support for the schema.org mark-up for organization logos, a way to connect your site with an iconic image. We want you to be able to specify which image we use as your logo in Google search results. Using schema.org Organization mark-up, you can indicate to our algorithms the location of your preferred logo. For example, a business whose homepage is www.example.com can add the following mark-up using visible on-page elements on their homepage. This example indicates to Google that this image is designated as the organization’s logo image for the homepage also included in the mark-up, and, where possible, may be used in Google search results. Mark-up like this is a strong signal to our algorithms to show this image in preference over others, for example when we show Knowledge Graph on the right hand side based on users’ queries.GOOGLE

 

Very easy to implement – just add the following code to your home page.

<div itemscope itemtype="http://schema.org/Organization">
 <a itemprop="url" href="https://www.example.com/">Home</a>
 <img itemprop="logo" src="https://www.example.com/logo.png" />
</div>

It’s probably past time to implement some more of this, too, and feed the Google beast:

http://schema.org/Organization

How Do I Submit a URL To Google Local Business Results?

Local SEO rankings are powered (in part) by Google’s local business directory, and can be improved by submitting your site to Google and telling them everything about your company.

Local business results for your business are only shown when there are searchers at a location near to your business or are specifically searching for your service in your area. Google will promote the most popular, most connected, most relevant and most trusted businesses in the local area to the top of SERPs. Sometimes this is in ‘local box’ style listings, other times, Google will merge local results with global results (since the advent of the VENICE algorithm change).

Submit Your Site to Google Maps, Google Places for Business and Google+ Local

Google Local Business listings, now known as Google Places for Business and Google+ Local (leading to lots of confusion) get your business (and website) featured for various keyword phrases, ESPECIALLY in your local area, as well as your business name, including a map to your business and contact details.

It’s FREE, so your crazy not to do it.

How To Improve Your Ranking in Google Local Business Results

In my experience, giving Google as much information as possible about your business is just about the best policy. Get listed in as many local business directories as possible. Encouraging positive reviews on your profiles all improves your standing in this index. Local business rankings are calculated differently from natural results but often BLENDED with them, if Google can detect your location).

Best Practices for Google Places (From Google)

  • Only business owners or authorized representatives may claim their business listings on Google Maps.
  • Use a shared, business email account, if multiple users will be updating your business listing.
  • If possible, use an email account with a domain that matches your business URL. For example, if your business website is www.hobo-web.co.uk, a matching email address would be me@hobo-web.co.uk.
  • The business name on Google Maps must be your full legal business name.
  • Do not attempt to manipulate search results by adding extraneous keywords or a description of your business into the business name.
  • Do not include phone numbers or URLs in the business name.
  • The address should look exactly the way you’d write it on a paper mailing envelope. If your business services several areas, you can create one listing under a PO Box.
  • Do not create listings at locations where the business does not physically exist.
  • PO Boxes do not count as physical locations.
  • Do not create more than one listing for each business location, either in a single account or multiple accounts.
  • Businesses that operate in a service area as opposed to a single location should not create a listing for every city they service. Service area businesses should create one listing for the central office of the business only.
  • Businesses with special services, such as law firms and doctors, should not create multiple listings to cover all of their specialties.
  • The precise address for the business must be provided in place of broad city names or cross-streets.
  • A property for rent is not considered a place of business. Please create one listing for the central office that processes the rentals.
  • Provide a phone number that connects to your individual business location as directly as possible. For example, you should provide an individual location phone number in place of a call center.
  • Do not provide phone numbers or URLs that redirect or ‘refer’ users to other landing pages or phone numbers other than those of the actual business.
  • Provide one URL that best identifies your individual business location.
  • Do not provide URLs that redirect or ‘refer’ users to other landing pages or phone numbers other than those of the actual business.
  • Use the description and custom attribute fields to include additional information about your listing. This type of content should never appear in your business’s title, address or category fields.
  • Please see this page of the LBC User Guide for examples of acceptable custom attributes.

If you are a small business or a brand new business just starting out, you should claim your listing. It’s 100% free and gives you a measure of increased visibility on Google if folks are looking for your type of business in your area. Here’s a link to more information on Google Places for Business https://www.google.com/local/business/ and Google Plus https://plus.google.com/‎. Google will very probably be using its products to find new pages (and that includes the Chrome browser and maybe even Gmail.)

SEO Tutorial For Beginners in 2017

What is SEO?

Search Engine Optimisation in 2017 is a technical, analytical and creative process to improve the visibility of a website in search engines. Its primary function is to drive more visits to a site that convert into sales.

The free SEO tips you will read on this page will help you create a successful SEO friendly website yourself.

I have over 15 years experience making websites rank in Google. If you need optimisation services – see my SEO audit or small business seo services.

An Introduction

This article is a beginner’s guide to effective white hat SEO.

I deliberately steer clear of techniques that might be ‘grey hat’, as what is grey today is often ‘black hat’ tomorrow, as far as Google is concerned.

No one-page guide can explore this complex topic in full. What you’ll read here are answers to questions I had when I was starting out in this field.

The ‘Rules.’

Google insists webmasters adhere to their ‘rules’ and aims to reward sites with high-quality content and remarkable ‘white hat’ web marketing techniques with high rankings.

Conversely, it also needs to penalise websites that manage to rank in Google by breaking these rules.

These rules are not ‘laws’, but ‘guidelines’, for ranking in Google; lay down by Google. You should note, however, that some methods of ranking in Google are, in fact, illegal. Hacking, for instance, is illegal in the UK and US.

You can choose to follow and abide by these rules, bend them or ignore them – all with different levels of success (and levels of retribution, from Google’s web spam team).

White hats do it by the ‘rules’; black hats ignore the ‘rules’.

What you read in this article is perfectly within the laws and also within the guidelines and will help you increase the traffic to your website through organic, or natural search engine results pages (SERPs).

Definition

There are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) but organic SEO in 2017 is mostly about getting free traffic from Google, the most popular search engine in the world (and almost the only game in town in the UK):

Graph: The Top 5 Search Engines in the UK from 2015-2016; Date: 2016-04-06

Opportunity

The art of web SEO lies in understanding how people search for things and understanding what type of results Google wants to (or will) display to its users. It’s about putting a lot of things together to look for opportunity.

A good optimiser has an understanding of how search engines like Google generate their natural SERPs to satisfy users’ navigationalinformational and transactionalkeyword queries.

Risk Management

A good search engine marketer has a good understanding of the short term and long term risks involved in optimising rankings in search engines, and an understanding of the type of content and sites Google (especially) WANTS to return in its natural SERPs.

The aim of any campaign is more visibility in search engines and this would be a simple process if it were not for the many pitfalls.

There are rules to be followed or ignored, risks to take, gains to make, and battles to be won or lost.

Free Traffic

 

A Mountain View spokesman once called the search engine ‘kingmakers‘, and that’s no lie.

Ranking high in Google is VERY VALUABLE – it’s effectively ‘free advertising’ on the best advertising space in the world.

Traffic from Google natural listings is STILL the most valuable organic traffic to a website in the world, and it can make or break an online business.

The state of play, in 2017, is that you can STILL generate highly targeted leads, for FREE, just by improving your website and optimising your content to be as relevant as possible for a buyer looking for your company, product or service.

As you can imagine, there’s a LOT of competition now for that free traffic – even from Google (!) in some niches.

You shouldn’t compete with Google. You should focus on competing with your competitors.

The Process

The process can be practised, successfully, in a bedroom or a workplace, but it has traditionally always involved mastering many skills as they arose including diverse marketing technologies including but not limited to:

  • Website design
  • Accessibility
  • Usability
  • User experience
  • Website development
  • PHP, HTML, CSS, etc.
  • Server management
  • Domain management
  • Copywriting
  • Spreadsheets
  • Backlink analysis
  • Keyword research
  • Social media promotion
  • Software development
  • Analytics and data analysis
  • Information architecture
  • Research
  • Log Analysis
  • Looking at Google for hours on end

It takes a lot, in 2017, to rank on merit a page in Google in competitive niches.

User Experience

The big stick Google is hitting every webmaster with (at the moment, and for the foreseeable future) is the ‘QUALITY USER EXPERIENCE‘ stick.

If you expect to rank in Google in 2017, you’d better have a quality offering, not based entirely on manipulation, or old school tactics.

Is a visit to your site a good user experience?

If not – beware manual ‘Quality Raters’ and beware the Google Panda/Site Qualityalgorithms that are looking for poor user experience for its users.

Google raising the ‘quality bar’, year on year, ensures a higher level of quality in online marketing in general (above the very low-quality we’ve seen over the last years).

Success online involves investment in higher quality on-page content, website architecture, usability, conversion to optimisation balance, and promotion.

If you don’t take that route, you’ll find yourself chased down by Google’s algorithms at some point in the coming year.

This ‘what is SEO‘ guide (and this entire website) is not about churn and burn type of Google SEO (called webspam to Google) as that is too risky to deploy on a real business website in 2017.

What Is A Successful Strategy?

Get relevant. Get trusted. Get Popular.

It is no longer just about manipulation in 2017.

It’s about adding quality and often useful content to your website that together meet a PURPOSE that delivers USER SATISFACTION.

If you are serious about getting more free traffic from search engines, get ready to invest time and effort in your website and online marketing.

Quality Signals

Google wants to rank QUALITY documents in its results, and force those who wish to rank high to invest in higher-quality content or great service that attracts editorial links from reputable websites.

If you’re willing to add a lot of great content to your website, and create buzz about your company, Google will rank you high.

If you try to manipulate Google, it will penalise you for a period, and often until you fix the offending issue – which we know can LAST YEARS.

Backlinks in general, for instance, are STILL weighed FAR too positively by Google and they are manipulated to drive a site to the top positions – for a while. That’s why blackhats do it – and they have the business model to do it. It’s the easiest way to rank a site, still today.

If you are a real business who intends to build a brand online – you can’t use black hat methods. Full stop.

Fixing the problems will not necessarily bring organic traffic back as it was before a penalty.

Recovery from a Google penalty is a ‘new growth’ process as much as it is a ‘clean-up’ process.

Google Rankings Are In Constant Ever-Flux

It’s Google’s job to MAKE MANIPULATING SERPs HARD.

So – the people behind the algorithms keep ‘moving the goalposts’, modifying the ‘rules’ and raising ‘quality standards’ for pages that compete for top ten rankings.

In 2017 – we have ever-flux in the SERPs – and that seems to suit Google and keep everybody guessing.

Google is very secretive about its ‘secret sauce’ and offers sometimes helpful and sometimes vague advice – and some say offers misdirection – about how to get more from valuable traffic from Google.

Google is on record as saying the engine is intent on ‘frustrating’ search engine optimisers attempts to improve the amount of high-quality traffic to a website – at least (but not limited to) – using low-quality strategies classed as web spam.

At its core, Google search engine optimisation is still about KEYWORDS and LINKS. It’s about RELEVANCEREPUTATION and TRUST. It is about QUALITY OF CONTENT & VISITOR SATISFACTION.

A Good USER EXPERIENCE is a key to winning – and keeping – the highest rankings in many verticals.

Relevance, Authority & Trust

Web page optimisation is about making a web page relevant and trusted enough to rank for a query.

It’s about ranking for valuable keywords for the long term, on merit. You can play by ‘white hat’ rules lay down by Google, and aim to build this Authority and Trust naturally, over time, or you can choose to ignore the rules and go full time ‘black hat’.

MOST SEO tactics still work, for some time, on some level, depending on who’s doing them, and how the campaign is deployed.

Whichever route you take, know that if Google catches you trying to modify your rank using overtly obvious and manipulative methods, then they will class you a web spammer, and your site will be penalised ( you will not rank high for relevant keywords).

These penalties can last years if not addressed, as some penalties expire and some do not – and Google wants you to clean up any violations.

Google does not want you to try and modify where you rank, easily. Critics would say Google would prefer you paid them to do that using Google Adwords.

The problem for Google is – ranking high in Google organic listings is a real social proof for a business, a way to avoid PPC costs and still, simply, the BEST WAY to drive VALUABLE traffic to a site.

It’s FREE, too, once you’ve met the always-increasing criteria it takes to rank top.

‘User Experience’ Matters

Is User Experience A Ranking Factor?

User experience is mentioned 16 times in the main content of the quality ratersguidelines (official PDF), but we have been told by Google it is not, per say, a classifiable ‘ranking factor‘ on desktop search, at least.

On mobile, sure, since UX is the base of the mobile friendly update. On desktop currently no. (Gary Illyes: Google, May 2015)

While UX, we are told, is not literally a ‘ranking factor’, it is useful to understand exactly what Google calls a ‘poor user experience’ because if any poor UX signals are identified on your website, that is not going to be a healthy thing for your rankings anytime soon.

Matt Cutts consistent SEO advice was to focus on a satisfying user experience.

What is Bad UX?

For Google – rating UX, at least from a quality rater’s perspective, revolves around marking the page down for:

  • Misleading or potentially deceptive design
  • sneaky redirects (cloaked affiliate links)
  • malicious downloads and
  • spammy user-generated content (unmoderated comments and posts)
  • Low-quality MC (main content of the page)
  • Low-quality SC (supplementary content)

What is SC (supplementary content)?

When it comes to a web page and positive ux, Google talks a lot about the functionality and utility of Helpful Supplementary Content – e.g. helpful navigation links for users (that are not, generally, MC or Ads).

Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose. SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.

To summarize, a lack of helpful SC may be a reason for a Low quality rating, depending on the purpose of the page and the type of website. We have different standards for small websites which exist to serve their communities versus large websites with a large volume of webpages and content. For some types of “webpages,” such as PDFs and JPEG files, we expect no SC at all.

It is worth remembering that Good SC cannot save Poor MC (“Main Content is any part of the page that directly helps the page achieve its purpose“.) from a low-quality rating.

Good SC seems to certainly be a sensible option. It always has been.

Key Points about SC

  1. Supplementary Content can be a large part of what makes a High-quality page very satisfying for its purpose.
  2. Helpful SC is content that is specifically targeted to the content and purpose of the page.
  3. Smaller websites such as websites for local businesses and community organizations, or personal websites and blogs, may need less SC for their purpose.
  4. A page can still receive a High or even Highest rating with no SC at all.

Here are the specific quotes containing the term SC:

  1. Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose.
  2. SC is created by Webmasters and is an important part of the user experience. One common type of SC is navigation links which allow users to visit other parts of the website. Note that in some cases, content behind tabs may be considered part of the SC of the page.
  3. SC which contributes to a satisfying user experience on the page and website. – (A mark of a high-quality site – this statement was repeated 5 times)
  4. However, we do expect websites of large companies and organizations to put a great deal of effort into creating a good user experience on their website, including having helpful SC. For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating.
  5. However, some pages are deliberately designed to shift the user’s attention from the MC to the Ads, monetized links, or SC. In these cases, the MC becomes difficult to read or use, resulting in a poor user experience. These pages should be rated Low.
  6. Misleading or potentially deceptive design makes it hard to tell that there’s no answer, making this page a poor user experience.
  7. Redirecting is the act of sending a user to a different URL than the one initially requested. There are many good reasons to redirect from one URL to another, for example, when a website moves to a new address. However, some redirects are designed to deceive search engines and users. These are a very poor user experience, and users may feel tricked or confused. We will call these “sneaky redirects.” Sneaky redirects are deceptive and should be rated Lowest.
  8. However, you may encounter pages with a large amount of spammed forum discussions or spammed user comments. We’ll consider a comment or forum discussion to be “spammed” if someone posts unrelated comments which are not intended to help other users, but rather to advertise a product or create a link to a website. Frequently these comments are posted by a “bot” rather than a real person. Spammed comments are easy to recognize. They may include Ads, download, or other links, or sometimes just short strings of text unrelated to the topic, such as “Good,” “Hello,” “I’m new here,” “How are you today,” etc. Webmasters should find and remove this content because it is a bad user experience.
  9. The modifications make it very difficult to read and are a poor user experience. (Lowest quality MC (copied content with little or no time, effort, expertise, manual curation, or added value for users))
  10. Sometimes, the MC of a landing page is helpful for the query, but the page happens to display porn ads or porn links outside the MC, which can be very distracting and potentially provide a poor user experience.
  11. The query and the helpfulness of the MC have to be balanced with the user experience of the page.
  12. Pages that provide a poor user experience, such as pages that try to download malicious software, should also receive low ratings, even if they have some images appropriate for the query.

In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance,  W3c Mobile testing tools.

Google is still, evidently, more interested in rating the main content of the webpage in question and the reputation of the domain the page is on – relative to your site, and competing pages on other domains.

A satisfying UX is can help your rankings, with second-order factors taken into consideration. A poor UX can seriously impact your human-reviewed rating, at least. Google’s punishing algorithms probably class pages as something akin to a poor UX if they meet certain detectable criteria e.g. lack of reputation or old-school SEO stuff like keyword stuffing a site.

If you are improving user experience by focusing primarily on the quality of the MC of your pages, and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2017 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.

Balancing Conversions With Usability & User Satisfaction

Take pop-up windows  or pop-unders as an example:

According to usability expert Jakob Nielson, 95% of website visitors hated unexpected or unwanted pop-up windows, especially those that contain unsolicited advertising.

In fact, Pop-Ups have been consistently voted the Number 1 Most Hated Advertising Technique since they first appeared many years ago.

Accessibility students will also agree:

  • creating a new browser window should be the authority of the user
  • pop-up new windows should not clutter the user’s screen.
  • all links should open in the same window by default. (An exception, however, may be made for pages containing a links list. It is convenient in such cases to open links in another window, so that the user can come back to the links page easily. Even in such cases, it is advisable to give the user a prior note that links would open in a new window).
  • Tell visitors they are about to invoke a pop-up window (using the link <title> attribute)
  • Popup windows do not work in all browsers.
  • They are disorienting for users
  • Provide the user with an alternative.

It is inconvenient for usability aficionados to hear that pop-ups can be used successfully to vastly increase signup subscription conversions.

EXAMPLE: TEST With Using A Pop Up Window

Pop ups suck, everybody seems to agree. Here’s the little test I carried out on a subset of pages, an experiment to see if pop-ups work on this site to convert more visitors  to subscribers.

I  tested it out when I didn’t blog for a few months and traffic was very stable.

Results:

Testing Pop Up Windows Results

  Pop Up Window Total %Change
WK1 On Wk2 Off
Mon 46 20 173%
Tue 48 23 109%
Wed 41 15 173%
Thu 48 23 109%
Fri 52 17 206%

That’s a fair increase in email subscribers across the board in this small experiment on this site. Using a pop up does seem to have an immediate impact.

I have since tested it on and off for a few months and the results from the small test above have been repeated over and over.

I’ve tested different layouts and different calls to actions without pop-ups, and they work too, to some degree, but they typically take a bit longer to deploy than activating a plugin.

I don’t really like pop-ups as they have been an impediment to web accessibility but it’s stupid to dismiss out-of-hand any technique that works. I’ve also not found a client who, if they had that kind of result, would choose accessibility over sign-ups.

I don’t really use the pop up on days I post on the blog, as in other tests, it really seemed to kill how many people share a post in social media circles.

With Google now showing an interest with interstitials, I would be very nervous of employing a pop-up window that obscures the primary reason for visiting the page. If Google detects a dissatisfaction, I think this would be very bad news for your rankings.

I am, at the moment, using an exit strategy pop-up window as hopefully by the time a user sees this device, they are FIRST satisfied with my content they came to read. I can recommend this as a way to increase your subscribers, at the moment, with a similar conversion rate than pop-ups – if NOT BETTER.

I think, as an optimiser, it is sensible to convert customers without using techniques that potentially negatively impact Google rankings.

Do NOT let conversion get in the way of the PRIMARY reason a visitor is CURRENLTY on ANY PARTICULAR PAGE or you risk Google detecting relative dissatifaction with your site and that is not going to help you as Google’s RankBrain gets better at working out what ‘quality’ actually means.

Google Wants To Rank High-Quality Websites

Google has a history of classifying your site as some type of entity, and whatever that is, you don’t want a low-quality label on it. Put there by algorithm or human. Manual evaluators might not directly impact your rankings, but any signal associated with Google marking your site as low-quality should probably be avoided.

If you are making websites to rank in Google without unnatural practices, you are going to have to meet Google’s expectations in the Quality Raters Guidelines (PDF).

Google says:

Low-quality pages are unsatisfying or lacking in some element that prevents them from achieving their purpose well.

‘Sufficient Reason’

There is ‘sufficient reason’ in some cases to immediately mark the page down on some areas, and Google directs quality raters to do so:

  • An unsatisfying amount of MC is a sufficient reason to give a page a Low-quality rating.
  • Low-quality MC is a sufficient reason to give a page a Low-quality rating.
  • Lacking appropriate E-A-T is sufficient reason to give a page a Low-quality rating.
  • Negative reputation is sufficient reason to give a page a Low-quality rating.

What are low-quality pages?

When it comes to defining what a low-quality page is, Google is evidently VERY interested in the quality of the Main Content (MC) of a page:

Main Content (MC)

Google says MC should be the ‘main reason a page exists’.

  • The quality of the MC is low.
  • There is an unsatisfying amount of MC for the purpose of the page.
  • There is an unsatisfying amount of website information.

POOR MC & POOR USER EXPERIENCE

  • This content has many problems: poor spelling and grammar, complete lack of editing, inaccurate information. The poor quality of the MC is a reason for the Lowest+ to Low rating. In addition, the popover ads (the words that are double underlined in blue) can make the main content difficult to read, resulting in a poor user experience.
  • Pages that provide a poor user experience, such as pages that try to download malicious software, should also receive low ratings, even if they have some images appropriate for the query.

DESIGN FOCUS NOT ON MC

  • If a page seems poorly designed, take a good look. Ask yourself if the page was deliberately designed to draw attention away from the MC. If so, the Low rating is appropriate.
  • The page design is lacking. For example, the page layout or use of space distracts from the MC, making it difficult to use the MC.

MC LACK OF AUTHOR EXPERTISE

  • You should consider who is responsible for the content of the website or content of the page you are evaluating. Does the person or organization have sufficient expertise for the topic? If expertise, authoritativeness, or trustworthiness is lacking, use the Low rating.
  • There is no evidence that the author has medical expertise. Because this is a YMYL medical article, lacking expertise is a reason for a Low rating.
  • The author of the page or website does not have enough expertise for the topic of the page and/or the website is not trustworthy or authoritative for the topic. In other words, the page/website is lacking E-A-T.

After page content, the following are given the most weight in determining if you have a high-quality page.

POOR SECONDARY CONTENT

  • Unhelpful or distracting SC that benefits the website rather than helping the user is a reason for a Low rating.
  • The SC is distracting or unhelpful for the purpose of the page.
  • The page is lacking helpful SC.
  • For large websites, SC may be one of the primary ways that users explore the website and find MC, and a lack of helpful SC on large websites with a lot of content may be a reason for a Low rating

DISTRACTING ADVERTISEMENTS

  • For example, an ad for a model in a revealing bikini is probably acceptable on a site that sells bathing suits, however, an extremely distracting and graphic porn ad may warrant a Low rating.

GOOD HOUSEKEEPING

  • If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.
  • The website is lacking maintenance and updates.

SERP SENTIMENT & NEGATIVE REVIEWS

  • Credible negative (though not malicious or financially fraudulent) reputation is a reason for a Low rating, especially for a YMYL page.
  • The website has a negative reputation.

LOWEST RATING

When it comes to Google assigning your page the lowest rating, you are probably going to have to go some to hit this, but it gives you a direction you want to ensure you avoid at all costs.

Google says throughout the document, that there are certain pages that…

should always receive the Lowest rating

..and these are presented below. Note – These statements below are spread throughout the raters document and not listed the way I have listed them there. I don’t think any context is lost presenting them like this, and it makes it more digestible.

Anyone familiar with Google Webmaster Guidelines will be familiar with most of the following:

  • True lack of purpose pages or websites.
    • Sometimes it is difficult to determine the real purpose of a page.
  • Pages on YMYL websites with completely inadequate or no website information.
  • Pages or websites that are created to make money with little to no attempt to help users.
  • Pages with extremely low or lowest quality MC.
    • If a page is deliberately created with no MC, use the Lowest rating. Why would a page exist without MC? Pages with no MC are usually lack of purpose pages or deceptive pages.
    • Webpages that are deliberately created with a bare minimum of MC, or with MC which is completely unhelpful for the purpose of the page, should be considered to have no MC
    • Pages deliberately created with no MC should be rated Lowest.
    • Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source. Important: The Lowest rating is appropriate if all or almost all of the MC on the page is copied with little or no time, effort, expertise, manual curation, or added value for users. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.
  • Pages on YMYL (Your Money Or Your Life Transaction pages) websites with completely inadequate or no website information.
  • Pages on abandoned, hacked, or defaced websites.
  • Pages or websites created with no expertise or pages that are highly untrustworthy, unreliable, unauthoritative, inaccurate, or misleading.
  • Harmful or malicious pages or websites.
    • Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.
    • Deceptive pages or websites. Deceptive webpages appear to have a helpful purpose (the stated purpose), but are actually created for some other reason. Use the Lowest rating if a webpage page is deliberately created to deceive and potentially harm users in order to benefit the website.
    • Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
    • Sometimes, pages just don’t “feel” trustworthy. Use the Lowest rating for any of the following: Pages or websites that you strongly suspect are scams
    • Pages that ask for personal information without a legitimate reason (for example, pages which ask for name, birthdate, address, bank account, government ID number, etc.). Websites that “phish” for passwords to Facebook, Gmail, or other popular online services. Pages with suspicious download links, which may be malware.
  • Use the Lowest rating for websites with extremely negative reputations.

Websites ‘Lacking Care and Maintenance’ Are Rated ‘Low Quality’.

Sometimes a website may seem a little neglected: links may be broken, images may not load, and content may feel stale or out-dated. If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.

“Broken” or Non-Functioning Pages Classed As Low Quality

I touched on 404 pages in my recent post about investigating why has a site lost traffic.

Google gives clear advice on creating useful 404 pages:

  1. Tell visitors clearly that the page they’re looking for can’t be found
  2. Use language that is friendly and inviting
  3. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site.
  4. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page.
  5. Think about providing a way for users to report a broken link.
  6. Make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested

Ratings for Pages with Error Messages or No MC

Google doesn’t want to index pages without a specific purpose or sufficient main content. A good 404 page and proper setup prevents a lot of this from happening in the first place.

Some pages load with content created by the webmaster, but have an error message or are missing MC. Pages may lack MC for various reasons. Sometimes, the page is “broken” and the content does not load properly or at all. Sometimes, the content is no longer available and the page displays an error message with this information. Many websites have a few “broken” or non-functioning pages. This is normal, and those individual non-functioning or broken pages on an otherwise maintained site should be rated Low quality. This is true even if other pages on the website are overall High or Highest quality.

Does Google programmatically look at 404 pages?

We are told, NO in a recent hangout –  – but – in Quality Raters Guidelines “Users probably care a lot”.

Do 404 Errors in Search Console Hurt My Rankings?

404 errors on invalid URLs do not harm your site’s indexing or rankingin any way. JOHN MEULLER

It appears this isn’t a once size fits all answer. If you properly deal with mishandled 404 errors that have some link equity, you reconnect equity that was once lost – and this ‘backlink reclamation’ evidently has value.

The issue here is that Google introduces a lot of noise into that Crawl Errors report to make it unwieldy and not very user-friendly.

A lot of broken links Google tells you about can often be totally irrelevant and legacy issues. Google could make it instantly more valuable by telling us which 404s are linked to from only external websites.

Fortunately, you can find your own broken links on site using the myriad of SEO toolsavailable.

I also prefer to use Analytics to look for broken backlinks on a site with some history of migrations, for instance.

John has clarified some of this before, although he is talking specifically (I think) about errors found by Google in Search Console (formerly Google Webmaster Tools):

  1. In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How do you tell? Double-check the origin of the crawl error. If there’s a broken link on your site, in your page’s static HTML, then that’s always worth fixing
  2. What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important

If you are making websites and want them to rank, the 2015 and 2014 Quality Raters Guidelines document is a great guide for Webmasters to avoid low-quality ratings and potentially avoid punishment algorithms.

Google Is Not Going To Rank Low-Quality Pages When It Has Better Options

If you have exact match instances of key-phrases on low-quality pages, mostly these pages won’t have all the compound ingredients it takes to rank high in Google in 2017.

I was working this, long before I understood it partially enough to write anything about it.

Here is a few examples of taking a standard page that did not rank for years and then turning it into a topic oriented resource page designed around a user’s intent:

Screenshot 2016-01-26 00.43.01

Screenshot 2016-01-26 00.33.53

Google, in many instances, would rather send long-tail search traffic, like users using mobile VOICE SEARCH, for instance, to high-quality pages ABOUT a concept/topic that explains relationships and connections between relevant sub-topics FIRST, rather than to only send that traffic to low-quality pages just because they have the exact phrase on the page.

Further Reading

Technical SEO

If you are doing a professional SEO audit for a real business, you are going to have to think like a Google Search Quality Rater AND a Google search engineer to provide real long term value to a client.

 

Google has a LONG list of technical requirements it advises you meet, on top of all the things it tells you NOT to do to optimise your website.

Meeting Google’s technical guidelines is no magic bullet to success – but failing to meet them can impact your rankings in the long run – and the odd technical issue can actually severely impact your entire site if rolled out across multiple pages.

The benefit of adhering to technical guidelines is often a second order benefit.

You don’t get penalised, or filtered, when others do. When others fall, you rise.

Mostly – individual technical issues will not be the reason you have ranking problems, but they still need addressed for any second order benefit they provide.

When making a site for Google in 2017, you really need to understand that Google has a long list of things it will mark sites down for, and that’s usually old-school seo tactics which are now classed as ‘web spam‘.

Conversely, sites that are not marked down are not demoted and so improve in rankings. Sites with higher rankings pick up more organic links, and this process can float a high-quality page quickly to the top of Google.

So – the sensible thing for any webmaster is to NOT give Google ANY reason to DEMOTE a site. Tick all the boxes Google tell you to tick.

I have used this simple (but longer term) strategy to rank on page 1 or thereabouts for ‘SEO’ in the UK over the last few years, and drive 100 thousand relevant organic visitors to this site, every month, to only about 70 pages, without building any links over the last few years (and very much working on it part-time):

Screenshot 2016-03-17 18.10.16

What Is Domain Authority?

 

Domain authority, or as Google calls it, ‘online business authority’, is an important ranking factor in Google. What is domain authority? Well, nobody knows exactly how Google calculates popularity, reputation, intent or trust, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted – all of which can be faked, of course.

Most sites that have domain authority/online business authority have lots of links to them – that’s for sure – hence why link building has traditionally been so popular a tactic – and counting these links is generally how most 3rd party tools calculate it a pseudo domain authority score, too.

Massive domain authority and ranking ‘trust’ was in the past awarded to very successful sites that have gained a lot of links from credible sources, and other online business authorities too.

Amazon has a lot of online business authority…. (Official Google Webmaster Blog)

SEO more usually talk about domain trust and domain authority based on the number, type and quality of incoming links to a site.

Examples of trusted, authority domains include Wikipedia, the W3C and Apple. How do you become a OBA? Through building a killer online or offline brand or service with, usually, a lot of useful content on your site.

How do you take advantage of being an online business authority? Either you turn the site into a SEO Black Hole (only for the very biggest brands) or you pump out information – like all the time. On any subject. Because Google will rank it!

EXCEPT – If what you publish is deemed low quality and not suitable for your domain to have visibility on Google.

I think this ‘quality score’ Google has developed could be Google’s answer to this sort of historical domain authority abuse.

Can you (on a smaller scale in certain niches) mimic a online business authority by recognising what OBA do for Google, and why Google ranks these high in search result?. These provide THE service, THE content, THE experience. This takes a lot of work and a lot of time to create, or even mimic.

In fact, as a SEO, I honestly think the content route is the only sustainable way for a most businesses to try to achieve OBA at least in their niche or locale. I concede a little focused linkbuilding goes a long way to help, and you have certainly got to get out there and tell others about your site…

Have other relevant sites link to yours. Google Webmaster Guidelines

Brands are how you sort out the cesspool.

“Brands are the solution, not the problem,” Mr. Schmidt said. “Brands are how you sort out the cesspool.

Google CEO Eric Schmidt said this. Reading between the lines, I’ve long thought this is good SEO advice.

If you are a ‘brand’ in your space, or well-cited site, Google wants to rank your stuff at the top because it trusts you won’t spam it and fill results pages with crap and make Google look stupid.

That’s money just sitting on the table the way Google currently awards massive domain authority and trust to particular sites they rate highly.

Tip – Keep content within your topic, unless you are producing high-quality content, of course. (e.g. the algorithms detect no unnatural practices)

I am always thinking:

“how do I get links from big KNOWN sites to my site. Where is my next quality link coming from?”

Getting links from ‘Brands’ (or well-cited websites) in niches can mean ‘quality links’.

Easier said than done, for most, of course, but that is the point.

But the aim with your main site should always be to become an online brand.

Does Google Prefer Big Brands In Organic SERPs?

Well, yes. It’s hard to imagine that a system like Google’s was not designed exactly over the last few years to deliver the listings it does today – and it IS filled with a lot of pages that rank high LARGELY because the domain the content is on.

Big Brands have an inherent advantage in Google’s ecosystem, and it’s kind of a suck for small businesses. There’s more small businesses than big brands for Google to get Adwords bucks out of, too.

That being said – small businesses can still succeed if they focus on a strategy based on depth, rather than breadth regarding how content is