Author: MySEOBlog

Why Content Isn’t The Only Thing That Matters

blog and blogs

Content is important for SEO. This is a fact, and it isn’t news. There’s no getting away from it any more, and if you don’t have it there’s very little you can do to get your website ranking for your target terms. However, content is not the only thing you should be focusing on. You need to have a clear strategy behind what you’re doing with your blog posts and webpage text, and you also need to make sure the rest of the elements of your SEO campaign are in place. Content for the sake of content is almost certainly going to do very little for your success. Here are a few points to bear in mind when you’re making your plans.

Great content doesn’t automatically rank

You should be making sure your content is interesting, engaging, relevant, informative, full of different types of media, and all those great things. However, don’t expect it to rank on Google and boost your website just because it’s high quality. Without proper promotion on your website, on social media, via email and so on, progress is going to be extremely slow.

Not all content will be successful

Even if you consistently promote your content and make sure it’s all written to a similar standard, not everything is going to catch on. There’s just too much competition out there for you to see the same success every time. Over 2 million blog posts are added to the Internet every single day, and on average only 25% of these will ever get even one single link back from a website. Just keep at it, and when a post finally gets the balance right and takes off it will make all your efforts worthwhile.

Links are still important

We know that links are still one of the absolute top factors which Google looks at when deciding its rankings every time someone runs a query. If you want your content to succeed online and pull in plenty of relevant organic traffic which leads people to look at the rest of your website, it’s important to try and get links back from other sources.

There are many ways to promote content

You can spread the word about your content through a variety of channels, and in fact this is the best way to get more engagement. Facebook and Google+ are good channels to share new posts on if you want to boost your SEO, as well as Twitter. You can also make use of mailing lists to distribute your content, and use news feeds on your website to announce new posts. As well as pushing new posts, don’t make the mistake of only mentioning them once. If your content truly offers value to readers, you can share it again and remind your followers to check it out. Visibility drops off very quickly once you’ve shared on social media, so posting a different update with the same link can really help counteract this.

Overall you simply need to ensure you have a good strategy and you’re confident that the content you’re posting is in line with your SEO objectives. With a little effort you should be able to make your posts stand out from the crowd and help your website gain traction.

Continue Reading

Facebook’s New Local SEO Tool

facebook local 2

Facebook has been an important platform for local businesses since it exploded in popularity over the last decade. Pages can be designated as local business pages, which allows you to add details about the industry you’re targeting, plus specific details like your opening hours, contact details and so on. This is all very helpful for people visiting your page, and a Facebook page that’s filled in to a high standard can rank well on Google search result pages.

However, until recently people weren’t that likely to find your page within Facebook. Its search function still leaves something to be desired, since it’s trying to take into account people, pages, groups and more while heavily customising the results shown on a personal basis. Facebook business pages also feature customer reviews, but since there was no centralised place to see them all you would be unlikely to use the social network to shop around for different businesses in your area.

Now Facebook has rolled out a tool which aims to tackle this problem, although it’s in the early stages of development according to their spokesperson and it hasn’t had any promotion yet. People have stumbled across the new area, Facebook Professional Services, when trying to optimise their pages. It effectively functions as a business directory, and is being interpreted as an attempt to muscle in on the market for review-based business listing websites such as Yell and Trustpilot.

The system is fairly simple at the moment. You can search for a business or service you’re interested in, but for your query you are limited to a predefined list of options. These are the categories that business pages are allowed to be placed under when you set one up, and you might find that the business you’re looking for (or your own business, if you’re creating a page) doesn’t exactly fit into one of these. You have to simply choose the closest match, which is an issue we expect to see fixed in a future version of this tool.

Once you select a search term, however, the tool becomes more sophisticated. It takes into account a range of variables when deciding how to rank businesses, such as geographical location, average customer rating and other unknown factors. For every Facebook user, the results will also be personalised depending on your own history with the business pages shown. For example, if you or your connections have previously interacted with the page it is likely to appear higher for you than it would for someone else.

At the moment, Facebook has basically asserted that this tool isn’t ready yet. The fact that it launched without any fanfare was enough of a clue, but since it has been confirmed since we can assume that it’s going to keep an eye out for early feedback and make adjustments before promoting it nationally or globally. For now it’s a good idea to make sure you get your page in order, however, and see if you can encourage people to leave reviews for your business when they’ve had a good experience. Staying active on your company Facebook page is a good way to ensure you stay ahead of the curve, whatever changes the social network chooses to make later.

Continue Reading

Are Keywords Still Important?

key

Google is always trying to improve its algorithms, so it can more accurately judge the relevance of a particular webpage when someone enters a search query. The algorithm allows Google to instantly and automatically make an informed and calculated decision about what order to display thousands of results in. It’s pretty impressive that it can understand all the information on your website to make this judgement call, right?

Well, the answer to that is mixed. Yes, Google is making an extremely fast decision (and as you’ve probably noticed, it always loves to brag about exactly how fast that decision was by displaying the time it took to show your personalised search results). However, it is not taking everything on your webpage into account the same way a human does. It’s looking at particular things, which we normally refer to as ranking factors. We know that keywords have always been a very major ranking factor, which is to say, the text content on the page you’ve created needs to contain some words which match the query a user typed into Google in order for that particular page to be shown on a search results page for that particular person.

Continue Reading

What Rankbrain does and how SEO should respond

 

Google recently revealed it has been secretly testing a new element in its infamous algorithm, the set of codes and calculations that decide what order websites are ranked in when people run searches. It’s been nicknamed Rankbrain and the announcement confirms that the rollout is already complete, having been phased in throughout early 2015.

Now Rankbrain is fully implemented and it’s estimated that it’s involved with rankings for around 80% of search queries. “Google has actually said that Rankbrain is the third most important ranking factor now,” says Diana Esho, Managing Director of prominent Leicester SEO company, 123 Ranking. “Although it refuses to confirm what the top two factors actually are. It has been speculated that these two factors are text content (i.e. the words that are used on the website matching the search term) and inbound links from external sources.”

Continue Reading

Google still heading towards semantic search

Google, the world’s largest and well-known search engine, is soon to be undergoing yet more changes that will affect the way results will be organised in the SERPs. This may wreak havoc among the SEO community, because many of the webmasters have already been adjusting their websites to the recent updates that Google has released. On the other hand, this is something we have heard many times. Is everything really going to be turned on its head?

In fact, the answer is probably not. Google has made its intentions obvious for several years now, presenting more facts and direct answers relating to the search query that the user performs. More and more different features have been added to the main search page so users are presented with a plethora of different displays when they search for queries that Google has a lot to say about. This was marked some time ago by renaming“Web” search to “All”, which reflects the fact that images, videos, maps and other interesting snippets of information tend to be displayed above organic text-based listings.

This system is making the keyword-based algorithm gradually redundant and less useful. The direction that Google is heading towards is called “Semantic Search.” With old priorities like PageRank becoming almost entirely irrelevant, Semantic Search uses the science of language and context to “understand” (albeit in a clunky, robotic way) queries and produce relevant, intuitive search results. The aim is to get to the desired answer in as few steps as possible, improving user experience and securing Google’s massive market share for the future.

Rankbrain was a major step for Google’s “artificial intelligence” development plans when it was rolled out in 2015, focusing on less literal interpretations of queries and attempting to understand context to provide answers that are not directly related to the words in the question. Rankbrain was built with the intention of feeding it historical information in batches and monitoring the results to ensure accuracy, so it’s still one step away from actually learning dynamically without the help of human test subjects. However, Google’s intention is to delve ever deeper into the field of semantic search and we can only expect this to go further in the next few years.

Continue Reading

Is your site over-optimised?

In the SEO world, backlinks are a great thing. Increasing the amount of backlinks pointing toward your site will help improve your site’s visibility in common search engines, such as: GoogleBing, and Yahoo. Even though increasing your backlink count is a good practice to undertake, sometimes where the backlinks come from can be an issue, especially if the backlinks are inorganic. For some time Google’s Penguin update has added considerable weight to backlink origin, so it’s important to check the backlink origins of your website if you want to improve your existing optimisation.

If this is overlooked, the algorithm may decide your website has used unfair techniques to get ranked, and as a result you might suffer the opposite effect – a penalty from Google. Originally Google confirmed these penalties did not exist, but that was now several years ago andits stance has long since changed. Search engines want to ensure nobody is cheating the system, and every major update brings them closer to picking up on any illicit tactics that are being employed by websites.

To check if your site is overly optimised to the point of penalty, there are a few free tools you can turn to. Some of these tools will e-mail you a report of the amount of bad/inorganic links that are pointing to your website. All that you have to do is to specify the URL of your website, along with the e-mail address where you would like the report e-mailed to you, and the software will automatically review your website for you.

Following these steps, a comprehensive list of the inorganic backlinks will be displayed. This will help you determine how to organise your current backlink situation, along with displaying the anchor text that the backlinks to your website come from. It might be that you find your website is classed as over-optimised, and this is an indication that Google is likely to penalise you for this tactic. Cleaning up any grey or black hat SEO techniques that might have been acceptable a few years ago is always an important priority for websites looking to maintain good positions in the long term.

Continue Reading

The Mysteries Of URL Shorteners Explained

Over the past few years, URL shorteners have become a popular tool, especially with the rise of social networking as an easy way of sharing information which relies on short and snappy content. Many people have wondered how shortening links affects the SEO world, considering the amount of links that are shortened on the Internet and search engines continuing to rely on visible links to make ranking decisions. This premise has been answered by Matt Cutts, who is head of the Webspam team at the search giant Google.

If you are unsure of what URL shorteners are, these are services that truncate lengthy URLs so that social networking users can easily fit a specific URL into the allotted space that a specific social network provides (e.g. a status message, link share, etc.). A shortened URL actually takes up the same number of characters in a Tweet as any other length URL, contrary to popular belief. However, for visual and branding purposes it’s much better to tidy up your messages with shorter links. The common URL shorteners that are currently popular on the net today are: Bit.lyGoo.gl, and TinyURL.com.

Since URL shorteners have become so popular, there has been a lot of confusion within the SEO world of how this can improve or hurt a site for any reason. Fortunately Google’s Matt Cutts gave a solid answer which has proved useful for those in the know.

Firstly, there are no penalties involved with Google rankings if you use URL shorteners. In short, you can use URL shortening tools however much you like and there will be no negative consequences for your positions. However, Google has pointed out that it would not be advisable to shrink links for submitting your business to a directory. For branding purposes, and to make it clear that everything is above board, it’s vital to include your full link for any listing submissions.

Overall the main point is that URL shrinking tools can be highly useful, and mainly for branding purposes when you’re considering a real user looking at your message and making a snap decision on whether to click through or not. From an SEO point of view, this is doing you no harm, but it’s not likely to do much good either, so if your link is for Google’s benefit you are better off not hiding it.

Continue Reading

How To Use Google’s Link Disavow Tool

If you are by any chance a directory editor, you probably receive many communications from directory submitters that would like either their directory listing updated or removed altogether. A lot of the removal requests are due to historic Google algorithm changes that prevent linkspam, which has led to a lot of confusion among webmasters in the past. To help clear up the issue, as well as providing a timely way for website owners to de-associate their site from another, the search giant Google added a “Disavow Tool” to their Webmaster Tools interface. If this is something you didn’t know about, you might find it extremely useful.

The reason why Google released the Link Disavow Tool has been explained in the paragraph below from this Google Help Centre article:

“If you’ve done as much work as you can to remove spammy or low-quality links from the web, and are unable to make further progress on getting the links taken down, you can disavow the remaining links. In other words, you can ask Google not to take certain links into account when assessing your site.”

Even though this is a great tool from Google to help you remove unwanted links, it is highly recommended that you still proceed with the various processes that you would use to normally remove a specific link, such as e-mailing the web administrator to take down the link or by manually taking the link down yourself if you are provided the option to. Only use this tool if you are really sure need it, because it won’t always be as effective as completely removing an unwanted link through the proper channels. Also, be sure to not disavow links from your own website.

In addition, it’s good to note that it may take some time for Google to disavow the links. It is highly recommended that you should wait approximately two to three days prior to submitting a reconsideration request after an original disavow request has been submitted. The information that Google uses from this tool will be incorporated into its index as it continuously recrawls the web, ensuring if everything is done properly your website won’t suffer any penalites or ranking issues as a result of links you no longer want as part of your strategy.

Continue Reading

How to instantly pull up informative stats about a website

The web is filled with all sorts of tools to help you gain SEO-related knowledge about your website; however, I came across the tool below and found that it provides a lot of relevant information that can help your website succeed on the web. For this week’s tool on mySEOblog, I would like to bring your attention to StatMyWeb.

This tool provides up to date and relevant statistics for any site of your choice. StatMyWeb provides a coherent history for a website and detailed stats including the date it was created, information about its hosting provider, its estimated value and its general rankings on search engines.

When performing a lookup of a website on StatMyWeb, this quality website tool will display the following statistics:

  • Overview – Displays general SEO-related statistics, such as: Alexa Rank, Page Rank, load time (average), daily visitors, and daily pageviews. Some of these may now be slightly outdated factors but it gives you an idea of what has been done previously for the SEO on the website.
  • Daily Visitors Country Map Analysis – A geographical visitor map broken down by country. Highlight any specific country to find out how many visitors to your site come from there.
  • Keywords Ranking at Good Position on Google Analysis – This will display a table showing what keywords from Google took organic search users to your site. This table is organised by keyword, impact factor, and query popularity.
  • Homepage Link Analysis – Counts how many internal and external links are on the website that you are evaluating.
  • W3C HTML Validation Analysis – Checks to see if your website has any W3C validation errors or warnings.

As we mentioned, it’s not necessarily the best idea to take all the statistics you find using tools like this as concrete facts. Statistics tell you a lot about what a website has had going on in the past, but this isn’t always an indication of its current or future value. This is especially true if you’re only looking from an SEO point of view, because again we only really know what has helped websites to rank in the past. We can’t be positive what factors Google and its rivals are using as their mid-low priority factors, so if you’re looking for an advantage over your competition take any conslusions drawn from tools like this with a pinch of salt, and keep an eye on the latest SEO news to judge what factors are the most important today.

Continue Reading

How to review your website’s meta data

When creating a website (especially from scratch), it is always best to insert meta information into the header of your website’s HTML code, if that’s how you’re building your site. This way, when search engines crawl your website, the search engines’ crawlers will look at the meta information of your website first to include keywords and phrases for your snippet. Meta information is also important to have when submitting your website to various link directories, so that information can easily be obtained for your directory listing.

If you are unsure of what meta information is, this is the description and keywords that you insert between the meta tags included within your website’s header information. For more information about how to insert meta code into your website, please review this information from the W3C.

If you would like to view your website’s header information without having to dig through its HTML code, and you don’t have a CMS or a system that supports this, you can use a tool like MetaHeaders. This service will display the following information pertaining to your website:

  • A summary of your website’s hosting information, such as its IP, datacenter location, and web server software.
  • Meta tags, including robots, content type, keywords, refresh, cache control, pragma, and many more.
  • Headers including date, last modified, content length, content type, and other information.

This tool is also great to use to view your competitors’ meta information. This way, you can see what kind of information your competitor is using to draw in their customers through SEO and potentially take some inspiration, depending on how well they’re actually ranking. Meta data software won’t tell you everything you need to know to go about making your website more search engine friendly, but if you’re stuck you might find it gives you a valuable insight.

Continue Reading