Author: MySEOBlog

Is your site over-optimised?

In the SEO world, backlinks are a great thing. Increasing the amount of backlinks pointing toward your site will help improve your site’s visibility in common search engines, such as: GoogleBing, and Yahoo. Even though increasing your backlink count is a good practice to undertake, sometimes where the backlinks come from can be an issue, especially if the backlinks are inorganic. For some time Google’s Penguin update has added considerable weight to backlink origin, so it’s important to check the backlink origins of your website if you want to improve your existing optimisation.

If this is overlooked, the algorithm may decide your website has used unfair techniques to get ranked, and as a result you might suffer the opposite effect – a penalty from Google. Originally Google confirmed these penalties did not exist, but that was now several years ago andits stance has long since changed. Search engines want to ensure nobody is cheating the system, and every major update brings them closer to picking up on any illicit tactics that are being employed by websites.

To check if your site is overly optimised to the point of penalty, there are a few free tools you can turn to. Some of these tools will e-mail you a report of the amount of bad/inorganic links that are pointing to your website. All that you have to do is to specify the URL of your website, along with the e-mail address where you would like the report e-mailed to you, and the software will automatically review your website for you.

Following these steps, a comprehensive list of the inorganic backlinks will be displayed. This will help you determine how to organise your current backlink situation, along with displaying the anchor text that the backlinks to your website come from. It might be that you find your website is classed as over-optimised, and this is an indication that Google is likely to penalise you for this tactic. Cleaning up any grey or black hat SEO techniques that might have been acceptable a few years ago is always an important priority for websites looking to maintain good positions in the long term.

Continue Reading

The Mysteries Of URL Shorteners Explained

Over the past few years, URL shorteners have become a popular tool, especially with the rise of social networking as an easy way of sharing information which relies on short and snappy content. Many people have wondered how shortening links affects the SEO world, considering the amount of links that are shortened on the Internet and search engines continuing to rely on visible links to make ranking decisions. This premise has been answered by Matt Cutts, who is head of the Webspam team at the search giant Google.

If you are unsure of what URL shorteners are, these are services that truncate lengthy URLs so that social networking users can easily fit a specific URL into the allotted space that a specific social network provides (e.g. a status message, link share, etc.). A shortened URL actually takes up the same number of characters in a Tweet as any other length URL, contrary to popular belief. However, for visual and branding purposes it’s much better to tidy up your messages with shorter links. The common URL shorteners that are currently popular on the net today are: Bit.lyGoo.gl, and TinyURL.com.

Since URL shorteners have become so popular, there has been a lot of confusion within the SEO world of how this can improve or hurt a site for any reason. Fortunately Google’s Matt Cutts gave a solid answer which has proved useful for those in the know.

Firstly, there are no penalties involved with Google rankings if you use URL shorteners. In short, you can use URL shortening tools however much you like and there will be no negative consequences for your positions. However, Google has pointed out that it would not be advisable to shrink links for submitting your business to a directory. For branding purposes, and to make it clear that everything is above board, it’s vital to include your full link for any listing submissions.

Overall the main point is that URL shrinking tools can be highly useful, and mainly for branding purposes when you’re considering a real user looking at your message and making a snap decision on whether to click through or not. From an SEO point of view, this is doing you no harm, but it’s not likely to do much good either, so if your link is for Google’s benefit you are better off not hiding it.

Continue Reading

How To Use Google’s Link Disavow Tool

If you are by any chance a directory editor, you probably receive many communications from directory submitters that would like either their directory listing updated or removed altogether. A lot of the removal requests are due to historic Google algorithm changes that prevent linkspam, which has led to a lot of confusion among webmasters in the past. To help clear up the issue, as well as providing a timely way for website owners to de-associate their site from another, the search giant Google added a “Disavow Tool” to their Webmaster Tools interface. If this is something you didn’t know about, you might find it extremely useful.

The reason why Google released the Link Disavow Tool has been explained in the paragraph below from this Google Help Centre article:

“If you’ve done as much work as you can to remove spammy or low-quality links from the web, and are unable to make further progress on getting the links taken down, you can disavow the remaining links. In other words, you can ask Google not to take certain links into account when assessing your site.”

Even though this is a great tool from Google to help you remove unwanted links, it is highly recommended that you still proceed with the various processes that you would use to normally remove a specific link, such as e-mailing the web administrator to take down the link or by manually taking the link down yourself if you are provided the option to. Only use this tool if you are really sure need it, because it won’t always be as effective as completely removing an unwanted link through the proper channels. Also, be sure to not disavow links from your own website.

In addition, it’s good to note that it may take some time for Google to disavow the links. It is highly recommended that you should wait approximately two to three days prior to submitting a reconsideration request after an original disavow request has been submitted. The information that Google uses from this tool will be incorporated into its index as it continuously recrawls the web, ensuring if everything is done properly your website won’t suffer any penalites or ranking issues as a result of links you no longer want as part of your strategy.

Continue Reading

How to instantly pull up informative stats about a website

The web is filled with all sorts of tools to help you gain SEO-related knowledge about your website; however, I came across the tool below and found that it provides a lot of relevant information that can help your website succeed on the web. For this week’s tool on mySEOblog, I would like to bring your attention to StatMyWeb.

This tool provides up to date and relevant statistics for any site of your choice. StatMyWeb provides a coherent history for a website and detailed stats including the date it was created, information about its hosting provider, its estimated value and its general rankings on search engines.

When performing a lookup of a website on StatMyWeb, this quality website tool will display the following statistics:

  • Overview – Displays general SEO-related statistics, such as: Alexa Rank, Page Rank, load time (average), daily visitors, and daily pageviews. Some of these may now be slightly outdated factors but it gives you an idea of what has been done previously for the SEO on the website.
  • Daily Visitors Country Map Analysis – A geographical visitor map broken down by country. Highlight any specific country to find out how many visitors to your site come from there.
  • Keywords Ranking at Good Position on Google Analysis – This will display a table showing what keywords from Google took organic search users to your site. This table is organised by keyword, impact factor, and query popularity.
  • Homepage Link Analysis – Counts how many internal and external links are on the website that you are evaluating.
  • W3C HTML Validation Analysis – Checks to see if your website has any W3C validation errors or warnings.

As we mentioned, it’s not necessarily the best idea to take all the statistics you find using tools like this as concrete facts. Statistics tell you a lot about what a website has had going on in the past, but this isn’t always an indication of its current or future value. This is especially true if you’re only looking from an SEO point of view, because again we only really know what has helped websites to rank in the past. We can’t be positive what factors Google and its rivals are using as their mid-low priority factors, so if you’re looking for an advantage over your competition take any conslusions drawn from tools like this with a pinch of salt, and keep an eye on the latest SEO news to judge what factors are the most important today.

Continue Reading

How to review your website’s meta data

When creating a website (especially from scratch), it is always best to insert meta information into the header of your website’s HTML code, if that’s how you’re building your site. This way, when search engines crawl your website, the search engines’ crawlers will look at the meta information of your website first to include keywords and phrases for your snippet. Meta information is also important to have when submitting your website to various link directories, so that information can easily be obtained for your directory listing.

If you are unsure of what meta information is, this is the description and keywords that you insert between the meta tags included within your website’s header information. For more information about how to insert meta code into your website, please review this information from the W3C.

If you would like to view your website’s header information without having to dig through its HTML code, and you don’t have a CMS or a system that supports this, you can use a tool like MetaHeaders. This service will display the following information pertaining to your website:

  • A summary of your website’s hosting information, such as its IP, datacenter location, and web server software.
  • Meta tags, including robots, content type, keywords, refresh, cache control, pragma, and many more.
  • Headers including date, last modified, content length, content type, and other information.

This tool is also great to use to view your competitors’ meta information. This way, you can see what kind of information your competitor is using to draw in their customers through SEO and potentially take some inspiration, depending on how well they’re actually ranking. Meta data software won’t tell you everything you need to know to go about making your website more search engine friendly, but if you’re stuck you might find it gives you a valuable insight.

Continue Reading

Social Mention – The Social Search Engine

Social media and networking play a big part of our day-to-day online activities, finding out what is new and happening in the world. In many cases, topics and events are covered by social media much more quickly than the news that we see on television, radio and elsewhere online. With this in mind, a search engine called Social Mention has been developed to help users search for key words and phrases on the various well-known social networks all around the web.

Social Mention is essentially a social media search engine as well as an analysis platform that aggregates content posted by users across different networks and collates and interprets the information. This means you can keep tabs on messages referring to you, your business, your rivals, or anything else, all in real time. Over 100 platforms are taken into account by Social Mention including Twitter, Facebook, YouTube, Google and more.

Along with providing data from these well-known websites, Social Mention also displays various statistics pertaining to the key word or phrase that you queried, displaying graphs for the following elements:

  • Sentiment – Whether the search term you queried has been referred to in a positive, neutral, or negative context.
  • Top Keywords – Displays a list of other keywords that have been searched for relating to your searched-for query.
  • Top Users – Displays the users who have been discussing your searched-for term the most.
  • Top Hashtags – The hashtags that have been closely relating to your search term.
  • Sources – The most-often-used external sources used for the social media posts relating to your query.

There are similar functions being increasingly built into each individual social network, and a variety of other tools that include more search functions, such as social post scheduling software. However, this is a great way of collating data from all the relevant sources in one place and getting a great overview. It’s becoming more and more important for online businesses to be tuned into what people are saying on social media and taking advantage of any opportunities that arise, so using Social Mention or a similar tool could become invaluable if you’re looking for an edge over your competition online.

Continue Reading

The Most Important SEO Tools In Google Analytics

If you are a website administrator who is heavily involved in or curious about the statistics for your website’s traffic and visitor information, you have probably heard of (if not already signed up with) the search giant Google‘s well-known statistics tool  Google Analytics. It’s an extremely well-known and religiously used analytical tool that will monitor the visitor statistics of your website and how it is accessed within Google, all based on a simple tracking code that is added to your website.

Some time ago, Google added an SEO report section to their well-known analytics service. You need to enable Search Console integration to access this section, although some of the data can be found elsewhere in Google Analytics (or GA). It makes use of some features found in Google’s Webmaster Tools and supplements these with tools of its own. If you want to use GA to draw conclusions about your SEO straetgy, you should certainly be looking for these key pieces of information:

  • Queries – The total number of search queries that returned pages from your site results over the given period. These numbers can be rounded, and may not be exact.
  • Query – A list of the top search queries that returned pages from your site. You will usually find that many are “not provided” which results from users being logged in and Google choosing to keep their data private. However, the data that is shown can still be useful if you have enough visitors.
  • Impressions – The number of times pages from your site were viewed in search results, and the percentage increase/decrease in the daily average impressions compared to the previous period. (The number of days per period defaults to 30, but you can change it at any time.)
  • Clicks – The number of times your site’s listing was clicked in search results for a particular query, and the percentage increase/decrease in the average daily clicks compared to the previous period.
  • CTR (clickthrough rate) – The percentage of impressions that resulted in a click to your site, and the increase/decrease in the daily average CTR compared to the previous period.
  • Average Position – The average position of your site on the search results page for that query, and the change compared to the previous period. Green indicates that your site’s average position is improving. To calculate average position, it takes into account the ranking of your site for a particular query (for example: if a query returns your site as the #1 and #2 result, then the average position would be 1.5).

(Based on information taken directly from Google)

Using Google Analytics to its full potential can be tricky, but most of the features are easily accessible to the majority of users. It’s an invaluable source of information about a huge range of factors that influence your website’s success, in terms of SEO and also user experience and ultimately online business success. If you don’t already have your site set up in Analytics this should really be something you look into.

Continue Reading

SEO Administrator – A multi-purpose tool for optimisation

It is always useful to have more tools at your disposal when trying to increase the value of your website in the eys of search engines. There are so many factors to consider it is almost impossible to devise a strategy without some sort of software to guide you and collate all the relevant statistical information. Some tools are free and can give you quick, simple insights, while others require purchase or subscriptions but offer a lot more detailed information or more advanced functionality.

One such tool is SEO Administrator, which includes the following quality tools that will help analyse your website’s popularity on Google, Yahoo and Bing.

  • The Ranking Monitor Utility – This SEO tool uses your website’s keywords to obtain the ranking position of your website from a wide range of search engines.
  • The Link Popularity Checker – The number of inbound links to your website is a major factor in successful search engine promotion, and this tool analyzes these links.
  • The Site Indexation Tool – This SEO tool checks for website page indexing. Before being included in keyword-based search results, a web page needs to be indexed by the search engines. SEO Administrator reveals the pages of your site that have been indexed by various search engines. The program supports all major international search engines.
  • The Site Analyser – The Site Analyser module is an SEO software tool that includes a broken link checker, sitemap creation software, and robots.txt file generator. It finds broken links and images, checks the Google PageRank value for every page, searches for errors in html code, and generates html and Google-compatible xml sitemaps for your web site.
  • The HTML Analyser – The HTML Analyser module is a program that analyzes html page content. The program provides a full report on the weight and density of keywords and keyphrases in your web pages, as well as providing a preliminary analysis of your own site. It can also be used to analyse your competitors’ sites as well.

With powerful software like this at your disposal (and there a huge number of alternatives available too), you might find you have all the tools you need to optimise your website thoroughly. This depends on whether you already have all the necessary technical expertise and experience required to interpret the information your tools have unearthed.

Continue Reading

Gmail adds more search functionality

gmail

Gmail, the search engine giant Google’s web-based e-mail system, has always included features that you can use to textually search to find specific messages within the various folders of your e-mail account. The unique search features that Gmail includes makes it easy for users to navigate around their e-mail messages.

Earlier this month, Google announced on their Gmail blog that they have added even more search features to their quality e-mail service. Christian Kurmann, a Software Engineer for Google, announces these changes:

We’re always looking for ways to make it faster and easier for you to find your messages using search in Gmail. So starting today, you can now search emails by size, more flexible date options, exact match, and more.

One of the new implementations that Gmail has added to the search functionality is the option to find e-mail messages by filesize. For example, if you want to find an e-mail message 4MB or larger, you can input “size:4m” (without quotes). You can also list e-mails older than a specific date by using the older_than: operator.

What do you think about Gmail’s e-mail text search functionality? Does it help you find e-mail messages in a more organized and timely fashion? Please leave your feedback in this post’s comments section.

Continue Reading

Google Improves Local Search

searching

Recently, the search engine giant Google published a list of 65 changes that they have implemented during August and September to help improve your search experience on their search engine. These changes span from various aspects of how searching the web affects you, such as: page quality, knowledge graph, autocomplete, freshness, and local.

The aspect of “local searching” on Google is a big concentration, because of a lot of us perform searches looking for local people, places, and events. A list of the local-related improvements that Google has implemented and updated are provided below:

  • #83659. -project “Answers”- We made improvements to display of the local time search feature.
  • nearby. -project “User Context”- We improved the precision and coverage of our system to help you find more relevant local web results. Now we’re better able to identify web results that are local to the user, and rank them appropriately.
  • #83377. -project “User Context”- We made improvements to show more relevant local results.
  • #83406. -project “Query Understanding”- We improved our ability to show relevant Universal Search results by better understanding when a search has strong image intent, local intent, video intent, etc.
  • #81360. -project “Translation and Internationalization”- With this launch, we began showing local URLs to users instead of general homepages where applicable (e.g. blogspot.ch instead of blogspot.com for users in Switzerland). That’s relevant, for example, for global companies where the product pages are the same, but the links for finding the nearest store are country-dependent.

What do you think about these search aspects that have been worked on by Google? Please let us know below in this post’s comments section.

Continue Reading