Content writing is a specific skill, and not everyone can do it. Even less great writers can also turn their abilities to search engine optimisation. If you have the formula for writing great, engaging content that’s also optimised to get your site found online, you’re quids in.
But how can you get to the point where you can comfortably balance these objectives and produce something great? Here are a few important tips which could help you reach that stage.
Keep people on your page
This is the number one thing you have to be doing.
However you do it, keeping people on your site for longer and maintaining their interest is crucial for writing like an SEO professional. This is because bounce rates have an impact on positions, and keeping people on your site for several minutes on average could get your ranking significantly higher than if people tend to leave after only a few seconds.
You might do this by creating a clear, simple structure for your writing. Huge paragraphs or vague, meandering articles will cause people to click away quickly. Very short sentences and paragraphs can help to keep things moving.
A search engine needs to be able to understand what your page is about in order for it to rank for desirable keywords and queries.
This might not be that clear from just one or two words, so rather than repeating your main phrases over and over with generic fluff in between, you need to be clear on what context you’re talking about your subject in. This might mean including many more related search terms, questions and answers, so that Google can see a clear link.
Use unique search queries
If you want to get to the top of a search engine results page, there is one great way to do this: come up with a target query that hasn’t already been taken.
Think outside the box with this one, starting with your own brand or business name. This should be unique so that you can rank at the top for your own name. If you can achieve that, you know people will find you when they hear about you, and all your advertising efforts will eventually help generate natural organic traffic.
Beyond this, start coming up with unique names for any content you want people to find. Give unique and snappy names to your products, services and even ideas that you share. Come up with article titles which answer questions that haven’t been answered, even though people may be searching for help on the matter.
Take a specific online course
Still struggling to balance all your requirements when writing?
The best way to learn a new skill is usually to be taught by an expert. You can do this online, often for free, by enrolling in a course where lessons are delivered in video format and supported by downloadable resources. A writing course could be exactly what you need to polish up your skills.
This month has seen temporary mass hysteria spread throughout the SEO community, as the beginning of March brought with it a new mystery update from Google. Within 24 hours, rankings started to fluctuate significantly enough for most experts to agree that a major algorithm update was upon us.
However, the distinctive thing about this new update was that Google was even more vague than usual when commenting on the subject. Not only did Google employees refuse to confirm an update, there is no official word on what it concerns, and it wasn’t even given a nickname. A sarcastic tweet from one of their technicians, calling all future updates “Fred”, is all anyone has to work with.
The general consensus on Fred, however, is that it concerns the quality of written content on websites. In the last few weeks since the algorithm started to incorporate the new patch, many different sites have seen their rankings plummet (or increase, in a small number of cases). There are few correlations between the affected sites, although most of those sites that saw their rankings drop were also found to contain content with could be described as advert-heavy or sales-driven.
In other words, Fred seems to be an effort on Google’s part to fight against spam and push it further down the rankings. Although it’s not yet clear exactly what parameters Google is using to decide what content counts as low value, we know that some indicators could include the following:
- Keyword-heavy content purely written for SEO purposes
- High numbers of obvious external links or affiliate advertisements
- Text-heavy articles without additional media such as videos
- Content not focused on answering search queries
Some have suggested that Google may be starting to understand what pages look like to real users, instead of simply interpreting them in code form. This would theoretically allow for layouts without much visual appeal and a heavy focus on adverts to be picked up by Google and ranked lower. Although there is no evidence of this being the case, and it’s not known how it would be possible, this is a logical next step if the search engine is not already factoring this in.
Overall, whether things are happening now or in the near future which could compromise your site’s rankings, it’s important to take action fast. The main point to take away from the Fred roll-out is that high quality content is more important than ever, and the details really do matter when you want to entice users as well as search engines.
You might have had great positions for some of your target keywords up until now, but suddenly things may take a turn for the worse at any moment. Most professional SEO specialists have been there before at some point. It can be very frustrating not knowing why you were suddenly considered less worthy of ranking after you followed all the best advice perfectly. To help you diagnose sudden drops in positions, here is a checklist to run through. Hopefully you can identify one of these potential problem areas as something that applies to your site, and work on it quickly for a swift recovery.
1) New linkage – Even links from good quality websites can temporarily interfere with rankings if the number of them popping up at once appears suspicious. Unnatural links are a dangerous sign as far as Google is concerned, and it will sometimes act fast to penalise a site if your links are sending the wrong message. If you didn’t deliberately get any new links, use a backlink analysis tool to check if there is something you missed.
2) Lost links – Old links disappearing can have even more of an impact on what’s going on with your new links. A handful of top quality links can be enough to support great positions on Google, so sometimes it’s a case of putting too many eggs in one basket. Keep an eye out for disappearing links, and if anything major is gone, work fast to try and replace it.
3) Algorithm updates – There are usually new developments on the horizon when it comes to Google’s automatic ranking system. The algorithms that power it are gradually changed to accommodate new ranking factors, and this will logically affect rankings. Check if you missed a major update which means your website is now considered outdated in some way, or if it’s just a temporary glitch.
4) Manual penalties – If your website has been classed as a problem by Google in terms of suspicious linkage or other misuse of the ranking system, you could be subject to manual penalties that stop you ranking. Any breach of policy can result in this, so ensure this is not the reason before spending time working on SEO factors.
5) Technical problems – Sometimes Google may be getting inaccurate or incomplete information about your site due to some obscure issue with your website’s technical aspects. For example, a server problem could be responsible for problems with caching or crawling. It’s hard to pinpoint the cause of these kinds of issues so you will need to communicate with your tech support team.
SEO in general is not exactly encouraged by Google, because in essence it’s a way of playing the system and presenting your website in a certain way to exploit the ranking algorithm for profit. On the other hand, so-called “white hat” SEO is not really an issue because it’s mostly focused on delivering a better user experience. Whether this is done with the best intentions or not, it will usually have the same results. Google is happy as long as your website is becoming more user-friendly and providing better answers, because this continues to support its own traffic and revenue streams.
However, “black hat” SEO is the flipside. This is what some people immediately think of when it comes to search engine optimisation, and it has been blamed for giving the whole industry a bad name. SEO was originally much easier than it is today, since Google was much simpler, and this was especially true if you were happy to use underhand techniques. In fact, many companies offering SEO to clients still employ some truly bad strategies based on “black hat” advice, and a lot of it doesn’t even work on Google at this point.
Some outdated strategies may be honest mistakes, but in a lot of cases they appear to have been done deliberately to mislead Google. Many of these were once technically acceptable, at least until Google figured out how to detect and punish these crimes against search.
For example, keyword stuffing was once a sure way of getting your webpage noticed for given terms, but it pretty quickly became a sure way of getting penalised and marked as spam. Unfortunately, many website owners still seem to think it’s a brilliant idea, and cram their content with as many instances of the same keyword as they can, before wondering why their positions may be lower than a site with actual quality content.
Hiding keywords on a webpage is an even worse variation on this. By formatting text in such a way that it isn’t visible to users, but it still appears as content in the code of the page, the idea is that search engines will pick up on the keywords being used without having to go to the trouble of weaving them seamlessly into the writing. Since this is inherently misleading, Google started picking it up and penalising it long ago, but for many black hat SEO specialists this is still one of their main strategies, which puts their clients at risk.
Unnatural link building is also a major black hat technique still very much at work today, despite the increasing sophistication of Google’s algorithm and its ability to detect it. Some SEO companies will put very little care and attention into their link construction, basically using their own network of spam sites to link out to their clients, or even placing links on the clients’ own sites to each other. This all gets messy quickly and rarely yields good results.
There are many more issues with SEO techniques that are far from up to date, and in fact we can think of too many to go into! Suffice to say that underhanded techniques are still being sold to clients and used on many websites inappropriately, but we can expect to see Google continue to tighten up on these offenders.
If you have paid any attention to insights and data analysis based on your website traffic (perhaps using official Google tools like Analytics), you will rarely find that your organic traffic is completely stable, or improving at a constant rate. In almost every single case you will see random jumps and drops at least every now and then, and it can be hard to figure out the context behind these sudden changes.
Fluctuations in organic traffic may be normal, but they are always caused by something, it’s just a case of determining whether that factor is within your control or not. If it is, you may need to act fast to rectify a problem and prevent further loss of potential traffic. Here are some examples of common issues that might explain sudden dips in traffic.
1) Indexing – To appear on Google’s search results pages, a webpage needs to be indexed. You can stop pages from being indexed, for various reasons, and sometimes this option may be enabled accidentally. A noindex tag or your robots.txt file may be to blame if your pages are missing from Google’s database.
2) Updates – If you have made major changes on your website, Google may detect this and take some time to go through the process of re-indexing the affected pages according to their new content. Expect some fluctuation in organic positions and traffic following a big overhaul of any page. It’s especially important when making updates to make sure you don’t break valuable links (setting up redirects can preserve these).
3) Google – The main search engine we’re usually all focused on is constantly making changes of its own. It ranks every website for every keywords based entirely on algorithms and calculations (with some rare exceptions), so when these algorithms are changed the rankings may be disrupted for a while. The engineers behind these changes tend to test new ideas quietly in the background, and at first the effects can sometimes be too extreme and require fine tuning. This could explain sudden changes in your organic traffic.
4) Competitors – You can’t control your competition within your market, however much you would love that power. Even if you are doing everything right in terms of SEO, someone else with a higher budget and a better website may spring up ahead of you out of nowhere. Unfortunately this may have a temporary (or permanent) effect on your own traffic, depending on how quickly you can react.
Although it’s not technically part of search engine optimisation, paid advertising is often considered an important part of digital marketing for websites that can usually work in conjunction with an organic traffic-based strategy. There are some similarities with SEO and PPC (pay-per-click), although there are many key differences too. Here are some of the most common mistakes people make when starting out in the world of paid Google listings.
1) Not bidding high enough
Some people are overly cautious about spending too much on clicks, and they limit their spend too much which allows their competitors to get ahead of them. Higher bids mean higher positions, so if you’re saving money on bid amounts you’ll probably be too low down to see good results. If you set your maximum bid above what you are really willing to pay, this will give you a competitive advantage in terms of ad impressions, and you will rarely have to actually pay that much for every click.
2) Lacking focus in AdGroups
A single AdGroup should only apply to people searching for one specific solution. All ads within a group need to be close variations on a theme in terms of what they are promoting, even if the wording is totally different. With organic SEO, you want to have specific landing pages so you can optimise them for particular keywords. The same is true for paid ads, as you will get better results with a narrow focus.
3) Using search and display network together
By default, when you set up a campaign Google Adwords will assign the setting “Search network and display network” automatically. This is never the best solution, because your text ads on the search network will be aimed at people who are currently searching for your keywords, while your display network ads will be targeting people who are browsing other sites at the time. Since these users are at totally different points in their search, how could you possibly optimise ads for both of them? The answer is you can’t! Run separate campaigns for best results.
4) Ignoring AdWords features
If you are spending time optimising your campaign and you’re serious about getting the best results, you really need to commit to learning about all the additional tools and features on offer beyond the basics. You are very unlikely to get the best possible conversion rates from your paid listings if you only use the traditional text-based ads, while your competitors are exploring other options such as sitelinks, callouts, call-only adverts and so on.
Google is regularly introducing new factors into its ranking algorithms, and in recent years a few newer elements have been rumoured to have a huge impact on search results for the majority of users. However, a few of the top factors have stayed the same since the search engine’s inception, and show no signs of changing. But are links still one of these? Are there any SEO services that can help?
Historically, the more links a website has pointing to it, the better it ranks. This has been the case since the beginning of search engines, as it’s one of the clearest ways to automatically calculate the relevance of a web page. Since Google’s algorithm is all mathematically determined by crawling sites and looking at aggregate data, it’s easy to see why counting the number of links was quickly adopted as an important method.
Having said that, the sheer number of links is not the only factor at work here. Quality is equally important, if not more so, since just a few high quality links can have a huge impact on overall ranking. This contributes to organic traffic and the total success of a website. Meanwhile, hundreds of low quality links can do nothing to boost SEO scores, and just a handful of particularly bad links can have a noticeable negative effect on rankings. So how can an algorithm assess quality in a similar way to a human?
Firstly, there is the traditional keyword-based way. Ranking for certain keywords is supported by those terms being mentioned alongside links to the target site. If the content on these websites is all somehow relevant, that helps too. The authority of the domain will also have an impact, as well as the chain of links going to that site, and so on.
In more recent years, new additions to Google’s calculations such as the Rankbrain element can be used to determine more intuitive connections between words, phrases and their meanings in popular colloquial use, so Google is actually starting to understand the context behind simple words based on other information it can gather. All this means that links are becoming more important than ever, since search engines are getting more “intelligent” through machine learning which helps them decide on the quality level of those sites, leading to their rankings changing accordingly.
Although a few short years ago, Google was only in its early stages, it may be hard for us to imagine now. At one point, there were many different search engines still in the game. Of course, there still are now, but none have anything close to the monopolistic power of everyone’s favourite search giant. We feel like it knows us personally, and a lot of that is down to developments in the last few years, with an increasing emphasis on local and personalised results, plus the additions of many different modules and widgets on search pages that link to Google Maps and try to connect us with the most geographically suitable answers to our queries.
But what if we suddenly went back to a time when all of this didn’t exist? It wasn’t really so long ago when Google would basically return the same results for anyone searching on it, regardless of their precise location or even what country they were in. Local competition in terms of SEO is so fierce now, imagine what it would be like without that specificity? Not only that, but imagine how difficult it would be for search users to make informed decisions! It would basically become impossible for certain industries, including…
1) Healthcare – Currently if you were to search for a doctor’s surgery or a pharmacy, you would almost certainly be presented with a local map widget labelling all your nearest options. If these businesses were not optimised to appear on Google properly, an emergency situation could become a lot more serious.
2) Jobs – Looking for jobs is almost a complete waste of time if you have no way of narrowing it down by location (amazing you will still find careers services online where this is the case). Currently Google will assist with this, but if it didn’t, you’d almost certainly miss all the best opportunities and waste a lot of time.
3) Food delivery – This is something many of us enjoy using Google’s local search powers for. If you look for just the word “pizza” in any given location you will usually see red pins appearing all over the place, for example. Without this, it would be frustrating trying to find somewhere that’s within easy reach or delivers to you, plus you have the issue of comparing user reviews to find the best option.
All this information is supplied through Google’s intelligent local pack, thankfully. But it is worth thinking about what would happen if we didn’t have the search features we now take for granted. What will be the next development that Google sees success with and we start to feel like we couldn’t live without?
If you ever use organic search to look for financial help, you may see results popping up for what we commonly refer to as payday loan companies. However, anyone paying close attention to the paid ad listings at the top and bottom of those pages would have recently seen such websites disappear entirely. That’s because Google has decided to put a blanket ban on these companies advertising on their massive pay-per-click network, which could be expected to have a huge impact on traffic to these sites.
After all, Google has been making its ads stand out from the organic results less and less. In the last few months it has taken away the distinctive yellow colour of the “Ad” labels, for example, which means they are sometimes almost indistinguishable from organic results. It is perhaps for reasons like this that misleading or predatory advertisements must be taken more seriously than ever, otherwise Google could be seen as complicit in encouraging search users to make poor financial decisions.
The morality of some of these companies may be debatable, but clearly Google is sending a message here about what kinds of business practices it is willing to be associated with. It has set certain standards to decide what is not permissible on its advertising network, including companies that offer interest rates in excess of 35% (for US advertisers) or require repayments within 60 days of borrowing money from them (applying to advertisers in any country worldwide). These characteristics are typical of so-called payday loan companies offering short term cash for people who are struggling to make ends meet between pay cheques.