Header Image

Best Practice for Guest Blogging

Archive for the ‘Search Engine Optimisation’ Category

Best Practice for Guest Blogging

Monday, July 15, 2013 15:32 No Comments

The regular changes that Google’s been making to its search algorithms recently to clamp down on poor quality links or content has started to change the focus of many website’s link building strategies. Outsourcing link building to agencies that use bulk link techniques on dubious sites has never worked that well, but now more than ever, an effective link building program should be focused on ‘relationship building’ rather than simple link building.

One of the popular ways to go about relationship building is by being a guest blogger on a reputable blog. This has always been an incredibly effective means of generating high quality links from popular and relevant web pages, but more recently the over-use and poor implementation of this technique has resulted in many bloggers cringing at inboxes full of poorly written, self-serving pitch requests, and ultimately ignoring the vast majority of would be ‘guest posts’. In the same way that numerous linking request emails started to flood into mailboxes several years ago, the same is now true for guest blog requests, so that a number of blogs are now closing their doors to guest post submissions.

Furthermore, according to Matt Cutts – the head of Google’s webspam team – “Google is willing to take action if they see spammy, or low quality guest blogging…which is basically putting low quality articles with embedded links on that site”. He goes on to say that “article-spinning, or low quality syndication are the areas in which Google are going to take an interest”. You can hear more about his comments in a video here.

Guest blogging still works however, and works well, but it has to be done effectively as genuine relationship building, rather than blatant link building. The links will come by building real relationships with the people running the sites so that a level of trust and respect is developed and the guest blog posts add to the quality and tone of the original blog.

Here are some useful tips on the best practice for guest blogging:

  • #1 Research potential link sources well: Research sources through social media channels, especially Twitter, LinkedIn and Pinterest. Seek out high quality blogs and get to know the blog first, before making contact.
  • #2 Don’t be too direct: The first time you contact a blogger, don’t pitch to them – instead, get to know them. If you are targeting a larger blog with multiple writers, then you might want to go by the way of an introduction. Most bloggers are happy to help out people they like with a link, but the only way to get that is to focus on the relationship before the link.
  • #3 Approach through social media: Better yet, skip email altogether for the first contact. Instead, make contact through social channels, where you are much more likely to get a response. Twitter is one of the best social networks for finding and connecting with bloggers and should be the first point of contact. Start by following, then tweet directly to them, but don’t ask for a link on the first tweet.
  • #4 Personalise the pitch: What if you don’t know enough about the blogger to make it personal? Then it’s probably too soon to be pitching for a link! Nothing will get your guest post denied quicker than sending a generic pitch.
  • #5 Offer value: The best way to get what you want is to give something back. The primary value you should be offering is excellent content to the blog, so create valuable, unique content to submit to the blogger. Also, offer to promote and share their content on your social networks, bring technical issues to their attention, such as dead links or broken forms, and leave good quality comments and participate in discussions.
  • #6 Maintain the relationship: Often when guest bloggers manage to get a link placement, they don’t continue the relationship with the blog’s owner. So follow up with the blog owner / editor to see if they have any feedback, positive or otherwise. If your content is good, the blogger will be eager to publish more of your submissions in the future. This is particularly useful for agencies that can leverage these relationships with multiple clients.

As outlined above, the process of guest blogging can be time consuming but should reflect the natural process of relationship building rather than a quick link request. If you would like more information about how guest blogging can improve your relationship building (and links), please contact us now for more details.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

The Benefits of Using Google+ For SEO

Saturday, June 15, 2013 15:31 No Comments

Although Google+ is still struggling to establish itself as a viable social media alternative to Facebook, the number of users is increasing as Google integrates the tool with other services and starts to create a community around the range of features being offered. There are also some SEO advantages to having a well set up Google+ profile with both personal and business pages.

With any link-building strategy, it’s very important to network and build genuine connections and relationships that will help spread your content. Google+ facilitates this by allowing the linking of all of your social media profiles, sites and blogs in an organised manner. It’s also possible to link to sites to which you regularly contribute and, importantly, all of these links are “followed”, rather than “no follow” links and you’re able to select the anchor text (in your bio).

Google will follow the links in your posts and the more people share them, +1 your posts or link to your profile, the more valuable these links become to you. If your post goes viral or is shared by a high authority profile, the value of the links increases more. Content on Google+ is indexed rapidly – some say almost instantly – so it’s a great way to get posts by you into Google’s index quickly when there is a hot topic.

With many social media sites, you have little to no ability to edit your content once it has been posted. However, Google+ allows you to go back and make edits to posts as you see fit. Furthermore, Google provides the option for you to take ownership of that content and so it’s important to set up an author tag (for an individual claiming content on a page) or a publisher tag (for a business to claim ownership of a site).

Both the author and publisher tag can’t be used on the same page and if the publisher tag is used, it’s only for the homepage, not internal pages. You can use the author tag for internal pages with content. It’s still a good idea to use Google’s “rel=publisher” tag, but you won’t get the image in your SERP listing like you do by using the authorship tag.

It’s important to remember that a Google+ profile needs to be set up in order to implement the Authorship Markup and take advantages of its benefits. To set up a personal profile, you can go here. Business profiles can be set up here.

The benefits of doing this are:

  • It makes your listing more robust, because it includes a photo, your name and links to more content by you. There are also indicators that your authorship markup may give you a boost in rankings. While some say it doesn’t directly help, others have reported an increase in rankings after implementing it.
  • Your authorship markup also helps you build trust as it establishes you as a real person in the often-anonymous online world.
  • It also allows you to claim your name (you don’t want someone trying to steal your name!) and your content (you’ll be seen as the original and rightful owner of the content and won’t have to worry about a “copy scraper” outranking you).
  • You can improve your click-through rate by playing with your profile image that’s shown in the SERPs. Images that perform best seem to be close-ups where the eyes are looking to the right towards the listing in the SERPs.

Once you’ve completed the profile and you start posting, keep in mind that the first sentence of your Google+ post becomes part of the title tag, which impacts rankings and influences click-through rates. A word of warning is that Google doesn’t tolerate “spammy” practices, so it’s vital not to turn your profile into a link farm!

There are a lot of different ways to connect with influencers in your industry and networking on Google+ with those is one of the core focuses of Google+ users. It’s fairly easy to do because there are so many ways to do it, but be sure you don’t abuse the privilege, as if you become seen as a spammer, it will be very difficult to grow your presence.

While Google+ numbers aren’t as large as Facebook, they are growing and as with most things, it’s the early adopters that do well in the long run. So it’s worth beginning now to establish your position. Google+ is probably here to stay and also likely to get more important to your rankings and traffic.

If you’d like more information about Google+, or help with setting up a profile, please contact us now.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Google’s Panda and Penguin Updates – Two Years On

Wednesday, May 15, 2013 15:30 No Comments

Each year, Google changes its search engine’s algorithm up to 500-600 times. While most of these changes are minor, every few months Google rolls out a “major” algorithmic update that affects search results in significant ways. In this article we evaluate how Google has continued to evolve the significant releases of their “Penguin” and “Panda” updates, how these changes caused some website’s rankings to decline, and what can be done to prevent this from happening to yours.

On February 24th 2011, Google announced its first ever “Panda/Farmer Update”, which was a ranking penalty that targeted poor website content (what it termed as “thin” or “not good enough”), or websites that used dubious content farms and ones with high ad-to-content ratio. Panda is a site-wide penalty, so that if enough pages are tagged as poor quality, the entire site is subject to it, (even though some good quality pages would continue to rank well). The only way to lose the penalty is to remove or improve the poor quality content. This major algorithm update hit some sites hard, affecting up to 12% of search results according to Google.

The Panda update had a series of subsequent changes over the following year and the “Penguin Update” (aka “Webspam Update”) was released on April 24th 2012. That evaluated the incoming links to a site to determine if they involved link schemes that were solely intended to improve rankings. This was done by automatically raising flags by examining the ratio of links compared with those for competitors’ sites, which then led to a manual investigation by Google. This impacted an estimated 3.1% of English-language search queries.

Subsequent updates were made to Penguin on May 25th and October 5th 2012 and the final release of Panda (#25) was on 14/4/2013. That filter is now going to become part of the core algorithm (Panda Everflux). This means that businesses of all sizes need to consider creating websites/pages with quality, relevant content that enhances the user’s experience. Also, any links that are created to point to it need to be genuine ones, rather than just being developed in an attempt to improve rankings.

So the main outcome post-Penguin, is that businesses need to take care with link building techniques and, ideally, to start earning links through real relationships and useful content. This is not easy for many websites, but Google will reward those websites that follow the process of combining good quality webpage content together with genuine links to support its ranking performance, as these are the kind of sites that it deems will benefit its users’ experience.

If you would like details about how we can help your website improve, rather than get penalised in the rankings, contact us now for more information.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Using Canonical Links and avoiding common mistakes

Wednesday, May 15, 2013 15:29 No Comments

The use of ‘canonical links’ is a helpful tool for webmasters in cases where a website may have duplicated pages of content. The role of ‘canonicalisation’ allows website owners to tell Google and Bing which webpage is the one to give precedence when there are duplicates of that page on the site. However, there are some common mistakes that need to be avoided when doing this.

It’s often a common occurrence for a site to have several pages listing the same information, or set of products if it’s an ecommerce site. For example, one page might display products sorted in alphabetical order, while other pages display the same products listed by price or by rating. If Google knows that these pages have the same content, it may index only one version in the search results, or it may penalise the site for creating duplicate content pages.

Therefore website owners can specify a canonical page (the preferred version of a set of pages with highly similar content) to search engines by adding a ‘link’ element with the attribute rel=”canonical” to the ‘head’ section of the non-canonical version of the page. Adding this link and attribute lets site owners identify sets of identical content and suggest to Google that of all these pages with identical content, this page is the most useful – therefore please prioritise it in search results.

The use of canonicalisation has to be done carefully however, as there are some common mistakes that can be made and it’s important that it should only be used for pages that are duplicates.

These are the most important points to consider:

  • Verify that most of the main text content of a duplicate page also appears in the canonical page.
  • Check that rel=canonical is only specified once (if at all) and in the ‘head’ of the page.
  • Check that rel=canonical points to an existent URL with good content (i.e., not a 404, or worse, a soft 404).
  • Avoid specifying rel=canonical from landing or category pages to featured articles (as that will make the featured article the preferred URL in search results.)

If you would like to know more about how the use of canonical links can improve your website’s indexing of duplicate pages with Google & Bing, contact us now.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Making use of Google’s Search Suggestions

Friday, February 15, 2013 15:26 No Comments

Since 2008, Google has provided searchers with the ‘Search Suggest’ option, which aims to predict the search terms that users are typing into the search query box. In 2010, this was combined with Google’s Instant Search service, which continuously changes the results list as the users types in their query. The suggested search term function is therefore a valuable tool for users, but also for search marketers.

The drop-down list of search suggestions that appear as users enter a search query on Google will sometimes display up to 10 options, although in most cases, there are 4 suggestions shown, which are continually refined as the query is typed. The aim of Google is to help the user complete their search query faster, by anticipating the search term they might use. These suggestions come from historical data on how people have searched, as well as the content of web pages indexed by Google.

The search popularity is the primary factor in what Google shows as a suggestion, yet the suggestions may also be influenced by a user’s previous search history, or by relevancy factors that are calculated by Google’s complex algorithms. There is also a “freshness layer”, so that if there are terms that suddenly spike in popularity, these can appear as suggestions, even if they haven’t gained long-term popularity.

What’s important from a search marketing point of view is how these suggestions may influence the results shown on the page, and how this information can be used for a business advantage. Although there are no figures on usage of these suggestions, many people say that they will look at the suggestions being shown and are likely to choose the relevant query to save typing in the full query. As a result, this tool can help to influence the way that people are searching and could increase the times that websites will appear in the rankings for selected search queries.

This is important for search engine optimisation (SEO) and for PPC advertising (Google Ads (AdWords)). From an SEO perspective, marketers should see what queries are being suggested for the main search terms they are targeting through their optimisation, and then ensure that the relevant suggestions are also being targeted in their site content.

This is also true for Google Ads (AdWords), so that by targeting the relevant suggestions for the market – either as a phrase or exact match term – marketers can see how often those terms are being used as a search query, and whether they perform well in their campaign. Targeting these suggestions can also help bid pricing on specific queries, and in addition to this, any suggestions that are shown, but are not relevant, provide good information on negative terms that should be added to the campaign.

If you would like more information about Google’s search suggestions and how these can be used for your search marketing activity, contact us now.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Google launches its version of the “Disavow Links” tool

Saturday, December 15, 2012 15:23 No Comments

Further to our previous article in September 2012, which announced the introduction of a “disavow links” feature in Bing’s Webmaster Center, this month we review Google’s recent release of their own much-anticipated version of this tool. As with Bing’s tool, this feature allows webmasters to protect their sites from malicious link building that could result in their website’s rankings being penalised on the Google search engine.

Google has recently been increasing its focus on targeting bad links, which has consequently affected some business website rankings. The new “disavow links” tool is therefore designed for those websites that have been impacted by Google’s ‘Penguin’ Update, which in particular impacted those websites that may have purchased links or gained them through spamming.

Following the Penguin Update, there was a sense of panic from some SEOs and publishers who wanted a way to ensure that they could discount bad links and start afresh. Others worried that people might point bad links at their sites in an attempt to harm them with “negative SEO”. The situation was compounded when Google released a new set of link warnings that didn’t clarify if publishers really had a problem they needed to fix, or not.

By counting ‘bad links’ as negative votes against a website, Google has now enabled website owners to try to avoid the negative impact on their site via this new tool, which can accessed through the Webmaster Tools service. It should mainly be used in response to a warning from Google about ‘unnatural links’ pointing to a webste and it therefore enables the person responsible for a business’s website indexing to tell the Google search engine that their site shouldn’t be associated with un-trusted links pointing to it from nominated external websites.

However, this tool should be used with extreme caution, as the incorrect use of it could result in a decrease in a website’s genuine rankings. Website publishers should therefore first try to remove links they are concerned about pointing at their site by first working with site owners hosting the links or with companies they may have purchased links through. Google’s blog states that: “in general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you’re worried that some back links might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored”.

If you’d like to know more about how Google’s “Disavow Links” tool may benefit your website’s rankings, you can read more here, or contact us now for more information.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Google introduces an Exact Match Domain filter

Saturday, December 15, 2012 15:23 No Comments

In another addition to Google’s enhancements of its ranking algorithm filters – Panda and Penguin – it recently introduced a new “EMD” filter. This should be of interest to SEO marketers and webmasters, as it can also affect the way in which their website may rank.

The ‘Panda’ update filtered sites that are deemed to have too much poor or duplicate quality, while the ‘Penguin’ update was designed to catch those that are thought to be spamming its search results, particularly through links or ‘over-optimisation’. “EMD” stands for “Exact Match Domains”, which are domains that exactly match the search terms for which they hope to be found and are generally bought and developed by businesses that are focused on rankings for a particular term. This filter tries to ensure that low-quality sites don’t rise high in Google’s search results simply because they have search terms in their domain names.

Google emphasises that all EMD domains aren’t being targeted, just EMD domains with ‘bad content’. Similar to those other filters, Google says EMD will be updated on a periodic basis. Those hit by it may escape the next EMD update, while others not hit this time could get caught up in the future.

If a website hasn’t had its rankings reduced by Panda or Penguin before and the domain name was bought just in the hope of an “exact match” ranking success, then its rankings will probably be affected by the EMD filter, so its potential effects are well-worth being aware about.

Google admits that there’s a small degree of boost to sites for having search terms in their domains with Google, but in general, it’s just a very small degree. So the potential benefits of gaining increased rankings through this type of exact match domain-naming strategy are out-weighed by potential decreases, particularly if the site’s content is bad quality.

Contact us now for more information about these Google filters and how they can affect the rankings for your website.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Google Improves the Ranking Results for Local Search

Tuesday, May 15, 2012 15:00 No Comments

Google has recently introduced a significant algorithm boost to the quality of local search results within the main search listings. Codenamed “Venice”, this new update will help smaller, localised businesses to compete in the rankings against larger, national companies and will benefit them as the search results continue to become increasingly localised.

Local search results are becoming increasingly important, particularly with mobile searchers, and so ranking positions on both Google Places and Google’s main organic results are something that any business with a localised target market needs to consider. Having a localised online marketing strategy, whether it’s for just one business location or multiple locations, is a key factor for search engine marketing, and Google’s recent changes make this more important than ever.

Google’s recent “Venice” update uses the signals within the main search results to help trigger relevant local results for the searcher, which is also based on the user’s location so that results should be more relevant to that location, whether or not a location term has been used in the search query. The location targeting is based on the searcher’s IP address location, which is also displayed in the left hand margin on the search results (and can be changed by the searcher if not correct).

This update therefore provides more opportunities for local companies to appear in the search results when relevant to local searchers, and therefore the SEO elements for a website and a Google Places listing become increasingly important. This could benefit local businesses that have previously been disadvantaged by larger firms in terms of their SEO targeting, through some improved optimisation of their sites for localised search terms.

These changes can involve a number of factors, from using the local terms in HTML tags and page content, to including local focused content on the website, and making updates to a website’s pages/architecture and the type of code that is used to micro-format the address. Links to the website that use the local search term in the text link also remain an important factor to support these type of rankings, as does the optimisation of a Google Places listing (which we covered in March 2011).

Therefore, careful localised optimisation is the key for local business marketing through search, as this Google Venice update shows that this is becoming increasing important at levelling the playing field between national and local firms. Businesses need to start developing a plan to deal with it to make sure they stay properly optimised and so take advantage of the opportunities that the updated algorithm change has to offer.

If you would like more information about how we can help to optimise your website to take advantage of this opportunity, contact us now for more details.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

How Google’s Recent Ranking Changes Benefit High Quality Websites

Tuesday, May 15, 2012 15:00 No Comments

At the end of April, Google’s Webmaster Blog announced further changes to their ranking criteria that are designed to target “over optimised” websites and to benefit those that offer unique content and comply with Google’s guidelines. As part of their ongoing efforts to improve the relevancy and quality of results for users, Google’s latest changes could shake up the rankings for some sectors of the market.

Google says that they don’t outlaw search engine optimisation as a practice – in fact they say that effective SEO can make a site easy to index, more accessible and easier to find. “White hat” search engine optimisers – the term often used to define ‘ethical’; techniques that comply with Google’s guidelines – are seen to improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines.

However, Google has always targeted “black hat webspam” and is continuing to do so with this latest update to their ranking criteria. Sites that use these techniques are targeting higher search rankings, possibly as a short term gain, and use techniques that don’t benefit users, since the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be to be ranked. The type of webspam techniques that Google targets includes keyword stuffing or link schemes that attempt to propel sites higher in rankings.

Google’s success relies on continually providing good results for searchers. It’s therefore also their policy to try to reward the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. The recent “Panda” updates over recent month have been focused on returning higher-quality sites in the search results and removing low quality or duplicated content. They have also introduced a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold” (on the screen before the user needs to scroll down the page).

Google has now just introduced a new algorithm change that targets webspam more intensely. The change will decrease rankings for sites that Google believes are violating their existing quality guidelines. As usual, Google doesn’t divulge the specific signals so there will be much testing and comment by SEO practitioners over the coming weeks as to what factors are being targeted. Google’s advice is still to focus on building high quality websites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

At the Web Marketing Workshop, we’ve always focused on developing good SEO techniques for websites that won’t fall foul of Google’s penalties. If your site has been affected by these recent changes or you’d like to know more about the likely impact for your business, please contact us now for more information.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.

Google makes personalised search more secure

Tuesday, November 15, 2011 14:49 No Comments

During October Google announced that it is going to increase the privacy for users of their search engine by encrypting personalised search results. This has caused much consternation in the search engine marketing field, since the end result will be that search query data will no longer be accessible through any web analytics stats from these searchers.

Google says that this change is important for security and privacy, so that users who sign into their Google account will get their search queries encrypted by default. As the use of the search engine is becoming an increasingly customised experience, the results become tailored towards individual users.

This additional layer of security means that Google and the web browser itself can only see any searches. A third party can’t intercept the search and know what’s being searched on, so it’s especially important for people that search using an unsecured Internet connection, such as a WiFi hotspot in an Internet cafe.

Google is doing this by securing the results for signed-in users, through the use of an encryption protocol called SSL (Secure Sockets Layer). This is the same technology that is used when performing secure credit card transactions and is evident by the extra “s” in the “https” in the address bar for Google’s homepage, when signed in to a user account.

Google may hope that this change will encourage more people to search through a personalised account, which will protect the user’s information but also allow Google to display more relevant results to the user. However, the downside for website owners and marketers is that less information will be available in Google Analytics – or any analytics package – so that although visits from Google’s organic results will still be counted, the individual search terms from logged-in users will be hidden and just displayed as ‘not provided’.

Google says that an aggregated list of the top 1,000 search queries that drove traffic to a site for each of the past 30 days will be available through Google Webmaster Tools and also any AdWords data will still be displayed at the search term level, whether the searcher is logged in to a Google account or not. However, the loss of organic search term data is significant and will become more so over time.

Initially this change will only happen on Google.com, and only relates to those searchers who are logged into a Google account. According to Google’s software engineer Matt Cutts, this is likely to account for only single-digit percentages of all Google searchers on Google.com at this time. However, as more people use Google’s services such as Gmail or Google+ and remain logged in when they search, this percentage is likely to grow and impact the level of data available through web analytics accounts.

We’ll be tracking this issue and reviewing the impact over the coming months, but if you’d like to know more about this and how the change may affect your search referral data in analytics, please contact us now.

This article was written by Web Search Workshop UK, a search engine optimisation and marketing consultancy for UK business websites. Contact us today for a free assessment of your website.