A roadmap for improved web experiences

Roadmap - by Canned Muffins

We are currently living in an era of Digital Darwinism, a time where evolving technologies – from mobile, real-time, to social media and the Internet of Things – are disrupting traditional business models and markets. In order to adapt effectively, digital transformation is required to ensure that businesses look beyond the here and now and focus on how disruptive technologies are affecting customer behaviour and the impact this has on their business.

The continuous evolution of the website

For most, a company’s website is at the centre of their digital ecosystem, and with the explosive growth in mobile and tablet usage the form and nature of the website is changing to keep pace with the changes in customer demands and preferences.

Many sites are therefore shifting to either adaptive or responsive designs and it could be suggested that a website redesign, relaunch or migration can be considered an integral part of the managed digital transformation process.

Best practices for effective website migration

As part of a digital transformation initiative, I was recently involved in the migration of an existing, legacy website, built using a fixed design, to a new website, with a responsive design and structure.

I was primarily responsible for the site’s natural search/SEO and social media effectiveness but my involvement in the project provided me with a 3600 view of the entire website migration process. I’d therefore like to outline some of the key aspects and learnings from my website migration experience and how they can be applied to other similar projects:

Outline goals and objectives

From the outset, it’s important to be absolutely clear as to why you are migrating your website and the objectives and success criteria associated with this. Remember: all objectives should be SMART.

For the project I was involved in, whilst one of the core reasons was to upgrade to a more sophisticated CMS, the core objective was:

To provide users with a first class customer experience on all browsers and devices

Some of the success criteria that derived from this overarching objective included:

  • Maintain (with a view to improving) existing natural search traffic and ranking performance
  • Enhance the overall website user experience
  • Ensure the website is accessible on all key browsers and devices

Pre-migration preparation

This stage involves establishing a baseline as to where you are now to give yourself the basis on which to measure and track results and thus the success or failure of the project versus objectives.

For our project, we primarily used this stage to review current and historic website traffic patterns to understand the future impact of change. The activity involved included:

  • Evaluated trends of views, visits and top landing pages to understand seasonal and promotional variance
  • Reviewed the number of search referrals to the site and took a measure of current percentage
  • Ran crawl tests to assess accessibility of site to search engines (primarily Google) and how many pages currently indexed
  • Measured inbound links and referring domains to key sections of the site and to the website as a whole

From my experience, the set up of a cross-functional website migration team with a range of different skills and expertise was crucial to the success of the project. As Brian Solis points out:

“The outcome of even the smallest investments in change brings together typically disparate groups to work in harmony across the entire customer journey. This allows teams to cooperate, or merge into new groups, in uniting the digital journey to improve engagement; deliver a holistic experience; and eliminate friction, gaps, and overlap”.

Consider your digital toolkit

A number of tools will be necessary throughout the project, as part of the baseline measurement, as well as for checking progress and measuring results once the new site goes live.

Some of the tools I found most useful during our website migration project included:

  • Web analytics – for measuring traffic by source; time on site/ engagement; bounce rate
  • Moz Open Site Explorer – to record rankings and review inbound links to the site
  • MajesticSEO – to baseline inbound links to the site
  • Website crawlers – use tools such as Screaming Frog and Xenu to get a list of all URLs and for checking redirects post-migration
  • Google Webmaster Tools – for conducting a site audit and monitoring errors post-migration

Review/audit key content

A thorough content review is essential, whether you’re planning on ‘lifting and shifting’ existing content across to the new site, updating or refining existing content, creating brand new content or a combination of all three.

With responsibility for all earned media marketing activity, this was a key task of mine and meant working closely with a number of teams and key stakeholders across the business, particularly content, marketing and PR.

Together we audited the site and made a plan as to how content would change and/ or be migrated into the structure of the new site. A pre-defined site structure meant we had a guide as to how the new site was going to be designed but this had to be flexible to ensure we could carry the right content across without losing key information or harming UX or natural search performance.

Some of the questions we asked when reviewing content included:

  • What Does the content serve a specific purpose?
  • How much traffic does the page receive?
  • Is the page orphaned?
  • Has the page earned any external links – how many and of what authority?
  • Has the content been shared socially?
  • Is there any other similar or duplicated content?
  • Is the content still relevant?

Define URL structure and redirection plan

If URLs are going to be changing as part of the website migration, it’s essential that a comprehensive redirection plan is created.

URLs can often change and pages moved as part of a website migration due to:

  • Re-branding requiring new domain
  • Expansion into foreign markets requiring new TLDs
  • Upgrade/redesign leading to a new navigation or URL structure
  • Re-purposing/ or updating of content on new website

Depending on the size of your website, a thorough redirection plan can be a complex and time-consuming process but a crucial one nonetheless. From an SEO perspective, restructuring or migrating a website is fraught with danger if not handled correctly so it’s highly recommended and worthwhile working with experienced specialists who can guide you through the process.

Launch and review

Prior to launch, it’s important to carry out last checks, including a content review, website crawl to pick up errors (e.g. broken links, missing meta data etc) and assurances that correct (301 – permanent) redirects are in place.

Once the new site is launched, it will be time to go back to the baseline and review results over the first few days and weeks. It’s important at this stage to keep calm, be consistent and trust in your preparation and research.

For the project I was involved in, we ensured that we referred back to the baseline we established at the start to look at:

  • Traffic – all sources
  • Rankings
  • Bounce rate
  • Sales funnels, customer journeys and conversion paths
  • Crawl errors and internal links
  • Missing pages and broken URLs (404 errors)

Set out a clear post-migration communications plan

It’s important that key stakeholders, both inside and outside the organisation, are aware of the new site and what it means to them. What or how you communicate will depend on the type of changes that have been made to the site but it’s usually worth shouting about positive changes and updates.

The website I was working on underwent a substantial re-design, with the introduction of new functionality and a lot of content. It was therefore imperative that we signalled changes to users early on, before we migrated, with sneak previews and guidance on the old site as to what the new site would offer.

Following the launch of the new site, customers and staff were emailed, social channels updated and key stakeholders invited to feedback with their views. Letting as many people as possible know about the good work that had taken place was a good opportunity to get people talking and generate new inbound links to the site.

Further reading and links

There are many articles across the web with helpful hints, tips and best practices for site migration. Here are a few that I have found very useful:


Best tools for effective backlink analysis

Although there is plenty of evidence to suggest that search engines are paying attention to other factors in addition to links when ranking websites, good quality backlinks are still very important and widely considered to be the leading ranking factor:

Ranking factors 2013

Effective link-building will form an integral part of any credible SEO strategy but before work begins on this type of activity it is highly advisable to conduct a backlink analysis in order to benchmark versus competitors and other leading websites, as well as gain a more thorough understanding and knowledge of links, link quality and link building strategies and tactics.

In his backlink analysis post for Search Engine Watch, Jon Ball identifies the following key elements of a comprehensive backlink analysis:

1. Total number of links
2. Number of unique domains
3. Linking domains vs. total links
4. Anchor text usage and variance – branded, keyword rich, etc.
5. Fresh/incoming links
6. Page performance
7. Link quality

There are a number of tools and software available to assist with backlink analysis. Search specialists and agencies will be familiar with many of these tools and are likely have access to a variety of them. However, for this post I’m going to look at the process from a client-side/ in-house search marketing perspective, where this may be managed a little differently from agencies and want to use a backlink analysis to:

  • Benchmark link volume and quality vs. competitors
  • Conduct a competitor analysis
  • Review SEO agency performance
  • Measure content and social media campaign performance
  • Identify opportunities for blogger/journalist outreach

It’s often advisable to work with at least a couple of tools, which will help provide a decent level of detail and is also useful for ‘sense-checking’ data and spotting trends or warning signs (e.g. spammy links or unnatural anchor text etc).

The backlink analysis tools

Whilst there are a number of tools to choose from, below is a review of three tools that I believe are particularly useful for in-house search marketing teams for backlink analysis and/or competitor benchmarking:

Google Webmaster Tools 

Google Webmaster Tools

Although Google’s Webmaster Tools can only be used to analyse link information for your own site and not those of competitors, the ‘Links to Your Site’ report (under the ‘Traffic’ section on the left-hand navigation menu) is still a useful resource for conducting a ‘high-level’ analysis of your site’s link portfolio. The report highlights the total number of links to your site, an overview report of who links the most, the most linked pages, as well as how data is linked (AKA anchor text).

Pros:

  • Free
  • Can manage multiple websites
  • Provides a good initial ‘high-level’ view of link profile
  • Linking domains can be downloaded as a CSV or Google Doc file to check and see if a given domain is linking to you

Cons:

  • Can only review your own website and not competitors
  • The ‘Links to Your Site’ feature does not provide a definitive list of links and is not as comprehensive or accurate as other tools such as MajesticSEO and Moz’s Open Site Explorer (reviewed below)

MajesticSEO 

MajesticSEO

MajesticSEO is widely considered to have one of the most comprehensive link indexes and provides some incredibly detailed link analysis reports. MajesticSEO also has an excellent blog providing SEO tips and best practices, training posts and videos and tool updates.

Pros:

  • Free (though paid options available for more detail and functionality)
  • Intuitive and easy to use
  • Ability to review competitor sites and compare to your own
  • Trend external links over time
  • Detailed anchor text analysis
  • Export to Excel functionality
  • Drill down into links to specific pages
  • Allows both a back link & domain link counts

Separates EDU & GOV links to allow easy analysis on quality links

Cons:

  • Detailed analysis requires a subscription
  • Amount of detail can be overwhelming and requires time to reach proficiency

Moz Open Site Explorer

Moz Open Site Explorer

Moz provides a range of SEO tools, from rank tracking to keyword difficulty and SERP analysis. They also have a superb blog covering everything from SEO to content marketing and UX. However, Moz are well known for their link analysis tool, Open Site Explorer, which provides detailed link data and competitor intelligence capabilities.

Pros:

  • Free (though paid options available for more detail and functionality)
  • Intuitive and engaging interface
  • Provides useful side-by-side competitor comparison and benchmarking
  • Anchor Text analysis available
  • Export to Excel functionality
  • Ability to drill down into links to specific pages

Cons:

  • Detailed analysis requires a subscription
  • Link index not considered to be as comprehensive as MajesticSEO

—–

As suggested above, for an in-house SEO team it’s certainly worth using at least a couple of tools to conduct a backlink analysis, and considering their respective prices, both MajesticSEO and Moz compliment each other nicely alongside Google Webmaster Tools.

Whilst all three tools have their down sides, the advantages they offer far outweigh the negatives and they offer some excellent functionality that will provide any SEO team with some powerful actionable insights.


New insights from Google Keyword Planner

When the Google Keyword Tool was removed and replaced with the Keyword Planner approximately a year ago, the ability for marketers to filter traffic estimates by device was lost. However, with the introduction of nine new features to the Google Keyword Planner Tool last month, users will now have greater insight into mobile device volumes, mobile trends, enhanced location breakdowns and how this data trends over time.

For the purpose of this post, I’m going to focus on the mobile and location enhancements, which offer some really useful data and allows us to consider how we can optimise our sites more effectively for mobile and tablet searchers.

Contribution of search volume for keywords by device

From the nine new features Google have introduced, different businesses will gain something different from each of them. But one of the most exciting features that’s likely to interest both small and larger enterprises alike is the re-introduction of device search insights:

Breakdown by device - aggregate

This feature provides visual data to enable users to understand the expected search volume contribution from individual devices. As you can see from above, a group of keywords representing a particular theme or topic can be analysed to get an aggregate volume of searches, in total as well as broken down by device type.

This type of data allows one to see how people are searching online for particular terms and phrases. For example, if it’s clear that a relatively equal volume of searches is coming from tablet, mobile devices and desktop computers, the data could be useful for building a business case for SEO investment in a responsive or adaptive website.

Breakdown by device - individual

Alternatively, you can also analyse individual keywords to determine specifically how each of them is performing by device type. In the first image above, we could see that [isa] was driving 49,500 searches per month as part of the aggregate list, however we can also see how these 49,500 [isa] searches themselves are broken down by device type in the second image.

This is useful for understanding how keyword groups differ from one another and whether individual content sections or landing pages might need to be tailored for those searching on desktop, mobile or tablet computers.

Analysing mobile trends over time

In addition to looking at overall search volume trends over time for particular keywords, e.g.:

Overall search volume trends

… the Keyword Planner also allows users to see the mobile percentage of these search terms over a period of time:

Mobile search volume trends

This is particularly useful to see how consumer behaviour is changing in relation to how people are using mobile devices to search online:

Mobile search volume trends over time

For the term [weather] (used above), one can clearly see that the percentage of searches conducted on mobile devices has increased steadily over time, from 17.3% in May 2012 to 56.1% in April 2014.

Along with the contribution of search volume for keywords by device which we looked at in the previous section, this level of data allows marketers to look at how content is rendered on different devices, for either a particular search term or group of terms, and can inform how content can be tailored to create a more useful user experience across devices.

Get more accurate data by setting specific dates

In the example above, we looked at the term [isa]. The 49,500 searches per month is an average taken over the last 12 months. For other terms that are not as highly affected by seasonality (e.g. [bank accounts]), this annual average number is still quite useful. However, because an ISA is a very seasonal financial product, it’s worth considering how search volume and demand for this term varies over the course of the year:

Seasonal snapshot - ISA

The seasonal snapshot above shows that there’s a clear escalation in search volume for the term [isa] from January onwards, with a peak in March. It might therefore be worth looking at the average search volume for this term during both ‘ISA season’ (January – April) and the rest of the year:

Seasonal comparison - ISA

By selecting the dates we want to focus on using the ‘date range’ selector in the left-hand navigation, we can compare average monthly searches for two specific date ranges to see how they differ.

As you’ll see from the data above, the graph provides absolute and relative changes so we can see that there is nearly a +124% percentage change in searches for the term [isa] during ‘ISA season’ vs. the rest of the year. Armed with this information, content, SEO and paid search efforts can be prioritised during key times of the year in order to maximise the seasonal opportunities.

Another handy aspect of the date range feature is the ability to compare search volume year-on-year so that accurate comparisons can be made for search demand:

Comparison on last year

Understand how search trends differ by target locations

Another useful addition to the Keyword Planner is the ability to see a breakdown of search volumes by location:

Location breakdown

This feature allows marketers to analyse the differences in search volume by location and identify segmentation opportunities. For example, although Nottingham is the second largest city by population size from the sample used above, it’s driving the least number of searches. Perhaps there is an opportunity to improve content and messaging on the website or modify bids in paid search campaigns in relation to Nottingham.


5 top SEO insights from SMX 2014

SMX London 2014 Earlier this month I attended the search marketing event SMX in London, and as with last year, SMX 2014 offered a huge amount of knowledge and insights from a wide variety of speakers from agencies, search engines, bloggers and business leaders. There was a lot a take in over the two days so for this post I’ve decided to highlight five areas that really stood out:

1. Focus on UX and engagement as well as links

Whilst links are most certainly not dead, it’s clear that search engines like Google are attempting to look beyond links as a signal in order to determine the relevance and quality of a website or piece of content and therefore webmasters and marketers alike need to consider not only their thinking around link-building but other emerging ranking factors, too.

In a session on Long-Term SEO: How To Win For Years, Not Days, Tim Grice from Branded3 explained how engagement and popularity will become increasingly important ranking factors and the key to higher rankings. This is something I covered off in my post from April about maximising SEO benefit through engagement. In his presentation, Tim provided a case study that how a site hit by a Google penalty and suffered a drop in visibility was able to regain and visibility not by building links but through the creation of content as part of a hub to serve users and to engage an audience:

In order to provide genuinely useful and engaging audience:

  • Understand your customers. Who are they? What are their interests? What social sites are they connected to? What are they talking about?
  • Use consumer surveys and data from UX
  • Identify poor performance of existing content using:
    • Page views
    • Unique page views
    • Time on site
    • Bounce rate
  • Use Google Webmaster Tools to identify what search terms are driving traffic. Don’t be afraid of removing pages that are driving unnecessary traffic

2. Entities and the semantic search revolution

As search engines become increasingly sophisticated, we’re seeing the continuing emergence of semantic search and the move from keywords to entities. Or, as Google stated with the introduction of the Knowledge Graph in 2012, ‘strings, not things’. This is about search engines giving ‘things’ – or entities – an identity so that it knows more information about them as well as the connections and relationships between other identities. Justin Briggs explains that an entity is “anything, including real world objects, facts and concepts that has a number of documents associated with it”. Examples of entities include businesses, products, movies, authors, people, places, events, etc. By creating identities for entities, search engines can form a better understanding of user intent for ambiguous search queries, e.g. [that big clock in london]:

That big clock in london, i.e. Big Ben

The introduction of the Hummingbird algorithm is a demonstration of Google’s intent to move into semantic search and the Knowledge Graph is an example of entity search in action. It makes search results smarter, more relevant and informative and provides a more complete picture by harnessing the collective intelligence of the web.

How can we take advantage of entities and semantic search?

  • Use structured data, such as Schema.org to provide additional detail about an entity:

Gone with the Wind

  • Use entities in copy, e.g. ‘15 things you didn’t know about Emma Watson in Bling Ring’. This reinforces a connection between one entity (Emma Watson) and another (Bling Ring):

Emma Watson and Bling Ring

3. Semantic mark-up, Schema and rich snippets

If Schema.org is one of the keys to unlocking the potential of semantic search, then what is it and how does it work? And what are other forms of microdata and rich snippets? It could be suggested that search engines are still a little stupid and therefore require webmasters to provide more information about pages and content to make it clear how things are related. Semantic mark-up is therefore important because:

  • You can provide meaning
  • You can increase ‘maintainability’, e.g. focus on HTML for content and use CSS and JavaScript for layout
  • Semantic mark-up is quality

Click-through rate is influenced by much more than just position nowadays – site links, Universal Search results, authorship integration and videos snippets are just some of the factors that motivate CTR. So in the face of rich snippets, personalisation and geo-targeting, the traditional CTR distribution curve is becoming redundant:

CTR from SERPs

So why should we take advantage of semantic mark-up?

  • Schema.org can help sites exploit semantic search and the Knowledge Graph (see above)
  • Very few sites are making use of Schema.org:

Schema.org take-up in US There is therefore a massive opportunity to get ahead of the game!

4. Key SEO tools

There are so many SEO tools available it’s often difficult to decide on which ones to use and when. And of course, whilst some tools can be accessed for free, others can vary quite dramatically in price range. Depending on whether you’re client or agency-side, the type and breadth of tools will also vary and it’s also important to only use the tools you need to avoid ‘analysis paralysis’. Below is a list of a few tools for budding SEOs or experienced digital marketers to consider:

MajesticSEO

  • Searchmetrics – depending on budget, can be pricey, but offers lots of useful data and a ‘visibility’ metric (a combination of rankings and associated search volumes
  • Google Webmaster Tools – whilst link data will not be as up-to-date as other tools (e.g. MajesticSEO), it’s still a useful tool for assessing crawl stats, keywords and additional link data
  • Moz Algorithm History – great for checking traffic against Google algorithm changes
  • Screaming Frog – super tool for technical audits and conducting site crawls, e.g. for site maps

5. Top social tactics for the search marketer

With content being the foundation of good search and social marketing, there are inevitably a number of social tactics that can be used to help improve one’s search marketing efforts. Identify top domains

  • Look at what domains a company’s Twitter followers are following
  • What are the top social networks Twitter followers are following and talking about?
  • Compare interest across multiple sources:
  • Take backlink data from your company + two or three competitors
  • Take a list of all followers from each
  • Look for domains that exist within those data-sets that follow all the companies’ Twitter accounts but only link to your competitors. This provides an opportunity for outreach

Use social insights

  • Look at different post types and the engagement generated by each. What is it? Where are the opportunities that you can leverage?
  • Tool like Birdsong can be used to analyse different social accounts and the content being posted and what is working in terms of Likes, comments, traffic etc
  • Take advantage of YouTube and Facebook’s native analytics:

Facebook Insights

  • Explore new content formats, e.g. long-form content

Identifying top people and influencers

  • Pick a publisher or microsite (e.g. Guardian Data blog) and look at interesting data
    • Crawl blog
    • Scrape author names
    • Sort authors by social success and topical relevancy
    • Use this to what topics you can write and who to talk to and reach out to about it

Three free tools for content marketing analysis

Tools by Josep Ma. Rosell

Part of the process of defining a content strategy for your business will be to evaluate the performance of your competitors’ content marketing efforts. The insight you can glean from what both direct and indirect competitors are doing will allow you to build a picture of the type of content activity, strategies and tactics that are working for others in your industry and what, as a result, might best work for you. In short, a competitor analysis can tell you:

  • Key content trends
  • Content gaps in your plans
  • The content types getting most traction
  • What new ideas you might be missing
  • Competitors’ strengths and weaknesses, and the opportunities and threats from the wider market

3 free tools for analysing competitors’ content marketing

There are a number of helpful tools available, each with differing price ranges, pros and cons, to help you to analyse content from across the web. In this post we’re going to look at three free stand-out tools that you can start using today to begin assessing the content performance of your and competitors’ websites:

Socialcrawlytics.com

Overview

Socialcrawlytics allows you to analyse the social performance of any given URL. By evaluating how content has been shared across multiple social networks, you can begin to determine what content is getting the most traction on which social network and how it is performing versus objectives. This process enables you to see how competitor content is performing and how it’s being constructed in terms of keywords, themes, style and structure.

Although users have to sign up for Socialcrawlytics.com via their Twitter account, once in simply enter a URL you wish to analyse and Socialcrawlytics crawls the site and prepares a detailed report outlining how and where content has been shared:

Socialcrawlytics - overview

Key features:

- For any given URL, Socialcrawlytics provides an overview of shares across a range of social networks:

Sociacrawlytics - detailed page report analysis

- Provides a % breakdown of social performance by social network so you can easily see where a site is over or under-indexing
- Offers detailed, graphical detail of what sections, and associated content, is being shared:

Socialcrawlytics - Sunburst

- Easily compare the level of sharing between yours and competitors’ sites
- Export results and categorise by content to get practical ideas of type of posts you should be creating to engage your audience:

Socialcrawlytics - page level report

Drawbacks:

It would be good to see some information around backlinks to a URL so that comparisons can be made between social shares and links, as well as some additional detail on who is sharing what content the most.

Also, because the tool conducts an in-depth crawl of a site, it can sometimes take some time before full results are returned. If you’re looking for a quicker snapshot of results, the other tools reviewed here are probably a better bet.

However, as the saying goes: good things come to those who wait. And that’s definitely the case with Socialcrawlytics. The insight and level of detail this tool provides is excellent considering it’s free and can provide users with a wealth of knowledge about how a site is performing from a social media perspective.

Quick Sprout

Overview

As with many of the best tools available, Neil Patel’s Quick Sprout is an all-round website analysis solution but has a number of very useful features that provide useful insights into how content is performing.

Key features:

- Great tool for those new to SEO as well as more experienced users
- Clear, simple and succinct visuals
- Data can be downloaded as PDF and/or Excel files
- Provides a series of overview data that can be drilled down to find more detail, e.g.:

Quick Sprout - overview

- Content analysis covers:
- Speed test
- Top keywords and phrases
- Backlink counter

- Social media analysis offers a comprehensive evaluation of which pages have generated shares on key social networks, including Google+, Facebook, Twitter, Pinterest and LinkedIn:

Quick Sprout - social media analysis 1

Quick Sprout - social media analysis 2

- The competitor analysis feature allows users to evaluate their website against up to three competitor sites:

Quick Sprout - competitor comparison

Drawbacks:

Because the tool provides a holistic view of a site, it’s not designed to specifically evaluate content and content marketing effectiveness alone unlike Socialcrawlytics and BuzzSumo, so some users could get over-awed by the amount of data Quick Sprout provides.

Nevertheless, besides some frustrating limitations with the competitor comparison analysis (which I found to work only intermittently), the fact that Quick Sprout does provide an overall website analysis will certainly be an advantage for those looking to make wider SEO and content changes to their websites.

To really understand how content is performing (beyond simply how much it’s being shared), users need to get a clear view as to how it is being rendered on different devices, how quickly pages are loading and where keywords are being neglected, all valuable data that Quick Sprout delivers.

BuzzSumo

Overview

Whilst BuzzSumo provides much of the functionality offered by Socialcrawlytics and Quick Sprout, the tool’s simplicity, ease of use and speedy performance means it still offers some really useful insights for those looking for a quick snapshot of how a website or web page’s content is performing across multiple social channels.

Key features:

- The opening page offers a really quick, handy guided tour of how the tool works
- Provides a clear, simple snapshot of social performance of different pages within a domain:

BuzzSumo overview

- Users can also search by topic
- Data is easily filtered by content type and date
- Provides detail about the Twitter users who have shared the content, all of whom can be sorted by page authority or followers, and filtered further by ‘bloggers’, ‘influencers’, ‘companies’, ‘journalists’ and ‘regular people’:

BuzzSumo - sharer detail

- Search for key influencers by topic or keyword:

BuzzSumo - influencer report

- Searched can be saved and data exported to CSV or Excel files

Drawbacks:

In comparison to Socialcrawlytics and Quick Sprout, BuzzSumo is limited in that it only offers users data for five social networks and doesn’t provide anywhere near the depth of insight that the other two tools offer. It would also be good to see BuzzSumo offer extra detail on who shared data from other social networks (although obviously this may be limited based on the each user and social network’s privacy settings).

However, the current benefits and features of BuzzSumo far outweigh the negatives and besides there’s a ‘pro’ version in the pipeline that may well solve some of the drawbacks referenced above.

Summary

It’s difficult to pick a ‘winner’ from each of the tools reviewed above. Whilst all being free, each also have their own distinct features and benefits and are useful for different reasons based on the level of insight you’re looking to get.

I would advise readers looking to analyse content, whether it be their own or their competitors’, to use a combination of all three tools. Whereas Quick Sprout can provide granular detail regarding on-page content factors such as keywords, page speed and meta data, BuzzSumo provides a great overview of where content is being shared (and by whom) and Socialcrawlytics offers an even greater level of detail, with clear percentage breakdowns all exportable so that data can be analysed further.

A version of this post was originally published on Smart Insights on 3rd May 2014. I cross-post it here with all the links and tags for your reading pleasure!


Maximising SEO benefit through engagement

It’s been interesting to see the changes and evolution of Google’s algorithm over the last few years. Ever since the introduction of the first Panda update just over three years ago, Google have continued to emphasise the need for websites to have original, useful, good quality content if they want to rank highly in the SERPS.

What is Google looking for?

In the presentation above from this year’s BrightonSEO event, Tim Grice from Branded3 outlines the importance of looking beyond ‘traditional’ SEO activities (such as keyword research, building links and actor text optimisation) to activity more focused around improving and maintaining an optimal user experience using comprehensive, targeted and relevant content for users.

Branded3 have demonstrated that by decreasing bounce rate and increasing time on site, websites can rank by focusing on great content and not by building links.

Content activity boosts search visibility

Although I doubt that this approach would work for every site on the web, it nevertheless provides some strong evidence that savvy digital marketers and SEOs now need to not only consider keywords, meta data and links, but also the variety, quality and type of content that works for their users.

Focus on engagement

Great content that’s creative, audience-led, insightful and useful provokes engagement, which is now more important than ever. And whilst I would disagree that traditional SEO activities (particularly link-building) are becoming redundant, it makes sense that Google will consider engagement metrics in addition to links in order to determine how users engage onsite once they click through from the search results.

At the end of Tim’s presentation it states: ’2014 – the year of building better websites’, and I think that sentiment sums up perfectly the approach digital marketers should continue to pursue. In order to maximise SEO in 2014 and beyond, we need to take a holistic view of the websites we’re producing and ensure that we create content that provokes engagement by meeting the needs of the visitor.