Archive: Nov 2013

  1. Battle for conversion

    Author: | 3 Comments

    When you have managed to attract traffic to your website, the next problem arises: how to lead visitors to the goals which we want them to complete, in other words – how to convert inbound traffic into sales.

    In this article you will find a review of several tools which could be used to increase the conversion and make users happy J.

    The process of conversion improvement consists of several stages. Each of them is equally important as the information obtained from the previous stage is used for the next one.

    It’s not difficult to increase the conversion

    You just need to:

    • understand website goals.
      Website goals may be of two types, at least – transitional and final. What final goal means is clear: for online store it could be purchase, for Groupon – sign up, and so forth. But on the path to this goal some transitional goals may exist.Important. Your goals and goals of your users are not always the same. Fancy dress and simple process of purchase are not synonyms. Think as a user, even as an inexperienced one.For example, users may want to fly to Thailand, others’ dream destination is Paris or Barcelona – they look for tickets many times, compare hotels, but make purchase only after several months. In this case the path from transitional goal to final is really long and this chain should be tracked.
    • define entry points and their distinction.
      Users which arrived from the organic search and frequent buyers may require different information from the service.
    • consider all possible paths from entry point to the goal.

    But it is not possible you may say! That is why it is required to define the most important paths and make them as easy as possible.

    • be flexible, monitor and experiment
      The most interesting part – on this stage the first three stages are run on circles, while you collect the information and study the results.

     

    Tools review: how to track and adjust the process

    So let’s say you are analyzing the website with information about air tickets. The main goals are: the search on site, newsletter sing up, and the most important, but suspended in time goal – air ticket purchase.  This feature – suspended goals – became available not so long ago, and it is really great option.

    After we have set up the goals, we need to gather the initial conversion statistics – it will be out starting point. Depending on goals number as well as visit and transaction frequency, the initial data could be collected in a few hours or several weeks.

    Setup goals may not only help evaluate the efficiency, but detect the website problems as well. For example, thanks to conversion report, you may find out that JS doesn’t work properly and half of users are not able to reach the goal.

    When initial data is finally at our disposal we can start out. Let’s look at the tools which may help us in this.

    We will begin with Yandex.Metrica
    Yandex. Metrica offers several marvelous tools for in-page analysis thanks to which you can understand what prevents the user from reaching the target page.

    Form analysis – one of the major tools used by online stores and service-based companies with unusual registration process. The instrument shows how many users began filling in the form, how many of them did to the completion and where majority abandoned the process. Excellent feature that helps create forms for people indeed.

    Link map – allow you to see where in fact users click, what interests them and how they use the navigation.

    Scroll map you will find out in seconds till what length the page “works”. The more content in blind zone, the more you will have to change.

    Click path analysis – gives you insight on how user arrives at your website, where he goes then and how close he is to the final point when he quits the process.
    The tools mentioned above fit the best for static analysis, meaning that before making such analysis, you will have to gather enough data for long period of time – week or month.
    Click map without additional settings is convenient to apply for such pages as the main page. But what to do if your website has lots of pages of similar type but with different URLs, and you would like to see the overall picture of users’ behavior?  This is where urlFilter comes handy. With the help of this filter you can group several pages into one. For example, if you wish to group all the pages in the search on site you will have to write the following piece of code:
    yaCounter.clickmap({

    urlFilter: function(urlOrig) {

    var url = window.location.pathname.toLowerCase(),

    prefix = ‘http://’ + window.location.host;

    if (url.substr(0, 8) === ‘/search/’) {

    return prefix + url.substr(0, 11);

    }

    // other groups

    // …

    return urlOrig;

    }

    });
    In the report you will see these pages with full URLs, but when you will view the map, the result will be common for all the pages.

    Webvisor – a unique and free of charge function in Yandex.Metrica. Some time ago Webvisor was a separate product, but in 2010 Yandex purchased the code along with the team and starting from 2011 this functionality is a part of Metrica tools.

    Webvisor allows to view users’ activity in motion, it records their actions – clicks, filling in the forms, moving mouse (it is a separate and fun bonus which indicates that user was bored and needed some useful information to entertain)  – and then plays it for you as a video.

    Webvisor comes into play if there is a page which users steadily leave and you need to find out why they can’t continue the path and reach the goal.

    Now let’s talk about Google Analytics
    These days Google is transforming Analytics from just statistical instrument into A/B tests and usability experiments platform.  You will find an enormous functionality there.

    Universal Analytics is a Google Analytics future. Not all the features in Universal Analytics work properly at the moment, as they are in beta version, but, no doubt, they offer great functionality. New Analytics version works faster, offers wider limits, allows sending queries from backend, and so on. Besides, new opportunities of Google Analytics will be implemented only in Universal version.
    We are interested in Conversions section.

    The last three ones (Reverse Goal Path, Funnel Visuzlization и Goal Flow) allow to see the overall picture of what is going on the website even for really large projects.

    If additionally to this features you will set up events, you will be able to receive the picture of how user navigates through your website.

    For example, Goal Flow report gives you a visual picture of how visitors arrived at your website and reached the goal, and whether they took the path you expected them to or there was another funnel.

    Now let’s talk about the most advanced and cool Google Analytics functionality – ability to transmit events from server and tracking them offline!

    For example, the same website offering information about air tickets, doesn’t sell anything itself – the purchase is completed on third-party website, so we can’t track the event as are not able to place tracking code there. With the time we receive sales data, but it is still unclear, what source contributed the most.

    Google helps everyone who has transaction offline solve this problem.

    Google unified the interface responsible for the interaction with analytics servers. All the queries come to URL www.google-analytics.com/collect and acquire the same set of parameters, regardless whether we use backend-e or frontend-e. The one distinction is that in js-e we have at our disposal already finished and useful function with the help of which we can send any events:

    // connecting analytics.js

    ga(‘create’, ‘UA-XXXX-Y’);

    ga(‘send’, ‘event’, ‘category’, ‘action’, ‘label’, value);
    On the server side thought the whole query is sent as POST payload, coded in form-urlencoded

    POST /collect HTTP/1.1
    Host: www.google-analytics.com

    v=1
    &tid=UA-XXXX-Y
    &cid=555
    &t=event
    &ec=category
    &ea=action
    &el=label
    &ev=value
    You may notice the additional parameter cid — client ID. This is also a new opportunity in Universal Analytics. If before only Analytics could define what a visitor is, now we can manually specify our ID.
    Here is the official documentation to help anyone, who would like to try offline tracking out.

    This method could be applied for conferences, shops, concerts. You just give the users unique coupons and then check in Google Analytics events whether the user has made the purchase.
    Working on conversion improvement could be an endless process. Once there is new chapter on the website, new functionality or special offer, it is time to start working towards better usability, higher conversion and website simplicity.

    If you are already using offline analytics from Google, please tell us about this in the comments below!

     

    Opinions expressed in the article are those of the guest author and not necessarily SEMPO.

  2. Common On-Page SEO Pitfalls

    Author: | 8 Comments

    A couple of weeks ago, I spoke at Turkey’s first SEO conference, SEOZone. Since our agency, Ads2people, conducts a large number of on-page audits, from very large and often multilingual corporate sites to regular blogs, I thought it would be helpful to talk about some common on-page pitfalls we see over and over again. This is an exclusive write-up for SEMPO summarizing that presentation. I hope it helps you improve your on-page SEO.

    #1 Crawl Budget

    Given the fact that search engines such as Google assign a certain crawl budget per domain (and sub-domain), I’m always surprised at how often site-owners simply try to push all of their content into the index. And they also often seem to be completely careless in regards to which sites are crawler-accessible at all.

    To assess and fix these problems on your site, a good starting place is Google Webmaster Tools (go to: Crawl > Crawl Stats), which gives a first impression of how a site is doing. A successful graph is slightly increasing – which usually reflects that Google picks up on content being added and therefore returns a bit more frequently. Conversely, if that graph is jumping or massively decreasing, you might have a problem.

    There are two ways to control search engine crawlers: using a robots.txt directive and implementing a robots meta tag into the HTML mark-up (or serve it as HTTP X-Robots header). However, the issue with both directives is that they don’t solve your (potential) crawling-budget-issues:

    -          Robots Meta Tag: Implementing a proper “noindex” does prevent a given page from showing up in search results but that page will still be crawled – and therefore a crawling budget has to be used.

    -          robots.txt:  Blocking a URL (or folder, etc.) does prevent the site from being crawled (and therefore does not waste crawling-budget); however, there are massive downsides. One is that pages might still (partially) show up in search results (mainly due to being linked from someplace else) and all inbound link juice will be cut-off. In other words, those links do not help your rankings.

    Considering those points, you might think about combining those… but please – don’t! It simply cannot work. If a page is blocked using robots.txt, a site won’t be crawled and the meta robots tag therefore cannot be read at all!

    Watch out for things like filters and sorting, pagination, and other potentially useless pages. We see so often that these are simply being pushed to the index but certainly never can or will rank for anything. Don’t waste Google’s resources on that!

    As a rule of thumb: If you want to be sure not to waste crawl-budget, only have pages that really are useful (so don’t create others in the first place).  If you have others you don’t want to show up, I’d go with meta robots to at least utilize the inbound link equity.

    #2 Duplicate Content

    I assume everyone is familiar with duplicate content (DC) issues, but it turns out that’s not the case (if you’re not, please read this first). It always surprises me to see how many sites out there are still not performing well due to a lot of internal (partial) DC. Even though most sites these days are OK in handling session IDs and tracking parameters, here are some “classics” I’d like remind you of: HTTP vs. HTTPs is considered to be DC, products available in multiple categories (and not using a single product URL) are causing DC as well, and sub domains (like staging servers) might get you in trouble.

    That said, the rel=”canonical” meta tag (or X-Robots Rel-Canonical Header) can help you fix those issues, but I think this is the third-best option to solve DC issues. In my mind, it’s really all about efficiency – so the best way to solve it is to actually make sure that you only serve contents using one single (canonicalized) URL and not multiple ones. It’s as simple as that.

    I’d generally not rely on something that Google calls “a strong hint” – because it’s a hint that they might or might not consider, but essentially it’s not a forcing directive like an HTTP 301 redirect (which they simple have to follow).

    Again it comes down to giving Google as few choices as possible.  Enforce single, unique URLs with amazing content and 301 redirect previously existing ones (e.g., old or multiple versions) to this (new) URL and you won’t suffer from DC issues.

    #3 Proper Mark-Up

    There are quite a few differing opinions on if and why proper mark-up is important. I don’t really jump into that discussion, but I’m a strong believer that doing clean and simple mark-up helps. That’s mainly due to the fact that I really don’t want to take chances that a crawler might have “issues” when trying to extract information from a site. And that’s also why I think doing schema.org mark-up is a good thing: It helps engines (not only crawlers) to actually understand (parts of) content and make sense of it. In short, to understand its meaning.

    Obviously you have to consider which information you can and want to provide to Google (and others), but if you don’t give your data, they’ll get it elsewhere. So generally speaking, don’t miss out on this. It’s far more than just gaining more CTR due to more prominent results – which is great by the way – but if you combine structured data with rel=”author” and / or rel=”publisher” that the benefits are even greater. It’s basically Google moving toward understanding and assigning verified entities to sets of queries, and you surely don’t want to miss out on that. In my opinion, Google is massively moving to a point where you need to be a verified authority for a given entity and therefore will automatically benefit from all that long tail traffic that belongs to this entity – which makes a lot of sense given the fact that Google sees a massive ~20% of new queries per day.

    So if you’ve not yet played around with Rich Snippet mark-up, I recommend you check-out schema.org to see what’s in store for you, get it implemented, and verify your domain and author profile with Google+ to get things started. Good luck!

    If you’re interested in the slide deck, feel free to check it out on SlideShare.

    About the author:

    Bastian Grimm co-runs Ads2people, a full-service performance marketing agency based in Berlin, Germany, where he heads the SEO department as the VP of Search. Having a passion for software development and everything “Tech,” he loves to challenge IT and marketing departments to come up with outstanding results in search marketing. Bastian is a widely cited authority in SEO having spoken at almost every major search conference including SMXs, ISS, SES, SEOkomm, LAC, BAC, and many more events around the globe.

    Find Bastian on Twitter and Google+ or contact him at bg@ads2people.de or +49 30 720209710.

     

    Opinions expressed in the article are those of the guest author and not necessarily SEMPO.

  3. SEMPO’s Annual State of Search Survey Extended

    Author: | Leave a Comment

    SEMPO’s State of Search Report has been an industry stalwart for eight years running providing valuable data and insights regarding search strategy and tactics.  In order to gain as much participation as possible and thereby provide more actionable data, we are extending our 9th Annual Survey period through November 23.

    For those who have not yet taken the survey, we’re excited to announce some new features/topics that have been added while we have retained some of the aspects that have provided unique perspective over the years.  Here’s a quick rundown of what we’ve kept and what’s new:

    - We’ve kept the core strategic questions around channel specific budget increases/decreases, objectives, and metrics so we can compare/contrast with the results from previous studies. That’s one of the advantages of having established benchmarks over the years.

    - We’ve also kept the key survey structure that has similar but separate tracks of questions for advertisers and for agencies.  This is a unique feature of the SEMPO Survey that has consistently generated interesting take-aways and discussion when we analyze and report the survey results.

    - We added Social Media to the survey several years ago as it was becoming clear that social activity was being incorporated into search engine ranking algorithms.  This year we’ve added Mobile and Email Marketing questions to further gauge the expanding integration and evolving role of search with other digital content marketing channels.

    - And we’ve added questions around some of the major developments recently announced (Hummingbird, keywords not provided, etc.) along with our now standard questions to help identify and prioritize other emerging trends and industry challenges.

    The 2013 State of Search survey is open to SEMPO members and non-SEMPO members alike and it’s time for your voices to be heard.  Please take the 10 minutes or so to participate by clicking here.  By participating in the survey, you will receive a complimentary copy of the report as well as a chance to win an iPad 3.  Please also help us spread the word.  The more participation we get, the better the data is for all of us.

    A key finding from SEMPO’s 2012 State of Search Report was that advertisers and agencies had a very different perspective on the need to integrate search and social media activities.

    SEMPO SOS 2012

     

  4. Google Not Provided: Privacy Issue or Just a Ploy to Get More AdWords Sales?

    Author: | 11 Comments

    GoogleNotProvided Just last week, numerous SEO blogs and news outlets reported that Google is soon going to start encrypting all search activity both for users who
    are signed in as well as those who are not. The only exception will be clicks on ads, which Google will not encrypt. As you can image, this has many marketers up in arms and others simply scratching their heads wondering what comes next. Are there going to be any benefits for marketers, or is this the end of keyword data as we know it?

    The Quick Basics: What Does Google “Not Provided” Mean? Hubspot reminded us that the discussion of encryption actually started back in October 2011 when Google announced that any users who are logged in to a Google product (Google+, Gmail, YouTube, etc.) would have encrypted search results. Essentially, a marketer would not be allowed to see the keywords someone used before visiting his/her company’s website, so knowing which keywords to optimize for was a struggle. As any good marketer knows, keyword insights open the door not only for optimizing an actual webpage but also for improving content marketing, retargeting, identifying audience, and much more.

    The Real Reason Why: Is Google Doing this to Enhance Their AdWords Sales? Google is claiming it is for extra protection for searchers—a completely valid reason that makes sense. However, many in the field are a bit skeptical. Marketing Land feels that Google might be attempting to block NSA spying activity, while Search Engine Watch threw out the idea that Google might soon release a new “premium” version of Google Analytics where users would pay a monthly fee in order to get access to full keyword data. A more popular opinion is that it could be to drive more people to use Google AdWords. Since ad clicks are not part of this new announcement, how can we not jump to that conclusion? Many are telling small businesses to use AdWords in order to gather this organic data. Consider some quotes from around the web:

    - QuickSprout: “Even if Google goes with ‘not provided’ for all your data, you can still uncover new keyword opportunities by using keyword research tools or spending money on AdWords.”

    - Search Engine Watch: “At this time advertisers still get full keyword referral data from Google, while there is speculation this could change sometime in the future, there is also the necessity for advertisers to be able to determine conversions from the traffic they are paying for.”

    - Search Engine Roundtable: Coming from a Webmaster World thread, “Go fully broad match on every single keyword and pay AdWords for your data.”

    - Moz: “Optionally, we can use AdWords to bid on branded terms and phrases. When we do that, you might want to have a relatively broad match on your branded terms and phrases so that you can see keyword volume that is branded from impression data.”

    You certainly can’t blame anyone for giving users this advice because it is good advice. In fact, we’d give that advice ourselves. In short, Google’s plan has worked perfectly. It’s clear that AdWords is going to benefit and privacy was just a secondary thought in Google’s mind that happened to work perfectly when informing the public. Nevertheless, for now all we can really do is believe Google and move on to the next part of any announcement—create a new strategy that works.

    Your Reaction: What to Do With Google Not Provided The first thing to understand is that the new change isn’t going anywhere so it’s time to react, whether you agree with Google’s decision or not. Fortunately, there are ways to cope without falling into their trap and spending a lot more money on AdWords; there are still things that you can measure using search data that isn’t necessarily keyword data. Consider some of your options below:

    - Other search engines. The keyword trends you will find with search engines such as Bing and Yahoo are very similar to those you would find on Google. These engines have not encrypted their keyword data, so put your focus here and on the keywords that work.

    - Traffic from organic. You might not be able to see the exact keywords people are using to find your website but that doesn’t mean you can’t see your overall organic traffic just like you’ve done in the past. It might take a bit more work, but figure out what you’re doing in the way of keywords and how your traffic is performing and then find correlations.

    - Use filters and track landing pages. You might not be able to see the exact keyword someone used, but if you can set up a filter on all of the ‘not provided’ traffic and see which landing page those people landed on, you can get an idea of what it was they were searching for when they came to your website.

    - Google Webmaster Tools. You can view your top pages and top search queries in GWT where you get clicks. Although you can’t see anything past 90 days, it’s still something that can help you keep track of your progress.

    - Google Trends. This will help you see quickly if you are improving or you need to ramp up your efforts.

    In the end, this Google update is just something else that will force marketers to adapt, but it isn’t going to take away your job or ruin your chances in the results pages (after all, everyone is in the same boat). Many see this as a positive move for the industry because it will force websites to create great content and put a focus on things that will really produce a great website. As a user, you’re going to be a little bit safer. Do you think this change was for privacy reasons, or do you think Google was more interested in lining their pockets with some increased AdWords sales? What are you going to do in response? Let us know your story and your thoughts in the comments below.

     

    Photo Credit: lumicall.org

    Amanda DiSilvestro gives small business and entrepreneurs SEO advice ranging from keyword density to recovering from Panda and Penguin updates. She writes for the nationally recognized SEO agency HigherVisibility.com that offers online marketing services to a wide range of companies across the country.


    Opinions expressed in the article are those of the guest author and not necessarily SEMPO.