If you’re looking to set up an affiliate network or you’ve already got one, you should be aware of a couple important points which might just change how you have or are thinking about setting it up.
Physically setting up an affiliate program is quite straight forward and you have two choices in handling the inbound links and referral tracking:
- Single entry point
- Using a single entry point in your site, where everyone links to with their respective referral code which then shunts the user to the desired page. Using this method, you might end up with:
- Multiple entry points
- Allowing multiple entry points facilitates deep linking. If you allow multiple entry points, you might end up with:
Both of these methods will work but which one is better for search engine optimisation? If you use a single entry point, you end up in a position where you’ll have hundreds or thousands of inbound links to a particular page. Unfortunately, the page that they are linking to isn’t useful to a search engine for indexing – it simply redirects to another page. You do however get the benefit of being able to effortlessly reorganise a web sites structure and only have to worry about updating destination URL’s in a single location.
Using multiple entry points allows your marketing or affiliates to link directly to their intended page with their referral code, which can make a difference on various levels:
- it’s convenient for the people linking to the page
- it’s less error prone, as the linker can simply copy the URL from the browser
- the linked URL will begin to gain inbound links, which is critical for effective search engine optimisation
- the person clicking on the URL can hover the URL and see where it is going
The last point might seem like something you might otherwise gloss over, however as internet users become more savvy – they are becoming acutely aware of their online actions. Letting the user clicking on the link see the destination URL will help build trust between your web site and them, as they will be less inclined to think the link is spam.
My personal preference is towards deep linking, it’s just so convenient. If you allow deep linking, the next problem you have is your affiliated links making their way into the search engine result pages; which is definitely not what you want. Fortunately, through the use of a robots.txt file it is possible to drop the affiliated URL’s from being indexed. In the above multiple entry point example, you could stop those URL’s from being indexed by including the following line into your robots.txt file:
Unfortunately, your work isn’t quite done, as all of the inbound links are linking into distinct URL’s (ie, with the referral code). As far as a search engine is concerned, these are totally separate web pages which could/should have unique content. To leverage the most out of your inbound links, you want to make sure the link ends up pointing to the permanent URL for the content (ie, without the referral code).
Remembering that you are tracking referral codes, the web site must first do something useful with the referral code. Useful might be placing the referral code in a cookie for later use or storing it in a database, but something generally needs to happen with it. Once the useful action has been completed, you need to send a standard HTTP redirect to the user agent (browser, bot, ..) to tell it the permanent URL for that content exists at a different URL – in this case the same URL without the referral code. Consult the documentation for your favourite server side language about handling HTTP response codes.
By implementing these two simple techniques, you now only have a single copy of any of your web pages indexed in the search engines and any inbound referral links will ultimately be attributed to the permanent URL for the actual content.
You can now sleep easily at night knowing you have search engine optimised referral tracking.