Category Archives: Search

Public Speaking

A few months ago I was asked to present at the Online Retailer Roadshow in Brisbane, which is a single day conference organised by Reed Exhibitions covering a broad array of topics associated to online marketing.

Historically, I’ve found a lot of one day conferences to not have enough depth in them or they aren’t able to attract the really great speakers but I was completely impressed by the line up for the conference & was in the esteemed company from people like:

  • Kieran O’Hea
    Chief Digital Officer – Brisbane City
  • Faye Ilhan
    Head of Online – Dan Murphy’s
  • Joshua McNicol
    Head of Marketing – Temple & Webster
  • Steve Tosh
    Director Omni-Channel – Dick Smith

My presentation was on unbeatable SEO tips or search engine optimisation for the uninitiated, ie how to make my website have great visibility in Google. 

Since the conference was about online retailing, I decided that I’d cover off a raft of foundation items, technical problems that aren’t handled well in ecommerce websites, tips on building links to their websites, outpacing their competition with great content and maximising the effect that social media can have on a site – which while not a ranking factor for Google, correlates well with rankings due to the fact that great exposure/awareness for a website generally leads to things that do affect rankings taking place such as links.

After getting the majority of my presentation organised, I gave myself a practice run through in the boardroom at work and despite the fact that the boardroom was empty – I felt really nervous for some reason. After I got going, everything began to fall into place and I worked my way through the content without a great deal of effort.

During the run through, I thought I was going fast – spending a limited amount of time on each of the slides and topics but despite that I was way over my 30 minute time allotment – so I cut slides out of the presentation.

It turns out that I should have cut even more slides out of the presentation as even with the reduced content and me still feeling as though I was tearing through the topics – I went over the time limit at the event as well but the organisers gave me a few more minutes which was great.

It’s difficult to find the balance of going too advanced or diving deep into a topic and not providing enough detail for the audience. I know from conferences I’ve attended that nothing frustrates me more than a presentation that I don’t learn something from – so I tried to avoid that with my content by focusing on items that could genuinely improve their online business if they took action to address the problems.

For my first public speaking engagement with an audience of over 100 people, I feel it went really well. There were a lot of people taking notes and photos of my slides while I was presenting, so I think that is an indication that they were finding value in what I was speaking about.

Now that my first one is out of the road, I will definitely be keeping my eyes open for more opportunities – it is a lot of effort but it’s good fun and you get to meet other business owners and professionals which I really enjoy.

Nothing Like Spam To Lower Your Opinion Of A Site

Squidoo spam, regarding cat urine removers.Last year, Seth Godin launched a new service named Squidoo, which aimed to bring the power of recommendation to search. Squidoo has been gaining reasonable momentum since it launched and as a result, it has now become the next haven for spammers.

I appreciate that it might be difficult to manage the problem, however if a site like Squidoo can’t get it under control then it completely erodes the usefulness of the site in my opinion. The Squidoo spam problem has been happening for quite some time and the owners are aware of it, however it seems that they have yet to find a way to curb it.

It’s a shame really, as I hate spam so much that it has now tainted my opinion of Squidoo as a useful service. Fortunately, I find the site pretty much useless, so at least I don’t have to put up with it while browsing their site.

Dashes Versus Underscores, It Doesn’t Matter

One of the common strategies employed during search engine optimisation has involved placing high visibility keywords and phrases into the URL. Using this search engine optimisation technique lead to URL’s which looked like:

  • http://domain.com/mysuperdooperproduct.html
  • http://domain.com/my-super-dooper-product.html
  • http://domain.com/my_super_dooper_product.html

For quite some time now, search engine optimisers all over the internet pondered whether using dashes versus underscores was the best performing technique. It wasn’t long before it became clear that using a dash as a word separator out performed the humble underscore.

Matt Cutts, a Google engineer has stated at WordCamp 2007 that very shortly Google will support an underscore as a word separator as well as the existing dash. Although the difference between the two seems so subtle, for many web sites it has proved a significant thorn in their sides as their content management systems produced URL’s which utilised an underscore and not the recognised dash.

This news is going to make a lot of people very happy in the near future.

Google Image Search Supports ImgType=Face

Google Image Search has supported the imgtype parameter for a long time and it recently received an upgrade, now accepting an imgtype with a value of face.

At the moment, there isn’t an option on the advanced search page to restrict images to that imgtype, so you’ll need to add it into the URL manually.To give you an idea of what it might be useful for, compare to two search results:

The first set of results are anything that has been associated to ‘Alistair Lattimore’, where the latter is meant to filter the results to contain faces. In that particular example, it isn’t perfect however it’s quite easy to see where this is going. This new functionality is apparently the by product of a Google acquiring Neven Vision last year who were developing specialist facial recognition software.

Removing Indexed Content From Google The Easy Way

Google are constantly improving their services and during April they updated their Google Webmasters Tools; this release relates to removing content that has already been indexed by Google.

Google have supported removing content from their service for a long time, however the process was often slow to take. With the recent addition of the URL removal into Google Webmasters Tools, its now possible to process the removal of a page quite quickly.

As with everything associated to Google Webmaster Tools, the web site to act on first needs to be verified. Once verified, there is now a URL Removals link under the Diagnostics tab. The removal service supports removing URL’s in the following ways:

  • individual web pages, images or other files
  • a complete directory
  • a complete web site
  • cached copies of a web site

To remove an individual web page, image or file – the URL must:

  • return a standard HTTP 404 (missing) or 410 (gone) response code
  • be blocked by the robots.txt file
  • be blocked by a robots tag

Removing a directory has less options available, it must be blocked using the robots.txt file. Submitting http://mydomain.com/folder/ would remove all objects which reside under that folder including all web pages, images, documents and files.

To remove an entire domain from the Google index, you need to block it using a robots.txt file and submit the expedited removal request. Google have once more reinforced the point that this option should not be used to remove the wrong ‘version’ of your site from the index, such as a www versus non-www version. To handle this, nominate the preferred domain within the Google Webmaster Tools and optionally redirect the wrong version to the correct version using a standard HTTP 301 redirect.

Cached copies of web pages can be removed by setting the <meta> robots attribute with a noindex on the given page(s) and submitting the removal request. By using this mechanism, Google will never re-include that URL so long as the robots noindex <meta> data is present. By removing the robots noindex <meta> data, you are instructing Google to re-include that URL, so long as it isn’t being block by alternate means such as a robots.txt file. If the intention is to simply refresh a given set of web pages, you can also change the content on those pages and submit the URL removal request. Google will fetch a fresh copy of the URLs, compare them against their cached copies and if they are different immediately removed the cached copy.

After submitting requests, it’s possible to view the status of the request. They will list as pending until they have been processed, denied if the page does not meet the removal criteria and once processed they will be moved into the ‘Removed Content’ tab. Of course, you can re-include a removed page at any time as well. It should be noted that if you remove a page and don’t manually re-include the web page(s) after exclusion, the removed page(s) will remain excluded for approximately 6 months – after which they will be automatically re-included.

Being able to remove content from the Google index so quickly is going to come in handy when certain types of content are indexed by accident and need to be removed with priority.