SEO Book Ready For Free Download

I’ve uploaded my new book to the interwebs so you can now download and read through it, if the fancy takes you, completely free of charge (SEO Truth – A Bible For The Next Generation Of Search Engine Optimisation).

There may be a few blank pages because it’s been laid out for print; you can get yourself one of these hard copies from this website here.

Let me know what you think and submit some feedback over on lulu by all means! Cheers.

Edit: The download link works correctly now, oops.


SEO Truth – Apparently I’ve Written A Book On SEO

Hello!

As usual I’ve been spending a horrendously long time without writing anything on my blog – and for that I apologise. However, I have spent some of my time writing an SEO (Search Engine Optimisation) handbook, covering the importance of next generation techniques and practises.

I’m sure there are those of you who are all too familiar with the increasingly backwards approaches used by a few ‘special’ SEO agents and individuals out there and perhaps for you this will merely reinforce what you already knew to be true. For those of you who don’t know what I’m talking about -then please read the book and have a good laugh at yourself for being such a silly.

You can order print copies of the book – just not yet… more details on that coming soonly!  I’ll be publishing online chapter by chapter (honestly I have finished writing it, but as an SEO, if I didn’t serialise it then it would look bad).

Enjoy the read and let me know what you think, if the first edition is terrible and you order, of course it’s going to be valuable in 200 years!

Preface

First off, I’d like to introduce myself. I’m a Search Engineer, a developer and programmer. I’ve worked with clients throughout the advertising industry at many different companies. My specialty is developing software that works with the search engines of companies like Google, Yahoo and MSN and attempts to influence the rankings of my client’s websites, as well as report on those ranking changes. I’ve never been to a lecture on computer science, read a book on development methodology and yet I’m in demand. My skills lie in understanding the technology of a search engine and how to capitalise on their ranking algorithms, web crawlers and content filters and it’s the ideas I generate in this area which have kept me in gainful employment.

SEO (Search Engine Optimisation) used to be a fairly simple task where you’d make sure every page on your client’s site had Meta tags, descriptions and content unique to that page. You might then try to analyse the keyword density of your key terms to keep them somewhere between 4 and 7 percent. More often than not most SEO companies wouldn’t even attempt that.

What most SEO companies would never tell you, and this is the industry’s most well kept secret, is that they’re intrinsically lazy. If you had a good client, with good content and a product of interest then their SERs (Search Engine Rankings) would climb entirely naturally to the top spots, you’d have nothing to do but sit back and reap the benefits of your lack of work.

This is of course a sad state of affairs which no real SEO company would allow and part of this book will help you to spot the difference between a professional outfit and rank amateurs and define the widening gap between the two camps.

As the title suggests I’m writing about the next generation of SEO. It’s becoming more difficult to increase the rankings of a particular website and it will only get more difficult to manipulate a website’s ranking without any understanding of how new search engine technology works. Lucky for you, my field is semantics (how to correlate the relationship between one word and another essentially) and you’re in for a whole chapter in manipulating a semantic index similar to those increasingly used by the major search engine players.

 

 

Chapter 1 – The Past

In order to proceed correctly in the future, the most important lesson is for us to understand what happened historically. There’s no shortage of information on the internet and amongst SEOs and webmasters about how Google’s original PageRank system worked. This is in large part thanks to a paper written by Google’s founders, Larry Page and Sergey Brin, whilst they were still studying for their PhDs at Stanford University. Not long after that they received their first investment from a company called Sun Microsystems which enabled them to build upon the hardware they had in their university dorm room and create the international phenomenon we know today.

PageRank was essentially a very simple system. It counted each link from one site to another as a vote for the destination site. By voting for another site the original gave away some of its own PageRank. The idea came from Salton’s Vector Space Model, which is a mathematical principal known to most Computer Science graduates today. This simple method of calculating which websites had the most votes, and therefore deserved higher rankings, is key to all search engine algorithms as it’s extremely fast to calculate. The most important factor in any search engine is its speed in returning and ranking results, especially when you’re dealing with an index of billions of pages.

 Anatomy of a search engine


The Anatomy of a Search Engine, based on the work of Larry Page and Sergey Brin whilst at Stanford.

If you understand that all calculations undertaken by a search engine must be as fast as possible, it allows you to draw logical conclusions:

·       Thinking about a page as a machine would (which struggles to actually understand rather than just read), rather than as a human, is key to analysing your websites content for SEO value.

·       Is every single underlined heading, keyword color, font size, image location, keyword relationship and page title length analysed when a page is crawled? It’s highly doubtful that anything too in depth is going to be indexed, when the crawler has another hundred thousand pages to visit and rank as quickly as possible, use some common sense here. Of course as processor speeds and bandwidth increase more in depth analysis will become possible in a shorter space of time.

·       The search engine needs to maximise two things: the speed of its calculations and its measure of quality relevancy. Occasionally one is going to suffer at the importance of the other, if you were going to choose between indexing a page poorly – or not at all – which would you do?

SEOs in the past were able to capitalise on this speed issue by choosing to concentrate on areas of a page such as the Meta tags, description and page title. The content itself gradually became more important as time went on but still was subject to the speed of indexing. SEOs quickly realised that keyword density (how many times a keyword appears on a page out of the total number of words) was a very quick way to determine some kind of relevancy, and that the search engines were using it too.

Once the search engines got wise they implemented filters that stopped SEOs from flooding a page with keywords. Arguments in the SEO community followed over exactly what was the ideal keyword density for a term, and this usually settled somewhere between 4 and 7 percent.

Of course the PageRank model meant that agencies were keen to build as many links to their client websites as possible. To make matters worse however they were after links that already had high PageRank values to gain the maximum ranking as quickly as possible and this sprang up a cottage industry of people generating high PageRank links, purely to sell on. Google of course were unhappy about this and their anti-spam team began its work. Blacklisting of websites which ‘farmed links’ was becoming fairly common and this moved on to other aspects of ‘black hat’ SEO behavior – where an unfair advantage was being made by some nefarious companies and individuals.

Most SEO agencies at this stage relied heavily on staff who’d be subjected to some extremely tedious and repetitive labour. Going through page after page of a website and adjusting the number of keywords on a page, slightly changing each page title and Meta tag was a boring job and not well paid.

Directors and CEOs didn’t have a whole stack of problems though, if they kept building up link relationships with ranking websites and making sure their Meta tags were in place, their job was done. Often enough they’d have clients who already had an interesting product which did most of the work itself, spreading links around the internet as people registered their interests.

This natural traffic increase was what Google was looking for as they wanted sites which progressed on their own merits rather than trying to beat the system.

 


Google Shows How Much You’re Worth To Them

I’ve said many times before that so-called SEOs out there need to stop believing every word that spills from Google’s overactive pen. Google is a business just like any other and they feed information that’s deliberately misleading to stop people from gaining an unfair advantage with their search rankings.

It now appears that, in fact, Google assigns an estimated worth to each ranking on their pages – visible to members of their AdWords sales team they use the information from your PPC campaigns and analytics package in order to figure out whether you’re worth it.

Google’s GG Score Shows How Much You’re Worth To Them In The Search Rankings

I’m sure many of you will draw your own conclusions from this and in time we may see a Google press release, from that department which again knows as much about how their technology actually works as most of the SEOs do. Take it with a pinch of salt is my advice and invest the time to understand how a search engine really works.

This story was broken on the french blog Zorgloob, much credit to them for a brilliant find.


I Can Has Likkle Written Contentz?

Hi Readers!

The internet is an odd place, as I look at wordpress.com right now I see the top few blogs are I CAN HAS CHEEZBURGER?, passive-aggressive notes from roommates, neighbors, coworkers and strangers and of course Scobleizer.

F or those of you yet to witness the phenomenen of icanhascheezburger then let me summarise for you by saying it’s a blog filled with cute/demonic pictures of animals, mostly feline in nature with captions underneath. The passive-aggressive notes blog is exactly as it says in the title; pictures of amusing passive-aggressive notes.

As a further exercise in demonstrating to you the power of this medium let me give you an example of an icanhascheezburger image (taken of my girlfriend’s cat, yesterday):

f**k bucket, has pub

If you haven’t been to the site, you won’t understand most likely. The ‘bucket’ is an in joke as these websites often produce. Why exactly though is it so popular over the thousands of blogs that produce well written, quality content?

It’s fast

There are many facets to the speed here, firstly it’s very quick for the authors to add a new post. All they need to do is get an image, put it in the wordpress editor, add a couple of lines about the submitter and possibly the humourous content if they can be bothered and they’re done. This means they can generate hundreds of posts in the time it takes the rest of us to put out one or two (sorry wasn’t talking about you Scoble, or you Winer). The other quick thing they can do when they add a wordpress post is to select categories, this is a very fast way of tagging essentially and means as well as quickly refreshed content they also have targeted keywords. Hello good SEO.

It’s also fast for users; if you don’t get the joke in the first pic you see, it’s a 1 click scroll to the next one. You laugh, it’s funny, you whack the link on an email and send it round the office. They even have a lolcats generator that lets you put a caption on your picture of a cat in about 20 seconds AND automatically submit it to the site. Auto generated content essentially, which is just gold.

If the site updates less often the search engines aren’t the only things that return less frequently. The same applies to all your human users as well. They’re far more likely to refresh if they think the content updates often, and even more if they think their cat might appear on the next post.

What next?

I think very soon, you’ll see an abundance of these kinds of websites arriving if people are smart (often they’re not).

All kinds of non text media will benefit from this treatment and a social voting style system for it will allow a much faster turnaround on content. You’ve seen it with Digg and this is one of the reasons they really should add an images section they’re losing out hugely there.

Other websites have also shown the advantage of fast content generation from any source. Twitter allowing updates by mobile phone for example. I can upload pictures to blogspot from my k800i directly, it’s a shame I don’t like the blogging software.

Urrr.

I completely lost my train of thought I went and read some c# documentation and then all my post ideas ran away. I may finish this later when I regain my mind.


The New Science of Inbound Linking : Part II

Generating traffic to your website is often the easy part. To have your traffic return frequently, building an active community for discussion is where it gets tricky.

Have no fear though, I’ve amassed some wisdom on the subject (stolen from others) and will try to regurgitate and plagiarise word for word.

In fact, that would be a bad idea. I’ve read so many posts, especially on creating traffic, that are practically carbon copies of every other post on the subject. Please don’t do that anymore, it drives me insane, try these in future:

  • You have to maintain your the interest of your users by writing about new thoughts and ideas.
  • If you can’t think of anything to write about related to your subject matter; dont. Just discuss something more personal, what you’ve done in your week for example. You’ll often find that as you write subjects relate to your tech interests and you’re off again.
  • If you have no new content, then talk about interesting content you’ve seen elsewhere. It won’t kill you to have some outbound links you know and you may even get a reciprocal link in return.

Targeting your posts correctly will have a huge effect on maintaining your audience. So don’t try and generate traffic that comes from irrelevant sources. There’s no point wasting your time on it and it’ll give you the minimum of returning vistors. Instead find blogs or websites like your own, post comments on topics you’re interested in with a link to your site if possible. If not then build yourself a ‘web identity’, consistently use your name or online nickname when you’re answering in comments and then if someone decides to Google you then there’s a better chance they’ll find your blog at the top.

Make sure you keep a close eye on your comments; each and every time a user asks a question within them then give the best answer you can. Don’t enter into slanging matches but try to keep a level head, giving an answer as honestly and openly as you can. The other users who read your opinion will then have a better view of you as an impartial writer – even if some of them don’t say anything directly.

Allowing users easy access to an RSS feed of your articles is an obvious one that’s been mentioned by many others. What they often fail to use though is an RSS feed of comments specifically. This will create you more discussion, and as as result, more returning users.

Using your web statistics package, keep an eye on the top search terms used to access your website organically. After time builds up then you can begin to target more articles towards these terms, further strengthening your position and targeting your site more effectively to the users who actually find your site of interest.

Keep an eye on what is not a hit as well as what is. If posts on certain subjects are not finding a base then either rewrite them with a context that can be appreciated by your readers or in future just leave them out. By doing this you’re increasing the percentage of posts that your readers will appreciate and they’ll be more likely to sit in anticipation of the next article’s arrival.

What I’ve written about here, as usual, are simple routes that are often left out in the blog culture of reiterating every point a hundred times across a thousand websites. As those of you who read reguarly know I strive to write originally and usefully and encourage others to think the same way.

I’d like to hear what you think of the duplication of content and thought that is becoming increasingly present in our online culture so please let me know.

Digg This!


Monkeys

Sorry I haven’t written much of late, been going hard at it on work and on personal projects in my own time – meaning very little time to write. For those of you waiting for part 2 of The New Science of Inbound Linking it will be up this week I promise (this is probably only going to make any kind of difference to one of you anyway).

Recently been talking to a fair number of people about the direction that SEO is going to be heading in over the next five years or so as compared to the last five. Like me many people seem to think that there are going to be some major changes very shortly. Some are even worried that there will be no more SEO, engines will completely manage to dispense with the need for it and those who put it into practise.

I’m not too worried myself, as long as there is an internet, people will be trying to call attention to themselves. In fact as long as there are people, they’ll be doing that – so if the industry does end – everyone will be dead. Excellent.

This doesn’t mean that we don’t need to be expanding skillsets, learning new techniques and increasing our range all the time. There seem to be those around who are all too happy relying on the same old tricks – which over the years are increasingly seen as black hat. So I for one am pleased that Wikipedia have nofollowed all their outbound links, it should have stayed that way the last time.

Myself I’m putting a lot of work into semantics, video and rich media formats to keep up with the curve, as well as reading every blog and technical journal I can get my hands on. As well as that, I spend a lot of time working on personal projects that often never see the light of day when the industry takes a different turn – but just having them in my portfolio shows breadth and a willingness to keep learning that every developer needs, and should be passionate about.

In short, keep creating.


The New Science of Inbound Linking : Part I

You know those articles that get routinely Digged to the front page with a headline something akin to : “AMAZING: 10 ways to promote your blog” ? Well this is one of those articles. But this time, I’m actually going to give you useful information that chances are you won’t have seen all of, and most probably not in the same place.

Know where you are

Before you even think about trying to do anything at all to build your inbound links then you must above all else get yourself a statistics package that you can:

  1. Install yourself.
  2. Understand.
  3. Really, understand.

It’s imperative that you can actually read the data you’re seeing, if you’re lucky enough to have a wordpress blog then the tracking on that is excellent for this kind of thing and simple enough for my granny to read even through the dense fog of swirling smoke that follows her at all times. If you can’t understand it then you can’t interpret how successful your efforts have been and may as well just not bother.

Be social

There are an ever increasing number of social networking websites out there now, some catering for specific markets and some catering for all comers. What they have in common is the ability to let their users post their content and/or links which other users can then peruse, and usually can rate. It’s good practise now to keep an account on all of the major social networking players – myself I’m at Digg, delicious (i refuse to put in the dots and waste my time), wordpress, mybloglog, newsvine, reddit, stumbleupon and a few more.

The challenge is to try and keep each one reguarly stocked with content, myself I try to maintain this by having a central blog and bookmarking out from there. My Digg account is the easiest I find to keep on top of and potentially the most valuable. I probably spend a good hour a day on Digg reading and submitting, as reward though you get pushed up the rankings and find more friends to Digg stories with.

Stumbleupon I find is brilliant for generating hits, but I’m never very sure if they’re highly relevant ones or if some poor sucker just gets thrown into the middle of one of my semantic search rants when they ‘stumble’. It won’t hurt though to use, and I personally submit every article I write.

I also submit my blog to pingomatic.com every time I update – this will let blog indexers such as technorati know that your blog has been updated with fresh content and save you from doing it manually; which is nice.

With all these systems, every time somebody either votes for your article or leaves a comment on there then add them as a friend. This is the most important part of social networking and one which a surprising number of people completely overlook. You actually have to build a social network to gain maximum use, SEO is no longer about using websites as links sources but about finding people with similar interest areas and keeping them in contact with your content or product when you do find them.

That’s all for part 1, part 2 to follow shortly : How to keep your traffic hooked.