A question frequently asked by SEO newcomers is whether it’s okay to use a subdomain instead of a subdirectory for things like blogs. The question is normally asked because there is some sort of technical challenge to using a subdirectory, like trying to get two CMS’s working together nicely, or it just seems more logical to separate them. The SEO’s answer to this was, is and mostly likely always will be that subdirectories beat subdomains. Recent proof of this fact is from a blog post from Iwantmyname that reached the front page of Hacker News. The tl;dr is that they moved their blog to a subdomain and the page views dropped. They didn’t recover until six months later when they moved it back.
So what went wrong? Basically, the move resulted in a backlink/authority loss. The fundamental concept of SEO is that the more quality backlinks you get the better, and each link will improve the page’s Page Rank as well as the domain’s overall ‘authority’. Iwantmyname did the right thing by using 301 redirects on the old URLs* and they obviously had a site-wide link from the main site but the blog lost the domain authority. This is because, for technical reasons, Google considered the blog a new site**. They were basically starting from scratch. While their blog post didn’t mention things like rank drops or the traffic sources, the fact that they had lower page views for 6 months suggests that their search engine traffic suffered - this points to the ‘sandbox effect’. This is an age-based penalty applied to new sites that was introduced nearly a decade ago when established sites were being knocked out of the top SERPS by new, highly-optimised ones (especially those run by blackhats). The length and severity of this penalty depends on various factors (such as domain age and what industry the site is targeting) and in almost all cases all you can do is wait it out. It can take over a year and they clearly didn’t want to wait that long :P
So what if you have to move a part of the site to a subdomain? In my experience, Iwantmyname’s use of Nginx’s proxy feature is the best solution. It’s a simple but very flexible feature where the webserver proxies requests to another site and then shows the result. The users and search engine spiders never see the ‘background’ site so they don’t know the content is on another server. I’m kind of surprised that it isn’t recommended more often however I think that as SEOs become more webserver-savvy over time it will be added to the standard toolkit.
* While 301 redirects always pass the PR on I don’t know it if it’s been proven conclusively that they pass all of it. In my opinion it probably depends on some other factors like if it’s the first 301 or if you’re travelling down a chain of them.
** A domain is a DNS record (that points to a webserver), and so all domains are subdomains themselves. For example, ‘chillidoor.com’ is a subdomain of ‘co.uk’, etc. It is easier for Google to treat every subdomain as a new site than it is to try figure out relationships.
How I Update My Blog With Jenkins