Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Quickest Wins in SEO (segment.io)
116 points by ianstormtaylor on April 9, 2013 | hide | past | favorite | 43 comments


> example.com/sitemap.xml should point search engines to interesting parts of your site.

Really, an XML Sitemap should be used to show Search Engines ALL pages on your site, and should only be used when it can be dynamically updated via a CMS or database that knows of all URLs. Don't even get me started about the idiocy of using a random CRAWLER (e.g. https://www.google.com/search?q=xml%20sitemap%20generator) to generate a sitemap to be used by the best CRAWLER on the internet.

I know the tone of the post is informal, but this wouldn't be a HN comment thread without some nit-picking.


Yeah, I think if it makes sense for the site, you should focus more on making it fully crawlable. Generally, humans should be able to find all of your content, too.


Also a quick win (ironically), don't use the .io extension unless you're actively targeting the indian ocean territory because Google doesn't include it in it's list of Generic TLDs.

See http://support.google.com/webmasters/bin/answer.py?hl=en&...


This is no longer true.

I have .IO domain for more than 2 years. 2 years ago, I remember the country for domain on Google Webmaster Tools has been "locked" to Indian Ocean Territory. But at least for last 6 months, I'm now able to change it to any country or leave it unspecified which indicates that .IO is now treated as generic TLD by google.

The article you have quoted is simply out of date.


Have you noticed the reflections in your U.S. rankings?


Good point, although I assume this will be fixed by Google pretty soon. I visit lots of .io sites every day and I don't think I've ever hit one that was indian ocean territory specific. Seeing as it's all the rage with developer-focused projects, it makes sense that it will join the ranks of .me and .co


That's the point - there's almost none pointed to .io. That said, I think it's pretty dangerous to create an SEO strategy around speculation that google "might" change the rules around something as big as a gTLD.

I'm not saying it won't happen, but if you're specifically hoping to gain traction via SEO efforts, using a ccTLD that Google hasn't explicitly mentioned as ranking like a normal gLTD I think you're taking a big unnecessary risk.


Is this actually an issue? I'm very doubtful.


Yep and the physical location of the ip counts I had the devil of a time getting a .ie site to rank in google uk a while back.


The physical location of a server is increasingly less and less of an issue (see: CDNs).

As you mention, though, ccTLD choice is another story entirely.


It actually is a real issue. Google says it themselves - they give priority to gTLDs.


How does this affect .ca domains that are relevant in the United States?


I believe they're targeted towards canada.


Hahaha, too true


I just did a major SEO audit for our site. Changing title tags and h1 tags are insanely important. It literally looked like we were hiding ourselves from Google. I'll probably do a post soon about this on our/my blog.


Would love to read that when you do. I think I understand what the title tags require, but I haven't heard about the h1. Is that just changing it to make sure that the h1 is the title of the page instead of the site?

Also how does the HTML5 heading system (where you can have multiple h1s on a page) affect SEO, do you know?


h1 is an indicator to the search engine that you consider that blurb/content to be the most important on the page. So the h1 should be optimized for the keywords you are trying to rank for. Also, if you have a bunch of h1 tags on a single page, it can actually dilute the weight of your most important heading. I recommend using just one h1 tag on a page to keep Google from having to guess what is most important. I also recommend not using any other h tags on the page for the same reason. Don't try to rank for a bunch of keywords on one page. Instead, concentrate on one or related keywords per page and use h tags to signal to the search engine what you think is most important.

Search engines are dumb. Make everything really obvious to them.


Not using any other h* tags seems like crazy advice. I could definitely believe that search engines haven't become privy to the multiple h1 tags that HTML5 proposes, but I assume that will come with time too. Since the idea for multiple h1's makes lots of semantic sense for components, I'm going to keep using them. Keeping track of heading levels inside components is hard/impossible.


Have you seen the effects of not using any other h tags on your site besides the h1?

In my experience, I've seen the opposite. Well done h2s in fact, can really boost a series of site's rankings.


Yes, I have seen the main keyword get boosted in the SERPs when other h* tags are removed. I have seen this on multiple sites. This is not a rare technique and other SEOs do it too.

No doubt your page also can get ranked using the normal h- hierarchy.

The point of this technique is to put as much value on the main keyword as possible. Make it really obvious to the search engines what your page is about. Your page may have other requirements, so use accordingly.


Probably one of the better SEO articles I have read in a while. I wish clients could understand this.

Just want to point out that your Twitter share button should point to the article's page (current page) not your main url.


If you're looking for a free tool to analyze your site for titles/descriptions/broken links/etc, Screaming Frog SEO is great.


Second this, I actually pay for the yearly license and it's saved me far, far more than it's cost in the first full site crawl I ever did with it :)


No, healthy sites should not have 30-50% search traffic. If that's what you have, you have too few regular users and are at high risk to lose a huge chunk of your traffic and consequently business at the next <random animal name> Google update.

Also, bear in mind that Google has acknowledged that keywords in URLs "help a little", so this is a good SEO win that has (arguably) nothing to do with how useful people find your content. It's much easier than putting effort into better content, too.


> No, healthy sites should not have 30-50% search traffic.

I'd argue this really depends on what type of set you're running (and what the purpose is).


I agree even well know brad site say BBC NYT etc still get people searching for brand terms "BBC" or "NYT" cos they are to lazy to type it in.


bear in mind that Google has acknowledged that keywords in URLs "help a little"

Actually, Google released an EMD update[1] in order to fix that issue.

[1] https://twitter.com/mattcutts/status/251784203597910016


Having keywords in your URLs is different than having an EMD.

The EMD update targeted the people who see the search volume for a specific keyword and then purchase exactkeyword.com to rank for that keyword.

But if I write a blog post about "red widgets in texas", it is still beneficial to have all, or some, of those keywords in my URL.

mywhateverblog.com/red-widgets-in-texas/ is better than mywhateverblog.com/?p=545


Actually the EMD update targeted EMD with low-quality backlinks to them. EMD that didn't have low-quality backlinks didn't seem to get hit. The goal of the update wasn't to penalize people with keywords in their domains.


Agreed. I was just simplifying, albeit poorly.

Sure, you can still build a high quality site on an EMD, but the main reason for the update was because it was a spammer's paradise. The majority of people registering them were doing so with the intent of abusing the EMD's power at the time. In many cases, spam will get you to the top of the SERPs. But during EMD's heyday, spam + an EMD almost guaranteed a #1 ranking for the target keyword. People were selling EMD starter packages that would include EMD, 5 articles, and some link spam for good measure. It was fairly easy because of the weight EMDs were getting in the algorithm. I have no data to back this up. Purely my observations from being in the space during that time.

So yeah, I should have clarified a bit. I wasn't trying to imply EMDs were all penalized.


Actually your examples: mywhateverblog.com/red-widgets-in-texas/ and mywhateverblog.com/?p=545 isn't comparing keywords in URLs but Static vs. Dynamic URLS - and static URLs e.g. /red-widgets-in-texas/ is NOT better than /?p=545 - This is something that Google have even said so themselves[1]

[1] http://googlewebmastercentral.blogspot.co.uk/2008/09/dynamic...


I'm still comparing keywords in URLs vs. no keywords in URLs, but with no regard to static or dynamic.

To your point, I could have used static vs. static and dynamic vs. dynamic examples such as:

mywhateverblog.com/page1.html is not as good as mywhateverblog.com/red-widgets-in-texas.html

mywhateverblog.com/?p=595 is not as good as mywhateverblog.com/red-widgets-in-texas?id=595

From Google's own SEO Starter guide [1]:

"Use words in URLs! URLs with words that are relevant to your site's content and structure are friendlier for visitors navigating your site. Visitors remember them better and might be more willing to link to them.

Avoid using lengthy URLs with unnecessary parameters and session IDs; choosing generic page names like "page1.html"; using excessive keywords like"baseball-cards-baseball-cards-baseballcards.htm"

And from Google's Webmaster Tools page regarding URL structure [2]:

"A site's URL structure should be as simple as possible. Consider organizing your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers). For example, if you're searching for information about aviation, a URL like http://en.wikipedia.org/wiki/Aviation will help you decide whether to click that link. A URL like http://www.example.com/index.php?id_sezione=360&sid=3a5e..., is much less appealing to users."

[1] http://static.googleusercontent.com/external_content/untrust...

[2] http://support.google.com/webmasters/bin/answer.py?hl=en&...


Does PageRank mean anything?

HN apparently has a PR of only 6, while for example my blog, which doesn't get a lot of traffic, has a PR of 5 (and even 5h on some pages).

PageRank used to be talked about a lot and now it seems it's not even mentioned; is it still relevant?


It is still relevant, but in absolute terms it's a small portion of how pages are ranked.

The PR of the HN homepage is probably pretty irrelevant as I doubt it gets much traffic from organic search for anything other than "hacker news".

The PR of your blog homepage has to be taken in context for the type of search terms it might rank for (and remember that PR is for pages, not domains - the PR of your homepage will be different from the PR of each blog article).

If your blog was about vegan-friendly fondue recipes, a PR of 5 might indicate you had a link profile which meant you ranked highly because there likely isn't a great deal of competition in that particular niche.

If your blog was about dating tips, a PR of 5 wouldn't really indicate anything by itself because it's a really competitive niche.


and apparently "ryan dahl", "how to get stolen domain back" and "overhead walks on two legs" :-p


Not as much since the algorithm has evolved. Personally, I'm more interested in a site's Domain Authority[1] (anything over 30 is good)

[1] http://www.seomoz.org/learn-seo/domain-authority


This is still 'SEO 1.0' stuff. If you have something to sell, i.e. an e-commerce site, or if you have a recipe, or if you have a large collection of flora and fauna, the quickest win is to use semantic markup - 'SEO 2.0'. Let The Search Engine (Google) know what is description, title, price, currency, size, weight and so on with semantic markup tags and you are presenting The Search Engine with information that can be made sense of. Otherwise it is just throwing more needles in the haystack.


Micro formats are underrated and simple to achieve: http://microformats.org/wiki/Main_Page


This is a really good point. Google actually has a really good tool for testing this: http://www.google.com/webmasters/tools/richsnippets

Here is an example: http://www.google.com/webmasters/tools/richsnippets?url=http...


Citation needed. To me, all this semantic web stuff sounds exactly like meta tags in the old days - and the big difference between google vs lycos/yahoo/etc. was that it ignored meta tags completely.


The benefits of Schema (schema.org) is pretty well documented. While its existence on a page is not a ranking factor in itself, it can help an SEO campaign and the visibility of a site in innumerable ways - not least organic click through rate.


Huhwhat? More people will click on a site if it includes a schema in its markup? Seriously?


I have been looking for a way into marketing for a while and I am glad i you posted this. I ran my website through hubspot and i found many wholes that could increase my visibility when fixed.

Thank you for sharing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: