Thursday, June 23, 2011

My 300rd blog post

Hi every one. With this blog entry i have completed 300rd Blog posts. It all started August 2009.  
 I’ve shared lots of information & enjoyed doing so. Thanks to all those who have encouraged me by posting their valuable suggestions / comments on my posts. From next post on words i will try to share more better information. 



5 Tips for SEO & User-Friendly Copy

You hear it time and time again: content is king. It’s king because it’s what users want, and it’s king because of SEO – i.e. if you don’t have good content, you’re not going to rank.
Of course, not all content is created equal. While over-optimizing content for search will make it less than user-friendly, focusing too much on usability can compromise its searchability.
So how can your content walk the line between SEO and usability? Well, there are 5 easy tips you can follow when trying to walk your content down that fine line between SEO and usability.

1. Length of On-Page Content

The ideal minimum length of page copy for SEO purposes is 250 words. So where your user experience permits, you should have 250 words (or more) of actual inline content – i.e. not headers, not sidebar content.
That being said, 250 words is just a minimum. As a general rule, the more content the better. In fact, I’ve personally noticed that when a page has 1,000+ plus, it has a much better chance of ranking for the keywords that it’s optimized for.
Of course, there are pages where it doesn’t really make sense to have so much content. In such cases, 250 words of content would disrupt the user-experience and push interactive features below the fold.
Some examples of this are a category pages on blogs or ecommerce sites. In these cases, you might want to optimize the categories for a certain niche/vertical of keywords, and added 250 words of text at the top of the page would help.
But from a usability standpoint, categories exist so that users can navigate/browse products or content within a category. So adding 250 word of text would disrupt the user-experience by pushing those product/article links below the fold.
A decent compromise in such cases is to insert a bit of static content at the top of main-content area of your category pages. Generally, 300 characters (about twice the length of your page’s meta-description) is enough to (1) get some descriptive keywords on the page, but (2) keep the category links/listings well above the fold.

2. Scanable Line Paragraphs

Knowing that you want at least 250 words on each page, how do you make that content as usable as possible? That is, how can you make it scanable so that users are not deterred from actually reading it?
As a rule, you should aim for 3-4 line paragraphs (2 sentences). Of course, in some circumstances, a paragraph warrants more than 2 sentences. But by keeping paragraphs within 3-4 lines (5 lines max), you can create a text-experience that users can easily scan, making them more likely to actually consume the content.

3. Headers

Now, what do you do if you have more than 250 words on a page? If you’re talking about more than one product or service? How do you keep the user engaged?
Well, you do that by structuring your content. Basically, you should section off your content using header tags (e.g. <h2> and <h3>). This will create a break in the content that (1) makes it seem easier (i.e. ‘not as long’) to read, and (2) provides visual cues to pull the user’s eye through the content.
From an SEO perspective, moreover, the keywords in headers help you clue search engines into what your content is about. For example, three headers might tell search engines that three different (but related) topics are being discussed on the page.

4. Keyword Density

Keyword density is how often a keyword appears on a page. Depending on who you ask, the ideal keyword density is anywhere from 2%-5%.
The only problem with this is that even at the low-end, this can make for unnatural prose. For instance, if a keyword makes up 3% of all words on a page, the user will probably notice that that keyword appears quite often. In fact, it will make your copy seem robotic, non-compelling, and generally spammy.
Where you can fit in these extra keywords, however, is in your header tags. Indeed, by using keyword variation to draw up optimized header tags, you can gain an extra 1% of keyword density without making the actual copy seem contrived and unnatural.

5. Bullet Lists

One of the biggest copy tradeoffs between usability and search is bullet lists. While bullet lists help make content more scanable, search engine regard them as “broken content” – meaning that they don’t quite count as much as other page copy when it comes to their keyword density. There are two ways to get around this.
First, you can use a bullet list at the top of the page to outline the page’s content. This will help cue the user into what they can expect as they read through the page, which will help them better navigate the content.
If you do go this route, however, make sure that (1) you have 2 sentences/3 lines preceding the bullet list, (2) you have another 2 sentences/3 lines following it, and (3) there is a minimum of 250 words of “unbroken” content on the page. This will not only put the bullet list in context for user, but it will help ensure that search engines properly index the page.
Alternatively, you can simply place a bullet list further down in the content, as part of the one of the subsections. If sufficient content precedes a bullet list, that bullet list is less likely to factor large into how that page is indexed.

Content: Kings & Jesters at Court

A general rule to mitigating tradeoffs between search and usability is developing content for humans, not for search bots. After all, at the end of the day, search engines are out to provide users (real human beings) with the most relevant content available.
copywriting
If you take every possible step to optimize your site for search, chances are you’ll produce some rather non-user-friendly content. And the paradox there is that the less user-friendly content is, the less engaging it is, and the less likely it will be to attract backlinks or go viral.
So you should always think of the user before you think of the search bots. But always keep in mind that there are some elements you can add to a page that both increase usability, and help optimize your page for SEO purposes.

The Ultimate Guide to On-Page Optimization

We work with a great deal of very talented website designers and developers who create first-rate websites for their clients – aesthetically superb but not always the most search engine friendly they could be. Overall it can make the project more expensive for the client because it means that we as the SEOs have to go back under the hood of their shiny new website to make tweaks for better on-page optimization.
This guide is intended to be a cheat sheet that allows you as a designer or developer to really bake SEO into the projects you are working on – saving your clients time and money in the long run as well as making the process much more seamless for them and your life a heck of a lot less stressful, well at least no more SEOs bugging you to make changes!
Ultimate Guide to On-Page Optimization

Website Factors & Site Architecture Factors

URL Structure – the golden rule when it comes to URLs is to keep ‘em short, pretty and include the page level keyword – never ever build on a platform that doesn’t allow for URL rewriting because not only are ugly URLs, well, ugly, they are also a mistake SEO wise. Short, sexy and search friendly URLs make it easy for the user to share with their social networks or link to – not to mention how much easier a logical URL structure makes website management!
Website structure accessibility – inaccessible navigations are a real headache when it comes to SEO. A navigation wrapped in javascript is bad and a menu made from Flash is worse. Now I bet you are thinking “Javascript makes a website more user friendly because it creates things like drop down menus helping the user to make better sense of the page options.” This might be true, but we need to balance usability with search engine friendly. Firstly, we shouldn’t forget that a slick looking menu/navigation bar could render a website unusable on certain devices and in certain browsers (try switching off Javascript or Flash) but from a strictly SEO perspective it could mean that pages deep within your site aren’t being indexed because the only links to them are from a menu wrapped in code that the spiders can’t decipher.
Considered use of Javascript – following on from the point above…whatever Google says, there is clear evidence that the search engine struggles to handle javascript. Reams and reams of unreadable code could mean Googlebot heads somewhere else rather than crawling any deeper into your site. It might also be causing other issues like crawl errors and damaging your website’s crawl rate neither of which are good
things!
Canonical URLs – use the attribute to specify your preferred URL for a page. This is useful in situations where almost identical pages appear at different URLs because of something like a category choice or session ID being added. It is important to tell Google and Bing which page is the one they should index and pass all relevant link juice and authority to. Failure to implement canonical
URLs can mean duplicate content issues but more crucially loss of rankings as search engines divide link juice and page authority between the copies of the page something which could have been avoided if the correct page had been stated in the rel=canonical tag.
Unique meta titles and descriptions – to many, on-page optimization is just about changing a meta title here or there…hopefully this list will show you otherwise. Whilst making Meta title and description changes might feel like SEO from 1997, in my experience it is still a part of the bigger on-page optimization jigsaw. In my mind, it is quite a simple step in the on-page optimization process… a unique title and description for every page front-loading page level keywords in a natural non-spammy way. There are of course other meta tags that you can include, e.g. ‘keywords’ and whilst I am sure some people will disagree with me on this, I only see the use of optimizing the titles and descriptions, tags like the keyword data have been abused to the point of rendering them almost a complete waste of time. Google might not always use the title and description you give a page but at least you’ve told the search engines what the page is about and if Google does decide to use your title and description, you have some influence over encouraging a user to come to your website over the other choices in the SERPs.
Robots.txt file – a good starting point for robots.txt best practice is this guide from SEOmoz. It is always worthwhile ensuring a robots.txt file doesn’t contain any unwanted directions for the search bots, even if you haven’t added anything, someone or something working on the site before might have.
XML Sitemap – fairly common practice nowadays but still worth a mention. An XML sitemap should always be available. It helps make the search engines aware of all the pages on your website and increases the likelihood of faster inclusion in the index for newer pages.
Website speed – I’m sure this issue is right at the fore of your mind when it comes to building websites because it is a really hot topic right now. Google recently enabled all webmasters to monitor page loading speed directly from their Google Analytics dashboard; if they’ve made it that easy for you, you can bet they are using this data as part of their calculation as to where to rank your website. Google loves to improve user experience and since a fast loading page is definitely a better user experience, I can see this playing an increasing role in SEO of the future, particularly in competitive markets. Also, Amazon.com conducted a study and found that for every 100 millisecond increase in page load time, their sales decreased by 1%. Therefore the reasons for improving page speed go way beyond just SEO! There are multiple ways to improve site speed so I won’t go through them all here but all I will say is code responsibly, choose a good host and setup a CDN (content delivery network) if your client is targeting users worldwide.
Ultimate Guide to On-Page Optimization

Content Factors

I was in two-minds as to whether to include this section in the final guide because as a designer you might have limited control over content factors but there again in my experience; designers certainly have some responsibility for either the content itself or for formatting and publishing so I feel it is worthwhile to mention these factors.
Content language – Google uses the language the text content has been written in as a reference point for the relevance to the user making the search query. If you are targeting an English speaking country then content should be written in English. Obvious really but it does reinforce the need for localized websites if you are helping a client to target other countries that speak different languages.
Content uniqueness – one phrase I am sure you are bored of hearing is ‘create unique content’ if you want to do well in the search results. People keep saying it because it is true. Unique content sends the right kinds of quality signals to Google because more users engage with it and talk about, they share it, it generates more links. Encourage your clients to invest in useful, unique content that offers real value to the reader or if necessary take responsibility for this yourself.
Amount of content – the recent Google Panda algorithm update has had an impact on what could be considered the right ‘amount’ of content. My suggestion is that you encourage clients to consolidate existing content or target new content creation efforts towards smaller but higher quality hubs of content. Help and advise clients to remove pages that are basically just a carbon copy of another page on the site but with a few different keywords.
Unlinked content density – pages that contain a lot of links particularly to external pages never look good in the eyes of Google. It gives off a link farm/poor quality directory/paid link operation type vibe which is not just damaging to the page but also to the website and to the pages it links to. Whilst there isn’t an optimum density, as a rule of thumb the number of links should feel natural and be well balanced with unlinked text or other types of content. If all the links are absolutely necessary, consider breaking them down into smaller categorized pages to improve the unlinked content density.
Is the content well-written? – there isn’t any direct evidence that suggests Google penalizes a website for poor spelling or grammar however that being said, a badly written page is off-putting for the user and will therefore send off the wrong kinds of signals to readers or potential customers and since Google is incorporating user feedback like bounce rate into its algorithm, keeping the user happy is vital.
Expertise and Depth of content – Google is smart and since it is on a mission to organize the world’s information, I would be willing to bet that it has already hit the mark or is close to it when it comes to understanding how deep a piece of content goes and whether the author is an expert or not. Algorithmically it could probably quite easily detect if key topics within the theme have been discussed and whether there are any factual inaccuracies meaning it is more important than ever to really be the expert.
Keyword location and repetition – it is widely accepted that Google places more emphasis on links that appear higher up a page. This is based on the logic that if something is important, it is likely to be included first. My suggestion is always (provided it looks natural) to front load the heading of the page with the keyword being targeted and then to mention the keyword within the first paragraph and then depending on the length of the page at selected intervals throughout the text. The key is to keep it natural, there’s no optimum keyword density but there certainly is such a thing as over optimization and keyword stuffing both of which will see the page and possibly the site subject to a penalty. Interweaving keywords into text so that it is good for both user and search engine can be quite challenging but it is worthwhile.
Spam keyword inclusion – if you run an adult themed website then of course this is unavoidable but be vigilant of quite innocent and accidental inclusion of these keywords on what would ordinarily be a very family friendly website. This will be a real turn off for the search engines because of safe-search filtering and also because it may suspect your website has been violated by hackers who have injected spam keywords and links.
Ultimate Guide to On-Page Optimization

Internal Linking Factors

Number of Internal links – one of the reasons that Wikipedia ranks so well is thanks to its internal linking structure. Of course each of the pages wouldn’t hold so much weight if it weren’t for the overall authority of the website but the online encyclopedia has still mastered internal linking best practices. It adds a link to another page on the site wherever it feels natural and will be useful to the user allowing them to flow through the website. You can take this concept and apply it to your client’s website helping them to increase pages per visit, improve user experience and ultimately improve page rankings through increased link volume. They may ‘only’ be internal links but the will still serve enhance your off-page link building efforts.
Anchor text of internal links – anchor text is still an important factor in link value. It will likely decrease in importance thanks to the abuse of it but for now it is still a case of anchor text rules. Use this with care however, particularly if you are working on a very large website where internal link implementation could potentially result in hundreds if not thousands of links with the same anchor text which would be easily detectable by Google and may result in a penalty. Just as with off-page linkbuilding, internally, it is also important to vary anchor text. Consider making the header navigation a keyword anchor text link, the breadcrumb a variation of this and in-content links something like “learn more about our services” – too many anchor text links can be overkill.
Source of internal links (content, breadcrumbs, navigation) – when it comes to link building campaigns, it is always advisable to encourage links from a variety of sources, the same applies to organizing internal links. Take care to ensure that links to internal pages are balanced. Too heavy reliance on for example breadcrumb navigation could mitigate some of the power of internal links.
Ultimate Guide to On-Page Optimization

Quality Factors

Google is making leaps and bounds towards making truly high-quality websites more visible in the search results. It is important to ensure you are helping clients give off the right kinds of ‘quality signals’: here are some factors worth considering.
A gorgeous design – Google can’t quite grade the looks of your website just yet but it can gauge the reaction of visitors. Good looking websites keep people engaged and stop people clicking away meaning it keeps the bounce rate low. Google utilizes user feedback metrics like bounce rate so anything you can do to improve the user experience is going to be a big win in the SEO arena.
Custom(ised) design – it doesn’t have to be a completely custom design but it is reasonable to assume that Google looks less favorably upon websites that use free or even premium themes but do absolutely nothing to make it their own. I’d imagine that Google takes this stance because it is quite reasonable to say that a webmaster who hasn’t bothered to get the basics of a website right is unlikely to be creating something high-quality in the long run. That might be an over-simplification and a sweeping generalization but Google is trying to crunch vast swathes of data and web pages, it doesn’t have the time to individually review every page out there.
Address, privacy policy, TOS, and other brand signals – Google post-Panda is looking to promote ‘real’ businesses and brands. Adding an address, a privacy policy and other basic housekeeping that reputable online operators would have on their website, can make all the difference with how well a website performs in the search engines. This Google blog post offers some guidance on building high quality websites and one of the rhetorical questions asked is “Would I trust this website with my credit card details?” If the answer is no then it would suggest there are some quality issues that need addressing
Ultimate Guide to On-Page Optimization


Designing an SEO Friendly Website

Nearly every business from local plumbers, large blue-chip organizations and even churches have websites, and each and every one wants to spread the message about their products, services and profiles, and what they can offer you as a consumer. That is where SEO comes in. SEO has become a valuable online marketing tool for businesses of all shapes and sizes, but for an SEO campaign to be successful, solid foundations need to be in place. Doing things right from the offset can really reap the benefits, save time and money.
Below are my top tips for designing an SEO friendly website from the ground up. These cover the little things that need to be done (or avoided) from the offset, so a website can eventually live a happy life at the top of the search engines.

Keyword Research

Before I even start a website design/build, I establish what the business objectives are. Many businesses will have a clear plan of what they want to achieve in terms of targeting. This may range from targeting the local area, national area or even global.
With this in mind, when designing wire frames, deciding on customer journeys and sitemaps, you can start to build, in addition, pages for content that target these specific keywords based on your findings, or at least have a plan of where these pages may go down the line once the site has launched.
From a design perspective, there is nothing worse than going back to a website six months later and having to change the structure and internal elements for additional content that may be used for SEO purposes. So my advice would be to plan early.

Search Engine Friendly Navigation

Now when I say search engine friendly, I mean a navigational system that the search engines can read and follow. One of the many factors of on-page SEO is the internal linking structure and the navigational system is the backbone for this. Having buttons and links which are text-based is a major plus and great for accessibility.
In my opinion, I would always avoid flash based navigational systems for one simple reason; Google along with the other search engines can’t read text on images. The other thing to consider with this is, as the web moves more and more to mobile devices how many of these will support flash? Apple has already said they won’t support flash elements and even my Blackberry doesn’t do a great job of flash websites. Please don’t think I’m not a lover of flash, because I do like the concept, but from an SEO point-of-view it can be a big barrier, especially with navigations.
Instead of flash, why not use CSS methods or jQuery? These can be just as good if not better, but make sure the load times for these elements are fast.
Also with navigational systems, it’s important to clearly label the links.  If a link says "about", it should take the user (and the search engines) to the “about” page. As mentioned above, your internal link structure needs to be good, if not bullet-proof! Another example would be, if you have a page about "restaurants in London", label any links that go to this page from the navigation or other internal links with the anchor text "restaurants in London".  Make it clear and descriptive for the search engines, thus adding that little extra internal link juice from a SEO perspective.

Website Load Times

Every so often a big update is done to the search engine algorithms. Normally the news is about Google and a SEO factor that was introduced early 2010 was website load times/speed.  I experimented with this and found by speeding up a site in terms of various elements, did have an affect, so much so that a website I experimented on moved up 3 places in Google.
Now when I say various elements, I mean things like:
  • CSS files – remove unwanted/unused code or place all the code on one line per div/class.
  • The amount of JavaScript in the code – this can be really slow depending on what you are using it for so I’d advise using it sparsely.
  • Website image size – see point 4.
  • File size – remove white spaces and any unessential line breaks in code, keep it streamlined.
Making improvements on the above, will increase speed and generally help with the SEO (not forgetting the user experience).

Website Images

As a golden rule, a website has eight seconds to sell the company and/or products and there is nothing worse than waiting for a website to load, especially large images and backgrounds. As mentioned above, website load time is a factor and the two best tools I have used to reduce image file sizes are Adobe Photoshop and Adobe Fireworks – admittedly Fireworks did the slightly better job, even though it was only a few kilobytes.
Also if the website is an e-commerce site, create smaller images for the galleries.  There is nothing worse than waiting for an image to load that is only 200px x 200px on screen that has been re-sized using HTML code – remember speed is now a key factor with SEO and can’t be avoided.

Keyword Placement

So, one of the major factors of SEO is telling Google what the page is about. This is done by writing great “user-focused” content. Within this content it’s important to get the keywords in the right position on the page. Here are the best places:
  • Title tag
  • Meta description and keywords
  • Website slogans
  • Navigation
  • Breadcrumb trails
  • H1, H2 and H3 tags
  • Bullet points
  • Alt text
  • Title attribute on links
  • The main website copy
  • Internal links
  • Footer links
  • URL’s
  • File / folder names
One thing to remember with the above is, don’t over-do it. Google has become heavily focused on the user so make sure the content is focused at the user; it will also become link-worthy content.

Add Social Elements

2011 has had a shift in terms of SEO; social is now a contributing factor. Not only is social a good way to demonstrate to potential customers that the company has a voice but social networks like Twitter helps towards good rankings.
It’s crucial with the design of the website that social elements are added in on 2 different levels.
  1. Make it clear that the company is available and contactable on social networks with prominent buttons and icons
  2. Adding the Twitter feed on site can also help with keyword placement, regular updates (you must be a daily user of Twitter) and can also speed up the Google cache rate (i.e. the amount of times Google visits the website and checks for updates).

Friendly URL’s and image filenames

One major thing I have noticed over the past few years is the number of websites that don’t contain friendly URL’s, so an example may be:
http://www.websiteshop.com/products/item1?=20193
A better example for a friendly URL would be:
http://www.websiteshop.com/formula-one/clothing/ferrari-tshirts
As you can see from the two examples, the second option has a good selection of keywords, this will help Google and the other search engines identify what the page is about and having keywords in the URLs is a good SEO method for keyword placement, as mentioned above.
In terms of images, also having an appropriate file name is vital. An example would be women that are looking for a wedding dress – they will more than likely go to Google images to find design ideas. Having an image named "img310.jpg" isn’t going to help with the Google image algorithm. So a better idea would be to have a file name "wedding-dress.jpg" for example.

Sitemaps

Sitemaps are purely designed to tell the search engines about all of the content on the website. This will ensure that the search engine bots find all of the content that may be 2 or 3 folders deep within the website so this content has a good shot at ranking for specific keywords and phrases.
One thing I have noticed with large e-commerce websites without sitemaps is the lack of pages that are indexed in Google. A great example would be an e-commerce site I worked on recently that had a catalogue of over 2,000 products. After conducting research on this, I found that only 500 pages had been indexed in Google. With the introduction of sitemaps, their indexed pages had gone up to 1,500 in 3 weeks – this also increased their exposure in Google. They then started to gain more long tail keyword searches and overall conversions increased off the back of this.
In terms of sitemaps I always recommend to use 4 different sitemaps:
  • XML
  • ROR (aka RSS Feed)
  • URL List
  • HTML
This give the search engines a variety of choice when it comes to locating all of the pages on site. Another thing to mention would be to include links to all 4 sitemaps on every page of the website (usually in the footer) to help the search engines further, especially with buried content which could be 2 or 3 levels (folders) deep.

Google Web Fonts

Creating visually interesting designs usually consists of using unfriendly web fonts. Creating text elements with an attractive font normally consists of using images as a work around. As mentioned in this post, Google and the other search engines can’t read text which is an image, which in turn could cost you really good real-estate with keyword placement on-page.
Back at the beginning of last year Google opened up a new Font Directory (http://www.google.com/webfonts). So instead of using images for text, you now have a large collection of open source fonts to use on the web completely free!
So in a nutshell, you can keep those super attractive designs with a readable web font, which in turn results in the search engines being able to read the text and use this as a ranking factor.

301 Redirects

Now I’m sure as a designer you have come across the re-design scenario. So, you have finished the design and launched the new website and then all of a sudden, rankings drop!
A great way to combat this, especially if you have restructured the website with new file names or moved content is to use 301 redirects in the .htaccess file.
This does 3 things:
  1. Tells the search engines that the page has moved to a new location and needs re-indexing
  2. Tells the search engines the page has been renamed and needs re-indexing
  3. Any links that were pointing to the old page will now flow through to the new page via the redirect. As links are an imperative part of SEO, you can’t afford to loose these valuable links, thus retaining good rankings.

Prediction: W3C Validation

Over the past 18 months I have blogged about validated websites don’t have an impact on search results. With Google updates such as "caffeine" and "panda" which focus on search quality and user experience, I believe this may become a factor in the future – so I believe its important to look at this area sooner rather than later.

Final note…

The above points are certainly a must, but one thing to remember especially with search engine optimisation is that continuous work is required to gain great results.
The above gives you the basics and a fantastic starting block for a successful SEO campaign.

How to choose a SEO wise domain name

When you’re looking for a domain name, you want to go for one that is easy to remember, easy to type and that gives an idea to the visitor on what your site is about. However, finding that perfect domain name is not an easy task. If you’re part of a very competitive business, you really have to put your creativity at work to come up with a top-notch domain name.
In SEO there are a lot of opinions on how important keywords are in a domain name. Ok, it’s important, but not vital. An optimised domain name is worth nothing if it’s not supported with unique content and quality backlinks.

Common questions

  1. How important are keywords in a domain name?
  2. If I include more keywords will I get more traffic?
  3. Should I use hyphens? How will these affect my domain?
  4. What’s the difference between .com and other domain extensions from a SEO point of view?
  5. If I add a suffix or a prefix to my main keyword, does it make it weaker?
I’ve tried to come up with some answers to clarify part of the existing misconceptions regarding SEO and domain names.

1. How important are keywords in a domain name?

The simple answer is yes, keywords are important. When you have strong keywords included in your domain name it helps you get some traffic to your site. On the other hand, don’t imagine you’ll get tens of visitors just because you have one or two good keywords in your domain name. You have to work hard to earn traffic, no matter what your domain name is called, so don’t make keywords the main factor in your decision-making process.
However, if you’re looking for a domain name with branding value, be as creative as you can and don’t worry too much about the keywords in your domain name. It’s better to choose an original name that will differentiate you from the keyword rich domain names available nowadays, rather than stuffing it with keywords for nothing. Go for a name that’s memorable and then put your marketing budget to good use to promote it heavily.

2. If I include more keywords will I get more traffic?

Photo by ivanpw
Including more keywords does not guarantee more traffic. If you include more keywords you can get some traffic, however, it might as well work against you. People might consider your site spammy and just not visit it, so you can also lose traffic because of this.
Keywords in a domain name are not as important as those in your title tags and site content. If you’re not supporting your site with unique content that is optimised and with quality backlinks, then it’s worth nothing. Your domain name should reflect what your site is all about, and not be just a combination of keywords in an attempt to get traffic with little value.

3. Should I use hyphens? How will these affect my domain?

You can include hyphens; it really doesn’t make a big difference from an SEO point of view. Some people prefer to get a domain name with hyphens usually when the non-hyphenated version is no longer available. The main search engine – Google – does not differentiate between hyphenated and non-hyphenated domain names. You should mainly focus on the name and the keywords included.
If you choose a name that includes several words, you can use hyphens. This way you can make it easier to write and remember. As a general rule, try to avoid having a long domain name that includes more than three keywords separated by hyphens because it’s usually considered spammy and it’s also difficult to remember.

4. What’s the difference between .com and other domain extensions from a SEO point of view?

Photo by 55His
Apart from the tech savvy, people are most familiar with the .com extension. If they’ve managed to remember your brands’ name, they’ll probably type in .com to enter your site. Dot com sites have this advantage and so it’s easier to get more traffic. If your desired domain name is not available for .com, then choose an alternative, such as .net, .org, .info. These are cheaper and you can still get traffic if you optimise your site content, although not as fast as you would on a dot com domain.
However, it also depends on your target audience. If you’re targeting the UK market for instance, it’s better to get a .co.uk domain.

5. If I add a suffix or a prefix to my main keyword, does it make it weaker?

Generally, if you include a non-keyword in a domain name it decreases the keyword density. You can include prefixes or suffixes but only if you really can’t find the domain name that you want.
You can use prefixes such as “my”, “your”, “buy” or suffixes like “online”, “blog”, “search”, etc. It’s usually recommended to use suffixes so you can place the keyword first and the suffix second. A good scenario would be for the keyword combined with the suffix to turn into a long tail keyword, such as “buy cars” or “buy used cars”.

Conclusion

When looking for a domain, try to find a combination that is memorable and easy to promote. If it also includes keywords, that’s an advantage. When you buy a hosting package, make sure you’re getting it from a reliable hosting company that can also provide the support you need. Once you’ve bought your domain name, your site should be ‘live’ at all times, 24/7.
How did you choose your domain name? Was including keywords your main focus?

What is Bad in SEO ?

Buying Links:


Search engines dislike being tricked into thinking that a website is more relevant than it actually is. That’s
why buying paid links from other websites is an issue: it can have an undue influence on the makeup of a
search results page.

Imagine what would happen if search engines openly allowed paid links without any punishment. The
company with the deepest pockets would always be ranked first! Fortunately for us internet users, search
engines do penalize paid links, so relevance is still the deciding factor in determining rankings.

Make no mistake: buying links is an extremely common practice. Chances are that your competitors have
tried it, or are doing it right now. But when those links are ultimately identified as paid, search engines
devalue them, and the site’s rankings will decrease as a result.

Duplicate Content:


Writing content can be difficult—even on a good day. But writing totally unique, engaging content that
ranks highly in search engines is harder. There are thousands of online services that allow you to syndicate
content; in effect, allowing you to populate a website with content from another system without having to
write a word.

On paper this technique sounds like a brilliant idea: fill the site with content, because the more content,
the higher the rankings—right? Wrong. Search engines have sophisticated systems for identifying and devaluing duplicate content. This means that your website will quickly become little more than a repository
of useless content.

There is a place for syndicated content: news streams are a perfect example. Once again, though, use your
common sense and ask yourself, “Does adding this content provide value to my visitors?” If the answer is
yes, adding syndicated content can make sense.

Keyword Stuffing :


If you discuss SEO with anyone who built websites around the year 2000, they’ll often have the opinion
that the more keywords you add to a page, the more relevant that page becomes to a search engine. The
practice is called keyword stuffing: stuffing the page full of keywords for the sole purpose of tricking search
engines into thinking the page is more relevant than it actually is. Today, keyword stuffing often makes
your page less relevant for a keyword, rather than more relevant.

All the major search engines employ extremely complicated phrase, sentence, paragraph, and page analysis
to every single site that they spider and index. Natural language patterns are analyzed, and keyword-stuffed
pages—which bear little resemblance to natural language—are devalued. It’s that simple.


Cloaking:
When a website presents one version of content to a visitor and a different version of the same content to
a search engine, it’s called cloaking. There are dozens of ways to cloak web pages, and quite a few of these
methods work very successfully, delivering high rankings for those web pages.

But make no mistake: cloaking is on all the major search engines’ blacklists. This is an extremely unethical
technique that will result in your site being banned completely from a search engine if your tactics are
discovered. And, given that your competitors are likely to have an eye on the tactics you’re employing and
can easily report you, discovery is all but inevitable.

Once again, remember the mantra: develop websites that are optimized for the users’ experience as well as
the search engines’ spiders.

Automated Link Building :


Hundreds of applications and services on the market today claim to be able to develop thousands of backlinks to your site for next to nothing. This sounds like quite a deal, but as usual, if it sounds too good to be true, it usually is.

Using software or services to automate your link-building efforts is a bad idea. These systems work by
submitting your website to tens of thousands of extremely low-quality directories, whose sole purpose is
to receive submissions from automated software tools.

You’ll receive the 10,000 links, just like the software promises. However, what those selling the services
fail to mention is that one relevant link from a trusted website in your industry can have ten times the impact
of those 10,000 links.

Do the math, and it’s not hard to work out that your time is better spent focusing on quality, rather than
quantity.

What is White Hat SEO ?

White hat SEO is the most ethical form of search optimization. Any practitioner who follows the principles
of white hat SEO will engage only in the ethical, long-term optimization of both their own sites and those
of their clients.

When you adopt a white hat SEO philosophy, the methods you’ll employ will follow the guidelines set out
by the search engines. The fundamentals of white hat SEO involve:

■ conducting thorough keyword research and targeting
■ creating high-quality, unique content
■ improving internal linking and site structure
■ building links from relevant, quality sites

“Well,” you might be thinking, “that sounds easy enough!” Sadly, it isn’t. Each element of a white hat
campaign is a time-consuming, grueling test of endurance played out over months or even years.

Time is measured a little differently in the world of SEO than it is in the email marketing or PPC universes,
both of which allow you to quickly test, enhance, and deliver campaign results. These tools are like Formula
One cars—quick off the starting blocks, expensive to run, and highly tuned.

Ethical SEO campaigns require months of effort, quite often producing little results. Think of white hat
SEO as being like an oil tanker: it takes a lot of energy to get moving, but once it’s moving, it’s highly efficient
and carries a lot of momentum.

What you need to learn to do from the outset of your SEO career is to think long term. You need to understand that the return on investment (or ROI) of an SEO campaign will almost always be negative for the first few months. There’s no shortcut to profitability. Without a long-term view of any project—and without setting a client’s expectations accordingly—you’ll be doomed to failure.

Now that you understand these challenges, it is time to discuss the big advantages of adopting the white
hat SEO approach.

The long-term benefits start with peace of mind: you’ll know that now, and in the future, you and your
clients will avoid the penalties that are applied to sites that breach ethical guidelines. These penalties can
be severe—in some cases, all evidence of a site is removed from the search results! At the very least, the
relevance of the site is reduced, so it loses ranking position and, ultimately, traffic.

Discriminating between ethical and unethical techniques is relatively simple. Just ask yourself, “Does this
optimization improve users’ experience or understanding of my website, or am I simply doing it for an SEO
benefit?” Let common sense—and the user experience—prevail in every SEO decision you make, and
chances are you’ll stay within the realm of white hat SEO.