Thursday, July 21, 2011

How to Read a Resume


How do you know who is interview worthy?

1.  Reading and Deciphering

The most important question to ask yourself when reading a resume is: is it easy to distill information about this candidate? Candidates can certainly brag about themselves, but knowing when you are looking at something that is actually good and not a messy mud puddle can be hard to discern. Here are some hawt points and the way in which to read into them:
Hawt Points
Objective. Sometimes software engineers want to be project managers. Know who actually wants the job you’re offering.
Experience. Where have they been and do you like where they have been? Have they included dates of employment? How many jobs have they held in the last year? Relevant experience?
Skills, Languages & Technologies. Are they versed in the capacities listed in the requirement section of your job post? Are they too expert sounding in too many languages? Did they list MS Office Suite?
Projects. Do they include the duration of the projects they have listed, as well as people they probably worked with on these projects?
Education. Did they go to an exceptional school? Complete their degree? If they didn’t, what was their GPA? Honors, awards, scholarships? Achievements?
Outside of Work. Do they list their interests and activities? Do they appear to exhibit a sense of passion for the work you are asking of them?
TIP: The simpler the better. The recruiter’s job is hard enough, how about making it easier on them? Spell checking shows that you pay at least a little attention to detail, as does pretty formatting. But remember to hit the main points: Jobs, Education, Skills, and if it seems applicable, projects (including the duration of the projects), and accomplishments (such as graduating with honors or that you won 1st place at theGoogle Code Jam.)

2. Record of Promotion 

If the candidate has moved up the ladder, successfully filling multiple roles while at one company, that’s probably something worth noting. Here’s a an example:
TIP: If you are creating your résumé and want to know what to show off, let the world know that you have an interest in yourself, and are genuinely, well, awesome! Show you have been promoted. Recruiters like that stuff. It also shows that people like you and what you do. Even if it was merely a change in job title and not an increase in pay. 

3.  Know the Good Companies from the Bad

This might seem obvious, but successful companies usually have strong employees behind them. How did they get those strong employees? Well, they are all American Gladiators… or they have a rigorous hiring process.

If a candidate has worked somewhere you know hires only REALLY good people, they are probably worth checking out.
Also, don’t forget about the little guys! They might be small and new – but they tried! Not sure? Sometimes their mission statements, or job postings showcase the type of employees they hire. We do, and so do lots of other people out there. Next time you are looking at a resume, check out the companies your candidates worked for and see what THEY looked for in a candidate. You can also look at their current employees in similar roles on LinkedIn to get a feel for the caliber of talent they were looking for.
TIP: You will probably be Googled, so you might want to take that Facebook photo down. You know which one I am talking about. Oh, and set up a LinkedIn account and start networking! (If you build it, they will come.) Be sure to fill in as much information as possible, and start networking with people you meet in interviews, it lets the recruiter know you are interested (and helps them remember you (*wink*).

4.  School is Cool

Obviously, if you are hiring for a tech position and the candidate went to a top computer science program and earned advanced degrees in Computer Science, Linguistics, Physics, Mathematics…. they are probably pretty dang smart. But don’t overlook those that have high GPAs from the lesser-known schools! A degree is still an accomplishment – especially if they graduated Cum Laude with a GPA higher than 3.7, honors, or additional degrees. Those folks are extremely interview-worthy and these are impressive achievements; it shows they worked hard and take pride in their work. Isn’t that what you want in a candidate?
TIP: Wondering if you should include your GPA on your résumé? If you received below a 3.5 GPA, then you may not want to showcase it on your résumé. But above a 3.5? Heck ya! Show your stuff and pump it up with activities, honors, awards, and personal achievements. That is an accomplishment and there are people who pay attention to these details.
At SEOmoz, we have interviewed candidates with otherwise unimpressive resumes because the candidate’s achievements at school or in their personal lives warranted a phone screen. Their dedication was simply too impressive to pass up!
As an aside, school is way cool, but it isn't necessarily everything a candidate has to offer. If you take all of those hawt points listed above, and they exceed your expectations without even looking at their education, bring them in! Did you know that SEOmoz's own CEO dropped out of college? Some people are just really smart, motivated and super dedicated, and that says a lot about character.

5.  Trophies, Patents, Awards and Certificates

Not everyone receives a fancy award or honor, so those are good to look at, especially fellowships, grants and scholarships. If someone else is willing to pay them for something amazing they did, you might find them worthy of a few peanuts, too.
It also shows that the candidate is willing to go that extra mile to prove they have mad skills. And that’s what you should be looking for, right? Ask yourself, “What do they do outside of work?” Are they involved withStartup Incubator as a finalist or member, have they participated in industry events such as Google Summer of CodeStartup Weekend, or have they attended conferences, or presented at conferences. (pssst....hiddenTIPS are in this paragraph!)
TIP: Get yourself awarded? No seriously, if you are amazing, then apply for a grant, or enter a contest! It’s not just for mom, it’s for your future (barf!) Show passion and achievement outside of what you get paid to do. “It’s more about the achieving nature of the person than the achievement.” Kate Matsudaira. 

6.  Projects and Mad Skills

Instead of being impressed by a long list of known technologies, an example of how the candidate has used them is way more impressive. Let's pretend we are looking for someone with experience in building a house using a hammer, screwdriver, nails and screws. A better resume would list how they used the tools (hammer, screwdriver, nails and screws), is a manner such as this: "My last project was building a house. For this project I used a hammer, a screwdriver, nails and screws." Savvy?
Also, the cover letter is the best place to discover if a candidate is looking to work vs. looking to fulfill a passion. If the developer is truly passionate about being a developer, then he/she’s probably working on side projects or learning a new language. These are things to look for in an application. If an interviewee told us they created a webpage especially for their interview, it wouldn’t be the first time, and there hasn’t been a time we didn’t consider them for the position. What scientist doesn’t like proof?
TIP: Flaunt it if you got it! In a meeting with Andrew Maguire, founder of InternMatch, he referenced a really interesting write up on an innovative way to draw attention to your mad skills. The concept: Kill the Cover Letter. Although this relates to interns looking for internships, it’s really not a bad idea all across the board. You can always write up a traditional cover letter and refer them to your nifty digital “cover letter”. Then you would really be cool, especially if there are multiple Tweets and Likes proving it. It’s a socially driven world, get with it!

7.  Lesser Known and Used Ideas and Strategies

If you don’t use a platform like Jobvite, take advantage of Survey Monkey. Have them answer all the generic make-it or break-it questions you have before contacting them. You can also ask them questions that are geared toward the kind of culture you are trying to maintain, or even create. You know, weed out the pen sniffers and the too secretive, secret Santas.

When you find yourself completely stumped on a candidate, it’s better to err on the side of giving them a chance. In fact, at SEOmoz, we worry about missing good people while only depending on their resumes as a filter. This is when the survey comes in handy. You can format it to do the 1st interview for you, and control your interview process. Here are some of the questions from our survey:
  1. Where do you want to be in 5 years? How would working at SEOmoz help you meet those goals?
  2. What project do you consider your greatest success and why?
  3. Tell us about a mistake you made recently that you learned a lot from.
  4. What do you love best about being a software engineer?
  5. Why did you apply to SEOmoz? Is there a particular product/feature/technology you want to work on?
  6. Why do you think you'd be a great fit at SEOmoz?
  7. Please provide a link to a page you consider to be the funniest on the internet.

8.  The moral of the story...

The suggestions listed above are derived from implemented practices here at SEOmoz. While we would like to say that our process is flawless, it is in fact organic. Depending on your recruiter, these tips could really scale down the otherwise arduous task of locating that awesome fit! An interesting outcome in utilizing these practices, especially the survey, is that you’re uncovering a personality that may or may not fit, or may not fit in the cover letter or resume format, and therefore on your team!
A Few Red Flags:
When reviewing resumes there are lots of things to look for in a resume, but there are also some things that may make you raise your eyebrows. Below are some of the things that can raise doubts in our mind (so if you are writing a resume be sure to avoid them). Of course, never let one or more of these prevent you from talking to a candidate, since good people do write bad resumes; however some of them may warrant additional questions and investigation.
The never - ending resume. 
No one cares about your high school job unless you just graduated or are still in high school.
TIP: If you have a lot of work experience, only include the most relevant if not the most recent positions.
The Expert at everything.
What did Anonymous say? Oh yeah, “An expert is someone who knows more and more about less and less, until eventually he knows everything about nothing.”
TIP: If you suggest that you’re an expert at everything you do, you have nothing to learn. Often times, the experts are not who recruiters are looking for. They tend to be, well, know- it-all’s.
The job jumper
Unless you are a time bender like Hiro Nakamura, no one should have 20 jobs in 3 years.
TIP: Um, don’t be a job jumper. You’re not only wasting your employer’s time, but your time. Obviously you're not happy with what you are doing, so find something new.
The endless list of accomplishments (liars, freeloaders, scrubs):
Unless you actually saved a litter of puppies from certain death on your way to work while juggling 10 cups of coffee – don’t say you did. It’s lame, and if someone else helped you with that project, it’s good karma to extend those kudos.
TIP: Figure out what you have to offer. If you feel like you don’t have anything to show for yourself, express it as a goal in your cover letter. If you do have a ridiculous amount of accomplishments within a very limited time span, chances are we won’t consider you, because it's next to impossible.

Thursday, June 23, 2011

My 300rd blog post

Hi every one. With this blog entry i have completed 300rd Blog posts. It all started August 2009.  
 I’ve shared lots of information & enjoyed doing so. Thanks to all those who have encouraged me by posting their valuable suggestions / comments on my posts. From next post on words i will try to share more better information. 



5 Tips for SEO & User-Friendly Copy

You hear it time and time again: content is king. It’s king because it’s what users want, and it’s king because of SEO – i.e. if you don’t have good content, you’re not going to rank.
Of course, not all content is created equal. While over-optimizing content for search will make it less than user-friendly, focusing too much on usability can compromise its searchability.
So how can your content walk the line between SEO and usability? Well, there are 5 easy tips you can follow when trying to walk your content down that fine line between SEO and usability.

1. Length of On-Page Content

The ideal minimum length of page copy for SEO purposes is 250 words. So where your user experience permits, you should have 250 words (or more) of actual inline content – i.e. not headers, not sidebar content.
That being said, 250 words is just a minimum. As a general rule, the more content the better. In fact, I’ve personally noticed that when a page has 1,000+ plus, it has a much better chance of ranking for the keywords that it’s optimized for.
Of course, there are pages where it doesn’t really make sense to have so much content. In such cases, 250 words of content would disrupt the user-experience and push interactive features below the fold.
Some examples of this are a category pages on blogs or ecommerce sites. In these cases, you might want to optimize the categories for a certain niche/vertical of keywords, and added 250 words of text at the top of the page would help.
But from a usability standpoint, categories exist so that users can navigate/browse products or content within a category. So adding 250 word of text would disrupt the user-experience by pushing those product/article links below the fold.
A decent compromise in such cases is to insert a bit of static content at the top of main-content area of your category pages. Generally, 300 characters (about twice the length of your page’s meta-description) is enough to (1) get some descriptive keywords on the page, but (2) keep the category links/listings well above the fold.

2. Scanable Line Paragraphs

Knowing that you want at least 250 words on each page, how do you make that content as usable as possible? That is, how can you make it scanable so that users are not deterred from actually reading it?
As a rule, you should aim for 3-4 line paragraphs (2 sentences). Of course, in some circumstances, a paragraph warrants more than 2 sentences. But by keeping paragraphs within 3-4 lines (5 lines max), you can create a text-experience that users can easily scan, making them more likely to actually consume the content.

3. Headers

Now, what do you do if you have more than 250 words on a page? If you’re talking about more than one product or service? How do you keep the user engaged?
Well, you do that by structuring your content. Basically, you should section off your content using header tags (e.g. <h2> and <h3>). This will create a break in the content that (1) makes it seem easier (i.e. ‘not as long’) to read, and (2) provides visual cues to pull the user’s eye through the content.
From an SEO perspective, moreover, the keywords in headers help you clue search engines into what your content is about. For example, three headers might tell search engines that three different (but related) topics are being discussed on the page.

4. Keyword Density

Keyword density is how often a keyword appears on a page. Depending on who you ask, the ideal keyword density is anywhere from 2%-5%.
The only problem with this is that even at the low-end, this can make for unnatural prose. For instance, if a keyword makes up 3% of all words on a page, the user will probably notice that that keyword appears quite often. In fact, it will make your copy seem robotic, non-compelling, and generally spammy.
Where you can fit in these extra keywords, however, is in your header tags. Indeed, by using keyword variation to draw up optimized header tags, you can gain an extra 1% of keyword density without making the actual copy seem contrived and unnatural.

5. Bullet Lists

One of the biggest copy tradeoffs between usability and search is bullet lists. While bullet lists help make content more scanable, search engine regard them as “broken content” – meaning that they don’t quite count as much as other page copy when it comes to their keyword density. There are two ways to get around this.
First, you can use a bullet list at the top of the page to outline the page’s content. This will help cue the user into what they can expect as they read through the page, which will help them better navigate the content.
If you do go this route, however, make sure that (1) you have 2 sentences/3 lines preceding the bullet list, (2) you have another 2 sentences/3 lines following it, and (3) there is a minimum of 250 words of “unbroken” content on the page. This will not only put the bullet list in context for user, but it will help ensure that search engines properly index the page.
Alternatively, you can simply place a bullet list further down in the content, as part of the one of the subsections. If sufficient content precedes a bullet list, that bullet list is less likely to factor large into how that page is indexed.

Content: Kings & Jesters at Court

A general rule to mitigating tradeoffs between search and usability is developing content for humans, not for search bots. After all, at the end of the day, search engines are out to provide users (real human beings) with the most relevant content available.
copywriting
If you take every possible step to optimize your site for search, chances are you’ll produce some rather non-user-friendly content. And the paradox there is that the less user-friendly content is, the less engaging it is, and the less likely it will be to attract backlinks or go viral.
So you should always think of the user before you think of the search bots. But always keep in mind that there are some elements you can add to a page that both increase usability, and help optimize your page for SEO purposes.

The Ultimate Guide to On-Page Optimization

We work with a great deal of very talented website designers and developers who create first-rate websites for their clients – aesthetically superb but not always the most search engine friendly they could be. Overall it can make the project more expensive for the client because it means that we as the SEOs have to go back under the hood of their shiny new website to make tweaks for better on-page optimization.
This guide is intended to be a cheat sheet that allows you as a designer or developer to really bake SEO into the projects you are working on – saving your clients time and money in the long run as well as making the process much more seamless for them and your life a heck of a lot less stressful, well at least no more SEOs bugging you to make changes!
Ultimate Guide to On-Page Optimization

Website Factors & Site Architecture Factors

URL Structure – the golden rule when it comes to URLs is to keep ‘em short, pretty and include the page level keyword – never ever build on a platform that doesn’t allow for URL rewriting because not only are ugly URLs, well, ugly, they are also a mistake SEO wise. Short, sexy and search friendly URLs make it easy for the user to share with their social networks or link to – not to mention how much easier a logical URL structure makes website management!
Website structure accessibility – inaccessible navigations are a real headache when it comes to SEO. A navigation wrapped in javascript is bad and a menu made from Flash is worse. Now I bet you are thinking “Javascript makes a website more user friendly because it creates things like drop down menus helping the user to make better sense of the page options.” This might be true, but we need to balance usability with search engine friendly. Firstly, we shouldn’t forget that a slick looking menu/navigation bar could render a website unusable on certain devices and in certain browsers (try switching off Javascript or Flash) but from a strictly SEO perspective it could mean that pages deep within your site aren’t being indexed because the only links to them are from a menu wrapped in code that the spiders can’t decipher.
Considered use of Javascript – following on from the point above…whatever Google says, there is clear evidence that the search engine struggles to handle javascript. Reams and reams of unreadable code could mean Googlebot heads somewhere else rather than crawling any deeper into your site. It might also be causing other issues like crawl errors and damaging your website’s crawl rate neither of which are good
things!
Canonical URLs – use the attribute to specify your preferred URL for a page. This is useful in situations where almost identical pages appear at different URLs because of something like a category choice or session ID being added. It is important to tell Google and Bing which page is the one they should index and pass all relevant link juice and authority to. Failure to implement canonical
URLs can mean duplicate content issues but more crucially loss of rankings as search engines divide link juice and page authority between the copies of the page something which could have been avoided if the correct page had been stated in the rel=canonical tag.
Unique meta titles and descriptions – to many, on-page optimization is just about changing a meta title here or there…hopefully this list will show you otherwise. Whilst making Meta title and description changes might feel like SEO from 1997, in my experience it is still a part of the bigger on-page optimization jigsaw. In my mind, it is quite a simple step in the on-page optimization process… a unique title and description for every page front-loading page level keywords in a natural non-spammy way. There are of course other meta tags that you can include, e.g. ‘keywords’ and whilst I am sure some people will disagree with me on this, I only see the use of optimizing the titles and descriptions, tags like the keyword data have been abused to the point of rendering them almost a complete waste of time. Google might not always use the title and description you give a page but at least you’ve told the search engines what the page is about and if Google does decide to use your title and description, you have some influence over encouraging a user to come to your website over the other choices in the SERPs.
Robots.txt file – a good starting point for robots.txt best practice is this guide from SEOmoz. It is always worthwhile ensuring a robots.txt file doesn’t contain any unwanted directions for the search bots, even if you haven’t added anything, someone or something working on the site before might have.
XML Sitemap – fairly common practice nowadays but still worth a mention. An XML sitemap should always be available. It helps make the search engines aware of all the pages on your website and increases the likelihood of faster inclusion in the index for newer pages.
Website speed – I’m sure this issue is right at the fore of your mind when it comes to building websites because it is a really hot topic right now. Google recently enabled all webmasters to monitor page loading speed directly from their Google Analytics dashboard; if they’ve made it that easy for you, you can bet they are using this data as part of their calculation as to where to rank your website. Google loves to improve user experience and since a fast loading page is definitely a better user experience, I can see this playing an increasing role in SEO of the future, particularly in competitive markets. Also, Amazon.com conducted a study and found that for every 100 millisecond increase in page load time, their sales decreased by 1%. Therefore the reasons for improving page speed go way beyond just SEO! There are multiple ways to improve site speed so I won’t go through them all here but all I will say is code responsibly, choose a good host and setup a CDN (content delivery network) if your client is targeting users worldwide.
Ultimate Guide to On-Page Optimization

Content Factors

I was in two-minds as to whether to include this section in the final guide because as a designer you might have limited control over content factors but there again in my experience; designers certainly have some responsibility for either the content itself or for formatting and publishing so I feel it is worthwhile to mention these factors.
Content language – Google uses the language the text content has been written in as a reference point for the relevance to the user making the search query. If you are targeting an English speaking country then content should be written in English. Obvious really but it does reinforce the need for localized websites if you are helping a client to target other countries that speak different languages.
Content uniqueness – one phrase I am sure you are bored of hearing is ‘create unique content’ if you want to do well in the search results. People keep saying it because it is true. Unique content sends the right kinds of quality signals to Google because more users engage with it and talk about, they share it, it generates more links. Encourage your clients to invest in useful, unique content that offers real value to the reader or if necessary take responsibility for this yourself.
Amount of content – the recent Google Panda algorithm update has had an impact on what could be considered the right ‘amount’ of content. My suggestion is that you encourage clients to consolidate existing content or target new content creation efforts towards smaller but higher quality hubs of content. Help and advise clients to remove pages that are basically just a carbon copy of another page on the site but with a few different keywords.
Unlinked content density – pages that contain a lot of links particularly to external pages never look good in the eyes of Google. It gives off a link farm/poor quality directory/paid link operation type vibe which is not just damaging to the page but also to the website and to the pages it links to. Whilst there isn’t an optimum density, as a rule of thumb the number of links should feel natural and be well balanced with unlinked text or other types of content. If all the links are absolutely necessary, consider breaking them down into smaller categorized pages to improve the unlinked content density.
Is the content well-written? – there isn’t any direct evidence that suggests Google penalizes a website for poor spelling or grammar however that being said, a badly written page is off-putting for the user and will therefore send off the wrong kinds of signals to readers or potential customers and since Google is incorporating user feedback like bounce rate into its algorithm, keeping the user happy is vital.
Expertise and Depth of content – Google is smart and since it is on a mission to organize the world’s information, I would be willing to bet that it has already hit the mark or is close to it when it comes to understanding how deep a piece of content goes and whether the author is an expert or not. Algorithmically it could probably quite easily detect if key topics within the theme have been discussed and whether there are any factual inaccuracies meaning it is more important than ever to really be the expert.
Keyword location and repetition – it is widely accepted that Google places more emphasis on links that appear higher up a page. This is based on the logic that if something is important, it is likely to be included first. My suggestion is always (provided it looks natural) to front load the heading of the page with the keyword being targeted and then to mention the keyword within the first paragraph and then depending on the length of the page at selected intervals throughout the text. The key is to keep it natural, there’s no optimum keyword density but there certainly is such a thing as over optimization and keyword stuffing both of which will see the page and possibly the site subject to a penalty. Interweaving keywords into text so that it is good for both user and search engine can be quite challenging but it is worthwhile.
Spam keyword inclusion – if you run an adult themed website then of course this is unavoidable but be vigilant of quite innocent and accidental inclusion of these keywords on what would ordinarily be a very family friendly website. This will be a real turn off for the search engines because of safe-search filtering and also because it may suspect your website has been violated by hackers who have injected spam keywords and links.
Ultimate Guide to On-Page Optimization

Internal Linking Factors

Number of Internal links – one of the reasons that Wikipedia ranks so well is thanks to its internal linking structure. Of course each of the pages wouldn’t hold so much weight if it weren’t for the overall authority of the website but the online encyclopedia has still mastered internal linking best practices. It adds a link to another page on the site wherever it feels natural and will be useful to the user allowing them to flow through the website. You can take this concept and apply it to your client’s website helping them to increase pages per visit, improve user experience and ultimately improve page rankings through increased link volume. They may ‘only’ be internal links but the will still serve enhance your off-page link building efforts.
Anchor text of internal links – anchor text is still an important factor in link value. It will likely decrease in importance thanks to the abuse of it but for now it is still a case of anchor text rules. Use this with care however, particularly if you are working on a very large website where internal link implementation could potentially result in hundreds if not thousands of links with the same anchor text which would be easily detectable by Google and may result in a penalty. Just as with off-page linkbuilding, internally, it is also important to vary anchor text. Consider making the header navigation a keyword anchor text link, the breadcrumb a variation of this and in-content links something like “learn more about our services” – too many anchor text links can be overkill.
Source of internal links (content, breadcrumbs, navigation) – when it comes to link building campaigns, it is always advisable to encourage links from a variety of sources, the same applies to organizing internal links. Take care to ensure that links to internal pages are balanced. Too heavy reliance on for example breadcrumb navigation could mitigate some of the power of internal links.
Ultimate Guide to On-Page Optimization

Quality Factors

Google is making leaps and bounds towards making truly high-quality websites more visible in the search results. It is important to ensure you are helping clients give off the right kinds of ‘quality signals’: here are some factors worth considering.
A gorgeous design – Google can’t quite grade the looks of your website just yet but it can gauge the reaction of visitors. Good looking websites keep people engaged and stop people clicking away meaning it keeps the bounce rate low. Google utilizes user feedback metrics like bounce rate so anything you can do to improve the user experience is going to be a big win in the SEO arena.
Custom(ised) design – it doesn’t have to be a completely custom design but it is reasonable to assume that Google looks less favorably upon websites that use free or even premium themes but do absolutely nothing to make it their own. I’d imagine that Google takes this stance because it is quite reasonable to say that a webmaster who hasn’t bothered to get the basics of a website right is unlikely to be creating something high-quality in the long run. That might be an over-simplification and a sweeping generalization but Google is trying to crunch vast swathes of data and web pages, it doesn’t have the time to individually review every page out there.
Address, privacy policy, TOS, and other brand signals – Google post-Panda is looking to promote ‘real’ businesses and brands. Adding an address, a privacy policy and other basic housekeeping that reputable online operators would have on their website, can make all the difference with how well a website performs in the search engines. This Google blog post offers some guidance on building high quality websites and one of the rhetorical questions asked is “Would I trust this website with my credit card details?” If the answer is no then it would suggest there are some quality issues that need addressing
Ultimate Guide to On-Page Optimization


Designing an SEO Friendly Website

Nearly every business from local plumbers, large blue-chip organizations and even churches have websites, and each and every one wants to spread the message about their products, services and profiles, and what they can offer you as a consumer. That is where SEO comes in. SEO has become a valuable online marketing tool for businesses of all shapes and sizes, but for an SEO campaign to be successful, solid foundations need to be in place. Doing things right from the offset can really reap the benefits, save time and money.
Below are my top tips for designing an SEO friendly website from the ground up. These cover the little things that need to be done (or avoided) from the offset, so a website can eventually live a happy life at the top of the search engines.

Keyword Research

Before I even start a website design/build, I establish what the business objectives are. Many businesses will have a clear plan of what they want to achieve in terms of targeting. This may range from targeting the local area, national area or even global.
With this in mind, when designing wire frames, deciding on customer journeys and sitemaps, you can start to build, in addition, pages for content that target these specific keywords based on your findings, or at least have a plan of where these pages may go down the line once the site has launched.
From a design perspective, there is nothing worse than going back to a website six months later and having to change the structure and internal elements for additional content that may be used for SEO purposes. So my advice would be to plan early.

Search Engine Friendly Navigation

Now when I say search engine friendly, I mean a navigational system that the search engines can read and follow. One of the many factors of on-page SEO is the internal linking structure and the navigational system is the backbone for this. Having buttons and links which are text-based is a major plus and great for accessibility.
In my opinion, I would always avoid flash based navigational systems for one simple reason; Google along with the other search engines can’t read text on images. The other thing to consider with this is, as the web moves more and more to mobile devices how many of these will support flash? Apple has already said they won’t support flash elements and even my Blackberry doesn’t do a great job of flash websites. Please don’t think I’m not a lover of flash, because I do like the concept, but from an SEO point-of-view it can be a big barrier, especially with navigations.
Instead of flash, why not use CSS methods or jQuery? These can be just as good if not better, but make sure the load times for these elements are fast.
Also with navigational systems, it’s important to clearly label the links.  If a link says "about", it should take the user (and the search engines) to the “about” page. As mentioned above, your internal link structure needs to be good, if not bullet-proof! Another example would be, if you have a page about "restaurants in London", label any links that go to this page from the navigation or other internal links with the anchor text "restaurants in London".  Make it clear and descriptive for the search engines, thus adding that little extra internal link juice from a SEO perspective.

Website Load Times

Every so often a big update is done to the search engine algorithms. Normally the news is about Google and a SEO factor that was introduced early 2010 was website load times/speed.  I experimented with this and found by speeding up a site in terms of various elements, did have an affect, so much so that a website I experimented on moved up 3 places in Google.
Now when I say various elements, I mean things like:
  • CSS files – remove unwanted/unused code or place all the code on one line per div/class.
  • The amount of JavaScript in the code – this can be really slow depending on what you are using it for so I’d advise using it sparsely.
  • Website image size – see point 4.
  • File size – remove white spaces and any unessential line breaks in code, keep it streamlined.
Making improvements on the above, will increase speed and generally help with the SEO (not forgetting the user experience).

Website Images

As a golden rule, a website has eight seconds to sell the company and/or products and there is nothing worse than waiting for a website to load, especially large images and backgrounds. As mentioned above, website load time is a factor and the two best tools I have used to reduce image file sizes are Adobe Photoshop and Adobe Fireworks – admittedly Fireworks did the slightly better job, even though it was only a few kilobytes.
Also if the website is an e-commerce site, create smaller images for the galleries.  There is nothing worse than waiting for an image to load that is only 200px x 200px on screen that has been re-sized using HTML code – remember speed is now a key factor with SEO and can’t be avoided.

Keyword Placement

So, one of the major factors of SEO is telling Google what the page is about. This is done by writing great “user-focused” content. Within this content it’s important to get the keywords in the right position on the page. Here are the best places:
  • Title tag
  • Meta description and keywords
  • Website slogans
  • Navigation
  • Breadcrumb trails
  • H1, H2 and H3 tags
  • Bullet points
  • Alt text
  • Title attribute on links
  • The main website copy
  • Internal links
  • Footer links
  • URL’s
  • File / folder names
One thing to remember with the above is, don’t over-do it. Google has become heavily focused on the user so make sure the content is focused at the user; it will also become link-worthy content.

Add Social Elements

2011 has had a shift in terms of SEO; social is now a contributing factor. Not only is social a good way to demonstrate to potential customers that the company has a voice but social networks like Twitter helps towards good rankings.
It’s crucial with the design of the website that social elements are added in on 2 different levels.
  1. Make it clear that the company is available and contactable on social networks with prominent buttons and icons
  2. Adding the Twitter feed on site can also help with keyword placement, regular updates (you must be a daily user of Twitter) and can also speed up the Google cache rate (i.e. the amount of times Google visits the website and checks for updates).

Friendly URL’s and image filenames

One major thing I have noticed over the past few years is the number of websites that don’t contain friendly URL’s, so an example may be:
http://www.websiteshop.com/products/item1?=20193
A better example for a friendly URL would be:
http://www.websiteshop.com/formula-one/clothing/ferrari-tshirts
As you can see from the two examples, the second option has a good selection of keywords, this will help Google and the other search engines identify what the page is about and having keywords in the URLs is a good SEO method for keyword placement, as mentioned above.
In terms of images, also having an appropriate file name is vital. An example would be women that are looking for a wedding dress – they will more than likely go to Google images to find design ideas. Having an image named "img310.jpg" isn’t going to help with the Google image algorithm. So a better idea would be to have a file name "wedding-dress.jpg" for example.

Sitemaps

Sitemaps are purely designed to tell the search engines about all of the content on the website. This will ensure that the search engine bots find all of the content that may be 2 or 3 folders deep within the website so this content has a good shot at ranking for specific keywords and phrases.
One thing I have noticed with large e-commerce websites without sitemaps is the lack of pages that are indexed in Google. A great example would be an e-commerce site I worked on recently that had a catalogue of over 2,000 products. After conducting research on this, I found that only 500 pages had been indexed in Google. With the introduction of sitemaps, their indexed pages had gone up to 1,500 in 3 weeks – this also increased their exposure in Google. They then started to gain more long tail keyword searches and overall conversions increased off the back of this.
In terms of sitemaps I always recommend to use 4 different sitemaps:
  • XML
  • ROR (aka RSS Feed)
  • URL List
  • HTML
This give the search engines a variety of choice when it comes to locating all of the pages on site. Another thing to mention would be to include links to all 4 sitemaps on every page of the website (usually in the footer) to help the search engines further, especially with buried content which could be 2 or 3 levels (folders) deep.

Google Web Fonts

Creating visually interesting designs usually consists of using unfriendly web fonts. Creating text elements with an attractive font normally consists of using images as a work around. As mentioned in this post, Google and the other search engines can’t read text which is an image, which in turn could cost you really good real-estate with keyword placement on-page.
Back at the beginning of last year Google opened up a new Font Directory (http://www.google.com/webfonts). So instead of using images for text, you now have a large collection of open source fonts to use on the web completely free!
So in a nutshell, you can keep those super attractive designs with a readable web font, which in turn results in the search engines being able to read the text and use this as a ranking factor.

301 Redirects

Now I’m sure as a designer you have come across the re-design scenario. So, you have finished the design and launched the new website and then all of a sudden, rankings drop!
A great way to combat this, especially if you have restructured the website with new file names or moved content is to use 301 redirects in the .htaccess file.
This does 3 things:
  1. Tells the search engines that the page has moved to a new location and needs re-indexing
  2. Tells the search engines the page has been renamed and needs re-indexing
  3. Any links that were pointing to the old page will now flow through to the new page via the redirect. As links are an imperative part of SEO, you can’t afford to loose these valuable links, thus retaining good rankings.

Prediction: W3C Validation

Over the past 18 months I have blogged about validated websites don’t have an impact on search results. With Google updates such as "caffeine" and "panda" which focus on search quality and user experience, I believe this may become a factor in the future – so I believe its important to look at this area sooner rather than later.

Final note…

The above points are certainly a must, but one thing to remember especially with search engine optimisation is that continuous work is required to gain great results.
The above gives you the basics and a fantastic starting block for a successful SEO campaign.