There's been much fuss lately as to the need for Search Engine Optimization of the right kind; some will tell you that to be effective SEO has to be "organic"; others will swear by the power of using the right software, coincidentally, their software. Then there are those who say that if you find the right "niche market", the world will beat a path to your door and leave their money when they get there. I even spoke with a fellow yesterday who claimed that if you only launched new sites when the moon was full... well, that's a whole other story.
Where there is less conjecture is in the dire consequences possible as a result of using the wrong kind of optimization. Horror stories abound of the million dollar investments that have simply gone down the drain when Google and the rest decided that the optimizers had cheated in their enthusiastic rush for the top. The early techniques of getting a website to the top of the search engine results (link farms, cross linking, doorway pages, keyword stuffing, and all the rest) have slowly been made ineffectual by the steadily rising sophistication of the search engine administrators,(and the algorithms they employ to keep the race fair and assure relevant search results).
So what is good SEO?
We could, perhaps, define SEO by function; we could talk about the actions which bring about the optimization. First there's research, hour upon hour of research must be performed for each account. With such items as industry research (what's the competition up to?), keyword research (how big is the market for this product/service?), competition research (how many others are already vying for this market?), marketing research (who else has already done this before and what did they discover that I better know about too?) Many hours, days, and sometimes weeks can be spent getting a clear visualization of a plan before the actual project can ever begin.
To even start a website without the proper research is a guarantee that:
nobody will ever link to your site (World's Worst Web Sites, excluded);
nobody, except maybe your mother, will likely see your website;
your investment and business venture will fail-miserably;
you will never be found in the major search engines; and
your results will match your research (0=0).
We could approach the question of SEO from the perspective of form. Every site is constructed differently, designed differently, laid out differently, has a unique way with which it interacts with visitors, and targets those visitors differently. These are all factors that are considered during the research performed prior to laying out a raison d'ĂȘtre for optimization. Architect Louis Sullivan, argued that a building's purpose should determine its design, stating emphatically that "Form follows function." Shortly after, his student, Frank Lloyd Wright argued, "Form and function should be one, joined in a spiritual union." Although Sullivan and Wright were speaking of architecture as it relates to concrete and steel buildings, there's an architecture which goes into the design of a website that, when done well, echoes Wright's observation very nicely.
Good SEO will see to the writing or rewriting of the content on each page to effectively work in all targeted keyword phrases. A professional web-copywriter will be able to take the SEO recommendations for keyword usage and incorporate them into existing content in a way that reads naturally (i.e. does not look like you jammed keywords here and there) and has the ability to convert your visitors into paying customers. This is no small order and if it is not performed well, the site will remain just another pretty page that nobody ever sees. Or, worse yet, your site will attract lots of visitors but they will become confused by the copy and fly off to the next site without ever taking the desired actions (sign up, buy, make contact).
Another function of SEO, and one not often spoken of, is cleaning up all extraneous code on the pages. Code bloat removal, as it's so poetically referred to, is an art form in itself. It requires a thorough understanding of html along with all the rest of the coding and scripting languages which make up a modern web page, as well as the ability to move it to a separate file and importing it, when feasible, or trimming it to it's bare essentials without changing the look and/or function of the page itself. Eliminating page code bloat can be an incredibly arduous task. Moving styles and JavaScripts is only part of the puzzle. Many times, a page has to be almost completely rebuilt due to the excess amount of junk code that gets added in with the use of popular "WYSIWYG" page editing software.
To see what a search engine spider bot "sees" when it visits a web page, go to the View button on your browser toolbar. Move your cursor down to Source (PageSource in Mozilla Firefox), and left click. What you are looking at is what Backrub (Google), Sidewinder (Infoseek), T-Rex (Lycos), Gulliver (Northern Lights), and all the others "look" at when they spider a site. They read it the same way you do, from top to bottom. Notice how much code and formatting are at the top of the page and scroll down to find the content (this article). Taking all of the code and paring it down to just what's needed and then finding ways to trim that is part of what good SEO is about. Try this on other sites you visit and you will soon understand the situation.
OK, time out! I tried not to mention specific software in this article but, hey! Have you ever wondered why Microsoft doesn't use FrontPage to create pages on Microsoft.com -- even the pages that deal with the FrontPage software? Perhaps, they're trying to tell us something. I've spent literally months of my life removing and rewriting the loopy code and nonessential tags that have been produced by FrontPage editors. From an optimization standpoint using FrontPage to produce a website is akin to shooting yourself in both feet before you start to run a marathon. If Microsoft doesn't use it, why should you?
Now that that's out of the way . . .
After the code bloat removal process, good SEO will address getting all pages on the site to validate to the professional standards set by the W3C. Validation is simply a process of ensuring that the right coding elements are used and used correctly. This isn't a good guy - bad guy question or even a matter of not breaking the rules, it's about being accessible to everyone who uses the web. There is a growing number of the blind and visually impaired who use Voice Readers or text-to-speech software which "speak" the text on the web page. Many of the old tricks and shortcuts that web designers used in the past don't work with these or any of the growing number of other software designed to make a level playing field of the Internet. While many validation issues are not a big problem in and of themselves, if you find it on one page, it will likely run all through the site (and can take many hours of head scratching and work to clean up affectively).
What You Do - Where You Do It - Who You Are
There are varying opinions about what should and shouldn't be included in a title tag. What is agreed upon is that all of the major search engines give the content of the title tag significant weight in determining what the page is all about. It's my practice to only write a title after everything else on the page has been written, and then with an eye to using at least two (better 3) of the keyword phrases that apply to the page. Unless you're "GE" or "Maxwell House," or intend to spend the kind of money they spent getting to be a well known brand, there is simply no reason to place your company name in the title tag. Save it for the terms that people will use to find your services/products. I know, you wanted mama to see your company name right up there in the Title Bar. It's ok with me, but it will cost you.
We Don't Need No Stinking Map
Site maps help both search engines and visitors quickly and easily get to the information that is important. It's amazing how simple a matter the design and implementation of a usable site map is, and how many websites either don't have one, or have an incomplete or obscure site map - an even worse scenario. If you're not sure you need one, build one anyway. Trust me on this one. If I come to your site and can't find what I'm looking for, I'll look for a site map. If I can't find a site map, I'll look somewhere else. Oh yeah, that's how 95% of website visitors are. Get a site map.
The robots.txt file is useful to communicate with the search engine spiders about content they should or should not index. This allows the "bot" to focus its time on the good stuff and not the irrelevant portions of your site.
Good SEO is all of these things and more. Your site will be off to a great start by following the suggestions mentioned here. And hopefully, this article will get you thinking that just maybe those "seo firms" which offer "Complete SEO $100." or "Get Your Site To #1 In Google" $295, aren't talking about the same things that we've been discussing here. After years in the business, I've yet to give even a "ballpark" figure for an optimization campaign without thoroughly researching the needs of the client, the structure of the site, and the competition for the target keywords. Every situation is different. Be wary of anyone offering a la carte SEO; without research, an individualized plan of attack, and careful implementation, you might as well wait for the next full moon.
No comments:
Post a Comment