Organic Search Engine Optimization (SEO)
Also known as Relevancy Based, Organic Search is where the search engines find your web site and determine its relevancy to certain key phrases.
There are too fundamental parts to SEO, on is on page optimization including link renaming, internal linking, setting alt attributes of image tags, meta tags, title tags, geographical focusing of pages to target keyword phrases and copy writing using accepted SEO principles. The other is off page optimization and that is largely discussed in the Building Back Links page, as linking is the primary component of off page optimization. There are also such things as page, image and folder renaming, robots.txt files which could be considered in either camp.
It is important to have internal links, so a site with a hundred pages will help you, if each is linked to the home page.
Designers may want to include optimization in the design process, as SEO is affected by design decisions. It is part of the cake, not just the frosting.
Diadematus And Google Site Maps
Diadematus for Google is a handy little free tool for creating Site maps. I tried a number of others but they all seemed like hard work or provided invalid output. Google's suggested Python solution is great for those who have Python on their server and are totally comfortable with installing a program there but again, it looked like hard work for the majority of us.
Diadematus is installed on your computer in the MediaCET folder. You can then run it from there to go to your site and create an xml site map in the folder of your choice. I place it in the same folder that mirrors the site on my computer and then just upload it via FTP and then submit it to Google.
However, there is a bug in the version that I downloaded, Diadematus 1.01.
The xml tags in the Diadematus generated site maps say "changefreg"; this should say "changefreq". Google was robust enough to put up with this initially but lately has parsed the sitemaps with the incorrect term and spat them out and says that they are in error. Changing freg to freq globally seems to solve the problem.
I e-mailed Mediacet, the makers of Diadematus, in September 2006, about the problem, but never received a response, and it doesn't look like an updated version is available.
Diadematus also doesn't like mailto: URIs, and will say they are invalid while processing your site.
For those of you interested, Diadematus is the taken from the name of the orb weaving spider, Araneus diadematus.
Other Sitemap tools:
AutoMapIt is a service thate can do site maps for websites up to 500 pages for free. Auto-pings Google, Yahoo, and others when your maps are uploaded. Covers Google, HTML, RSS, ROR and more for sitemap formats. Lots of available filters and tweaks to get clean sitemaps. Best of all is that it will run your maps and keep them updated versus needing to start a program or to remember to upload them.
SoftPlus GSiteCrawler has been described as having "great results, it is free and will do XML, HTML, text, and robots.text files among others."
Free Site Map Generator "has some good features. Auto upload new sitemap to your site. Auto ping google when a new sitemap is uploaded."
http://www.rorweb.com/rormap.htm that creates a sitemap in xml that you can name sitemap.xml or ror.xml and put a meta tag on your site that tells ALL bots about the existance of this site map. It comes in 2 free versions a web service that will do upto 1000 URLs and a downloadable windows version that will do upto 5000 URLs.
See http://www.sitemaps.org/ for a full explanation.
META tags are controversial because many of the general public assume, as do some uninformed web developers, that putting your keywords into the META tags is all that is required for a high ranking and relevance on the search engines. This is definitely not true.
Some authorities have stated that Yahoo may place some emphasis on the META description tag when it comes to putting a site into its directory, but not into its search index. The URL, title and description are the only things indexed in the Yahoo directory.
A recent SEO project I did demonstrated that the META keywords tag definitely does help in search rankings on Yahoo and AltaVista. We put a keyword into the tag, and that was the only place on the web site that the keyword existed. In a week we were #4 on Yahoo and #2 on AltaVista for the term! Note that there was almost no competition for the term. Still, the search engines noticed it.
These engines also may pay attention:
It has been alleged that some search engines penalize web sites if the search terms of the Meta Keywords tag don't appear in the body text of the web page.
Google once suggested the perfect formula for on-page optimization (this is 5 year old information):
Use the key phrase once in the title, again in the META description tag, twice in the first two paragraphs, and then again in the last sentence on the page.
Top Level Domain Names
Is there any preference by the Search Engines against .info, .net, .ca, .tv or whatever other tld in favor of .com?
According to Roy Montero, no, there isn't.
Google's new algorithms are hurting sales landing pages (the Google slap). Google says: Recently, we have begun incorporating the quality of an ad's landing page into the determination of what ads appear on your site. The quality of the ad's landing page now affects the Quality Score that the ad receives -- this score helps to determine the amount an advertiser must bid to appear on your site. The lower the Quality Score, the more "expensive" it is for the advertiser to show up.
Concentrate on long tail phrases. eg pickup truck bed liner or ford f100 pickup truck bed liner
404s will result from spiders looking for robots.txt, so add it with the following code:
This should look after 90% of web sites.
404 and Error Pages
You will need a custom 404 page. A 404 code is the one that the server sends out when the client asks for a non existent page.
301 redirect for URL changes.
You will need to add 301 redirects to pages that you have renamed. I have already added a custom 404 error page that catches any missing pages. If you can, you should keep the original page names, as these are indexed in the Search Engines.
If you change extension from .html to .shtml, then because search engines index your site with your first extension,
your page is lost
from the search engine index. You must wait again for the search engine to crawl and index your new extension.
Free Web Hosting
There are several free web hosting services available, such as 50webs.com, bytehost.com, awardspace.
com, 110mb.com, fateback.com and pages.google.com.
These may be a good way to get free hosting, and perhaps diversify your IP address. When it comes to calculating the worth of a backlink, then the search engines discount the value of links coming from the same C-block IP address.
Other SEO Techniques
Glossary page to build up SEO terms. Have a site map and a FAQ and a search option.
Translate sites into another language, e.g. Spanish. This may increase your traffic.
Yahoo uses a modified version of Inktomi for its SERPs. Slurp is the spider.
http://www.seomoz.org/articles/search-ranking-factors.php is an excellent discussion of the factors in SEO.
http://botw.org/ Best of the Web $70/submission
linkcenter.com free submission
Traffic generation from http://www.jimcockrum.com/blog/2006/08/09/jims-top-traffic-generation-tips/
BBBonline and Verisign are the lemming principle examples. People trust those brands, even UPS or Amazon or PayPal.
Put phone # on site as it lets people know that they can call if a problem.
http://www.webrankinfo.com/english/tools/google-data-centers.php lets you check your rankings on 17 Google data centers at once
Bruce Clay has a chart of which search engines feed which other engines.
Supplemental results are an index separate to Google's main index. You can find out what pages of your site are in the supplemental results by doing a site: command on your site. They will be listed after the main indexed results and labeles as supplemental results.
May be used in determining SE ranking. You can diversify your hosting with providers such as http://www.cheapandbest.org
Give images with "alt"or "title" attributes which contain your keywords. Google provides image search which means that the Google
bot CAN read the images at your site.
How Long You Have Your Domain Registered
If you register your domain for 10 years rather than one, then that may give your site a boost in Google's rankings, according to one of their patents.
Matt Cutts says: I carefully chose keywords for the title and the url (note that I used “change” in the url and “changing” in the title). Strongly recommended is using directory URLs e.g. /whats-an-update/
Frontloading from http://adsense.1scroll.net/front.htm
Frontloading means that you start headlines, paragraphs and links with the most important words. The first words should communicate the subject of the headline, paragraph or link. This is not like writing a novel or a story, where you have time to be coy and not get to the point for awhile. You’ve got about a quarter of a second to grab that user’s attention or he won’t read the rest of the sentence. Make the most of that opportunity.
If you do this, and you frontload your writing, especially at the top of the page, user’s eyes will easily catch the most important info, and they’ll keep reading.
Here are some examples of good frontloading:
- Foo Fighters release new cd
- Barbeque beef ribs recipes everyone will like
- Tom Cruise stars in a new movie
Here are some bad examples that are not frontloaded:
- New cd is being released, it’s by the Foo Fighters
- Everyone will love these great new recipes for barbeque beef ribs
- New movie is coming out and it’ll star Tom Cruise
Never Hide Headers from http://adsense.1scroll.net/never.htm
Remember how I said people look to the upper left? If you’ve been centering your headlines and subheadings, do you still think that’s a good idea? Well, it’s not. Yeah, I know newspapers, magazines and books do it. So do lots of other sites. But that’s just not where people want to look first.
They’ve tested this. Believe it or not, about 10-20 percent of people just literally do not see centered headlines, particularly if they’re in a hurry (and who isn’t these days?) They look in the top left hand corner of the content. And when they do, they see empty space, because the centered headline starts off to the right.
So what do they do? Instead of scanning right, they move their eyes down. And they miss the headlines.
Centered headlines are wasted headlines. If you center them, you’ve hidden them from 10-20% of your readers. Might as well not have them at all. And don’t even think about right-justifying them.
Just left-justify them and don’t ever worry about it again!
A word about tables: the ideal table for online is short, narrow, and only used for data. When a table is too wide or too long, part of it is out of the reader’s natural field of vision. When they scan fast, they won’t see all of it.Google, Ask, Yahoo and Microsoft now follow a robots.txt directive that tells the crawlers the name of the sitemap. Disadvantage is that you may have errors and webmaster tools can tell you about those.
Sitemap: sitemap.xml (or does it have to be a full path?)
Sitemaps help crawler figure what pages are changing. Saves time and processing.
All urls have to be full paths in the sitemap itself.
There is a sitemap google group.
Yahoo Site Explorer is similar to Google Webmaster Tools. You can even remove URLs (and subpaths) that you don't want in the index.
Good speaker was Eric Papczun from Performics
Search Submit Pro puts PPC into the organic results.
A solution for domain or URL changes is to put up a sitemap with all the 301 pages. Within two weeks the SEs will know that your site has, mediapost.com subscription for details or rangeonline.com
Exclude spammy content from your sitemap.
Priority code allows you to highlight relevant pages.
What are Google Sitelinks?
You may see subheadings under some search results. Google derives these from the site architecture.
1. Tables, schmables. I use them all the time on my sites and it has never been an issue with getting indexed top ten on Google, MSN and Yahoo, within a few weeks in some cases. The crawlers are pretty smart, they know that lazy people like me use them and probably just ignore them and look at the content. Otherwise 70% of the relevant sites on the web would not be indexed.
2. Rankings. On the subject of Google's erraticness of late, I've whined here about one of my sites, the one with one of the highest eCPMs unfortunately, which has had a bumpy ride on Google. It was in the sandbox for about ten months, only to be found in the supplemental results, then the last week of May it finally made the main index and was top ten for the esoteric terms I wanted it to be. For one day (June 25) it was top ten for a high traffic term "florida homeowners insurance". I thought I had it made. Four times as much traffic and my biggest ever day of Adsense. The next day and ever since, it is not even in the supplemental results. Who knows what is going on with their algorithms.