You are on page 1of 9

What is SEO? SEO is the act of modifying a website to increase its ranking in organic (vs.

paid), crawler-based listings of search engines

----------------------------------------------------------------------------------------------------------Types of Search Engines (1) Crawler-Based Search Engines Crawler-based search engines use automated software programs to survey and categorise web pages. The programs used by the search engines to access your web pages are called ‘spiders’, ‘crawlers’, ‘robots’ or ‘bots’. A spider will find a web page, download it and analyse the information presented on the web page. This is a seamless process. The web page will then be added to the search engine’s database. Then when a user performs a search, the search engine will check its database of web pages for the key words the user searched on to present a list of link results. The results (list of suggested links to go to), are listed on pages by order of which is ‘closest’ (as defined by the ‘bots’), to what the user wants to find online. Crawler-based search engines are constantly searching the Internet for new web pages and updating their database of information with these new or altered pages. Examples of crawler-based search engines are: Google (www.google.com) Ask Jeeves (www.ask.com) (2) Directories A ‘directory’ uses human editors who decide what category the site belongs to; they place websites within specific categories in the ‘directories’ database. The human editors comprehensively check the website and rank it, based on the information they find, using a pre-defined set of rules. There are two major directories at the time of writing:

such as descriptions and keywords for search engines and refresh rates.com) ----------------------------------------------------------------------------------------------------------Meta Definition: The <meta> element provides meta-information about your page. Of these.com) (4) Meta Search Engines Meta search engines take the results from all the other search engines results. .com) Dogpile (www. and combine them into one large listing. Meta element Meta elements are HTML or XHTML elements used to provide structured metadata about a Web page. http-equiv. Examples of hybrid search engines are: Yahoo (www. keywords and any other metadata not provided through the other head elements and attributes.yahoo. name and scheme.com) Google (www.dmoz.yahoo.Yahoo Directory (www. More and more search engines these days are moving to a hybrid-based model.dogpile. Such elements must be placed as tags in the head section of an HTML or XHTML document.metacrawler.google. Examples of Meta search engines include: Metacrawler (www. Meta elements can be used to specify page description.com) Open Directory (www.org) Note: Since late 2002 Yahoo has provided search results using crawler-based technology as well as its own directory. The meta element has four valid attributes: content. (3) Hybrid Search Engines Hybrid search engines use a combination of both crawler-based results and directory results. only content is a required attribute.

volume and consistency of searches and/or viewer traffic. uniqueness. { Major search engine robots are more likely to quantify such extant factors as the volume of incoming links from related websites. technical user-features. which specifies the document type so a client (browser or otherwise) knows what content type to render.etc. For example: <meta http-equiv="Content-Type" content="text/html" /> This specifies that the page should be served with an HTTP header called 'Content-Type' that has a value 'text/html'. relevance. a meta element specifies name and associated content attributes describing aspects of the HTML page. quantity and quality of content. redundancy. This is a typical use of the meta element.encyclopedia" /> In this example. the meta element identifies itself as containing the 'keywords' relevant to the document. In the general form.An example of the use of the meta element In one form. click-throughs. For example: <meta name="keywords" content="wikipedia. spelling. meta elements can specify HTTP headers which should be sent before the actual content when the HTML page is served from Web server to client. Meta tags can be used to indicate the location a business serves: <meta name="zipcode" content="45212. revisits. geographical information is given according to zip codes. functional v. page views. time within website. broken hyperlinks.01 specifies three document types: Strict. and Frameset. HTML 4." /> In this example.45208.45218. Wikipedia and encyclopedia. technical precision of source code. freshness. . } ----------------------------------------------------------------------------------------------------------<!DOCTYPE> Definition The <!DOCTYPE> declaration is the very first thing in your document. advertising revenue yield. This tag tells the browser which HTML or XHTML specification the document uses. Transitional. language and other intrinsic characteristics. geography. before the <html> tag.

it should be avoided) that consists of generating different HTML content depending on whether it is intended for a visitor or for a search engine. • • • • Inserting keywords that are the same colour as a page's background (invisible keywords) Adding keywords that have nothing to do with the page to the meta tags Repeating keywords (or keywords stuffing) Webpage hijacking (pagejacking) ----------------------------------------------------------------------------------------------------------What is Cloaking? Cloaking is a technique banned by search engines (i.HTML Strict DTD Use this when you want clean markup. Thus. Spamdexing consists of adding keywords that have nothing to do with the page and hiding them from visitors. spamdexing is likened to spam because it is seen as deception. Use this when you need to use HTML's presentational features because your readers don't have browsers that support Cascading Style Sheets (CSS): Frameset DTD The Frameset DTD should be used for documents with frames. Indeed. The Frameset DTD is equal to the Transitional DTD except for the frameset element replaces the body element: ---------------------------------------------------------------------------------------------------------------------------------------- What is Spamdexing? Spamdexing is defined by search engines as all of the improper referencing techniques and methods.e. free of presentational clutter. which is contrary to the interest of Internet users. it is possible to detect search engine robots through the presence of a specific User-Agent field in the HTTP requests that they send and show them a different content that includes extra key words that are not shown to visitors. . Use this together with Cascading Style Sheets (CSS): HTML Transitional DTD The Transitional DTD includes presentation attributes and elements that W3C expects to move to a style sheet.

When a search engine explores a website. attract prospective customers or develop sales numbers. The term web marketing is used as opposed to "traditional marketing".Nevertheless. ----------------------------------------------------------------------------------------------------------Promoting a Website Promoting a website consists in making it known publicly through several channels in order to. ----------------------------------------------------------------------------------------------------------- .txt File Format The robots. improve traffic. depending on the case. robots.txt File robots. The value * means "all search engines" Disallow: used to identify the pages to be excluded during indexing.txt file at the root of the site.txt is a text file that contains commands for search engine indexing robots that specify the pages that can and cannot be indexed. Web marketing" (also called cybermarketing or netmarketing) is any campaign that improves a website's visibility by using the Internet as a marketing channel.txt file is an ASCII file found at the root of the site. the website runs the risk of no longer being indexed or even of being blacklisted (banished) for several months. gain a reputation. it starts by looking for the robots. Because webmarketing and traditional marketing are not necessarily exclusive of each other. Each page or path that is to be excluded must be on a separate line and must start with / The value / alone means "all of the website's pages". a well articulated online advertising campaign and a traditional offline advertising campaign will have even more impact. It can contain the following commands: User Agent: used to specify the robot that is subject to the following orders. ----------------------------------------------------------------------------------------------------------Presentation of the robots. if this technique is detected by a search engine (which is easy for them to do).

i.e. part of the work is to identify these requests ----------------------------------------------------------------------------------------------------------Improving Web Positionning There are some design techniques that can be used to more effectively position WebPages: • • • • • • Original and attractive content\ an aptly chosen title an fitting URL a body text that can be read by search engines META tags that precisely describe the page's content well thought out links Webpage Content Page Title The title must describe the webpage's content as precisely as possible (in under 7 words and 60 characters). The title is all the more important because it will appear in the user's favorites as well as in the search history. which consists of introducing the website into search tools by filling out the search tools' forms positionning). Page URL Body of the Page Meta Tags Hypertext Links ALT Attributes for Images website's images are opaque to search engines. it is a good idea to place an ALT attribute on every image that . which consists of positionning the website or specific pages of the website on the first results page for certain keywords ranking. whose goal is similar to that of positionning but for more elaborate phrases. they are not capable of indexing image content.What is Web Positionning? Web positioning" generally means all the techniques used to improve website visibility: referencing). Therefore.

the above-mentioned advice must be taken into account when structuring each page. ----------------------------------------------------------------------------------------------------------Measuring and Qualifying Website Traffic Every webmaster's goal is to increase traffic to his or her website. Web Positionning Hinges on the Page The items that search engines index are webpages. Most webmasters remember to correctly index their website's home page but neglect the other pages. identify the audience in order to provide content that is closer to what the website's visitors want. The ALT tag is also of utmost importance for the blind who browse the Internet with a Braille terminal. facilitate the measurement of how website traffic is evolving (which is called both "audience monitoring" and "website metering") and. URL and meta tags (etc. Therefore. it is essential to have indicators that. performed mostly with a panel of users Measuring and defining a website's traffic are two methods for measuring a website's effectiveness in order to permanently improve its quality. It is therefore absolutely imperative to choose an appropriate title. when designing a webpage. increase the number of visits everyday. This involves choosing a tool capable of analysing web server log files and creating a control panel containing the website's main traffic indicators .e. i. on the other hand. Generally there are thought to be two types of studies: • • Site-Centric Measurement User-Centric Measurement. even though it is the other pages that contain the most interesting information. on the one hand. How to Measure a Website's Traffic There are three solutions for measuring a website's traffic: How to Measure a Website's Traffic There are three solutions for measuring a website's traffic: Exploiting the web server's logs (log files) by using a specific tool.) for every page of a website. Therefore.describes the content.

so as to verify that a website still has a good position and. Thus. Gauge of the number of referenced pages according to search engines: Google: "site:www. that equals four hits. the company offering this service is responsible for upgrading the indicators and control panels so as to be constantly in sync with the evolution of Internet access technology and web browsers. it is easy to create a gauge to measure a website's popularity.Developing an ad-hoc statistics system. the statistics gathered this way will not necessarily be exhaustive because: some users stop loading pages before the tag code is downloaded intermediate proxy servers are likely to impede the page from loading security infrastructures and firewalls in particular can block information from being uploaded Hits One webpage may be made up of a certain number of files (particularly image files).net" . etc. This system consists of inserting a "marker or "tag" on each page so that the traffic measuring service can collect the data on a server.net" Altavista: "host:www. take the corrective measures towards improving the position. this type of mechanism can cause the processor to become heavily loaded and the disk space to be more full.commentcamarche. For websites with high traffic flow. Regularly monitoring referencing and position is recommended. a "hit" is a file loaded by a browser. especially if the collected data are stored in a database management system Using a "traffic measurement" service.commentcamarche. It is possible on a website to store visitor information each time a visitor loads a page in order to use that information at a later date. The advantage of this type of service is that it conserves material resources because all of the processing is done on a remote server. style sheets. if necessary. Thanks to this system. However. The "link:" command used by many search engines allows you to determine the number of websites pointing to a given address ("backlinks"). If a webpage containing three images is loaded. ----------------------------------------------------------------------------------------------------------Monitoring Referencing and Position How a website is referenced and how its position evolves over time. What is more. JavaScript files.

net" AlltheWeb: "domain:commentcamarche.net" Voilà: "anchor:commentcamarche. Lycos.net" Gauge of the number of links pointing to a website according to search engines: Altavista: "link:commentcamarche.net" Google: "link:www. Yahoo: "linkdomain:commentcamarche.net" Hotbot.commentcamarche.net" ----------------------------------------------------------------------------------------------------------- .Voilà: "url:commentcamarche.