Sie sind auf Seite 1von 27

Introduction to SEO

INTRODUCTION


What is search?
A search is the organized pursuit of information. Somewhere in a collection of
documents, Web pages, and other sources, there is information that you want to find,
but you have no idea where it is. You perform a search by issuing a query, which is
simply a way of asking a question that will find the information you are looking for.
Searching is usually an iterative process. You submit a query and if the results list does
not contain the information you are looking for, you refine the query until you locate a
page that contains the answer.

What is a search Engine?
A Web search engine is a tool designed to search for information on the World Wide
Web. The search results are usually presented in a list and are commonly called hits.
The information may consist of web pages, images, information and other types of files.
Some search engines also mine data available in News Books, databases, or open
directories. Unlike Web directories, which are maintained by human editors, search
engines operate algorithmically or are a mixture of algorithmic and human input.

Types of search engines:
Though the term "Search Engine is referred for both crawIer-based search
engines and human-powered directories, there is a significant different
between the two.
1-CrawIer-based search engines: crawler or spider crawl the web
page and store it in search engine's index. This index is sometimes referred as a
catalog that contains all the web pages that crawler had found. f there is any updation


in the webpage, this catalog is updated with the new information. Different sites have
their own crawling schedules that determine when a crawler will update the index.
Sometime, the user may not get the updated information for the reason that a webpage
may have been crawled, but not indexed when you hit a search. The Search Engine is
the program that finds information through the millions of pages that are present in the
index to find the user provided search criteria and rank them in order of what it believes
is most relevant.
-Human-Powered Directory The website owner needs to submit a
short description to the directory for your entire site. Then a search looks for a definitive
match only in the descriptions submitted. Any updates or changes to your websites
have no effect on your listing.
t used to be the case in web's early days that a search engine presents the results in
either crawler-based or human-powered directory. However, today it is very common for
this information to be presented with both types of results.
For example, MSN Search is more likely to present human-powered listings from
LookSmart. However, it does also present crawler-based results (as provided by
Inktomi), especially for more obscure queries.
O LookSmart provides search advertising products and services to text
advertisers, as well as targeted pay-per-click search and contextual
advertising via its Search Advertising Network. t provides directory
and listing services to Microsoft.
O nktomi Corporation was a California company that provided software
for nternet service providers and was acquired by Yahoo! in 2002









How Search Engine Works?(Basics)






Search engine optimization: Search engine optimization (SEO)
is the process of improving the visibility of a website or a web page in search
engines via the "natural" or un-paid ("organic" or "algorithmic") search results. n
general, the earlier (or higher ranked on the search results page), and more frequently a
site appears in the search results list, the more visitors it will receive from the search
engine's users. SEO may target different kinds of search, including image search, local
search, video search, academic search, news search and industry-specific vertical
search.As an nternet marketing strategy, SEO considers how search engines work,
what people search for, the actual search terms typed into search engines and which
search engines are preferred by their targeted audience. Optimizing a website may
involve editing its content and HTML and associated coding to both increase its
relevance to specific keywords and to remove barriers to the indexing activities of
search engines. Promoting a site to increase the number of back links, or inbound links,
is another SEO tactic.The acronym "SEOs" can refer to "search engine optimizers," a
term adopted by an industry of consultants who carry out optimization projects on behalf
of clients, and by employees who perform SEO services in-house. Search engine
optimizers may offer SEO as a stand-alone service or as a part of a broader marketing
campaign. Because effective SEO may require changes to the HTML source code of a
site and site content, SEO tactics may be incorporated into website development
and design. The term "search engine friendly" may be used to describe website
designs, menus, content management systems, images, videos, shopping carts, and
other elements that have been optimized for the purpose of search engine exposure.











Search ResuIts:




Organic ResuIts:Organic describes a search that returns results by indexing
pages based on content and keyword relevancy. This is in contrast to listings ranked
based on who paid the most money to appear at the top such as those on
Overture.com. Sometimes this is called "pure" or "natural search" as it is supposed to
be "untainted" by commercial payments or bids.


Pros and Cons of organic search ResuIts:Unlike organic foods at your
local grocery store, you aren't required to pay extra to reap the healthy benefits of
"organic" search. So what are the benefits of organic over paid search.

Organic Search Pros:
1-Greater CIick-through: People trust "organically grown" search results
more than they do sponsored results. While the engines business is
supported by paid ads, many consumers prefer the organic search results.
Due to the contextual nature of organic search, the listings can be more
relevant and offer a greater depth of choices. Therefore, while paid ads can
play an important part in your marketing strategy, ultimately it is the organic
search results that will more likely yield the greater click-through rates when
all other things are equal. Therefore, it's this type of listing that will maximize
the traffic to your site whenever you climb to the top.

-Power of Branding: More and more large corporations are
investing resources into organic search to gain the marketing benefits of
promoting their brand. For example, most consumers would expect to find
Dell.com in a search for computers. f your company does not show up for the
keyword results in which you'd expect to appear it can be embarrassing.
Consumers may wonder if Company X is as important as they once were if
they don't show up in MSN, Yahoo or Google. Conversely, inserting your
brand in the top search results can give the impression that your company is
important. Therefore, smaller companies can give the impression of big
business importance by securing a better position in organic search than their
larger rivals.

3-Greater Trust EquaIs Greater Conversions: Organic search
can, of course, be commercially influenced. However, a recent survey shows
that people tend to trust organic results compared to sponsored listings. On
the whole, you should see more visitors from organic search converting to
sales, assuming your rankings were for targeted, relevant keywords. n the
business world, RO, or Return on nvestment, is king. Fortunately, organic


search can give you the high RO you're looking for or your boss is
demanding.

-Organic is Free: After all these years, it's still free to submit to Google,
arguably the most popular of the organic search engines right now. Google
has always been adamant about not charging for inclusion in its index of 4.2
billion pages. Most other organic engines will also index you for free, although
some like Yahoo do have paid inclusion options. Paid inclusion simply
guarantees your page will get indexed quickly and stay indexed for as long as
you maintain your subscription, but does not promise a particular ranking.
However, if you have a Web site with good quality content and links from third
party sites, paid inclusion is "nice-to-have". t can be very useful in getting
pages indexed or re-indexed quickly. This allows you to quickly test various
page designs and to feed news and other time-sensitive content to the search
engine as quickly as possible.

Organic Search Cons
1-Organic Rankings are Not Automatic: With organic listings, you
cannot simply hand over a certain amount of money and be guaranteed to
quickly and automatically achieve any ranking you desire. nstead, achieving
positions in organic search requires the proper technology, skill, and know-
how. There has always been a cloud of mystery around the process of
achieving top rankings. How's it done? Where do start? That's why
resources like this newsletter and products like Web Position Gold are
essential to a business's success in search engine marketing.

-Organic Rankings Require an Investment in Time: The age-old
adage of "nothing worthwhile in life ever comes easy" rings true with organic
rankings. While they are monetarily free, simply submitting your pages to the
search engines is not enough to bring in a flood of new visitors to your Web
site. Far too many businesses have been fooled into spending $49 or $99 to
submit their site to "thousands" of sites, 99% of which are obscure names


you've probably never heard of. The key is that someone doing a search on a
major search engine must be able to easily find your Web site.

SEO Process:

Though every search engine has its own implementation,
usually A search engine has four parts.
Crawling
Indexing
Search Algorithm
User Interface

CrawIing:s a process that is used by the search engines for accessing the
contents of websites available on web and providing these contents to search
engine's indexer. These crawlers can also be used to automate some
maintenance activities on a website such as validating reference links and
HTML Code. Also, crawlers can be used to gather specific types of information
from Web pages, such as harvesting e-mail addresses (usually for spam). A
crawler use sitemap file for discovery of website URLs for crawling. The
leading search engines, such as Google, Bing and Yahoo, use crawlers to find
pages for their algorithmic search results. Pages that are linked from other
search engine indexed pages do not need to be submitted because they are
found automatically. Some search engines, notably Yahoo!, operate a paid
submission service that guarantee crawling for either a set fee or cost per
click such programs usually guarantee inclusion in the database, but do not
guarantee specific ranking within the search results. Two major directories, the
Yahoo Directory and the Project both require manual submission and human
editorial review. Google offers Google Webmaster Tools, for which an XML
Sitemap feed can be created and submitted for free to ensure that all pages
are found, especially pages that aren't discoverable by automatically following
links
Search engines use "spiders for crawling.



Spiders:

A search engine "spider also known as a "crawler is a software
program that search engines such as Google use to find out what is out
there on the web. The web is a huge place, so something needs to
travel around and see what is offered on it every second of every day,
and the spider is it. The spider looking at your information follows all of
your hyperlinks on each page after the page is loaded. Much like a
spider crawls through a web and finds all insects that get stuck in it, the
"spider on the web crawls around web sites and will eventually find your
information. When a spider visits your web page, the content on your
page gets loaded into a database (picture a gigantic excel file the size of
your city) After your web page has been retrieved, the search engines
loads your content into their index, like drawers and drawers of index
cards, your words get organized .n SEO the spider goes out and finds
your pages, then they break down all of your words on your page and
then all of your URLs are fed back into the SEO program .The first thing
a spider does when it visits your page is look for a file called "robots.txt.
t is a special file that tells the spider what to index and what not to index
and if the spider doesn't find the page, the page will be thrown out,
hence why you may not get recognized in a search engine. The only
way for a spider to see your information is for it to have a robots.txt file.
A spider will find your page by hyperlinks or "found pages. Search
engine may have a URL submission form in which you will want to
request that they add your site to their index, this is a good idea to do in
most cases. One last thing is that if you are submitting your site to a
search engine, it is very important to not submit it to the sites you find or
software you can purchase that will submit your site to hundreds of
engines, this does not work. More and more links you have on your site
will also improve rankings.






Indexing:Once the data has been crawl by the web crawler, the Search engine
indexing collects, parses, and stores data to facilitate fast and accurate information
retrieval. Popular engines focus on the full-text indexing of online, natural language
documents. Media types such as video and audio and graphics are also searchable. A
cache-based search engines permanently store the index along with the corpus.Search
engine crawlers may look at a number of different factors when crawling a site. Not
every page is indexed by the search engines. Distance of pages from the root directory
of a site may also be a factor in whether or not pages get crawled Additionally, search
engines sometimes have problems with crawling sites with certain kinds of graphic
content, flash files, portable document format files, and dynamic content.To avoid
undesirable content in the search indexes, webmasters can instruct spiders not to crawl
certain files or directories through the standard robots.txt file in the root directory of the
domain. Additionally, a page can be explicitly excluded from a search engine's database
by using a meta tag specific to robots. When a search engine visits a site, the robots.txt
located in the root directory is the first file crawled. The robots.txt file is then parsed, and
will instruct the robot as to which pages are not to be crawled. As a search engine
crawler may keep a cached copy of this file, it may on occasion crawl pages a
webmaster does not wish crawled. Pages typically prevented from being crawled
include login specific pages such as shopping carts and user-specific content such as
search results from internal searches.

Search AIgorithm Here lies the trick of the trade. This is the part where every
search engine has their own implementation. That is why the same query may provide
you different results on different search engines. We can take an example of the
Google's Query processor that is responsible for fetching and providing the results to
the end user. The life span of a Google query normally lasts less than half a second, yet
involves a number of different steps that must be completed before results can be
delivered to a person seeking information. Page Rank is Google's system for ranking
web pages. A page with a higher Page Rank is deemed more important and is more
likely to be listed above a page with a lower Page Rank. Google considers over a
hundred factors in computing a Page Rank and determining which documents are most
relevant to a query, including the popularity of the page, the position and size of the
search terms within the page, and the proximity of the search terms to one another on
the page. Google also applies machine-learning techniques to improve its performance
automatically by learning relationships and associations within the stored data. For
example, the spelling-correcting system uses such techniques to figure out likely
alternative spellings. Google closely guards the formulas it uses to calculate relevance;
they're tweaked to improve quality and performance. Google shares general facts about
its algorithm; the specifics are a company secret. This helps Google remain competitive


with other search engines on the Web and reduces the chance of someone finding out
how to abuse the system. A variety of methods can increase the prominence of a
webpage within the search results. Cross linking between pages of the same website to
provide more links to most important pages may improve its visibility writing content that
includes frequently searched keyword phrase, so as to be relevant to a wide variety of
search queries will tend to increase traffic. Updating content so as to keep search
engines crawling back frequently can give additional weight to a site. Adding relevant
keywords to a web page's meta data, including the title tag and meta description, will
tend to improve the relevancy of a site's search listings, thus increasing traffic. URL
normalization of web pages accessible via multiple URLs.

Now some noticeabIe points used in crawIing and indexing:
Robots.txt FiIe: A robots.txt file on a website will function as a request
that specified robots ignore specified files or directories in their search. This
might be, for example, out of a preference for privacy from search engine
results, or the belief that the content of the selected directories might be
misleading or irrelevant to the categorization of the site as a whole, or out of a
desire that an application only operate on certain data. For websites with
multiple sub domains, each sub domain must have its own robots.txt file. f
example.com had a robots.txt file but a.example.com did not, the rules that
would apply for example.com would not apply to a.example.com. The format
and semantics of the "/robots.txt" file are as follows
O The file consists of one or more records separated by one or more blank lines
(terminated by CR, CR/NL, or NL). Each record contains lines of the form
"<optional space><optional space>". The field name is case insensitive.
O Comments can be included in file using UNX Bourne shell conventions the '#'
character is used to indicate that preceding space (if any) and the remainder of
the line up to the line termination is discarded. Lines containing only a comment
are discarded completely and therefore do not indicate a record boundary.
O The record starts with one or more User-agent lines, followed by one or more
Disallow lines, as detailed below. Unrecognized headers are ignored.
O Optionally, the location of the Sitemap can also be included in the robots.txt file
by adding the Sitemap: /sitemap.xmI line to robots.txt.
O The file name must be robot.txt and must be placed on the root of the web i.e.
www.example.com/robot.txt.


The following example "/robots.txt" file specifies that no robots should visit any URL
starting with "/cyber world/map/" or "/tmp/", or / PersonalData.html. A Sitemap file also
has been provided.


# Robots.txt for http/www.example.com
User-agent:
Disallow: /cyber world/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
Disallow: /PersonalData.htm
Sitemap: /sitemap.xml


Sitemaps: Where a robot.txt disallow a crawler to retrieve information on
a website, a Sitemaps protocol allows a webmaster to inform search engines
about URLs on a website that are available for crawling. A Sitemap is an XML
file that lists the URLs for a site. t allows webmasters to include additional
information about each URL when it was last updated, how often it changes,
and how important it is in relation to other URLs in the site. This allows search
engines to crawl the site more intelligently. Sitemaps are a URL inclusion
protocol and complement robots.txt, a URL exclusion protocol. The
webmaster can generate a Sitemap containing all accessible URLs on the
site and submit it to search engines. Since Google, MSN, Yahoo, and Ask
use the same protocol now, having a Sitemap would let the biggest search
engines have the updated pages information. Sitemaps supplement do not
replace the existing crawl-based mechanisms that search engines already
use to discover URLs. nstead by submitting Sitemaps to a search engine, a
webmaster is only helping that engine's crawlers to do a better job of crawling
their site(s). Using this protocol does not guarantee that web pages will be
included in search indexes, nor does it influence the way that pages are
ranked in search results
Some more facts about sitemaps
O Sitemaps can also be just a plain text file containing list of URLs.


O Sitemap files have a limit of 50,000 URLs and 10 megabytes per sitemap.
O Multiple sitemap files are supported, with a Sitemap index file serving as
an entry point for a total of 1000 Sitemaps.
O The filename should fit in file naming restrictions of all common operating
systems.
O The filename may contain any characters but must comply with the above
4th point.


xml version='1.0' encoding='UTF-8'
urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
"
http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
url
lochttp://w3c-at.de</loc>
lastmod2006-11-18/lastmod
changefreqdaily/changefreq
priority0.8/priority
/url
/urlset



A sample files of text Sitemap
hLLp//wwwexamplecom/caLalog?lLem1
hLLp//wwwexamplecom/caLalog?lLem11





How Search Engines Rank Web Pages?
How a search engines go about determining relevancy, when confronted with hundreds
of millions of web pages to sort through? They follow a set of rules, known as an
algorithm. Exactly how a particular search engine's algorithm works is a closely-kept
trade secret. However, all major search engines follow some general rules to determine
the Page Ranking
O ocation A location where the search string is present is very important. A
website where the search string is present in the HTML title will be displayed first
in comparison with the website where it is present in body text. n other words, if
a search string is available in the title get the first preference and then header,
then first few paragraphs of the body text in the index page and so on.
O Frequency - is the other major factor in how search engines determine
relevancy. A search engine will analyze how often keywords appear in relation to
other words in a web page. Those with a higher frequency are often deemed
more relevant than other web pages.
O eta Tags Search engines also uses some Meta tags to rank/search/index
pages. Every search engine has its own set of tags that are used for this
purpose.
The above mentioned location/frequency method is a very generic method that is
treated as a first step for page ranking in most of the search engines. The other
methods include clickthrough measurement. t means that a search engine may watch
what results someone selects for a particular search, then eventually drop high-ranking
pages that are not attracting clicks, while promoting lower-ranking pages that do pull in
visitors.
Search engines may also penalize pages or exclude them from the index, if they detect
search engine "spamming." An example is when a word is repeated hundreds or
thousands of times on a page, to increase the frequency and propel the page higher in
the listings. Search engines watch for common spamming methods in a variety of ways,
including following up on complaints from their users.
Going through the above basic page ranking that in a way is controlled by web
administrator, a search engine may have their own algorithm to provide assured page
ranking which is a trade secret and some search engines may even charge for their
page ranking system. So even though the website administrator provides all the above


noted information for page ranking, there is not any guarantee that their page will be
displayed in the very first few result pages.
Types of SEO:
SEO is categorized in parts:
On page SEO
Off page SEO

On page SEO:Search engines are constantly improving their algorithm so they can
provide more relevant results. A relevant site is one that provides quality content for its
readers so after you have chosen your keyword phrases build the content of your pages
around those keywords. On page SEO means the code page optimization so, in this
phase, you should pay special attention to the following

O TitIe Tag:The title tag is one of the most important factors in achieving high
rankings. A title tag is essentially an HTML code that creates the words that
appear in the top bar of your Web browser. Usually, the Title Tag is the first
element in the <Head> area of your site, followed by the Meta Description and
the Meta Keywords Tags. These are the general rules you should follow when
optimizing your title tag Use in your title maximum 3 keyword phrases and 100
characters f possible, don't use stop words like "a, and, or Avoid spam don't
repeat the same keyword in your title more than twice; it's considered spam.
Also, some engines penalize for using all CAPS.
O eta Description tag: The META description tag describes your site's
content, giving search engines' spiders an accurate summary filled with multiple
keywords. The META description tag should contain multiple keywords
organized in a logical sentence. Place the keywords phrase at the beginning of
your description to achieve the best possible ranking Many SE's use this to
describe your site so make sure you not only repeat each of your keyword
phrases (max 3) at least once but make this a true representation of the page
that the visitor will be viewing, and try to keep it under 255 chars .

O The eta Keywords Tag: The Meta keywords tag allows you to
provide additional text for crawler-based search engines to index along with your


body copy. How does this help you? Well, for most major crawlers, it doesn't.
That's because most crawlers now ignore the tag. The meta keywords tag is
sometimes useful as a way to reinforce the terms you think a page is important
for on the few crawlers that support it. For instance, if you had a page about
pills -- AND you say the words pills at various places in your body copy -- then
mentioning the words "pills" in the meta keywords tag MGHT help boost your
page a bit higher for those words.
O Body Text:This is what your surfers will actually see when coming to your
site. There are many issues to consider when placing keywords in the text of
your pages. Most search engines index the full text of each page, so it's vital to
place keywords throughout your text. However, each search engine uses
different ranking algorithms. These are general rules that everyone should
follow

1. ake sure your main page have your main keywords t has
a higher chance of being indexed than your other pages, and it will be the
only page indexed by some engines. Some engines rank a page high if it
has at least 100 words, so make that your minimum. Directories include
pages based on the quality of their content, so make sure your pages
aren't simply lists of keywords.
. The H1, H...H6 tags are given special relevancy weight, and you
should plan to integrate your keywords into your heading. You don't have
to go extreme, just use one H1 for your most important keyword and two
H2's - one for each of your secondary keyword phrases.
3. BoIding and itaIicizing your keywords at least once doesn't hurt and
actually gives you a very small boost (recommended, but don't go bolding
every keyword on the page).
When creating your content pages, keep the following four concepts in
mind Keyword prominence, proximity, density and frequency.
Keyword prominence - the best place to place keywords in
the text is at the top of each page, preferably the main page. The
closer your keywords are to the start of the page or the start of a
sentence, the better. This concept is known as "keyword
prominence." You'll frequently see it used to describe search
engines' algorithms. Some engines also say the bottom of the
page should contain keywords as well. Here is an example how
search engines see your page. You have a page with the
following links at top


Keyword Proximity - some engines, such as Google, use the
concept of "keyword proximity" as part of their ranking formulas.
As suggested by the name, "keyword proximity" means the how
close keywords are to each other. Put your keywords as close
together as possible and make sure your sentences are clear.
Here's an example
a. We have been selling products for weight loss for over 3 years.

b. n this case if someone should look for "weight loss products, the
first sentence will rank higher because its keywords are closer to
each other.
O Navigation: The navigation structure of your site is important because it is
through your navigation that the search engine spiders are able to access all of your
web site's content. For sites with a small number of pages you should have every page
of your site linked to every other page of your site. This can be done by having a
left/right side menu or by putting a link to every other page on the bottom of your page.
For sites with many pages it is advisable to include in directories the pages that are
content related. For example, if you sell 10 different types of shoes and you also sell
food and clothes you're probably not going to link the sport shoes page and the leather
shoes page from the food page-but rather from the shoes.


Off-page SEO:Link popularity is the number of relevant quality inbound links
pointing to your website. Most major search engines use link popularity as part of their
algorithm which helps to determine the relevance of your website. f you don't have
inbound links, you won't rank well for competitive keywords.

O Ways to buiId Iink popuIarity ink exchange:There are some very
important aspects you should consider when exchanging links. The right way to
exchange links is by finding quality, relevant sites with which to exchange links.
These are the links that search engines care about. Convincing webmasters to
exchange links with you isn't easy, this is a time consuming process.


This is how it works You search for sites that are in the same general topic area
as yours (but not sites that compete directly). After you find a list of suitable link
exchange partners, you place a link to their site on your site. Then you email the
webmasters of the other sites and ask for a link exchange. Do not send generic
copies of the same email to each webmaster however. Write a personalized
email to the webmaster explaining what you liked about the site (be specific) and
why you think a link exchange would benefit both parties. Be sure to address the
webmaster by name if at all possible. Also, be sure to give him your link
exchange information. This should include the title of your site, a short
description, and the URL that you want his site to link to (this doesn't have to be
the home page). And be sure to include the URL of the page that already has his
reciprocal link on it. Be prepared for rejection for various reasons, this is normal
so don't be discouraged you will also receive email accepting your proposal.
Verify that your reciprocal link is in fact on their sites and then send a "thank
you" email. Here are some keyword phrases (KW) you can type in Google for
finding related sites. The inbound links through reciprocal link exchanges is very
powerful website promotional tool. f done correctly, increasing your linkage will
increase your traffic significantly

1-improve your visibility in the search engines by raising your link popularity

2-provide an added resource to your website

3-save you a lot of advertising money

4-You can use Arelis or Opt link software to manage your link exchange
campaign. t
AIIow other peopIe to pubIish your e-zine on their web site.
nclude your web site's ad and link in each issue you publish. This may also help you
increase the number of people that subscribe to your e-zine.
Create a directory of web sites on a specific topic.
Give people the option of adding the directory to their web site by linking to it. Put your
business ad at the top of the directory's home page
Offer a free e-book to your web site visitors.
The eBook should be related to your target audience. Allow them to give the e-book to
their own web site visitors by linking directly to your web site.
Exchange content with other web sites.
You could trade articles, top ten lists, etc. Both parties could include a resource box at
the end of the content.


oin or create a web ring.A web ring is a group of web sites on a similar subject
agreeing to link together. To find a web ring to join type keywords "web rings" into your
search engine of choice.

O Directory Submission: Directories are different from standard search
engines in that a search engine will query a database of indexed websites before
it produces results and a directory is a database of websites that have been
arranged by subject. Search engines put weight on the links coming from
directories which is why is so important that you know how to submit you site.
Remember, that your site must be optimized before submitting and not in
different construction phases; otherwise it will be rejected from submission. For a
correct submission select the most appropriate category related to the subject
matter of the site and then submit, suggesting a title, description and any other
information the directory may require. Also make sure you read their submission
guidelines.

SEO Techniques:

types of Techniques

White Hat SEO
Black Hat SEO


SEO techniques can be classified into two broad categories techniques that search
engines recommend as part of good design, and those techniques of which search
engines do not approve. The search engines attempt to minimize the effect of the latter,
among them spamdexing. ndustry commentators have classified these methods, and
the practitioners who employ them, as either white hat SEO, or black hat SEO. White
hats tend to produce results that last a long time, whereas black hats anticipate that
their sites may eventually be banned either temporarily or permanently once the search
engines discover what they are doing.



White Hat SEO:An SEO technique is considered white hat if it conforms to the
search engines' guidelines and involves no deception. As the search engine
guidelines are not written as a series of rules or commandments, this is an important


distinction to note. White hat SEO is not just about following guidelines, but is about
ensuring that the content a search engine indexes and subsequently ranks is the same
content a user will see. White hat advice is generally summed up as creating content for
users, not for search engines, and then making that content easily accessible to the
spiders, rather than attempting to trick the algorithm from its intended purpose. White
hat SEO is in many ways similar to web development that promotes
accessibility, although the two are not identical.

BIack Hat SEO;Black hat SEO attempts to improve rankings in ways that are
disapproved of by the search engines, or involve deception. One black hat technique
uses text that is hidden, either as text colored similar to the background, in an
invisible div, or positioned off screen. Another method gives a different page depending
on whether the page is being requested by a human visitor or a search engine, a
technique known as cloaking.
Search engines may penalize sites they discover using black hat methods, either by
reducing their rankings or eliminating their listings from their databases altogether. Such
penalties can be applied either automatically by the search engines' algorithms, or by a
manual site review.















A Search Lng|ne sees





A Search Lng|ne Doesn't see







Advantages of SEO

Most of people on nternet like to use search engines to search of a specific product or
service. You product listing should be on top pages in major search engines. You can
take services from SEO service providers to achieve specific target.
SEO advantages includes targeting traffic, increase brand visibility, increase in sales
etc. Keywords optimization plays significant role in making your best choice. SEO is
very much cost effective method of marketing.

Other advantages of SEO include best usability, browser compatibility and accessibility.
Better SEO services will increase in sales and to increase your business services,
affiliate business and credibility. SEO services can also increase your confidence and
authority as well. Many search engines list websites for free however some of them
extra top listings.

SEO service providers can help you to get in top listings. f SEO services are
implemented effectively then it can increase target traffic to your site and rank your site
for long term basis as well.

There are many SEO techniques are there and you can choose one best service from
them. Any of these methods can be used to rank your website. However an ongoing
SEO campaign must be used to get target result from your website.

O You can increase daily and unique visitors on your website by offering
information related to their needs.
O You can target visitors who are looking for items which are exactly related to
your company and increase brand recognition and identity.
O Writing contents which are exactly relevant to your product becomes to increase
public exposure.
O To achieve your organization's goals SEO is necessary if you are doing online
business.
O t is possible to divide your number of target visitors towards specific products
which are highly related to them.
O We can get measurable results using some marketing strategies by making
analysis of search engine reports, website statistics and visitor's conversion
rates and key indicators.




Disadvantages of SEO:
Unfortunately, the optimization of content has a negative impact on the industry of the
entire nternet. Although the process depends really on delivering relevant results to
users, it can be manipulated and exploited by unscrupulous practitioners of the trade,
known as the practice of some black hat search engine optimizers.
SEO is used to increase traffic, but no productive outcome. Web sites set only keyword,
as they can reach in their content, in the hope of a higher rank, despite the inability to
deliver, what to look for an application. Fortunately, search engines like Google have
implemented measures to counter these unsavory practices. Ultimately, it is still
valuable results that people tend to choose.













CONCUSION
The modification of a web site's build, content, and inbound linking to better
position the site in the natural/organic results of the major search engines that is
SEO.Search Engine Optimisation is not some mystic black art!Search Engine
Optimisation is simply understanding how search engines rank content and using this
understanding to ensure a website or some other digital assets ranks well. valuable seo
tips,seo is thinking the same way as search engine does and abide by their rules to get
your site ranked with white hat seo techniques.
',:041$
O Organic search results drive more clicks than paid search results. On Google's
search result page, Paid search picks up around 25% of the click-share ,
whereas organic search results account for around 75% of clicks. Consequently,
optimising the organic search listings is of upmost importance.
O Leaked data from AOL showed that number 1 organic search result has a click
through rate of 42%, the second result 12%, and the third result 8%. Based on
these figures, if you rank number one, you'll get 3.5x more traffic than if you rank
number 2, and 5 times the traffic than if you rank number 3!
O The search engine market share in the UK is heavily skewed to Google,
according to Hitwise, Google have over 90% of the UK market share!
O Conclusion Ranking in position 1 in Google UK is the key objective!
O Because Google uses SEO techniques in a better way and follows the ranking
through the algorithmic result.









BIBIOGRAPHY
Web Sites:
www.seamoz.org
www.searchenginewatch.com
"The ABC of SEO David George
en.wikipedia.org (WKPEDA)
www.seoarticles.com
www.virtualsplat.com
www.beanstalk.com

Das könnte Ihnen auch gefallen