Basics of Search Engine Optimization
SEO Ensures any website as best and “search engine friendly” which means that making sure that each individual page is compliant in accordance to Search engine standards and ready to be indexed correctly. We need to follow each and every page of website to ensure proper SEO.
Before you buy the domain name you need to have a list of targeted keywords which will help you to rank your website .you’ll need to insert them into your pages.
Search Engine Optimization |
Page Title
The title of the HTML page should be relatively short and describe the page content accurately. Try to include your keywords (without distorting the true meaning of the title), Wherever possible. For example: <title>SEO Basics</title>
Meta Tags |
Metatags
Use the description and keywords metatags at the head of each web page. Make these tags different on each web page you build.
<meta name=”description” content=”Basics of Search Engine Optimization.”> (The description should be no more than 160 letters.)
<meta name=”keywords” content=”Important SEO Basics”> ( Do not repeat the keyword, no more than 8 words)
<meta name=”description” content=”Basics of Search Engine Optimization.”> (The description should be no more than 160 letters.)
<meta name=”keywords” content=”Important SEO Basics”> ( Do not repeat the keyword, no more than 8 words)
Heading Tags
Use your heading tags. Make sure you use keywords in them. Use one <h1> tag per page with the most important keywords. Use other head tags (<h2>, <h3>, etc) to provide variations and support the main heading.
<h1>SEO Basics</h1>
<h2>Page Title</h2>
<p>The title of the HTML…</p>
<h2>Heading Tags</h2>
<p> Use your heading tags….</p>
Page Text
The texts of your web pages need to contain your keywords near the beginning of your text and near the end. You should aim for 1 keyword in every 100 but it needs to read naturally and use common phrases which people might search for. The main idea is to discretely spread keywords through out the text without making it obvious.
<p> SEO Basics for your “search engine friendly” Website .</p>
Remember that text contained within images won’t be picked up by search engines. Only actual text on the page will be indexed. This is where you need to use the alt = text in images, i.e. <a href=”http://www.basicsofseo.com”><img src=”http://www.basicsofseo/tools -120×600.jpg” alt=”seo image”/></a>
<p> SEO Basics for your “search engine friendly” Website .</p>
Remember that text contained within images won’t be picked up by search engines. Only actual text on the page will be indexed. This is where you need to use the alt = text in images, i.e. <a href=”http://www.basicsofseo.com”><img src=”http://www.basicsofseo/tools -120×600.jpg” alt=”seo image”/></a>
HTML |
Basic SEO is all about common sense and simplicity. But by its self it will not get you a no. 1 ranking on Google. Basic SEO doesn’t require specialized knowledge of algorithms, programming and taxonomy but it does require a basic understanding of how search engines work. There are two concepts of search engines that you need to understand. The first is how spiders work. The second is how search engines figure out which documents relate to which keywords and phrases.
Analysis of SEO |
The search engines collect data about your website by sending an electronic spider to visit the site and copy its content, which it then stores in the search engine’s database. Known as ‘bots’, these spiders are designed to follow links from one document to the next. As they copy and assimilate content from one document, they record links and send other bots to make copies of content on those linked documents. This process is never ending. By sending out spiders and collecting information 24/7, the major search engines have established databases that measure their size in the tens of billions of pages.
Knowing how the spiders read information on a site is the technical end of basic SEO. Spiders are designed to read site content like you and I read a newspaper. Starting in the top left hand corner, a spider will read site content line by line from left to right. If columns are used (as they are in most sites), spiders will follow the left hand column to its conclusion before moving to central and right hand columns. If a spider encounters a link it can follow, it will record that link and send another bot to copy and record data found on the document the link leads to. The spider will proceed through the site until it records everything it can possibly find there.
As spiders follow links and record everything in their path, If a link to a site exists, a spider will find that site. Webmasters and SEOs no longer need to manually or electronically submit their sites to the major search engines. All you need is a link from an indexed site to your own and the search spiders will find it and index your site. The most valuable incoming links (and the only ones worth perusing), come from sites that share the same themes.
To help a spider find all the pages that you wish them to index you need to build a text –based sitemap. The easiest way to do this if you are building a blog is to use a sitemap plugin. For a standard website use the Google webmaster tool to find out how to set up your sitemap. Its important to remember, make your website easy to follow for your readers as well.
Robots |
There are some pages that you will not want the spider to index, like your privacy, about and contact pages. You need the spider to follow the links to these pages but not to index them. You use robots.txt files to control this, again the easy way with a blog is to use plugins. A bit more skill is needed for a normal site, so you will have to do your homework.
Offering spiders access to the areas of your site you want them to access is half the battle. The other half is found in the site content. Search engines are trying to provide their users with lists of documents that relate to the users entered keyword phrases or queries. Search engines then determine which of billions of documents is relevant to a small number of specific words. In order to do this, the search engine needs to know how your site relates to those words.
Do not cover more than one topic on a page as this will be confusing for spiders and SEOs alike. It is equally important when you do this is to avoid duplicate content and the temptation to construct doorway pages specifically designed for search placements.
The last on-site element a spider examines when reading the site and relating it to user queries is the anchor text used in internal links. Use relevant keyword phrases in the anchor text. This basic SEO technique is aimed at solidifying the search engine’s perception of the relationship between the documents and the words used to phrase the link.
The foundation of nearly every successful SEO campaign is simplicity. The goal is to make your site easy to find, easy to follow, and easy to read for search spiders and live-visitors, with well written topical content and a fair number of relevant incoming links. While basic SEO can be time consuming in the early stages, the results are almost always worth it and set the stage for more advanced future work.
0 comments:
Post a Comment