Showing posts with label dynamic. Show all posts
Showing posts with label dynamic. Show all posts

7 Basic SEO Techniques All Webmasters Must Know

Technique #1 – Optimize your title tags

Every page on your website should have its own unique title tag. However, not all title tags are created equally!

From an SEO perspective, a good title tag should:

Be no more than 70 characters (including spaces)
Include both product- or service-related keywords and your company’s brand name
Be both intriguing and informative enough to prompt search engine users to click through to your page (as title tags are typically used in the snipped displayed in the natural search results)
To see what this all looks like in real life, let’s look at a hypothetical example of how this and the following SEO techniques could be put into practice…

Suppose you run an auto body shop in San Francisco, CA. You’ve recently launched a new website which includes a page on your current specials. After conducting your keyword research, you’ve decided to target the phrase “auto body coupon,” along with location modifiers that put your website in front of consumers in your city.

In this case, a good title tag could be:

“Auto Body Coupons and Discounts | Frank's Auto - San Francisco, CA”
In addition, as you’re crafting your title tags, avoid “keyword stuffing” these important fields. Title tags that are no more than a string of all your website’s target keywords (in this example, something like “Auto Body Coupons, Auto Body Discounts, Auto Body Savings”) frustrate visitors and can lead to over-optimization penalties by the search engines.

Technique #2 – Create compelling meta descriptions
Along with your title tags, the section of every page on your website should contain a customized meta description. These brief page summaries should be no more than 150-160 characters long and should include at least one mention of your page’s target keyword phrase.

Following our previous example, your meta description could read something like this:

“Need quality auto body repair work done at a discount price? Check out the latest auto body coupon codes from Frank's Auto of San Francisco, CA.”
While meta descriptions don’t hold nearly the SEO weight that they used to, their presence in the snippets found on search engine results pages plays an important role in your site’s overall click-through rate (CTR). By writing your meta descriptions in a way that captures search users’ attention, you’ll increase the number of visitors who choose to click on your listing compared to your competitors’.



Technique #3 – Utilize keyword-rich headings
Both the title tag and meta description should be included in the section of every page on your website – but what about the rest of your page’s content?

When it comes to your pages’ body text, one of the best things you can do to improve your overall SEO value is to include <h2> and heading tags containing your target keyword phrases in your content. Not only do the search engines place added weight on the words found in these particular tags, the visual relief they provide to your website’s readers will help them to move more efficiently through your site’s content.

Technique #4 – Add ALT tags to your images
If you choose to add images to your web pages, make sure to add ALT tags to them as well.

ALT tags originally came about to provide visually impaired website visitors using text-to-speech devises with additional information about the content on their screens. And though this initial purpose is still valid, the content found in these fields is also important when it comes to SEO.

Again, you shouldn’t “stuff” your ALT tag full of target keywords and keyword variations. Instead, use your ALT tags to clearly and accurately describe your website’s images, working in keyword phrases as they occur naturally.

Technique #5 – Create a sitemap
In the world of SEO, the search engines’ indexing programs – commonly referred to as “spiders” – play the important role of analyzing new websites (or new content added to existing websites) and adding their content to the lists of pages that can be displayed in response to user queries.

As a result, facilitating the easy movement of these spider programs throughout your own site is an important part of optimizing your content for natural search traffic.

One of the best things you can do to help the spiders index your website is to create a sitemap – a page listing links to all the other pages on your site. There are plenty of different automated tools that can generate these important pages for you, though you can always create a sitemap by hand if your site is small.

Technique #6 – Build internal links between pages

Another way to help the search engine spiders to catalogue all of your site’s pages is to create internal links that connect your different pieces of content.

As an example, instead of simply linking to your “Contact” or “About Us” page from your navigation bar, consider adding text links to these pages from within the body content found on your home page. Doing so is a great way to help your visitors find the information they’re looking for, in addition to providing a major SEO boost to your site’s content.

Technique #7 – Update your site regularly
One final SEO technique you’ll want to implement is to update your site periodically. Because the search engines’ top priority is serving up the results that will best meet their users’ needs, they prefer to share websites that contain the latest, most up-to-date information. In addition, the more content you post to your site, the more keywords your website will include – increasing your odds of generating search engine traffic.

The easiest way to make regular updating a part of your SEO strategy is to create a company blog or news section. Either one of these tools will help you to connect with your readers, while also appeasing the search engines’ desire for fresh content.

Unfortunately, implementing these seven basic SEO techniques on your website alone isn’t enough to guarantee that your site will reach the coveted top spot in the search engine results pages for your chosen keyword phrases overnight. Realistically, SEO is a process that takes time to deliver results – though many new webmasters find this frustrating.

Instead of getting overwhelmed by the number of different SEO techniques out there, start with these seven basics. Over time and with continued commitment to pleasing both your readers and the search engines, you will start to see results!

How to Optimize Dynamic Website or Pages?



Definition of Dynamic Pages :



Dynamic pages are the pages generated "on-the-fly" from a database. It is also known as database-driven pages. PHP, Perl, ASP, etc are using to generate dynamic pages.


It is difficult for the search engines to read the dynamic pages. It is generated when user select variable. It is not possible for search engines to select those variables and this reason these pages cannot indexed.


Dynamic URL's contains strings like &, %, ? and due to these characters problems starts for the search crawlers to read the content. But it is possible to index the dynamic pages also.


Here, I am writing the way through which a dynamic page can be readable by the crawlers :



  1. Place links of the dynamic pages on static pages. Further it can be submitted in search engines manually. Most of the dynamic pages will indexed by the search engine by this way.
  2. Coldfusion and other softwares are available which can replace special characters like ?, %, & with alternative text. 
  3. Using help of CGI/Perl, we can convert query string in the url with suitable text. Path_Info and Script_Name are environment variables which contains complete URL including strings in dynamic pages.
  4. Web hosting company also provides module with Apache software that can convert dynamic URL into urls that can be indexed by search engines. The mod_rewrite available for Apache that converts requested url into search engine friendly urls.



Dynamic URLs vs. Static URLs







The Issue at Hand

Websites that utilize databases which can insert content into a webpage by way of a dynamic script like PHP or JavaScript are increasingly popular. This type of site is considered dynamic. Many websites choose dynamic content over static content. This is because if a website has thousands of products or pages, writing or updating each static by hand is a monumental task.

There are two types of URLs: dynamic and static. A dynamic URL is a page address that results from the search of a database-driven web site or the URL of a web site that runs a script. In contrast to static URLs, in which the contents of the web page stay the same unless the changes are hard-coded into the HTML, dynamic URLs are generated from specific queries to a site's database. The dynamic page is basically only a template in which to display the results of the database query. Instead of changing information in the HTML code, the data is changed in the database.

But there is a risk when using dynamic URLs: search engines don't like them. For those at most risk of losing search engine positioning due to dynamic URLs are e-commerce stores, forums, sites utilizing content management systems and blogs like Mambo or WordPress, or any other database-driven website. Many times the URL that is generated for the content in a dynamic site looks something like this:

http://www.somesites.com/forums/thread.php?threadid=12345&sort=date

A static URL on the other hand, is a URL that doesn't change, and doesn't have variable strings. It looks like this:

http://www.somesites.com/forums/the-challenges-of-dynamic-urls.htm

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.

A search engine wants to only list pages its index that are unique. Search engines decide to combat this issue by cutting off the URLs after a specific number of variable strings (e.g.: ? & =).

For example, let's look at three URLs:

http://www.somesites.com/forums/thread.php?threadid=12345&sort=date

http://www.somesites.com/forums/thread.php?threadid=67890&sort=date

http://www.somesites.com/forums/thread.php?threadid=13579&sort=date

All three of these URLs point to three different pages. But if the search engine purges the information after the first offending character, the question mark (?), now all three pages look the same:

http://www.somesites.com/forums/thread.php

http://www.somesites.com/forums/thread.php

http://www.somesites.com/forums/thread.php

Now, you don't have unique pages, and consequently, the duplicate URLs won't be indexed.

Another issue is that dynamic pages generally do not have any keywords in the URL. It is very important to have keyword rich URLs. Highly relevant keywords should appear in the domain name or the page URL. This became clear in a recent study on how the top three search engines, Google, Yahoo, and MSN, rank websites.

The study involved taking hundreds of highly competitive keyword queries, like travel, cars, and computer software, and comparing factors involving the top ten results. The statistics show that of those top ten, Google has 40-50% of those with the keyword either in the URL or the domain; Yahoo shows 60%; and MSN has an astonishing 85%! What that means is that to these search engines, having your keywords in your URL or domain name could mean the difference between a top ten ranking, and a ranking far down in the results pages.

The Solution

So what can you do about this difficult problem? You certainly don't want to have to go back and recode every single dynamic URL into a static URL. This would be too much work for any website owner.

If you are hosted on a Linux server, then you will want to make the most of the Apache Mod Rewrite Rule, which is gives you the ability to inconspicuously redirect one URL to another, without the user's (or a search engine's) knowledge. You will need to have this module installed in Apache; for more information, you can view the documentation for this module here. This module saves you from having to rewrite your static URLs manually.

How does this module work? When a request comes in to a server for the new static URL, the Apache module redirects the URL internally to the old, dynamic URL, while still looking like the new static URL. The web server compares the URL requested by the client with the search pattern in the individual rules.

For example, when someone requests this URL:

http://www.somesites.com/forums/the-challenges-of-dynamic-urls.html

The server looks for and compares this static-looking URL to what information is listed in the .htaccess file, such as:

RewriteEngine on

RewriteRule thread-threadid-(.*)\.htm$ thread.php?threadid=$1

It then converts the static URL to the old dynamic URL that looks like this, with no one the wiser:

http://www.somesites.com/forums/thread.php?threadid=12345


You now have a URL that only will rank better in the search engines, but your end-users can definitely understand by glancing at the URL what the page will be about, while allowing Apache's Mod Rewrite Rule to handle to conversion for you, and still keeping the dynamic URL.

If you are not particularly technical, you may not wish to attempt to figure out the complex Mod Rewrite code and how to use it, or you simply may not have the time to embark upon a new learning curve. Therefore, it would be extremely beneficial to have something to do it for you. This URL Rewriting Tool can definitely help you. What this tool does is implement the Mod Rewrite Rule in your .htaccess file to secretly convert a URL to another, such as with dynamic and static ones.

With the URL Rewriting Tool, you can opt to rewrite single pages or entire directories. Simply enter the URL into the box, press submit, and copy and paste the generated code into your .htaccess file on the root of your website. You must remember to place any additional rewrite commands in your .htaccess file for each dynamic URL you want Apache to rewrite. Now, you can give out the static URL links on your website without having to alter all of your dynamic URLs manually because you are letting the Mod Rewrite Rule do the conversion for you, without JavaScript, cloaking, or any sneaky tactics.

Another thing you must remember to do is to change all of your links in your website to the static URLs in order to avoid penalties by search engines due to having duplicate URLs. You could even add your dynamic URLs to your Robots Exclusion Standard File (robots.txt) to keep the search engines from spidering the duplicate URLs. Regardless of your methods, after using the URL Rewrite Tool, you should ideally have no links pointing to any of your old dynamic URLs.

You have multiple reasons to utilize static URLs in your website whenever possible. When it's not possible, and you need to keep your database-driven content as those old dynamic URLs, you can still give end-users and search engine a static URL to navigate, and all the while, they are still your dynamic URLs in disguise. When a search engine engineer was asked if this method was considered "cloaking", he responded that it indeed was not, and that in fact, search engines prefer you do it this way. The URL Rewrite Tool not only saves you time and energy by helping you use static URLs by converting them transparently to your dynamic URLs, but it will also save your rankings in the search engines.