9 Best professional online networking web sites

1. Network: Linkedin

Registered Members: LinkedIn has 30 million professional members from around the world representing 150 industries.

2. Network: Ecademy
Registered Members: Membership numbers aren’t disclosed.

3. Network: Xing
Registered Members: 6 million members, mainly Europe (Germany) and China.

4. Network: Ryze
Registered Members: 500,000 members

5. Network: Plaxo
Registered Members: 15 million members

6. Spoke
Registered Members: Over 60 million people

7. Silicon India
Registered Members: 500,000 members

8. brijj.com
Registered Members: Membership numbers aren’t disclosed.

9. facebook.com
Registered Members: 321.1 million people have a facebook by 2009.

My Suggestions for Google Top ranking

My Suggestions for Google Top ranking

 

Create City wise Sub domains

like pune.yourdomain.com, Mumbai.yourdomain.com

 

in pune.yourdomain.com upload whole website or redirect to our yourdomain.com site

Another way upload whole website but disable only selected means pune city in search criteria.

 

Google collect all website data every week on Sunday. He catches many types of data like meta description, meta keywords, page information, page name, page content. But Google give First preference to website name than WebPages name (page title).

 

That’s why suggested to create simple city wise and favorite category wise sub domains. This is very powerful system which we have implement and improve.

 

Google can catch our sub domains and there different types of data means specialize (city wise and favorite category wise) information of rental activity in that city or that category.

 

This is technically possible anytime. Not so much hard work. No database mismatch or No additional database required. no additional mail required ( but if we create mail like pune@yourdomain.com, Mumbai@yourdomain.com ) this is very better for Google top ranking. Pune rental requirement come on only pune@yourdomain.com

 

Now we hosted our website in Godaddy.com. We can create 10 sub domain in our hosting plan.

 

This is very simple and achieving idea for Google top ranking.

 

Pl read my Search engine Report because we use Php and Mysql Technology in our Web portal. What Google did for data collection and what criteria they use for ranking or listing of website? I read many articles and used many soft wares and programs for Google ranking.

 

I am pretty glad to tell you. I suggested techniques are very helpful for our achievement.

 

This techniques used by many website like

 

(pune.magicbricks.com, pune.indiaproperty.com, realestate.virtualpune.com, apartment. magicbricks.com, pune.click.in, Bharatmatrimonial.com and many more)

 

Search Engine Optimizing PHP Report

 

PHP pages have a reputation of being more difficult (or at least different) to SEO than static HTML pages. Here?s an overview of the major issues encountered when trying to optimize PHP script for search engines. While this focuses on PHP much of it is still relevant to SEO’ing dynamic pages in general.

PHP Speed

While page size does affect load time, spiders run on servers connected to high bandwidth networks, so download time is less important than the latency of the PHP script?s execution time. If a search engine spider follows a link on a site and is forced to wait too long for the server to process the PHP code behind that page, it may label your page as unresponsive.

The biggest delays in a PHP script typically are the database and loop code. Avoid making SELECT * calls, instead explicitly name all the columns you want to retrieve, and if you are using MySQL, test your queries using the EXPLAIN statement. To optimize loops consider using duplicated code instead of loops that don’t repeat very many times, also use as many static values, such as count($array) values inside the loop as you can, generating their values before the loop once.

URL Cleanliness

A major goal in SEO’ing your PHP pages, is to make them look and act like static pages. If you have a large site you can use Apache to fake static looking URLs, or, with a smaller site, you can simply keep your GET variables to a useful minimum. In either case, however, never allow a spider to see links with different URL?s to the same content. If the URL is different, the page should be too.

One of the major problems most webmasters have with getting their dynamic pages to index is URL cleanliness. Since many dynamic pages are created with GET variables, lots of pages have URLs that look like:

Page.php?var=lkdjdfhv&var2=345&var3=kfjdks

Most of the search engines will be able to follow this link, because it has 3 or under get variables (a good rule of thumb is to keep the number of GET variables passed in the URL to under 3), but any more than 3 variables and you will run into problems. Try using less GET variables, and make them more relevant, rather that useless id numbers use titles, and other keyword rich bits of text. This is an example of a better URL:

Page.php?var=category&var2=topic

If the page requires more variables you may want to consider combining the variables by delimiting them with a hyphen or another unused character, and then splitting the variable in the target page.

Disabling Trans_sid

Possibly the biggest cause of webmaster frustration when SEO?ing php pages is PHP?s tendency to add session id numbers to links if cookies are rejected by the browser (Search engine spiders reject cookies). This will happen by default if your PHP installation was compiled with the ?enable-trans-sid option (and this is the default from version 4.2 onward), and it creates links with an additional, long nonsense looking GET variable. In addition to making the links clunky this gives spiders different URLs with the same content, which makes them less likely to treat the pages individually, and possibly not even index them at all. A quick way to fix this is to disable the trans-id feature, if you have access, in php.ini by setting ?session.use_trans_sid? to false. If you don?t have access to change php.ini you can add this line to the .htaccess file in your root directory:

<IfModule mod_php4.c> php_flag session.use_trans_sid off </IfModule>

Masking dymamic URLs

However the mere presence of a question mark in the URL will introduce a delay in google?s indexing of the page. This from Mike Grehan?s Interview with Daniel Dulitz of Google:

?So the mere presence of a question mark in the URL doesn’t throw us for a loop at all. We will crawl those pages more slowly than we do with pages without question marks in the URL. Purely because they’ve identified themselves as being dynamic and we certainly don’t want to bring anyone’s site down because we’re hitting the database too hard.?

Small sites will not need to worry much about this delay as it means your server is hit every few minutes, not a few times a second, but for larger sites this can slow down your site?s inclusion into the index.

Making dynamic pages look static without mod_rewrite

A way to mask bulky dynamic page URLs (and avoid the question mark delay) is Apache?s ForceType in combination with a PHP command to interpret URLs like: http://www.example.com/books/computers.html as referring to a page called ?books? which is executed as a PHP script (I usually make a link on the linux server from ?books.php? to ?book? to make editing the script easier). The function will return an array of the additional values, including the category ?computers?. This can be accomplished by inserting a line like this into the .htaccess file in the root of your web documents directory:

<Files *Directory Handle Name Here*> ForceType application/x-httpd-php </Files>

*Directory Handle Name Here* should be replaced with the alias name you are giving to the fake directory you are creating. In this example:

http://www.example.com/books/computers.html

your .htaccess line would look like this:

<Files books>ForceType application/x-httpd-php</Files>

You can then log into your server and create a link to the file ?books? or whatever directory alias you have choosen. This is done with the linux command “ln”, or link, like this:

$ Ln books.php books

This creates a link between the existing PHP script books.php and the non-existant ?books?. This way all requests for the directory “books” will be given to books.php, ie http://www.example.com/books/computers.html would be handled by books.php

Inside books.php you can use this function (available here) to extract values from the static-looking request URI?s:


<?php
function processURI() {
    $request = explode(“/”,$_SERVER[“REQUEST_URI”]);
    $count = count($request);     
    for ($i = 1 ; $i < $count ; $i++) {              
        $values[“arg”.$i] = $request[$i];   
    }
    return $values;   
}
?>
1

So the above example of: http://www.example.com/books/computer.html would be processed like this:


<?php
$vals = processURI();
$_GET[$vals[1]] = str_replace(“.html”, “”, $vals[2]);
?>
1

With this, $_GET[‘books’] would equal “computer” similar to http://www.example.com/index.php?books=computer

PHP?s SEO Advantages

A great advantage to using dynamic pages as opposed to static pages is the ability to create content that is constantly changing and updated in real time. RSS headlines, randomly circulating content and other automatically ?fresh? content can boost your ranks in Google, and many other engines.

Another advantage to using PHP is that you can make simple modifications to many scripts to create relevant and fresh page titles. Since this is the most important on page factor in SEO special attention should be given to creating title tags that accurately reflect the page?s current content. Any html templates or used in PHP pages can be altered to contain this line:

<title><?=$page_title?></title>

With this, $page_title can be set to a keyword rich test describing the page. Title text is also important in improving the click-through from SERP?s, so be sure that the title tags doesn?t read like spam, but more like a human created title.