Nepal and blog action day 08 poverty

Yes. October 15, is the day for "Blog Action Day 2008 Poverty"

  • 45 per cent of the population live below the poverty line. Nepal has the lowest life expectancy and highest illiteracy rates – especially for women.
  • A Maoist insurgency, general political instability and a reduction in tourism caused a negative growth rate.
  • Political Unrest and violence, lack of economic growth,illiteracy all have contributed in some way to Poverty.
  • Mid Western and Far Western regions in Nepal have the highest concentration of poor people.
  • 88 Ways to DO Something About Poverty Right Now


crawl errors sources by GWT

Just that google talked about Good times with inbound links, the google webmaster tool now shows sources for the crawl errors.
The web crawl errors detailing "Errors for url in sitemaps" and "Not Found" have a new column, "Linked From" added by GWT. This "Linked From" provides linked pages/sources for the URL.

Here's an example from ( Image Editing and Photo Retouching)

The figure shows 4 pages in the Linked From Column for Not Found URL. (click on image to see large view)

Now, after I clicked on 4 pages under "linked from", the above sources are shown. And the mentioned sources have broken links and so is under the Not Found Stat. Very much helpful information provided by "GWT /Linked From".

20.9.08 error filed

earlier post about the small little square box while browsing in Google Chrome. is google for Nepal.
Finally I am happy to see that the error has been filed for the further development. Please view at

Hope to see results soon.


twitter interface redesigned

The very popular social micro blogging, twitter has redesigned its interface. is my tweet and when I logged in this morning, I happened to realize the change in the design. And then quickly looked at the official blog of twitter to see the updates and yes, they have provided the update in their blog.
Here's how my twitter look like,

Get more from the twitter blog.



I know I have already once talked about sitelinks, and today I would again like to post some lines about the sitelinks.
So what are sitelinks? Google sitelinks are little set of links or snippets of links seen in SERPs just below the website URL.These list of links ranges from 2 to maximum number of 8 links.

Here's a screen shot of the popular photo collection website from Netherlands.

The links highlighted are called Sitelinks and they are the sitelinks for the Search with "photoos" in, you can see the above result.Here are few points I would like to mention for the sitelinks as I have experienced:-
  1. Site should rank first for the keyword used as query to search.
  2. Searching for short, high traffic keyword/s usually brand/domain names.
  3. Not for every keywords or keyphrase sitelinks appear.
  4. Fairly high natural search traffic.
  5. High click through rates from the search results page.
  6. Sitelinks are usually generated from top level links. Internal links from homepage.
  7. Easily crawled, structured navigation. Easy to crawl and relevant page links in HTML, Useful links.
  8. Pure HTML links, no java script or flash.
  9. Image with proper alt text.
  10. 2 to 8 numbers of sitelinks are displayed.
  11. Site age is several years or older.
  12. High probablity that the information on a web page will be accessed by a user.
  13. High probablity that the information will be useful to a user submitting a search query.

Google says, "Sitelinks are calculated algorithmically and completely automated".
You might also want to visit:



"Twitter is a service for friends, family, and co–workers to communicate and stay connected through the exchange of quick, frequent answers to one simple question: What are you doing?" from
Twitter is a micro blogging service, allows you to update your messages, read others messages. Users can receive update through, or SMS or RSS or other supporting applications, and can be found in

Now you know what twitter is, Start micro blogging now, follow me, know about my updates at


shortcut keys chrome

I have listed some shortcut keys for using google chrome,
  1. Alt + Home loads home page
  2. Control + Shift + N opens an ‘incognito’ window
  3. Control + T opens a new tab. The new tab can be dragged to make a new window.
  4. Control + Shift + T opens your most recently closed tab. Pressing the Control + Shift + T again opens the previously closed tab.
  5. Control + Tab lets you cycle your open tabs from first opened tab.
  6. Control + Shift + Tab lets you cycle your tabs begining from the last opened tab.
  7. Control + 1, Control + 2, Control + 3, etc for different first, second, third respective tabs.
  8. Hide and Unhide bookmarks with Control +B.
  9. Control + H is for History.
  10. Control + J shows Download page.
  11. Control + E or Control + K to search from the address bar. The '?' appears in the address bar, ready for search query.
  12. Shift + Escape is for Task manager.

incognito window

If you press Control+Shift+N in google chrome, you will see a new window called Incognito window. Surfing in Incognito window leaves no traces of cookies or browsing history. When you click for New Incognito window, Chrome displays:

You've gone incognito. Pages you view in this window won't appear in your browser history or search history, and they won't leave other traces, like cookies, on your computer after you close the incognito window. Any files you download or bookmarks you create will be preserved, however.
Going incognito doesn't affect the behavior of other people, servers, or software. Be wary of:
  • Websites that collect or share information about you
  • Internet service providers or employers that track the pages you visit
  • Malicious software that tracks your keystrokes in exchange for free smileys
  • Surveillance by secret agents
  • People standing behind you
You can also know more about incognito window from


google chrome screen shots

As I have been posting about Google Browser in this blog many times, finally...I saw the Chrome Screen.

I was just waiting so much for Google to release it's awaited browser, Google Chrome and finally I could get screen shot the Google Chrome,

First I visited this and then downloaded the Google Chrome and could finally see Thank You page as well.

That was wow for me,

here are some screen shots I took while downloading the Google Browser or so called Chrome.

First Step: Source :

Second Step

Final and Third Step

Finally I installed the Google Browser or Chrome. I was waiting for it since this afternoon and it is 1:43 am NST in the morning of 3rd September, 08.




Google is launching its own browser called Chrome today sep, 2, 2008. The Chrome is not available for download yet, as google's official blogspot says. But they still have shared the information about Google Chrome and they say comic book.

Image Source:

Beta will be available for download for 100 countries at first, I am excited to use Google Chrome in Nepal once it is available for download.


hypens better or underscores

For creating Google friendly URLs, I was googling for articles and forums to see if hypens(-) is better than underscroe(_).
I found one resource useful from Matt Cutts about the dashes and underscore. Read his article from

Also there's a supoprt from Google webmaster as well. Google recommends the use of hyphen than the underscore,

Here's the link:


Errors in browsers after installing Google analytics Code

I am sure, those of you who use Google Analytics tool must have surely noticed the javascript error in browsers either in IE or Firefox or any,

The Error that is seen is "_gat undefined", And I am sure you must have noticed such error.

But no worries, you can find the answers from the

If browser is unable to access the javscript source, browser will display error as "_gat is undefined". And such happens due to firewalls, browser's extensions like AdBlock, No Script, filters not allowing the javascript source file to be loaded by browser.

So as a solution, You can see the following example,

<script type="text/javascript">,
var gaJsHost = (("https:" == document.location.protocol) ? "https://
ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-' type='text/javascript'%3E%3C/script%3E"));
<script type="text/javascript">
if (typeof(_gat) == 'object') // See this line, here is the use of typeof(_gat)
var pageTracker = _gat._getTracker("UA-abeen0-X"); // replace with your own tracking code

It should solve your problem of seeing error in your browsers.


MSN webmaster tool

I happened to see this tool by msn.

MSN webmaster tool is in beta but still provides information stats about your websites. Very good tool for webmaster to drive in traffic.



Spreading Firefox - Nepal

Click below to Download Firefox,

Let's explore the strength of Firefox from today and let's be Firefox users.


checking duplicate content

Search Engines do not like duplicate content. Duplicate Content might be at two different websites or two different pages, try not to have duplicate content,

Here are some tools provided to check the duplicate contents:



list of links are sitelinks

Search engine websites like Google, MSN are showing list of links in their search results page at the first position of first page.
The links that sometimes are seen as a set of links under the main site are Sitelinks. These Sitelinks are sometimes shown by search engines as a search results for site. The Sitelinks help users to see list of links in that site.

The figure above shows the sitelinks of, the popular photo collection(sports and events) site from Netherlands.

Google generated sitelinks are automated and the site should be well structured to meet the Google sitelinks algorithm containing at least of three sitelinks so as to appear on search results.


web spider, crawler

web spider is an automated and methodological program that crawl the internet, programmed for task specific purpose and so is built in that specific way.
Some web spiders are developed to find email address in web pages also known as Email harvesting Crawlers, to check links in web pages, to find inactive links, validating links, validating HTML.
Spider is also developed to find the content from web pages and generate statistics or create copy of content from web pages. Especially search engines use web crawling to provide up-to-date.

Web crawler History:


different results from AWStats and Google Analytics

I found different results from Google Analytics and AWStats web analytics, and just did not notice that before today. I was happy to see the results from AWStats and at the same time worried by seeing the results from Google Analytics. AWStats provides over counted statistical data whereas it seemed that Google Analytics was not providing all the data.

I am thinking that following might be reasons for difference in statistical results from Google Analytics and AWStats:-

1. AWStats calculate the visitor for less time span where as Google Analytics calculate visitors for long time period thus reducing visitor count.
2. Automated spams, robots have affects in executing the AWStats whereas Google Analytics statistics are not affected.
3. Other java scripts interfering Google tracking code and turned off cookies would trouble Google Analytic for tracking results.
4. Google Analytics counts the results only if the pages are loaded fully. Navigating away without pages being fully loaded means data is not collected for that action as tracking code would not be able load to operate.

Also, I got something from Google Analytics Help Center,
here's the link:

Some tips to increase your traffic rank in Alexa

"Alexa computes traffic rankings by analyzing the Web usage of millions of Alexa Toolbar users. The information is sorted, sifted, anonymized, counted, and computed, until, finally, we get the traffic rankings shown in the Alexa service." Source:
  1. Install the Alexa Toolbar and also make your website the homepage.
  2. Ask few of your friends to install the Alexa Toolbar and surf your website. Also encourage others to use Alexa toolbar and get reviews from them
  3. Participate in webmaster forums where you can submit your website URL and can get feedbacks.
  4. Use Alexa redirects e.g. ( in place of your URL.
  5. Buy ad, banners and links for traffic from webmasters websites and forums which drive lots of webmaster traffic to your website significantly boosting up your rank.
  6. Post and advertise articles with tips on how to increase your Alexa traffic rankings.

increase traffic rank in Alexa

  • Alexa computes traffic rankings by analyzing the Web usage of millions of Alexa Toolbar users.
  • The traffic rank is based on three months of aggregated historical traffic data from millions of Alexa Toolbar users and is a combined measure of page views and users (reach).
  • Page views measure the number of pages viewed by Alexa Toolbar users. Multiple page views of the same page made by the same user on the same day are counted only once.
  • Reach measures the number of users.
  • Alexa expresses reach as number of users per million. For example, if a site like has a reach of 28%, this means that if you took random samples of one million Internet users, you would on average find that 280,000 of them visit
  • The daily traffic rank reflects the traffic to the site based on data for a single day.The Trend graph shows you a three-day moving average of the site's daily traffic rank, charted over time.

SEO: few tips

Some Basic things to consider for Search Engine Optimization, in short SEO
Writing rich text Content:
For any web page, Content is king. So try to have rich contents in the pages. Target the goal of your purpose through contents in the page.
Selecting correct keywords:
Choosing the right keywords is very essential since those keywords target the optimization process. Right keywords find your site in the search engine.
Optimizing your HTML page Title is very important for the search engines. The Page Title shows up as a first Clickable link in the search result. So the Page Title should be compelling to target the landing pages in your site.
Meta Description Tags:
Meta Tags, Meta Description Tag and Meta Keywords Tag are invisible text elements.
  • Meta Description Tag should be informative enough to list description of your page.
  • Keep the text in Meta Description Tag informative with minimum characters
  • Pair it up with Title Tag but do not repeat your title text in your Meta Description Tag.
  • Try including some of your valuable keywords which influences page ranking but do not stuff the description tag with long list of keywords.
Meta Keywords Tag
  • Although Meta Keyword Tags do not play that big role for optimization to search engines, but they should not be left out.
  • Meta Keywords Tag along with the Meta Description Tag helps in influencing the search engine rank.
  • The Keywords Tag is a word or phrases of the page document.
Use of Robot.txt
  • Robot.txt file allows the spiders to crawl knowing what they can and can't index. This is helpful in keeping spiders out of folders that you do not want index like the admin or secured folders, or even content that you don't want spiders to index.
  • Most of the spiders index any page they come across and links from them can be followed but still it is good idea to add a Robot Tag with the index and follow statements.
For example, letting spiders to index all pages' content
User-agent: *
Here's another example that would block spiders from indexing admin.php and cgi-bin, admin
User-agent: *
Disallow: /admin.php
Disallow: /cgi-bin/
Disallow: /admin/

Using Title and Alt attributes tells relevance of links to search engines.
Creating a Site Map Page - Since every page on the website will be linked to the sitemap, it allows web crawlers (and users) to quickly and easily find content of website.

Validate HTML and CSS making sure that there are no broken links and images in the pages.

Inbound links send visitors to your site; hence links building are important for your websites. Also Directories represent chance to describe about your site with well visitor targeted contents. Thus submitting to Open Directory Project, Yahoo! Directory also plays role for search engine optimization.

QA critical to web development company

Web Quality
Quality is a tough concept when you are dealing with a web site. Web Quality refer to measurable characteristics- things we are able to compare to known standards such as:
HTML standards
HTML Validation
Validating links
CSS Validation
Validating accessibility

In practical terms, this requires a site to validate against a series of checkpoints. These include:
  • Checking for broken links.
  • Checking for missing content, e.g. images.
  • Checking for missing page titles.
  • Checking the spelling and grammar of content.
  • Checking for missing metadata.
  • Checking the file sizes of pages to ensure they are not too large.
  • Checking for browser compatibility.
  • Checking that applications are functioning correctly, e.g. online forms.
  • Checking that any Server Side Scripting or other languages function correctly.
  • Checking that legal and regulatory guidelines are being adhered to, e.g. data protection and privacy.
  • Checking that pages conform to your organisation's Web-Accessibility standard (if any), e.g. missing 'alt-tags'.
  • Checking that the Website Design standard is maintained.