I am very happy to introduce myself as a web promotion expert. I am sure & confident that I can promote any website. After my promotion techniques, your website will be available on first page in Google within 7 days...that’s for sure.
I look forward to promote your website…

Wednesday, February 16, 2011

DNS Records (NS, A, MX, CNAME, SPF) Explained!

Your domain name has a DNS zone which consists of the following records:

* NS - specifies which are the DNS servers for your domain;
* A - specifies IP addresses corresponding to your domain and its sub domains;
* MX - specifies where the emails for your domain should be delivered;
* CNAME - specifies redirects from your domain's subdomains to other domains / subdomains;
* SPF - Sender Policy Framework (SPF) is an attempt to control forged e-mail.

in reference to: Google (view on Google Sidewiki)

Sunday, February 13, 2011

Online Market Is Growing..... Americans watching more videos online

As many as 49 percent of all Americans now incorporate online video into their everyday lives. They are also watching more videos online per day, with 66 percent of respondents indicating that they view more videos than they did 12 months ago.

in reference to: Google (view on Google Sidewiki)

B2B online marketing proves to be resilient in 2010

With many Americans increasingly turning to online communication channels in 2010, firms are seeing more return on their B2B online marketing initiatives, comScore reports.

According to comScore's 2010 U.S. Digital Year in Review report, several online marketing channels are seeing more use from Americans. For example, one out of every eight minutes spent online is now devoted to social media platforms, with 90 percent of Americans visiting social networks every month.

Further, the number of people who conducted search queries increased by 4 percent in 2010. Meanwhile, the overall volume of searches performed by Americans rose by 12 percent over the course of the year.

"2010 was a very positive year for the digital media industry, highlighted by ... significant innovation and increased demand for online advertising" said comScore chairman Gian Fulgoni. "As we embark on a promising 2011, marketers must have a sound understanding of the digital media landscape and how it is changing if they hope to capitalize on key trends that can drive their business into the future."

While the number of queries conducted in 2010 rose, the search landscape is growing increasingly more complex. According to Experian Hitwise, Bing is gaining more market share each month, with a growth of 21 percent in January.

Thanks

http://www.rajeshgoutam.com

in reference to: Google (view on Google Sidewiki)

Friday, February 11, 2011

Online Market Updates!!!

Online marketing industry size to touch Rs 2k crore by 2013...

As rules of the advertising game change rapidly, online or digital marketing market size in India is estimated to touch close to Rs 2,000 crore in the next two years from a Rs 1,400 crore now, say management experts.

in reference to: Google (view on Google Sidewiki)

Saturday, February 5, 2011

Hey want to know more about Port's???

FTP: 20, 21,
SMTP: 28,
HTTP: 80,
SSL:443,
DNS: 53
POP3:110
SMTP:25
NNTP:119
IMAP:142

SSL port no:

HTTP: 443
SMTP:456
POP3:995
IMAP:993
NNTP:563
LDAP:636

>>>>>>

in reference to: Google (view on Google Sidewiki)

Shared Web Hosting - The best small business hosting solution

Shared Web Hosting is ideal if:
• You want your site online within 24 hours!
• You have little or no technical experience
• Want low cost hosting solutions
Yes, in a nutshell, that shared web hosting is a crucial key that can get web startups up and moving in a very short time. Lets find out how it works.

For small businesses entering into the web domain, shared web hosting is proving to be an ideal launch vehicle. We all know that for any startup in any business sector, it is vitally important to find the right partners to get a kickstart.

More so in the case of web business. If you choose the right web-partners, you are likely to get the right solutions at the right time from them. In fact, if you get your web partnerships to form the ideal synergy, even a small-web business start-up of yours can experience skyrocketing growth bringing in high ROI.

Vital steps towards building up a successful small web business

Right from designing your small-business website to hosting it over the net, the success of your small-business web site depends a lot on how you choose partners at each step of its development.

The right web partners will help you boost your business prospects, be it in terms of building your brand image and presence, search engine optimization or providing a small business hosting solution.

Right from website designing to hosting it over the net, the whole process of building a successful web business is completed in the following stages:
1. Buying a Domain
2. Choosing a Host
3. Designing your website
4. Promoting your website
5. Generating Revenue from your website

Rajesh Goutam....Online Marketing / Web Solution Expert

in reference to: Google (view on Google Sidewiki)

Friday, February 4, 2011

"Access is Denied" error message when you try to open a folder

To resolve this issue, you must turn off Simple File Sharing, and then take ownership of the folder:

1. Turn off Simple File Sharing:
1. Click Start, and then click My Computer.
2. On the Tools menu, click Folder Options, and then click the View tab.
3. Under Advanced Settings, click to clear the Use simple file sharing (Recommended) check box, and then click OK.
2. Right-click the folder that you want to take ownership of, and then click Properties.
3. Click the Security tab, and then click OK on the Security message, if one appears.
4. Click Advanced, and then click the Owner tab.
5. In the Name list, click your user name, Administrator if you are logged in as Administrator, or click the Administrators group.

If you want to take ownership of the contents of that folder, click to select the Replace owner on subcontainers and objects check box.
6. Click OK.

You may receive the following error message, where Folder is the name of the folder that you want to take ownership of:
You do not have permission to read the contents of directory Folder. Do you want to replace the directory permissions with permissions granting you Full Control? All permissions will be replaced if you press Yes.
7. Click Yes.
8. Click OK, and then reapply the permissions and security settings that you want for the folder and the folder contents.

in reference to: Google (view on Google Sidewiki)

Webmaster guidelines

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized. If a site has been penalized, it may no longer show up in results on Google.com or on any of Google's partner sites.

Design and content guidelines

Learn more...

Technical guidelines

Learn more...

Quality guidelines

Learn more...

When your site is ready:

Design and content guidelines

  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.
  • Make sure that your elements and ALT attributes are descriptive and accurate.
  • Check for broken links and correct HTML.
  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
  • Keep the links on a given page to a reasonable number (fewer than 100).
  • Review our image guidelines for best practices on publishing images.

With image search, just as with web search, Google's goal is to provide the best and most relevant search results to our users. Following the best practices listed below (as well as our usual webmaster guidelines) will increase the likelihood that your images will be returned in those search results.

Don't embed text inside images

Search engines can't read text embedded in images. If you want search engines to understand your content, keep it in regular HTML.

Tell us as much as you can about the image

Give your images detailed, informative filenames

The filename can give Google clues about the subject matter of the image. Try to make your filename a good description of the subject matter of the image. For example, my-new-black-kitten.jpg is a lot more informative than IMG00023.JPG. Descriptive filenames can also be useful to users: If we're unable to find suitable text in the page on which we found the image, we'll use the filename as the image's snippet in our search results.

The alt attribute is used to describe the contents of an image file. It's important for several reasons:

  • It provides Google with useful information about the subject matter of the image. We use this information to help determine the best image to return for a user's query.
  • Many people—for example, users with visual impairments, or people using screen readers or who have low-bandwidth connections—may not be able to see images on web pages. Descriptive alt text provides these users with important information.

Not so good:

Better:

puppy

Best:

Dalmatian puppy playing fetch 

To be avoided

dog pup pups puppies doggies pups litter puppies dog retriever 
 labrador wolfhound setter pointer puppy jack russell terrier 
puppies dog food cheap dogfood puppy food"/>

Filling alt attributes with keywords ("keyword stuffing") results in a negative user experience, and may cause your site to be perceived as spam. Instead, focus on creating useful, information-rich content that uses keywords appropriately and in context. We recommend testing your content by using a text-only browser such as Lynx.

Anchor text

External anchor text (the text pages use to link to your site) reflects how other people view your pages. While typically webmasters can't control how other sites link to theirs, you can make sure that anchor text you use within your own site is useful, descriptive, and relevant. This improves the user experience and helps the user understand the link's destination. For example, you might link to a page of vacation photos like this: Photos of our June 2008 trip to Ireland.

Provide good context for your image

The page the image is on, and the content around the image (including any captions or image titles), provide search engines with important information about the subject matter of your image. For example, if you have a picture of a polar bear on a page about home-grown tomatoes, you'll be sending a confused message to the search engines about the subject matter of polarbear.jpg.

Wherever possible, it's a good idea to make sure that images are placed near the relevant text. In addition, we recommend providing good, descriptive titles and captions for your images.

Think about the best ways to protect your images

Because images are often copied by users, Google often finds multiple copies of the same image online. We use many different signals to identify the original source of the image, and you can help by providing us with as much information as you can. In addition, the information you give about an image tells us about its content and subject matter.

Webmasters are often concerned about the unauthorized use of their images. If you prevent users from using your images on their site, or linking to your images, you'll prevent people from using your bandwidth, but you are also limiting the potential audience for your images and reducing their discoverability by search engines.

One solution is to allow other people to use your images, but require attribution and a link back to your own site. There are several ways you can do this. For example, you can:

  • Make your images available under a license that requires attribution, such as a Creative Commons license that requires attribution.
  • Provide a HTML snippet that other people can use to embed your image on their page while providing attribution. This snippet can include both the link to the image and a link to the source page on your site.

Similarly, some people add copyright text, watermarks, or other information to their images. This won't impact your image's performance in search results, but may not provide the kind of user experience you are looking for.

If you don't want search engines to crawl your images, we recommend using a robots.txt file to block access to your images.

Create a great user experience

Great image content is an excellent way to build traffic to your site. We recommend that when publishing images, you think carefully about creating the best user experience you can.

  • Good-quality photos appeal to users more than blurry, unclear images. In addition, other webmasters are much more likely to link to a good-quality image, which can increase visits to your site. Crisp, sharp images will also appear better in the thumbnail versions we display in our search results, and may therefore be more likely to be clicked on by users.
  • Even if your image appears on several pages on your site, consider creating a standalone landing page for each image, where you can gather all its related information. If you do this, be sure to provide unique information—such as descriptive titles and captions—on each page. You could also enable comments, discussions, or ratings for each picture.
  • Not all users scroll to the bottom of a page, so consider putting your images high up on the page where it can be immediately seen.
  • Consider structuring your directories so that similar images are saved together. For example, you might have one directory for thumbnails and another for full-size images; or you could create separate directories for each category of images (for example, you could create separate directories for Hawaii, Ghana, and Ireland under your Travel directory). If your site contains adult images, we recommend storing these in one or more directories separate from the rest of the images on your site.
  • Specify a width and height for all images. A web browser can begin to render a page even before images are downloaded, provided that it knows the dimensions to wrap non-replaceable elements around. Specifying these dimensions can speed up page loading and improve the user experience.

With image search, just as with web search, our goal is to provide the best and most relevant search results to our users. Following the best practices listed above will increase the likelihood that your images will be returned in those search results.

Technical guidelines

  • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
  • Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
  • Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
  • Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://www.robotstxt.org/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.
  • If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
  • Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
  • Test your site to make sure that it appears correctly in different browsers.

Quality guidelines

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google's quality guidelines, please report that site at https://www.google.com/webmasters/tools/spamreport. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts.

Quality guidelines - basic principles

  • Make pages primarily for users, not for search engines. Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking."
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"
  • Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.
  • Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.

Quality guidelines - specific guidelines

  • Avoid hidden text or hidden links.
  • Don't use cloaking or sneaky redirects.
    • Cloaking, sneaky Javascript redirects, and doorway pages
    • Cloaking
    • Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
    • Some examples of cloaking include:
    • Serving a page of HTML text to search engines, while showing a page of images or Flash to users.
    • Serving different content to search engines than to users.
    • If your site contains elements that aren't crawlable by search engines (such as rich media files other than Flash, JavaScript, or images), you shouldn't provide cloaked content to search engines. Rather, you should consider visitors to your site who are unable to view these elements as well. For instance:
    • Provide alt text that describes images for visitors with screen readers or images turned off in their browsers.
    • Provide the textual contents of JavaScript in a noscript tag.
    • Ensure that you provide the same content in both elements (for instance, provide the same text in the JavaScript as in the noscript tag). Including substantially different content in the alternate element may cause Google to take action on the site.

Sneaky JavaScript redirects

    • When Googlebot indexes a page containing JavaScript, it will index that page but it cannot follow or index any links hidden in the JavaScript itself. Use of JavaScript is an entirely legitimate web practice. However, use of JavaScript with the intent to deceive search engines is not. For instance, placing different text in JavaScript than in a noscript tag violates our webmaster guidelines because it displays different content for users (who see the JavaScript-based text) than for search engines (which see the noscript-based text). Along those lines, it violates the webmaster guidelines to embed a link in JavaScript that redirects the user to a different page with the intent to show the user a different page than the search engine sees. When a redirect link is embedded in JavaScript, the search engine indexes the original page rather than following the link, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it displays different content to users and to Googlebot, and can take a visitor somewhere other than where they intended to go.
    • Note that placement of links within JavaScript is alone not deceptive. When examining JavaScript on your site to ensure your site adheres to our guidelines, consider the intent.
    • Keep in mind that since search engines generally can't access the contents of JavaScript, legitimate links within JavaScript will likely be inaccessible to them (as well as to visitors without Javascript-enabled browsers). You might instead keep links outside of JavaScript or replicate them in a noscript tag.

Doorway pages ( Pages created just for search engines )

    • Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination.
    • Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our webmaster guidelines.
    • Google's aim is to give our users the most valuable and relevant search results. Therefore, we frown on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the ones they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practice, including removing these sites from the Google index.
    • If your site has been removed from our search results, review our webmaster guidelines for more information. Once you've made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.
  • Don't send automated queries to Google.
  • Don't load pages with irrelevant keywords.
  • Don't create multiple pages, subdomains, or domains with substantially duplicate content.
  • Don't create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware.
  • Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.
  • If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

If you determine that your site doesn't meet these guidelines, you can modify your site so that it does and then submit your site for reconsideration.

Google Basics

When you sit down at your computer and do a Google search, you're almost instantly presented with a list of results from all over the web. How does Google find web pages matching your query, and determine the order of search results?

In the simplest terms, you could think of searching the web as looking in a very large book with an impressive index telling you exactly where everything is located. When you perform a Google search, our programs check our index to determine the most relevant search results to be returned ("served") to you.

The three key processes in delivering search results to you are:

Crawling: Does Google know about your site? Can we find it?

Learn more...

Indexing: Can Google index your site?

Learn more...

Serving: Does the site have good and useful content that is relevant to the user's search?

Learn more...

Crawling

Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

We use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

Google's crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

Google doesn't accept payment to crawl a site more frequently, and we keep the search side of our business separate from our revenue-generating AdWords service.

Indexing

Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, we process information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.

Serving results

When a user enters a query, our machines search the index for matching pages and return the results we believe are the most relevant to the user. Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site's PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.

In order for your site to rank well in search results pages, it's important to make sure that Google can crawl and index your site correctly. Our Webmaster Guidelines outline some best practices that can help you avoid common pitfalls and improve your site's ranking.

Google's Related Searches, Spelling Suggestions, and Google Suggest features are designed to help users save time by displaying related terms, common misspellings, and popular queries. Like our google.com search results, the keywords used by these features are automatically generated by our web crawlers and search algorithms. We display these suggestions only when we think they might save the user time. If a site ranks well for a keyword, it's because we've algorithmically determined that its content is more relevant to the user's query.

Verify that your site ranks for your domain name = Google search for www.[yourdomain].com

Check your site is in the Google index = A search for site:google.com


Rajesh Goutam

www.rajeshgoutam.com

Rajesh Goutam Google Group

My website: