Google Webmaster Tool
With the help of Google webmaster tool we can know how our site is performing in search Engine Results Pages. Webmaster sees the site as the Google sees it.
1) If Google webmaster tool finds any problem in our site it will let u know about the problem and we can rectify our problem and we can submit our site/modified pages to Google search engine.
2) With Google webmaster tool we can also know about the traffic, as who is linking and which keyword has high traffic.
3) We can also ask Google webmaster tool to remove indexed pages from search results.
WEBMASTER OVERVIEW:
Overview page tells us about two things
1.Indexing and Top search queries
2. Web Crawl Errors
1. Indexing: With this we can know about the total no of pages crawled by Google.
Home Page Crawl will let us know about the information of total number of pages/url’s, no of downloads, and time spend to downloading a page.
Index Status gives the information about indexed pages in Google search like indexed pages, link pages, cached pages, info about our site which Google has, and similar pages related to our site.
When we are added new pages to our site we should inform about those new pages to Google by submitting sitemaps through Google Webmaster Tool. After submission of sitemaps, Google Webmaster Tool shows the total no of URL’s and total no of indexed pages.
Top Search Queries: Here Google Webmaster Tool will tell us about the top 20 searches and the high num of clicks to particular page/keyword. By this we can know which keyword is having high search, or we can know if we have unknowingly blocked any of our pages from crawling and we can compare two top 20 lists to improve contents of our sites.
Impressions - The top 20 queries in which our site appeared and the percentage of each search.
Traffic – We get the information on users reached our site and the percentage of Queries
2.Web Crawl Errors: Google bot crawl our sites by following links from page to page. Web crawl error gives the list of pages where Google having problem in crawling the pages (Errors in Pages), and as a result they won't be index and will not appear in search results. We can find the errors in the page and rectify it and then submit it to Google Webmaster tool for indexing. Web Crawl gives the list of error pages like Errors for URL’s in sitemap, not found, http error, unreachable urls, url’s not followed, url’s timeout, url’s restricted by robots.txt.
SETTINGS:
With this Settings page we provide Google with information about our site and preferences. This will help Google to improve in crawl and index our site. In settings we have Geographic target, preferred domain, Image search, and Crawl rate.
Geographic target: If we are targeting any particular location for clients then this feature is used. Here we can select the particular targeted location for our sites. Like if we select U.S then this means that we are targeting U.S people.
Preferred domain: The preferred domain is the one that we would like to index our site's pages. Links may point to our site using both the www and non-www versions of the URL (http://www.example.com and http://example.com). The preferred domain is the version that we want used for our site in the search results.
Image search: To will improve indexing and search quality of our pages we can also use the options for enhanced image in our sites. Google may use tools such as Google Image Labeler to associate the images included in our site with labels.
Crawl rate: crawl rate is nothing but the time used by Googlebot to crawl the site. Google can crawl as many pages from our site. We can even change the crawl rate for sites that are at the root level. The new custom crawl rate will be valid for 90 days.
DIAGNOSTICS:
Web Crawl Errors: Google bot crawl our sites by following links from page to page. Web crawl error gives the list of pages where Google having problem in crawling those pages (Errors in Pages), and as a result they won't be index and will not appear in search results. We can find the errors in the page rectify it and then submit it to Google for indexing. Web Crawl gives the list of error pages like Errors for URL’s in sitemap, not found, http error, unreachable urls, url’s not followed, url’s timeout, url’s restricted by robots.txt.
Mobile Crawl: The Mobile crawl errors page provides details about problems encountered crawling URLs on our mobile website.
Content Analysis: We should build our site a Google-friendly site by using content analysis to identify the possible issues with content like metatags and title descriptions for our site. Data that may be included on this page includes:
•Title problems: Missing or repeated page titles on our pages leads to a problems/mistakes.
•Meta description problems: Duplicate or otherwise problematic Meta descriptions will lead to problem.
•Non- indexable content: some rich media files, video, or images leads to non-indexable content.
STATISTICS:
Top Search Queries: This gives the list of top queries given to our site/page. By this we can know the page/keywords having high traffic.
What Googlebot Sees: This will let us know about how Google will see our site. This gives list of our top pages having external links and top keywords which has high traffic.
Index Status gives the information about indexed pages in Google search like indexed pages, link pages, cached pages, info about our site which Google has, and similar pages related to our site.
Crawl Stat: Here Google Webmaster Tool will tell us about the top 20 searches and the high num of clicks to particular page/keyword. By this we can know which keyword is having high search, or we can know if we have unknowingly blocked any of our pages from crawling and we can compare two top 20 lists to improve contents of our sites.
Impressions - The top 20 queries in which our site appeared and the percentage of each search.
Traffic – We get the information on users reached our site and the percentage of Queries
LINKS: We have three types of links
1. Pages with External links
2. Pages with Internal Links and
3. Sitelinks.
Pages with External Links: External links are nothing but the links given by other directories to our site. Webmaster tool shows the external links list for total num of pages and total num of links to particular pages having links.
Pages with Internal Links: Every site has a small number of pages we should give links in such a way that each and every page of our site is linked with each other. Google webmastertool gives the total number of sites and there total number of internal links.
SiteLinks: Has we create sitelinks dynamically. So, Google sometimes generate some additional links from site contents in order to help users navigate our sites. The list created by Google will change from time to time.
SITEMAPS: Sitemaps is a way through which we will inform Google about our site. Sitemap contains the list of the pages of our website .There are two types of sitemaps one is HTML pages and other is XML Sitemaps. By creating and submitting a Sitemap to Google webmaster tool helps to make sure that Google knows about all the pages of our site, including URLs, its status.
Tools: Analyze robots.txt: we can request to ignore specific files/ directories in the search engines by using robots.txt file.
Generate robots.txt: Here we are created our robots.txt file to inform webmaster Tool about the pages to allow/disallow to Crawl.
Manage site verification: Google webmaster tool gives the verification Meta tag to our directory. Security is one of the main reasons for this Meta tag where Google wants to make sure that we are the Website owner. This prevents other users getting information about our site by putting in our Website into their account and getting information about our site. We will include this Meta tag in our pages by which Google will come to know the specific owners.
Remove URLs: Google and other search engines will not crawl the content if we want to remove from search results. To do this, we must ensure that each page returns an HTTP status code of either 404 or 410, or use a robots.txt file or Meta no index tag to block crawlers from accessing your content. If we are requesting removal of a full site or directory, we must use a robots.txt file to block crawlers from accessing this content.
Enhance 404 pages: when we are requesting for a page which is missing or a page having some error, then web servers will send back a code of 404 to indicate that a page is not found. This is nothing but a 404 Page.
Gadgets: we create the dynamic content which is powered by Google were it takes them as tiny objects and which can be placed in any page on the wed.
With the help of Google webmaster tool we can know how our site is performing in search Engine Results Pages. Webmaster sees the site as the Google sees it.
1) If Google webmaster tool finds any problem in our site it will let u know about the problem and we can rectify our problem and we can submit our site/modified pages to Google search engine.
2) With Google webmaster tool we can also know about the traffic, as who is linking and which keyword has high traffic.
3) We can also ask Google webmaster tool to remove indexed pages from search results.
WEBMASTER OVERVIEW:
Overview page tells us about two things
1.Indexing and Top search queries
2. Web Crawl Errors
1. Indexing: With this we can know about the total no of pages crawled by Google.
Home Page Crawl will let us know about the information of total number of pages/url’s, no of downloads, and time spend to downloading a page.
Index Status gives the information about indexed pages in Google search like indexed pages, link pages, cached pages, info about our site which Google has, and similar pages related to our site.
When we are added new pages to our site we should inform about those new pages to Google by submitting sitemaps through Google Webmaster Tool. After submission of sitemaps, Google Webmaster Tool shows the total no of URL’s and total no of indexed pages.
Top Search Queries: Here Google Webmaster Tool will tell us about the top 20 searches and the high num of clicks to particular page/keyword. By this we can know which keyword is having high search, or we can know if we have unknowingly blocked any of our pages from crawling and we can compare two top 20 lists to improve contents of our sites.
Impressions - The top 20 queries in which our site appeared and the percentage of each search.
Traffic – We get the information on users reached our site and the percentage of Queries
2.Web Crawl Errors: Google bot crawl our sites by following links from page to page. Web crawl error gives the list of pages where Google having problem in crawling the pages (Errors in Pages), and as a result they won't be index and will not appear in search results. We can find the errors in the page and rectify it and then submit it to Google Webmaster tool for indexing. Web Crawl gives the list of error pages like Errors for URL’s in sitemap, not found, http error, unreachable urls, url’s not followed, url’s timeout, url’s restricted by robots.txt.
SETTINGS:
With this Settings page we provide Google with information about our site and preferences. This will help Google to improve in crawl and index our site. In settings we have Geographic target, preferred domain, Image search, and Crawl rate.
Geographic target: If we are targeting any particular location for clients then this feature is used. Here we can select the particular targeted location for our sites. Like if we select U.S then this means that we are targeting U.S people.
Preferred domain: The preferred domain is the one that we would like to index our site's pages. Links may point to our site using both the www and non-www versions of the URL (http://www.example.com and http://example.com). The preferred domain is the version that we want used for our site in the search results.
Image search: To will improve indexing and search quality of our pages we can also use the options for enhanced image in our sites. Google may use tools such as Google Image Labeler to associate the images included in our site with labels.
Crawl rate: crawl rate is nothing but the time used by Googlebot to crawl the site. Google can crawl as many pages from our site. We can even change the crawl rate for sites that are at the root level. The new custom crawl rate will be valid for 90 days.
DIAGNOSTICS:
Web Crawl Errors: Google bot crawl our sites by following links from page to page. Web crawl error gives the list of pages where Google having problem in crawling those pages (Errors in Pages), and as a result they won't be index and will not appear in search results. We can find the errors in the page rectify it and then submit it to Google for indexing. Web Crawl gives the list of error pages like Errors for URL’s in sitemap, not found, http error, unreachable urls, url’s not followed, url’s timeout, url’s restricted by robots.txt.
Mobile Crawl: The Mobile crawl errors page provides details about problems encountered crawling URLs on our mobile website.
Content Analysis: We should build our site a Google-friendly site by using content analysis to identify the possible issues with content like metatags and title descriptions for our site. Data that may be included on this page includes:
•Title problems: Missing or repeated page titles on our pages leads to a problems/mistakes.
•Meta description problems: Duplicate or otherwise problematic Meta descriptions will lead to problem.
•Non- indexable content: some rich media files, video, or images leads to non-indexable content.
STATISTICS:
Top Search Queries: This gives the list of top queries given to our site/page. By this we can know the page/keywords having high traffic.
What Googlebot Sees: This will let us know about how Google will see our site. This gives list of our top pages having external links and top keywords which has high traffic.
Index Status gives the information about indexed pages in Google search like indexed pages, link pages, cached pages, info about our site which Google has, and similar pages related to our site.
Crawl Stat: Here Google Webmaster Tool will tell us about the top 20 searches and the high num of clicks to particular page/keyword. By this we can know which keyword is having high search, or we can know if we have unknowingly blocked any of our pages from crawling and we can compare two top 20 lists to improve contents of our sites.
Impressions - The top 20 queries in which our site appeared and the percentage of each search.
Traffic – We get the information on users reached our site and the percentage of Queries
LINKS: We have three types of links
1. Pages with External links
2. Pages with Internal Links and
3. Sitelinks.
Pages with External Links: External links are nothing but the links given by other directories to our site. Webmaster tool shows the external links list for total num of pages and total num of links to particular pages having links.
Pages with Internal Links: Every site has a small number of pages we should give links in such a way that each and every page of our site is linked with each other. Google webmastertool gives the total number of sites and there total number of internal links.
SiteLinks: Has we create sitelinks dynamically. So, Google sometimes generate some additional links from site contents in order to help users navigate our sites. The list created by Google will change from time to time.
SITEMAPS: Sitemaps is a way through which we will inform Google about our site. Sitemap contains the list of the pages of our website .There are two types of sitemaps one is HTML pages and other is XML Sitemaps. By creating and submitting a Sitemap to Google webmaster tool helps to make sure that Google knows about all the pages of our site, including URLs, its status.
Tools: Analyze robots.txt: we can request to ignore specific files/ directories in the search engines by using robots.txt file.
Generate robots.txt: Here we are created our robots.txt file to inform webmaster Tool about the pages to allow/disallow to Crawl.
Manage site verification: Google webmaster tool gives the verification Meta tag to our directory. Security is one of the main reasons for this Meta tag where Google wants to make sure that we are the Website owner. This prevents other users getting information about our site by putting in our Website into their account and getting information about our site. We will include this Meta tag in our pages by which Google will come to know the specific owners.
Remove URLs: Google and other search engines will not crawl the content if we want to remove from search results. To do this, we must ensure that each page returns an HTTP status code of either 404 or 410, or use a robots.txt file or Meta no index tag to block crawlers from accessing your content. If we are requesting removal of a full site or directory, we must use a robots.txt file to block crawlers from accessing this content.
Enhance 404 pages: when we are requesting for a page which is missing or a page having some error, then web servers will send back a code of 404 to indicate that a page is not found. This is nothing but a 404 Page.
Gadgets: we create the dynamic content which is powered by Google were it takes them as tiny objects and which can be placed in any page on the wed.
No comments:
Post a Comment