Swati Lathia

Learning ways

SEO

Google Trends Report on Social Media

Look at the change of trends from one social media to another one.

This is just past one month analysis taken from Google Trends, which clearly says that the changeover is really dramatic. The interest of one social media has been taken over by another strong social media.

You can check it out your own words of interest by using this magical tool named “Google Trends”.

You can use it as Keyword Research Tools as well, as it serves vast data related to your words/keywords.

Here is its website URL. Go and check it out some interesting facts

https://trends.google.com/trends/?geo=IN

Google Sandbox : A belief

Google sandbox is a filter to prevent the new websites to check them whether they are spam or not , or we can say it is a filter which checks the new website is as relevant as the older one or no. Because they are not as popular(relevant) as the older sites are. So sometimes, web content developer feels exhausted as their site can not be at the top on SERP of google as the sandbox tries to verify the newer site.

This observation of google sandbox depends on the quantity and quality of keywords of your site have. It may take 6 to 8 months to keep your site in sandbox.

Sitemap.xml

Map of A Website To Help Crawler : In simple terms, a Sitemap is an XML file that is full of your all webpage’s URLs. It’s like a list of every webpage in your website. This file should be easily discoverable in your site in order for search engine crawlers to trip over it.

A Sitemap is usually used to make the search engine crawlers follow all the links of your individual webpages so that it does not miss out anything on your webpage.

 

As the name implies, it is just a map of your site – i.e. on a single page you show the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are an important way of communication with search engines.

 

Structural Decisions

Auditing an Existing Site to Identify SEO Problems

Auditing an existing site is one of the most important task for SEO professional. This includes those who have developed CMSs, so there is a lot of opportunity to find problems when conducting a site audit.

Please check : Elements of Audit

Structural decisions include some of the key points as follows

  1. Target Keywords
  2. Cross-link relevant content
  3. Use Anchor text
  4. Use breadcrumb navigation
  5. Minimum link depth

Session Ids in URLs : Crawler Confusion

Session IDs are most common in e-commerce sites and are embedded in a URL so the website can track their users or consumers from page to page and they are used to keep track of items in a consumer’s shopping cart.

But these IDs cause problems for search engine crawlers because they create a large number of links for the spider to crawl. This can create a situation where the search engine indexes essentially the same page over and over. Search engines like Google refer to it as a ‘spider trap’, which we will discuss later on.

Below are a few examples of how session IDs can give the appearance of an endless number of pages within a single site. A crawler coming to your website may find a page with the following URL:

http://www.yoursite/shop.cgi?id=dkom2354kle03i

This page gets indexed but when the spider returns later to look for new content, it finds the following:

http://www.yoursite/shop.cgi?id=hj545jkf93jf4k

This is actually the same page as before, just with a different special session ID but the spider sees it as a new URL. Because of this confusion, search engine spiders are programmed to avoid pages containing these session IDs.

Scroll to top