(function(i,m,p,a,c,t){c.ire_o=p;c[p]=c[p]||function(){(c[p].a=c[p].a||[]).push(arguments)};t=a.createElement(m);var z=a.getElementsByTagName(m)[0];t.async=1;t.src=i;z.parentNode.insertBefore(t,z)})('https://utt.impactcdn.com/P-A4441360-e9b0-4ece-8d08-1a14071420071.js','script','impactStat',document,window);impactStat('transformLinks');impactStat('trackImpression'); Best One Page SEO & OFF PAGE SEO Tutorial (A to Z )

Best One Page SEO & OFF PAGE SEO Tutorial (A to Z )

Best  One Page SEO & OFF PAGE SEO Tutorial

we present you with a complete SEO Article with all the terms related to the subject so that you can solve your digital doubts.

Within SEO there are many terms and concepts that we need to be clear about when optimizing and positioning our site or project in the best possible way.

A good interpretation of a term is essential. For this reason, in this section, we will expand new terms and "words", all related to SEO so that you are always updated.

1. Google algorithm

The Google Algorithm is the search engine's way of positioning pages before a search,  that is, it is what decides if you go first, second, or on the second page.

This algorithm changes about 500 times a year and is difficult to keep track of. That is why it is preferable to know well important changes such as Panda and Penguin, how they affect SEO and how to recover.

Anchor text

The Anchor Text or anchor text is the visible text in a link or hyperlink that provides information about the content to which we want to direct the user and the search engines.

Search engines have improved over time and increasingly use more factors to create their positioning rankings. One of these metrics or factors is the relevance of a link. The relevance of a link depends both on the authority of the page where that link comes from, and on the visible text of the anchor text.  Of course, the link should always be as natural as possible or Google will understand it as a bad practice.

We can classify anchor text into the following types:

  • Naked or no anchor text. Only one URL is displayed. For example www.40defiebre.com
  • Generic.  Include words such as: “this blog”, “click here”, “this page”.
  • Keyword.  Depending on whether we are interested in positioning one or another keyword, we use a different anchor text and choose the terms that we want to highlight, for example, “ Link Building ”.
  • Name.  When it consists of a text other than the previous ones, and the objective is to link to a brand, a website, etc. The link would be: " amirul-academy ".

2. Backlinks

The Backlinks are the links or inbound links that point from other pages to your own. The number of backlinks on your page is important because the more relevant pages that link to you, the more notoriety your website will gain in the eyes of Google. Make sure they are natural and convenient links, always quality before quantity.

Black Hat SEO or negative SEO

In SEO Black Hat is called the attempt to improve the search engine positioning of a web page using unethical techniques or techniques that contradict Google's guidelines, "cheating". These practices are increasingly penalized by Google. Some examples of Black Hat SEO are:

  • Cloaking
  • SPAM in forums and blog comments
  • Keyword stuffing

Keyword cannibalization

Keyword cannibalization occurs when on a website there are several pages that compete for the same keywords, confusing the search engine by not knowing which page is the most relevant for that keyword, causing a loss in positioning.

How is this solved? The easiest way is to focus each page on one or two keywords at the most and in the event that it cannot be avoided, we will have to create a main page of the product from where you can access the pages of the different formats in which we will include a canonical tag to the home page of the product.

3. Cloaking

Cloaking is a widely used Black Hat SEO technique that consists of displaying different content depending on whether it is a user or a search engine robot that reads it.

Google is very hard with this practice and although years ago it could have given results, forget it, it is outside what search engines are looking for with their updates: a more natural, ethical and more user-focused SEO.

Duplicate content

Duplicate content occurs when the same content appears in multiple URLs and that in principle is not a reason for penalty unless a high percentage of your website has duplicate content. Having a few duplicate pages won't make  Google mad at us, but avoiding it will give you clues that we're on the right track.

Although it does not imply a penalty, it can generate a loss of positioning potential because search engines do not know which are the most relevant pages for a certain search.

CTR

The CTR (Click Through Rate) is the number of clicks a link gets compared to its number of impressions. It is always calculated in percentage, and it is a metric that is normally used to measure the impact that a digital campaign has had.

How to calculate the CTR?

As we said before, the CTR is calculated in percentage. It is obtained by dividing the number of clicks that a link has obtained by the number of times it has been seen by users (impressions) multiplied by 100.

Let's see an example: Let's imagine that we have a result in Google that has been seen 2000 times and that has obtained 30 clicks, our CTR would be calculated like this:

CTR = (Clicks / Impressions) x 100 = (30/2000) x 100 = 1.5%

CTR = 1.5%

4. Keyword density

The density of keywords is the percentage of times that a word (or series of words) appears in the whole text compared to the total number of words.

A few years ago, keyword density was one of the most important factors in SEO positioning, as it was the method used by search engines (Google, Yahoo, Bing) to identify the main topic of a page.

However, SEO has changed, now Google's guidelines recommend writing in the most natural way possible, that is, you have to write for the user instead of for the search engine.

Although there are still people who recommend not to exceed 3% the density of keywords, there is no ideal percentage.

AND

Canonical Label

The Canonical tag was introduced by Google, Yahoo!, and Bing in 2009 to solve the problem of duplicate or similar content in SEO.

If there is no canonical tag in your code on a set of pages with duplicate or similar content, search engines will have to decide which URL is best suited to what the user is specifically looking for. However, if we introduce this tag, we are the ones who tell Google and other search engines which is our favorite page. This will improve the indexing and positioning process of our website in SERPs.

Canonical tag example: <link rel = ”canonical” href = ”http://www.miweb.com/principal» /> ”

Let's see an example: if our website is the platform from which we sell flats in the Chueca neighborhood of Madrid and we have several pages with very similar content, we must choose the URL by which we want to position ourselves as canonical. This may be the one that has brought us the most traffic or the one that brings the greatest benefit.

To use the canonical tag effectively in SEO, just follow these steps:

  1. Choose which is the main or canonical page.
  2. Decide which or which are your secondary pages that can compete in the positioning with the main one.
  3. Add the canonical tag in the secondary pages pointing to the main page between "<head>" and "</head>"
  4. Put the canonical tag on the main page pointing to itself between "<head>" and "</head>"

Robots label

The Meta Robots tag is an HTML tag that is used to tell search engines to treat a url in a certain way.

This tag is necessary if we do not want our website to be indexed or positioned in search engines .

This function can also be performed through the Robots.txt file on the page.

The difference between using the Meta Robots tag and the Robots.txt file is as follows:

  • Through the tag we tell Google that we do not want to index certain pages, but we do want bots to crawl them.
  • However, if we use the Robots.txt file, we tell bots not to bother directly entering and crawling certain pages.

This difference is important that you take into account. You will understand it better with an example:

Imagine that you have 2 URLs that you don't want to appear in the Google index.

Url 1: blocked by robots.txt file

This URL will not be crawled nor will it be indexed (a priori, never trust Google 100% :-P).

Url 2: blocked with meta robots tag

This URL, when blocked with the meta robots tag, will not be indexed but it will be able to be crawled by search engines, which will cause all content to be analyzed, and therefore, search engines can track and follow the links to other pages.

5. Google panda

Google Panda is a change in Google's algorithm that was published in the United States in February 2011 and in Europe in April of the same year. At its exit, it affected more than 12% of all search results.

The maxim with Panda to avoid being penalized is to be sure that your content is totally original and adds great value to your user, that you keep the page updated or that you even look for new formats to enrich your contribution to the user. The most actionable metrics, in this case, will be the bounce rate, the CTR in your search results, the time spent, and the number of page views.

Google penguin

Penguin is the official name for the Google algorithm update designed to fight webspam. This update was released in April 2012.

It focuses on the off-site factors of a website, rewarding those sites that have a link profile with links from high quality domains and not manipulated, and trying to punish those pages that have violated Google guidelines, which have profiles of unnatural links, too many links on low-quality sites, etc.

It was Google who, from the beginning, decided that the links generated to a website were a sign that its content was relevant. Hence, everyone started generating links to gogó. However, Google Penguin is a “where I said I say, I say Diego”.

The improvements implemented by the algorithm include better detection of links of little value, purchased, in article networks, directories and basically any dynamic that involves trying to modify the link profile of your website. The best way to ensure that you are not penalized by Penguin is to adhere to Google's guidelines and passively link through your content.

How does SEO change with Google Penguin?

  • Natural links, that is, are generated passively or through real value. The syndication of articles, spinning, hidden links, directories (free or paid), promotions that result in links, etc. are prohibited.
  • Variety of anchor text: It no longer makes sense to generate links with a link text that you want to position. If Google detects a pattern that it does not consider natural, it can penalize you.
  • Search your niche: The most valuable links are those of domains and pages in your niche or that talk about topics with a relationship.
  • Quality, not quantity: It is preferable to generate few quality links than many of little value.

6. Keyword

It refers to the keyword (or keywords) to refer to the terms by which we want to attract traffic to our website through search engines . You must take into account some factors associated with Keywords (abbreviated KW) such as competition, the number of searches, the conversion or even the potential as a branding tool.

The choice of one or another keyword will determine the strategy, the content of a page, the appearance of that keyword in texts and tags, and other SEO positioning factors .

Keyword stuffing

Keyword Stuffing is a Black Hat SEO technique  that consists of the excessive use of keywords within a text  with the wrongly focused objective of giving this word more relevance. Google very often penalizes this type of over-optimization.

To avoid any type of negative action by Google, the texts should always be written to provide value to the user, and in the way that best suits your audience profile. If the text manages to provide useful, original and well-synthesized information, that will be a better indicator for Google than any variation in the number of keywords in the text.

There is no percentage that defines a perfect keyword density and Google recommends above all naturalness.

7. Link baiting

Technique of attracting links organically by creating high-value content. One of the essential factors for search engine positioning is the number of links to a given page.

Link Baiting intends for a large number of users to link to content on our site. To do this, we must create original, relevant, and novel content, such as articles, videos, or infographics that attract the attention of users.

Link building

Link Building is one of the foundations of web positioning or SEO, which seeks to increase the authority of a page as much as possible by generating links to it.

The algorithms of most search engines such as Google or Bing are based on the factors of SEO on-site and SEO off-site, the latter based on the relevance of a website, whose main indicator is the links that point to it or backlinks. There are another series of factors, such as the anchor text of the link, if the link is follow or not, brand mentions or links generated in RRSS.

It is important to bear in mind that good content is often linked naturally, so the effort to get links happens organically and with less effort than in other ways.

Link juice

It is the authority that transmits a page through a link. Google positions web pages based on their authority and relevance, this is transferred from one page to another through links, and this transmitted authority is what we call Link Juice.

To understand it, we have to understand a web page as a large glass of juice (web) to which we make various holes (links) at the base. In this way, a glass that has a hole will transmit all of its link juice through that single hole. If you have 10, each hole will pass 10% of the total link juice, and so on.

Long-tail

The long tail or the long tail is a statistical term that refers to the distribution of a population.

Let's suppose that your website attracts traffic through 100,000 keywords, and you focus on the 100 with the most visitors. Let's imagine that they represent around 20% of total traffic (depending on the nature of your website) corresponds to these terms, and the remaining 80% will correspond to terms with a very low number of searches. So the vast majority of traffic that your website attracts comes through terms that you are not analyzing and that you do not even know what they are.

This is what we call long tail, searches with more specific terms that individually generate very little traffic, but together they are the largest source of visits on the web. The term is applicable to other realities apart from online marketing; It was popularized by Chris Anderson in a Wired article, citing examples of companies that have succeeded thanks to the business generated by their long tails, such as Amazon, Netflix or Apple.

8. Meta Tags

Meta tags or meta tags are information included in web pages but which in turn are not directly viewed by the user. They are used to provide information to browsers and search engines in a way that helps them better interpret the page and are written in HTML language within the web document itself.

Meta tags have been important at the SEO level due to their ability to affect search engine behavior, providing information on which pages a website should position for, giving a description of it or blocking access or indexing of the website by users. search engine robots.

Microformats

Microformats are a simple form of code that gives content semantic meaning so that machines can read it and understand our products or services.

If we add Microformats to our website, Google can read it and display it in search results. This information can include user votes, photo and name of the author, video, audio, etc.

9. Not provided

The term "not provided" is a term used in Google Analytics that identifies all "safe" traffic within Google, or what is the same, all traffic that comes from users who have logged into their Google account.

What about this data? What do I do with them? In this post we will discover the different options so that you know how to interpret this data.

OR

Off-site SEO

It is the part of the SEO work that focuses on factors external to the web page in which we work and that affect our site, it includes external links, social signals, mentions and other metrics that reinforce the authority of the page.

One of the most important tasks of off-site SEO is link building , generating links that point to your page on external websites, with which Google will give it greater relevance.

On-site SEO

On-site SEO or On-page SEO is a set of internal factors that influence the positioning of a web page. They are those aspects that we can change ourselves on our page such as:

The  meta-information, such as the title or the meta-description
  • The URL
  • The content
  • The <alt> attribute in images
  • The web structure
  • The internal linking
  • HTML code
Optimizing SEO On-site is an essential process that every web page must take care of if it wants to appear in search results.

10. Pagerank

The Page Rank is the way in which Google measures the importance of a website,  the search engine classifies the value of the websites on a scale of 1 to 10.

When a page links to another website, it transmits a value, and this value depends on the Page Rank of the page that links.

Currently, Google has stopped publicly updating the Page Rank, and now no one can see what score a website has for the search engine.

However, although they continue to use it internally to establish their search results, it has less and less weight within the whole algorithm.

The Page Rank is given by factors such as the number of links and domains pointing to the web, their quality, the age of the domain, etc.

11. Query

The English term "query" means doubt or question. When we talk about databases, query or query string is a request for data stored in said BB.DD., although in a generic way it can refer to any interaction. When we talk about search engines, a query is the term that we write in Google, a query that will later lead to a SERP.

12. Search Engine Ranking

Search Engine Ranking is the position your website occupies on a search results page. That is, the position in which you appear in Google, Yahoo. Bing…. when a user performs a search.

To improve our positioning we must use strategies and tools that help us optimize our website, increasing accessibility, usability and content.

13. Schema Markup

Schema Markup is the specific vocabulary of tags (or microdata) that you can add in the HTML code of your website to provide more relevant information. This will help search engines understand your content better and provide better results. Also, improve the way your page is rendered with rich snippets that appear below the page title.

Schema.org  is the reference website for this type of strategy where you will find all kinds of hierarchies and ways to organize your content. But wait, what can I structure? Hundreds of things! Nowadays there are a wide variety of labels to structure and surely with time there will be more: places, events, movies, books, recipes, people, etc. Also, to make it much easier Google created “ Markup Helper ”. Very useful.

SERP

(Search Engine Results Page) refers to the results page of a search engine, such as Google or Bing.

It is the page that appears after conducting a search, it is where the results are displayed in order.

The more a website is optimized according to the quality criteria of the search engines, the more likely it will be to rank better in the SERPs.

Sitemap

A sitemap or website map is an XTML document that is sent to search engines. This document allows search engines to have a complete list of the pages that make up a website so that they can index pages that their robots cannot access because there are no direct links, being behind a form, etc.

Spinning

Spinning is a Black Hat SEO technique that refers to the creation of an article by reusing different original texts.

In this way, the generation of content is accelerated in a simple way. It can be carried out using software that automates the process of modifying the content or manually, making people believe that they are different texts by means of synonyms or changes of order and words.

Although this technique has been widely used, doing it automatically falls within Google's penalty factors. Since its now-famous Penguin, Google detects these practices more frequently.

14. White Hat SEO

White Hat SEO are those ethically correct techniques that meet the guidelines set by search engines to position a website. 

Its objective is to make a page more relevant to search engines. To get a good SEO White Hat there are some characteristics that you should take into account:

White Hat SEO is the most beneficial way to optimize the positioning of a website in the medium-long term.

Learn More:


Post a Comment

If You have any question then let me know I will try to give all the answer of your questions !!

Previous Post Next Post