SEO Tutorial for Beginners in 2017



What is SEO?

Search Engine Optimization in 2016 is a technical, analytical and creative process that improves the visibility of a website in search engines. Its main function is to drive more visitors to a website that convert into sales.

The free SEO tips you read on this page will help you create a successful SEO website.

I have more than 5 years experience in ranking the sites on Google. If you need optimization services, visit my SEO services or SEO services for small businesses.

An introduction

This article is a beginner’s guide for effective white hat SEO.

Conscious techniques to avoid the “gray hat” than what might be gray now is often referred to as “black hat” tomorrow when it comes to Google.

No guide page can explore this complex subject in its entirety. What you find here you will find answers to questions asked, you when you started in this area.

“The Rules”

Google insists Webmasters stay true to its “rules” and aims to reward websites with quality content and “white hat” web marketing techniques with high rankings.

Instead, you must also punish websites that qualify by breaking the rules of Google.

These rules are not “laws”, but of “guidelines” for ranking at Google; Created by Google. However, you should note that certain methods of placement at Google, in fact, is illegal. Piracy, for example, is illegal, in the United Kingdom and the United States.

You can choose to follow these rules and respect, double or ignore them, all with varying success (and pay levels, Google spam team).

The white hats do with “rules”; The black hats ignore the “rules”.

What you read in this article is perfect in-laws and also in the guidelines and will help to increase traffic to your website through organic, natural or pages of search engine results (SERP).


There are many definitions of SEO (search engine optimization spelling in the UK, Australia, and New Zealand or the optimization of search engines in the US and Canada), but organic SEO 2016 mostly get free of Google traffic, the world (and almost the only game In the city in the UK):


The SEO web artwork is based on the understanding of how people are looking for things and understands the type of results that Google wants (or will) show for its users. This is looking at a lot of things to bring together.

A good optimizer has an understanding of how search engines like Google generate their natural SERPs to meet navigation applications, information and transaction users.

Risk Management

A good search engine search engine has a good understanding of the short and long term risks involved in optimizing placements in the search engines, and understanding the type of content and sites of google (especially) will return natural SERPs.

The goal of each campaign is more visible in the search engines, and it would be a simple process if it was not for the many traps.

There are rules to follow or ignore, the risks are made for the fans and the bouts that are won or lost.

Free traffic


A Mountain View spokesman has called the search “kingmaker”, and that is not a lie.

The high ranking in Google is very important – it is actually “free advertising” in the best advertising space in the world.

Trading in natural listings by Google is still the most valuable organic visitor to a website in the world, and it can make an online business or break.

The state of the game in 2016 is that you can still generate highly qualified leads, free, easy to improve your website and optimize your content as relevant as possible for a buyer to search for your business, product or service.

As you can imagine, there is a lot of competition now free traffic (!) – Also Google in some niches.

You should not compete with Google. You should concentrate with your competitor on the competition.


The process

The process can be successfully carried out in a room or workplace, but it has always been the rule over many skills involved as they have come into being, including various marketing technologies, including but not limited to:

  • Web Page Design
  • Accessibility
  • Availability
  • User Experience
  • Website development
  • PHP, HTML, CSS, etc.
  • Server Management
  • Domain management
  • Translating
  • Tables
  • Backlink analysis
  • Search for a keyword
  • Promotion of social media
  • Software development
  • Analysis and data analysis
  • Information Architecture
  • Survey
  • Log analysis
  • Search for hours and hours on Google

It takes a lot in 2016, earning a page on Google in competitive niches.

User Experience

The big stick that Google meets all webmasters with (currently and for the foreseeable future) is the stick “quality user experience.”

If you are planning to qualify for Google in 2016, you should have a better quality offer, not entirely on manipulation or the old school tactics.

A visit to your website is it a good user experience?

Otherwise, they adjust from the “quality of handbook” assessors and watch the algorithms Google Panda / to the quality of websites to seek a bad user experience for their users.

Google launch “quality bar”, year by year, provides a higher level of quality in online marketing in general (on very low quality that we’ve seen in recent years).

Online success is to invest in a better quality of page content, website architecture, ease of use, converted to optimize balance and promote.

If you do not take this path, you will be hunted by the Google algorithms next year.

This guide “What is SEO” (and all websites) does not refer to the type of Google SEO Churn and Burn (called Webspam at Google) because it is too risky to implement on a real website in 2016.

What is a successful strategy?

Be relevant. Trust Always popular.

It is not just a question of manipulation in 2016.

It is often useful to add high-quality content to your site, which together serves a purpose that provides user satisfaction.

If you are serious about more free traffic from search engines, be prepared to invest time and effort into your website and online marketing.

Quality signals

Google wants to rank quality documents in their results, and require those who want to invest better in high-quality content and excellent service attracts editorial links from sites of good reputation.

If you are willing to add great content to your site and create rumors about your business, Google will give you a high score.

If you try to manipulate Google, it will punish you for a period of time, until you fix the problem offensive – that we can know yours.

Backlinks in general, for example, are still far too strong positively by Google and are manipulated to guide a website for top positions – for a while. For this reason, Blackhats do – and they have the business model to do so. This is the easiest way to rank a website still today.

If you are a real company that wants to build a brand online – you can not use black hat methods. Point.

Solving problems do not necessarily result in a punishment in bio-traffic.

Restoring a Google penalty is a process of “new growth”, as much as it is a process of “cleaning”.

Google rankings are constantly changing

It is the job of Google SERPs for HARD.

Thus, people keep behind the “shifting elements” algorithms by changing the “rules” and increasing “quality standards” for the competing pages in the top ten placements.

By the year 2016 – we flow into the SERPs – and that seems to suit Google and keep everyone guessing.

Google is very mysterious about its “secret sauce” and sometimes provide useful advice and sometimes vague – and some say bad way – how making the most valuable traffic from Google.

Google said that the engine optimizer search engine “thwart” is determined to improve the amount of high-quality traffic on a website, at least (but not limited to) low-quality strategies as spam web.

In essence, the Google search engine optimization is always about keywords and links. This is RELEVANCE, reputation, and trust. It is the quality of the content and the satisfaction of the visitors.

GOOD EXPERIENCE USER is a key to winning – and keeping – the highest ranks in many vertical markets.

Relevance, authority, and trust

Optimizing the web page is relevant and reliable web page to make a query arrange.

This is the classification of the valid keywords in the long run, by merit. You can play according to the rules of the “white hat” defined by Google and aim to build this trust and authority course over time, or you can choose to ignore the rules and go full-time “black hat”.

Most SEO tactics still work for a certain time at a certain level, depending on who makes it and how the campaign is implemented.

What path you take, you know if Google catches you try to change your rank by hand and manipulative methods, you will be classified as spammers network and your site will be penalized.

These sanctions can last for years if they are not treated because some penalties expire and some do not, and Google wants to delete violations.

Google wants you to change and sort easily, not to try. Critics say that Google would rather pay with the help of Google Adwords.

The problem for Google is that the top placement in the organic search results from Google is a social test for a business one way to avoid PPC costs, and yet is the best way to drive traffic to a website.

It is free, even if you meet the growing criteria you should rank above.

The problems of the “User Experience”

The user is it a ranking factor?

The user experience is mentioned 16 times mentioned in the essential content of quality guidelines (official PDF), but Google told us that there is no classifiable “ranking factor” in the search office, at least.

Mobile, of course, since UX is the basis of the mobile update. Currently not on the desktop. (Gary Illyes: Google May 2015)

While UX, we are told, is not literally a “ranking factor”, it is helpful to understand exactly what Google calls “bad user experience” because when the bad UX signals are identified on your website, thing for your ranking too at any time.

The consistent SEO tips from Matt Cutts were focused on a good user experience.

What Bad UX?

For Google, the UX notation, at least from the point of view of a quality evaluator is based on the following page:

  • Misleading or potentially misleading design
  • Sneaky Redirects (affiliate links intruders)
  • Downloads and malicious downloads
  • Spammy user-generated content (comments without moderation and messages)
  • MC bad (contents of page)
  • SC low quality (supplementary content/extra content)

What is SC (supplementary content /extra content)?

If it is a positive and own web page, Google speaks a variety of features and useful content relevant content – p. Useful navigation links for users (not usually MC or ads).

Additional content contributes to a good user experience on the page but is not directly relevant page to achieve its goal. SC is created by webmasters and is an important part of the user experience. A common type of SC is the navigation links, which allow users to visit other parts of the site. Note that in some cases the content may be viewed from the side behind the tabs part of the SC.

In short, the lack of useful SC can be a reason for a low-quality score, depending on the purpose of the site and the type of site. We have different standards for small websites that exist in their communities of great websites with a large volume of web pages and content. For some types of “web pages” like PDF and JPEG files, we do not expect SC.

It’s worth remembering that good does not save SC bad MC (“The main content is a part of the page that directly helps the page to achieve its goal.”) A poor review.

Good SC seems to be a sensible choice. It has always been.

Key points of SC

  1. Additional content can be a large part of what makes for its purpose a very satisfying high-quality
  2. SC is useful to the content that is targeted to the content and purpose of the page.
  3. Smaller websites such as websites for local businesses and non-profit organizations or personal websites and blogs may need less of their needs for their needs.
  4. One side can still get a high score or even higher without SC at all.

The specific offers Matching SC:

  1. Additional content contributes to a good user experience on the page but is not directly relevant page to achieve its goal.
  2. SC is created by webmasters and is an important part of the user experience. A common type of SC is the navigation links, which allow users to visit other parts of the site. Note that in some cases the content may be viewed from the side behind the tabs part of the SC.
  3. SC, which contributes to the site and the website to a successful user experience. – (A brand of high-quality website – this statement was repeated 5 times)
  4. However, we hope that the websites of large companies and organizations will make a lot of effort to create a good user experience on their website, including the one with SC useful. For large sites, SC can be an important means under which users can browse the site and MC, and the lack of SC useful for large sites with lots of content can be a reason for a low rating.
  5. However, some pages are deliberately designed to move the MC user’s attention to the ads, monetized links or SC. In these cases, the MC is difficult to read or use in a poor user experience. These pages should be classified.
  6. Misleading or possibly misleading design makes it difficult to say that there is no answer, so this page is a bad user experience.
  7. The redirect is the action of a user to send to a different URL than originally requested. There are many good reasons to redirect one URL to another, for example, when a website is moved to a new address. However, some redirects are designed to cheat search engines and users. This is a very bad user experience, and users may feel deceived or confused. We call these “hidden detours.” The hidden diversions are misleading and should be classified as the lowest.
  8. However, you can find sites with many discussions in the forum spam or unwanted user comments. We will consider any comments or discussion forum as “spam” if someone is not published without reference comments to help others but to advertise a product or a link to a website. Often these comments are written by a “but” instead of a real person. Spam comments are easy to spot. You can include ads, downloads, or other links, or sometimes only short text sequences unrelated to the topic, such as “good”, “hello”, “I’m new here”, “how are you adjourn” Hui “, etc. Webmasters should find and delete this content because it is a bad user experience.
  9. The changes make it very difficult to read a bad user experience and are. (MC inferior (copied content with little or no time, effort, experience, manual or healing value for users))
  10. Sometimes the MC of a landing page is relevant to the query, but the continuous page pornographic ads or pornographic links outside the MC ad, which can be very disruptive and possibly a bad user experience.
  11. The application and benefits of MC must be balanced with the user experience side.
  12. The pages that provide a bad user experience, such as pages that try to download malicious software, should also get low ratings, even if they have appropriate images for the query.

In short, no one advises you to create a bad UX, however, in the light of Google’s algorithms and qualifier human quality, which show a clear interest in this material. Google is a type of mobile websites in classes that are frustrating for UX, although at certain levels, Google classes as UX would be very far from what could be familiar with mobile sorting tools a professional UX Google, for example, W3c tested Mobile Tools.

Google is increasingly interested in classifying the content of the website in question and the reputation of the page field is with your website and the pages of competition in other areas.

A satisfying UX can help your ranking, taking into account the second order factors. A bad UX can seriously affect your human rating, at least. Google’s likely penalty algorithms rank pages as something like a bad UX, for example, if they meet certain criteria provable. The lack of good reputation or the old school stuff like SEO keyword stuff a website.

If you want to improve the user experience by focusing primarily on the quality of your pages MC and avoid – or even eliminate – SEO techniques of the old school – these are definitely positive steps to get more traffic Google in 2016 – and The nature of the content to Google Performance Awards are ultimately at least a good user experience.

Balancing conversions with user-friendliness and user satisfaction

Take Popup or Pop-Under as an example:

According to the expert in usability, Jakob Nielson, 95% of visitors to the site have hated unexpected or unwanted pop-ups, especially unwanted advertising ones.

In fact, pop-ups have been consistently voted the # 1 most hated advertising technologies since they appeared for many years.

Accessibility The students will also agree:

  • Create a new browser window must be the user’s authorization
  • New windows should not clog to the user’s screen.
  • All links must be open in the same window by default. (However, you can make an exception for pages that contain a list of links, and in such cases, it is convenient to open links in a new window so that the user can easily return to the links.To give the user a prior knowledge of the links Links will open in a new window).
  • Inform the visitors that they can access via a pop-up window (with the <title> link)
  • The pop-ups do not work in all browsers.
  • They are confusing to users
  • Give the user an alternative.

It is inconvenient for the fans to increase the user-friendliness that pop-ups successfully use the subscription conversions dramatically.

Example: Test with a pop-up window

Pop-ups suck, everyone seems to agree. Here is the small test I performed on a subset of pages, an experiment to see if the pop-up work in that place will convert more visitors to customers.

I tried it when I did not blog for a few months and the traffic was very stable.


Testing Pop Up Windows Results

Pop Up Window Total %Change
WK1 On Wk2 Off
Mon 46 20 173%
Tue 48 23 109%
Wed 41 15 173%
Thu 48 23 109%
Fri 52 17 206%


There is a fair increase in email subscribers, usually in this little experience on this site. A pop-up usage appears to have an immediate effect.

I’ve tried it for a few months and the results of the small previous tests have been repeatedly repeated.

I have tried different models and different action prompts without popups, and they also work to some extent, but they usually last a bit longer to implement than to activate a plugin.

I really do not like pop-ups because they were an obstacle to accessibility on the web, but it’s stupid any hand to rule out that works. I have not found a customer when I had that kind of result, choose accessibility through subscriptions.

I do not really use the pop-up on the days I publish on the blog, as in the other tests, he really seemed to kill as many people share a place in the social media circles.

With Google now showing an interest with interstitial, I would be nervous to use a pop-up window that covers the main reason for the visit. If Google recognizes dissatisfaction, I think it would be bad news for the assessment.

I am currently using an exit strategy pop-up which I expect that when the user sees the device, they are initially satisfied with my content they came to read. I can recommend this as a way to recommend your subscribers, because now, with a conversion rate similar to pop-up – if not better.

I think as an optimizer, it makes sense to convert the customers without using techniques that potentially have a negative impact on the Google rankings.

Do not make the transformation in the way the main reason for a visitor currently is to recognize a particular page or Google with your relative site risk always dissatisfaction and that will not help Google better RankBrain gets the development of what “really mean.

Google wants to rank high-quality websites

Google has a ranking of your site’s history as a unit type, and no matter you do not want a bad day on it. Set it by algorithm or human. The manual testers cannot directly affect your ranking, but all the characters associated with Google are likely to avoid your location being marked as low quality.

If you create websites without having practiced to Google to be up against nature, you will meet Google’s expectations in the quality of qualification policies (PDF).

Google said:

Poor quality pages are inadequate or missing items that prevent them from reaching their goal.

“Reason Enough”

In some cases, there is a “reason enough” to immediately report the page in certain areas and Google orders the quality testers to do this:

  • An insufficient amount of MC is reason enough to give a bad grade for a page.
  • Poor quality MC is reason enough for a page to give a bad quality rating.
  • The lack of suitable E-A-T is reason enough for a page to give a bad note.
  • Negative reputation is reason enough for a page to give a bad note.

What are the low-quality pages?

When it comes to defining what an interior page is, Google is obviously interested in the quality of the main content (MC) of a page:

Main content (MC)

Google said that MC should be the main reason why a page is available.

  • MC quality is low.
  • There is an insufficient amount of MC for the purpose of the page.
  • There is an insufficient amount of information on the site.

Bad User Experience

  • This content has a lot of problems: bad spelling and grammar, complete assembly is missing, wrong information. The poor quality of the MC is a reason for the lower note + low. In addition, the popover ads (blue underlined words) can make the main content difficult to read, resulting in a bad user experience.
  • The pages that provide a bad user experience, such as pages that try to download malicious software, should also get low ratings, even if they have appropriate images for the query.


  • If a page seems ill-conceived, take a look. Ask yourself if the site was intentionally designed to get the attention of MC. If this is the case, the low score is appropriate.
  • The layout is missing. The layout of the page for example or space diverted from the MC, which makes the use of the CD.


  • You should consider which is responsible for the content of the site or the content of the page you are evaluating. Does the person or organization has a sufficient experience of the subject? If the experience, autoritativité or reliability is missing, the lower rating.
  • There is no evidence that the author was a medical experiment. Because it is a YMYL medical subject, the lack of experience is a reason for a low score.
  • The author of the site or website is not enough experience to make the subject of the site and/or the site is not reliable, authorized for the subject. In other words, the page/site is missing E-A-T.

After the content of the page are the following ones with the most weight in determining whether you have a high-quality bag.


  • Unnecessary SC or distraction that transmits the website, rather than helping the user is a reason for a low rating.
  • SC distracted or not helps the purpose of the site.
  • The page is missing SC useful.
  • For large sites, SC can be an important means under which users can browse the site and MC, and the lack of SC useful for large sites with lots of content can be a reason for the bad chances


For example, an ad for a model in a revealing bikini is likely to be acceptable to a website that sells swimsuits, but a very entertaining pornographic advertising and graphic can justify a deep note.

Good care of your home

  • If the site is not sufficiently updated and feels badly held for its purpose, the low score is probably justified.
  • The page lack of maintenance and updates.


  • Credible negative reputation (though not malicious or fraudulent financially) is a reason for a low score, especially for a YMYL page.
  • The page has a negative reputation.


When it comes to your Google page, the lowest rating comes, you’ll probably need to go to beat some of these, but it gives you an address that you want to make sure you definitely avoid.

Google said in the document, there are some pages that…

You should always get the lowest score

.. And are shown below. Note – These statements are then distributed in the document and do not check with the way I listed here. I do not think that context presents, this is lost, and it makes it more digestible.

All who are familiar with the Google Webmaster Guidelines are familiar with most of the following:

  • Real lack of landing pages or websites.
  1. Sometimes it is difficult to determine the true purpose of a page.
  • Pages that link to YMYL from sites with information about the totally inadequate or non-existent website.
  • Sites or websites that are created to make money with little or no effort to help users.
  • MC pages with extremely low or low quality.
  1. If a page is intentionally created without MC, use the lowest rank. Why does a page exist without MC? Pages without MC generally lack destination pages or misleading information.
  2. Web sites that are intentionally created to be with a minimum of MC, or MC, which is completely unnecessary for the purpose of the page, should be considered without MC
  3. The pages intentionally created without MC should be classified as the lowest.
  4. Important: The lowest ranking is appropriate when all or most of the MC’s pages is copied with little or no time, effort, experience, manual restoration or added value for users. These pages should be classified as the lowest, even if the page assigns the credit for the content to another source. Important: The lowest ranking is appropriate when all or most of the MC’s pages is copied with little or no time, effort, experience, manual restoration or added value for users. These pages should be classified as the lowest, even if the page assigns the credit for the content to another source.
  • Pages on YMYL websites (your money or your life transaction pages) with totally inadequate information or no website at all.
  • Pages on contaminated sites, unauthorized or defective.
  • Pages or websites created inexperienced or very reliable pages, unreliable, unauthorized, inaccurate or misleading.
  1. Web sites or harmful or dangerous websites.
  2. The Web sites that have extremely negative or harmful reputation. You can also use the lowest rating for violating Google policies about the quality of webmasters. Finally, the Minor + can be used for two pages with many functions and bad for pages including the lack of a page of the quality feature, makes them question the real purpose of the page. Important: Negative reputation is reason enough for a page to give a bad note. Evidence of a genuinely malicious or deceptive behavior ensures the lowest score.
  3. Sites or misleading websites. Deceptive websites seem to have a useful purpose (the stated goal), but they are actually created for another reason. Use the lowest rank when a web page is intended to generate to deceive and potentially hurt users to take advantage of the site.


  1. Some pages are designed to allow users to process specific types of links through visual design elements such as layout, organize, link address, font color, images and so on. We will consider such sites with the deceptive Use the lowest rank when the page intended to provide useful MC designed users to manage, click on ads, link or suspicious monetize download links with little or no effort to click.
  2. Sometimes the pages feel “unreliable”. Use the lowest rating for one of the following conditions: pages or sites that you suspect are frauds
  3. (Eg pages that require the name, date of birth, address, bank account, the number of government ID cards, etc.). Web sites that are “phish” for passwords for Facebook, Gmail or other popular services online. Suspect sites with download links, which can be malicious software.
  • Use the lowest ranking for sites that has an extremely bad reputation.

Web sites that lack attention and care are classified as low quality.

Sometimes a web page may seem a bit sloppy: the links are broken, the images cannot be downloaded and the content may appear obsolete or overtaken. If the site is not sufficiently updated and feels badly held for its purpose, the low score is probably justified.

Sides “broken” or unclassified low quality functional

I hit 404 pages in my recently published article looking for, why a website traffic has lost.

Google gives clear clues to create useful 404 pages:

  1. Visitors clearly say that the site they are looking for is not found
  2. Use a friendly and engaging language
  3. Make sure that your 404 page uses the same look and navigation as the rest of your site.
  4. Consider adding links to your favorite items or items, and a link to the main page of your site.
  5. Consider a possibility for users to report a broken link to report.
  6. Make sure that your Web server returns an HTTP status code of 404 when a genuine missing page is requested

Points for pages with error messages or without MC

Google does not want to index the pages without a particular goal or sufficient baseline. A right side 404 and the corresponding layout prevent a large portion of it from occurring primarily.

Some pages are loaded with content created by the webmaster, but an error message or a missing MC. The pages cannot be MC for several reasons. Sometimes the page is “broken” and the content is not loading properly or at all. Sometimes the content is no longer available and the page shows an error message that contains this information. Many websites have “broken” or pages that do not work. This is normal, and the individual pages that do not work or are broken in a place else should be classified as low quality. This also applies when other site pages are high or high overall quality.

Are Google 404 pages per program examined?

We were not told in the last hangout – but – in Quality Rater guidelines “The users are likely to interest a lot.”

You 404 errors in the research console hurt my notes?

404 Errors in the invalid URLs do not affect the indexing or ranking of your site. JOHN MEULLER

It seems that this is not a unit size. If you correctly handled 404 mistakes that have an equity link, close the equity that was once lost – and this claim “backlink” obviously has value.

The problem here is that Google introduced a lot of noise in this monitoring report error to make it difficult to manage and not very easy to use.

Many broken links Google says you can often be completely irrelevant and inheritance matters. Google could immediately make us more valuable by saying that 404s are only connected to external sites.

Fortunately, you can find your own broken links on the website using SEO tools countless available.

I also prefer to use analyses to search with a history of migration for broken backlinks to a website, for example.

John has found some of these, although it specifically speaks (I think) of the errors found by Google in the webmaster tools (formerly Google Webmaster Tools)

  1. In some cases creep errors may come from a legitimate structural problem within your site or CMS. How do you know? Check the source of the analysis errors. If a link is on your site, in the static HTML code of your page, it always helps to fix it
  2. What about the funky URL, which “clearly broken?” If our algorithms are your site, they can try to find more content in it, for example trying to discover the new URLs in JavaScript. When we tried the “URLs” and found a 404, it’s great and expected. We just want to miss something important

If you create websites and want them to be classified, the paper quality guidelines 2015 and 2014 is an excellent guide for webmasters to avoid poor assessment for quality and avoid potentially punishment algorithms.

Google does not rank over the poor quality pages if better options

If you have examples of exact matches of keywords on poor quality pages, most of these sites will not have all the ingredients that are necessary for Google to rank high in 2016.

I worked on it long until I understood enough to write something about it.

Examples of how to make a standard page that has not been categorized for years and designed it into a page with resource-based resources around a user’s intent:

Google, in many cases, prefers the long tail search traffic, such as users with mobile voice search using, for example, high-quality pages about a concept / topic to send out the relationships and the connections between the subtropics relevant FIRST only that of the Send traffic on the pages of poor quality just because they have the exact phrase on the page.

Continue reading

  • Study a smaller number of visitors
  • The treatment of poor quality pages on a website
  • Example of a high-quality web page
  • Make high-quality websites

Technical SEO

If you are a professional SEO audit for a real business, you have to think like a Google search as a researcher and to provide a Google search engineer with a long-term real value for a customer.


Google has a long list of technical requirements that you can guess, and all things he says you will not optimize your site.

The note Google’s technical guide is not a magic solution is to be successful, but not achieved, your long-term ranking and one or other technical issue can affect your entire website seriously if it is deployed on multiple pages.

The advantage of the technical guidelines to respect is usually an advantage of the second order.

It is not punished or filtered when others do. When others fall, they rise.

Above all – the individual technical problems will not be the reason why you have classification problems, but we still need the second to advantages that appeal to them.


When you create a website for Google in 2016, you really need to understand that Google has a long list of things that will mark down the pages, and it’s usually the old school SEO tactics that are now classified As “web spam”.

In contrast, websites that are not marked below do not worsen and thus improve on the ranking. Sites with higher rankings collect more organic links and this process can quickly provide a high-quality page from Google.

So – the most reasonable thing for any webmaster does not give Google any reason to demote a website. Check all the fields that Google requires you to check.

I have this simple strategy (but longer view) on rank on page 1 or to “SEO” in the UK over the past few years and lead relevant biological 100,000 visitors this site each month about 70 pages, all links in recent years (and working on it part-time):

What is the domain’s authority?

Domain authorization, or as Google calls it, “authority online business” is an important factor at Google. What is the domain authority? Now, no one knows exactly how Google calculates the popularity, reputation, or trust, outside of Google, but when I write about the domain authority, I think usually of sites that are popular, worthy of trust and trustworthiness – Everything, of course, can be wrong.

Most websites that have the domain authority/business line of authority have many links to them – sure – hence the link building traditionally a tactic has been so popular – and these links usually count How many third-party tools calculate a pseudo range Domain score, too.

The massive scope of authority and rank “trust” have been awarded in the past very successful sites that have gained many links from credible sources and other business authorities online as well.

Amazon has a lot of online business authority…. (Blog Official Google Webmaster)

SEO generally refers to the domain and the authority of the domain according to the number, type, and quality of the incoming links on a website.

Examples of trust, jurisdictions include Wikipedia, the W3C, and Apple. How to become an OBA? By building a killer online or offline brand or service, usually a lot of useful content on your site.

How can you enjoy online a commercial authority? Or turn the place into a black hole SEO (only for big brands) or pumping information – like all the time. In every topic. As Google rate!

AUSSER – If you are considered to post bad quality and not suitable for your region’s visibility in Google.

I think the “quality judgment” Google has developed can be the answer to this type of historical authority abuse area.

Can you (on a smaller scale in certain niches) imitate the recognition of an authority in the online business that OBA is for Google, and why does Google rank high in the search results? These provide the service, content, experience. It takes a lot of work and time to create or even imitate.

In fact, as the SEO, I really believe that the content way is the only sustainable way for most companies to try to at least reach OBA in their niche or the place. Some concentrated link building is going to help a long way, and you must definitely go and tell others about your site …

Make other relevant sites to sell links. Guidelines for Google Webmasters

Brands are how you adjust the gap.

“Brands are the solution, not the problem,” Schmidt said. “Brands are the way to sort the pit.”

Eric Schmidt, CEO of Google, said this. Between the lines, I thought this is a good SEO advice.

If you’re a “brand” near you or the city’s website, Google wants to sort your business at the top because you trust the results pages with crap not to spam and fill and make Google look stupid.

It is money that sits at the table, as Google currently offers the authority of the massive field and the trust in certain sites that have the highest rate.

Tip: Keep the content on your topic when you produce quality content, of course. (For example, the algorithms cannot detect abnormal practices)

I always think:

How can I link to major sites known to my site? Where is my link quality is?”

Get links to “brands” (or city sites) in niches can mean “quality links”.

Easier said than done, for the most part, of course, but that is the point.

But the goal with your main website should always be online to become a brand.

Google the preferred brands in organic SERPs?

Okay, yes. It is hard to imagine that a system like Google has not been developed just over the last few years to provide ads that today, and is full of many sites that have a high degree of importance as the area in which the content is.

Great brands have an inherent advantage in Google’s ecosystem, and it’s a kind of small businesses to suck. There are more small businesses than big brands to get Google the money from the Adwords, too.

That is – small businesses can still be successful if they focus on a strategy based on depth, rather than width, as the content is structured page by page on a website.

The age of the domain is an important factor in the Google ranking?


No, not in isolation.

A domain decade in which Google knows nothing is the same as a new domain.

A 10-year website, which is cited year after year repeatedly, shares in other more authorized websites and reliable? This is precious.

But this is not the age of your website address on your own in the game as a staging factor.

A field created by a year of websites authority is so valuable when no more worth than a field of ten years without links and without historical research.

Perhaps the age of the investment can come into play when other factors are taken into account, but I think Google works at all levels very similarly, with all ranking factors and all the classification terms.

I do not think it is possible to “classify factors” without checking “conditions for classification.”

Other classification factors:

  1. Age of the field; (NOT YOUR OWN)
  2. Time of domain registration; (I do not see much profit in IT “Also, to know” Valid domains are often paid for several years in advance, while the input fields (illegitimate) are rarely used for more than a year.D others that he does not want anyone else The domain, it is not an indication that it will do something that Google worries).
  3. Domain registration information has been hidden / anonymous; (Perhaps, as human criticism, if the conditions look like a spam site)
  4. Field of the top-level location (geographical focus, for example, compared to compared to .uk); (YES)
  5. Top-level domain of the site (for example .com .info against); (From)
  6. Domain secondary or root? (From)
  7. Previous domain registrations (IP rate); (From)
  8. Previous domain owner (the frequency at which the owner has changed) (a)
  9. Keywords in the field; (FINALLY – EXCEPTION EXCEPTION KEY PART – although Google is much quiet filtering performance an exact match of domains in 2016))
  10. Area of the IP; (HÄNGT – usually not)
  11. IP domain neighbors; (HÄNGT – usually not)
  12. External domain names (not related) (I have no idea, in 2016)
  13. Location targeting settings in Google Webmaster Tools (YES – of course)

Google punishes unnatural footprints

By the year 2016, you should be aware that what improves your rank also works (faster, and much more noticeable) can be punished.

In particular, Google’s Webspam team carries out a PR war on sites that depend on artificial links and other manipulative tactics (and the delivery of severe penalties when it detects). And this is looking at already many algorithms designed for other manipulative tactics (such as filling up keywords or text rotation on many pages).

Google ensures it takes longer on which the search engine in the world is based essentially the results of SEO black and white hat, and with the intention of ensuring a flow in your SERP to watch time for research and the company is in the Close to this search engine.

There are some things that you can not legitimate to directly influence your ranking, but there is a lot that you can do to drive more traffic from Google to a web page.

Factors Classification

Google has hundreds of factors that can be compared with character ranking that you can change daily, weekly, monthly, or yearly to determine where your page is compared to other competing pages in SERPs.

You will never find any ranking factors. Many ranking factors are on the site, or website and other off-page and off-site. Some ranking factors are based on where you are or what you were looking for.

I am an online marketing for 15 years. A lot has changed at this time. I’ve learned to focus on the aspects that have the greatest return on investment of your work.

Learn the basics of SEO ..

Here are some simple SEO tips for getting started:

  • If you’re just starting, I do not think I can fool Google all the time on anything. Google has probably seen his tactics. Therefore, it is best to keep your plan simple. RELEVANT to get. Serious GET. Aim for a healthy and satisfying visitor experience. If you are just starting out, – you can also learn how doing it in the Google Webmaster’s Guidelines first. Make a decision earlier if you follow Google policies or not, and also stick to them. Do not get caught in the middle with an important project. Do not always follow the herd.
  • If your goal is to get visitors to Google in any way, Google is not your friend. Anyway, Google is not your friend, but you do not want to be an enemy. Google will find you much free traffic send, even if you manage to reach the top of the search results, then maybe they are not so bad.
  • Many optimization techniques that the sites are effective in ranking in Google are against the Google guidelines in promoting. For example, many links that you have once promoted to the top of Google are actually your site and their ability to hurt to rank high on Google. Stuffing keywords could keep your back page … You should be wise and careful when it comes to building links to your site in such a way that Google * does not * hope * much problems with the future. For they will punish you into the future.
  • Do not expect to arrange without a lot of investment and work for a number 1 in a niche. Do not expect overnight results. Wait too fast can cause problems with your spam machine.
  • You pay nothing Google, Yahoo or Bing natural or free listings. It is common for the major search engines find your site very quickly by itself in a few days. This is much easier when the CMS actually made “pings” search engines when you update the content (via XML Sitemaps or RSS, for example).
  • To be valued and ranked high in Google and other search engines, you should really consider and, above all, look for engine rules and formal guidelines for taking up. With experience and much observation, you can learn what rules can be doubled, and what tactics are short term and perhaps should be avoided.
  • Google series sites (relevance for a moment aside) by the number and quality of incoming links to a site from other sites (among hundreds of other metrics). Which the connection points usually have a link from one page to another page is seen on the page in Google’s “eyes” as a voice. The more votes you get a page, the more confidence a page will have and the greater the Google rank – theoretically. Rankings are tremendously different from how Google is ultimately based on the field page being rotated. Backlinks (links from other sites -. Win any other characters)
  • I always thought that if you take the rankings seriously, do it with the original It is clear – search engines reward good content you have not previously found. Murderously fast indexes, for a start (in a second, if your site does not penalize!). So be sure that each of your pages has enough text content you have written specifically for this site – and you do not have to jump through the tire to get this rating.
  • If you have an original and high-quality content on a website, you also have the possibility to generate the quality inbound links (IBL). If you find your content on other websites, it will be difficult to get links, and you will probably not rank very well that promotes Google diversity in your results. If you have the original content of sufficient quality on your site, you can allow authoritative websites to learn the ones who have procurator online, and a link to yours. This is called a quality backlink.
  • Search engines must understand that “a link is a link” that you can trust. The links can be designed with the no follow attribute rel ignored by the search engines.
  • Search engines can use your website from other sites to find it. You can also send your website to search engines directly, but I have not sent any website over the past ten years search engine – you probably do not need it. If you have a new website, save now with Google Webmaster Tools these days.
  • Google and Bing use a crawler (Googlebot and BingBot) that web spiders for new links to the image. These robots can find a link to your homepage somewhere on the internet, then crawl and index the pages of your site when all the pages are connected to each other. If your site has an XML Sitemap, for example, the use of Google will include this content in its index. An XML page is INCLUSIVE, not exclusive. Google crawls and indexes each page of your site, including pages with an XML sitemap.
  • Many people think that Google will not allow any new sites to rank well for competitive terms, until the web address “age” and gain “trust” in Google, I think it depends on quality depends on inbound links. Sometimes your site will rank high for a while and then disappear for months. A “grace period” to give you an idea of Google traffic, no doubt.
  • Google will rank your site when crawling and indexing your site – and this classification may have an impact on your ranking DRASTIC – it is important for Google to determine what your last name – you want to be classified as a “site” With Google, a domain page, or a small business website with a real purpose? Make sure not to confuse Google by becoming explicit with all the characters that you display on your website, that this is a real deal Your intention is, authentic – and more importantly, today – focused on the satisfaction of a visitor.
  • NOTE – If there is a page just to make money with Google Free Traffic, Google prompts for spam. I will return later in this manual.
  • Transparency You provide on your website in text and links on who you are, what you do and how you qualify the web or as a business it is a way that Google (algorithms and manual) could use to “qualify” your website. Keep in mind that Google must be a huge army of testers and quality at a certain point on your site if you have a lot of traffic from Google.
  • For classification search for specific keywords, you usually have the key set or very relevant words on your page (not necessarily together, but help) or links that appear on your site/site.
  • Ultimately, what you have to largely compete to do depends on what the competition is about the term you want it to be. They need at least reflect on how hard they are to compete when a better opportunity is hard to spot.
  • Following other high-quality websites with your website, the site now has a certain amount of real PageRank that is shared with all internal pages that will form your website, which in the future provide a signal that will help the series on this page In the future.
  • Yes, you need to create links to your site to win more PageRank, or “juice” from Google, or what we now call domain of authority or trust. Google is a search engine based on a link – it is not quite the content “best” or “quality” does not understand, but he understands the content “popular.” It can also identify the poor, or a poor portion – and punish your website so – or – at least – you had to remove the traffic once with an algorithm change. Google does not like to call the actions a punishment of under – does not look good. They blame their graduation drops on their engineers are getting better at identifying content or quality, or vice versa links – lower-quality content and artificial links. When your site steps to paid links, they call it a “manual action” and you will be notified of it in the webmaster tools when you register.
  • Link building is not just a game with numbers, however. A link to a website “trusted authority” on Google could be anything you need to rank in your niche. Of course, the more links “trustworthy” attract you more Google your website. Of course, you need to get several reliable connections from several trustworthy sites like the best of Google in 2016.
  • Try to link to the page text, which has relevant keywords to your site points, or at least, of course, in the link text, not for example on blogs or site links. Try to make sure that the links are not obviously “generated by the machine,” p. Links to websites, forums, and directories. Get links from sites that in turn have many links to them, and you will soon see the benefits.
  • On the site, you should create a link to your other pages by linking the pages of the main text content. Usually, I do that if it is relevant – often I will be linked to the corresponding pages where the keyword is in the title elements of both sides. I will not have the automatic connection to all during the generation. Google has punished websites for using automatic shortcuts, for example, so I avoid them.
  • Linking to a page with expressions of the actual keywords in the link helps a lot with any search engines if you want to include terms for specific keywords. For example; “SEO Scotland” unlike http://hobo-web.co.uk or “click here”. He says that – in 2016, Google will anchor the anchor text manipulator in a very aggressive manner, be appropriate – and to build simple markers and URL links, the authority with less risk. Rarely optimize for grammatically incorrect terms these days (especially with links).
  • I think that the anchor text links in the internal navigation are always valued – but keeping it natural. Google needs to find links and categorize your pages. Do not underestimate the value of a rich architectural slogan Smart internal link and make sure you know for example how many words Google accounts on a link, but do not exaggerate. Too many links on a page can be considered a bad user experience. Avoid as many hidden links in your navigation model.
  • Search engines like Google “spider” or “crawl” all the following the page all links to your site with new pages as a human would click on the links to your pages. Google will crawl your pages and be indexed, and in a few days will usually start your pages on SERPs return.
  • After a while, Google will see your pages and maintain what you are looking at with the original content or pages with many links to make them as pages “useful”. The remainder will be indexed. Caution: Too many bad quality pages on your site will impact the overall performance of the site on Google. Google is equatorial, speaks of good and bad quality content of low-quality
  • Ideally, you will have single pages, with unique page titles and unique page descriptions of purpose. Google does not seem to use the meta description when your site is sorted for the search specific keywords, if it is not relevant, unless you are careful, you can spammers only the original text free at the end of your site and not By you once scratch descriptions and placing the text in the main content of your website. I’m not interested in keywords these days, as Google and Bing say they either ignore or use them as spam signals.
  • Google will take some time to analyze your entire website to examine the content of the texts and links. This process is more and more these days, but ultimately it is determined by your domain name and real PageRank.
  • If you have plenty of text through duplicate sloppy already through Googlebot on other sites you will find; Google will ignore your page. If your site or page shows signs of spam, Google will punish it earlier or later. If you have multiple pages on your site, Google will ignore most of your website.
  • It is not necessary that the keywords text contains competition to overcome.
  • Optimize a page to get more traffic by increasing the frequency of the desired key phrase, related to keywords, to the appropriate keywords and synonyms in the links, page title, and text content. There is no ideal amount of text, no density magic keywords. Filling keywords is a delicate matter, even these days.
  • I would make sure I make so many relevant UNIQUE words on the page that make the greatest number of relevant topics in the long tail as possible.
  • If you access irrelevant websites, Google may also ignore the page, but even here it comes to the site. If you have a connection or a right, it is PRINTER – I hope that Google will use its relationships, practices as a means of potentially ranking your site. Partner websites, for example, that do not do well on Google these days without some good backlinks and sites of the highest quality.
  • Many search engine researchers believe that you can connect them (which binds) helps determine local community pages in each area or central authority. They just want to be in this area, in the middle, if possible (but unlikely), but at least in it. I like to think of this as a good thing to remember in the future that search engines will still better determine the current relevance of the pages, but I never saw an advantage granular classification (for page question) link.
  • I think the external links to other sites are probably deeper on individual pages in the architecture of your site with pages that should get all your Google juice as soon as it is “tempered” by the more tops of the structure of your pages in the class ). This tactic is old school, but I am still. I do not need to think that you should worry about it, even in 2016.
  • Original content is king and pull a “natural link growth” – according to Google. Too many incoming links to quickly could depreciate your site, but again. Normally, I’m wrong on the safe side – I always call massive diversity in my bonds – to make them more “natural”. Honestly, I will be looking for the natural bindings in 2016 on this site.
  • Google can demote the whole pages, individual pages, links to models and individual links generated when Google considers them to be “useless” and “bad user experience”.
  • Google knows that you “bind” the “quality” of these links and connect them to you. These – and other factors – help eventually determine where a page is placed on your site. To make it confusing – the page that is listed on your site may not be the page that you want to categorize or even the page that determines your ranking for that term. After Google has developed its area of authority – sometimes it seems that the most relevant page of your site Google has no problem with the management.
  • Google decides which pages of your site are important or more relevant. You can help Google by linking your important pages and ensuring that at least one page is properly optimized in the rest of your pages to the desired keywords. Always remember that Google’s “thin” pages in the results did not rank, any page you want to rank should have all the things Google search. It is much today!
  • It is important that you link all the true “PageRank” – or share links – to your sales keyword/phrase rich pages and so stays for the rest of the pages, so Google does not abort pages into oblivion – or “supplemental results” The old connoisseurs of them during the day. Again – it’s a bit of the old school – but it took me today.
  • Consider linking important pages of your site from your homepage and other important pages of your site.
  • First, focus on MEANING. Concentrate your marketing efforts and get reputable. This is the key to ranking “lawfully” in Google in 2016.
  • Every few months, Google changes its algorithm to punish the decayed optimization or industrial handling. Google Panda and Google Penguin are two such updates, but it is important to understand that Google is constantly changing its algorithms to control its ad pages (over 600 changes a year we say).
  • The art of change ranking is to classify these algorithms without been stumbling over or reported by a human criticism – and it’s hard!
  • Concentrate at any time to improve download speeds from the site. The web is changing very fast, and the fast website is a good user experience.

Welcome to the tightrope walk that is the modern web optimization.

Read on if you want to learn SEO…

Search by keyword

The first step in any professional campaign is to research and keyword analysis.

Someone asked me about a simple white hat tactic and I think this is probably the easiest thing anyone can do that guarantees results.

The letter above (last year) shows a lifetime of 4 valuable enough words. I noticed that page I was not ranking high in Google, but I thought I should be well and could rank it with this simple technique.

I found a simple example to represent an aspect of the on page SEO or “degree of modification” white hat is 100% google friendly and never cause a problem with google.

This “trick” works with any keyword, anywhere and with different visible results based on the availability of pages in the SERPs and the availability of content to compete with your site.

The key phrase that I have tested rankings for is not on the page, and I have not hidden the dominant sentence … Or on incoming links, or with technical tricks such as diversions or technical hiding, but as you can see in the graphic The opportunities to be in the right direction.

You can benefit if you know a little about how Google works (or work that years in many observations above, unless Google throws a bone into synonyms.) You can never be 100% working on any level unless The data show that you are wrong, of course.)

What have I done the number 1 out of nowhere for this keyword phrase rank?

I have a keyword in the page as a plain text since adding the actual keyword set myself would have read my text keyword spamming for other variations of the main term. It will be interesting if you do many pages and many keyword phrases. The most important thing is to search for keywords – and know what to add keywords.

This example illustrates a key “meaning” on a page, in many cases, it is a keyword.

The exact keyword.

Yes – many other things happen at the same time. It is difficult to accurately determine why Google page all the time ranking … but you can rely on other things that happen, and just keep up with what you see working for you.

In an optimal time of light, it is useful to gain some words that would classify a simple way, let others ask you how to do it.

Of course, you can always mark on a page or even spam your link profile – but it’s “light” optimization. I’m really interested in testing on this site – as with less to get – I believe that the key is not aggressive tripping Google’s algorithms.

There are many tools on the web to help the research base of keywords (including Google keyword planners and there are more useful third party SEO tools that help you do this).

You can use many keyword research tools to quickly identify opportunities to get more visitors to a page.

I built mine:

Keyword Analytics “Not available”.

Google Analytics was the best place to see the keyword opportunities for some areas (especially the former), but all this a few years ago has not changed.

Google has told us what keywords visitors are to send to our pages from the search engine in October 2011 as part of the privacy of users.

Google’s research by standard users is now encrypting already linked through a secure link to Google.com. Going to the SSL search also means that websites that people visit after clicking on the results Google will no longer receive “reference” data that show what these people are looking for, except ads.

Instead, Google Analytics rather show the keyword “not provided”.

In the new Google system, the reference data is blocked. This means that website operators lose valuable data, they begin to understand how they leave their sites through Google. You can still say that someone came out of a Google search. However, they will not know what this research was. Search engine (an excellent source for new industry from Google)

You can still get some of this data when you sign up for Google Webmaster Tools (and you combine it in Google Analytics), but the data is limited and often not the most accurate. However, keywords, data can be useful, and accessing data backlinks are essential these days.

If the site you are working on is an old place, there is probably a lot of data on keywords in Google Analytics:

The keywords in bold or italic help them?

Some webmasters claim that their keywords are bold or put their keywords in italics is a positive ranking factor in terms of optimizing search engines for a page.

It is essentially impossible to prove this, and I think these days, Google could use this very well (and others easily identify in page optimization efforts) to determine to do a website punish them to not promote them in SERPs.

All you can “optimize” your page – Google will use it against you to filter the results.

I use bold or these days specifically for users cursive.

I only use the focus when it is natural, or that is really what I want to emphasize!

Do not tell Google what you should sneak easily.

I think Google is dealing with sites that have quite a different trust each other in a certain respect.

In other words, the most reliable websites that are not treated differently on untrustworthy sites.

Keep it simple, natural, useful and random.

How many words and words do I use on a page?

I have been all this time –

How do you put text on a page to sort a specific keyword?

The answer is that there is no optimal amount of text per page, but the amount of text you need is based on your AUTHORITYFIELD, its relevance, and how current competition before for that term, and how COMPETENT that is the competition.

Instead of thinking about the amount of text, you should think more about the quality of the content of the page. Optimize it with the intention of the researcher in mind. Well, that’s what I do.

I do not think you need a minimum of words or text on Google rank. I have seen pages on pages 100, with 50 words of 250, 500 or 1000 words. Then again, I’ve seen pages without the text line in anything other than incoming links or other “strategy”. In 2016, Google is much better to hide these sites, however.

At the moment I draw it long pages with lots of text, even though I still strongly rely on keyword analysis based on my pages. The advantages of the long sides are they are ideal for the main sets of the long tail.

Rich information pages rich focused the mind when it comes to producing useful and licensed content.

Each location is different. Some pages, for example, may be 50 words because of a good link profile away and the area in which it is hosted. For me, the most important thing is to make a corresponding page in a user search query.

I do not care how many words I recognize that and I often need to experiment with a site that I do not know. After a while, you have an idea of the amount of text you are using to get a page in a particular domain on Google.

One thing to keep in mind: The more text you add to the page, as long as it is unique, keyword rich and relevant, the more this page is rewarded with more visitors from Google.

There is no optimal number of words on a page to the Google site. Each site – each page – is different from what I can see. Do not worry too much about the number of words if your content is original and informative. Google will probably reward you at some level – anytime – if there are many individual texts on all pages.

Character counter tool

What is the perfect keyword density?


The short answer is – no.

There is no unique density of keywords, not guaranteed optimal percentage 1. However, each page in the number to classify, I know that you can include a keyword on a page and solve an anti-filter spam.

Most web optimization professionals agree that there are no ideal percentage of keywords in the text for a page as # 1 on Google. Search engines are not so easy to cheat, although the key to success in many areas is simply to make good things (or at least better than the competition).

I write for natural page copy, if possible, always focus on key concepts – never calculate the density to identify the best% – there are too many other things to work with. I studied it. If that seems normal, that’s fine.

I intend to include related terms, long tail variants, and synonyms in the main content – at least once, as all pages require.

The optimal keyword density is a myth, even though it is a lot that claims the opposite.

“Things, no strings”

Google is better to design what a page, and it should be the intention of a search engine to be about, and it’s not just relying on keyword phrases on a page to do so.

Google Knowledge Graph has been nominated and possibly filled, and Google will provide this information on the results of search engines (SERP).

Google has many options when re-writing the query in a contextual manner, depending on what you were looking for, who you are, how you are looking, and where you are at the time of the search.

Can I write, of course, and rank high on Google?

Yes, of course, you must write (and briefly) in 2016, but if you have no idea of the keywords you have and you have no experience in this subject, you can find behind those who can access this experience.

You can “write it naturally” and still rank well for keywords less than you would if you had optimized the page.

There are too many competing sites leading to the first places did not optimize the content.

Of course, you may need the amount of text to write how much you need to work on it, and finally, the rank depends on the reputation of the domain of the site you publish the article.

Do you have a lot of text need to sort the pages in Google?

The user search intent is a way for the seller to describe what a user wants to accomplish when they perform a Google search.

SEOs have the user search intentions to understand broadly in the following categories and it is an excellent place to drop Moz at that point.

  • Transactional – The user would like to do something like buy, register, register a task that they have in mind to complete.
  • Information – The user wants to learn something
  • Navigation – The user knows where to go

The guidelines for the human quality of Google change the rules to simplify the constructions:

  • Do
  • Namely
  • Go

As long as it matches the main purpose of the user, you need as many words as you need.

You do not need to sort too much text Google.

Optimize users’ intent and satisfaction

When it comes to writing an SEO-friendly text to Google, we need to optimize the user’s intention, not just a user written on Google.

Google will allow people to search for information on a subject on the relevant pages of the highest quality that they are looking for in their database, often in front of how Google used to work, according to close or exact match occurrences of a keyword on each page.

Google is constantly evolving better understanding the context and the intention of the user’s behavior and does not care to re-write the query, used to serve high-quality sites for users who fully meet the satisfaction of users, for example. Discover themes and concepts unique and successful.

Of course, the intention of the user to optimize, even in this way is that something that many suppliers had long ago appeared to re-write the query and Google Hummingbird.

Optimization “Long Click”

When it comes to the satisfaction rating user, there are some theories to do the round at the moment, I think are sensitive. Google may be the satisfaction of the following users via a proxy. If a user uses Google to look for something, the user behavior of this point can be a proxy for relevancy and the relative quality of the actual SERP.

What is a long you up?

A user clicks on a result and spends a lot of time on it, sometimes ending research.

What is a short click?

A user clicks on a result and returns to the SERP, Pogo-Haft, among other things, until a long click is observed. Google has this information when it is to be used as a proxy for the satisfaction of queries.

For more information on this topic, I recommend clicking this article in time.

Optimize the extra content on the page

Once you have the content, think about the extras and secondary links that help users on their discovery journey.

This content can be on other pages on your own links to content, but if it really helps a user understand a topic, it should link to other useful resources, for example. Other websites. A website linked to a different website could be exactly interpreted to be at least very interesting. I cannot believe from a website that is the actual endpoint of the orbit.

A site that does not bind any other website could be interpreted exactly as at least by personal interest. I cannot believe from a website that is the actual endpoint of the orbit.

  • TASK – In information pages, link to related sites on other sites and other sites on your own site where FALLING
  • TASK – For e-commerce sites, ADD related products.
  • TASK – Create deep content
  • TASK – Keep the content fresh, announcements cut, maximize the conversion, monitor for broken links or redirected
  • TASK – Assign a depth content to an author with an online authority or someone with a visible experience in the topic
  • TASK – When you run a blog, clean it first. To avoid creating pages that can be viewed as thin content within 6 months, you should plan a broader content strategy. If you publish the best 30 pages on the different aspects of a topic, you can play it all on one side centered on the page to help the users understand something about what it sells.

For more information about the most important content, please visit: http://www.hobo-web.co.uk/how-to-write-seo-friendly-website-content-for-google/

Title page element


<Title> What is the best title tag for Google? </ Title>

Page title tag (or HTML title element) is perhaps the most important in the page ranking factor (in terms of web page optimization). The keywords in page titles can undoubtedly help your pages be placed higher in Google results pages (SERP). The page title is often also used by Google as the title of a search snippet link in the results pages of the search engine.

For me, a perfect title tag on Google depends on a number of factors, and I’ll define a few down, but since I extended the title page to the Council on a different page (link below);

  1. A title page that maximizes relevancy to the page you relate to its ease of use, performance ranking in the search engines, and the percentage of click satisfaction. It will probably appear in the title bar of a web browser window and clickable research extracts from Google, Bing, and other search engines. The title element is the “crown” of a web page with an important key phrase that has at least once inside.
  2. Most modern search engines have traditionally placed great emphasis on the words contained in this HTML element. A good page title consists of valuable keyword phrases with a clear intention of the user.
  3. The last time I saw on Google, he showed so many characters that fit into a “block element that is 512px wide and no more than one line of text.” Therefore – there was no quantity of signs any optimizer could establish the best practice could be a title in its entirety displayed in Google, at least, as the title of the research fragment. Ultimately, only the characters and the words will tell you whether the title of the entire page is displayed in a snippet Google search. Recently, Google showed 70 characters in the title, but that changed in the years 2011/2012.
  4. If you want your full title tag to appear in the UK desktop version of Google SERPs, make sure to watch a shorter title over 55 characters, but it does not mean that your title tag should be completed in 55 characters and remember Your mobile visitors to see a longer title (the UK, at least March 2015). I saw ‘up-to’ 69 characters (in 2012) – but as I said – what you see in SERPs depends on the characters you use. In 2016, I hope that Google shows will change, so I’m not obsessed with what Google is doing in terms of visualization.
  5. Google is all about “User Experience” and “satisfaction of visitors” in 2016, it is useful that usability studies have shown to remember that a good page titlist is seven or eight words and less than 64 characters in Longer titles are less able to scan in enumerations, and may not be displayed correctly in many browsers (and, of course, they are likely to be clipped in the SERP).
  6. Google INDEX maybe 1,000 characters in a way… but I do not think anyone knows exactly how many characters or words Google will count as a basis to determine the suitability for sorting purposes. It is very difficult to isolate exactly all the evidence and obfuscations that Google used to hide its “secret sauce.” I had the qualifying game success with more titles – many more titles. Google certainly reads all the words in the title of your page (unless it’s idiot spamming, of course).
  7. You can probably cover up to 12 words, such as the title of a page, and with key keywords in the first eight words. The rest of the title page is counted as normal text on the page.
  8. NOTE, in 2016, the HTML title item you choose for your page may not be what Google includes in its SERP snippets. The title and the description of the extract strongly depend on these days QUERY. Google often chooses what he considers the most appropriate title for his research fragment, and information from the page or a link to that page allows you to create a completely different SERP fragment title.
  9. If you want to optimize a title that you can arrange without the keyword for as many terms as possible that fills your title. Often, the best bet is to optimize for a particular set (or sets) – and a long tail approach. Note that too many page titles and not enough real page for page text could lead to Google Panda or other performance issues “User Experience”. Being highly relevant a page is not enough to cover a page’s content. Google is worried too much about the content of the page text today if a better title contains a thin page on most websites.
  10. Some page titles improve with a call to action: a call to action that reflects exactly the intention of the search engine (for example, something to learn, something to buy or rent something.) Remember that it hooks into the Google search engine When you use to select your title page in the search extract, and there are many competing pages in 2016.
  11. The perfect title tag on a page is unique to other sites. In the light of Google Panda, an algorithm designed for the “quality” of websites, you really need to do your only page title, and minimize duplication, especially on larger sites.
  12. I want to make sure that my keywords are as fast as possible in a title tag, but the most important thing is to have key keywords and key phrases in the title tag of the ANY page.
  13. For me, if the visibility on the search engines is more important than the brand name, the name of the company is at the end of the tag, and I use a variety of separators for so no way to separate it works better. If you have a recognizable brand – then there is an argument to suggest this title – although Google will often change your title dynamically – sometimes link your brand name before you title fragment.
  14. Keep in mind that Google is very good these days by removing the special characters that you have in your page title – and I’d be wise to try to make your title and meta description to use special characters. This is not what Google wants, of course, and give you another way to make your search snippets with Rich Snippets and SCHEMA mark-up.
  15. I like to think I write titles for search engines and people.
  16. All regularly adjusted know that Google – why not that the perfect title disappear? Then mix it…
  17. Not possessed. Of course, is probably better, and only improve when the engines change. Optimize for keyword phrases and not just keywords.
  18. I draw page title mixed case I find it better able to scan the tracks with all CAPS or small letters.
  19. In general, the more trust and authority than your site in Google, the easier it is to arrange a new page for something. So you should not forget. There is only so much you can do with your page title – your Google ranking site is much more with factors that do OFFSITE ONSITE – negative and positive.
  20. CTR is something that is probably measured by Google when the page ranking (Bing said they also use it, and now have the power of yahoo), so it is worth considering if you are optimizing the title of your game Page rate. More placements in search engines.
  1. I think the keyword spamming your page title might be part of the area of Google (although I see little evidence for it).
  2. Remember …. Thinking “keyword” instead of “keyword”, “keyword”, “keyword” … think long tail.
  3. Google will select the best title you want to extract your research and take this information from multiple sources, not just the title of your page element. A small title with more information about the field often added. Sometimes when Google familiarizes the brand name, replace it with this one (often by adding a colon at the beginning of its title, or sometimes adding the end of the title of the fragment to the real estate page to which it belongs).

A reference to the title tags;

When you write a page title, you have the ability to tell Google (and other search engines) it is a site, spam or a quality site, for example: – Have you repeated the keyword four times or only once? I think the title tags, like anything else, should be as simple as possible, using the keyword once and perhaps a related term, if possible.

I intend to always keep my title page of HTML elements as simple and as unique as possible.

I clean up the way I write my songs all the time.

Read more:

Title tags Best Practice

Google will have the title, H1, H2, H3, H4, H5, and H6 as well as securities

External links


Meta Keywords Tag


A characteristic natural optimization company, search engine – keyword meta tag. Companies that lose time and resources for these items lose money from customers – this is a fact:

<Meta name = “Keywords” content = “s.e.o., search engine optimization, optimization”>

I have a board with the Meta Keywords, which is called Title Tag, in the main body of your web page, to forget it.

If you rely on the keywords meta optimization words, you’re dead in water classify. From what I see, Google + Bing ignores meta keywords – or at least, do not sit on them to arrange pages. Yahoo can read, but really, a search engine optimizer has more important things to worry about than this nonsense.

What about the other search engines that they use? Wait, I mean to send my page to these 75,000 first engines [sarcasm!]. Yes, ten years ago, the first search engines were enjoying watching their meta keywords. I saw the reflection of OPs on forums that is the best way to write these tags – using a comma with spaces, thereby increasing the number of characters. Forget the meta keyword tags – they’re a waste of unnecessary time and bandwidth. You could probably save a rainforest with bandwidth costs, then we saved when all of your keyword tags removed.

Tin hat time can

So you have a new website. You fill your meta tags home with 20 keywords you want to rank for – hey, that’s what optimization is all about, is not it? Say Google Only the third line text, why the filter. The meta name = “keywords” was originally made for the words that were not really on the page to help the type of the document.

Sometimes, competitors can use the information in their keywords to determine what they are trying to organize …

If everyone stopped and suppressed the misuse of target keywords, Google would probably start looking, but that’s how things look in the search engines.

I do not know the keywords meta and removes pages where I work.

Meta Description Tag


Just like the tutorial, and unlike the meta tag keywords, it is important both from the human viewpoint and search engine.

<Meta name = content = “Description” “Get your site on the first page of Google,

Yahoo and Bing. Call us at +8801814302010. A company based in Bangladesh. Oriental.

Remember to put your keyword or keyword into it, make it a search engine relevancy and write for people, not search engines. If you describe the extract of 20 words exactly the page you want to be optimized for one or two keyword phrases when the people of Google use to search, make sure the keyword is present.

I must say I usually include the keyword in the description because it is usually obtained in its SERP fragment.

Google looks at the description, but there is no discussion on the use of the tag ranking sites. I think that it could be at a certain level, but also hear a very weak signal. I do not even know of an example that clearly shows a Meta description to help a page rank.

Sometimes I will ask a question with my title, and respond in the description, sometimes I will give a hint.

It is much more difficult to focus in 2016 than other research fragments depending on what Google wants to focus on its users.

It is also very important for each page of your site to have unique Meta descriptions.

Tin hat time can

Sometimes I think that if your stocks are spam, your keywords are spam, and your Meta description is spam, Google could stop it – you’ll probably want to store bandwidth at a given time. A keyword in the description Putting take no crap site number 1 or 50 points Increase the competition deepen – so why optimize for a search engine when it can be optimized for a person? – I think it’s much more worth, especially if you’re already in the mix – that’s – on the first page for your keyword.

Thus, the Meta Description tag is important in Google, Yahoo and Bing and all other engine listings – very important to do well.

Do it for man.

Oh, and by the way – Google seems to be nothing else in Meta description @ 156 characters truncated although this is limited by the width of 2016 pixels.

Generate Meta descriptions on large sites

Google said it automatically generated unique purpose descriptions based on the content of the page.

Follow the example of Google:

Meta name =” description “content =

Author Arif, Illustrator: Mary Grandpre, Category: Books, Price: $ 17.99, Length: 784 pages “>

… And why do this advice:

There is no duplication, further information, and everything is clearly marked and separated. No additional work required to produce anything of this quality: the price and the length are the only new data and are already published on the website.

I think it is very important to hear when Google says you are doing something in a very specific way, and Google gives clear guidelines in this area.

Read more:


External links


Meta tag robot


Example robot Meta tag;

<Meta name = “robots” content = “index, no follow” />

You can use the Meta tag using the Google index to tell the page, but the links on the page do not follow if you do not want to appear for any reason in the Google search results.

Default Googlebot indexes a page and follows links with it. Therefore, it is not necessary to mark pages with content values of INDEX or NEXT. GOOGLE

There are some guidelines that you can use in your Meta tag robot, but remember that Google’s index follows standard and the links, so you do not have to include as a command – you can leave the robot completely objective – And probably should if you have no idea.

Googlebot includes any combination of uppercase and lowercase letters. GOOGLE.

Valid values of the robot Meta tag attribute “CONTENT” are: “INDEX”, “NOINDEX”, “forward”, and “no follow”.

Application example:

Meta name = “robots” content = “NOINDEX, FOLLOW”

Meta name = “robots” content = “INDEX, no follow”

Meta name = “robots” content = “NOINDEX, no follow”

Meta name = “robots” content = “NOARCHIVE”

Meta name = “Googlebot” content = “nosnippet”

Google will have the following interpretation and following values of the Meta robots are:

  • NOINDEX – prevents the page from being included in the index.
  • NOFOLLOW: prevent the Googlebot following links on the page. (Note that this is different from the connection level no follow attribute, which prevents an individual connection from Googlebot from following.)
  • NOARCHIVE – prevents a cached copy of this page from being available in the search results.
  • Nosnippet Prevents a description of the page from appearing in the search results, as well as preventing the layout cache.
  • NOODP – Blocks the Open Directory Project description of the project from the page that is used in the description that appears in the search results on the page.
  • NONE – equivalently with “NOINDEX, no follow”.

Quick Reference META Robots

Terms Googlebot Slurp BingBot Teoma
No Snippet YES NO NO NO
No Image Index YES NO NO NO
No Translate YES NO NO NO
Unavailable After YES NO NO NO


I’ve included the robot meta tag in my tutorial since this is one of the few meta tags / HTML header elements on which I focus on when it comes to managing the Googlebot and BingBot. On the page, this is a powerful way to check if pages on the search pages are returned.

These meta tags are in the [HEAD] of a page [HTML] and Google is the only tags that interest me. Almost anything you can put in the [HEAD] of the HTML document is quite useless and perhaps even useless (for Google optimization, anyway).


If you want to control which pages are analyzed and indexed by Google, read my article for beginners on robots.txt.

External links




H1-H6: Side head


I cannot find evidence online that says that you want to use the title tags (H1, H2, H3, H4, H5, and H6) or placements at Google, and I saw the plot on Google pages without them – but I use them, especially the H1 tag on the side.

For me, this is another piece of a “perfect” side, in the traditional sense, and I’m trying to build a website for Google and people.

<H1> this is a title page </ h1>

Typically, I generally only use a title tag <h1> on my pages for targeted keywords – I think the way the W3C has tried to use HTML 4 – and make sure that they are at the beginning of a page of the text of the corresponding page and wrote with my primary keywords or embedded in sets of keywords.

I’ve never had any problems with CSS control to make the appearance of the header tags occur by making them bigger or smaller.

You can use multiple H1 in HTML5, but most of the sites I use with the work they are using HTML4.

I can use as much H2 – H6 as needed depending on the size of the page, but I use H1, H2, and H3. Here you can see how to use header tags properly (essentially in line, whatever you do to give your users the best user experience).

How many words are on the H1 label? As much as I think it makes sense – as short and fast, as normal as possible.

I also discovered that Google uses its header tags, like page titles, to a certain level when the title element is flawed.

As always, make sure that your header tags are very relevant to the content of this page and not too much spam.

ALT Tags


NOTE: Alt tags are counted by Google (and Bing), but I’d be careful to optimize them. I have seen many websites penalizing for invisible items on one side. Do not do that.

ALT tags are very important and I think it is a very worthwhile field to be correct. I have always been the main keyword in ALT.

You’re not optimizing your ALT tags (or better, attributes) just for Google!

Use ALT tags (or better, attribute ALT) for the description text that helps the visitor – and keep it.

Not possessed. <br> About us | Contact | About us | Contact | > <br> Do you want to know more? If you are interested in it, I have a simple test using the ALT as many words to determine attributes.

If you are looking for a job, you can write to me. If you want to write a comment, you can write a comment. ALT (or NULL), so people with screen readers can enjoy your site.

Updated 17/11/08 – Picked Up In This SER roundtable ALT Tag:

JohnMu Google: ALT attributes should describe the image. So if you have a picture of a large blue pineapple chair, you should use the old label that best describes is so old = “big blue pineapple chair.” The Title attribute is used. The Title attribute must contain information about what will happen when you click on the image. For example, if you want the picture, you should consider something like this, title = “View larger version of the large photo bay of blue pineapple.”

Barry continues with a quote:

When Googlebot does not directly see the images, you can add the “old” attribute with “title” and other attributes if they are available to users! For example, if you have a picture of a puppy (the nice at the moment) playing a ball, you can use something like “My puppy Betsy playing with a bowling ball” as the ALT attribute of the picture. If you have a link to the image, you can use this image in high resolution for the title attribute for the link.

Link Title Attributes, Acronym, and ABBR tags

Is Google there the text in the acronym tag account?


Not in my tests. From my observation page series – Google ignores keywords in the acronym tag.

My observations from a test page.

  • Link Title Attribute – no advantage is transferred via a link to another page as it seems
  • ABBR (Abbreviation tags) – Not
  • Image Filename – None
  • Wrap Words on SCRIPT – Sometimes. Google is better to understand what it is doing in 2016.

It is clear that many invisible items on a page are completely ignored by Google (which we care about SEO).

Some are invisible (yet) not particularly supported:

  • NOFRAMES – Yes
  • NOSCRIPT – Yes
  • ALT attributes – Yes

Unless you’re really interested in the subject, I believe the ** P ** day is the most important day for optimization in 2016.

URL search engine friendly (SEF)

Clean URL (or friendly URLs for search engines) is simple: clean, easy to read, simple.

You do not need to have a URL for Google site Images to sideshow spider (verified by Google in 2008) even though I own and use the default URL these days for years.

It is often useful.

Is there a big difference when using Google’s own URL?

No, in my experience, it is much more a second or third order, perhaps even less, when used alone. However – there is a detectable benefit in URLs.

Here is a list of URLs that you can use to create a custom ID.

I think it’s a good idea. I optimized as they did.

It is also possible to isolate one of the following factors:

Where everyone’s advantage is easily detectable is when people (e.g., in forums) link to your site using the URL as a link.

Then it is fair to say that you get a boost because the keywords are in the actual anchor text link to your site, and I think it is, but again, this is dependent on the quality of the page link to your site Web. That is, when Google trusts him and hands over PageRank (!) And the anchor text benefit.

And, of course, you will need to quote your site content.

Sometimes I will remove stop words from a URL and leave important keywords, such as the title of the page, how many forums will mutilate a URL to shorten it. Most forums will not be traced in 2016 to be fair, but some old habits harsh.

Sometimes I prefer to see the exact expression that I name as the name of the URL I ask to sort Google.

Set the URL as follows:

  1. hobo-web.co.uk/?p=292 – is automatically modified by the CMS with URL Rewrite
  2. hobo-web.co.uk/websites-clean-search-engine-friendly-URLs/ – which then break something like
  3. hobo-web.co.uk/search-engine-friendly-URLs/

Keep in mind that Googlebot can crawl pages with dynamic URLs; It is assumed by many webmasters that there is a greater risk that they will give up when URLs are considered unimportant and contain multiple variables and session ID (theory).

As a standard, I use clean URLs wherever possible in new sites these days, and I’m trying to keep the URL as simple as possible and thus not obsessed.

This is my goal at any time when optimizing a website to work better on Google – simplicity.

Google search for keywords in the URL, even at a granular level.

A keyword in the URL to have can be the difference between the rankings of your site and not – potentially useful advantage of long lines of the research wait – for more a keyword in the URI (filename) account see Google in the ranking of a page?

Absolute or relative URL

My advice would be consistent with what you choose.

I prefer the absolute URL. It’s just a privilege. Google will creep if your locale is developed properly.

Relative means only with the document in which the link is located.

Move this page to a different page and it will not work.

With an absolute URL, it would work.

Subdirectories or files for the URL structure

Sometimes I use subfolders and sometimes I use files. I have not been able to decide whether it is a real benefit (in terms of higher ranking) either to use it. Many CMSs currently use subfolders in your data source, so I’m sure Google can deal with either.

I used to have the files rather than. Html when I was building a new site from scratch because they were the “end of the line” for the search engines, as I imagined, and a subdirectory (or directory) was a collection of pages.

I used to think of it, could take more of a trustworthy subfolder to get a single file to say, and I think it’s me shaking the files to use on most of the websites I use (during the day). Once subfolders are familiar, it is 6 or a half-dozen, which is the difference in placement at Google – usually, placements at Google are more determined by how serious or under a page is a query.

In the past, differently handled files will be subfolders (in my experience).

Subfolders have less confidence than other subfolders or pages of your site, or completely ignored. Subfolders * * a little longer seemed to be indexed by Google, for example, Html.

People talk familiar domains, but they do not mention (or do not believe) that part of the area can trust less. Google treats certain subfolders… .. Differently. Well, they used – and remember how Google used to handle things – even in 2016.

Some say they do not have four levels of folders in the file path. I have not experienced too many problems, but you never know.

UPDATE – I think in 2016, there is even less to be feared. There are much more important things to check.

What is the best for Google? PHP, HTML, or ASP?

Google does not care about it. Although it is converted as a document compatible with the browser, it seems that Google can read these days.

I draw it PHP these days also with flat documents because it is easier to add the server-side code in this document if I want to add some kind of feature on the site.

W3C Valid HTML / CSS Help Rank?


Above – A Google video confirms that the advice I shared for the first time in 2008.

Is Google, a page higher due to valid code counts? The short answer is no, although I tested it on a small test with different results.

Google does not care if your page is valid HTML and CSS valid. This is clear: check the top ten results on Google and you’ll probably see that most contain HTML or CSS invalid. I like to create barrier-free websites, but they are a little difficult to handle if you have multiple authors or developers on a website.

If your site is so badly designed with much bad code is also Google and browsers cannot read it, then you have a problem.

Whenever possible, when launching a new website, the requirement of at least the minimum compliance with site accessibility (there are three priority levels to respect) and HTML validate and target CSS. In fact, it is the law in some countries, even if you do not know, and some work in the preservation of your skill set.

Valid HTML and CSS are a pillar of good website optimization practices, not necessarily part of the optimal professional search engine. It is a form of optimization that does not punish Google.

Add-on – always follow W3C recommendations, aim to provide a better user experience;

Hypertext links. Use text that makes sense when read from the context. W3C Accessibility Top 10 Tips

Internal links point to the corresponding pages

I can connect the internal pages relevant to your content if needed.

I reveal irrelevant or primarily through links in text content and sub menus and pages between the system’s trust which are relevant to each other in context.

I care a perfect silo technology and I do not care if I need to join another class because I believe that the dynamics that announce many in the size of the pages are minimal, I usually manage.

I’m not obsessed with the website architecture, as much as in the past … But I always make sure that my pages I want to be indexed in an exploration of the home page, and I always insist on the importance one’s Pages that link when needed. I always try to get the most important specific anchor text to show on the page of internal links – but I avoid misuse of internal and avoid open manipulative internal links that are not grammatically correct, by example.

There is no established method that I found on any site outside linked to related internal pages often without exaggeration and appropriate to work.

What are the links to the SERP websites?

If Google knows enough history or relationships a website (or website), it sometimes shows what site links are called (or mega-site links) in the URL of the site.

This leads to a search in the SERP fragment improved.

This is usually activated when Google suggests that this is the place they are looking for, depending on the search terms used.

Site links are often reserved for navigation requirements with one with a strong brand, a brand name or a company name, for example, or the address of the site.

I followed the evolution of Google Site links in organic search results over the years and I based on a number of selected factors.

How to get links to Google sites?

The pages that appear on the links pages are often popular sites on your site, in terms of internal or external links, or the user experience, or even the latest news published on your blog.

Google likes that much to mix, perhaps provide some variety, and probably the results will minimize manipulation to minimize or prevent.

Sometimes there are pages that allow me to scratch the head, why Google selected a particular page is displayed.

If you have no links to websites, you have a little patience and focus on other areas of your web marketing, such as adding content, maintaining public relations or social activities centered website.

Google will provide links to sites with specific conditions; Once Google expects your site to be the target that the user wants.

This could be a week or several months, but the more popular the site, the more likely that Google will catch up quickly.

Site links cannot be enabled or disabled, even if you can control to a certain extent that the pages are selected as Google Site links. You can do the research console AKA search.

Links to related sites

In terms of best practices on page SEO, we usually link to other sites of relevant quality on other sites where this is possible and man is precious.

I do not link to other websites that are on the main page. I would like the page rank on the main page to be shared only with my internal pages. I hate it either to go to other sites on the sides of my class, for the same reason.

I connect to other relevant sites (deep link if possible) from individual pages and me often, usually. I do not care about fairness link or flight PR because I control on a page for page level.

This is working for me; it allows me to link with other sites I have to share to ensure that it is not at the expense of the pages of my field. It can help get myself in a “neighborhood” of relevant websites, especially when some of them begin to connect to my site.

Links to other sites, especially with a blog, also helps other people to say that in your content might be interested that your page is “here.” Try it out.

You will not abuse the anchor text, but I will be ruthless and usually try to connect with keywords to a website that these bloggers/website owners will appreciate.

The document recently filtered quality guidelines clearly shows users how to identify useful or useful options for your EXTRA NAVIGATION, whether you connect to other internal pages or other pages.

Faulty links are a link loss performance

The easiest advice I read about a site/website optimization years ago the creation before and still useful today:

Make sure that all your links pages to at least one other website

This advice is still healthy today and the most important advice, in my opinion.

Check your pages for broken links. Seriously, broken links a loss link power and your website could damage, drastically in some cases.

Google is a search engine based on the links. If your links are broken and your site is full of 404, you cannot be in the race.

This is the second best advice, in my opinion, because we are talking about website architecture;

Link to important pages often internally with a variable anchor text in the navigation and text content of the page

Especially if you do not have a lot of PageRank.

That is the first link in Google Do you expect that?

Count the second anchor text link on a page?

One of the most interesting discussions in the last webmaster community has tried to determine the Google account links as links to the pages of your site. Some say that the Google connection is found higher in the code is the Google link “account” when there are two links on a page from the same page.

I’ve tried to count this (a while) with the position Google’s first internal link.

For example (and I’m talking here internally – if I have a page Places are two links, both go to the same page?) (OK – little scientist, but you get the idea).

Google does he make the first link realize? Or will the anchor text of the links reads, and give it to my advantage page text on both links, especially if the anchor text on the two links is different? Is Google going to ignore the second link?

What is interesting to me is that this with a question lets you know. If your navigation array is linking to your main pages, your links are ignored within the content or at least ignored.

I think the links in the text are invaluable. Does it mean to navigate the copy for a wide and varied internal anchor text on a page?


As I said, I believe this is one of the most interesting conversations in the community is right now and maybe Google works differently with internal links and not external; Links to other websites.

I think it would probably change from day to day when Google pushes a button, but a website to think that only the first link on a page account will optimize – according to what I see even when I am testing – and Actually once from page to page on customer sites, unless it is useful to the visitors.

Duplicate Content Penalty

Webmasters often falsely penalize for duplicate content, which is a natural part of the web landscape, especially at a time Google says there is no penalty for duplicate content.

The reality in 2016 is that if Google grades your content as a duplicate content THIN, then you have a very serious problem that violates the web site’s performance and recommendations from Google to “clean up” this “violation”.

Duplicate content generally refers to contain content blocks within or between domains that correspond to the completely different content or are substantially similar. Above all, this is not deceptive origin …

It is very important to understand that if in 2016, as a webmaster, you re-publish articles, press releases, news and product descriptions found on other sites so that your pages will certainly have trouble gaining Google’s SERP pages).

Google does not like to use the word “punishment”, but when the whole site is re-published content, Google does not classify.

If you have a multi-site sales strategy of the same products – you are likely to blow your long-term traffic, rather than dominate a niche as before, be able to do so.

All this is because the search engine works with duplicate content found on other sites and experience that Google wants to provide its users and competitors.

Confuse duplicate content on a website, and it might seem a shame to google as the end result is the same; It is possible that important pages that have been classified are not reassigned and the new content is quickly not recognized.

Your site could even get a “manual action” to finish the content. The worst case scenario for your site is taken by Google Panda’s algorithm.

A good rule is; do not expect to find the content found on other trustworthy sites and expect to rank high on Google rank not even when all you use to create automatically generated pages without added value.

Tip: Do not repeat the text, not even yours, on too many pages on your web page.

Read more:

  • Which pages of your website violate your ranking?
  • Classification algorithm change causes drop
  • Google Duplicate Content Tips


Duplicate ads on Google or retreat

How to get lists duplicated or depend on Google SERP? How to get two ads in the web page in the top ten of the Google results instead of (in normal operation with 10).

In general, this means that you have at least two pages with enough link capital to have the top ten results – two very relevant pages for the search term. By 2016, however, it might be a sign of Google testing different sets of results, such as the merge of two indexes where a site takes a different place in both.

You can do this with the appropriate pages to achieve a good internal and course structure from links to other sites. It is much easier to achieve in less competitive industries, but in the end, it is in many cases reduced to the authority in the field of high relevance for a particular key phrase.

Redirect Non-WWW to WWW

Your site probably has canonization problems (especially if you have an e-commerce website) and at the domain level.

In short, http://www.hobo-web.co.uk/ can be treated by Google as another URL http://hobo-web.co.uk/ although it is the same page, and if you continue more difficult.

Your thinking REAL PageRank can be diluted if Google wants to be confused about your URL and just do not want this PR (in theory) diluted.

For this reason, many, even I, do not redirect www to www (or vice versa) when the site is on a Linux / Apache (the Http access file

Options + Follow Simulink’s

Rewrite Engine on

Rewrite Cond% {} ^ HTTP_HOST hobo-web.fr [NC]

Rewrite Rule ^ (. *) $ http://www.hobo-web.co.uk/$1 [L, R = 301]

Basically, redirect all Google juice into a canonical version of a URL.

By the year 2016 – this is one must have the best practice.

It’s easy to optimize Google. Please note; it is not very important to mix the two types of www / non-www on the website when they link to your internal pages!

Note 2016 Google asks you in what area you prefer to configure as a canonical domain in Google Webmaster Tools.

301 redirects are powerful and white hat

Instead of saying Google with a 404 or other command that this page is not here, you should permanently redirect a page to page very similar to group shares links that might have on this page.

My general rule is to ensure that the information (and keywords) is included in the new page – remain safe.

Most already know the performance of a 301 and how they are used to also feed completely independently pages on Google for a while – sometimes very long.

Google seems to believe that the server is very good, so I let them.

You can change the focus of a redirect, but it’s a little bleak for me to hit and can be abused – I’m not talking about this kind of thing on this blog. But it is interesting to know you need to keep these redirects in the http access file.

Redirecting multiple former pages to a new page – works for me if the information on the page that has sorted the previous page.

NOTE – This tactic is very 2016. Spamming of detours Caution. I believe I saw transferred the sanctions by 301. Neither REDIRECT 301 blindly to your home page. I would also be wise to redirect too many bad quality links to a URL. If you need a page to redirect the old URLs, consider your sitemap or contact page. Check all the title pages of the backlinks to redirect them to an important page.

I see at work like Ornate 301s in 2016 – although they seem to take a little longer to have an influence.

Tip: A good tactic, at the moment, has consolidated old and thin boundary elements, Google, ignored into larger and better items.

Usually, when 301 all sites consolidate a single source link stock and share. As long as the intention users and serve is to create something more up – Google agrees.

Canonical Link is your best friend


When it comes to Google SEO, the rel = Canonical Link * was very * the years IMPORTANT ever and never again.

This is used by Google, Bing and other search engines help them enter the page you want to double and close to duplicating pages to help arrange these on your site or on other web pages.

In the above video, Matt Cutts of Google’s stock tips on the new tag rel = “canonical” (more precisely – the Canonical link) that search engines now support also top 3.

Google, Yahoo, and Microsoft have agreed to work together on one

“Joint efforts to reduce duplicate content for larger sites and become more complex and the result is the new Canon label.”

Google Webmaster example of Canonical tag head office blog:

<Link rel = “canonical” Href = “http://www.example.com/product.php?item=swedish-fish&#8221; />

The process is simple. You can put in the main area of the content URL in double this link tag if you need it.

I add an element of the self-referential canonical link as standard these days – any site.

Is rel= “canonical” a hint or a directive?

It’s a hint that we honor strongly. We’ll take your preference into account, in conjunction with other signals, when calculating the most relevant page to display in search results.

Can I use a relative path to specify the canonical, such as <link rel = “canonical” href = “product.php? Item = Swedish-fish” />?

Yes, relative paths are recognized as expected with the <link> tag. Also, if you include a <base> link in your document, relative paths will resolve according to the base URL.

Is it okay if the canonical is not an exact duplicate of the content?

We allow slight differences, e.g., in the sort order of a table of products. We also recognize that we may crawl the canonical and the duplicate pages at different points in time, so we may occasionally see different versions of your content. All of that is okay with us.

What if the rel = “canonical” returns a 404?

We’ll continue to index your content and use a heuristic to find a canonical, but we recommend that you specify existent URLs as canonicals.

What if the rel = “canonical” has not yet been indexed?

Like all public content on the web, we strive to discover and crawl a designated canonical URL quickly. As soon as we index it, we’ll immediately reconsider the rel = “canonical” hint.

Can rel = “canonical” is a redirect?

Yes, you can specify a URL that redirects to a canonical URL. Google will then process the redirect as usual and try to index it.

What if I have contradictory rel = “canonical” designations?

Our algorithm is lenient: We can follow canonical chains, but we strongly recommend that you update links to point to a single canonical page to ensure optimal canonicalization results.

Can this link tag be used to suggest a canonical URL on a completely different domain?

** Update on 12/17/2009: The answer is yes! We now support a cross-domain rel = “canonical” link element. **

More reading at http://googlewebmastercentral.blogspot.co.uk/2009/02/specify-your-canonical.html

Do I need an XML Google sitemap for my site?

What are a Sitemap XML and I need for my SEO for Google?

(XML Sitemap Protocol) has broad acceptance, including support from Google, Yahoo! and Microsoft

No, technically you do not have an XML sitemap to optimize a website for Google if you have a reasonable navigation system that Google can easily be crawled and indexed.

HOWEVER – 2016 – you should create a content management system that produces a proven method – and you must submit this sitemap to Google in Google Webmaster Tools. Again – Best Practices.

Google recently said XML and RSS are still a discovery process very useful for them to choose the updated content on their site recently.

An XML Sitemap is a file on your server that you can easily help Google Crawl and Index all pages of your site. This is, of course, useful for very important sites regularly publish new updates to the content or content.

Your web pages will continue to receive the search results without an XML sitemap if Google can find your site, they creep when:

  1. Make sure that all your links pages to at least one other place.
  2. Link to important pages often using (text variable anchor in the navigation and in the content of the page text, if you want better results)

Keep in mind: Google must find links to all pages of your site and links to spread page rank that classifies the help pages – so an XML sitemap is no substitute for a great architecture website.

Sitemaps are an easy way for a webmaster to provide pages in the search engines to their sites that are available for exploration. In its simplest form, a sitemap is an XML file that includes URLs for a website along with additional metadata about each URL list (when it last updates its frequency and its importance compared to the other website URL) that the search engines Intelligently the site can creep more.

Most modern CMSs automatically generate XML Sitemaps and Google requires that you submit a sitemap in the webmaster tools and I am doing this today.

I prefer to manually set my important pages displayed links and content depth, but an XML Sitemap is a good practice in 2016 for most websites.

Rich fragments

Rich Snippets Schema and Markup can be intimidating when you are new to it – but important information about your business can actually be easily added to your site by optimizing your sensitive website.

It is easy to implement.

A foot of the site optimized website can comply with the law, it can help search engines better understand your site and can improve the ease of use and conversions.

Properly optimized footers can also bring you search snippets to the pages of Google results:

If you are a UK company – Your site must meet the legal requirements to comply with the UK Law on 2007 Companies It is easy to integrate the necessary information in the footer.

UK companies must include certain rules on their websites and in their email feet …… or they violate the stock law and a fine risk. OUTLAW

This is what you need to know about the website and the page of footnotes to comply with the British Society Act (with our bold information);

  1. The Company Name-
  2. Physical geographic address (APO Box is unlikely to suffice as a geographic address; but a registered office address would – If the business is a company, the registered office address must be included.)
    88 Bisil Mirpur,
    Dhaka, BD
  3. The company’s registration number should be given and, under the Companies Act, the place of registration should be stated (e.g.
    ONLINE-ZONE is a company registered in Bangladesh with company number 00******
  4. The email address of the company (It is not sufficient to include a ‘contact us’ form without also providing an email address and geographic address somewhere easily accessible on the site)
  5. The name of the organization with which the customer is contracting must be given. This might differ from the trading name. Any such difference should be explained.
    The domain seoworkweb.bloggspot.com and the ONLINE-ZONE logo and creative are owned by Ariful Islam and licensed to ONLINE-ZONE Marketing of which Miss Tanie is an employed co-founding director.
  6. If your business has a VAT number, it should be stated even if the website is not being used for e-commerce transactions.
    VAT No. 00*****
  7. Prices on the website must be clear and unambiguous. Also, state whether prices are inclusive of tax and delivery costs.
    All ONLINE-ZONE BD prices stated in an email or on the website EXCLUDE VAT



The above information should not appear on each page, as well as a clearly accessible page. However, with Google quality website quality based on experience, authority, and trust (see my recent release of high-quality sites to see) – all the signs that you send to an algorithm or a human reviewer a motion-sensitive moment, of course, nothing to hide).

Note: If the company is a member of a trade or professional association, membership data must be provided, including the registration number. Please consider the rules on distance selling, which include other information requirements for online businesses that sell to consumers (B2C as B2B).

For more information on British companies:

  • 2006 Companies Act (HTML version)
  • Corporations Regulation (Registrar, Languages and Trade Declarations)
  • Companies of the commercial economy (760 pages / 2,8MB PDF)
  • The BD e-commerce regulation – outlaw extraordinary leader

While we have shown most, if not all, of this information in the e-mail and website footer, I thought it would be useful to have this information unambiguous and explain why they exist – and wrap everything in (hopefully) informative.

Copyright Notice Dynamic PHP in Word press

Now that you comply with the law – you want to ensure that your site is never clearly outdated.

If your footers – make sure your copyright notice is dynamic and will change from year to year – automatically.

It’s easy to show a dynamic date in the footer in Word press, for example, so you’ll never have to change your copyright memo on your blog if the year changes.

This piece of code will show the current year. Just add it to the footer. php of the motif and you can forget to make sure it does not seem dull or give you the impression that your site is outdated and not used at the beginning of each year.

& Copy; Copyright 2004 – PHP echo date (‘Y’)?

A simple and elegant PHP copyright reference for Word press blogs.

In Schema.org, markup in the footer

You can take your top information and transform it with Schema.org’s markup to provide more accurate search engine information.

From this….


<P> © Copyright 2006-2016 MBSA Marketing LTD trading as Hobo, Company No. SC536213 | VAT No. 249 1439 90 <br>

68 Finn art Street, Greenock, PA16 8HJ, Scotland, UK | TEL: 0800 689 0293 <br>

Business hours are 09.00 a.m. to 17.00 p.m. Monday to Friday – Local Time is <span id = “time”> 9:44:36 </ span> (GMT)

</ P>

</ Div>

… To this.


<Div item scope = “” item type = “http://schema.org/Local Business”>

© Copyright 2006-2016 <span itemprop = “name”> ONLINE-ZONE </ span>

<Div item prop = “address” itemscope = “” item type = “http://schema.org/Postal Address”>


<Span item prop = “street Address”> 68 Finn art Street </ span>,

<Span item prop = “address Locality”> Greenock </ span>,

<Span item prop = “address Region”> Scotland </ span>,

<Span item prop = “postal Code”> PA16 8HJ </ span>,

<Span item prop = “address Country”> GB </ span> |

TEL: <span item prop = “telephone”> 0800 689 0293 </ span> |

EMAIL: <a href=”mailto:info@hobo-web.co.uk” item prop=”email”> info@hobo-web.co.uk </a>.

</ Div>


<Span item prop = “geo” item scope = “” item type = “http://schema.org/Geo Coordinates”>

<Meta item prop = “latitude” content = “000***”>

<Meta item prop = “longitude” content = “000*”>

</ Span>


<Span>Company No. 00**** </ span> |

VAT No. <span item prop = “vatID”> 000***** </ span> |

Business hours are <time item prop = “opening Hours” date time = “Mo, Tu, We, Th, Fr 09: 00-17: 00”> 09.00 a.m. to 17.00 p.m. Monday to Friday </ time>

Local Time is <span id = “time”> 9:46:20 </ span> (GMT)


</ Div>


<Span class = “rating-desc” itemscope = “” item type = “http://schema.org/Product”&gt;

<Span item prop = “name”> ONLINE-ZONE SEO Services </ span>

<Span item prop = “aggregate Rating” item scope = “” item type = “http://schema.org/Aggregate Rating”> Rated <span item prop = “rating Value”> 4.8 </ span> / 5 based on <span item prop = “review Count” > 11 </ span> reviews. | <a class=”ratings” href=”https://plus.google.com/b/00****/about/p/pub?review=1″&gt; Review Us </a> </ span>

</ Span>

</ Div>


Tip: Look at the code at the end of the above example if you’re wondering how to get yellow stars on Google pages.

I have yellow stars on Google within a few days to add the code to my site template – directly link my page to information Google already has about my case.

In addition, you can modify the link to plus.google.com directly with the Google plus VIEW page to encourage visitors to check your business.

You can now use a foot site page that helps your business with the right of the United Kingdom to be more usable, is automatically updated, the copyright notice of the year, and help your site stand out in Google’s SERP.

PRO Tip: Now that you know the basics, you should implement rich systems implementing a much cleaner method called JSON-LD.

Read more


Is simple, stupid

Do not create your site using Flash or HTML frames.

Well… not complete with Flash and especially if you do not know very little about the accessibility of increasingly flash grows.

Flash is a plug-in created by Macromedia’s owner (though) fantastically draw rich media for your websites. The W3C recommends avoiding this proprietary technology with a complete website. Instead, build your site using CSS and HTML, so that all, including search engine robots, can test the content of your site. Then, if necessary, you can embed media files like Flash in the HTML code of your website.

Flash in the hands of an inexperienced designer can cause all sorts of problems right now, especially with:

  • Accessibility
  • Boots
  • Users who do not have the plug-in
  • Important unloading times

Flash does not work at all on some devices like the Apple iPhone. Note that Google sometimes shows when your site is not compatible with mobile devices on some devices. And on the subject of mobile websites, keep in mind that Google has alerted the community of webmasters that mobile friendship will be a 2016 ranking factor search engine.

On April 21 (2015), we will expand our use of compatible mobile devices such as scoring signal. This change affects mobile searches in all languages of the world and will have a significant impact on our search results. As a result, users will find it easier to get relevant search results and high quality optimized for their devices. GOOGLE

HTML5 will be the preferred option in Flash today for most designers. A website built entirely in Flash can affect in a bad user experience and your notes and especially on mobile search results. For similar reasons of accessibility and user satisfaction, I would also say that you do not have a website with website management.

As with all forms of design, do not try to reinvent the wheel when simple solutions are sufficient. The KISS philosophy has existed since the beginnings of design.

KISS does not mean boring web pages. You can create amazing websites with graphics smashing – but you have to build these pages using simple techniques – HTML and CSS, for example. If you are new to the web design, things like Flash and JavaScript, especially for products such as scroll unmanned etc., do well for TV These products work – but only cause problems for the visitors.

Keep your designs and consistent and easy navigation and arrangements. You do not have time, effort and money (especially if you work in a professional environment) through the development of the sophisticated navigation menu, for example when your new website is to spend an information page.

The same with site optimization – keep your well-structured documents and keep your title and page content, use title tags, wisely, and try to avoid too much footprint – it either.

  • Google Mobile Compatibility Test
  • Http://validator.w3.org/mobile/
  • Best screen size

How fast do you need to upload your website?

“The speed of the website,” Google told us in the previous video, is a rating factor. But as with any factor that confirms that Google is a ranking signal, it is usually a small, “nuanced.”

A fast website is a good user experience (UX) and a successful UX leads to higher sales.

The speed with which your website load is an essential, but often totally ignored in any online business that involves search engine marketing and search engine optimization.

Very slow sites are a bad user experience – and Google is all about BUEN UX today.

How much is “Speed Website” Google Ranking Factor?

“How a slow site is a negative ranking factor” is a useful interpretation of the statement that “the speed of the site is a Google ranking factor.”

First: I’ve slowly experienced very sites 10 seconds with the most negative impact on Google and secondly, statements from Googler:

We say that we have a small factor in the pages that are too slow to load when we need to consider. John Mueller, GOOGLE

Google can crawl your site slowly if you have a slow website. And it’s bad, especially when you add new content or make changes.

We see a very high response time for queries made on your site (sometimes more than 2 seconds for a single URL). This has given us a strong limit to the number of URLs we crawl from your site. John Mueller, GOOGLE

John specifically said 2 seconds interruption of activity crawling, not skill RANKING, but you get the picture.

What should be the burden of your site in 2016?

Recent research results are difficult to find, but they show up as quickly as possible.

A non-technical SEO strategy Google

Here are some final thoughts:

  • Use healthy people’s understanding – Google is a search engine – is looking for pages to search the results of search engines, 90% of users search for information. Google it WILL get organic results full of information. Almost all websites link to relevant information, content, making content rich websites a lot of links, especially the quality of links. Google Ranks Websites with many links (especially the quality of the links) to the top of their search engines, so the obvious thing you need to do is a lot of informative content on your site.
  • I think the rankings in the organic search results about trust relationships much distrust rank pages, on trustworthy links that ranked rank ad nauseam for many keywords. Some sites can convey trust to another website; some pages cannot. Some links can. Some do not. You rely on some links to transfer the sorting capacity to another page. Some do not. YOU NEED PAGES TRUSTS Links if you want to sort and avoid SANCTIONS AND FILTERS.
  • Google engineers are building an AI, but everything is based on a simple human desire for something to happen or something to prevent. You can work with or against Google’s engineers. You must make money for Google, but unfortunately for them, they have the best search engine in the world for us as part of the agreement people. Create a website that comes to good use. What is a Google Engineer trying to do with an algorithm? I still remember that it was an idea before it is an algorithm. What was the idea? Think like Google engineers and give Google what you want. What Google is trying to give its users? With this orientation. What Google does not give their users? Do not look like that. How Google thinks and ENGINEER builds a site that WILL GIVE A HIGH RANKING.
  • Google is a search engine based on the links. Google does not need content to arrange pages, but must provide content users. Google needs to find content and find content through the links as it does by clicking on a link. So first you need to make sure that you link the world over to your website so that other sites sell. Do not reciprocity to more powerful or even genuine websites sites – I think it’s added to your domain authority – it’s better to classify only a few narrow key terms.
  • Everything has its limits. Google has its limits. What are you? How would you do it by watching or breaking even test, breaking or receiving or being punished by them? It is not a configuration lab – you cannot test much 100% exactly, but you can make an assumption based on the reasonable approach that a Google engineer would do, and what you would do if Google were yours
  • The best way for Google to maintain secret classification is final to have a random one – or at least a random one on the interface, as presented to Google users – while maintaining something stable, probably the way to easily prevent a strange optimizer from doing Discover how it works. Well, that’s what I think. And I think this opportunity manifests in many ways. What works for some websites for your site may not necessarily work – it is definitely not the same. Perhaps no two sites are the same (the conditions are different for a start of two sides).
  • Google can play dice with the multi – verse from Google, so be aware of this. It uses multiple results and rotates and serves different results on different machines and browsers, even on the same computer. Google results are constantly changing – some sites are constantly because they give Google what they want in specific areas or they can have a greater number and variety of more reliable connections than you have.
  • Google has a long memory when it comes to links and pages and associations for your site – maybe an infinite memory profile of your site. Maybe I can forgive, but never be forgotten. Perhaps we can also forget, just like us, and therefore the above-mentioned penalties or prohibitions can be lifted. I think (according to the website because Google can solve it if you have a blog or an e-commerce site) Google probably looks at different versions of stories from certain sites also to individual sites. What is your relationship with Google? Up, do not try to fool Google – we are not smart enough. Be very own on the site and make sure that Google thinks twice before you get hit for any abnormalities in your link profile.
  • Google earns trust. Most of our most profitable accounts come from references from customers who trust us. Before customers told us about us, they did not know anything about us. Ok, they might have heard the people of us, which in turn they did not trust us much. The reference to us now rests on the customer’s testimony. These reference us automatically to a certain degree. This confidence grows as we deliver. Forgiveness gives us much confidence. But it is difficult to continue to create that confidence and gain more confidence because you do not want to dive into the confidence – it’s good to gain more and more confidence. Google works in the same way that human emotions and the search engines have been trying for years to provide a reliable set of human desire based websites and the intention of the search engine. Get in touch with Google
  • You will not break the trust of Google if your friend betrays you, depending on what you have done, you may have lost confidence. Sometimes trust is completely lost. If you do something that does not treat Google in such a way that you do not want to lose you the trust, and in some cases you will lose the trust (in some areas). For example, your sites may be able to sort, but your links can not be reliable enough to justify another site. DO NOT HAVE A STUDY ON GOOGLE
  • If Google trusts you, then it is because you have gained their trust so you can achieve what you need to accomplish the fastest and most cost-effective way. She helped Google achieve its goals. Trust yourself and reward yourself with your contribution to enter the order of websites you trust the most. It lists the friends who will trust you more than you will be educated in a particular area at the top of these areas. IF GOOGLE reliable, IL permitted storage and backs, go to other pages or “friends” Google information.
  • Google will be cheated and manipulated as you can, but give you a kick in the gonads if you break that trust – as you would likely. Handle Google as if it handled.
  • Be

Remember, it takes the time to build trust… And that is probably one of the reasons why Google is growing, the need for “trust” as a spell modifier.

I, of course, you may read too much in Google and TIME TRUST Google wants us to wait until things come to an end… but consider the trust a psychological emotion Google is trying to emulate with people with ideas Based algorithms.

If you do all the above, you will get more traffic from Google over time.

If you want to rank for certain keywords highly competitive niche, rank, you will be a big brand chosen by major brands (and related), or buy links to fake trust, spam or get it with a smart one will not be you Taken. Easier said than done.

I think Google is open to any human being if it is based on human traits…

What not to do in the website search engine optimization

Google has a very basic guide for organic optimization pdf search engine for webmasters, using intern:

Although this guide is not a secret that you can get your first site Surveys says Google automatically classified (sorry!), continue to best described below, it will be easier for search engines to index your content. Google

It is always useful to read, even if it is very simple, optimized search engines to your site.

No search engine you’ll never say the real keywords on your site to improve your ranking and more bio traffic conversion – and Google – that’s the most important thing you want to know!

If you want a bigger pdf – try my SEO free SEO.

It has been downloaded by tens of thousands of webmasters and updated every year or two.

Here is a list of what Google said in the document to avoid;

  1. Select a title that is not linked to the content of the page
  2. The use of standard or vague titles such as “Untitled” or “New Page 1”
  3. With a single title tag on each page of your site or on many pages
  4. With extremely long titles that do not serve users
  5. Fill out unnecessary keywords in your title tags
  6. Do not write a meta description tag with the contents of the page
  7. The use of general descriptive such as “This is a web page” or “Baseball page
  8. Displaying -4-
  9. Full description with only keywords
  10. Copy the entire content of the document in the description meta tag
  11. With a unique meta tag description on each page of your site or a large group of pages
  12. Use long URLs with unnecessary parameters and session IDs
  13. Choose generic page names such as “page1.html”
  14. Using excessive keywords such as “Baseball Card Baseball Card-Baseball-cards. Him”
  15. With deeply nested subdirectories like “… / dir1 / dir2 / dir3 / dir4 / dir5 / dir6 /
  16. html “
  17. With directory names that are not related to the content of this
  18. With sub-domain pages and the root directory (e.g. “domain.com/
  19. htm “and” sub.domain.com/page.htm “) to access the same content
  20. Mix www. And not www. URL versions in your internal link structure
  21. With strange URL catches (many users expect lower-case of the URL for a better callback)
  22. Create complex networks of navigation links, for example. Who matches each page of your site to any other site
  23. Go overboard with cutting and DICUT its content (it takes twenty clicks to reach the deep content)
  24. With full-based navigation on drop-down menus, images or animations (many, but not all, search engines can discover such links on a website, but if a user can access all pages of a site normal text link, improve the accessibility of your site)
  25. Allow HTML Sitemap page outdated with broken links
  26. Create an HTML sitemap that simply lists pages without organizing them, for example, by topic (Edit Shaun – sure to say, especially for larger sites)
  27. Allow your 404 pages to be indexed in the search engines (make sure your webserver is configured to give a 404 HTTP status code when there is demand for non-existent pages)
  28. Offer only a vague message like “Not found”, “404”, or no 404 page at all
  29. A design for your 404 pages is not compatible with the rest of your site
  30. Write an expired text with many spelling and grammar errors
  31. Embed text in images for textual content (User
  32. Add the text and search engines cannot read it)
  33. Unload large amounts of text on different topics on a page without paragraph, here, or layout separation
  34. Ruminating (or even copying) existing content, which brings little added value to users

Pretty simple, but sometimes it is the easy thing that is often overlooked. Of course, set the above with Google’s guidelines for webmasters.

Optimizing search engines are often making small changes to parts of your website. These changes can appear as incremental improvements, but when combined with other optimizations, they can have a significant impact on the user-friendliness and performance of the site in the organic search results.

Do not make these simple but dangerous mistakes… .

  1. Avoid duplicate content found elsewhere on your site. Yes, Google loves content, but * Rule * will be well-developed, unique and original to come to the top!
  2. Do not hide the text on your website. Google can remove the SERP.
  3. Do not buy 1000 links and think “Take me to the top!”. Google likes the growth of natural links and often frowns on buying massive links.
  4. Do not ask anyone to link to the same “anchor text” link or phrase. This could mark it as a “rank modifier”. They do not want that.
  5. Do not continue Google PR 100 links to hunting. Think quality links … not quantity.
  6. Do not be too rich domains to buy keywords, fill them with similar content and link them to your website. It’s lazy and dangerous and you could see ignored, or even worse from Google banned. It might have worked yesterday, but it certainly is not working today without a little compassion from Google.
  7. You do not constantly change the names of the pages on your website or browse the site without using reminders to redirects. Just screw it into a search engine.
  8. Do not use a website with JavaScript navigation like Google, Yahoo, and Bing can not creep build.
  9. They do not connect to anyone you ask for the mutual links. Just link to quality websites that you can trust.

Do not highlight your website with a bad website optimization

The main subject of any “change classification” does not point to your website as “suspicious” for Google algorithms or your spam team on the web.

I recommend tricks such as links in H1 tags to forget its or link to the same page three times with a different anchor text on a page.

Forget “what is best” by taking the things in mind that you should not waste your time with.

Every element of a page is an advantage for you until you spam.

Put a keyword in each tag and mark your site as “try too hard” if you have the court does not have the trust relationship – and the google algorithms go to work.

Spamming Google is often against productive in the long run.


  • Do not send an anchor text links of securities with the same keyword.
  • Do not spam your ALT tags or another day.
  • Specify your keywords carefully.
  • Try to make the site primarily for people, not just for search engines.

SEO On page is not as simple as a checklist on the term here, keyword there. Optimizers face many intelligent people on the Googleplex – and instead make this practice difficult

For those who need a checklist is a type that gives me the results;

  1. Make the keyword search
  2. Identify attractive opportunities for the research experiment
  3. Identify the public and the reason for your page
  4. Write a utility copied – helpful. Use related terms in your content. Use plurals. Use words with the intent of the search engine to compare how to buy. I like to get a keyword or a term assigned in each paragraph.
  5. Use reinforcements to sparring important points on the page to highlight if your keywords are not
  6. Select a title page with your Smart keyword
  7. Write intelligent meta description through the page repeatedly
  8. Add a picture with an ALT attribute text to user-centered
  9. Link to related pages on your site in the text
  10. Link to related sites on other sites
  11. Your page should have a simple and user-friendly URLs Google
  12. Keep it simple
  13. Share it and support it

You can forget everything else.

The continued development of SEO

The “keyword is not provided” incident is another example of Google HARDER in the organic search results ranking – a change to “users” that seems to have the greatest impact on “marketing” outside the Google ecosystem – yes – the search engine optimizer.

Now, the consultants should focus on the page (summary, I know), rather than focusing on keywords only when optimizing a web page for Google. Today, there are many tools from third-party providers that help in search keywords, but most of us do not have the kind of keywords that we have access to earlier.

Comes to your text content on the pages and keywords in external and internal links right keyword research is important because finally get a website to the top of Google. Overall, Google uses these signals to determine where the file is when it is ever categorized.

There is no magic ball on this.

At any time, your website is likely to provide the influence of an algorithmic filter (eg Google Panda and Google Penguin) designed to keep spam sites under control and provide relevant results and high quality to the human visitor.

One filter can be to keep one page in the SERPs while another filter is pushing another page. They may have bad content, but excellent ve a great content, but a very bad tecinbound links, or vice versa. They can hahnical organization.

Try to determine the reasons why Google does not “calculate” a higher specific page, the competition; The answer is usually on the page or the backlinks on the page shows.

  • Do you have very few high-quality incoming links?
  • Do you have too many bad quality backlinks?
  • Your page does not contain a description text, rich text?
  • Are you filling keywords in your text?
  • Link to independent websites?
  • Do you have too many ads about twice?
  • Have you found affiliate links on each page of your site and text on other sites?
  • Do you have broken links and missing images on the page?

However, to identify and solve problems.

Get the wrong page from Google, and your site might be selected for review guide to optimizing your site as if you were to get a day’s review from a reviewer’s spam website.

I think the key to a successful campaign is to convince Google that your page is more relevant to a particular search term. To do this, get a good text content, keyword and get the quality of the links on this page.

It is much easier these days say that actually!

The next time you develop a page, consider what looks like spam, it is likely spam at Google. Ask yourself which pages of your site are really necessary. What links are needed? Which site pages appear in the architecture of the site? Which pages do you know?

You can, but be careful, a website to help in various ways (by ensuring that your page titles and meta tags are unique including). The obvious proof of the “scope change” is dangerous.

I used simple SEO techniques and those that can be determined in some way. I never wanted to be competitive; I always wanted at least some of the reasons to understand why for this key phrase page a small ad. I am trying to create a good user experience for people and search engines. If you make the text content of high quality, relevant and appropriate for these two target groups, you are more likely to find success in the organic search results and you may not need to get into the technical side of things as the user guide and search URL.

To win the competition in an industry where it is difficult to gain the quality of the links you need to be more “technical” sometimes – and in some industries – traditionally you will need 100% black hat to also get in 100 first Competition results transaction research.

There are no fixed rules for the long-term success of the rankings, except for the development of high-quality sites with high-quality content and quality of links pointing to them. More you have domain authority about you in the text. The goal is to build a satisfying website and build an authority!

It is necessary to mix and learn from the experience. Make mistakes and learn from them to watch. I found that being punished is a very good way to learn what not to do.

Remember that there are exceptions to almost every rule, and in a landscape, still swaying, and you probably have little chance of accurately determining why you will rank in today’s search engines. I’ve been doing this for over 15 years and I’ve been trying to understand Google, learn more, and learn from others’ experiences every day.

It is important not to possess the detail granular classification with little return on investment if you really have time to do it! THERE IS SOMETHING MORE to spend this time.

It is generally good backlinks or large content.



The foundations of successful optimization while not refined much changed over the years, although Google seems better than sites with some signs of reputation to reward and content and ease of use.

Google is not on the pay legitimate effort – despite what some claim. If that were the case, I would be a black hat full-time. So everyone tried to rank on Google.

Most small businesses do not need advanced strategies because their direct competition either does not use this tactic.

I took a medium-sized company to the top of Google has recently done very competitive terms, but nothing to do but to ensure page titles have been optimized, the text of the home page has been rewritten, one or two links earned by trustworthy sites.

This site was a few years ago a clean record on Google, and a few of organic links already trusted websites.

This area had the authority and the ability to rank for a few precious words, and all we had to do made some changes to the page to improve the depth and content website from home, Performance and adjust the page title.

There was duplicate content I needed to sort and some late canonization of content to meet, but none of the actions I would have implemented would have advanced.

Many companies can get more visitor conversion Google simply following the basic concepts and best practices:

  1. Always make sure that each page of at least one other page is linked
  2. Link to important sites frequently
  3. Link not only to your navigation, but the rich-text links, keywords into the text content – keep it natural and visitors
  4. Try to keep each page element and make the content as unique as possible
  5. Create a website for visitors to get visitors and you can hardly convert anything into actual sales too
  6. Create a keyword content on the site
  7. Look at the pages you connect to and from the page, but to the link.
  8. Go to find sites on relatively reliable websites to try to get anchor text input links
  9. Observation of Trends, Show Statistics
  10. Reduce duplicate content or duplicate
  11. Bend a rule or two without breaking it and you will probably be good

Once this is complete, it’s time to … more and better content to your site and more people tell you if you want more love from Google.

OK, so you may need to implement the 301 odd, but again, it’s just advanced.

I saw SEO Marketing simple technical function for years.

It is better to simply do things better and faster than to be concerned about some advanced techniques as you read on some blogs I think – it is more productive, profitable for business and safer for most.

Beware of pseudo


Pseudoscience is a statement, a belief or practice that is considered a science, but that is neither hurt correct scientific methodology …

Beware of people having fun with science to try. This is not a science when Google controls “laws” and change at will.

You see, I always thought about optimization was:

  • Looking at the Google rankings overnight,
  • Search for a keyword
  • Explanations on the performance evaluation of your pages and others (if not in a controlled environment)
  • Put the relevant and appropriate words that you want to rank on the pages
  • Put words in the links to the pages you want to rank for
  • Understand what you put in your title, it’s what you rank better
  • How to get other sites links that show you
  • Get real quality that will last from sites that are reliable enough
  • Publication of much, much more content
  • Focus on the long tail of the search!
  • Understand that it is time to take all these contests to beat
  • I have always hoped to get a compromised website:
  • Always have too many links with the same anchor text that points to a page
  • Keywords fill a page
  • Also, try to manipulate Google on a website
  • Creating a “frustrating user experience.”
  • Go to Algorithm
  • Get links that you should not have
  • Links to buy

Not that all is punished automatically all the time.

I’ve always believed that I did not understand mathematics or science from Google that many, want to understand what the Google engineers want.

The biggest challenge today is to get websites to trust you, but the rewards are worth it.

To do this, you might want to invest in a marketable salary, or compelling advantages party (which is to pay for links is not just that someone else can pay for more). Purchasing links to improve the rankings of work, but this is probably the construction technique of the most hated link when it comes to the Google spam team.

I was very curious about the science of optimization, I studied what I could, but it gave me a little dissatisfied. I learned that building relationships, creating a lot of decent content and learning how to monetize these content better (without substantial interruption, Google TOS) would have been the most useful use of my time.

Better and faster to do everything that would be good.

There are many problems with blogs, including myself.

Disinformation is obvious. Rare results are not conclusive or applications are 100% correct. Even if you think that a theory contains water at a certain level. I’m trying to update the old messages with new information if I think the page is only meaningful, with accurate data.

Keep in mind that most of what you read or how Google is working on a third opinion, and how in any other area of knowledge, the “facts” can change to a better understanding in time or with a different perspective.

Chase the Algorithm

There is no magic solution, and it’s no secret formula to reach the number 1 ranking in Google quickly in any competition without spam Google.

Legitimate earned a high position in the search engines takes a lot of hard work.

There are tricks and less about the tactics used by some better than others to fight against Google Panda, for example, but there are no big secrets (no “white hat” secrets anyway). This is not a wise strategy, however, and creative solutions found opportunities to be discovered by research niche. If Google sees a strategy the results … it will generally be “with guidelines” and something that can be punished for – so beware in the latest fashion, jumping.  The biggest advantage that each vendor has on the other is the experience and the resources. Do you know what is not and hurts your site, which is often more valuable than knowing what to give an impulse of short duration? Going to the top of Google is a relatively easy process. One that is constantly evolving. Professional SEO is more a collection of skills, methods, and techniques. It is more a way of doing things, a spell of a magnitude.

After more than a decade of practice and the use of real campaigns, I always try to bring it a cost-effective process to its simplest and most.

I think it’s easy to do things.

Good text, simple navigation structure, quality links. To be relevant and reputable takes time, effort and happiness, just like everything in the real world, and that is the way Google likes it.

If a company promises a guaranteed ranking and has a strategy in the bombardment, pay attention.

I would check it does not violate the Google policies.

How long does it take to see results?

Some results are achieved in a few weeks, and you need to see a few strategies waiting for the benefit to see. Google wants these efforts to take time. Critics of the search engines so that Google’s current rankings are a function of Google Adwords sponsored ads.

The optimization is not a fast process, and a successful campaign can be judged if not years in months. Most optimization techniques accelerate websites are found in the guidelines for the Google webmaster cautiously.

It takes the time to build quality, and it is this quality that Google wants to reward in 2016.

It takes the time to generate the data that is necessary to begin formulating a campaign and the time to implement this campaign. Progress also depends on many factors

  • What is the age of your site from the 10 best websites?
  • How many backlinks do you have to compare them?
  • How is your quality backlinks compared to yours?
  • What is the history of people between you (they have what words used to link people to your site?)
  • How is the quality of a resource on your site?
  • Has your website can attract natural backlinks (p. Ex., Have good content and excellent service) or are you completely dependent on your agency for backlinks (which is very risky in 2016)?
  • What unique content do you have?
  • Do you have to link all to you (which is risky), or do you have a “natural” reason for people to link you?

Google wants to retain the quality of the pages in their organic listings, and it takes the time to build the capacity and quality to be recognized.

It takes too long to balance your content, create high-quality backlinks, and manage unauthorized connections.

Google knows how much organic traffic is valuable, and they want to invest webmasters in sorting the sites a lot of effort.

Critics have pointed out that the higher the cost of SEO experts, Adwords will be a good look, but Adwords will only be more expensive, too. At some point, if you want to compete online, you need to build a high-quality website with a unique offer to meet the visitor back – the sooner you start, the sooner you will start seeing results.

If you are now starting and are determined to create an online brand, a content-rich website with a good user experience, Google will reward you in the organic search results.


Web optimization is a marketing channel like any other and there is no guarantee of success in everyone, for what obvious reasons should be. There are no guarantees on Google Adwords either, unless the cost of competition increases, of course.

That is why it is so attractive – but like any marketing – it is still a gamble.

Currently, I do not know you, your company, your website, your resources, your competition or your product. Even with all this knowledge, ROI is extremely difficult because ultimately, Google decides who’s where in their results – sometimes it’s the best sites, and sometimes (often) its ranking websites to break the above rules your own.

Nothing is absolutely in search engine marketing.

There are no guarantees – despite the allegations of some companies. What you do with this investment depends on many things, not least on the right track, to convert your site visitors into sales.

Each location is different.

The big branded campaigns are very, very different SEO campaigns started by small businesses that have no connection you have begun to give just one example.

It is certainly easier if the brand concerned has a lot of authority as an unlocked authority is waiting – but this is, of course, a generalization that big brands have a big brand and competition.

It all depends on the quality of the site in question and the level and quality of competition, but small businesses should probably look into their own niche, even if they are initially limited to their location.

Local SEO is always a good place for small businesses to start.

What is a page of spam?


What causes a spam page?

  • Hidden text or links: by selecting all the text on the page and scrolling down (all on the underlined text), deactivating CSS / Javascript or setting the display of the source code
  • Sneaky redirects – Redirection over multiple URLs, the rotation of the target areas with JavaScript Masking redirects and frames 100%
  • Keyword Filling – no percentage or keyword density data; It depends on the evaluators
  • PPC ads that only serve to make money, not to help users
  • Content copied/scraped and PPC ads
  • Flow with PPC displays
  • Read pages – several target pages that the whole user goes to the same goal
  • Models and other computer-generated pages created series that are copied by the content copied and/or slight variations of keywords
  • Message panes without additional page content
  • Fake pages with PPC search ads
  • False blogs with PPC ads identified by the content copied/scrapped or not sensuous
  • Thin affiliate sites, which only exist to make money checked by in another domain, image properties, the origin to another URL, the lack of original content, have various registered two domains in question
  • Pure PPC pages with little or no salary
  • Parked domains

There is more information about this announcement on SEW.

If a page is just to make money, the page is spam, Google


If a page is just to make money, the page is spam. GOOGLE

In both qualitative filter quality standards, we see for Google quality tester, the statement is very important – and it should be a heads up for any webmaster out there who think that they are doing a quick Google organic silver these days.

It should at least, think about the types of pages that you will spend your valuable time with.

No added value for Google users, do not expect a number.

If you are making the sole purpose of making a page today, making money with it – and especially with the free circulation of Google – you obviously do not receive the memo.

Consider this from a textbook reviewer:

.. When they reach the summit, they must be checked by a human eye that ensures the site quality. PotPieGirl

It should be remembered:

  • If a page is just to make money, the page is spam
  • If a site is just to make money, the site is spam

This is how you do it will be judged – whether it is fair or not.


Of course not – in some cases – the same conditions of competition.

If you think of coming to a website it will be a workload and passion,

  • Stand up
  • Be known
  • Add your website CONTENTS ONLY
  • GET THE CREDIT as a source of content
  • HELP USER (!) In a way, TUT FOR 100 other pages

…. Then you may find that you have built a great website and also in time – a “brand”.

Google does not care about us or SEO sites – but cares to help users.

So if you help your visitors – not just through a different website – you probably do something good at least.

In this sense – I am already building affiliate sites differently.

Doorway pages

Google announced that it intends to target sites in the next major update. The definition of a portal page is sure to develop in the coming years – and this will begin shortly.

The last time, Google announced that they would be behind the doors and doors in 2015.

Example: In the pictures below (from 2011), all pages of the site appeared to be hit for anything, with a penalty -50 +.

First – the Google rankings fueled for the most important items ...

… Which led to an apocalypse of course …

… And got a nice email from Google WMT:

Google Webmaster Tools Announcing the Recognized Pages on XXXX XXXX – Dear website operator or webmaster of XXXX XXXX, we discovered that some of the pages of your site may be techniques that are designed for webmasters outside Google’s policies. In particular, your website may have what we consider to be entry pages – the cookie groups or poor quality pages. These pages are often of little value to users and are often optimized for single words or phrases to channel users to a single site. We believe that the landing pages often create a frustrating user experience, and we encourage you to correct or delete pages that violate our quality guidelines. Once you have made these changes, please submit your re-review of the website in the search results of Google. If you have questions about how to fix this issue, visit our help forum for webmasters to get help. Sincerely, team of the quality of Google search

What are the entrance pages?

Incoming pages are often large amounts of poor quality pages, on which each page is optimized for a particular keyword or phrase. In many cases, the initial pages are written for a particular set to classify and then funnel users to a single destination. Gateway pages are web pages created for search engine spamming, ie the index of a search engine to send by sending the results for certain phrases insert visitors to another page. They are also known as bridge page name, portal pages, jump pages, gateway pages, login pages, and other names. Pages of entry that visitors without their knowledge use a form of obfuscation use. Whether implemented in many areas or in a domain, tend to thwart doorway pages users and violate our guidelines for webmasters. Google’s goal is to give our users the most valuable and relevant search results. Thus, frowning on practices that are designed to manipulate search engines and users of the cause by being selected in places other than the ones that direct them and provide content solely for the benefit of search engines. Google may take over bridges Websites and other sites that use this misleading practice, including removing those pages from the Google Index. If your site has been removed from our search results, please refer to our Webmaster Guidelines for more information. After you have made the changes and your website no longer violates our policies, submit your site for re-review.

At the time (2011), not immediately ranked pages on the pages in question, the doorway pages affected. Obviously, the definition of Google doors changes over time.

When I looked at the Google Webmaster forums, there are many people who ask questions about how to fix this at the moment – and as usual – it seems a bit of a Grauzone with lots of theories … .. And some of the aid in Google forum is clearly questionable.

Many people do not realize they build what Google classes as input pages … ..and that is significant … ..what they intend to do with the traffic Google sends itself to be a ranking factor, not too Is often mentioned.

You probably do not want to register for GWT if you have several sites that have many pages covered.

Here’s what Google has recently said about this update algorithm:

The doors are websites or pages created according to specific search queries. They are bad for users because they can lead to many similar pages in user search results, with each result ultimately leading the user to essentially the same destination. You can also get users on intermediate pages that are not as useful as the final destination.

… With examples of doors listed as follows:

  • Several domain names or the sites to have specific regions or cities targeted to send users to a page
  • Generated pages the visitors in the real useful or relevant part of your (your) location to beautify (n)
  • Essentially similar to those that are closer to research results, the hierarchy and exploitable pages clearly defined

Google also said recently:

Here are some questions about the pages, which could be considered as entrance pages:

  • The purpose to optimize search engines and visitors is to integrate it into the useful part of your site or to be a part of the user experience of your site?
  • Are the pages, classified in general terms, but the content presented on the page, is it very specific?
  • Do you duplicate pages of useful aggregations of items (sites, products, etc.) that exist on the site to capture more search traffic?
  • Are these sites made just to attract affiliate traffic and send users without creating unambiguous value in their content and functionality?
  • These pages does it give as an “island”? Are they difficult or impossible to navigate elsewhere on your site? Are there any pages on other pages of the site or the network of websites created for search engine links?

A real Google Friendly website

Meanwhile, a Google Friendly website should build a website for the Googlebot could properly classify and scratch.

When I think of “Google friendly” these days – I believe a Google site will be the first when it is popular and accessible enough and does not feel as a * stone and without apparent reason, 1 day, though I Google SEO Starter Guide On the map… just because Google has found something they do not like – or have they classified my site as a junk one day.

It’s not just the original content, but the function of your website provides visitors with Google, and it’s your business plan.

I am now building sites with the following in mind… .

  • Do not be a Google site is not sorted – What Google ranks your site like – is perhaps the number 1 Google, ranking factor is not often talked about – if Google determines that algorithmic or finally, manually. That means – whether a merchant, a subsidiary, a contact or PAGE DOOR, SPAM or VITAL to a particular search – what do you think Google from your site? Your site is better than those who are now in the top ten of Google? Or just the same? Ask why bothers Google ranks your site if it is exactly the same, why not rather because it is the same… How can you do differently?
  • Think someday your website will need to pass a manual review of Google – the best rank you get, or more traffic you get, the more likely it is being investigated. I know that Google is at least the same classes useful pages as spammy, according to the filtered documents. If you want a website to have a high ranking in Google, it is better to “do” something other than a simple link to another site because of a paid commission. To know that in order to be successful, you need to help your site, a visitor that Google will send to you – and a useful website is not just a website, with a commercial intent to make a Google visitor to another site to Send – or “affiliate slim” as a Google class.
  • Consider how Google’s algorithm and can manually determine the commercial purpose of your site – think of the signs that a real small business website is different from a website created with links visitors to another site to send affiliate to Each side, for example; Or ads on your website, through the flap, etc., can be a clear indicator for the respective business – where Google has a copied
  • Google will not thank you for publishing a number of similar products and double the content of your website. We expect that for every page you want to do on Google or invest the original content will create something not to publish on other sites
  • Make sure Google knows that your website is the source of all content it generates (for example, by simply clicking on Google via XML or RSS). I also dare to confirm the use of Google+ … This kind of thing will become more important throughout the year
  • Understand and accept why Google’s ranking your competition over you: they are:
    • The more relevant and popular,
    • More relevant and respectable, or
    • Manage backlinks better than you.
    • Spamming
  • Understand that everyone through Google falls into these categories and formulate their own strategy to compete – on Google leaving action on your behalf is most likely not happen.
  • Be “relevant” is reduced to keywords and key phrases – in domain names, URLs, subtitles, the number of times they are repeated in the text of the page, the text in the ALT tags image, richness and marking the page in Question. If you rely on manipulating hidden items on a page well on Google, you’ll probably run anti-spam filters. If you are “hidden” in the side elements, take care of you to be proud that your ranking is improving.
  • SEO GOOD Bases for years have not changed – although the effectiveness of certain elements has certainly been reduced or changed in the way of utility – you still need to focus on building a simple website, using the simple SEO best practices – do the little things do not sweat, While all the time, pay attention to the important things – many unique SECURITIES PAGE and much new content ORIGINAL. Understand how Google sees your website. ESS, like Google, with (for example) screaming Frog SEO spider and correct invalid links or things that result in server errors (500), broken links (400+) and unnecessaries (300+) redirects. Any page you want to publish Google a header message OK 200.

Shaun Anderson … and that’s it now.

This is a complex issue, as I said at the very beginning of this article.

I wish you a free SEO tutorial DIY enjoyed for beginners. as follows:

Guidelines for Google Webmasters

You do not have to pay in the search engines, and you do not necessarily have to subordinate your site to them, but you must know their “rules” – especially the rules of Google.

Note; This includes rules can change. These rules are official advice to Google Webmaster and Google actually suppresses the techniques of “poor quality” that will affect its ranking in 2016.

The following is a list of the most important pages of the Google Webmaster and Links Guidelines

 Rank Google Guideline or Support Documents Source
1 Guidance on building high-quality websites View
2 Main webmaster guidelines View
3 Quality Rater’s Guide 2016 (and previous years!) View
4 Link Schemes Warning View
5 Disavow Backlinks Warning View
6 Auto-Generated Content Warning View
7 Affiliate Program Advice View
8 Report spam paid links or malware View
9 Reconsideration requests View
10 List of common manual actions View
11 Use rel=”no follow” for specific links View
12 Adding A Site To Google View
13 Browser Compatibility Advice View
14 URL Structure Advice View
15 Learn about Sitemaps View
16 Duplicate Content View
17 Use canonical URLS View
18 Indicate paginated content View
19 Change page URLs with 301 redirects View
20 How Google Deals With AJAX View
21 Review your page titles and snippets View
22 Meta tags that Google understands View
23 Image Publishing Guidelines View
24 Video best practices View
25 Flash and other rich media files View
26 Learn about robots.txt files View
27 Create usefully 404 pages View
28 Introduction to Structured Data View
29 Mark Up Your Content Items View
30 Schema Guidelines View
31 Keyword Stuffing Warnings View
32 Cloaking Warning View
33 Sneaky Redirects Warning View
34 Hidden Text & Links Warnings View
35 Doorway Pages Warnings View
36 Scraped Content Warnings View
37 Malicious Behavior Warnings View
38 Hacking Warnings View
39 Switching to Https View
40 User Generated Spam Warnings View
41 Social Engineering View
42 Malware and unwanted software View
43 Developing Mobile Sites View
44 Sneaky mobile redirects View
45 Developing mobile-friendly pages View
46 Use HTTP “Accept” header for mobile View
47 Feature phone Sitemaps View
48 Multi-regional and multilingual sites View
49 Use halfling for language and regional URLs View
50 Use a sitemap to indicate alternate language View
51 Locale-aware crawling by Googlebot View
52 Remove information from Google View
53 Move your site (no URL changes) View
54 Move your site (URL changes) View
55 How Google crawls, and serves results View
56 Ranking In Google View
57 Search Engine Optimization View
58 Steps to a Google-friendly site View
59 Webmaster FAQ View
60 Check your site’s search performance View


Google’s webmaster channel is also useful to subscribe.

If you have done here, you should read my post-Google Panda, which will better allow you to understand this process.

Free SEO EBOOK (2016) PDF

Hobo UK SEO Beginners Guide V3 2015: Shaun AndersonCongratulations! You’ve just finished reading the first chapter of my 2016 training guide.

A Hobo UK SEO Guide for Beginners (2016) A free book in PDF format is that you can download from here completely free (2 MB), which will take my notes increased driving organic traffic to a website in Google’s guidelines.

I’m in the UK and most of the time is Google.co.uk just spent the ebook (and blog posts) should be read in that sense.

Google is BIG – with many different specific search engines in countries with very different outcomes in some cases. I did all my tests on Google. Es.

This is a guide based on my 15-year experience.

I write and publish on my blog to keep track of thoughts and get feedback from the industry and colleagues. Following this strategy, I get about 100,000 visitors a month from Google.

My ebook is snacking – I’m not a professional writer – but the content in it is largely the information I needed while I registered a penalized page to keep organic Google traffic from 2016.

This is the fourth version of this paper I published in 7 years and I hope that this and the previous one in a way show my interest in this area that others can learn.

a warranty

There is no guarantee – it is a free PDF file. This SEO Training Guide is my opinions, observations and theories I have put into practice, no advice.

I hope you find it useful and hope that beginners can get some free e-book or links to other high-quality resources they can relate to.

Subscribe to our Latest SEO Tips Click here and subscribe to this blog for free upgrades.