Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users. The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow”).
On-site SEO (also known as on-page SEO) is the practice of optimizing elements on a website (as opposed to links elsewhere on the Internet and other external signals collectively known as "off-site SEO") in order to rank higher and earn more relevant traffic from search engines. On-site SEO refers to optimizing both the content and HTML source code of a page.
There are a number of websites which exist solely to help you determine how fast your website loads, often from many different geographical locations. That helps you get a sense of your overall “User Experience” which is a major part of SEO in 2018 and beyond.
SEO (Search Engine Optimization) helps people to find your website using search engines. It brings more online business by improving search engine ranking and bringing relevant traffic to your website. This article describes how to do end-to-end SEO of a website.
Many local business owners are surprised with the information that appears when they (and their customers) come across their business listings at Google and Bing. Often, incorrect or out-of-date information shows up with no explanation about where it comes from.