Each time a web user enters a search into a search engine a series of events transpires at blazing speeds. To begin with a link analysis algorithm is triggered which involves the algorithm assigning a numerical value to the elements (keywords predominantly) of hyperlinked documents.
The objective of this quantitative weighting is to measure the relative relevancy of the document to the search that was entered by the web user. The entire equation based on the a mathematically constructed algorithm is related to a web-graph that can be found within all World Wide Web pages in the form of nodes and hyperlinks with higher priority placed on authoritative hubs (e.g. cnn.com, Huffington Post, BBC.com etc.). The value derived or the rank value is an indication of the importance or relevancy of a particular page, and is the basis behind search engine optimization.
Towards breaking this down to more manageable understanding the value obtained towards a relevancy of a page boils down to fundamentals such as a hyperlink that leads to a page which is deemed as a vote of support. The ranking of a page is based on repetitions of links to a page and the links that lead to the previous page (Page-Rank metric of any and all pages that are linked to it regarded as ‘incoming links’. Therefore in essence any page that is found to be linked to and from by many other pages which also have high rankings will automatically receive high ranking.
The enormous amount of information that has been injected into the web would make manual searches for what web users need virtually impossible without a proper ‘filing system’. The construct of Google ranking is akin to a librarian who sort billions of webpages within seconds and return the most relevant or most useful pages within a second. Google ranking systems are made up a bunch of algorithms that are examine a variety of factors such as the words that have been used for the query, expertise of sources, location and browser settings and even the usability of the pages it finds. The weight of relevancy that is applied for each of these factors examined by the algorithms varies according to the nature of the query (e.g. the newness of content would be given priority for queries related to current news than definitions from a dictionary.
Google’s page rank structure and process is rigorous to say the least in order to maintain the high standards of search quality from Google which is at the top of ‘search engine food chain’. Google performs live tests and they also employ thousands of professional Search Quality Raters across the globe that abides by strict guidelines that play a critical role in developing search algorithms. In essence the page rank system first examines ‘the meaning of a query’ before seeking out relevant webpages, from here the filtration process based on quality steps in before an assessment of the usability of the webpage is gauged. The final search results returned would be based on the context of the setting by the user on his or her browser.
While this structure is how Google operates, the SEO company that assists in ranking websites needs to be sure to follow all relevant guidelines for what google calls “White Hat” ranking methods, this means ranking sites based on their content and relevance, not using methods that rank sites based upon irrelevant methods that do not provide clear and informative information on each search term.