Understanding Google's Algorithms
An algorithm is defined as: In mathematics, computing, and related subjects, an algorithm is an effective method for solving a problem using a finite sequence of instructions. Algorithms are used for calculation, data processing, and many other fields.
Obviously, Google?s algorithm (or algo as I will refer to it from this point on) is a secrete and highly protected though some basic information can has been released publicly since the release of their patent. The purpose of this article is to give you a basic idea of what Google is doing on their end when reviewing each website they spider. Although the algo has slight frequent changes, the basics have remained the same for years and seem to be a permanent structure. Reminder, this article is not going to put you #1 for your keyword, there is obviously a lot more to SEO then knowing the Algo, but using the Algo to your benefits will definitely get you on the right track to earning your rankings. Google?s Considerations when a Spider or Robot finds your site:
As you can see, and as many of you know, Google takes an extensive look into your site each time its spidered. SEO is a full time job for any site, and understanding the things Google looks for each time they visit your site is a good starting point to ranking your site. About The Author: Bobby Taylor, also known as 2bet, has spent nearly 11 years in the adult industry. In 2004, he successfully combined gaming and adult through Webmaster Poker Tournaments on 2bet.com. He credits the rapid growth of 2bet.com to successful search engine optimization, and moved solely into SEO in 2007. In 2008 SEO AP was publicly launched and recently in 2009, a sister company site, X RATED SEO emerged. |
:thumbsup
|
Sig spot :)
|
Bobby, thanks for adding to our Educational Series! This topic is sure to warrant much discussion! :)
|
Looks like great info for the beginners.
|
Great read, thanks for taking the time.
|
Very good info Bobby :thumbsup
I'd like to add one other factor, very often overlooked, is having valid html/css code. While i don't have a solid case study, it's been my experience that valid documents rank better. Not to mention, it's just a good practice anyways, as having valid code will prevent most cross-browser "surprises" and make your site more accessible. Validation check tools can also alert you to where you may have forgotten alt tags on images, etc... I recommend using this free validation tool: http://validator.w3.org/ |
Good stuff bud :thumbsup
|
Great Writeup, I would mention that you should not use <b> tags at all however, they are deprecated...
<strong> for bold <em> for emphasized (italics) Are the correct standards :P |
Some solid 101(and maybe 101,5) advice there :) This thread will be interesting.
|
Good job.
|
Nice read. Thank you kind sire.
:thumbsup |
I FUCKING own Google:
Domain: 13 factors 1. Domain age; 2. Length of domain registration; 3. Domain registration information hidden/anonymous; 4. Site top level domain (geographical focus, e.g. com versus co.uk); 5. Site top level domain (e.g. .com versus .info); 6. Sub domain or root domain? 7. Domain past records (how often it changed IP); 8. Domain past owners (how often the owner was changed) 9. Keywords in the domain; 10. Domain IP; 11. Domain IP neighbors; 12. Domain external mentions (non-linked) 13. Geo-targeting settings in Google Webmaster Tools Server-side: 2 factors 1. Server geographical location; 2. Server reliability / uptime Architecture: 8 factors 1. URL structure; 2. HTML structure; 3. Semantic structure; 4. Use of external CSS / JS files; 5. Website structure accessibility (use of inaccessible navigation, JavaScript, etc); 6. Use of canonical URLs; 7. “Correct” HTML code (?); 8. Cookies usage; Content: 14 factors 1. Content language 2. Content uniqueness; 3. Amount of content (text versus HTML); 4. Unlinked content density (links versus text); 5. Pure text content ratio (without links, images, code, etc) 6. Content topicality / timeliness (for seasonal searches for example); 7. Semantic information (phrase-based indexing and co-occurring phrase indicators) 8. Content flag for general category (transactional, informational, navigational) 9. Content / market niche 10. Flagged keywords usage (gambling, dating vocabulary) 11. Text in images (?) 12. Malicious content (possibly added by hackers); 13. Rampant mis-spelling of words, bad grammar, and 10,000 word screeds without punctuation; 14. Use of absolutely unique /new phrases. Internal Cross Linking: 5 factors 1. # of internal links to page; 2. # of internal links to page with identical / targeted anchor text; 3. # of internal links to page from content (instead of navigation bar, breadcrumbs, etc); 4. # of links using “nofollow” attribute; (?) 5. Internal link density, Website factors: 7 factors 1. Website Robots.txt file content 2. Overall site update frequency; 3. Overall site size (number of pages); 4. Age of the site since it was first discovered by Google 5. XML Sitemap; 6. On-page trust flags (Contact info ( for local search even more important), Privacy policy, TOS, and similar); 7. Website type (e.g. blog instead of informational sites in top 10) Page-specific factors: 9 factors 1. Page meta Robots tags; 2. Page age; 3. Page freshness (Frequency of edits and % of page effected (changed) by page edits); 4. Content duplication with other pages of the site (internal duplicate content); 5. Page content reading level; (?) 6. Page load time (many factors in here); 7. Page type (About-us page versus main content page); 8. Page internal popularity (how many internal links it has); 9. Page external popularity (how many external links it has relevant to other pages of this site); Keywords usage and keyword prominence: 13 factors 1. Keywords in the title of a page; 2. Keywords in the beginning of page title; 3. Keywords in Alt tags; 4. Keywords in anchor text of internal links (internal anchor text); 5. Keywords in anchor text of outbound links (?); 6. Keywords in bold and italic text (?); 7. Keywords in the beginning of the body text; 8. Keywords in body text; 9. Keyword synonyms relating to theme of page/site; 10. Keywords in filenames; 11. Keywords in URL; 12. No “Randomness on purpose” (placing “keyword” in the domain, “keyword” in the filename, “keyword” starting the first word of the title, “keyword” in the first word of the first line of the description and keyword tag…) 13. The use (abuse) of keywords utilized in HTML comment tags Outbound links: 8 factors 1. Number of outbound links (per domain); 2. Number of outbound links (per page); 3. Quality of pages the site links in; 4. Links to bad neighborhoods; 5. Relevancy of outbound links; 6. Links to 404 and other error pages. 7. Links to SEO agencies from clients site 8. Hot-linked images Backlink profile: 21 factors 1. Relevancy of sites linking in; 2. Relevancy of pages linking in; 3. Quality of sites linking in; 4. Quality of web page linking in; 5. Backlinks within network of sites; 6. Co-citations (which sites have similar backlink sources); 7. Link profile diversity: 1. Anchor text diversity; 2. Different IP addresses of linking sites, 3. Geographical diversity, 4. Different TLDs, 5. Topical diversity, 6. Different types of linking sites (logs, directories, etc); 7. Diversity of link placements 8. Authority Link (CNN, BBC, etc) Per Inbound Link 9. Backlinks from bad neighborhoods (absence / presence of backlinks from flagged sites) 10. Reciprocal links ratio (relevant to the overall backlink profile); 11. Social media links ratio (links from social media sites versus overall backlink profile); 12. Backlinks trends and patterns (like sudden spikes or drops of backlink number) 13. Citations in Wikipedia and Dmoz; 14. Backlink profile historical records (ever caught for link buying/selling, etc); 15. Backlinks from social bookmarking sites. Each Separate Backlink: 6 factors 1. Authority of TLD (.com versus .gov) 2. Authority of a domain linking in 3. Authority of a page linking in 4. Location of a link (footer, navigation, body text) 5. Anchor text of a link (and Alt tag of images linking) 6. Title attribute of a link (?) Visitor Profile and Behavior: 6 factors 1. Number of visits; 2. Visitors’ demographics; 3. Bounce rate; 4. Visitors’ browsing habits (which other sites they tend to visit) 5. Visiting trends and patterns (like sudden spiked in incoming traffic) 6. How often the listing is clicked within the SERPs (relevant to other listings) Penalties, Filters and Manipulation: 12 factors 1. Keyword over usage / Keyword stuffing; 2. Link buying flag 3. Link selling flag; 4. Spamming records (comment, forums, other link spam); 5. Cloaking; 6. Hidden Text; 7. Duplicate Content (external duplication) 8. History of past penalties for this domain 9. History of past penalties for this owner 10. History of past penalties for other properties of this owner (?) 11. Past hackers’ attacks records 12. 301 flags: double re-directs/re-direct loops, or re-directs ending in 404 error More Factors (6): 1. Domain registration with Google Webmaster Tools; 2. Domain presence in Google News; 3. Domain presence in Google Blog Search; 4. Use of the domain in Google AdWords; 5. Use of the domain in Google Analytics; 6. Business name / brand name external mentions. . |
Great :thumbsup
|
Interesting info, thx for sharing! :)
|
Wow an actual post with relevant information about a topic everyone likes to bullshit about knowing. Good stuff. :thumbsup:thumbsup
|
I think Jill has a crush on that Bobby guy...just saying.... I heard they had a *THING* and that's why hes allowed to post. hehehe
Injustice! |
Good read :)
|
Very nice my friend.
|
Good overview for peeps. :)
|
nice read bobby thanks
what about time spent on site - isnt that a considerable factor these days? |
Quote:
Time spent on site pages viewed bounce rate etc. to the rest of the replies, ty all all for such great responses |
Quote:
|
great info.
|
GFy might be serious about the change.. Thanks for great BIZ thread.
|
Great article, and subsequent reply.
info-tastic. |
Very informative and I am bookmarking this thread!
|
Good stuff bro, keep it coming. ;-)
|
great info there ..
|
hi, thx for the article, could you or someone else elaborate on the 3 keywords max rule? is that 3 uses per keyword max for an entire site? or per blog entry or??..
thx in advances! |
Quote:
Your index meta Title, Keywords and Descriptions should only contain the same word a max of 3 times, ie Porn , Porn Videos, Porn Movies , once the Porn is used up, you can use the others a cpl more times like Adult Videos, Adult Movies, etc. While on this topic, google actually reads "single word coma single word coma" as a mixture of phrases, example below: adult, porn, sex, movies, videos the above can be read and related to Adult Porn, Adult Sex, Sex Movies, Sex Porn, Porn Videos, Adult Videos, Videos Sex, Adult Porn Movies, etc. 5 related keywords can generate a variety of phrases when spidered. Your Title and Description is actually more effective when keywords are used 2 times instead of 3, and I as well as many others have seen more results from using your most important keyword with in the first few words. (Bing is also starting to shift focuses on keyword placement and usage in your Title/Description and with in the past cpl months, URL) As for blog posts - I assume your using All In One SEO or something similar which adds meta to each individual post. If this is the case, the google is going to index that page individually so once again, your able to use up to 3 on each post most related to your content. Google will not see these keywords while viewing the index of your blog, only when it follows the post and views the Single.php page. These pages are what i recommend you deep link to to not only increase your pages indexed faster, but to move them up in the SERPS for the keywords your going after. |
nice, thank you again! thread bookmarked.
|
Good stuff 2bet
|
Good stuff, I enjoyed the read
|
educational post :thumbsup
|
bookmarked for future reference
|
domainage isnt a major factor ;)
|
awesome read, thank you very much :)
|
:hi thanks
|
Nice job.
|
cool, thanks
|
Quote:
Oh shit Sleazy is fucked. He will never get his sites listed well in google :1orglaugh :winkwink: |
Thanks for sharing the info. Very helpfull.
I have a question. I've assumed that google will detect how much traffic your site gets and will give more love to sites that get a lot of traffic. If the site gets 10 hits a day I assume they know this (as they know everything) and not index you well. How many hits a day do you think a site needs to not be penalized for not being popular. If I buy some good relavent inbound links but don't get a good placement on those sites and in turn get little traffic from them. I am wondering how much traffic I should buy for the site and how good the quality should be (all US, or mix of traffic). I ask because I am building 100's of sites and can't go balls to the wall with traffic on all of them. But I can buy more traffic to help make them appear more popular, in addition to what ever traffic I get from hard links. Thanks. |
Thanks for the great post!
|
Quote:
|
in regards to the whois info thing, do you think there is an advantage in keeping whois info private from an seo point of view? do you think that google looks at that info to find networks of sites belonging to one person? and thanks!!
|
Quote:
|
Excellent thread and posts. Bookmarked.
Thanks 2bet and marketsmart! |
Quote:
BTW cool article. |
Great read Bobby thanks. I need to get with you so we can finish the stuff we chatted about a couple months ago. Just never enough hours in the day it seems.
|
All times are GMT -7. The time now is 03:59 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc