Search engines use computer programmes called crawlers or spiders to index websites. These are automated programs that gather information about your website so that it can be presented to people who use the search engine to look for the product, service or information that you are providing.
To get the crawlers and spiders to you website, you can ‘manually’ submit your website pages to a search engine by completing their required submission page and the spiders will index your entire site.
While this type of submission is often seen as a way to promote a website quickly, it generally is not necessary because the major search engines use crawlers and spiders that will eventually find most websites on the internet all by themselves. So, your online work from home business will be found if you have relevant content which is updated regularly.
A spider will read the content on the actual site, the site’s meta tags (a code that provides information about a website) and also follow links to where the website might connects. The spider then returns all that information back to a central depository, where the data is indexed. It will visit each link you have on your website and index those sites as well. Some spiders will only index a certain number of pages on your site, so be warned if you create a site with hundreds of pages!
The spider will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the search engine. If you are running an online work from home business, this is why it is important to keep your website content regularly updated.
A spider is a bit like a book where it contains the table of contents, the actual content and the links and references for all the websites it finds during its search. It may index up to a million pages a day.
When you ask a search engine to locate information, it is actually searching through the index which it has created and not actually searching the entire internet. Different engines produce different rankings because they use different algorithms. This is the formula that the search engines use to determine the significance of a web page.
One of the things that an algorithm scans for is the frequency and location of keywords on a web page. Keywords are the phrase or text that the website author is trying to target for search engines to find and thus provide to readers or customers who have searched for that word or phrase.. For example when someone searches for “online home business tips” they want to find information that is related to that search.
They can also detect artificial keyword stuffing or spamdexing. This is the deliberate manipulation of search engine indexes such as repeating unrelated key phrases or words.
Then algorithms also analyse the way that pages link to other pages on the internet. By checking how pages link to each other, an engine can both determine what a page is about, and if the keywords of the linked pages are similar to the keywords on the original page.