Anyone creating a website will naturally try to make it as easy and clear for a human being to read and navigate through the different pages, but how would a robot see it? At this point your thinking, ‘I don’t really care if a robot can read my website or not they represent a relatively small proportion of my clientele,’ or worse yet you are the type of person who doesn’t serve “their kind.” To which I would say, first, you are being very closed minded, and second, perhaps you are confusing robots with cyborgs. I will leave the debate over how intelligent self-aware cyborgs should be treated in our society to the philosopher-sci-fi novelists and explain what I mean by robots. Google is the prime example. They have massive networks of servers that are constantly scouring the Internet day and night and indexing every page. These servers are sometimes called robots, or spiders. Indexing means that they save all of the data from each page they see in a database and rank it related to certain key words that are found on the page.
These robots see pages different than the human eye. For the most part they ignore graphics, pictures and animation but they see things that a human visitor to your site doesn’t see, and that is called meta-data, information that is hidden from the human eye on your website. Making sure that your website is readable and properly represents itself to the search engine robots is called Search Engine Optimization (SEO). Search Engine Optimization is often an afterthought when creating a new website but you can start with a considerable advantage if you begin creating your new website with SEO in mind.