A search engine friendly architecture ensures that there are no barriers for the “bots” that are sent out from Google and Bing. Bots or automated programs constantly scour the web for new information so they can write back to their databases. It is therefore very important to implement a backend code that will accommodate the bots and not generate any incorrect code that may stop them.
SiteBuilder developers have worked hard over the years to make our architecture and sites search engine friendly. It’s also easy to add search engine optimization code to the sites. This is essential if you want to gain targeted traffic in organic search.
When shopping for a Website Builder, make sure that you do some research in this area. Some of the top Website Builder on the market today are not search engine friendly (although they claim to be). As a search engine marketer, I have examined the code and the backend of many of these site in an attempt to help them rank. Sometimes, the recommendation is, unfortunately, to move their site a search-friendly platform.
My advice for anyone starting a new site with an online Sitebuider is to look into the technical aspects first. Don’t get caught up with design and forget the main point of the site. Be found in search and drive business to you. A site that won’t rank will not do you any good.
Now let’s assume that you have selected a search friendly site. Here are some tips to keep it that way.
- One of the worst things you can do is setup an intro screen or a “splash” page before you get to your main content. Many times this is Flash that has no readable content for the search engines to pick up. Your home page is the most relevant page to the search engines, do not waste it on a splash page. If that’s not enough reason, consider this. People want their content fast. Believe me, no one wants to sit through an intro page.
- Using I-Frames. Do not place content in I-Frames. The search engines do not recognize this as on-page content and will not index it.
- Using images instead of written text on your pages. Search engines cannot index text that is in an image. Make sure you do not overuse images where you could use written text that the search engine can index.
- Slow page load times. Google is looking at page load times. Heavy pages may be penalized. Therefore, you need to keep the page loading fast as possible. If you need to put images on the page make sure they are optimized locally (on your local machine before uploading).
A JPG should always be compressed. For instance, using Photoshop or Photoshop Elements a 1.3 meg photo can be compressed down to 60k without losing definition. The quality setting should be set at 60 when compressing JPG images. All good image editing software will have a similar feature. A good image editing program is a must!