Daily Archives: July 9, 2015


Do You Need Google Analytics?

Most likely you could benefit from  Google Analytics. Here are just a few of the many questions about your website that you can answer using Google Analytics.

  • How many people visit my website?
  • Where do my visitors live?
  • Do I need a mobile-friendly website?
  • What websites send traffic to my website?
  • What marketing tactics drive the most traffic to my website?
  • Which pages on my website are the most popular?
  • How many visitors have I converted into leads or customers?
  • Where did my converting visitors come from and go on my website?
  • How can I improve my website’s speed?
  • What blog content do my visitors like the most?

Browser Compatibility

Users typically view your website using a browser. Each browser interprets your website code in a slightly different manner, which means that it may appear differently to visitors using different browsers. In general, you should avoid relying on browser specific behavior, such as relying on a browser to correctly detect a content-type or encoding when you did not specify one. In addition, there are some steps you can take to make sure your site doesn’t behave in unexpected ways.
Test your site in as many browsers as possible
Once you’ve created your web design, you should review your site’s appearance and functionality on multiple browsers to make sure that all your visitors are getting the experience you worked so hard to design. Ideally, you should start testing as early in your site development process as possible. Different browsers – and even different versions of the same browser – can see your site differently.


Why Responsive Design?

We recommend using responsive web design because it:

  • Makes it easier for users to share and link to your content with a single URL.
  • Helps Google’s algorithms accurately assign indexing properties to the page rather than needing to signal the existence of corresponding desktop/mobile pages.
  • Requires less engineering time to maintain multiple pages for the same content.
  • Reduces the possibility of the common mistakes that affect mobile sites.
  • Requires no redirection for users to have a device-optimized view, which reduces load time. Also, user agent-based redirection is error-prone and can degrade your site’s user experience.
  • Saves resources when Googlebot crawls your site. For responsive web design pages, a single Googlebot user agent only needs to crawl your page once, rather than crawling multiple times with different Googlebot user agents to retrieve all versions of the content. This improvement in crawling efficiency can indirectly help Google index more of your site’s content and keep it appropriately fresh.

Responsive Web Design

Responsive web design (RWD) is a setup where the server always sends the same HTML code to all devices and CSS is used to alter the rendering of the page on the device.

Google’s algorithms should be able to automatically detect this setup if all Googlebot user agents are allowed to crawl the page and its assets (CSS, JavaScript, and images).

Responsive design serves all devices with the same code that adjusts for screen size.

Pages optimized for a variety of devices must include a meta viewport element in the head of the document. A meta viewport tag gives the browser instructions on how to control the page’s dimensions and scaling.

The meta viewport tag gives the browser instructions on how to adjust the dimensions and scaling of the page to the width of the device. When the meta viewport element is absent, mobile browsers default to rendering the page at a desktop screen width (usually about 980px, though this varies across devices). Mobile browsers then try to make the content look better by increasing font sizes and either scaling the content to fit the screen or showing only the part of the content that fits within the screen


SEO Optimization

Best practices: Whitehat SEO

These techniques aim to improve a site by focusing on the visitors instead of on ranking higher. Examples of good, whitehat techniques include creating organic, high-quality content and adding good descriptive tags covered in the previous module. They adhere to Webmaster Guidelines, which your site should follow to rank well and organically in Google Search. If you’re looking to hire a SEO, make sure the SEO does not use blackhat techniques. Not even the most experienced SEO can guarantee a certain rank for your site. Establish upfront your goals, how the SEO will reach the goals, and metrics used to evaluate success of the goals.

Accurately describe the page’s content Choose a title that effectively communicates the topic of the page’s content.

AVOID choosing a title that has no relation to the content on the page

using default or vague titles like “Untitled” or “New Page 1”

Create unique title tags for each page

Each of your pages should ideally have a unique title tag, which helps Google know how the page is distinct from the others on your site

AVOID using a single title tag across all of your site’s pages or a large group of pages

Use brief, but descriptive titles

Titles can be both short and informative. If the title is too long, Google will show only a portion of it in the search result.

AVOID using extremely lengthy titles that are unhelpful to users

stuffing unneeded keywords in your title tags

Accurately summarize the page’s content

Writing a description meta tag that has no relation to the content on the page using generic descriptions like “This is a web page” or “Page about baseball cards” filling the description with only keywords copying and pasting the entire content of the document into the description meta tag Avoid: Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result.

Use unique descriptions for each page

Using a single description meta tag across all of your site’s pages or a large group of pages Avoid: Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (e.g. searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn’t feasible. In this case, you could automatically generate description meta tags based on each page’s content.

Improve the structure of your URLs

Simple-to-understand URLs will convey content information easily

URLs are displayed in search results

Use words in URLs

Using lengthy URLs with unnecessary parameters and session IDs choosing generic page names like “page1.html” using excessive keywords like”baseball-cards-baseball-cards-baseballcards.htm”

Avoid: URLs with words that are relevant to your site’s content and structure are friendlier for visitors navigating your site. Visitors remember them better and might be more willing to link to them.

Create a simple directory structure

Use a directory structure that organizes your content well and makes it easy for visitors to know where they’re at on your site. Try using your directory structure to indicate the type of content found at that URL

Provide one version of a URL to reach a document

Make your site easier to navigate

The navigation of a website is important in helping visitors quickly find the content they want. It can also help search engines understand what content the webmaster thinks is important. Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.

Plan out your navigation based on your homepage

All sites have a home or “root” page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (e.g. root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

Ensure more convenience for users by using ‘breadcrumb lists’

A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page (1). Many breadcrumbs have the most general page (usually the root page) as the first, left-most link and list the more specific sections out to the right.

Allow for the possibility of a part of the URL being removed

Prepare two sitemaps: one for users, one for search engines

Create a naturally flowing hierarchy creating complex webs of navigation links, e.g. linking every page on your site to every other page going overboard with slicing and dicing your content (so that it takes twenty clicks) Avoid: Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure.