Keyword Grouping
This is where we group the target keywords that have been agreed upon into sets of 1-3 keywords which will then be designated to target pages during the URL Mapping stage. Keywords of a given campaign are grouped based on, but not limited to, similar keywords, related terms, and geo-target.
URL Mapping
If an SEO campaign has 30 keywords, they will not all go on the same page. We usually target 3 keywords on each page, with unique exceptions to the homepage. The process of determining which page should contain certain keywords is URL mapping.
Factors such as theme relevance and page rankings will come into play when URL mapping is performed. Among the target pages that are prioritized are those that are convertible and/or will catch a user’s attention, engaging them and encouraging them to interact and browse through the site. The homepage, which is the most highly evaluated page among all the others, is always targeted. If a page with a matching theme does not exist for certain keywords, then a new page with fresh content will have to be created.
Target URLs
The Target URLs are determined during keyword URL mapping. These URLs are simply the pages we are primarily targeting with our On Page and Off Page optimizations. More specifically, these are the pages that will be ranking in the search engines.
Homepage Redirect Analysis
Similar to the Domain Redirect Analysis, we will be checking if the homepage has different URL versions and determining which version should be used for your site in the search results.
Example:
Example.com, Example.com/home, Example.com/welcome, Example.com/index
Homepage Redirecting
If we find a duplicate URL of the homepage and have determined which version to go for, provided that we have the necessary access, we will be implementing the 301 homepage redirect to our recommended version.
Domain Redirect Analysis
Some websites use both the WWW and the non-WWW version of its URLs. If this is the case, and the site does not redirect to WWW if non-WWW is used and vice versa, then the site needs to be redirected to its appropriate URL. Here, we will be determining which version should be used for your site in the search results so that the rankings and traffic will not be distributed and search engines will not interpret them as duplicates or different pages (http://www.google.com/support/webmasters/bin/answer.py?answer=44231).
Domain Redirecting
If necessary and given the appropriate access, we will be implementing the 301 domain redirect to our recommended version – either WWW or non-WWW.
Footer Navigation Creation Analysis
If a campaign has highly competitive keywords, then optimized footers will have to be created for them. Otherwise, this is not necessary.
Footer Navigation Optimization
Footers are globally used in a website. Given this, placing keyword-optimized anchor texts on them are a good way to increase the competitive keywords’ prominence.
IMG Alt Text Analysis, Creation & Optimization
Also known as alternative text or alt attribute, this type of HTML tag (ALT tag) is used to provide images with a text description in the event images are turned off on a web browser or a person is using assistive technology. The image’s text description is usually visible while “hovering” over the image.
Used with the <img src=”/”> http tag, Alt Text provides search engines an important way of understanding what the image is about. Repeating the same phrase in multiple tags on the same page can be a flag for spammy tactics on your site. Consequently, the alt texts should be unique.
We will be performing the Alt text optimization on the most prominent images on X amount of pages (as detailed on your proposal) by making sure all the alt text of the most visible images (on those selected pages) are unique and, if applicable, keyword optimized as well.
Please note that this services includes optimizing 3 images per page + the website logo.
hCard Implementation
What is an hCard?
While you and I can see and determine the company address found on a website, Google and other search engines do not have the ability to do just that. Search engines don’t “see” the site, they read it. The hCard takes information that is already on your website and “labels” it so the search engines can tell exactly what it represents. In other words, it’s a special kind of website code (microformat) which allows search engines to easily distinguish a business’s name, address, and phone number from other information on a webpage. While the individuals visiting your website can’t see it, your address is labeled as such for the search engines.
We will be implementing the hCard script containing your company address on your website.
User Sitemap Analysis
A User Sitemap is an on-site Web page that links to all the other pages on your website. This type of sitemap is generally for Web site visitors but is also useful for spiders/search engines. It helps any spider crawling your site to easily and quickly find and index all of your site’s important Web pages.
We will be checking to see if there is an existing User Sitemap and whether or not our target pages are included there. If we find that the Web site has no sitemap, then we will be creating/recommending one (depending on whether or not we have access to do so).
User Sitemap Implementation
If we determine that the site does not have a User Sitemap, a new page will be created for this. The user sitemap containing all the relevant pages of the website will also be added to the menu or footer (whichever is applicable) for users to access.
No Follow Internal Links Repair
The links on a website are followed by search engines by default unless the link attribute rel=”no follow” is used. No follow is a non-standard HTML link attribute used to prevent a link from passing trust, reputation or PR to the linked page or website. Thus, we will be repairing any internal link to remove the unnecessary rel=”no follow” attribute on X amount of pages (as detailed on your proposal).
Text-Based Web Browser (Lynx) Compatibility Checking
We utilize Lynx Browser (a text-only web browser enabling you to see how search engines will see your website) to access your site and check for potential On Page problems such as hidden links and invisible texts (keywords that are placed in the HTML source in such a way that these words are not visible to the users looking at the rendered web page). This step is recommended by Google and is stated in the Google Webmaster Guidelines as well
(http://www.google.com/support/webmasters/bin/answer.py?answer=35769#2):
We will be using a text browser such as Lynx to examine your site because most search engine spiders see your site the same way Lynx does. If advanced features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing your entire site in a text browser, then search engine spiders may have trouble crawling your site.
Text-Based Web Browser (Lynx) Compatibility Fixing
The removal of hidden links and semi-invisible texts in a site is important because their mere presence is considered a Black Hat SEO practice and can cause the website to lose points in terms of page ranking. This is in fact part of the Google Webmaster Guidelines: http://www.google.com/support/webmasters/bin/answer.py?answer=66353
The text-based web browser we are utilizing looks at a site from a search engine’s point of view. Any issues it detects is a suggestion for improvisation in the eyes of search engines. Eliminating hidden links within a site could help improve its rankings because you are letting search engines know that the website does not intend to deceive its users.
Robots.TXT & Meta Tags Checking & Fixing
Part of our On Page optimization procedure is to check the website for issues concerning its robots.txt file and/or our target pages’ robot meta tags.
The robots.txt file provides instructions about the site to web robots such as search engine crawlers (http://www.robotstxt.org/robotstxt.html). The special HTML <META NAME=”ROBOTS” CONTENT=””> tag can also be used to tell robots not to index the content of a page, and/or not scan it for links to follow (http://www.robotstxt.org
The post General On-Page Search Optimization appeared first on StudioHOF.