Xpirt Design

Blog commenting for strong one way link building

Posted on: November 11, 2009

Build high PR, Do Follow links fast and easy!

By: James Weir, Xpirt Design, seach engine optimization los angeles

In this tutorial, I will show you how to effectively build one-way links by finding, analyzing, and commenting on blog sites. Commenting on blogs can yield a large amount of high quality links, but you need to make sure you are commenting on the right blogs, and your comments are worthy of the link. This link building technique has really come in handy for Xpirt Design’s SEO efforts and hopefully, it can be just as successful for you.

Plug-ins / Programs needed:

  • You will need the No Do Follow extension for Firefox. When activated, this extension will highlight all links on the web page. The red highlights will be the links that are No Follow. The links that are blue will be the Do Follow links.Download the No Do Follow extension for Firefox
  • You will need the SEO Quake extension for Firefox. When this is activated, you will see all stats for any website or web page that is of importance in SEO (Page Rank, Inbound Links, Outbound links, Alexa Rank, page age, and other stats).Download the SEO Quake extension for Firefox
  • You will need to download the Back Link Analyzer tool from SEO Book.com. This tool will find the inbound links for a website.
    Download the Back Link Analyzer Tool

    • Set the option to Google (http)
    • Set the option to Yahoo (web service)
    • Set the option to “1000″ on result limit
  • Use the Back Link Analyzer tool to gather your competitor’s links

    Basically, what you want to do here is analyze the back links of any given competitor. There are many back link analyzer tools out there, but the easiest one (and it’s free) is the back link analyzer tool at SEOBook.com. You can also use the link analyzer tool that is offered in the WEB CEO program. The WEB CEO program takes a very long time to search and update the results so for this tutorial, I’ll be using the back link analyzer tool from SEO Book.com.

    backlink analyzer tool screenshot 1

    Back Link Analyzer v2.0

    Setup your options

    You will need to setup the options for the Back Link Analyzer Tool.

    Setup the options as shown

    Setup the options as shown

    Run the tool. You will soon see a list of back links pointing to your competitor’s website. Here’s the trick on finding blogs. Most, if not all, blogs will use a certain URL structure, for instance, www.example.com/2009/11/9/title , or www.example.com/archives/web-design-stuff/ , or something similar to these. When you find a URL that you think might be a blog, click on it and it should open up your Fire Fox browser. Make sure your extensions are activated.

    Find and recognize blog-type URLs

    Find and recognize blog-type URLs

    Take a quick look at your SEO Quake stats. Use these stats to determine whether or not this blog is worth commenting on. Try to find blogs with high page ranks on the actual article page itself. Usually, blogs with high page ranks on the domain will have articles that also have high page rank.

    Analyze the blog posting according to PageRank, outbound links, ect.

    Analyze the blog posting according to PageRank, outbound links, ect.

    How to know if your comment will be a Do Follow Comment

    Take a look at the comments other people have posted. Most blogs will setup a link with the anchor text of the “name” text field. Make sure your No Do Follow extension is activated. If it’s not, go to “tools” and click on “No Do Follow”. So, if the link/name of the commenter is highlighted in blue, the link is a Do Follow link. Also look for the ability to insert a link within the actual comment. Sometimes blogs will allow these.

    Make sure the comment links are DoFollow (the link that will be pointing to your site)

    Make sure the comment links are DoFollow (the link that will be pointing to your site)

    Make sure to leave good comments

    Leaving good comments is the key to getting back links from blogs. Take the time to read the blog entry and leave a constructive and useful comment related to the subject of the entry. If you don’t leave a good comment, the moderator will simply just treat your comment as spam and you will not get the link. Time will be wasted. So make sure you read the entry and leave a good comment.
    Some blogs will allow you to use keywords as your “name”. The blogs that don’t let you use keywords, just use your name or company name. A link is better than no link.

    An example of a spammy blog comment

    An example of a spammy blog comment

    An example of an acceptable blog comment

    An example of an acceptable blog comment

    Screenshot of the input fields for commenting

    Screenshot of the input fields for commenting

    Conclusion

    And that’s pretty much it! Keep doing this for your competitors and when you have run out of competitors to find blogs from, you can just do a Google search to find them. Good luck with your link building and feel free to comment on this post, Digg it, ect. And don’t forget to subscribe to my RSS Feed!

Web Design and SEO Tips: Starting an online business

Posted on: May 3, 2009

Web Design and SEO Tips:

A few important points to consider when starting an online business:

  1. Reasearch, research, research. You absolutely must research your target niche market. Find out how competitive the market is. Find out how much of a demand there is.
  2. Research your direct competitors. Knowing the direction your competitors are taking will help you plan out your own strategies.
  3. Web design style: Again, look at your competitors, or related websites and try to get ideas.
    • how you want your site to look.
    • how you want it to function.
    • how you want the navigation and layout structure.
  4. Find out what types of services you require. Whether its ecommerce shopping carts, blog services, or whatever, it is always good to read up on the services before you jump into anything.
  5. If you have no website design experience, call a professional web design and seo company (like Xpirt Design). They can help you properly build your online business and give you valuable suggestions and consultation.
  6. Greatly consider search engine optimization services. It’s always a good idea to choose a web design company that also offers affordable, but effective SEO services. The reason is that the web designer himself knows how the website is built and it would be much easier, less time consuming, and more effective for the seo to take effect.
  7. Consider paid advertising or pay per click/adwords type marketing. A new website will take time to develop natural search engine traffic (SEO). While you are waiting, it is always a good idea to give your website that jumpstart with traffic from paid advertising.
  8. The first impression of your web design is everything! When a visitor comes to your website and sees that your website is not professionally designed, more than likely the visitor will leave and continue on his search engine journey.
  9. The way your navigation is structured is also a major factor of a properly designed website. If the visitor has difficulties in finding what hes looking for, gets confused or lost within the website, or cannot go back to his previous page, chances are, the visitor will simply leave.

Though web design will take a little time to think about, its the internet marketing and search engine optimization that will no doubt end up taking more than 75% of your time or your hired professional’s time. You constantly have to keep yourself up to date on the market trends, new marketing opportunities, and new directions you can take your business into. Your website will definately determine your success, at least online. If your online marketing, combined with a professional web design, is all done properly, efficiently, and effectively, you can have an unlimited potential in online business growth.

If you are indeed looking for a professional with tons of web design and seo/pay per click experience, visit Xpirt Design or orange county web design for a free consultation.

Control Googlebot’s Crawl Rate in Webmaster Tools

Posted on: April 21, 2009

Google releases new options in the Webmaster Tools to control Googlebot’s crawl rate. Here is the article taken from the Official Google Webmaster Central Blog.

More control of Googlebot’s crawl rate

We’ve upgraded the crawl rate setting in Webmaster Tools so that webmasters experiencing problems with Googlebot can now provide us more specific information. Crawl rate for your site determines the time used by Googlebot to crawl your site on each visit. Our goal is to thoroughly crawl your site (so your pages can be indexed and returned in search results!) without creating a noticeable impact on your server’s bandwidth. While most webmasters are fine using the default crawl setting (i.e. no changes needed, more on that below), some webmasters may have more specific needs.

Googlebot employs sophisticated algorithms that determine how much to crawl each site it visits. For a vast majority of sites, it’s probably best to choose the “Let Google determine my crawl rate” option, which is the default. However, if you’re an advanced user or if you’re facing bandwidth issues with your server, you can customize your crawl rate to the speed most optimal for your web server(s). The custom crawl rate option allows you to provide Googlebot insight to the maximum number of requests per second and the number of seconds between requests that you feel are best for your environment.
Googlebot determines the range of crawl rate values you’ll have available in Webmaster Tools. This is based on our understanding of your server’s capabilities. This range may vary from one site to another and across time based on several factors. Setting the crawl rate to a lower-than-default value may affect the coverage and freshness of your site in Google’s search results. However, setting it to higher value than the default won’t improve your coverage or ranking. If you do set a custom crawl rate, the new rate will be in effect for 90 days after which it resets to Google’s recommended value.
You may use this setting only for root level sites and sites not hosted on a large domain like blogspot.com (we have special settings assigned for them). To check the crawl rate setting, sign in to Webmaster Tools and visit the Settings tab. If you have additional questions, visit the Webmaster Help Center to learn more about how Google crawls your site or post your questions in the Webmaster Help Forum.

Written By Pooja Shah, Software Engineer, Webmaster Tools Team

Orange County Web Design

The Importance of Link Architecture / Internal Linking in Web Design and SEO

Posted on:


Google releases very useful information on the importance of internal linking and how the Googlebot uses links to crawl through websites. This type of information is very helpful, especially if its from Google, when designing or re-designing your navigation links for a website. Internal linking is the 3rd most important procedure in regards to SEO, quality content being the 1st more important, then external link building being 2nd.

This article was taken from the Official Google Webmasters Central Blog and reads:

“Link architecture—the method of internal linking on your site—is a crucial step in website design if you want your site indexed by search engines. It plays a critical role in Googlebot’s ability to find your site’s pages and ensures that your visitors can navigate and enjoy your site.

Keep important pages within several clicks from the homepage

Although you may believe that users prefer a search box on your site rather than category navigation, it’s uncommon for search engine crawlers to type into search boxes or navigate via pulldown menus. So make sure your important pages are clickable from the homepage and for easy for Googlebot to find throughout your site. It’s best to create a link architecture that’s intuitive for users and crawlable for search engines. Here are more ideas to get started:

Intuitive navigation for users

Create common user scenarios, get “in character,” then try working through your site. For example, if your site is about basketball, imagine being a visitor (in this case a “baller” :) trying to learn the best dribbling technique.

  • Starting at the homepage, if the user doesn’t use the search box on your site or a pulldown menu, can they easily find the desired information (ball handling like a superstar) from the navigation links?
  • Let’s say a user found your site through an external link, but they didn’t land on the homepage. Starting from any (sub-/child) page on your site, make sure they can easily find their way to the homepage and/or other relevant sections. In other words, make sure users aren’t trapped or stuck. Was the “best dribbling technique” easy for your imaginary user to find? Often breadcrumbs such as “Home > Techniques > Dribbling” help users to understand where they are.

Crawlable links for search engines

  • Text links are easily discovered by search engines and are often the safest bet if your priority is having your content crawled. While you’re welcome to try the latest technologies, keep-in-mind that when text-based links are available and easily navigable for users, chances are that search engines can crawl your site as well.This <a href=”new-page.html”>text link</a> is easy for search engines to find.
  • Sitemap submission is also helpful for major search engines, though it shouldn’t be a substitute for crawlable link architecture. If your site utilizes newer techniques, such as AJAX, see “Verify that Googlebot finds your internal links” below.

Use descriptive anchor text

Writing descriptive anchor text, the clickable words in a link, is a useful signal to help search engines and users alike to better understand your content. The more Google knows about your site—through your content, page titles, anchor text, etc.—the more relevant results we can return for users (and your potential search visitors). For example, if you run a basketball site and you have videos to accompany the textual content, a not-very-optimal way of linking would be:

To see all our basketball videos, <a href=”videos.html”>click here</a> for the entire listing.

However, instead of the generic “click here,” you could rewrite the anchor text more descriptively as:

Feel free to browse all of our <a href=”videos.html”>basketball videos</a>.

Verify that Googlebot finds your internal links

For verified site owners, Webmaster Tools has the feature “Links > Pages with internal links” that’s great for verifying that Googlebot finds most of the links you’d expect. This is especially useful if your site uses navigation involving JavaScript (which Googlebot doesn’t always execute)—you’ll want to make sure that Googlebot is finding other internal links as expected.

Here’s an abridged snapshot of our internal links to the introductory post for “404 week at Webmaster Central.” Our internal links are discovered as we had hoped.

Feel free to ask more internal linking questions
Here are some to get you started…

Q: What about using rel=”nofollow” for maximizing PageRank flow in my internal link architecture (such as PageRank sculpting, or PageRank siloing)?

A: It’s not something we, as webmasters who also work at Google, would really spend time or energy on. In other words, if your site already has strong link architecture, it’s far more productive to work on keeping users happy with fresh and compelling content rather than to worry about PageRank sculpting.

Matt Cutts answered more questions about “appropriate uses of nofollow” in our webmaster discussion group.

Q: Let’s say my website is about my favorite hobbies: biking and camping. Should I keep my internal linking architecture “themed” and not cross-link between the two?

A: We haven’t found a case where a webmaster would benefit by intentionally “theming” their link architecture for search engines. And, keep-in-mind, if a visitor to one part of your site can’t easily reach other parts of your site, that may be a problem for search engines as well.

Perhaps it’s cliche, but at the end of the day, and at the end of this post, :) it’s best to create solid link architecture (making navigation intuitive for users and crawlable for search engines)—implementing what makes sense for your users and their experience on your site.

Thanks for your time today! Information about outbound links will soon be available in Day 3 of links week. And, if you have helpful tips about internal links or questions for our team, please share them in the comments below.



Orange County Web Design

Aside from internal links, external links are a major factor in ranking. Link building services exist that will source appropriate link placements on your behalf to work towards top rankings.

Search engines are vulnerable to building link popularity. this is why most websites choose to link with other websites.

Google On-Demand XML Sitemaps for Custom Search

Posted on:


Google released another great webmasters tool and even better, they let us in on a little secret about Google Sitemaps Submission (XML Sitemaps) and how Google gives special attention to websites that submit sitemaps to Google. So now we know that Google will give a website priority, in regards to new pages or new content being indexed. They will actually crawl your website within 24 hours, in most cases, once the xml sitemap is submitted, or re-submitted to Google. This is great news, especially to sites that continuously upload new content, or brand new websites that are looking to get indexed as quickly as possible.

Here is the actual article taken from the Official Google Webmaster Central Blog:

Thursday, November 13, 2008 at 10:31 AM

“Since we launched enhanced indexing with the Custom Search platform earlier this year, webmasters who submit Sitemaps to Webmaster Tools get special treatment: Custom Search recognizes the submitted Sitemaps and indexes URLs from these Sitemaps into a separate index for higher quality Custom Search results. We analyze your Custom Search Engines (CSEs), pick up the appropriate Sitemaps, and figure out which URLs are relevant for your engines for enhanced indexing. You get the dual benefit of better discovery for Google.com and more comprehensive coverage in your own CSEs.

Today, we’re taking another step towards improving your experience with Google webmaster services with the launch of On-Demand Indexing in Custom Search. With On-Demand Indexing, you can now tell us about the pages on your websites that are new, or that are important and have changed, and Custom Search will instantly schedule them for crawl, and index and serve them in your CSEs usually within 24 hours, often much faster.

How do you tell us about these URLs? You guessed it… provide a Sitemap to Webmaster Tools, like you always do, and tell Custom Search about it. Just go to the CSE control panel, click on the Indexing tab, select your On-Demand Sitemap, and hit the “Index Now” button. You can tell us which of these URLs are most important to you via the priority and lastmod attributes that you provide in your Sitemap. Each CSE has a number of pages allocated within the On-Demand Index, and with these attributes, you can us which are most important for indexing. If you need greater allocation in the On-Demand index, as well as more customization controls, Google Site Search provides a range of options.”

Google Search Screen Shot Image

Google Search Screen Shot Image

———————————————————–

End of Article.

Google answers controversial SEO related questions…Finally!

Posted on:

Finally, some decent straight answers pertaining to SEO and Pagerank from Google!

Recently, during a Live Q & A Chat Googlers Matt Cutts and Maile Ohye answered the questions from webmasters around the world. This isnt the whole chat exerpt but here are some of the major topics we all were once confused about.

Does the age of a website/domain affect its ranking?

Ohye answered this way: a site’s reputation can be a indicator to search engines, but of course, it’s not everything. Having a site for a long period of time can establish credibility with users, and as a search engine we also want to reflect this type of credibility. Of course, newer domains can also gain users and credibility. It seems like running a good site is a bit like running a reputable business. So yes, if your domain has been credible for years it can help. If you buy an old domain and put all your content on it in hopes of getting instant rankings, that’s not the best idea.

But, when the question was rephrased from another webmaster, Cutts answered: In the majority of cases, it actually doesn’t matter–we want to return the best information, not just the oldest information. Especially if you’re a mom/pop site, we try to find ways to rank your site even if your site is newer or doesn’t have many links. I think it is fair for Google to use that as a signal in some circumstances, and I try never to rule a signal out completely, but I wouldn’t obsess about it.

Official translation: Sometimes, when we say it does.

Do 301 redirects carry over PageRank?

Where appropriate, ranking signals will be transferred across 301 redirects (if the same page has moved from one URL to another). This may take some time, so you should probably leave the redirect in place as long as you have control over the URL.

How many 301 redirects are acceptable?

It’s ok to chain a few together. The HTTP 1.0 standard allows for a maximum of 5 redirects for a URL, so keep it minimal.

Why do pages translated into different languages each have different rankings in their respective engines?

Google looks at content on a URL-by-URL basis, so even if you have translated top content from one language to another, Google might not treat it the same way as they would treat the original content. It’s also possible that the translated content is not as relevant as other original content in that language. Generally speaking, making sure that your content is as unique and compelling as possible for the users in that target market is the best thing to do.

Do backlinks from bad sites negatively affect my PageRank?

Those links might be positively affecting your PageRank (PageRank does not go down from “bad” links like those from adult sites). In general, you don’t have to worry about bad links like that which point to your site that aren’t under your control.

How often does your search algorithm change?

We change the algorithms all the time – last year we had over 450 changes.

Could sharing an IP address with a bad site get my site penalized?

The situations where it would matter are when the server is overloaded (can’t respond to your visitors) and when it’s incorrectly configured (not returning your site to your visitors). But otherwise that is no longer a concern.

Does Google have a problem with rank-checking software?

Rank-checking software is against Google’s Terms of Service and could result in blocking your IP address, and it doesn’t really help, especially when it comes to personalized or geotargeted results.

Is there PageRank boost from .edu or .gov links?

Google’s Answer: You don’t get any PageRank boost from having an .edu link or .gov link automatically. If you get an .edu link and no one is linking to that .edu page, you’re not going to get any PageRank at all because that .edu page doesn’t have any PageRank.

Translation: If the .edu or .gov page is linked to, then yes, because that webpage now has some authority, just like with any (non-.gov or .edu) page.

Does a page load time play a crucial role in Google Page Ranking? If yes how important is it?

Google’s Answer: I think the more important issue here is user experience. If your site loads fast, your users will be happy; if it loads slow, users will be less happy. Make your users happy, right?

Translation: Yes, and as important as 200 other factors.

Aaron D’Souza of the Search Quality team was reported as stating that publishing the same content on two separate geotargeted paths under your domain will not trigger the dupe content filters. Is this correct?

Google’s Answer: In general, in a case like that, we’d try to pick the best page based on various factors, including geotargeting and language choices. If that page is one which is also available for other geotargeting/language choices, we will generally try to pick the version that our algorithms feel makes the most sense.

Translation: Yes, we think.

I have reported sites that clearly have paid links (e.g. the backlink page says “Advertising” above the link), but Google does not seem to take action. Why would that be the case? These are .orgs who are clearly selling their .org juice.

Google’s Answer: While paid links and spam reports are being taken very seriously by Google, the results may not be seen immediately for users or even not at all. This does not mean no action is being taken on the offending sites. Also, the TLD of the sites should not be a factor being taken into account. For this reason reporting both, web spam and PageRank passing link selling makes sense and contributes in an important way to the quality of Google’s index.

Translation, partly based on .gov/.edu response: Google treats all top level domains the same, so a .org would have no more juice than a .com or .info. Further, clearly marked paid links (ones on pages labeled “Advertising”) are not necessarily violations of Google’s guidelines. If the links you reported were found to be nofollow links, then no action would be necessary. But keep trying to sabotage the competition. Business is war.

Is it true that the fewer the links FROM your website, the more influence they have on the sites receiving those links?

Google’s Answer: PageRank is split up over the links from a page, but I would recommend not concentrating on this (as you won’t be able to “measure” and act upon it anyway) and instead making your site as usable as possible for your visitors.

Translation: Yes, the more you link the more the link juice passed on is diluted, but don’t go trying to figure out the formula in order to game the system. We’ll figure you out. We’re Google.

Does getting a lot of comments in a blog help in being well indexed/ranked by Google?
Google’s answer: Having a lot of enthusiastic users commenting on your posts and doing so generating content on your site, certainly does not harm your rankings :-) Furthermore, a large fan base gives the webmaster a bit of independence from search engine traffic, which is the reason why generating original and compelling content in order to nurture a group of committed users is something I would highly recommend to any blogger

Translation: Yes.

Recently, you removed this suggestion: “Submit your site to relevant directories such as the Open Directory Project and Yahoo!” from your guidelines. Is there any chance that you will be discounting these kinds of links for ranking value in future?

Google’s Answer: There’s always the chance that we’ll discount directory links in the future. What we were seeing was quite a few novice people would see the “directory” recommendation and go out and just try to submit to a ton of directories, even if some of the directories were lower-quality or even fly-by-night directories that weren’t great for users. Right now we haven’t changed how we’re weighting directory links–we’ve only removed the directory suggestion from the webmaster guidelines.

Translation: Possibly.

Until recentley (the last six months or so) a high ranking was achievable by submitting articles to article directories (providing they were 40%-60% unique), it no longer seems to be the case. Have links from article sites been de-valued at all?

Google’s Answer: In my experience, not every article directory site is high-quality. Sometimes you see a ton of articles copied all over the place, and it’s hard to even find original content on the site. The user experience for a lot of those article directory sites can be pretty bad too. So you’d see users landing on those sorts of pages have a bad experience. If you’re thinking of boosting your reputation and getting to be well-known, I might not start as the very first thing with an article directory. Sometimes it’s nice to get to be known a little better before jumping in and submitting a ton of articles as the first thing.

Translation: Yes.

Basically, what it boils down to is this:

1) Stop trying black hat SEO methods. Google will find out for sure.

2) Stop worrying about pagerank and focus on creating a user friendly, information/content filled, unique website that users will love.

3) Backlinks are still very important for pagerank however, pagerank is only 1 of over 200 factors in deciding which websites show up first in search results.

4) Content is still KING in regards to Google. Focus on creating unique content and all will be good.

Get your website indexed in Google in less than a week!

Posted on:

How to get your website indexed in Google in less than a week:

Once your website is completed, most people get frustrated with the search engine’s delay in indexing your website into the search results. This article is written assuming that the new website has had on-page optimization (meta data, titles, content, ect). If you still need to do some on-page search engine optimization, visit SEO Tips for some great articles and tutorials regarding this issue.

So why is indexing your website so important?

Once the website is finished and uploaded to your web host, the search engines don’t know that your website exists until they find the website (crawl the website). Until the search engines find and crawl your new website, you will not be in any of the search results.

Indexing can take weeks, or even months! But, I am going to show you step by step on how you can get indexed in less than a week.

The reason it normally takes so long to get indexed is that a new website has absolutely no incomming links, no popularity, and no way of letting the search engines know that its there. And to make the wait longer, popular search engines have millions, and billions of web pages to crawl, which makes them very very busy.

So what do we need to do to get indexed faster (Google in particular)?

1) Well first off, what we need to do here is somehow let the search engines know that the website exists.

2) We need to get the website listed in directories as fast as possible (first steps of link popularity).

3) Get as many reciprocal links as possible in a short period (link exchange with other relevant websites).

So lets get down to the details:

Let the search engines know that your website exists:

1) Create an XML Sitemap.

An XML Sitemap is used by search engines as a directory of your website. A sitemap will show all the different directories, sub-directories, and pages that is within any given URL. This is different than an HTML Sitemap, in that and XML sitemap code is written specifically for crawlers, robots, search engines, ect.

To create a FREE XML Sitemap, visit: http://www.xml-sitemaps.com/

Here, you can automatically generate an XML Sitemap. You can either download the file once its finished, or you can copy and past the coding into your HTML editor, or whatever you use. The file extension for an XML document is “.xml”

The XML Sitemap will look something like this:

XML Sitemap Generation

Once you have created the XML Sitemap you need to upload the file into the directory of your web host where your website is located.

Next, you need to sign up for Google Webmaster Tools, located here: http://www.google.com/webmasters/

Once you have signed up, you can add your website URL, add your sitemap, check statistics, crawling problems, ect. Read the information Google provides and get familiar with this tool because it is very important in tracking the stats and problems of your website.

Google Webmasters Dashboard

Google Webmasters Sitemap Submission

Google Webmasters Sitemap Submission

To submit your XML Sitemap, click on “sitemaps” and follow the instructions.

Google now has a record of your sitemap, and when Google gets around to it, Google will crawl and index your website based on what it finds in your XML Sitemap (a directory of your web pages within the website).

2) Submit your site to as many directories as possible:

Typically, good directories get indexed by Google very often. You should submit to the directories that require no reciprocal links and that are free. Most directories will take weeks and sometimes months to actually add your link to the directory, but SOME directories will add you within a few days.

Here is a great website to find a whole lot of good directories to submit to. And, you can keep track of the directories you submit to as well! http://www.onewaytextlink.com/

onewaytextlink.com

Sign up with them, and start submitting your website!

This step will help in speeding up the process of getting indexed.

3) Get reciprocal links from other relevant websites (related websites):

A reciprocal link is basically a link exchange with another website. “I’ll add your link to my website if you add mine.”

This is a good idea in terms of link popularity, but there are some criteria that you should adhere to.

-Don’t exchange links with websites that are irrelevant to your main topic, or subject.

-Don’t exchange links with websites that are banned by any search engines. (visit http://google.com/toolbar to download the Google Toolbar. You can view the Pagerank of websites as well as other tools).

Xpirt Web Design's Pagerank

-Try to exchange links with other websites that have a Pagerank of 1 or higher.

There are some easy ways to find reciprocal links. Visit http://link2me.com , http://linkmarket.net , http://metrolink.com

You can also download Web CEO and automatically send out emails to relevant sites and manage them with this program. Visit http://www.webceo.com/ to download the software for FREE.

Web CEO Screenshot:
Web CEO - SEO Tool

4) Submit your website to Super Pages, Google Maps, and Hot Frog

xpirt web design

Final STEP:

Sit back and wait. Well actually, keep adding your site to directories, and keep grabbing link exchanges. Like I said, if you need more info on how to do anything, read up on the rest of my articles here: SEO Tips. If you have no time and would rather have us to get you indexed in a week or less, visit Orange County Search Engine Optimization for a free quote on our indexing services.

GOOD LUCK!

Orange County Web Design
Los Angeles Web Design

Using Your Keywords in URL Filenames – Tested with Results!

Posted on:


Ok, there’s been a lot of questions by the SEO community on whether or not using your keywords in your website file names actually has an affect on Google search engine rankings. Does it help your ranking? Does it hurt the ranking? Does Google even pay attention to the keywords in the file names? And actually, I was confused and wondering about it myself. So, I decided to do some testing of my own, with one of my own websites, http://xpirtdesign.com and I’ll tell you some interesting facts that I’ve found with my own experience.

Test #1:

Here are the results of the test that I did for index.html (homepage) file:

I renamed index.html to another filename using my top 2 keyphrases and seperated each individual word using hyphens. (before renaming the file, I was nowhere to be found on Google, at least not in the top 100 sites).

Next, I submitted my XML sitemap to google and waited for Google to re-crawl and index my new URL (the new filename).

After a few days or so, Google did find my new URL and did re-index the filenames. However, I was still nowhere to be found within the top 100 sites using my target key phrases and keywords.

After about a month or so of waiting, still no improvement on my Google rankings. (keep in mind that my site is also completely optimized, as far as on-page optimization goes).

I’ve concluded that after about 2 months of waiting, renaming the index file to a keyword did not have any affect on my rankings. But keep reading on, this gets interesting…

Test #2

The next thing I wanted to find out is whether or not renaming my filenames on my other subpages (about us, contact, services, ect) would have any affect on my Google rankings.

Before renaming my filenames, my sub pages did actually show up in the top 100 sites in Google for my target key phrases, at least a couple of em did. SO, I did the same thing as I did with the index.html file, and renamed the filenames using my keywords and phrases using hyphens as a seperator.

I resubmitted my XML sitemap to Google, then waiting to be reindexed. Oh and by the way, you need to do a 301 redirect to make sure there are no 404 errors on the old URLs. A few days later, Google re-indexes my new URL filenames.

I waited about a week or so to recheck my rankings in Google… But this time, my MAIN PAGE (index.html) zooms up to the first page of Google (#8 spot with about 500,000 competitors), out of nowhere! Remember now, my main page wasnt even in the top 100, or 600 for that matter.

But what about my sub pages?

Well I immediately checked my sub pages for ranking, and let me tell you, out of about 10 keywords and keyphrases (all over 400,000 competitors for each keyphrase) I am either ranking on the 1st page, 2nd page, or within the top 40 sites!!! These are the sub pages that are ranking, as well as my mainpage.

Another thing that I want to add is this:

When searching in Google, you will notice that the keywords are highlighted in the URL file name as well as the Title tag and content of the page. If Google doesnt take the filenames into account for relevancy, then why would they highlight, or bold, the keywords and phrases that are within the actual filename? WHY??? Google is obviously taking filenames into consideration.

Test #3:

So how do I know that this is no coincidence?

Well, I went ahead and renamed the file names back to the original names to start with. Within a week or so, every single one of my sub pages, and main page disappeared. They pretty much dropped off into Google Space somewhere.

OK. Back to square one. I repeated my previous steps again (renaming the files using keywords and hyphens and resubmitted the new URLS to Google). In about another couple of weeks, my pages showed up again! They didnt show up in the EXACT same order as before, but they did come back to the first 2 pages, and a couple of rankings on the 3rd and 4th pages. And again, google is using a new cache of my site with the new filenames in the URL.

My conclusion to the testing:

Now, my site is only 3 months old or so. I have a pagerank 1 on the mainpage and 4 other pages (resources pages). I have about 400 backlinks so far. So YES, I have lots of work to do with link popularity and all that. But the point here is this: Using your target keywords in your filenames seperated by hyphens helps your site to be seen by Google as more relevant for your given keywords than a filename using no keywords. This, I believe, is a fact.

Also, keep in mind that my mainpage remained “index.html.” My mainpage is comming up on the 1st and 2nd pages of my target keyphrases with probably means that my subpages (using keywords in the filenames) are actually helping my mainpage increase in rank. My mainpage did not show up on the rankings before using the keywords, the mainpage showed up when my subpages used the keywords, the mainpage dissappeared when I removed the keywords, then reappeared when I re-added the keywords!

As far as hyphens and spam go:

On some pages I used 2 hyphens, some pages I used 4 or 5 hyphens. These factors did not affect anything, as far as I could see. Im pretty sure you dont want to use way too many hyphens, and you definately dont want to spam your keywords like crazy in the URL. Keep that in mind when renaming your files.

And, being only a PR1 on my site and a PR 0 on my subpages (not including my resources pages), I am ranking higher then many PR2, PR3, PR4, and PR5 sites that are targeting the same keywords. My site is much younger, and has much less backlinks.

Now don’t get me wrong, I am not saying that all you need to do is rename the file names to rank higher. Ive worked extensively in analyzing my competitors, keyword density, analysed title tags, content, backlinks, you name it. All of these practices need to be applied without question. I am only saying that using keywords in the filenames definately helps increase the relevancy of your page with the given keywords you are searching for, in terms of Google.

I hope this articles helps you guys with the confusion thats been going on, and I hope you just dont take my word for it. Test it yourself and make your own decisions, like I did.

Stay tuned for more articles comming soon. If you find any of my articles helpful or interesting, please dont hesitate to subscribe to the RSS, Digg my post, and leave some comments.

Orange County Search Engine Optimization

Xpirt Design’s Link Building Strategy 101: Low budget link building

Posted on:


Xpirt Design’s Link Building Strategy 101: Low budget link building

In this article, I will be talking about my link building strategy that I’ve implemented to build Google page rank for my site. These are proven methods as of September 24th, 2008. I will not be talking about paid link exchanges because 1) It is against Google policy, 2) This article is for site owners that are on a tight budget.

I have a new website that I am trying to promote, like many of you, and we all know that Google and Yahoo relies heavily on other websites linking to your website in order to calculate page rank.

ANCHOR TEXT: Very important!

Before we go on, I want to explain the importance of your keyword in the anchor text of your links. The anchor text is the actual text that the view can see when click on a link. This text should have your target keyword or key phrase that you are attempting to optimize your site for. When submitting any link, you should always use keywords in the anchor text of that link. Some exceptions are DMOZ and any other directory or link exchange site that says otherwise. Also, it is important to mix up your anchor text by adding other words like “best” or “cheap” or whatever you choose. The reason is that Google has whats called a “link trap.” Google actually will penalize your site if it thinks that you are only optimizing for a certain keyword or key phrase. Mixing up the anchor text passes this “link trap” while still achieving the goals of optimizing for a certain key phrase.

There are many different ways to get more links to your site. I will showing you how to get good one-way links as well as good reciprocal link. Also, I will be explaining the best practices in link building so you don’t end up getting penalized by Google. But first, lets talk about submitting your site into the popular link directories.

Link Directories:

The most popular link directories are DMOZ.com and the Yahoo Directory. DMOZ is free but it takes forever for them to add your link (several months in most cases). Yahoo Directory is $300 bucks per year. You should definitely submit your site to DMOZ, but be careful with your site descriptions. They are very strict on their criteria. If you do have the money, the Yahoo Directory is a great candidate as well.

So how do you find more free link directories?

SEARCH for them! The problem with this is that you have no way of keeping track of the directories who added you or not (in an efficient way at least). Which brings me to my next point…

How can we keep track of which directories we’ve submitted to? What anchor text did we use for the directory? What date did we submit on? How can we check to see if our link was added?

Here is a website that has these tools to keep track of who, what, where, and when. http://www.onewaytextlink.com/

This is a great site to start with. Keep in mind that the FREE one-way links are of most importance to us right now. So go ahead, go through that whole list of free one way link directories and submit your link with your mixed up anchor texts.

Reciprocal Link Exchanges:

Another way to get quality links is reciprocal linking. A faster and more productive way in doing this is to sign up for a good link exchanging program like http://link2me.com. Its free and you can monitor which sites have added your link, the page rank of those sites, and so on..

There is a lot of controversy over reciprocal links however. Some people say these links don’t count, some people say they do count. One thing is for sure…Reciprocal links should only be exchanged with other sites that are relevant to your site. If I have a website design site, then I would want to exchange my link with another web design site. With that said, you should also watch out for sites that have been banned or penalized by Google. If you link to a site that has been banned or penalized, you will be penalized as well. The way to check this is to download the Google Toolbar and use the page rank option. This shows you the page rank of any site you go to. If the toolbar is all grey on the homepage of a site, it is because either the site is only a couple of weeks old, or the site has been banned. You can also check by going to Google Search and typing in “site:” and then the url of the site you are checking. If you are checking my site, you would type “site:xpirtdesign.com”

So, go to link2me.com and sign up for free. Start your link exchanging with relevant sites and start building those links.

Get links from your competitors’ link partners:

My favorite way to find link partners is to use my COMPETITORS link partners. There is a reason why your competitor ranks higher in the search engines. If you have already done the on-page SEO for your site, the next step is to find your competitors and do a link exchange with the sites that are currently linking to them!

Go to Google, get the URL of the competitors homepage. Type in Google Search “link:url” If you were to find the websites that are linking to me, you would type “link:xpirtdesign.com”

Go down the list of sites, write an email to eat site owner explaining that you are a fairly new site and you would like to participate in a link exchange. You should write your email in a way that would convince the site owner to exchange links with you. Personalize each email with their URL or name.

Try to focus on getting reciprocal links from your competitors link partners who have page rank on the actual link pages, or the page where they will be putting your link on. The more back linksyou have on pages with page rank, the higher rank you will get from Google in return.

Also when doing reciprocal links, you should have criteria as well.

Check for a “nofollow” attribute within the HTML coding either in their meta tags, or on the links themselves. You do not want to exchange links with sites that do not give you any page rank love. Avoid sites with the nofollow attributes.

Only trade links with other sites that are relevant to yours unless that other site has a good page rank on their link pages.

What are some things that you should do in order to get sites to agree to a link exchange?

-Get some page rank on your link pages.

-Do not use nofollow attributes on your reciprocal links or pages.

-Do not add more then 30 links to each link page.

Trade links with your competitors themselves:

Another great way to find good relevant links is to exchange links with your competitors themselves. Email them to see if they are interested, but I would use a different anchor text so they won’t be worried about helping their competitors. You can use a secondary key phrase that you are optimizing for to achieve this goal.

This concludes “Link Building Strategies 101 – Low Budget.” Please stay tuned for more great articles from Xpirt Design!

Do you not have the time or resources to start your link building campaign? Do you need professional search engine optimization services?

XPIRT DESIGN CAN HELP! We do search engine optimization, web design, and link building for a living. We can get your site that jump-start that it needs to start bringing in that traffic you deserve. Check out these links to find out more!

Orange County Web Design

SEO Tips: Meta tag optimization tutorial

Posted on:


This article is the 3rd in my SEO Tutorial series. Though many search engines disregard most meta tags, it is still important to make sure you include meta tags in all of your web pages. The reason is, meta tags are still used by some search engines and writing these tags properly will also determine how your search result will be displayed to the viewer. I believe the Title tag is the most important tag as far as SEO is concerned.

There are 3 tags that I will go over:

1) Title Tag

2) Description Tag

3) Keyword Tag

All of these tags should be inside your <head> tag.

A title html tag looks like this <title>Your Page Title</title>

The title tag is the tag that you will see at the top bar of your internet browser. Search engines use this tag to get a general idea of what your website will be about. So if you are trying to rank in a certain keyword, like, “Orange County Web Design,” then it’s very important to have your target keyword in your title tag. In fact, it should be the first set of words in your title tag. Some people like to use their company name, or “home page”, or “services”, and so on. Do not do that. Use your keyword phrase in your title tag unless you are an established business and people might probably search for your company name on the search engines.

When using keywords in a title tag, you shouldn’t use more then 70 characters. Also, if you have a # of keywords you are trying to optimize for, keep in mind that the less keywords in your tag, the more weight, or, relevance, that keyword will have. 1 or 2 keywords or keyword phrases are a good start for a title tag.

Also, your title tag should be the first thing that the search engine crawlers should read. Which means, you should place your title tag directly underneath your <head> tag. Dont keyword stuff your title tag either because search engines will penalize your rank in doing so.

Your Description Tag looks like this: <meta name=”description” content=”This is my webpage description…” />

Your description should explain what your webpage is about. You should also use your keyword or keyphrase in here, but in a way that is understandable to the reader. For example: <meta name=”description” content=”Affordable Website Design, Affordable Search Engine Optimization Company servicing businesses / individuals nationwide.” />

Notice how i didn’t use the word “and”, “or”, ect. You want to avoid those words in both your description and title tags. You want to also make sure that the words you use in your description tags also happen to be inside your content.

Keyword Tags look like this: <meta name=”keywords” content=”Affordable Website Design, Affordable Search Engine Optimization” />

When writing your keyword tags, just list each keyword once. Again, the less keywords you use, the more weight that keyword or keywords have. Thats pretty much it for keywords. Pretty simple eh?

As you can see, On page SEO (optimization done within your website coding and structure) basically revolves around your target keywords and keyphrases. Each page should be optimized uniquely. All your pages shouldn’t be targeting the same keywords.

The next tutorial will be about optimizing your actual content of the webpage, using heading tags, using bold, italics, underlines, and how to structure your design layout for SEO. Stay tuned and dont forget to subscribe to my RSS feed and feel free to leave comments if you find any of these articles useful. Also, i just added a digg widget in the navigation. Digg me =] .


Los Angeles Web Design

Orange County Web Design