Showing posts with label seo. Show all posts
Showing posts with label seo. Show all posts

Tuesday, November 27, 2007

Squidoo - Why Squidoo? Part 1

Enter Squidoo! Whether you sell on eBay or your ain WebStore, a Squidoo lens system system is an excellent, free, powerful manner to advance your merchandise or service.

Here are the 5 grounds you should do a Squidoo lens today...

1. FREE

You can't beat out free! In a short amount of clip to make your lens, you can quickly begin drive traffic to your eBay lists for free. And if you have got an eBay shop you can acquire the Shop referral credit.

2. LENSES ARE EASY TO CREATE

Squidoo is easy to utilize and supplies you with the tools to make an first-class lens system describing your merchandise or service. Just choose the faculties that are right for your concern and you are on your way.

3. ALL ABOUT COMMUNITY

Squidoo users are portion of a community that is committed to helping you succeed. From designing thoughts and proofreading, to sharing tagging fast ones for getting found, to working on cross publicities with another lensmaster, the Squidoo community-and even their bantam staff!-is there for you.

4. May aid YOU become THE EXPERT

Initially noone is supposed to cognize your merchandises as well as you do, so utilize Squidoo to share that cognition with possible buyers. Make content that depicts your products, explicates a procedure or supplies some history about your merchandise and then point readers to where they can buy your product.

5. aids YOU share YOUR PASSION

Share a small about yourself and your business. One of the joyousnesses of merchandising on eBay is the human relationship you can construct with your customers. Use Squidoo to personalise your concern for those customers. This volition also assist you construct a connexion so that you can turn tax return business.

We will transport on with this treatment in the adjacent article.

Tuesday, June 12, 2007

Beating Scraper Sites

I've gotten a few emails recently asking me about scraper sites and how to beat them. I'm not sure anything is 100% effective, but you can probably use them to your advantage (somewhat). If you're unsure about what scraper sites are:

A scraper site is a website that pulls all of its information from other websites using web scraping. In essence, no part of a scraper site is original. A search engine is not an example of a scraper site. Sites such as Yahoo and Google gather content from other websites and index it so you can search the index for keywords. Search engines then display snippets of the original site content which they have scraped in response to your search.

In the last few years, and due to the advent of the Google Adsense web advertising program, scraper sites have proliferated at an amazing rate for spamming search engines. Open content, Wikipedia, are a common source of material for scraper sites.

from the main article at Wikipedia.org

Now it should be noted, that having a vast array of scraper sites that host your content may lower your rankings in Google, as you are sometimes perceived as spam. So I recommend doing everything you can to prevent that from happening. You won't be able to stop every one, but you'll be able to benefit from the ones you don't.

Things you can do:

Include links to other posts on your site in your posts.

Include your blog name and a link to your blog on your site.

Manually whitelist the good spiders (google,msn,yahoo etc).

Manually blacklist the bad ones (scrapers).

Automatically blog all at once page requests.

Automatically block visitors that disobey robots.txt.

Use a spider trap: you have to be able to block access to your site by an IP address…this is done through .htaccess (I do hope you're using a linux server..) Create a new page, that will log the ip address of anyone who visits it. (don't setup banning yet, if you see where this is going..). Then setup your robots.txt with a "nofollow" to that link. Next you much place the link in one of your pages, but hidden, where a normal user will not click it. Use a table set to display:none or something. Now, wait a few days, as the good spiders (google etc.) have a cache of your old robots.txt and could accidentally ban themselves. Wait until they have the new one to do the autobanning. Track this progress on the page that collects IP addresses. When you feel good, (and have added all the major search spiders to your whitelist for extra protection), change that page to log, and autoban each ip that views it, and redirect them to a dead end page. That should take care of quite a few of them.