Making Money with Parked Domain

Have you ever typed in a domain name and came to what looks like a default page of a domain name registrar that reads, “http://www.thedomainyoutyped.com is under construction”. Well it is time to stop advertising your domain name registrar’s site and start using that parked domain name for making money.

How do I Make Money with my Parked Domain Names?

The first step is to make sure that your domain registrar offers “Forwarding” or “Redirection” service. If you do not know what “Forwarding” or “Redirection” is let me explain. Forwarding a domain name means that your domain registrar will forward anyone that types a domain that has forwarding on it to another domain. For example if you set up forwarding on yourdomain.com to yourotherdomain.com, when anyone types in or clicks a link to yourdomain.com they will automatically be sent to yourotherdomain.com. That’s it! I use GoDaddy to register domains and they offer the Forwarding for Free.
Finding the right affiliate program.

After you determine that your domain registrar offers Forwarding you will need to find affiliate programs that are directly related to your parked domain names. For example, I have a domain name BuyLasVegasShowTickets.com. Now I would want to look for an affiliate program that sells tickets to Las Vegas Shows. It is best to find affiliate programs that are directly related to your parked domain name. You will find a higher percentage of affiliate sales will be made that way. If a visitor is already interested in the same subject of your domain, it does not make sense to send them to something they are not looking for.

I joined an affiliate program…now what?

After joining an affiliate program you will be given a “Referring URL” that contains your id in the URL. Next you will go to your domain registrar’s site and set up the forwarding of your domain to the referring URL. Continuing with our previous example our fictitious referring URL is http://www.youcanbuyshowticketshere.com/id=me. Remember this is fictitious for example only. So I go to my domain registrar and have BuyLasVegasShowTickets.com forwarded to http://www.youcanbuyshowticketshere.com/id=me. Now when someone clicks on a link or types in BuyLasVegasShowTickets.com they are automatically taken to site they can buy show tickets at, and since my referral id is in the URL, I will get credit for any sales made. It is that simple.
What are you waiting for?

I know there are thousands (if not millions) of you out there, who own domain names that sit around collecting dust. Why not put them to use and start to make money for your online banking accounts with them?

Creating custom error pages using .htaccess

No doubt you’ve been frustrated after visiting a web site, then clicking on a link only to be presented with the dreaded :

Error 404 – File not found

What’s the first thing you do? In most instances I know I just leave the site altogether.

Lost visitor = lost $$

But in the instance that there is a properly structured custom error page, especially if it has a search box, I may hang around a while longer – or another offer may grab my attention.

If your web host supports .htaccess files, then with a few minutes work, you can have your own error pages up and running!
What is a .htaccess file?

It basically contains commands that instruct the server how to treat certain requests.

The .htaccess file contains a number of settings to control who can access the contents of a specific directory and how much access they have. It can also be used to create a “URL Redirect”.
How do I implement custom error pages?

Create and publish what will be your custom error pages to your account as you would usually publish any page. You’ll need to create two for the more common error codes, file not found (404) or unauthorized/forbidden (403, 401). Your custom error pages should have an apology, a brief blurb regarding what may have gone wrong (file renamed etc.). This explanation should be immediately followed by an invitation for the visitor to try reloading the page or to select a different section (provide suggestions). Ensure that these pages have your sites’ look and feel.

After publishing the pages, you’ll need to edit the .htaccess file in the root of your document directory of your site. Use the Edit utility (set to ASCII transfer mode) in your FTP software to view the file (it would be wise to also create a backup).

If you have a FrontPage web, be especially cautious, as the .htaccess file contains other important FrontPage configurations.

If you don’t find a .htaccess file in the root of /docs, you can create your own with any text editor – ensure the file is called .htaccess (include the dot)

Add the following lines to the end of the file (change to suit)

ErrorDocument 404 http://blah/blah/404.htm
ErrorDocument 403 http://blah/blah/403.htm
ErrorDocument 401 http://blah/blah/401.htm

Save the file, test by trying to access a page that doesn’t exist on your site – done!

Custom error pages are very easy to create and help you to retain wayward visitors – remember, that one visitor you may lose through not having custom error pages may be the one who was ready to buy your products.
What if I’m renaming or moving pages?

While you can use a custom error page in these situations, it’s better to use a 301 redirect for a number of reasons. Learn why and how to implement a 301 redirect in our tutorial.

Renaming/moving a page? 301 redirect it!

If you are considering moving a page on your web site into another folder or simply renaming it; there’s a couple of important points to consider. The most important issue is that if the page you are moving or renaming has a good search engine ranking already, or may have been bookmarked by your visitors; all your hard work could be lost.

Bearing in mind that search engines can take months to refresh their listings, any visitor clicking on a search engine result may be severely frustrated if they don’t arrive on the page on your site. You could always use a custom 404 error page; but this is still an extra hurdle for visitors to jump and the rankings you have built up for the page in question will be lost.

Wouldn’t it be great if you could set up your site so it was “smart” enough to know that the page had been renamed or moved and then take the visitor to the correct page? Well, you can!
Enter the 301 redirect

One of the simplest ways to redirect visitors is to put up a blank page with what’s known as a “meta-refresh” tag, but, this is frowned upon by many search engines and definitely won’t save your rankings, so we won’t discuss it any further. If you are currently using meta-refresh tags, it would be wise to consider changing them over to a 301 redirect as it is the most efficient and friendly solution. Search engine spiders and human visitors will be presented with the correct page if the old page is requested – seamlessly.

A 301 redirect is implemented in your .htaccess file.
What is .htaccess ?

.htaccess is a text file that is checked by the web server when a request for a page/item is made by a browser, agent or spider. It contains specific instructions on how to handle specific requests and also plays a role in security.
What’s a 301 redirect?

“301” translates to “moved permanently”. After the code, the name and location of the moved or renamed page is noted, then there is a space, and then the new location and name of the file.
Implementing a 301

First of all, check with your web host that you can use a 301 redirect – not all web servers will be compatible.

You’ll then need to download the .htaccess file from your web site which can be found in the root of your documents directory via ftp (use ASCII mode). If a .htacess file isn’t present, create one with Notepad or a similar text editor. Ensure you remember the “.” at the beginning of the file name and do not use a tail extension

If there is a .htaccess file already in existence with lines of code present, be very careful not to change any existing code. It’s probably wise to create a backup of this file in case you make a mistake.

Scroll down to the end of the current code, miss a line and then create a new line using the following example as a guide.

redirect 301 /current/currentname.htm http://www.you.com/newfolder/newname.htm

That’s all there is to it – save and upload back to the document root directory and then test it out by typing the old address into your browser – you should be seamlessly redirected to the new page name/location.

Note: Do not use “http://www” in the first section of the statement – just add the path from the top level of your site to the page. Also ensure that you leave a single space between these elements:

redirect 301 (the directive that the page has permanently moved)

/currentfolder/currentname.htm (the old path and file name)

http://www.you.com/newfolder/newname.htm (new path and file name)
Moving/renaming many pages?

The basic 301 redirect is a great solution for changes to a few pages, but what about dozens of pages or an entire site? A more powerful set of instructions for URL redirects is contained in the Apache mod_rewrite module. Learn more about it here:

http://httpd.apache.org/ docs/misc/rewriteguide.html

Shopping Cart SEO

I had a few interesting conversations about shopping carts last week. (Note to Yahoo, anyone who has two interesting conversations about shopping carts in the same seven-day period needs better online entertainment options. Please step up the pace.) Actually, shopping carts are fairly interesting, if only because they are so fundamentally important. With the rise in consumer acceptance of e-commerce comes a tidal wave of merchant awareness and in a massive page-by-page process, shopping carts have rapidly expanded the size of the web and will continue to do so until a better way comes along to sell stuff. They are also expanding the sizes of search engine databases with mixed results. Shopping carts are either fast moving or fairly static. In some cases, product listings can change faster than Yahoo updates its cache of the page those products were listed on. There are a number of issues that can be associated with these problems and since each website is unique, general rules tend to be applied as solutions until work-arounds are devised or purchased.

The first “conversation” was actually an Email correspondence with a site developer/SEO in the UK named Christine who wanted to know why Yahoo would ask webmasters to use robots.txt files to prevent Slurp from deep-spidering specific product pages in shopping carts. She was right when she wrote, “I thought it would help (Slurp) as it would see all the products, and be more relevant, and then when someone searches for … say a necklace then they would come up with the necklace shopping cart section.” It stands to reason that relevant results might be found deep within a e-commerce database on a shopping cart page and excluding these pages from Yahoo’s organic index serves to limit the range of information available to searchers. There are several good reasons Yahoo and other search engines would rather not spider their way deep into the hearts of shopping cart databases. At the same time, there are millions of merchants selling billions of items that consumers are looking to purchase online who would benefit from Top10 placements for specific products in their inventories.

The second conversation was with StepForth Placement’s sales manager, Bill Stroll as he and I were looking at strategies around E-commerce sites. Bill pointed out that StepForth was receiving a number of recent inquiries regarding shopping carts.

Both conversations got me thinking about shopping carts and the frustration that can come when targeting specific product placement through shopping carts.

Shopping carts are relatively easy to establish and populate. For merchants, opening an E-store is easier today than ever before with many ISPs offering some basic shopping cart as a web development tool and hundreds of different shopping cart software packages on the market. If you hang out in the IT field, chances are you know someone who has built or worked on someone else’s customized shopping cart. For SEOs, every new shopping cart can pose steep learning curves if they are intent on achieving specific product placements.

For search engines, the internal pages of a database might provide excellent information or, they might produce stale results left over from the last time the spider visited. Some webmasters haven’t quite mastered the art of their shopping carts. I’ve encountered lost pages, blind links, ancient products that don’t exist anymore, orphaned links, databases created by on-the-fly reference/link generators that mechanically spit out thousands of carbon copy pages… the list can go on and on. If a guy like me has seen enough to say he’s seen a lot, try to imagine what the search engines spiders see and record. There’s a lot of noise pollution in them there search-engines. All of the major engines spit out bad results from time to time and Yahoo is no exception. Asking webmasters to exclude spiders from shopping carts is one of the ways Yahoo is trying to clean up its SEPRs.

Nevertheless, search engines are a marketing tool and without a listing, a product will sell much less than it would if it was easily found in the Top10 results. There are two ways to go about getting these listings, one of which produces very rapid results but can be rather pricey, the other of which takes a bit longer and presents a slightly lower chance of sustaining strong rankings for shopping carts over time. Both options benefit greatly from treatment by a SEO.

The first work-around is establishing relationship with Yahoo via SiteMatch or SiteXchange. This enters the realm of paid-inclusion but guarantees either consistent spidering by Slurp or a constant XML feed to Yahoo for much larger sites, as opposed to reliance on more random spidering by Slurp. Rates are negotiated directly with an Overture sales rep or via an SEO/SEM firm. For smaller online stores, the greatest benefit of SiteMatch for is consistent spidering which practically ensures both you and Yahoo that any changes to your merchandise or website will be read and recorded fairly quickly. For the big shopping sites, SiteXchange, allows you to establish a direct XML feed straight to Yahoo which saves them the time and bandwidth of constantly updating huge sites while offering large online sellers the security of constant updating of their site at Yahoo. (Please note, at $5000 per month and up, SiteXchange is an option only for the largest stores, or ones with a $60,000 year + online advertising budget). While Yahoo offers paid-inclusion for online merchants, the promise of rapid and consistent spidering does not guarantee Top10 placements in the SERPs, it only guarantees that Yahoo’s spider Slurp will visit the site more often or that you have a direct pipe to Yahoo. For strong placements, hiring an SEO firm, or doing the work yourself is important.

Most online retailers don’t have that kind of money to spend on search engine marketing but a lack of online advertising dollars does not diminish the need for strong search listings. There is a less expensive work-around relying on the use of static pages for specific products or specific product groups. This is an old-fashion method that is not entirely desirable as it involves a lot of labor from your IT staff or your SEO staff and creates a number of quasi-duplicate pages however it is the method that has stood the test of time. Basically, a static page with lots of description should be created for each product or similar groups of products in the shopping cart. No spider ever balks when asked to spider through a series of static HTML pages. A direct link to the shopping cart listing can be included on the static product page to allow one-click navigation for customers. Sitemaps also work best when links lead to static pages and a coherent SEO plan is easier to devise when dealing with static pages than it is with highly dynamic content.

The third work-around for this issue costs less money than the first and offers many of the SEO benefits of the second. There are a few shopping carts out there that have been designed specifically with search engine spiders in mind. The one we have worked with most was developed by Oklahoma based Internet pioneer, Lee Roberts. When developing ApplePie Shopping Cart <www.applepiecart.com>, Roberts, better known as the host of the syndicated WebDoctor radio show, focused on two major issues. First of all, he wanted to create a tool that practically anyone could use. Next, he wanted to create a shopping cart that would provide information to search engine spiders in the way spiders best like to read site information. After years of development, Roberts released a successful version of the cart. According to Roberts, “Apple Pie Shopping Cart provides an easily optimizable page even for business owners who do not know how to work with HTML. All you need to do is simply fill-in-the-blanks and the Apple Pie builds the your site for you. We’ve had business owners finding Google indexed over 1000 pages of their e-commerce site within the first 30 days … and many of those pages receiving top positions in Google’s results.”

This holiday season, the e-commerce sector is expected to drive upwards of 15% of product purchases. With almost every major retailer and most smaller merchants getting involved, online commerce continues to expand, exponentially, as will the demand for shopping cart SEO. There is a long way to go and new versions of shopping carts to come out. As it stands today, most pages produced by shopping carts are simply not going to achieve a strong ranking in search engine listings without applying a work-around of some sort. While the costs may vary, the goals are always the same, getting the placements and making the sales.

IPower Web Review

IPower Web proudly displays that they are serving over 250,000 members with affordable, reliable hosting. They say this for good reason.

For $7.95 per month you get:

Free Set-up
2000MB of disk space
80 GB of Bandwidth
SSL certificate for secure online shopping
MySQL Database
Plus you get to choose from these great shopping carts:
OS Commerce Shopping Cart
Agora Shopping Cart
PayPal Shopping Cart
Gain access to Marketing and Promotion tools
Web Stats and so much more.

iPowerWeb’s Business Pro package is the #1 selling web hosting solution on the Internet for two years in a row. It is a complete and all-inclusive solution for everyone – beginners, professionals and businesses.

Analytics and Expertise – Trusting the Pros to Know

Recently, a friend of mine bought a new car. Buying a car can be extremely stressful with an enormous array of important numbers, specifications and comparative measurements to consider before purchase. Now I was raised in the downtown core of Toronto that is a megalopolis stretching around the northwestern quarter of Lake Ontario. Growing up with a highly efficient public transit system and a decidedly urban lifestyle, I never even considered the need for a vehicle until I was in my mid 20’s. Ten years later I am still at the “this moves me where?” stage in my relationship with vehicles. Taking me to a car-dealership is sort of like asking another city-kid which mushrooms are safe to eat in the forest. “Hey that one looks cool…” Things can get pretty Mickey Mouse from here eh?

Being Mr. Urban-boy, I took a common sense approach to this problem. In order to appear less dense than I actually am, I did a bit of research on the types of cars my buddy was interested in. There is a lot of information out there. Being an SEO, I have a knack for easily finding information quickly. I found a lot of information and began to compile small dossiers on several vehicles that my friend mentioned. I can tell you about torque ratios and fuel efficiencies and anti-lock braking systems and all sorts of other stuff about several different models.

I was about to learn a very sad truth about such matters. When it comes to new cars, I don’t really know what I am talking about. I know enough to make conversation with another person but when it really comes down to it, lots gets lost in transmission.

Armed with numbers and knowledge, I felt somewhat comfortable helping my friend avoid getting sharked by a salesperson, at least in my head. In my heart however, I knew I was descending into a realm I’ve never really needed to understand before. While I already understood the basics of internal combustion engines, a glance under the hood of a 2004 model showed a very different design than the V.W. or Slant6 engines I’ve seen over other friends’ shoulders. After about five minutes, I decided my best contribution would be to simply stop asking questions and just watch the person selling the car. Let me tell you, people selling cars can throw numbers around and they sure know a lot about the vehicles they sell. Some of them were really nice folks. Others could have been typecast for their roles as totally scuzzy car dealers. The experience reminded me of a part of the cyber-world that is very close to my heart.

After my experience “helping” my friend find a new car, I thought about tools that provide businesses with information about their websites and online marketing efforts. There are a number of free SEO analytic tools out there for webmasters and site developers to work with. In many cases, these analytic tools offer a lot of numbers but very little actual analysis and function as sales devices for SEO or SEM firms.

Being able to access stats regarding the number of incoming links or the number of words found on a page does not necessarily give one the full knowledge needed to practice SEO. Search is a complicated field that has never provided a static environment. That complexity is the primary reason the SEO sector exists. When it comes to structuring an online marketing campaign, having hard facts about your website gives you the ability to make informed decisions, especially when you don’t have the luxury of examining the eyes of the salesperson on the other end of the phone.

Still, knowing all the numbers doesn’t really mean one knows the score. What do the numbers really mean in relation to each other or in relation to a competitor’s site? Here is a basic guide to analytic data you should be looking at.

W3C Compliance:
The World Wide Web Consortium (W3C) is the body that sets technical standards on the web. Being certain your site is W3C compliant helps ensure any search engine spider can read it. Look for a tag at the very top of your source code that looks something like this:
< !DOCTYPE HTML PUBLIC “-//W3C//DTD HTML 4.01 Transitional//EN”>

Title:
There are not common rules for the length of a page title but conventional wisdom says the greatest “power” area is found in the first 40 characters. If your page titles do not have keywords found in the first 40 characters, chances are you will want to have them rephrased. You will also want to ensure that each page in a site has a unique, topic-specific title.

Meta Description and Meta Keyword Tags:
There are two meta tags that are important to search engine rankings, the description and keyword tags. Of these two tags, the description is the most important but the keyword tag is thought to carry a very small weight on some search tools. Both tags should be kept below 190 characters and have the strongest keywords or phrases as close to the beginning of the tag as possible. It should be noted that alterations to either of these tags would affect another important analytic measure, keyword density.

Use of Heading Tags:
Headings should be used like page headlines. A good analytic tool will tell you how many <H1>, <H2>, or <H3> tags are used on each page analyzed. Search engines tend to give a bit more weight to keywords phrased in heading tags however they also penalize sites that misuse headings tags. Knowing the number of times a heading tag is used doesn’t tell you if that is the optimal way to use such tags. It is also difficult to offer general advice on the use of headings tags except to suggest that limiting the use of these tags is generally wise.

Use of IMG-ALT Text:
A good analytics tool will tell you how many images on a page use ALT text, however, most analytics tools will not tell you if ALT text is used wisely. Image ALT text is the text that appears when a mouse hovers over an image. It is primarily used as an accessibility tool allowing page readers to describe an image for visually impaired visitors. Some search engine marketers are also using image ALT text as a SPAM tool.

HTML Size (or Page Size):
Good analytic tools will tell you the size of your website. Generally, the smaller the number the better as small pages load faster and are more likely to address one topic per page. If your analytic tool tells you your page size is very large that is likely an indication you need to restructure your website.

Keyword Density:
Keyword density refers to the keyword/non-keyword ratio of the site. This is a touchy area as many in the SEO community do not believe keyword densities play a factor in organic placements but it is an often-analyzed page element. Scott Van Achte, head SEO at StepForth Placement does consider keyword densities however he believes that the “optimal” keyword density is directly related to the other pages listed in the Top10 placements. This is an area you would want to address with a professional SEO firm.

Site Structure:
This refers to how a website is constructed and is a fairly dense area to work through. Please note: Given the numerous types of sites, databases and design tools, a general view on site structure is rather difficult to present, an issue shared by all analytic tools, tech-writers included. If you think your site structure might have an adverse effect on site rankings, you should speak to an SEO. Here is some general advice on analytic tools and site structure though.

The first thing a good online analytic tool will do is tell you if a search engine spider can read your website. A problem here is that (generally speaking), most search engine spiders are more advanced than the free SEO analytic software. Quite often an older tool will tell a webmaster that their site is not open to search engine spiders when the site is in fact wide open.

Next, a good tool will offer a representation of link paths found within the site. Users should be able to see each link listed, including the anchor text used to phrase the links. The tool should also expose any “dead links”. If there are critical SEO issues posed by the structure of the site, a really good free tool will offer corrective suggestions as well.

An important thing to note is that most analytic tools examine individual web pages not entire websites. You want to be certain you have a full analysis of your entire website before undertaking a major redesign or SEO effort.

Incoming Links:
One of the biggest factors shared by all major search engines is that spiders find pages by following links. Furthermore, the number of links directed to a specific page has an effect on the placement of that page.

A good analytic tool will tell you exactly how many incoming links are directed to the page being studied. A really good tool will give you an active list of these links however;most free tools do not generate such lists. Link Analysis is an important part of SEO work however this is another area in which most analytic tools simply can not offer a full picture of the effectiveness of current incoming links.

Overall, website analysis is very complex, made more difficult by the fact that analysis of competing websites is critical to establishing useful baselines. It is important to remember that most online analytic tools look at pages, not entire sites. Webmasters and marketers are urged to gather as much information as possible before considering search engine optimization, whether in-house or out-sourced to a professional firm. Above all, use the numbers gathered in analytic study of the various pages in your website to quiz whomever you are considering for your SEO effort. If every topic addressed above generates a thoughtful response (even if it challenges what I’ve written), chances are you are talking to a pro.

By the way, my friend has a nice new car. At the end of what turned out to be a very long day, my friend settled on a great car purchased from the salesman he trusted the most. After I stopped trying to be an expert on cars, I reverted to an expertise we all share, I used my meager knowledge to probe my sense of trust.

Hosting Your Own Web Server: Things to Consider

Are you disgusted or disappointed with your current web host? Have you switched web hosting companies too many times? Have you thought of hosting your own website(s)? Do you have the ambition to control and manage your own web server?

If you answered ‘yes’ to the questions above, then you may be ready to host your own sites. This article will give you things to consider while making the switch.

When being your own web host you should be technically inclined and have basic knowledge of operating systems, understand technical terms, understand how to setup a server environment (such as: DNS, IIS, Apache, etc.) have basic knowledge of scripting languages and databases (PHP, Perl, MySQL, etc.), be familiar with current technologies, and have a basic understanding of hardware and server components.

You should realize the pros & cons. It is one thing to say, you want to host your own web server and it is another thing to actually do it.

Pros:

Own sense of responsibility
Awareness level raised (you are at the frontline of all server happenings)
No monthly hosting fees/accounts
Incompetence no longer exist
Non-shared environment (dedicated server)
Unlimited websites, databases, content, storage, etc.
More bandwidth
No more waiting on someone else time
Complete control

Cons:

Exhausting at times
Faced with server/hardware problems
ISP business account (monthly business/broadband expense)
If server goes down then the website is offline
No technical support team
Software, hardware, and network expenses

There could be many more pros & cons but I’ve pointed out some of the major ones. Managing a web server starts as a full time job, you must constantly monitor its performance and security. This can sometimes be an exhausting task, especially if you currently have other responsibilities. Though, the control you will have over your website and its performance is rewarding enough. You no longer have to wait for technical support or approval to install a script onto the server. You can have as many websites and databases you want, as long as your hardware can handle it. You no longer have to go into the discussion forums and search for the best web host or rant about how much you hate your current host. You can even begin hosting family & friends personal websites.

Ask yourself, how technically advanced are you? Many times you do not have to be a tech guru or anything of the sort, but you must be very resourceful. You must know how to find resolutions and answers to problems, quickly and efficiently. This means you must be internet savvy. Not just the average surfer, who surfs aimlessly, but you must be the surfer who can always find what they are looking for. This is key, because with any server environment you are going to run into problems and finding the answers are most accomplished online, using multiple resources, search techniques, and engines. Sure you can hire someone to fix your problems, but as we should have learned from the “web hosting”, having someone do it for you isn’t always the best option. Here is a test to see if you are ready to find solutions. I need a solution to a Microsoft Windows 2003 Server Event Error – “Event ID: 1056” it’s a DHCP Server Error. How would you search? Go ahead find the solution.

Did you first go to Google? If you did, that was a nice effort and common for most, plus a good place to start, but usually it is best to start at the developers’ website. In this case “microsoft.com” would have been the first option. Why? Google would more than likely provide you with the answer from Microsoft and other sources, but you don’t want to get inaccurate information from other sources. It is common to get information from Microsoft that would not specifically resolve your problem, but the developer should always be your first place to search for the answers. Now search the error again and go to the Microsoft site and find the solution.

You should had found this link: http://support.microsoft.com/default.aspx?scid=kb;en-us;282001 (Event ID 1056 is Logged after installing DHCP)

What search phrase did you use? It should have been Event ID: 1056, because the Event ID is the exact error, it pinpoints your exact problem without broadening your search. Sometimes the error description is also appropriate to search, just the error description by itself or in combination with the Event ID. It depends on your error, your search feedback, your ability, and technique. For this example I did not include the error description.

Google or Yahoo! should have been your second option (the two largest search engines). Then search other smaller and niche search engines. A good search site which makes use of Google’s operator tags is www.soople.com. Next you should search within forums and discussion groups. If you are pretty internet savvy and have a plentiful or few forums and discussion groups which you frequent then you might actually visit those places before visiting Google or Yahoo!. You can even visit those before visiting the developer site since it is a trusted source, but I wouldn’t recommend it, I still would go to the developers’ site first. Okay, so now we have planted our feet and have familiarized ourselves with being internet (search) savvy. We are ready to purchase a server!

When making a server purchase you need to consider a few things before doing so. What to buy? A top of the line, quadruple processor, and super fast turbo server is always ideal, but many times it is not logical or affordable. Therefore, you need to weigh your options (sensibly).

What to buy?

First determine your budget. Be realistic and expect to spend at least $2,500 for a low-end server. For a low-end, quality server with other needed equipment and services I spent a little over $4,500 easily.
Determine your ISP (broadband) provider. Research and speak with several different vendors before deciding which broadband solution best suit your needs. Each provider plan is different and has different benefits, determine the best one which fits your needs. Bandwidth should be put into consideration when choosing your ISP.
A backup device should be purchased before implementing a server install. The backup device should be double the server storage space. This could be a standalone unit like an external hard drive or network storage device or the backup device could be multiple devices such as: backup tapes, disc, etc. The reason the backup device should be larger is because you want to be able to have months worth of backups and not just weekly or monthly backups. You should have at least 24 weeks of backups without the concern of storage space. The backup device or safe deposits should also be external, removable, and portable. This is so the backups can be stored in a remote location. Usually for safe-keeping, in event of a theft or disaster.
Determine your daily traffic goal (the daily traffic which you hope to see within 1 year – be realistic), divide that number by the daily traffic you currently receive, and then multiply that number by 5. That is the total number of GB space you need.

Example:

Daily Traffic Goal: 10,000 (Divided by) Current Daily Traffic: 500 (Times X) 5 = 100GB

In the example you should purchase a 100GB Hard Drive, it is best to buy 2 or more drives oppose to 1. In this case, since 50GB drives do not exist or harder to find, you would buy two 60GB drives giving you a total of 120GB. 2 or more drives are usually needed in a server to configure the proper RAID option, in some cases 3 or more are needed. Your backup storage space should be a minimum capacity of 200GB (or 240GB, optional).

Determine your memory. If your web server daily traffic goal is 500,000 then I would recommend at least 2GB worth of memory. If it is a shared server, meaning it also has other server services running on the machine, especially a mail server or database server (which is not recommended) then your memory should at least be 3GB or more. Otherwise you can think small and upgrade as needed, a 1GB memory stick should be fine for starters.
Determine your network components, which NIC card best performs under high traffic levels and which router best performs for your LAN / web server. It is best to get a router which has a built-in firewall (commonly known as a “hardware firewall”). Your ISP may provide you with a router or hardware firewall, this is how they are able to authorize your traffic on their network. Like a cable box does for cable television. The router also shares your IP address with other clients on your network. This enables you to share your internet connection without having to get a different IP from your ISP. The hardware firewall is simply a router with a built-in firewall, which means it shares your IP address as well as provides added protection to your network. It blocks bad addresses and ports at the forefront, before it can even make it to your computer. It is not recommended to rely only on a hardware firewall for security, this is just the first step. It is recommended to also include a software firewall (firewall software which installs on your computer) and it is recommended to continue timely security practices, such as updating and patching your system on a scheduled routine.
Determine your processor speed and power. Regardless of the amount of traffic you are expecting I would recommend a dual processor or greater. A dual processor or greater is best because if your website unexpectedly take off then you will be well prepared and if you host other server options or websites on the same server then you will have better performance. At the time of this writing the 64-bit platform is the processor direction. 3.8GHz is the most available speed. If your pockets can afford the latest technologies then that is ideal, but keep in mind at this particular time a 64-bit compatible processor is not necessary, without having many applications that require or deliver on that platform. That is a lot of speed going no where fast. Also, remember the power of the 64-bit platform and the greater the processor speed the more heat it produces, therefore it must be cooled much more rapidly and efficiently. At this time a dual 32-bit, 2.8GHz – 3.2GHz processor will suffice (even that is way more than enough). Though, if your pockets can afford it then the latest and greatest would be fine, you will be well prepared. Otherwise, do like most people and upgrade when the time comes.
Make sure you have a CD/RW drive. A floppy disk drive is not needed, but I do recommend it for making system restore disk. The CD/RW drive is needed because you need some type of removable storage device. You never know when you need to install a driver from a different location…like a ethernet driver. An external CD/RW drive is the best option, especially if you have multiple machines.
Choose your operating system carefully. Choose the vendor which you are most comfortable with. Do not choose a MAC if you never used a MAC before. Just because your friend suggests it and says it is a piece of cake does not mean it will be for you. You are trying to get a web server online not re-learn a whole new system. Stay focused and grounded. If you are comfortable with Microsoft then go with Microsoft, regardless if the IT person at your job says Microsoft products are unsecure, Linux or Unix is more secure and much better for a web server. If you have never used Linux or (especially) Unix then you will be in for a ride of your life. Your web server experience will soon become a nightmare and you will have wasted thousands of dollars on equipment. Go with what you know, not what you are told. Each platform has its pros and cons: Microsoft is the user-friendly of them all; Mac is the web/graphic developer of them all; Linux is the open-source/developers paradise of them all; Unix is the most secure of them all. Each of them can be tweaked in ways to provide a solid, quality platform, it is best to stick with what you already know.
You should install on your system all the web services (such as: Apache, IIS, etc.), applications (such as: backup device software, RAID (Array) Manager), scripting languages (such as: PHP, Perl, etc.), CGI, Database (such as: MySQL, Microsoft SQL, etc.), and a web log analyzer (such as: weblog expert, nihuo, etc.). It is also recommended to perform system updates, security patches, and firewall installs and configurations. It is sometimes best to leave the security installs and configurations for after you have the system functioning properly to avoid any uncalculated problems. This way when you install the firewall and something stops working properly like connecting to your website from outside your network, you can troubleshoot assuming that it is the firewall, therefore you would begin by opening port 80 on the firewall. This method can save you a lot of hassle, but can sometimes be considered the unsecure method, especially if you don’t have a hardware firewall already in place.

Next, there are a few things to consider before installing and configuring a server. Where to put it? A secluded, cool location is ideal, but if this cannot be accomplished then adjustments can be made.

Where to put it?

Locate a location within your home that is static-free, we do not want to put the server on the carpet, near curtains, or anything else that can produce a static reaction. The reason for this is that you do not want the server internal components to receive a static charge and short circuit.
The location should be a cool environment, preferably below room temperature. The reason for a cool area is because servers (as well as desktop computers) produce large amounts of heat, the more heat it produces the more it needs to cool off. If the server is not kept cool, then this can lead to the server hanging and then eventually crashing. It’s the same result of when a car overheats, it stops running. You can keep a server internal components cool by keeping the room cool, additionally you can add more fans to the server which cools the server internal components, you can also position a standalone fan directly at the server back panel and constantly have the fan cool the server internal components. Depending on how cool your room is and how much heat your server produces, getting more fans for your server may be a must and not an option.
It is suggested that the server is at a higher level within your house (mid-floor), because if it is in the basement and a flood occurs it could ruin it. If it is in the attic and the sun beams on the attic for hours the server could overheat. Mid-floor level is usually the cooler place within the home, but these recommendations is optional and you should put the server in the most comfortable and convenient space following the rest of the guidelines.
Be sure the area you choose is not damp or wet (no leaks, moisture, near any liquids, or near a window “especially open”). We all know what can happen when electrical components get wet.
The server should sit at least 4” inches from the ground
Be sure a working electrical outlet is near. It is an important to have a power supply, a good brand is APC. Having a power supply can save you from electrical outages, blackouts, and brownouts. Having a power supply helps the server keep its power, remain stable, and unaffected when there is an outage, as well it protects your server against electrical shock. If your outage remains more than 5 minutes it is recommended to begin backing up anything deemed important and shut the computer down voluntarily as well as any other electrical attached devices.
Have a LAN line near and also a phone jack (if you plan on using the server for any dial-up services). It is not recommended to use your web service with a dial-up connection. A broadband connection or greater is recommend for optimal performance. The LAN line should never come from your wall and plug directly into your server, it should plug into a hardware firewall (usually a router with a built-in firewall).
Depending on the size of the server and the internal components the server may be extremely noisy and loud, especially with the fans going. It is suggested to have the server in a secluded location which is not near any peaceful area of the house, such as: bedroom.

Those are all some basic things to consider before hosting your own server. Hosting your own server is not an easy task, but once you have had some time and experience with it, it becomes a breeze. You will have a better web hosting experience, you will be more in tune and reliant to your customers, and you will gain greater technical aptitude. Hosting your own web server is rewarding, useful, and gives you the opportunity to host unlimited websites, databases, services, etc.

If your web traffic becomes too great for your server then it is recommended to choose an offsite web host. It is recommend to purchase a dedicated server with a quality web host. If you choose a server with the above recommendations then your server traffic should be fine for at least up to 25,000,000 page views per month. It could be greater or less depending on your operating system, your server configuration, performance, applications, and services. The point is you should be well in a position to purchase dedicated server space by the time your traffic gets too high, because if you are getting anything close to 1,000,000 page views a month you should have some competitive advertising or ecommerce income. And plus, with the knowledge you will gain from the experience you will be able to support your own server at a dedicated server (remote) location, thus cutting more overhead.

The biggest disadvantage of hosting your own web server is the uptime reliability. If your web server powers down for any reason then your website is offline. Usually web hosting companies have methods and networks to prevent this failure from affecting your website. Usually if the system goes down they have a mirror location where your site is rejuvenated and doesn’t experience much downtime. Some web hosts do not practice this method or any other fault tolerance. Imagine if a blackout or power outage occurs, then your web server could be down for hours or even days, this can affect your business greatly. There are some hosting companies who provide mirror web hosting for a small fee, I haven’t found one that is reliable yet. Or you could cross your fingers and hope for the best, until you are able to host your server outside your home on a larger network.

Hosting your own server usually is not a money-saving experience. It sometimes cost more to host your own server, when you total the up keep and maintenance. The benefit of hosting your own server is usually a better platform for your customers, because it is no longer a shared server and it gives you the ability to make global changes almost instantaneously. When your customers request more performance or specific applications, then you will be able to implement this immediately. Customers like to know that you are in control. If you tell your customers that the server will be down for maintenance from 6 a.m. – 7 a.m. then it would be best if that is when your server is down, not from 5 a.m. – 10 a.m. You know how your web host will tell you one thing and then you have to try explaining it to your customer. In the end it will benefit you to host your own web server as a web business, this helps you and your customers

Tell me what your website does!

You know exactly what your organisation does and what your website offers its users. This information has probably become second nature to you, but first-time visitors to your site won’t know this. As such, make sure you don’t forget to tell them what you do.

As soon as new site visitors arrive at your website the first thing they need to know, before anything else, is what you do. You can talk all you like about how great you are, but unless you spell out what you actually do, they won’t even know what you’re so great at! This oh-so-overlooked yet such basic of information can be communicated to your site visitors in a number of different ways:

Page title

Don’t just use the page title to tell me who you are; tell me what you do too. If your company is called Bloggs Ltd don’t only place the words, ‘Bloggs Ltd’ in the page title as there’s plenty of room for more information. If Bloggs Ltd sells widgets, a good page title might be: ‘Bloggs Ltd – Buy widgets online’.

Note in this example, ‘Buy widgets online’ was used to describe what Bloggs Ltd does, and not ‘Widget seller’. When describing what it is you do be sure to speak the language of your users, and don’t talk from your point of view. From your point of view you sell widgets, but from their point of view they want to buy widgets online, so do bear this in mind when authoring the page title.

The page title is the first thing that appears on screen, and especially on dial-up modems can be the only thing that displays for the first 10 seconds or so. For many web users this is the first piece of content they’ll read on your site.

The page title is also very important for search engines, which place more importance on the page title than any other on-page element. Descriptive page titles are also essential for blind web users utilising screen readers, as it’s the first thing that gets read aloud to them upon arriving at the page.

Tagline

A good tagline is one of the most important usability features on any website. A good tagline should be explanatory and not vague, clear and informative and about four to eight words in length. A tagline is different to a company slogan, in that the former describes what the organisation/website does whereas the latter is designed to evoke certain feeling or create a brand.

‘Priceless’ and ‘I’m loving it’ are slogans by Mastercard and McDonald’s respectively – they differ from taglines because they don’t describe what the organisation does.

Taglines are so important because no matter on what page site visitors enter your website, they’ll always be able to quickly gain an understanding of what your organisation and website offers. This can be especially true for site visitors coming into internal pages from search engines – by telling these site visitors what you do through the tagline, they may be more likely to explore your site beyond the initial page on which they enter.

Taglines are also good for search engine optimisation, as they appear on every page right at the top of the page, an area on to which search engines place importance.

Main heading

The main heading on the homepage is one of the first pieces of text web users notice, especially on clean well laid out websites. Sticking a ‘Welcome to our website’ may seem to be friendly and welcoming to you, but to task-driven site visitors it doesn’t help in any way shape or form. A quick summary of what you do and/or what the website offers, in just four or five words can be highly effective (and very search engine friendly too!).

Opening paragraph

Perhaps the most important place on the homepage to tell your site visitors what you do, the opening paragraph must be short, succinct and straight-to-the-point. Just one sentence is enough to put across this most basic yet fundamental of information.

When writing this opening paragraph, remember to front-load the content (this rule actually applies to every paragraph on the website). Front-loading means putting the conclusion first, followed by the when, what, where and how.

Don’t write a story with a start, middle and conclusion – generally speaking on the web, we scan looking for the information that we’re after so put the conclusion first. This way, site visitors can read the conclusion first, which in this case is what your organisation actually does. If they want to know any more, they can then continue reading or jump to another section of the page. (To see front-loading in action, read any newspaper article.)

Exceptions

So, does every website need to tell users what the organisation does in these four different places? Well, not necessarily. We all know what Mastercard and McDonalds do, so it could definitely be argued that websites for household names need not explicitly say what they do. What these sites should do instead is tell us what the website offers, and this message can (and should) be put across in any of the above four ways – how else will site visitors quickly be able to find this out?

Conclusion

People are going to visit your site who don’t know what you do. Before you can even begin selling to them you must tell them what your organisation and website does. In addition to fulfilling site visitors’ immediate need (finding out what you do) you’ll also be boosting your search engine rankings. If your organisation is a household name, then instead of explaining what you do, it may be wise to tell site visitors what they can do on your website.

Sitemaps 101 – Back To SEO School

Sitemaps are without doubt one of the most often ignored and undervalued aspects of search engine optimization. You’ve probably spent a huge amount of time working on pages of original content, keyword density and getting incoming links but never once spared a thought for a sitemap for your new creation.
    
What is a sitemap?

Put simply it’s a page or pages that contain a list of and link to all the other documents on your site. This is useful on two levels:

1. Your visitor can quickly reference all the documents on your site to find exactly what they’re looking for.

2. Search engine spiders can also quickly find and index every single page of your site in the least amount of time. The SEO benefits of using a sitemap far considerable and should not be ignored.

This is a win-win situation for you, your website visitors and the search engines. Put simply you’re nuts if you’ve not included a sitemap as part of your overall website promotion strategy.

The good news is that it’s never too late to start. You can create a sitemap page today but there are some rules to creating an effective sitemap that you need to follow:

Your sitemap must only be linked to from your homepage and no other page. Why? You want the search engine spiders to find this link directly from your homepage and follow it from there. Your sitemap MUST NOT be linked to from every other page of your site. Also from a Google Pagerank point of view only linking to your sitemap from your homepage can also “funnel” PR quickly to pages all over your site.

If you have a large website of 50 pages or more limit the number of pages listed on your sitemap to a maximum of 30. This is to prevent your sitemap from being misinterpreted as a link farm by the search engines. It also makes the sitemap a lot easier for real human visitors to read through. Limiting the number of pages listed on each sitemap to 30 might mean splitting your sitemap over 5, 10 or 20 pages. This has to be done and the long term benefits are worth it. Bear in mind that if you do create a 20 page sitemap you’ve just created an extra 20 pages of content for your website!

Make absolutely sure that each of your sitemap pages links to the next. If you have 10 sitemap pages in total then each of those needs to link to every other sitemap page. Otherwise both visitors and search engine spiders will find a broken link, lose interest and go away.

Test your sitemap thoroughly. Make sure all the links works. Make sure it’s easy to read and navigate through. Your sitemap is there to assist your visitor and not confuse them.

How should you structure your sitemap? The following tips must be adhered to in order for your site to gain the maximum possible benefit from having a sitemap.

1. The title of each sitemap link should be keyword rich and link directly back to the original page.

2. Include 10 – 20 words of text from the original page of content underneath the relevant sitemap link. This creates more content for search engine spiders and human visitors can see exactly what each page is about in advance of clicking.

Human visitor benefit is that they can see what the pages is about in advance.

3. Ensure that the look and feel of your sitemap page is consistent with the rest of your site. Use the same basic HTML template you used for every other page of your site.

So now you understand the importance of building a sitemap for your website. There is work involved but the long term benefits for your websites far outweigh any effort you have to make right now.

Content Management Systems (CMS)

There is a buzz in the online community about a technology that empowers the average computer user with the ability to create and maintain their very own web presence. In the past, individuals who took interest in having and operating their own websites were burdened with the task of learning HTML, DHTML, and other web-based technologies such as JavaScript and CSS. The only alternative to this was, unfortunately, to pocket the expenses and costs required to pay a web developer to build and maintain it for them.

This dilemma is one of the primary reasons that small to medium-sized businesses did not begin to emerge on the worldwide web for several years following the corporate dot-com rush. Many business owners were terrified at the thought of having to learn complicated programming languages and server languages in order to create ‘do it yourself’ websites. Fortunately, that was then and this is now!

Due to the evolving demand for businesses to have a presence on the web, a new application has emerged in order to help business owners and employees create and sustain a professional-looking site without the worries of coding and technical applications.

What do we call this innovative technology that lessens the gap between the IT professionals and ourselves? Content Management Systems (otherwise known as CMS). Simply put, Content Management Systems are applications that implement easy-to-use web-based tools in combination with a database and web templates in order to effortlessly construct, and update a website’s content. Hence the name, Content Management System. Content Management System applications are ideal for businesses and sites that require ongoing updates and additions.

The simplicity lies in the fact that through the utilization of the web-based tools, Content Management Systems completely separate the updating and creation of the site’s actual content from the site’s design and layout. Therefore allowing a person with no knowledge of HTML to go in and alter/add content to the site’s pages without making structural changes to the site’s design.

What are a few of the benefits associated with using a Content Management System over traditional web-design?

Well, to begin with, Content Management Systems are developed in a way so that even a novice user has the power to maintain and update the site. The content input/update areas are designed to offer a very user-friendly interface, appearing much like a common word processing application that so many of us are familiar with. This makes it possible for any person or staff involved in document creation to easily and efficiently maintain the content on the company’s website.

This brings us to yet another factor that makes Content Management Systems so helpful. Because more people have access to updating and maintaining the website, the information is generally much more accurate and recent. Updating older pages manually in order to keep them relevant to changing information cannot only be frustrating, but also repetitive and time consuming. In today’s competitive online market, up-to-date information is crucial to your business’s credibility and success.

Content Management System un-complicates this task by making global (site-wide) changes through the alteration of a single file. Because all of the site’s information is stored in a central location, when that information is altered, every page on that site which contains that specific info is updated immediately. Just like that! This allows the editors or the website’s author to write, edit, and publish information to the website without having to submit all of the material to the Webmaster. No inconsistencies, less proofreading—less work altogether.

Another tremendous advantage to using a Content Management System is link maintenance. Many times, even Webmasters make the mistake of deleting or moving content that is linked to multiple areas within a site. When this occurs, the links that are not removed then become broken links because the file to which they refer is either deleted or in another location and the reference is invalid. When a person using a Content Management System deletes or updates the location of a file, each of the links relative to that file are automatically updated, therefore eliminating any risk of orphaned links and that nasty little ‘page not found’ experience for the site’s visitors.

Overall, it is quite obvious that the implementation of a Content Management System is ideal for a wide variety of websites. From news sites to corporate sites— all the way to small business sites and personal web pages, using a Content Management System combines the convenience of ease of use along with the professionalism of clean web design and up-to-date material.