Unix cheat sheet

Telnet Telnet allows you to login, albeit insecurely, to any remote machine running a telnet server. Telnet will allow you to open a shell and use simple command line unix tools on the remote machine.

>telnet fred

In this example, imagine you are telnetting to a machine named “fred”. Sometimes you may have to use the fully qualified name of the machine (e.g., Fred.arizona.edu) or use Fred’s IP address: 128.196.99.1. You will need to login to Fred and provide your password.


FtpFTP is the file transfer protocol. Like telnet, it is an old insecure protocol. It is being replaced by scp, but is still in use on some machines. FTP can operate in text or binary mode, with the prompt on or off. It can get files from the remote machine or put files on the remote machine, either singly or in large batches. By default, ftp operates in text mode with the prompt on, we usually alter these defaults at the beginning of a new ftp session. FTP will allow you to cd between directories, but it may have trouble with listing, copying, moving and removing files and directories. Telnet is better suited for these general unix commands.

To start an ftp connection from a unix machine:

>ftp Fred

As with telnet above, you may sometimes need to use Fred’s fully qualified domain name or IP address and you will need to login.

ftp>bin (this will tell ftp to transfer the data in binary mode instead of text mode. You will typically be tranferring image data, so you want to be in binary mode. In fact, it never hurts to be in binary mode, even if you are transferring text files.)

ftp>prompt (this will tell ftp not to ask you about transferring each individual file. If you are about to move dozens of files, you will want to type “prompt”).

Local and Remote Machines (Understanding get and put)

In the simplest scenario, I sit down at one machine (e.g., “Mary”) and I ftp to another machine (e.g., “Fred”):

>ftp Fred

In this case, Mary is my local machine and Fred is the remote machine.

However, it can be much more complicated. Suppose I’m sitting at home at my PC and I telnet to Mary. After logging in to Mary, I ftp Fred. Again, Mary is the local machine (the machine where I started the ftp session) and Fred is the remote machine.

Let’s make it even worse. I telnet from home to Mary. Then I telnet from Mary to Fred, and then ftp from Fred to Mary. Now Fred is the local machine and Mary is the remote machine.

To understand when to use “get” or “mget” versus “put” or “mput”, you must understand these abstract concepts of the remote and local machines. However, it does get confusing, so if you try “put” and get back a message like “no such file or directory”, then try “get” instead.

Let’s go back to the simplest case, I ftp from Mary to Fred. Mary is my local machine.

  • I should use “put” or “mput” to transfer files from Mary (local) to Fred (remote).
  • I should use “get” or “mget” to transfer files from Fred (remote) to Mary (local).

You should start your ftp session in the directory on the local machine where files to transfer reside or where you intend to place them.

You can use the “cd” command to move around on the remote machine once you have ftp’d there

ftp>cd /data/tmp

Examples

ftp>put P01000

In this example “put” is used to copy a single specified file from the local machine (specifically, from the directory you started the ftp session in) to the remote machine (the directory you are in on the remote machine).

ftp>mput P*

“mput” [multiple puts] tells ftp to copy all files that meet the criterion, in this case, all files beginning with a capital P, from the current directory on the local machine to the current directory on the remote machine.

ftp>get bird.jpg

Copy a single file “bird.jpg” from the current directory on the remote machine to the current directory on the local machine.

ftp>mget *.jpg

Copy all “*.jpg” files from the current directory on the remote machine to the current directory on the local machine.

ftp>bye

Exits the ftp session

>man ftp

Tells you more about the options and flags available with ftp.


SSHssh=secure shell (secure telnet)
To use these programs, they must be installed on both communicating machines. For a machine to receive an ssh or scp request (i.e., for it to answer when you request a connection to it) it must be running an ssh server (sshd). Typically only unix machines will run ssh servers. If you have trouble connecting to a machine with ssh, you should check to see if it is running an ssh server (or daemon):

>which sshd

>ps -ef | sshd

Same user on local and remote machines

The commands you are most likely to need:

ssh machinename (where the name of the machine or IP address is substituted for the term “machinename”), e.g.,

>ssh buddy

ssh assumes you want to be the same user on the machine you are sshing TO

as you are on the machine you are coming FROM. This can be annoying.

Different user on local and remote machines

If you want to login as a different user, use the following scheme:

ssh -l username machinename

e.g.

>ssh -l joe buddy

or

>ssh -l joe buddy.psych.arizona.edu

(-l = “login as”)

You will be asked for the password.


SCPsame user on local and remote machines

scp=secure copy (secure binary mode ftp)

Unix: You can use scp at the command line whether or not have used ssh to connect to another machine.

Windows PC: If you are using the university ssh and scp on a Windows PC, then you have a separate scp program as well as being able to use scp at the command line once you have connected with ssh.

SCP move files to or from your current location. It always uses binary mode. You can work as either the current user on the starting machine or a different user. It always asks for the user’s password. Here are some examples in which I move the file bird.jpg from one place to another. The first three examples assume you are the same user on the local and remote machines. The last example shows you how to login as someone else on the remote machine:

Put a file on a remote machine:

>scp bird.jpg buddy:/data/joe/

Get a file from a remote machine:

>scp buddy:/data/joe/bird.jpg . (the “.” means “here”)

>scp buddy:/data/joe/bird.jpg /home/fred/

(does the same thing, but just substitutes the path for “.”)

You can scp -r so that an entire directory can be copied at a time:

>scp -r e12345 buddy:/data/joe

different user on local and remote machines

Log in as someone other than who you are locally, then copy a file from your current directory to a directory (/data/fred/) on the remote machine:

>scp bird.jpg joe@buddy:/data/fred

(you will be asked for joe’s password)

Copy a file (bird.jpg) from a directory (/data/fred/) on a remote machine (buddy) where you will login as someone else (joe) to here (.)

>scp joe@buddy:/data/fred/bird.jpg .

Htaccess Tips and Tricks

1) Stopping hackers
2) Stopping site snagging (offline viewing)
3) Stopping Hotlinking
4) Multiple Domain Names: Shared Members Areas

Section 1) Stopping hackers

The most common way of protecting your members only area is with, as I’m sure you know, a filed named .htaccess sitting in your server’s member’s folder. This file is used by your server to pop up a little box and force people to enter a username and password. It then checks that against a password file located on your server to see it the info is valid. If it is, access is given.

There are, however, many lines that you can add to your .htaccess file that most webmasters don’t really know about. I’ll go through them one by one as well as show you completed .htaccess files that you can start using immediately.

NOTE: You only need to copy the text below that is in black. It is also important that you use a very basic text editor to save the file. Use Notepad or NoteTab to do it. DO NOT use MS Word! The file will not save correctly! You should also realize that an .htaccess file is just plain text file with a funny name. The complete file name really is .htaccess, period in front and all.

Here is the basic .htaccess file that most people use:

AuthUserFile /server/path/to/your/password/file/.htpasswd
AuthGroupFile /dev/null
AuthName “Members Area”
AuthType Basic

<limit GET PUT POST>
require valid-user
</limit>

This file, when placed in your members only folder will protect all of the subfolders under it. There are however some holes here. Once inside the members area, they can still poke around for things you may not want them to see by being creative and typing in URL’s. Most of the time this is no big deal. However, they really don’t need to be poking around in there.

Another problem is that some password security programs have to be accessed directly or in a very specific way to work. An older version of the security program I used required a file called index.cgi to be placed in the member’s only folder. When you linked to http://princessmandy.com/members/ it would do two things. First, the .htaccess file would check the username and password to see if they were valid. Second, if approved, it would run my security program to see how many people have used that username and password. If that checked out, they would be sent to the opening page of my member’s area which was actually http://princessmandy.com/members/welcome.htm.

That worked fine as long as no one tried to go directly to the welcome.htm page. Guess what, hackers are smart. By posting a simple link on a password trading site, they could bypass the security program and gain access in one easy step. The link would look like this:

http://username:password@princessmandy.com/members/welcome.htm

Look familiar? If you’ve ever been password traded (and you will) it should look familiar. After that I learned of some code that will stop this and force everyone to use one page to gain access to the member’s area.

AuthUserFile /server/path/to/your/password/file/.htpasswd
AuthGroupFile /dev/null
AuthName “Members Area”
AuthType Basic

<limit GET PUT POST>
require valid-user
</limit>

RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*yourdomain.com/ [NC]
RewriteRule /* http://www.yourdomain.com/login.htm [L,R]

The new section activated the RewriteEngine feature of your .htaccess file. This will now only allow access to the member’s area of your site through a link on your page. If they don’t use an actual link on your site they cannot get in. Any URL that you manually type into the address bar of your browser will show up in your log file as having no referrer and will not pass. The only way to satisfy the RewriteCond of this updated .htaccess file is to use a link on your site.

Using this example, you will need a new little web page named login.htm in your free area. On that page you will need a link to your member’s area. Whatever link will allow your security program to work right.

The main thing I like about using this is that it keeps people from messing around inside the members area. Since I update with new pics every week, I can upload several sets at a time to the server and have them waiting. I don’t have to worry about anyone finding them before I link to them.

Now remember, if you don’t have any software in place to monitor how many times your usernames and passwords are being used, this won’t help you at all. This method won’t stop shared usernames and passwords from being used. It is only here to channel people into your password sharing software. I personally recommend using Password Sentry. It’s a one time charge and they give you lifetime upgrades and support. It’s also not very expensive. I haven’t found any program out there that I liked any better, at any price. You can find them at http://www.monster-submit.com/sentry/

I actually use their newest version which can stop people from hammering your site with username and password combinations until they get one that works. I was getting at least one person a day running one of those programs on my site trying to get in. I still use an .htaccess file in my members area, but it no longer checks for a username and password. It looks for a temporary cookie that is placed on their system is they are approved by my security program. It’s just as secure but blocks those password hammering programs completely.

——————————————————————————–

2) Stopping Site Snagging

This one pisses me off. There are many programs out there designed for “offline viewing” of web sites. These programs allow a person to download everything on your site to their computer. It works wonderfully in the free area, however, if they have a username and password to your site, they can also download your entire member’s area.

If you don’t have any software protecting you from password traders, this one could be devastating. Not only could everyone in the world get into your members area for free, they could download everything in there in a hurry. If you have 200 MB of stuff in your site and 1000 people get in for free and decide to use one of these programs, your looking at 200 Gigabyte of transfer in as short as one day. Can you afford that? Those numbers are kind too. Many of you have much more than 200 MB of stuff. I’ve also been traded in the past and was receiving 4500 people per hour into the members area for free. That could put you out of business in a hurry.

If you don’t think that these programs are a problem check your stats. Many stats programs will tell you the different web browsers that are visiting your site. I have programs like Teleport Pro and Offline Explorer in my top 10 web browsers every single day.

Since we have to pay for bandwidth, which can get expensive as your site grows, this can turn into a major problem. I was surprised at how much bandwidth I saved after adding these lines to an .htaccess file.

Here’s the best part. You can place this .htaccess file in your root public directory. Put it in the same folder as your site’s opening index file and it will protect your entire site.

You’ll notice one major difference about this file. It doesn’t require usernames and passwords to get in. Those lines have simply been removed from the file. It will also not have any effect on the .htaccess file in your member’s folder. That one will check passwords, this one will stop people from snagging your site.

There are actually 3 sections to the file below.

The first section allows you to block specific users’ ip addresses. I have two blocked here. There were users that tried hammering my site with around 20,000 username and password combos. This part is optional since most people have a new ip each time they log on. However, if they are using a cable modem they will keep the same ip all of the time like the two in my example. If I were you I would definitely leave that guy in there.

The second section related to error 404’s. This works well with the way many search engines work. I don’t know how many of them are still linking to pages on my site that no longer exist. If someone clicks on a link from that search engine that is no longer any good, they just get that blank error page. The errordocument line below forwards those people to another page. I forward them to my opening page. That way, if they come to my site using a link that no longer is valid, they end up at my opening page never realizing that the link was bad.

The third section stops the programs that will try and download your site. Since I’m finding more all the time the list keeps growing. If you discover more, just add them it. If the program is actually two words, Teleport Pro for example, you only need to include one word to block them. Notice below that I have a line including Teleport, but not Teleport Pro. I’ve downloaded the program and tested it. This method works perfectly.

The very last line, the RewriteRule, is where violators will be sent to. I have personally chosen a site at geocities that features sewing patterns for gay men’s swimwear. :)

<Limit GET>
order allow,deny
deny from 24.128.16.113
allow from all
</Limit>

errordocument 404 http://www.princessmandy.com/index.htm

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*WebZIP.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Iria.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Stripper.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Offline.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Copier.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Crawler.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Snagger.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Teleport.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Reaper.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Wget.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Grabber.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Sucker.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Downloader.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Siphon.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Collector.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Mag-Net.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Widow.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Pockey.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*DA.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Snake.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*BackWeb.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*gotit.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Vacuum.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*SmartDownload.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Pump.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*HMView.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Ninja.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*HTTrack.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*JOC.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*likse.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Memo.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*pcBrowser.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*SuperBot.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*leech.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Mirror.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Recorder.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*GrabNet.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Likse.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Navroad.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*attach.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Magnet.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Surfbot.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Bandit.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Ants.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Buddy.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Whacker.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*DISCo\Pump.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Drip.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*EirGrabber.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*ExtractorPro.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*EyeNetIE.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*FlashGet.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*GetRight.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Gets.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Go!Zilla.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Go-Ahead-Got-It.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Grafula.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*IBrowse.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*InterGET.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Internet\Ninja.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*JetCar.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*JustView.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*MIDown\tool.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Mister\PiX.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*NearSite.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*NetSpider.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Offline\Explorer.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*PageGrabber.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Papa\Foto.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Pockey.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*ReGet.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Slurp.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*SpaceBison.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*SuperHTTP.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Teleport.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebAuto.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Webcam\Watcher.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebCopier.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebFetch.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebReaper.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*FreeLoader.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Clint’s\Webcam.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebCam\Spy.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*CamEVU.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*iCamMaster.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Cam\Chaser\Pro.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*FlashIT.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebSauger.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebStripper.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebWhacker.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebZIP.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Web\Image\Collector.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Web\Sucker.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Webster.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Wget.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*eCatch.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*ia_archiver.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*lftp.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*tAkeOut.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*FileHound.*$
RewriteRule /* http://www.geocities.com/WestHollywood/Heights/3204/1home.html [L,R]

If you decide to redirect them somewhere else be sure to leave the “[L,R]” at the end of the line. It’s rather important.

Remember to always check your site immediately after uploading a new .htaccess file to your server. If there are any errors in you file, your site will most likely not load at all. In that case, quickly delete the file off of the server until you figure out what went wrong!

——————————————————————————–

3) Stopping Hotlinking

I think I see some of you smiling already. Yes, you can use an .htaccess file to stop people from hotlinking images off of your site. I recently discovered several of my pictures being posted on a messageboard. They had a little message and then my picture would pop up in the message. It was loading directly off of my server with absolutely nothing pointing back to me. I was pissed.

The .htaccess file to prevent this is very similar to some of the ones above. It’s just much shorter since it only performs one function, to stop hotlinking. It does this by checking the referrer. In other words, where the hit is coming from.

I have actually moved all of my images, graphics, games, you name it into a subfolder in the free area. I then just place this .htaccess file into that folder.

I DON”T recommend adding these lines into the .htaccess file above that protects your entire site. Why? Well, when you sign up on someone else’s friends page you have to enter in a URL of your ID picture. If you block everything than all of your ID pictures on all of those friends pages you signed up for will not load. Your ID picture will be a very sexy little red x.

You can stop people from hotlinking your id pictures if you want, just think it through first. I have my banner farm protected to stop new sign ups from hotlinking. However, I still have a few I pictures in unprotected areas too. That way I can sign up for new friends and links pages. You also don’t want to block everything if you purposely post pictures at picpost pages. If you block your entire site, none of those picposts will load.

Similar to some of the above files, this one will allow the picture to load if the referring site starts with princessmandy.com/ only. Do not include the www. in here. That’s what all of the crap in front of princessmandy.com/ is for. The referrer can end with anything it likes, as long as it has princessmandy.com/ in it.

RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*princessmandy.com/ [NC]
RewriteRule /* http://www.princessmandy.com [L,R]

——————————————————————————–

Multiple Domain Names: Shared Members Areas

Here’s a fun one. Many of you may have several web sites but only one credit card account and one password file. How do you get everyone to have access to all of your site’s members areas but only use one account? Easy. Use .htaccess files. This can get a little bit tricky so pay attention.

Let’s say you have three sites: monkeyone.com, monkeytwo.com, and monkeythree.com

Let’s also say that you want anyone joining one site to have access to all three.

Pick one site to house the main entry page. Just like in the above examples, create a page called http://www.monkeyone.com/login.htm in the free area of that site. You can call it whatever you want. Use that page as the entry page for all of your web sites. Just put a link on there saying “click here to enter the member’s area” or something.

Now everywhere on monkeytwo.com and monkeythree.com that says “member’s entrance” should point to http://www.monkeyone.com/login.htm. Understand? Only one entrance page and only one password file. Everyone must enter from the same place.

Now, you’ll need to add the following lines to your .htaccess file in the member’s only folder of monkeyone.com.

RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeyone.com/ [NC]
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeytwo.com/members/ [NC]
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeythree.com/members/ [NC]
RewriteRule /* http://www.monkeyone.com/login.htm [L,R]

This will allow entry only from either your main page’s entry page, or from the member’s area of your other sites. Is part is tricky to think about but very important.

Your new monkeyone.com member’s only folder .htaccess file will most likely look like this:
AuthUserFile /server/path/to/your/password/file/.htpasswd
AuthGroupFile /dev/null
AuthName “Members Area”
AuthType Basic

<limit GET PUT POST>
require valid-user
</limit>

RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeyone.com/ [NC]
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeytwo.com/members/ [NC]
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeythree.com/members/ [NC]
RewriteRule /* http://www.monkeyone.com/login.htm [L,R]

Now here’s the fun part. The members areas of monkeytwo.com and monkeythree.com will no longer check for a valid username and password. They will only check out where the person is coming from. If they aren’t coming from one of three places they will be routed to the login.htm page on monkeyone.com.

This .htaccess file is very small and should be placed in the members only folder at monkeytwo.com and monkeythree.com.

You must include lines for all of your sites in every .htaccess file.

The .htaccess files at monkeytwo.com/members and monkeythree.com/members should look like this:

RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeyone.com/members/ [NC]
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeytwo.com/members/ [NC]
RewriteCond %{HTTP_REFERER} !^http://([a-z0-9-]+\.)*monkeythree.com/members/ [NC]
RewriteRule /* http://www.monkeyone.com/login.htm [L,R]

That’s it. They’re very short files but they will do the job. These new .htaccess files at monkeytwo and monkeythree will only allow people access if they’re coming from the members only area of one of the other sites. They don’t need to check usernames and passwords too.

I made a new page in my members area that links to all three of my sites. Once they are validated at princessmandy.com they end up on this one page. It’s sort of a “Welcome inside. What site do you want to visit?” type of thing. It works very, very well and allows me to use one password file for as many sites as I want.

This method can also be used to allow two very different sites to share a members. Each site can be owned and operated by two different people using two different login pages, generating their own revenue, but sharing a members area.

Just allow access from either your own site, or the members only folder of the other site.

The history of Linux

linux admin

This article is about operating systems that use the Linux kernel. For the kernel itself, see Linux kernel. For other uses, see Linux (disambiguation).
Linux
Tux, the penguin, mascot of Linux
Tux, the penguin, mascot of the Linux kernel
OS family Unix-like
Latest stable release 2.6.23.12 (Linux kernel) / 18 December 2007
Kernel type Monolithic kernel
License GNU General Public License and others
Working state Current

Linux (pronunciation: IPA: /?l?n?ks/, lin-uks) is a Unix-like computer operating system. Linux is one of the most prominent examples of free software and open source development; typically all underlying source code can be freely modified, used, and redistributed by anyone.

The Linux kernel was first released to the public on 17 September 1991, for the Intel x86 PC architecture. The kernel was augmented with system utilities and libraries from the GNU project to create a usable operating system, which led to an alternative term, GNU/Linux. Linux is packaged for different uses in Linux distributions, which contain the sometimes modified kernel along with a variety of other software packages tailored to different requirements.

Predominantly known for its use in servers, Linux is supported by corporations such as Dell, Hewlett-Packard, IBM, Novell, Oracle Corporation, Red Hat, and Sun Microsystems. It is used as an operating system for a wide variety of computer hardware, including desktop computers, supercomputers, video game systems, such as PlayStation 2, 3, several arcade games, and embedded devices, such as mobile phones and routers.

In 1992, Linus Torvalds explained that he pronounces Linux as /?l?n?ks/, though other variations are common.

The Unix operating system was conceived and implemented in the 1960s and first released in 1970. Its wide availability and portability meant that it was widely adopted, copied and modified by academic institutions and businesses, with its design being influential on authors of other systems.

he GNU Project, started in 1984, had the goal of creating a “complete Unix-compatible software system” made entirely of free software. In 1985, Richard Stallman created the Free Software Foundation and developed the GNU General Public License (GNU GPL), in order to spread software freely. Many of the programs required in an OS (such as libraries, compilers, text editors, a Unix shell, and a windowing system) were completed by the early 1990s, although low level elements such as device drivers, daemons, and the kernel were stalled and incomplete. Linus Torvalds has said that if the GNU kernel had been available at the time (1991), he would not have decided to write his own.

MINIX, a Unix-like system intended for academic use, was released by Andrew S. Tanenbaum in 1987. While source code for the system was available, modification and redistribution were restricted. In addition, MINIX’s 16-bit design was not well adapted to the 32-bit design of the increasingly cheap and popular Intel 386 architecture for personal computers.

In 1991, Linus Torvalds began to work on a non-commercial replacement for MINIX while he was attending the University of Helsinki. This eventually became the Linux kernel.

Linux was dependent on the MINIX userspace at first. With code from the GNU system freely available, it was advantageous if this could be used with the fledgling OS. Code licensed under the GNU GPL can be used in other projects, so long as they also are released under the same or a compatible license. In order to make the Linux kernel compatible with the components from the GNU Project, Torvalds initiated a switch from his original license (which prohibited commercial redistribution) to the GNU GPL. Linux and GNU developers worked to integrate GNU components with Linux to make a fully functional and free operating system.

Commercial and popular uptake

Today Linux is used in numerous domains, from embedded systems to supercomputers, and has secured a place in server installations with the popular LAMP application stack. Torvalds continues to direct the development of the kernel. Stallman heads the Free Software Foundation, which in turn supports the GNU components. Finally, individuals and corporations develop third-party non-GNU components. These third-party components comprise a vast body of work and may include both kernel modules and user applications and libraries. Linux vendors and communities combine and distribute the kernel, GNU components, and non-GNU components, with additional package management software in the form of Linux distributions.

The primary difference between Linux and many other popular contemporary operating systems is that the Linux kernel and other components are free and open source software. Linux is not the only such operating system, although it is the best-known and most widely used. Some free and open source software licences are based on the principle of copyleft, a kind of reciprocity: any work derived from a copyleft piece of software must also be copyleft itself. The most common free software license, the GNU GPL, is used for the Linux kernel and many of the components from the GNU project.

As an operating system underdog competing with mainstream operating systems, Linux cannot rely on a monopoly advantage; in order for Linux to be convenient for users, Linux aims for interoperability with other operating systems and established computing standards. Linux systems adhere to POSIX, SUS, ISO, and ANSI standards where possible, although to date only one Linux distribution has been POSIX.1 certified, Linux-FT.

Free software projects, although developed in a collaborative fashion, are often produced independently of each other. However, given that the software licenses explicitly permit redistribution, this provides a basis for larger scale projects that collect the software produced by stand-alone projects and make it available all at once in the form of a Linux distribution.

A Linux distribution, commonly called a “distro”, is a project that manages a remote collection of Linux-based software, and facilitates installation of a Linux operating system. Distributions are maintained by individuals, loose-knit teams, volunteer organizations, and commercial entities. They include system software and application software in the form of packages, and distribution-specific software for initial system installation and configuration as well as later package upgrades and installs. A distribution is responsible for the default configuration of installed Linux systems, system security, and more generally integration of the different software packages into a coherent whole.

Linux is largely driven by its developer and user communities. Some vendors develop and fund their distributions on a volunteer basis, Debian being a well-known example. Others maintain a community version of their commercial distributions, as Red Hat does with Fedora.

In many cities and regions, local associations known as Linux Users Groups (LUGs) seek to promote Linux and by extension free software. They hold meetings and provide free demonstrations, training, technical support, and operating system installation to new users. There are also many Internet communities that seek to provide support to Linux users and developers. Most distributions and open source projects have IRC chatrooms or newsgroups. Online forums are another means for support, with notable examples being LinuxQuestions.org and the Gentoo forums. Linux distributions host mailing lists; commonly there will be a specific topic such as usage or development for a given list.

There are several technology websites with a Linux focus. Linux Weekly News is a weekly digest of Linux-related news; the Linux Journal is an online magazine of Linux articles published monthly; Slashdot is a technology-related news website with many stories on Linux and open source software; Groklaw has written in depth about Linux-related legal proceedings; and there are many articles relevant to the Linux kernel and its relationship with the GNU on the project’s website.

Although Linux is generally available free of charge, several large corporations have established business models that involve selling, supporting, and contributing to Linux and free software. These include Dell, IBM, HP, Sun Microsystems, Novell, and Red Hat. The free software licenses on which Linux is based explicitly accommodate and encourage commercialization; the relationship between Linux as a whole and individual vendors may be seen as symbiotic. One common business model of commercial suppliers is charging for support, especially for business users. A number of companies also offer a specialized business version of their distribution, which adds proprietary support packages and tools to administer higher numbers of installations or to simplify administrative tasks. Another business model is to give away the software in order to sell hardware.

Programming on Linux

Most Linux distributions support dozens of programming languages. The most common collection of utilities for building both Linux applications and operating system programs is found within the GNU toolchain, which includes the GNU Compiler Collection (GCC) and the GNU build system. Amongst others, GCC provides compilers for C, C++, Java, Ada and Fortran. The Linux kernel itself is written to be compiled with GCC.

Most also include support for Perl, Ruby, Python and other dynamic languages. Examples of languages that are less common, but still well-supported, are C# via the Mono project, and Scheme. A number of Java Virtual Machines and development kits run on Linux, including the original Sun Microsystems JVM (HotSpot), and IBM’s J2SE RE, as well as many open-source projects like Kaffe. The two main frameworks for developing graphical applications are those of GNOME and KDE. These projects are based on the GTK+ and Qt widget toolkits, respectively, which can also be used independently of the larger framework. Both support a wide variety of languages. There are a number of Integrated development environments available including Anjuta, Code::Blocks, Eclipse, KDevelop, MonoDevelop, NetBeans, and Omnis Studio while the traditional editors Vim and Emacs remain popular.

Although free and open source compilers and tools are widely used under Linux, there are also proprietary solutions available from a range of companies, including the Intel C++ Compiler, PathScale, Micro Focus COBOL, Franz Inc and the Portland Group.

Design

Linux is a modular Unix-like operating system. It derives much of its basic design from principles established in Unix during the 1970s and 1980s. Linux uses a monolithic kernel, the Linux kernel, which handles process control, networking, and peripheral and file system access. Device drivers are integrated directly with the kernel.

Much of Linux’s higher-level functionality is provided by separate projects which interface with the kernel. The GNU userland is an important part of most Linux systems, providing the shell and Unix tools which carry out many basic operating system tasks. Atop these tools graphical user interfaces can be placed, usually running via the X Window System.

User interface

See also: User interface

Linux can be controlled by one or more of a text-based command line interface (CLI), graphical user interface (GUI) (usually the default for desktop), through controls on the device itself (common on embedded machines).

On desktop machines, KDE, GNOME and Xfce are the most popular user interfaces., though a variety of other user interfaces exist. Most popular user interfaces run on top of the X Window System (X), which provides network transparency, enabling graphical apps running on one machine to be displayed and controlled from another.

Other GUIs include X window managers such as FVWM, Enlightenment and Window Maker. The window manager provides a means to control the placement and appearance of individual application windows, and interacts with the X window system.

As with most platforms there are a number of toolkits. These tend to be themed similarly in order to maintain desktop continuity. For example, although Evolution is based on GTK, Firefox is based on XUL, OpenOffice.org is based on its own toolkit and Azureus is a Java app, each uses the same GTK theme and is similar in appearance.

Linux systems usually provide a CLI of some sort through a shell, which is the traditional way of interacting with Unix systems. Linux distributions specialized for servers may use the CLI as their only interface. “Headless systems” run without even a monitor can be controlled by command line via a protocol such as SSH or telnet.

Most low-level Linux components, including the GNU Userland, use the CLI exclusively. The CLI is particularly suited for automation of repetitive or delayed tasks, and provides very simple inter-process communication. Graphical terminal emulator programs are often used to access the CLI from a Linux desktop.

Uses

As well as those designed for general purpose use on desktops and servers, distributions may be specialized for different purposes including: computer architecture support, embedded systems, stability, security, localization to a specific region or language, targeting of specific user groups, support for real-time applications, or commitment to a given desktop environment. Furthermore, some distributions deliberately include only free software. Currently, over three hundred distributions are actively developed, with about a dozen distributions being most popular for general-purpose use.

Linux is a widely ported operating system. While the Linux kernel was originally designed only for Intel 80386 microprocessors, it now runs on a more diverse range of computer architectures than any other operating system—from the hand-held ARM-based iPAQ to the mainframe IBM System z9, in devices ranging from supercomputers to mobile phones. Specialized distributions exist for less mainstream architectures. The ELKS kernel fork can run on Intel 8086 or Intel 80286 16-bit microprocessors, while the µClinux kernel may run on systems without a memory management unit. The kernel also runs on architectures that were only ever intended to use a manufacturer-created operating system, such as Macintosh computers, PDAs, Video game consoles, portable music players, and Mobile phones.

Although in specialized application domains such as desktop publishing and professional audio there may be a lack of commercial quality software, users migrating from Mac OS X and Windows can find equivalent applications for most tasks.

Many free software titles that are popular on Windows are also available, such as Pidgin, Mozilla Firefox, Openoffice.org, and GIMP, amongst others. A growing amount of proprietary desktop software is also supported under Linux, examples being Adobe Flash Player, Acrobat Reader, Matlab, Nero Burning ROM, Opera, RealPlayer, and Skype. In the field of animation and visual effects, most high end software, such as AutoDesk Maya, Softimage XSI and Apple Shake are available both for Linux, Windows and/or MacOS X. Additionally, CrossOver is a commercial solution based on the open source Wine project that supports running Windows versions of Microsoft Office and Photoshop.

Linux’s open nature offers the ability for distributed teams to localize Linux distributions for use in locales where doing so to proprietary systems would not be cost-effective. For example, the Sinhalese language version of the Knoppix distribution was available for a long time before the initiation of translation of Microsoft Windows XP to Sinhalese. In this case, The Lanka Linux User Group played a major part in developing the localized system by combining the knowledge of university professors, linguists and local developers.

Servers and supercomputers

Historically, Linux has mainly been used as a server operating system, and has risen to prominence in that area; Netcraft reported in September 2006 that eight of the ten most reliable internet hosting companies run Linux on their web servers. This is due to its relative stability and long uptime, and the fact that desktop software with a graphical user interface is often unneeded. Enterprise and non-enterprise Linux distributions may be found running on servers. Linux is the cornerstone of the LAMP server-software combination (Linux, Apache, MySQL, Perl/PHP/Python) which has achieved popularity among developers, and which is one of the more common platforms for website hosting.

Linux is commonly used as an operating system for supercomputers. As of November 2007, out of the top 500 systems, 426 (85.2%) run Linux.

Embedded devices

Main article: embedded Linux

Due to its low cost and ability to be easily modified, an embedded Linux is often used in embedded systems. Linux has become a major competitor to the proprietary Symbian OS found in many mobile phones — 16.7% of smartphones sold worldwide during 2006 were using Linux — and it is an alternative to the dominant Windows CE and Palm OS operating systems on handheld devices. The popular TiVo digital video recorder uses a customized version of Linux. Several network firewall and router standalone products, including several from Linksys, use Linux internally, using its advanced firewall and routing capabilities. The Korg OASYS and the Yamaha Motif XS music workstations also run Linux.

Many quantitative studies of open source software focus on topics including market share and reliability, with numerous studies specifically examining Linux. The Linux market is growing rapidly, and the revenue of servers, desktops, and packaged software running Linux is expected to exceed $35.7 billion by 2008.

IDC’s report for Q1 2007 says that Linux now holds 12.7% of the overall server market. This estimate was based on the number of Linux servers sold by various companies.

Desktop adoption of Linux is approximately 1%. In comparison, Microsoft operating systems hold more than 90%.

The frictional cost of switching operating systems and lack of support for certain hardware and application programs designed for Microsoft Windows have been two factors that have inhibited adoption. Proponents and analysts attribute the relative success of Linux to its security, reliability, low cost, and freedom from vendor lock-in.

The XO laptop project of One Laptop Per Child is creating a new and potentially much larger Linux community, planned to reach several hundred million schoolchildren and their families and communities in developing countries. Six countries have ordered a million or more units each for delivery in 2007 to distribute to schoolchildren at no charge. Google, Red Hat, and eBay are major supporters of the project.

GNU/Linux

Main article: GNU/Linux naming controversy

The Free Software Foundation views Linux distributions which use GNU software as “GNU variants” and they ask that such operating systems be referred to as GNU/Linux or a Linux-based GNU system. However, the media and population at large refers to this family of operating systems simply as Linux. While some distributors make a point of using the aggregate form, most notably Debian with the Debian GNU/Linux distribution, the term’s use outside of the enthusiast community is limited. The distinction between the Linux kernel and distributions based on it plus the GNU system is a source of confusion to many newcomers, and the naming remains controversial.