Difference between revisions of "Linux2"

From SourceWiki
Jump to navigation Jump to search
 
(11 intermediate revisions by 2 users not shown)
Line 2: Line 2:
 
'''Leveraging the power of the Linux command line'''
 
'''Leveraging the power of the Linux command line'''
 
=Introduction=
 
=Introduction=
 
'''Roll call: Jonny, Lauren, Emma, Guy, Tim, Rita, SarahS, Jenny, Laura Edwards, Jeff'''
 
  
 
This practical follows [[Linux1]] which introduced the fundamentals of the Linux command line.
 
This practical follows [[Linux1]] which introduced the fundamentals of the Linux command line.
Line 10: Line 8:
  
 
= Getting the content for this practical =  
 
= Getting the content for this practical =  
The necessary files for this practical are hosted in a version control system. To obtain them, just type the following command:
+
The files for this practical are hosted in a version control system. To obtain them, just type the following command:
 
<pre>
 
<pre>
$ svn export http://source.ggy.bris.ac.uk/subversion-open/linux2/trunk linux2
+
$ svn export https://svn.ggy.bris.ac.uk/subversion-open/linux2/trunk linux2
 
</pre>
 
</pre>
  
Line 20: Line 18:
 
In the [[Linux1]] practical, we have discovered a few Linux commands. Some of these commands use input from the keyboard (standard input) and output data to the screen (standard output). It is possible to (a) redirect input and output and (b) link commands together to perform complex actions. The files for this section are in the <tt>example1</tt> directory.
 
In the [[Linux1]] practical, we have discovered a few Linux commands. Some of these commands use input from the keyboard (standard input) and output data to the screen (standard output). It is possible to (a) redirect input and output and (b) link commands together to perform complex actions. The files for this section are in the <tt>example1</tt> directory.
 
<pre>
 
<pre>
$ cd ../example1
+
$ cd linux2/examples/example1
 
</pre>
 
</pre>
  
Line 72: Line 70:
 
Most commands we have seen so far are fairly powerful but have a limited scope. This is intentional as the Linux command line allows to create a pipeline of commands to achieve a complex behaviour. For instance, <tt>ls</tt> is good at listing things and <tt>more</tt> is good at displaying things so let's pipe them together. This is done by using the pipe sign "<tt>|</tt>".
 
Most commands we have seen so far are fairly powerful but have a limited scope. This is intentional as the Linux command line allows to create a pipeline of commands to achieve a complex behaviour. For instance, <tt>ls</tt> is good at listing things and <tt>more</tt> is good at displaying things so let's pipe them together. This is done by using the pipe sign "<tt>|</tt>".
 
<pre>
 
<pre>
$ ls -l ~ | more
+
$ ls -l ~ | less
 
-> more takes over the window if the output spans more than one screen
 
-> more takes over the window if the output spans more than one screen
 
</pre>
 
</pre>
Line 94: Line 92:
  
 
<pre>
 
<pre>
$ du --exclude .svn --human-readable ./* | sort -nr | head -n 3
+
$ du --human-readable ./* | sort -nr | head -n 3
 
184K    ./file5
 
184K    ./file5
 
44K    ./file3
 
44K    ./file3
Line 148: Line 146:
 
</pre>
 
</pre>
  
Now use the file explorer to look into <tt>thumbnails.zip</tt>.
+
Now use '''unzip -l''' to look into <tt>thumbnails.zip</tt>:
 +
 
 +
<pre>
 +
$ unzip -l thumbnails.zip
 +
Archive:  thumbnails.zip
 +
  Length      Date    Time    Name
 +
---------  ---------- -----  ----
 +
        0  2013-08-20 11:04  thumbnails/
 +
  1384508  2013-08-20 11:03  thumbnails/image-2000.jpg
 +
    17659  2013-08-20 11:04  thumbnails/image-10.jpg
 +
    23470  2013-08-20 11:04  thumbnails/image-100.jpg
 +
  125454  2013-08-20 11:04  thumbnails/image-500.jpg
 +
  424753  2013-08-20 11:04  thumbnails/image-1000.jpg
 +
---------                    -------
 +
  1975844                    6 files
 +
</pre>
  
 
This is a very simple script but already it shows that a simple batch file like this can perform some complex operations and make your life simpler. Let's go a bit further now.
 
This is a very simple script but already it shows that a simple batch file like this can perform some complex operations and make your life simpler. Let's go a bit further now.
Line 296: Line 309:
 
* check if you script is still running via <tt>ps</tt>
 
* check if you script is still running via <tt>ps</tt>
 
* kill it
 
* kill it
 +
 +
If you are goinf to run jobs in the background, you should also use the <tt>nice</tt> command:
 +
 +
<pre>
 +
nice ./infinite_loop > dates.txt &
 +
</pre>
  
 
Now that you can stop jobs and control their behaviour, it is time to learn how to build some more advanced shell scripts.
 
Now that you can stop jobs and control their behaviour, it is time to learn how to build some more advanced shell scripts.
Line 462: Line 481:
 
</pre>
 
</pre>
  
Then look at the html file in your web browser.
+
Then look at the html file using a web browser:
 +
 
 +
<pre>
 +
$ firefox test.html
 +
</pre>
  
 
Although this script is a bit silly, it combines most things you will ever need to do, namely:
 
Although this script is a bit silly, it combines most things you will ever need to do, namely:
Line 472: Line 495:
  
 
=Environment Variables=
 
=Environment Variables=
The variables we have seen so far lived only inside our shell scripts. It is possible to have persistent variables. They are called environment variables. Actually, there are already plenty of environment variables declared for you. Just type <tt.env</tt> and be surprised at how much is there already.
+
The variables we have seen so far lived only inside our shell scripts. It is possible to have persistent variables. They are called environment variables. Actually, there are already plenty of environment variables declared for you. Just type <tt>env</tt> and be surprised at how much is there already.
  
 
You can ignore most of them but some of them are really important. They can ve re-used in shell scripts for instance.
 
You can ignore most of them but some of them are really important. They can ve re-used in shell scripts for instance.

Latest revision as of 10:39, 20 August 2013

Leveraging the power of the Linux command line

Introduction

This practical follows Linux1 which introduced the fundamentals of the Linux command line.

During this practical, we will learn how to combine some commands together to create scripts that perform more complex actions.

Getting the content for this practical

The files for this practical are hosted in a version control system. To obtain them, just type the following command:

$ svn export https://svn.ggy.bris.ac.uk/subversion-open/linux2/trunk linux2

This will fetch all necessary files and put them in a folder called linux2/. Ignore the cryptic syntax so far, an introduction to version control using subversion (svn) will be given later on.

Output redirection

In the Linux1 practical, we have discovered a few Linux commands. Some of these commands use input from the keyboard (standard input) and output data to the screen (standard output). It is possible to (a) redirect input and output and (b) link commands together to perform complex actions. The files for this section are in the example1 directory.

$ cd linux2/examples/example1

Redirecting standard input and output

Let's start with a simple example. By default, the diff command outputs to the screen, for instance try:

$ diff file1 file2

This is not convenient if there is a lot of output. It is easy to redirect its output to a file so that the output can be saved for later. This is done by using the sign">":

$ diff file1 file2 > diff12.txt
$ diff file2 file3 > diff23.txt

You can then look at the respective files in a text editor or by using more or less.

Now imagine we want to put the outputs of the two diff operations into one single file. Using the syntax above and the same filename will not work as the second call would overwrite the first one. However, it is also possible to append the output of one command to a file. Note the second call below, it uses a double ">>":

$ diff file1 file2 > diff.txt
$ diff file2 file3 >> diff.txt

Just remember that a single ">" will overwrite the content of a file, a double ">>" will append.

Note that we could also concatenate the two initial files into one big file rather easily too...

$ cat diff12.txt diff23.txt > diff.txt

In the examples above, we redirected the output to a file. It is also possible to redirect the input although it not used as often as most commands accept a file as an argument. For instance consider the function sort which can be used to ... sort alphabetically the lines in a file. You could specify which file to use by using a "<".

$ sort < file4 

Note that the example above is a bit tedious as sort file4 would work just as well. However, you will probably encounter input redirection sometimes so you might as well know how it is done. Note, you can use the option -n to sort to make to use numerical sorting instead of alphabetical.

Both types of redirection can also be combined:

$ sort < file4 > file4-sorted.txt

The writing above starts to get complex and leads nicely to the notion os command pipeline which is explained below.

Important note: there are more than just standard input (stdin) and standard output (stdout), there is also standard error (stderr). Which is used by commands to report problems (compiler warnings, errors etc...). It is also possible to redirect standard error, not necessarily to the same place as standard output. This is beyond the scope of this practical.

Pipelines

Most commands we have seen so far are fairly powerful but have a limited scope. This is intentional as the Linux command line allows to create a pipeline of commands to achieve a complex behaviour. For instance, ls is good at listing things and more is good at displaying things so let's pipe them together. This is done by using the pipe sign "|".

$ ls -l ~ | less
-> more takes over the window if the output spans more than one screen

This lists the content of your home directory and makes sure the output does not overflow a page. Use space to scroll down. You could substitute more by less also.

The uniq command remove duplicate lines from its input. Let's combine it to sort to really start to tidy up file4.

$ sort file4 | uniq > file4-sorted-and-cleaned.txt

How many times was "Scene" written in the first act of Hamlet? grep can find them and wc can count words and lines so let's combine them:

$ grep -i scene file1 | wc -l
5

5 Scenes, correct!

For the last pipe example let's learn a new useful command. du calculates the size of files and folders given as input. sort can sort things numerically and head can display so to find the 3 biggest files or folder inside our directory, we could do:

$ du --human-readable ./* | sort -nr | head -n 3
184K    ./file5
44K     ./file3
44K     ./file2

Yes, file5 is bigger. It contains the integrality of hamlet actually! You could use du to find which file are clogging up your file space.

Automating things

Although pipelines can be used to perform complex tasks, they are often difficult to read after a few pipes. To performs more complex task, it is possible to put a list of commands in a file and execute this file.

$ cd ../example2

convert is a small utility from the program Imagemagick which allows the manipulation of images at the command line. For instance, to resize an image to 2000 pixels max and rename it, you could use:

$ convert image-large.jpg -resize 2000 image-2000.jpg

Now let's say you want to scale an image at five different sizes and zip the whole lot. You could enter each command repeatedly but if you use them often, you could also put them together in a file. Have a look at the file create_thumbnails:

$ ls -l create_thumbnails
-rwxr-xr-x 1 jp jp 474 2008-02-27 11:58 create_thumbnails

The first thing to notice is that the execute flag is set on this file. If it was not, it could not be executed.

Now look at the content. It starts with the shebang, a line specifying which syntax will be used. This is not mandatory but you are advised to put it to make sure the right shell is used. We used the bash shell here.

#!/bin/bash

Then the commands are listed one after the other, in sequential order. Note that we could put two commands on one line by seperating them with a semi colon.

Nothing new here except than echo is used to print things to standard output and zip can be used to create a zip file of a folder. Now try to execute the file. We do that by typing the name of the file and the preceding ./ makes sure we use the one in our directory:

$ ./create_thumbnails 
Create thumbnails.
Move thumbnails.
Compress thumbnails.
updating: thumbnails/ (stored 0%)
updating: thumbnails/image-1000.jpg (deflated 0%)
updating: thumbnails/image-100.jpg (deflated 6%)
updating: thumbnails/image-10.jpg (deflated 8%)
updating: thumbnails/image-500.jpg (deflated 1%)
Clean up.
All done.

Now use unzip -l to look into thumbnails.zip:

$ unzip -l thumbnails.zip
Archive:  thumbnails.zip
  Length      Date    Time    Name
---------  ---------- -----   ----
        0  2013-08-20 11:04   thumbnails/
  1384508  2013-08-20 11:03   thumbnails/image-2000.jpg
    17659  2013-08-20 11:04   thumbnails/image-10.jpg
    23470  2013-08-20 11:04   thumbnails/image-100.jpg
   125454  2013-08-20 11:04   thumbnails/image-500.jpg
   424753  2013-08-20 11:04   thumbnails/image-1000.jpg
---------                     -------
  1975844                     6 files

This is a very simple script but already it shows that a simple batch file like this can perform some complex operations and make your life simpler. Let's go a bit further now.

In the images folder, there are a few images and I want a set of thumbnails for each of them. I could use the supplied script which does the following:

  • copy the first image into image-large.jpg
  • execute the create_thumbnails script from above.
  • rename the zip file appropriately
  • do the same thing for the next image...

This is done by the file create_all_thumbnails. It is very straightforward. One section requires explanation:

../create_thumbnails 2&>1 /dev/null

Here we execute the script create_thumbnails by giving its relative path. The scribble that follows means that both the standard output and standard error from create_thumbnails will be redirected to oblivion: /dev/null. So this script is not too verbose. Try removing the 2&>1 /dev/null to see the difference.

You see that we really are starting to automate things now. But we could do better. A lot better. For instance, we still had to hardcode the name of the images and our current script needs to copy data which could be a expensive operation. It is actually possible to write a script that would loop on all the pictures in the directory automatically but before we get to that, we should really look at script execution and how to control it.

Launching, monitoring and controlling jobs

In this section we will look at which tools exist to control how jobs are running on our Linux machine.

$ cd ../example3

This directory contains a very simple script called infinite_loop. Although the script is very simple, we have not covered yet the fundamental aspect of it; the loop. Nonetheless, accept that this script will loop indefinitely. At each iteration of the loop, it will execute:

date
sleep 2

So basically, it will write the date and time to standard output with date and then wait two seconds via sleep. It is a very silly thing to do but its characteristics for this section about controlling the execution of scripts are that:

  • it will never stop
  • it will clog up the screen with output after a while.

Try it:

$ ./infinite_loop
Thu Feb 28 08:25:57 GMT 2008
Thu Feb 28 08:25:59 GMT 2008
Thu Feb 28 08:26:01 GMT 2008

... the script carries on and on ...

The script is running, it won't stop on its own. How can we stop it?

The easiest way to stop it right now is to press together CTRL-C. This send the interruption signal SIGINT. If the script behaves well, it should just stop. Try it.

Thu Feb 28 09:54:45 GMT 2008
Thu Feb 28 09:54:47 GMT 2008
Thu Feb 28 09:54:49 GMT 2008
... hit CTRL-C and the script will stop ...
$ 

In the example above, the script was outputting to the screen. We already know how to redirect the output to a file but one other trick is to run the script directly in the background so that we don't loose the control of our terminal:

$ ./infinite_loop > dates.txt &
[1] 5483

Now the script is running but we cannot see it. What we could do is check that the file dates.txt is receiving data:

% tail -f dates.txt
Thu Feb 28 09:15:15 GMT 2008
Thu Feb 28 09:15:17 GMT 2008
Thu Feb 28 09:15:19 GMT 2008
Thu Feb 28 09:15:21 GMT 2008
Thu Feb 28 09:15:23 GMT 2008
... and new entries keep appearing ...
... hit CTRL-C to stop tail...

So our script is running in the background. To list which scripts are running in the background right now, use the command jobs:

$ jobs -l
[1]+ 5483 Running                 ./infinite_loop > dates.txt &

Note that jobs will only list the scripts started from a given shell window. If you try jobs in another window, it will not list our running script. There are other options for that that we will see later.

There are different options to stop our running script. We could bring it back to the foreground using the command fg and then use CTRL-C to send the interruption signal SIGINT:

$ fg 1
./infinite_loop > dates.txt

... now hit CTRL-C and the script will stop ...

$

We could also have killed it directly with the command kill:

$ ./infinite_loop > dates.txt &
[1] 5990
$ kill -9 5990
$

Several signals can be sent to a process using kill. Option 9 is for SIGKILL, the strongest. The equivalent of CTRL-C would be to use kill -2 which would send SIGINT also.

We have brought a background job back to the foreground. The opposite is also possible but for this we first need to know how to suspend a job to get back the control of the terminal. This is done by using CTRL-Z which sends the suspend signal SIGTSP. We then get the control of the shell back and we can send the job to the background with the command bg:

$ ./infinite_loop > dates.txt
 ... hit CTRL-Z ...
[1]+  Stopped                 ./infinite_loop > dates.txt
$ bg 1
[1]+ ./infinite_loop > dates.txt &
$ jobs -l
[1]+  6030 Running                 ./infinite_loop > dates.txt &
$ fg 1
./infinite_loop > dates.txt
... hit CTRL-C to stop ...

We have gone a long way now. There are still a few very useful commands for job control. The first one is top which gives you a summary of what processes are running on your machine and how much resources they consume. It is very useful when you machine grinds to a near halt and you don't know why. You can then find the PID of the CPU greedy processes and kill them. Press Q to exit.

ps is in a way similar to top except that it only lists processes. Use the syntax ps -u username to see only all processes running in your name. Note that by default ps is limited to the running shell, like jobs. You need to use the option -u to see all your processes.

Now you can start and stop jobs and also send them to the background so that you can carry on working. However, if you logout or if you close the shell window (the same thing really), your running jobs will die. This is an issue for jobs that might take days to finish. The trick is to use the command nohup. It makes sure that your script will carry on running after you logout. The script will only stop if it finishes, if the machine reboots ... or if it is killed by an admin because it clogs up the machine (this happens too!).

By default, all output is sent to a file nohup.out so use redirection to make sure it is sent somewhere appropriate instead.

Let's start our script via nohup:

$ nohup ./infinite_loop > dates.txt &
[2] 7390
$ nohup: ignoring input and redirecting stderr to stdout
$ 

Now, to check all is well, you could:

  • close all shell windows
  • login again
  • check if you script is still running via ps
  • kill it

If you are goinf to run jobs in the background, you should also use the nice command:

nice ./infinite_loop > dates.txt &

Now that you can stop jobs and control their behaviour, it is time to learn how to build some more advanced shell scripts.

Shell Scripting

The bash shell allows the creation of complex scripts using conditionals, loops, arithmetics etc... However, keep in mind that shell scripting should only be done when required. Don't program your whole model in bash, it would probably be slow, inefficient and hard to maintain. Do use shell scripting to deal with your numerical models, manage input and output for it etc...

The examples for this section are in the example4 directory.

$ cd ../example4

We could spend hours talking about shell scripts. Instead, as you are already, aware of programming concepts, we will simply see how the main building block of a programs can be built using the bash shell.

One thing to keep in mind is that a shell script is interpreted, it is not compiled. So you find problems as you run the script. So be careful as sometimes, errors in the script could have consequences...

Variables

The bash shell allows the use of variables. They are not typed as in Fortran so you just declare them and use them as you go. When recalling a variable, use a dollar sign before its name. For instance:

MYVAR=123
echo $MYVAR

One thing to be aware of at this stage is how quoting is handled. Remember that:

  • double quotes expand variable names: "$MYVAR" is equivalent to 123
  • single quotes do not expand variable names: '$MYVAR' is simply the text string $MYVAR.
  • use back ticks to use a command: `date`

Look at the script var.sh and execute it. It illustrate a basic use of variables in a script.

Arithmetic

Now that we can use variables, we can do operations on them. Note that it is not recommended to use the shell to do complex calculations ... but you can do basic operations on variables. There are two main ways of manipulating variables:

  • Use the command let:
MYVAR=3
let MYVAR=MYVAR*9
  • Use "arithmetic expansion":
MYVAR=3
MYVAR=$(( MYVAR*9 ))

Both options do exactly the same things. Note that you might need to use double quotes with let in the case of complex operations.

The list of operators which can be used is quite large, for instance:

numerical operations:  =, +, -, *, /, **, %
and their shortcuts:   +=, -=, *=, /+, %=
logical operations:    &&, ||, !

Look at the script arith.sh and execute it. It illustrate a basic use of arithmetic on variables in a script.

Conditionals

We can use variables and perform operations on them. We have seen earlier how to perform some logical operations. bash also allows us to write conditional statements too. The syntax is very simple:

if [ condition ] ; then
    do something
else
    do something else
fi

This is quite simple really. The tricky bit is to write the condition properly. bash gives us some useful operators:

-eq  : is equal to
-ne  : is not equal to
-le  : is less or equal to
-lt  : is less than
-ge  : is greater or equal to
-gt  : is greater than

Here is a simple example:

if [ 2 -eq 3 ]; then
  this will never get done
else
  this will always get done
fi

As well as these "standard" tests, bash provides some very useful tests for data management.

-a : exists
-f : exists and is a file
-d : exists and is a directory

the particularity of these files is that they have only one operand, for instance:

if [ -d folder ]; then
  cd folder
fi

Look at the script if.sh and execute it. It illustrate a basic use of tests in a shell script.

Loops

bash also support the notion of loops. It is actually rather powerful and can be used to loop on the elements of a directory for instance. The basic syntax is:

for VAR in LIST; do
  do something
done;

In the example above, the LIST contains the elements to loop on. It could be given in the script or the result from a command. For instance, the output from ls can be used to loop on all elements of a given directory. Then the variable VAR is given the value of an element in the list. It is a standard variable and hence can be accessed at any time via $VAR.

It is also possible to create loops based on conditionals. For instance a loop which stops when a condition is not satisfied anymore can be create with while:

while [ condition ]; do
  do something
done

Note that it is also possible to use until.

Look at the script loop.sh and execute it. It illustrate a basic use of tests in a shell script.

Arguments

Often we want to give some input to a script so that you don;t have to rewrite it all the time. Shell scripts can accept arguments. The arguments can then be handled inside the script. The first argument is $1, the second $2 etc... $0 actually contains the name of the script and $# is the number of arguments, handy to check before doing operations on the arguments.

Look at the script loop.sh. It illustrate a basic use of arguments in a shell script. Run it with some arguments, for instance:

$ simple-args.sh foo bar 
./simple-args.sh
foo
bar
foobar
./simple-args.sh expects two arguments: <FILEA> <FILEB>

Of note in that simple example is that we introduced the notion of return value. If the script fails, it returns a value which is not zero so that the problem can be handled appropriately.

Functions

When you are going to perform the same operations many times in a script, it is possible to put this operation in a function and just call the functions. Functions are very simple to use and the only thing to be careful about is that the function declaration must come before its use in the script. The general syntax is:

# A simple function
function()
{
  whatever the function does
}

# Main script
# Call my function
function

Look at the script func.sh and execute it. It illustrate the use of a function inside a shell script.

A final example

To finish this section about scripting, have a look at the directory flickr_mossaic. Try to execute the script create_mossaic and give it an argument with the extension .html. Try to run it more than once also.

$ create_mossaic test.html

Then look at the html file using a web browser:

$ firefox test.html

Although this script is a bit silly, it combines most things you will ever need to do, namely:

  • check the validity of the arguments (if)
  • loop on a number of items (while, for)
  • download material from the Internet
  • perform operations on that material
  • output some text to a file (cat, redirection)

Environment Variables

The variables we have seen so far lived only inside our shell scripts. It is possible to have persistent variables. They are called environment variables. Actually, there are already plenty of environment variables declared for you. Just type env and be surprised at how much is there already.

You can ignore most of them but some of them are really important. They can ve re-used in shell scripts for instance.

  • HOME contains the location of your home directory. very useful in scripts.
  • PWD contains your current location. very useful in scripts.
  • PATH contains all the locations that will be searched when you are trying a command. Can make your life easier.
  • LD_LIBRARY_PATH contains the location of the shared libraries on the system. Best left alone but might need modifying in some cases.

Most of these variables should be set in a login script. When using bash, the file .bash_profile is executed once when you login and .bashrc is run everytime you start a new shell window. Therefore you could put your configuration in .bashrc. For instance, let's assume you have a directory bin in your home folder in which you put useful scripts that you use very often. You could add this directory to your PATH so that you don't have to re-type the full path of the script all the time. You could put in .bashrc:

# Local bin folder containing all my useful scripts
PATH=$PATH:$HOME/bin
export PATH

So $HOME/bin is appended to PATH and a semi-colon inserted for separation of the directories. Note also the export command which makes sure that PATH becomes an environment variable, i.e. it is not destroyed when the script finishes.

To test the changes, create the bin folder and put a script in it. Then modify .bashrc accordingly. Finally either re-open a new shell window or type . ~/.bashrc. The script in bin should be available automatically now!

Useful commands not covered in this practical

We could not cover everything in two practicals and you know enough to self learn a lot about the Linux command line now. below is a list of commands that you might need sometimes so it's nice to look at what they do now ...

Text Processing

sed, the stream editor is great to manipulate text. want to list all the files ending in .jpg but without the extension?, try:

$ ls *.jpg | sed 's/\.jpg//g'

awk is more apt to manipulate column type data.

cut and paste can also be useful.

Managing Data

Managing data and file space will consume too much of your time. Use df to know how full the hard disks are.

When you need room, you can delete stuff but you can also compress data. There are a few (!) compression utilities installed on our Linux machines:

  • tar
  • zip
  • gzip

zip is nice as the zipped files can be unzipped under Microsoft Windows very easily. It's rather easy to use to compress a folder:

zip folder folder  -> will compress folder and its content into a zip file called folder.zip
unzip folder.zip   -> will uncompress the zip file folder.zip

To compress data further, use tar and combine it with gzip (option z in the tar arguments):

tar cfvz folder.tar.gz folder -> will compress folder and its content into a file called folder.tar.gz
tar xvfz folder.tar.gz        -> will uncompress the file folder.tar.gz

To go further

The Pragmatic Programming course continues with an practical about more advanced features of the Fortran programming language: Fortran2.

Now you should have some solid foundations about Linux commands. Remember to use man and info to find help.

There is a very extensive bash user guide available at http://steve-parker.org/sh/sh.shtml and remember that for technical issues, Google is your friend...