Difference between revisions of "Linux2"

From SourceWiki
Jump to navigation Jump to search
Line 99: Line 99:
  
 
= Automating things =
 
= Automating things =
"batch files"
+
Although pipelines can be used to perform complex tasks, they are often difficult to read after a few pipes. To performs more complex task, it is possible to put a list of commands in a file and execute this file.
  
 
<pre>
 
<pre>
$ cd example2
+
$ cd ../example2
 
</pre>
 
</pre>
  

Revision as of 00:04, 28 February 2008

Leveraging the power of the Linux command line

Introduction

Roll call: Jonny, Lauren, Emma, Guy, Tim, Rita, SarahS, Jenny

This practical follows Linux1 which introduced the fundamentals of the Linux command line.

During this practical, we will learn how to combine some commands together to create scripts that perform more complex actions.

Getting the content for this practical

The necessary files for this practical are hosted in a version control system. To obtain them, just type the following command:

$ svn export http://source.ggy.bris.ac.uk/subversion-open/linux2/trunk linux2

This will fetch all necessary files and put them in a folder called linux2/. Ignore the cryptic syntax so far, an introduction to version control using subversion (svn) will be given later on.

Output redirection

In the Linux1 practical, we have discovered a few Linux commands. Some of these commands use input from the keyboard (standard input) and output data to the screen (standard output). It is possible to (a) redirect input and output and (b) link commands together to perform complex actions. The files for this section are in the example1 directory

Redirecting standard input and output

Let's start with a simple example. By default, the diff command outputs to the screen, for instance try:

$ diff file1 file2

This is not convenient if there is a lot of output. It is easy to redirect its output to a file so that the output can be saved for later. This is done by using the sign">":

$ diff file1 file2 > diff12.txt
$ diff file2 file3 > diff23.txt

You can then look at the respective files in a text editor or by using more or less.

Now imagine we want to put the outputs of the two diff operations into one single file. Using the syntax above and the same filename will not work as the second call would overwrite the first one. However, it is also possible to append the output of one command to a file. Note the second call below, it uses a double ">>":

$ diff file1 file2 > diff.txt
$ diff file2 file3 >> diff.txt

Just remember that a single ">" will overwrite the content of a file, a double ">>" will append.

Note that we could also concatenate the two initial files into one big file rather easily too...

$ cat diff12.txt diff23.txt > diff.txt

In the examples above, we redirected the output to a file. It is also possible to redirect the input although it not used as often as most commands accept a file as an argument. For instance consider the function sort which can be used to ... sort alphabetically the lines in a file. You could specify which file to use by using a "<".

sort < file4 

Note that the example above is a bit tedious as sort file4 would work just as well. However, you will probably encounter input redirection sometimes so you might as well know how it is done. Note, you can use the option -n to sort to make to use numerical sorting instead of alphabetical.

Both types of redirection can also be combined:

sort < file4 > file4-sorted.txt

The writing above starts to get complex and leads nicely to the notion os command pipeline which is explained below.

Important note: there are more than just standard input (stdin) and standard output (stdout), there is also standard error (stderr). Which is used by commands to report problems (compiler warnings, errors etc...). It is also possible to redirect standard error, not necessarily to the same place as standard output. This is beyond the scope of this practical.

Pipelines

Most commands we have seen so far are fairly powerful but have a limited scope. This is intentional as the Linux command line allows to create a pipeline of commands to achieve a complex behaviour. For instance, ls is good at listing things and more is good at displaying things so let's pipe them together. This is done by using the pipe sign "|".

$ ls -l ~ | more
-> more takes over the window if the output spans more than one screen

This lists the content of your home directory and makes sure the output does not overflow a page. Use space to scroll down. You could substitute more by less also.

The uniq command remove duplicate lines from its input. Let's combine it to sort to really start to tidy up file4.

$ sort file4 | uniq > file4-sorted-and-cleaned.txt

How many times was "Scene" written in the first act of Hamlet? grep can find them and wc can count words and lines so let's combine them:

$ grep -i scene file1 | wc -l
5

5 Scenes, correct!

For the last pipe example let's learn a new useful command. du calculates the size of files and folders given as input. sort can sort things numerically and head can display so to find the 3 biggest files or folder inside our directory, we could do:

$ du --exclude .svn --human-readable ./* | sort -nr | head -n 3
184K    ./file5
44K     ./file3
44K     ./file2

Yes, file5 is bigger. It contains the integrality of hamlet actually! You could use du to find which file are clogging up your file space.

Automating things

Although pipelines can be used to perform complex tasks, they are often difficult to read after a few pipes. To performs more complex task, it is possible to put a list of commands in a file and execute this file.

$ cd ../example2

convert is a small utility from the program Imagemagick which allows the manipulation of images at the command line. For instance, to resize an image to 2000 pixels max and rename it, you could use:

$ convert image-large.jpg -resize 2000 image-2000.jpg

Now let's say

#!/bin/bash

echo "Create thumbnails."
convert image-large.jpg -resize 2000 image-2000.jpg
convert image-large.jpg -resize 1000 image-1000.jpg
convert image-large.jpg -resize 500 image-500.jpg
convert image-large.jpg -resize 100 image-100.jpg
convert image-large.jpg -resize 10 image-10.jpg

echo "Move thumbnails."
mkdir thumbnails
mv image-*0.jpg thumbnails/

echo "Compress thumbnails."
zip -r thumbnails thumbnails

echo "Clean up."
rm -rf thumbnails

echo "All done."

Launching, monitoring and controlling jobs

Background, bg, jobs, fg, nohup, top, kill

Shell Scripting

Variables

Conditionals

For loops

Functions

Arithmetic

Putting everything together in one script.

Environment Variables

SHELL PWD PATH (LD_LIBRARY_PATH)


Text Processing

sed, awk.

Managing Data?

du, df, file. Use of symbolic links. tar, zip, gzip, bzip2