BISMG:SarahS/automated testing

Set up testing on the glimmer-cism2 trunk
Put the GS files on BERLIOS ftp://ftp.berlios.de/pub/glimmer-cism/ log onto holocene: scp filename sarah_shannon@shell.berlios.de:/home/groups/ftp/pub/glimmer-cism

or

sftp sarah_shannon@shell.berlios.de:/home/groups/ftp/pub/glimmer-cism user: sarah_shannon pw:

To run one of his tests cd tests/EISMINT/EISMINT-1 and type make e1-fm.1.nc This will run an individual test. Alternatively, type make data in the top directory to run all the tests To run the comparison type make compare-e1-fm.1


 * Step 1. Add macro to the configure.am which defines the path to the golden_std directory

To do this add

AC_MSG_CHECKING([whether location of gold standard files is specified]) AC_ARG_WITH(gold_std,AS_HELP_STRING([--with-gold-std],[location of gold_std]),[GOLD_STD_PREFIX=$withval]) AC_MSG_RESULT(gold_std location: $GOLD_STD_PREFIX)

and export the new variable by adding the line

AC_SUBST(GOLD_STD_DIR)

Add these lines to extra_rules.am
 * Step 2

compare-%:	%.nc @if [ ! -e $(GOLD_STD_PREFIX)/$(rel_builddir)/$< ];then \ echo "Reference file does not exist, download files from ftp.berlios.de/pub/glimmer-cism"; \ echo "and point configure to this directory --with-gold-std=/path/to/files"; \ exit -1; \ fi ;\ if $(top_builddir)/utils/compare $(GOLD_STD_PREFIX)/$(rel_builddir)/$< $< -a 1e-8; then \ echo "$< passed"; \ else echo "$< failed"; \ fi

check-data: @$(foreach data,$(data), make compare-$(basename $(data));)

dist-gold-standard: @mkdir -p $(top_builddir)/$(PACKAGE)-gold-$(VERSION)/$(rel_builddir) $(foreach data,$(data), cp $(data) $(top_builddir)/$(PACKAGE)-gold-$(VERSION)/$(rel_builddir);)

dist-gold-standard:: @find $(top_builddir)/tests/*/*/*.nc -size +100 -print > test_filenames @tar -cvzf $(PACKAGE)-gold-$(VERSION).tgz -T test_filenames @rm test_filenames @echo "Output of tests compressed to" $(PACKAGE)-gold-$(VERSION).tgz

''%old stuff checked in compare-%:	%.nc @if [ ! -e $(top_srcdir)/tests/golden_std/$(shell basename $(CURDIR))/$^ ];then \ mkdir -p $(top_srcdir)/tests/golden_std/$(shell basename $(CURDIR)) ;\ echo "Reference file does not exist, download from berlios"; \ wget --directory-prefix=$(top_srcdir)/tests/golden_std/$(shell basename $(CURDIR)) ftp://ftp.berlios.de/pub/glimmer-cism/$(shell basename $(CURDIR))/$^ ; \ fi ;\ if $(top_builddir)/utils/compare $(top_srcdir)/tests/golden_std/$(shell basename $(CURDIR))/$^ $^ -a 1e-8; then \ echo "test passed"; \ else echo "test failed"; \ fi ''

Set up EISMINT-1 and hump tests on the lanl using a bash script
the total size of the check out is 15.6Mb For the hump.PBJ.config tests the Pattyn-Bocek the code must be recompiled with -DO_RESCALE FLAG.
 * STEP 1. Generate the golden standard files. Note that the GS generated by glimmer-cism/trunk for e1.mm.1.nc differs to that generated by glimmer-cism/lanl/trunk. The wvel(surface) differs by +/- 0.5myr-1.

The GS files must not make this much larger. hump.out.nc=122kb, hump.nc=7.86kb, e1.mm.1.nc=985.5kb


 * STEP 2. Add compare.cpp to tests/util/

svn ci -m "ggsrs: Add tool to compare two netCDF files for automated nightly build and standard tests" compare.cpp

svn ci -m "ggsrs: Add EISMINT-1 configuration files for automated nighly build and tests. Files have been modified to reduce the size of the output. See README " EISMINT-1
 * STEP 3. upload the EISMINT-1 tests to /tests


 * STEP 4. * Get autoconf to build compare.cpp
 * In configure.in at line 35 insert: AC_PROG_CXX This finds the cpp compiler
 * In configure.in at line 165 insert: CXXFLAGS="$FCFLAGS $ax_cv_f90_modflag$ac_cv_netcdf_prefix/include" . This links compare.cpp to the netCDF directory
 * In configure.in at line 349 add tests/util/Makefile to the list of AC_CONFIG_FILES
 * In /test/util create a Makefile.am (a Makefile template) which compiles compare.cpp. Makefile.am contains the lines

bin_PROGRAMS = compare compare_SOURCES = compare.cpp compare_LDADD= -lnetcdf_c++
 * In /glimmer-prefix/Makefile.am add tests/util to the list like this SUBDIRS= m4macros src tests/util.


 * STEP 5. Run autoreconf to remake the Makefiles.in


 * STEP 6. Re-run ./configure --prefix=/data/ggsrs/glimmer-test --with-netcdf=/opt/local/CentOS-64/netcdf/4.0/intel_fc_10.1 FC=ifort F77=ifort


 * STEP 7. compare.cpp will now be made when we type make.

test:: newline and tab ${prefix}/tests/util/test.sh ${prefix} ${FC} ${experiment_type} ${experiment_number}
 * STEP 8. Get autoconf to create a new target so that typing "make tests  " invokes the test script. In /glimmer_prefix/Makefile.am


 * STEP 9. Use the script by typing: make test exp_type=eismint-1 exp_num=1

Feed the script some arguments


 * EISMINT-1. In these tests the temperature does not interact with the velocity. i.e the viscosity is held constant. These tests mean the temperature code can be tested directly. The subsets to run are moving margin, steady state experiments. Variables to compare are thk, wvel, xvel, yvel, temp. Output every 40ky. this reduce the files from 109Mb to 1.93kB


 * EISMINT-2 In these tests the temperature and velocity interact. T.P suggests running (a) and (g) as these are steady state experiments. Compare the same variables as EISMINT-1 plus melt rate. (file size reduces from 1.18GB to 31Mb


 * ISMIP-HOM (a subset of tests A through F, for higher-order models only)


 * Ross ice shelf (for shelf and/or HO models)


 * the "hump" test on the parallel branch


 * one or more analytical tests, as in the Bueler papers. To this test aside for the time being. It is more a test of the physics of the model rather than if the code has broken


 * a glint-example test case for Greenland i.e demonstrate that the code can run for a long time without crashing


 * exact restart tests--very important for climate models. Put this aside for the time being.

Notes: ISMIP-HOM tests in glimmer-cism/glimmer-cism-lanl/trunk/ did not work. Verify.py created the same size netCDF files no matter which size is specified (--size=)

The EISMINT-1 tests

Write the test script in bash, keep in /home/automat'

Outline of the test script:

A rule in the makefile would invoke the script so that typing “make test  ” would run a test. Need help with this !!!!

The script then does the following


 * a. Check out the configuration files for the tests
 * b. Check out the golden standard files for the tests
 * c. Compile a tool to compare two neCDF files.
 * d. Run glimmer for each configuration file in the test.
 * e. Compare the output of the runs to the gold standard and report back if the tests passed or not.

The nightly build script checks out the code, builds it and runs the test script.