BISMG:RupertG

Rupert Gladstone's wiki page

=Overview=

Working on 1d adaptive mesh ice sheet modelling with the aim of overcoming model inconsistencies in simulating grounding line migration. Also on development and testing of parameterisations for representing grounding line position at sub grid resolution. These studies should lead to submission of two papers: An AMI paper "Grounding Line Migration in an Adaptive Mesh Ice Sheet Model". A GLP paper "Parameterising the grounding line in marine ice sheet models".

Investigating southern ocean warming in output from existing models, including HadCM3 and OCCAM. Further development of coupling code for coupled climate/ice sheet simulations.

Funded on NCEO and JCRP.

Terminology

 * AMI - my 1d Adaptive Mesh ice sheet model
 * BC1 and BC2 - Blue Crystal phases 1 and 2 (Bristol Uni supercomputing facility)
 * GLP - Grounding Line Parameterisation
 * JCRP - Joint Climate Research Programme (between Met Office and NERC)
 * LHS - Latin Hypercube Sampling
 * NCEO - National Centre for Earth Observation (supports modelling activities too)
 * OAT - One At a Time sampling

=Weekly updates=

Fri 18th December
AMI paper:

Recieved comments from Andreas and Tony, incorporated these into the paper. Also tidied up references. Aim to submit this afternoon (Gems was down yesterday, due back today).

GLP paper:

Problems on BC1 - other users writing too much to standard out or to /local on the nodes, causing my jobs to fail. Had to resubmit ~1000 jobs that failed over the weekend. Wrote a script for this using python - checks output directories for certain files, and resubmits any jobs that.

Further problems - retreat experiments didn't work properly (initial forcing boost not sufficient to get out of region of steady states in many cases). Wrote script to check retreat experiments. Modified set up and prescribed forcing for retreat experiments: indrease acab boost and boost rate factor too, return gradually to normal forcing over 10000 years after steady state has been achieved.

Note, also wrote python script for updating queues - queue limit is 800 and my ensemble consists of ~7000 runs, so the script checks how many jobs I have queued and fills the queue up to the limit by taking jobs from a list (text file). This could be run using cron, but I'll leave it manual until I get round to putting some error handling in it!

The runs all completed, but are currently running again (after modifying retreat experiments). Should all be complete by Tuesday bar the highest res convergence simulations.

Started a very high resolution run (16X higher res than the main LHS and OAT ensemble) which outputs full thickness profiles periodically. Intend to use these for the paper - direct comparison of parameterised thickness profiles against high resolution thickness profile.

Streamlining output file transfer - set up key sharing for passwordless connections from bc1 to dartagnan. Wrote python script using tar and scp to transfer output files to dartagnan.

Fri 11th December
AMI paper:

Waiting co-authors comments, basically ready for submission.

GLP paper:

The description of GLPs is complete.

A significant proportion of the LHS, OAT and convergence ensembles need to be redone. This is because of problems on Blue Crystal caused by another user writing huge amounts of standard out locally and crashing the nodes. Also, having looked at the output from the previous runs some changes are needed.

Instead of linear sampling I will use linear sampling of the log of rate factor, drag coefficient and bed slope. The reason is that where linear sampling is used of parameters that cover several orders of magnitude, the vast majority of values will be of the highest order of magnitude, and we are interested in practice in representing the lower orders of magnitude too.

A very long cutoff time will be used to end simulations in which the grounding line position is oscillating too much for the termination criteria to be reached. Provisionally, 200kyr will be used.

In all cases where GLPs fail (this is mainly in the lin extrap or the cases that require solving a cubic) linear interpolation will be used instead, and the number of occurrences will be counted and written out. Previously checks were in place for most but not all eventualities (e.g. checking for more than one root but not for zero roots, since that didn't ever happen in the test case).

The number of simulations can be increased, they are running pretty quickly. The convergence experiments will be increased by one for each GLP and the sample size for the OAT and LHS experiments will be doubled.

These changes have all been implemented and the samples are running.

Fri 4th December
Monsoon:

Attended Monsoon kick off day. I now have access to this machine via the academic network.

AMI paper:

Nearing completion. Vicky completed her contribution. I've made further very minor revisions, spell checking... sent out to co-authors for final comments before submission. Contacted JGR re problems with figure captions and labelling in draft mode, this is now fixed.

GLP runs:

Preliminary results are inconclusive due to heavy bias from a few outliers (either failed runs or steady state criteria cut in too early).

Enhanced the termination criterion: added thickness rate checking as in MISMIP. Should reduce outliers.

Changed the way bed slope is implemented: now 'pivots' near the middle rather than from the left margin, makes very gentle or very steep slopes more stable for our given domain. Should reduce number of failed runs.

Added new GLPs. Pattyn and a variant on the Harmonic mean.

Problems on BC1 have meant that many runs have failed to submit or complete. Contacted hpc, problem should be resolved now, need to resubmit...

Started writing up GLP descriptions.

Fri 27th November
AMI paper:

Moved to sub directory of amis1d repository for sharing with co-authors (well, Vicky anyway!).

The reruns have completed on blue crystal with x_tol = 55km (determines patch size). Transferred the output to Dartagnan, tidied up the plotting scripts to give better looking output (hopefully font size etc matches JGR requirements), recreated all the plots. Re-did the thickness profile plot from scratch, hope its more reader-friendly now.

Implemented minor corrections suggested by Vicky.

Put in all the relevant numbers from the reruns into the text, tidied up the results sections a bit.

Modified the discussion section: quantified convergence and accuracy error a little more formally.

Updated the conclusions.

Fri 20th November
GLP simulations:

Implemented an alternative termination criterion to simply prescribing in advance the simulation length. A cutoff for rate of change of grounding line migration can be prescribed instead. Note that in retreat simulations, where an initially higher forcing is prescribed, this cutoff is used twice: first time to determine completion of the first (high forcing) phase, second time to terminate the simulation. A hard coded minimum period (provisionally 5000 years though this could change) for each phase is imposed (in case of slow initial response or response to forcing change).

Modified the code for calculating the semi-analytic solution [Schoof 2007] to run at 100 times higher resolution. This is because we are now including the Schoof solution in the model outputs and therefore want high accuracy. The Schoof solution is included in the outputs because of the LHS sampling - it was previously the same for all simulations but now it will vary.

Some of the outputs canceled and key outputs now written out separately: schoof solution and simulator solution for steady state grounding line position. This is to facilitate plotting/processing of ensemble simulations without having to access/manipulate large numbers of large files.

A note on input space. 4 simulator inputs are varied: rate factor, slip coefficient, bedrock slope and accumulation. The limits for these parameters are defined in /exports/gpfs/ggrmg/ami/LHS_inputs.txt on BC1. This file is also now in the AMISM1D SVN repository on Source. At time of writing the ranges are as follows:
 * 'slip_',315569260,315569260000
 * 'A_',3.1e-18,1.7e-16
 * 'dbdx_',0.0007,0.005
 * 'initAcab_',0.2,0.3

LHS ensemble is being run at dx = 2.4km, dt = 0.2yr resolution. Sample size 50 (could easily increase this), same for each GLP. OAT sampling has been added to the ensemble, centered on the mid points given by the parameter ranges above. Sample size is 5 per parameter, resolution and input space same as LHS ensemble.

Convergence with resolution simulations have been added to the ensemble (dx = 0.6, 1.2, 2.4, 4.8, 9.6, 19.2km).

In all three ensemble subsets (OAT, LHS, convergence) the exact same sample is used for each GLP (i.e. choice of GLP is not considered a simulator input).

The ensemble is currently running with linear drag law, it might be interesting to repeat (at least some of it) with u^(1/3) drag law.

The ensemble is currently running on BC1.

AMI paper:

Talked to Vicky, gave her access to the paper on svn, both of us to make some minor modifications.

Re-runs are complete and have been transferred to Dartagnan.

Next tasks: re-do plots, tidy up new results section, minor modifications suggested by vicky.

Fri 13th November
Set up this wiki, with Gethin's help.

Setting up LHS (latin hypercube sampling) ensemble of experiments to test the new GLPs (grounding line parameterisations) over accumulation/bedrock slope/drag coefficient/rate factor space. Will be run on Blue Crystal, something like 400 simulations at 2km or 4km resolution.

Went to Met Office (twice!). Learnt a bit about MONSOON, new computing facility for joint NERC/Met office ventures. They are keen for us to carry out coupled climate/ice sheet simulations on the machine. Goes live 1st December, but is expected to have a few glitches to be ironed out. Has 60Tb fast disk attached, plus 100Tb slow disk (to be increased). Sits effectively outside the Met office network, but is connected to MASS-R (their storage system for model output). Is connected to JANET, users can ssh in from restricted IP addresses (I've given them the Bristol IP address range so we should be ok). There is a dedicated post processing machine with not much more than IDL on it at the moment. We should shout if we need other data processing/analysis packages.