BISMG:RupertG

Rupert Gladstone's wiki page

=Overview=

Working on 1d adaptive mesh ice sheet modelling with the aim of overcoming model inconsistencies in simulating grounding line migration. Also on development and testing of parameterisations for representing grounding line position at sub grid resolution. These studies should lead to submission of two papers: An AMI paper "Grounding Line Migration in an Adaptive Mesh Ice Sheet Model". A GLP paper "Parameterising the grounding line in marine ice sheet models".

Investigating southern ocean warming in output from existing models, including HadCM3 and OCCAM. Further development of coupling code for coupled climate/ice sheet simulations.

Funded on NCEO and JCRP.

Terminology

 * AMI - my 1d Adaptive Mesh ice sheet model
 * BC1 and BC2 - Blue Crystal phases 1 and 2 (Bristol Uni supercomputing facility)
 * GCL - glimmer-cism-lanl
 * GLP - Grounding Line Parameterisation
 * JCRP - Joint Climate Research Programme (between Met Office and NERC)
 * LHS - Latin Hypercube Sampling
 * NCEO - National Centre for Earth Observation (supports modelling activities too)
 * OAT - One At a Time sampling

=Weekly updates=

Fri 19th March
Ice2sea conference attendance and presentation on the GLP work.

Fri 12th March
GLP work:

Generated new plots using color to highlight certain GLPs (Pattyn's published one and our best one).

Started preparing .ppt presentation for ice2sea (can also use as basis for EGU).

Ocean coupling:

Struggling with an awkward bug, I'll post more on this later. We have an unsatisfactory work around and a couple more ideas to try to fix it.

Southern ocean warming:

Working with Ian C to extend analysis code to generate animations of lat/depth sections for all models. OCCAM arrays are too large for IDL, need to find workaround. OCCAM netcdf files still need time coords sorting out - will use NCO.

Other:

Monsoon. Negotiating re accounts and UM model versions with Rod Smyth. Doesn't look like accounts for group members not directly involved in/funded by JCRP should be too much of a problem, so long as there is a clear need for them to do collaborative JCRP related work on Monsoon. Would need a very strong case to get UM vn4.5 put on Monsoon.

Co-wrote EPSRC feasibility proposal with Oliver Ray from Computer science, outcome by end March.

Fri 5th March
on holiday...

Fri 26th Feb
GCM/GCL coupling:

Awkward bug identified in my coupling code. Spent a few entertaining hours with Gethin trying to get this sorted. Code fails at a line like this: if (associated(thing)) deallocate(thing) It passes the 'if associated' but fails to deallocate. Note that 'thing' is not directly affected by my code changes, and that this line was present in the old code, which ran ok. Working on it...

Southern ocean warming:

Learning more about UKMO IDL lib. Tweaked OCCAM netcdf files to be more (but not completely) CF compliant. Started generating vertical slices of OCCAM data. Needs merging with JG code at some point.

Other:

Started reviewing a paper for J.Glac.

Started looking at Monsoon with DAGW.

Fri 19th Feb
GLP paper:

Re-runs failed to finish, issues on BC1. Re-ran the last few. Carried out checks, transferred results to Dartagnan, processed results, generated plots.

Finished writing results and discussion sections, sent out draft.

GCM/GCL coupling:

Coupled code including ocean interface now compiles, currently testing at run time.

Fri 12th Feb
GLP paper:

Re-runs taking longer than expected. Should be finished later today (Friday).

GCM/GCL coupling:

HadCM3/GCL not successfully compiled last week (issues with modset naming). Have now implemented changes to the modset to enable coupling to GCL, seems to be just a few module name changes.

Have extended GCL code to allow passing of ocean fields: Enhanced main derived type to allow ocean grid and 3d ocean vars (theta and salinity so far), modified global grid creation subroutine to allow for the option of 3d grids, added new subroutine for initialising ocean grid and fields, added place holder subroutine for handling the ocean data arrays from the GCM. So far this subroutine merely receives the arrays and prints out a few numbers. These code changes should not affect existing functionality. They have been checked in to GCL trunk.

Modified the low level HadCM3 modset routines to pass data to the new GCL subroutines. Currently debugging this.

Fri 5th Feb
GLP paper:

Identified problem with termination criteria. Reset to prescribed very long time (100kyr for advance and 200kyr for retreat simulations) and restarted ensembles on BC1. Re-running LHS and OAT ensembles. Expect to take about a week.

GCM/glimmer-cism coupling:

Updated env vars and default modules on BC2.

Checked out and installed glimmer-cism-lanl (gcl) main trunk.

Compiled HadCM3 with gcl, currently testing at run time.

Passed ocean temp and salin to lowest level in coupling modset (needs mods to gcl now to accept these fields and grid information).

Fri 29th January
Discussions on glimmer-cism developments. See minutes from meeting and BISMG main page.

GLP paper:

Completing 1st draft of plots/results sections.

Fri 22nd January
Southern ocean warming:

Moved OCCAM data to Monsoon (except 1/12th degree monthly, we don't have that as yet).

Set up SVN repository on Monsoon. Imported JG scripts for initial analysis. These are now working (with help from JG).

GLP paper:

Continued making plots and writing up results.

Fri 15th January
Southern ocean warming:

Requested OCCAM high res data (1/12) via email and via the data selector. The given email addresses for Beverly de Cuevas and Andrew Coward appear to be broken (perhaps a server at soton is down?).

GLP paper:

Further failures on BC1 due to other users filling up disk space. BC1 staff have now decreased quotas.

Correction to OAT sample (hadn't updated sample after changing dbdx limit), started running again.

Plotted distribution of LHS sample (just the sample not the outputs), highlighting invalid retreat experiments.

Wrote up description of experimental setup and sampling.

Fri 8th January
Installed R on my PC, started learning about R.

Southern Ocean warming:

Logged on to MONSOON, started transferring OCCAM 1/4 degree files to MONSOON post processing machine. Spoke to JG, received his scripts and a couple of relevant papers. JG to transfer yearly mean ocean temperatures to MONSOON from CMIP3 exps. Me to transfer OCCAM 1/4 and 1/16th deg data there.

GLP work:

Set up further convergence runs: log centre point of input space, also MISMIP exp inputs. Also using non-linear (u^(1/3)) basal drag law.

Generated box plots in R better showing distribution of metrics for GLPs. Gravitational driving stress parameterisation still looks bad, needs looking into.

Christmas
AMI paper was submitted.

GLP work:

Runs completed, created some plots, strangely the gravitational driving stress parameterisation appears to cause increased discrepancy between advance and retreat experiments when considering input space as a whole. This is different to earlier preliminary results with the standard AMI setup (i.e. as in the AMI paper).

Fri 18th December
AMI paper:

Recieved comments from Andreas and Tony, incorporated these into the paper. Also tidied up references. Aim to submit this afternoon (Gems was down yesterday, due back today).

GLP paper:

Problems on BC1 - other users writing too much to standard out or to /local on the nodes, causing my jobs to fail. Had to resubmit ~1000 jobs that failed over the weekend. Wrote a script for this using python - checks output directories for certain files, and resubmits any jobs that.

Further problems - retreat experiments didn't work properly (initial forcing boost not sufficient to get out of region of steady states in many cases). Wrote script to check retreat experiments. Modified set up and prescribed forcing for retreat experiments: indrease acab boost and boost rate factor too, return gradually to normal forcing over 10000 years after steady state has been achieved.

Note, also wrote python script for updating queues - queue limit is 800 and my ensemble consists of ~7000 runs, so the script checks how many jobs I have queued and fills the queue up to the limit by taking jobs from a list (text file). This could be run using cron, but I'll leave it manual until I get round to putting some error handling in it!

The runs all completed, but are currently running again (after modifying retreat experiments). Should all be complete by Tuesday bar the highest res convergence simulations.

Started a very high resolution run (16X higher res than the main LHS and OAT ensemble) which outputs full thickness profiles periodically. Intend to use these for the paper - direct comparison of parameterised thickness profiles against high resolution thickness profile.

Streamlining output file transfer - set up key sharing for passwordless connections from bc1 to dartagnan. Wrote python script using tar and scp to transfer output files to dartagnan.

Fri 11th December
AMI paper:

Waiting co-authors comments, basically ready for submission.

GLP paper:

The description of GLPs is complete.

A significant proportion of the LHS, OAT and convergence ensembles need to be redone. This is because of problems on Blue Crystal caused by another user writing huge amounts of standard out locally and crashing the nodes. Also, having looked at the output from the previous runs some changes are needed.

Instead of linear sampling I will use linear sampling of the log of rate factor, drag coefficient and bed slope. The reason is that where linear sampling is used of parameters that cover several orders of magnitude, the vast majority of values will be of the highest order of magnitude, and we are interested in practice in representing the lower orders of magnitude too.

A very long cutoff time will be used to end simulations in which the grounding line position is oscillating too much for the termination criteria to be reached. Provisionally, 200kyr will be used.

In all cases where GLPs fail (this is mainly in the lin extrap or the cases that require solving a cubic) linear interpolation will be used instead, and the number of occurrences will be counted and written out. Previously checks were in place for most but not all eventualities (e.g. checking for more than one root but not for zero roots, since that didn't ever happen in the test case).

The number of simulations can be increased, they are running pretty quickly. The convergence experiments will be increased by one for each GLP and the sample size for the OAT and LHS experiments will be doubled.

These changes have all been implemented and the samples are running.

Fri 4th December
Monsoon:

Attended Monsoon kick off day. I now have access to this machine via the academic network.

AMI paper:

Nearing completion. Vicky completed her contribution. I've made further very minor revisions, spell checking... sent out to co-authors for final comments before submission. Contacted JGR re problems with figure captions and labelling in draft mode, this is now fixed.

GLP runs:

Preliminary results are inconclusive due to heavy bias from a few outliers (either failed runs or steady state criteria cut in too early).

Enhanced the termination criterion: added thickness rate checking as in MISMIP. Should reduce outliers.

Changed the way bed slope is implemented: now 'pivots' near the middle rather than from the left margin, makes very gentle or very steep slopes more stable for our given domain. Should reduce number of failed runs.

Added new GLPs. Pattyn and a variant on the Harmonic mean.

Problems on BC1 have meant that many runs have failed to submit or complete. Contacted hpc, problem should be resolved now, need to resubmit...

Started writing up GLP descriptions.

Fri 27th November
AMI paper:

Moved to sub directory of amis1d repository for sharing with co-authors (well, Vicky anyway!).

The reruns have completed on blue crystal with x_tol = 55km (determines patch size). Transferred the output to Dartagnan, tidied up the plotting scripts to give better looking output (hopefully font size etc matches JGR requirements), recreated all the plots. Re-did the thickness profile plot from scratch, hope its more reader-friendly now.

Implemented minor corrections suggested by Vicky.

Put in all the relevant numbers from the reruns into the text, tidied up the results sections a bit.

Modified the discussion section: quantified convergence and accuracy error a little more formally.

Updated the conclusions.

Fri 20th November
GLP simulations:

Implemented an alternative termination criterion to simply prescribing in advance the simulation length. A cutoff for rate of change of grounding line migration can be prescribed instead. Note that in retreat simulations, where an initially higher forcing is prescribed, this cutoff is used twice: first time to determine completion of the first (high forcing) phase, second time to terminate the simulation. A hard coded minimum period (provisionally 5000 years though this could change) for each phase is imposed (in case of slow initial response or response to forcing change).

Modified the code for calculating the semi-analytic solution [Schoof 2007] to run at 100 times higher resolution. This is because we are now including the Schoof solution in the model outputs and therefore want high accuracy. The Schoof solution is included in the outputs because of the LHS sampling - it was previously the same for all simulations but now it will vary.

Some of the outputs canceled and key outputs now written out separately: schoof solution and simulator solution for steady state grounding line position. This is to facilitate plotting/processing of ensemble simulations without having to access/manipulate large numbers of large files.

A note on input space. 4 simulator inputs are varied: rate factor, slip coefficient, bedrock slope and accumulation. The limits for these parameters are defined in /exports/gpfs/ggrmg/ami/LHS_inputs.txt on BC1. This file is also now in the AMISM1D SVN repository on Source. At time of writing the ranges are as follows:
 * 'slip_',315569260,315569260000
 * 'A_',3.1e-18,1.7e-16
 * 'dbdx_',0.0007,0.005
 * 'initAcab_',0.2,0.3

LHS ensemble is being run at dx = 2.4km, dt = 0.2yr resolution. Sample size 50 (could easily increase this), same for each GLP. OAT sampling has been added to the ensemble, centered on the mid points given by the parameter ranges above. Sample size is 5 per parameter, resolution and input space same as LHS ensemble.

Convergence with resolution simulations have been added to the ensemble (dx = 0.6, 1.2, 2.4, 4.8, 9.6, 19.2km).

In all three ensemble subsets (OAT, LHS, convergence) the exact same sample is used for each GLP (i.e. choice of GLP is not considered a simulator input).

The ensemble is currently running with linear drag law, it might be interesting to repeat (at least some of it) with u^(1/3) drag law.

The ensemble is currently running on BC1.

AMI paper:

Talked to Vicky, gave her access to the paper on svn, both of us to make some minor modifications.

Re-runs are complete and have been transferred to Dartagnan.

Next tasks: re-do plots, tidy up new results section, minor modifications suggested by vicky.

Fri 13th November
Set up this wiki, with Gethin's help.

Setting up LHS (latin hypercube sampling) ensemble of experiments to test the new GLPs (grounding line parameterisations) over accumulation/bedrock slope/drag coefficient/rate factor space. Will be run on Blue Crystal, something like 400 simulations at 2km or 4km resolution.

Went to Met Office (twice!). Learnt a bit about MONSOON, new computing facility for joint NERC/Met office ventures. They are keen for us to carry out coupled climate/ice sheet simulations on the machine. Goes live 1st December, but is expected to have a few glitches to be ironed out. Has 60Tb fast disk attached, plus 100Tb slow disk (to be increased). Sits effectively outside the Met office network, but is connected to MASS-R (their storage system for model output). Is connected to JANET, users can ssh in from restricted IP addresses (I've given them the Bristol IP address range so we should be ok). There is a dedicated post processing machine with not much more than IDL on it at the moment. We should shout if we need other data processing/analysis packages.