Alexandru T. Codilean, Henry Munack and their colleagues have just released their **Open and Global Database of Cosmogenic Radionuclide and Luminescence **measurements in fluvial sediment (**OCTOPUS**). This is a great open and accessible resource of data!

The cosmogenic radionuclide (CRN) part of the database consists of Be-10 and Al-26 measurements in fluvial sediment samples along with ancillary geospatial vector and raster layers, including sample site, basin outline, digital elevation model, gradient raster, flow direction and flow accumulation rasters, atmospheric pressure raster, and CRN production scaling and topographic shielding factor rasters. The database also includes a comprehensive metadata and all necessary information and input files for the recalculation of denudation rates using CAIRN, an open source program for calculating basin-wide denudation rates from Be-10 and Al-26 data.

The luminescence part of the database consists of thermoluminescence (TL) and optically stimulated luminescence (OSL) measurements in fluvial sediment samples from stratigraphic sections and sediment cores from across the Australian continent and includes ancillary vector and raster geospatial data.

OCTOPUS can be accessed at: https://earth.uow.edu.au

The developers of OCTOPUS also submitted a manuscript describing the database in detail to the open access journal **Earth System Science Data (Discussions)**. The paper is now accessible and open for interactive public discussion until 01 May 2018 at:

https://www.earth-syst-sci-data-discuss.net/essd-2018-32/

You are invited to download the data and take part in the discussion.

]]>Here is an example application of this function. I’ll use a DEM that covers one of the world’s steepest topographic gradients from the Andes down to the Peru-Chile deep-sea trench.

Before you run the function, choose a colormap. Calling ttcmap without input arguments displays a table that shows the different colormaps and their respective elevation ranges.

ttcmap Colormap_Name Min_Elevation Max_Elevation _____________ _____________ _____________ 'gmtrelief' -8000 7000 'france' -5000 4000 'mby' -8000 4000 'gmtglobe' -10000 9500

Now here are three colormaps that span the elevation range in our DEM. Note that ttcmap outputs the array zlimits which is required as input to imageschs so that the land-sea boundary is exactly at zero. ttcmap is similar to the mapping toolbox function demcmap but more precisely sets the land-sea transition to the zero value.

[cmap,zlimits] = ttcmap(DEM,'cmap','gmtrelief'); imageschs(DEM,[],'caxis',zlimits,'colormap',cmap)

[cmap,zlimits] = ttcmap(DEM,'cmap','mby'); imageschs(DEM,[],'caxis',zlimits,'colormap',cmap)

[cmap,zlimits] = ttcmap(DEM,'cmap','gmtglobe'); imageschs(DEM,[],'caxis',zlimits,'colormap',cmap)

I downloaded the colormaps from cpt-city, a great resource for colormaps.

]]>doc

in the command window. MATLAB’s documentation browser will pop up and the landing page will have a new entry in the section Supplemental Software. The link will bring you to the TopoToolbox documentation.

The documentation features a Getting Started section, numerous guides, and a function reference (which is, however, not linking to the individual help functions).

Moreover, you can now search the documentation which will hopefully make it much easier to find the right tools.

The documentation is not yet complete. We are still working on including the help sections of each function into the documentation, which will make the functionality even more accessible and browsable.

**** addendum January 15, 2018 ****

After permanently setting the paths, you might need to restart MATLAB in order to be able to view the documentation in the help browser.

]]>To answer this question, I took monthly visitor counts during the last two years (unfortunately, I can only access the last two years). I saved them in an Excel file which I then imported to MATLAB. My aim was then to do a regression analysis to be able to identify possible trends. Since the dependent data are counts, a standard least squares regression is clearly not advisable. Rather, count data are modelled using Poisson statistics. I thus chose a generalized linear regression model with counts as a Poisson response, and time as independent variable. I used ‘log’ as a link function so that the response variable is Y~Poisson(exp(b1 + b2(X))).

The function fitglm (which is part of the Statistics and Machine Learning Toolbox) is perfectly suited to solve this regression problem. However, the function takes a frequentist approach which I didn’t want to follow here. Rather, I wanted to learn how to tackle this problem with Bayesian inference. Gladly, MATLAB increasingly features tools for Bayesian analysis such as the Hamiltonian Monte Carlo (HMC) sampler. The only thing I needed was some guidance of how to derive the prior distributions and how to calculate the posterior probability density function. For this, I found this paper by Doss and Narasimhan quite useful.

Writing tools for Bayesian analysis is not easy and the code quite difficult to debug. But finally, I managed to get my analysis working, and to return following figure:

Clearly, the posterior distribution of beta_1, the slope of the regression, does not include zero so I can assume with high certainty, that we truly observe a trend to higher visitor numbers. Can this finding be extrapolated? Well, I don’t know. Just keep on visiting this blog frequently, and we’ll see.

If you are interested in the technical details, here is the function:

function bayesianpoissonregress(t,y) %Bayesianpoissonregression % % t --> datetime vector (equally spaced) % y --> vector with counts t = t(:); y = y(:); % Numeric month indices X = (1:size(t,1))'; % add offset to design matrix X = [ones(size(X,1),1) X]; % link function gamma = @(beta) exp(beta(:)'*X')'; % Prior distributions of the parameters are gaussian a = [0 0]; % prior means b = [20 20]; % prior std d = a./b + sum(X.*y); %% MCMC sampling hmc = hmcSampler(@(beta) logpdf(beta),[0 0],'UseNumericalGradient',true); % Estimate max. a posteriori map = estimateMAP(hmc); % tune sampler hmc = tuneSampler(hmc,'StartPoint',map); % draw chain chain = drawSamples(hmc,'StartPoint',map); %% Plot results figure subplot(1,2,1) [~,ax] = plotmatrix(chain); ylabel(ax(1,1),'\beta_0'); ylabel(ax(2,1),'\beta_1'); xlabel(ax(2,1),'\beta_0'); xlabel(ax(2,2),'\beta_1'); subplot(1,2,2) % random set of parameters from chain sam = randperm(size(chain,1),100); sam = chain(sam,:); hold on for r = 1:size(sam,1) chat = gamma(sam(r,:)); plot(t,chat,'color',[.7 .7 .7]); end plot(t,y,'--s') ylabel('Visitors') box on % -- end of function %% Log pdf function p = logpdf(beta) % posterior distribution according to % http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.39.9684&rep=rep1&type=pdf % (Eq. 2.2) % posterior p = - sum(1./(b.^2) .* beta.^2) + sum(d .* beta) - sum(gamma(beta)); end end]]>

Before this year ends, here is an announcement for an event next year: The **Central European Conference on Geomorphology and Quaternary Sciences** will take place in Giessen in September 2018. The event is jointly organized by the German Working Group on Geomorphology (AK Geomorphologie) and the German Quaternary Association (DEUQUA – Deutsche Quartärvereinigung). Here are the key dates:

January 8^{th}2018 |
Start of Registration |

June 15^{th}2018 |
End of Early Registration |

July 6^{th}2018 |
Deadline of Abstract Submission |

September 14^{th}2018 |
End of Late Registration Period |

September 23 – 27 2018 |
Conference |

You’ll find more information >>here<<.

]]>One difficulty with Tanaka’s method is that coloring of contour lines requires knowledge about whether higher values are found to the left or the right of a contour line. GDAL’s contouring algorithm has consistent behavior in this respect. However, MATLAB’s algorithm doesn’t. My trick is to calculate surface normals and their angle to the light source, and then interpolate the gridded directions to the vertex locations of the contour lines.

Now here is my approach which uses some undocumented features of the HG2 graphics. An excellent resource for undocumented features in MATLAB is Yair Altman’s blog.

function tanakacontour(DEM,nlevels) %TANAKACONTOUR Relief depiction using Tanaka contours % % Syntax % % tanakacontour(DEM,nlevels) % % See also: contourf, hillshade % % Author: Wolfgang Schwanghart (w.schwanghart[at]geo.uni-potsdam.de) % Date: 8. December, 2017 if nargin == 1; nlevels = 10; end % calculate solar vector azimuth = 315; azid = azimuth-90; azisource = azid/180*pi; [sx,sy,~] = sph2cart(azisource,0.1,1); % get coordinate matrices and z values [Z,X,Y] = GRIDobj2mat(DEM); % calculate surface normals [Nx,Ny] = surfnorm(Z); N = hypot(Nx,Ny); Nx = Nx./N; Ny = Ny./N; H = GRIDobj(DEM); H.Z = Nx*sx + Ny*sy; % Get coordinate matrices and Z [~,h] = contourf(X,Y,Z,nlevels); axis image drawnow % colormap cmap = gray(255)*255; cval = linspace(0,1,size(cmap,1))'; % the undocumented part for r = 1:numel(h.EdgePrims) % the EdgePrims property contains vertex data x = h.EdgePrims(r).VertexData(1,:); y = h.EdgePrims(r).VertexData(2,:); % interpolate hillshade grayscale values to contour vertices c = interp(H,double(x(:)),double(y(:))); % interpolate to colormap clr = interp1(cval,cmap,c); % adjust color data to conform with EdgePrims ColorData property clr = clr'; clr = uint8([clr;repmat(255,1,size(clr,2))]); % set all properties at once set(h.EdgePrims(r),'ColorBinding','interpolated','ColorData',clr,'LineWidth',1.2); end

Ok, now let’s try this on some data. In this code, I have used GRIDobj, the class for gridded, georeferenced data.

[X,Y,Z] = peaks(500); tanakacontour(GRIDobj(X,Y,Z),10)

And now for the Big Tujunga DEM:

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif'); tanakacontour(DEM,5)

Now that looks ok, but I am not totally convinced by these unrealistic terraces, to be honest. Moreover, my approach so far doesn’t include changing line widths. Contour lines should actually be thicker for slopes that face both towards and away from the light source. However, I didn’t figure out how to do this so far. Any ideas?

Our paper on uncertainty quantification and smoothing of longitudinal river profiles has now been published in Earth Surface Dynamics. The paper describes a new approach to hydrologically correct and smooth profiles using quantile regression. Smoothing is based on curvature regularization with the constraint that elevation must decrease with increasing flow distance. We thus refer to this technique as constrained regularized smoothing (CRS). We compare CRS-derived profiles to profiles obtained from the common methods of filling and carving, and show that CRS outperforms these methods in terms of accuracy and precision.

Check out the new TopoToolbox functions that accompany the paper:

- STREAMobj/crs
- STREAMobj/crslin
- STREAMobj/crsapp
- STREAMobj/smooth
- STREAMobj/quantcarve
- FLOWobj/quantcarve

Applications of these functions were already described in the posts on An alternative to along-river swath profiles, Steepness derived from smoothed river profiles, Smooth operator….

**References**

Schwanghart, W., Scherler, D., 2017. Bumps in river profiles: uncertainty assessment and smoothing using quantile regression techniques. *Earth Surface Dynamics*, 5, 821-839. [DOI: 10.5194/esurf-5-821-2017]

]]>

Output from LEMs is multidimensional. In fact, it is +4D: three dimensions in space, one in time, and several variables that change over time in space. This makes comparing and visualizing output from LEMs quite difficult. In this hackathon we will attempt to find solutions to this challenge. We will organize in teams to develop solutions and to rapidly prototype software.

Interested? Then apply here. We have only 15 seats and the deadline for application is on Wednesday, December 20, 2017.

]]>The last weeks had been quite busy to finish version 2.2 which was available as a prerelease for a long while. TopoToolbox users who keep their software constantly updated (for example by using GIT) won’t see much changes. For those that do not keep pace with the frequent commits to our repository, we encourage them to **do so now**. There are a lot of new functions and modifications. Benjamin Campforts added TTLEM, the TopoToolbox Landscape Evolution Model. The scope of functions for working with river networks (STREAMobj) has tremendously increased with new plotting functions, low-level chi analysis tools, and tools for geometric modifications. We added new functions to hydrologically correct and smooth river networks and values measured along them (e.g. constrained regularized smoothing (CRS)). TopoToolbox now supports multiple flow directions and there are several new functions for working with grids (GRIDobj). In addition, we consolidated the help sections in each function and increased compatibility with older MATLAB versions. Please see the readme-file for a complete overview of changes.

With version 2.2, we offer TopoToolbox as a MATLAB^{®} toolbox file (mltbx-file). This file will make installation very easy. Simply download it, double-click, and follow the instructions.

Dirk and I met this morning in the train (here we are!) …

… and discussed possible directions for a next version. The number of functions has increased a lot which entails the threat that TopoToolbox might become confusing and even deterrent in particular for new users. Simply adding new functionalities is thus not the way forward. Instead, we decided that a new version should have a better documentation that should be integrated in MATLABs documentation browser. To quote John D’Errico, a long-time and excellent contributor of MATLAB code: *Your job as a programmer does not stop when you write the last line of code. If you think so, then you should be fired. You should document your code. Provide help. Otherwise, that code is just a bunch of random bits, useful to nobody else in the world*.

With this in mind, let’s go for 2.3.

]]>The Himalayan history is rich with a sequence of destructive earthquakes. In the last century, ground-shaking, collapsing houses, and landslides in the wake of earthquakes killed tens of thousand of people, wreaking havoc to the Himalayan nations. The 2015 Gorkha Earthquake was the latest in a series of severe earthquakes to hit Nepal.

Seismic hazard analysis in the Himalayas is based on few instrumental records and a paleoseismic record extending back ~1000 years. Paleoseismology largely relies on rupture histories derived from fault trenches, written accounts, and liquefaction features. Other records derived from e.g. lake sediments are scarce.

In a now published paper in Quaternary Science Reviews, Amelie Stolle et al. documents our research in the Pokhara Valley in Nepal. The valley was massively and repeatedly aggraded by several cubic kilometers of debris in the wake of medieval earthquakes in the region. The paper extends on our 2016 paper in Science, offering new radiocarbon dates and detailing the sedimentology of the infills. Based on our findings, we argue that valley fills in the Himalayas may offer substantial additional evidence for past earthquakes subsidy to the current portfolio of paleoseismological records.

**References**

Stolle, A., Bernhardt, A., Schwanghart, W., Hoelzmann, P., Adhikari, B.R., Fort, M., Korup, O., 2017. Catastrophic valley fills record large Himalayan earthquakes, Pokhara, Nepal. Quaternary Science Reviews, 177, 88-103. [DOI: 10.1016/j.quascirev.2017.10.015]. **<< free link to paper until December 26, 2017 >>**