Better maps with TopoToolbox

Posted on

I am increasingly using MATLAB to create maps. I simply find the workflow cumbersome that combines TopoToolbox, ArcGIS or QGIS. This is among the reasons why I implemented a new function in TopoToolbox that helps you coloring DEMs, particularly those that include topography and bathymetry. The function is ttcmap and you’ll find it in the colormaps folder.

Here is an example application of this function. I’ll use a DEM that covers one of the world’s steepest topographic gradients from the Andes down to the Peru-Chile deep-sea trench.

Before you run the function, choose a colormap. Calling ttcmap without input arguments displays a table that shows the different colormaps and their respective elevation ranges.


    Colormap_Name    Min_Elevation    Max_Elevation
    _____________    _____________    _____________

    'gmtrelief'       -8000           7000         
    'france'          -5000           4000         
    'mby'             -8000           4000         
    'gmtglobe'       -10000           9500    

Now here are three colormaps that span the elevation range in our DEM. Note that ttcmap outputs the array zlimits which is required as input to imageschs so that the land-sea boundary is exactly at zero. ttcmap is similar to the mapping toolbox function demcmap but more precisely sets the land-sea transition to the zero value.

[cmap,zlimits] = ttcmap(DEM,'cmap','gmtrelief');
[cmap,zlimits] = ttcmap(DEM,'cmap','mby');
[cmap,zlimits] = ttcmap(DEM,'cmap','gmtglobe');

I downloaded the colormaps from cpt-city, a great resource for colormaps.


TopoToolbox documentation integrated in MATLAB help browser

Posted on Updated on

TopoToolbox version 2.3 will have a better documentation. With the latest commits to our github repository, you are now able to take a glance at our work-in-progress. Just download it, follow the instruction in the readme file, and then type


in the command window. MATLAB’s documentation browser will pop up and the landing page will have a new entry in the section Supplemental Software. The link will bring you to the TopoToolbox documentation.

The MATLAB documentation landing page

The documentation features a Getting Started section, numerous guides, and a function reference (which is, however, not linking to the individual help functions).

The TopoToolbox documentation

Moreover, you can now search the documentation which will hopefully make it much easier to find the right tools.

Search the documentation for the term ‘ksn’.

The documentation is not yet complete. We are still working on including the help sections of each function into the documentation, which will make the functionality even more accessible and browsable.

**** addendum January 15, 2018 ****

After permanently setting the paths, you might need to restart MATLAB in order to be able to view the documentation in the help browser.

Bayesian analysis of blog usage

Posted on Updated on

The TopoToolbox blog has now been online since October 2014. Since then, the blog had ~22k visitors. With a total of 8.7k, 2017 was the year with the most visitors. Yet, monthly statistics show quite some variability throughout the years. The blog tends to have fewer visitors in December and during the summer months July and August than during the remaining months. In addition, there are strong fluctuations between usage statistics of months that have seen many posts and those when I had no time to post.  This variability superimposes the general trend. We may thus ask the question, whether we truely experience an increase in visitor counts.

To answer this question, I took monthly visitor counts during the last two years (unfortunately, I can only access the last two years). I saved them in an Excel file which I then imported to MATLAB. My aim was then to do a regression analysis to be able to identify possible trends. Since the dependent data are counts, a standard least squares regression is clearly not advisable. Rather, count data are modelled using Poisson statistics. I thus chose a generalized linear regression model with counts as a Poisson response, and time as independent variable. I used ‘log’ as a link function so that the response variable is Y~Poisson(exp(b1 + b2(X))).

The function fitglm (which is part of the Statistics and Machine Learning Toolbox) is perfectly suited to solve this regression problem. However, the function takes a frequentist approach which I didn’t want to follow here. Rather, I wanted to learn how to tackle this problem with Bayesian inference. Gladly, MATLAB increasingly features tools for Bayesian analysis such as the Hamiltonian Monte Carlo (HMC) sampler. The only thing I needed was some guidance of how to derive the prior distributions and how to calculate the posterior probability density function. For this, I found this paper by Doss and Narasimhan quite useful.

Writing tools for Bayesian analysis is not easy and the code quite difficult to debug. But finally, I managed to get my analysis working, and to return following figure:

Bayesian analysis of monthly blog statistics.

Clearly, the posterior distribution of beta_1, the slope of the regression, does not include zero so I can assume with high certainty, that we truly observe a trend to higher visitor numbers. Can this finding be extrapolated? Well, I don’t know. Just keep on visiting this blog frequently, and we’ll see.

If you are interested in the technical details, here is the function:

function bayesianpoissonregress(t,y)

% t --> datetime vector (equally spaced)
% y --> vector with counts

t = t(:);
y = y(:);

% Numeric month indices
X = (1:size(t,1))';
% add offset to design matrix
X = [ones(size(X,1),1) X];

% link function
gamma = @(beta) exp(beta(:)'*X')';

% Prior distributions of the parameters are gaussian
a = [0 0]; % prior means
b = [20 20]; % prior std

d = a./b + sum(X.*y);

%% MCMC sampling
hmc = hmcSampler(@(beta) logpdf(beta),[0 0],'UseNumericalGradient',true);
% Estimate max. a posteriori
map = estimateMAP(hmc);
% tune sampler
hmc = tuneSampler(hmc,'StartPoint',map);
% draw chain
chain = drawSamples(hmc,'StartPoint',map);

%% Plot results
[~,ax] = plotmatrix(chain);

% random set of parameters from chain
sam = randperm(size(chain,1),100);
sam = chain(sam,:);

hold on
for r = 1:size(sam,1)
    chat = gamma(sam(r,:)); 
    plot(t,chat,'color',[.7 .7 .7]); 
box on
% -- end of function

%% Log pdf
function p = logpdf(beta)
% posterior distribution according to 
% (Eq. 2.2)

% posterior
p = - sum(1./(b.^2) .* beta.^2) + sum(d .* beta) - sum(gamma(beta));


Central European Conference on Geomorphology and Quaternary Sciences 2018

Posted on Updated on


Before this year ends, here is an announcement for an event next year: The Central European Conference on Geomorphology and Quaternary Sciences will take place in Giessen in September 2018. The event is jointly organized by the German Working Group on Geomorphology (AK Geomorphologie) and the German Quaternary Association (DEUQUA – Deutsche Quartärvereinigung). Here are the key dates:

January 8th
Start of Registration
June 15th
End of Early Registration
July 6th
Deadline of Abstract Submission
September 14th
End of Late Registration Period
September 23 – 27

You’ll find more information >>here<<.

Tanaka contours – a MATLAB solution

Posted on Updated on

Today, I read about the Tanaka method, an approach to depicting relief. The method relies on filled contours and coloring contour lines according to their direction and light origin. I have seen solutions in QGIS (here and here) which made me thinking whether the Tanaka method can also be implemented in MATLAB.

One difficulty with Tanaka’s method is that coloring of contour lines requires knowledge about whether higher values are found to the left or the right of a contour line. GDAL’s contouring algorithm has consistent behavior in this respect. However, MATLAB’s algorithm doesn’t. My trick is to calculate surface normals and their angle to the light source, and then interpolate the gridded directions to the vertex locations of the contour lines.

Now here is my approach which uses some undocumented features of the HG2 graphics. An excellent resource for undocumented features in MATLAB is Yair Altman’s blog.

function tanakacontour(DEM,nlevels)

%TANAKACONTOUR Relief depiction using Tanaka contours
% Syntax
%     tanakacontour(DEM,nlevels)
% See also: contourf, hillshade
% Author: Wolfgang Schwanghart (w.schwanghart[at]
% Date: 8. December, 2017

if nargin == 1;
    nlevels = 10;

% calculate solar vector
azimuth = 315;
azid    = azimuth-90;
azisource = azid/180*pi;
[sx,sy,~] = sph2cart(azisource,0.1,1);

% get coordinate matrices and z values
[Z,X,Y] = GRIDobj2mat(DEM);

% calculate surface normals
[Nx,Ny] = surfnorm(Z);
N = hypot(Nx,Ny);
Nx = Nx./N;
Ny = Ny./N;

H = GRIDobj(DEM);
H.Z = Nx*sx + Ny*sy;

% Get coordinate matrices and Z
[~,h] = contourf(X,Y,Z,nlevels);
axis image

% colormap
cmap = gray(255)*255;
cval = linspace(0,1,size(cmap,1))';

% the undocumented part
for r = 1:numel(h.EdgePrims)
    % the EdgePrims property contains vertex data
    x = h.EdgePrims(r).VertexData(1,:);
    y = h.EdgePrims(r).VertexData(2,:);
    % interpolate hillshade grayscale values to contour vertices
    c = interp(H,double(x(:)),double(y(:)));
    % interpolate to colormap
    clr = interp1(cval,cmap,c);
    % adjust color data to conform with EdgePrims ColorData property
    clr = clr';
    clr = uint8([clr;repmat(255,1,size(clr,2))]);
    % set all properties at once

Ok, now let’s try this on some data. In this code, I have used GRIDobj, the class for gridded, georeferenced data.

[X,Y,Z] = peaks(500);


And now for the Big Tujunga DEM:

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');

Now that looks ok, but I am not totally convinced by these unrealistic terraces, to be honest. Moreover, my approach so far doesn’t include changing line widths. Contour lines should actually be thicker for slopes that face both towards and away from the light source. However, I didn’t figure out how to do this so far. Any ideas?

Paper out now: Bumps in river profiles

Posted on

Different preprocessing techniques and their effect on the longitudinal river profile (Figure 1 from Schwanghart and Scherler (2017)).

Our paper on uncertainty quantification and smoothing of longitudinal river profiles has now been published in Earth Surface Dynamics. The paper describes a new approach to hydrologically correct and smooth profiles using quantile regression. Smoothing is based on curvature regularization with the constraint that elevation must decrease with increasing flow distance. We thus refer to this technique as constrained regularized smoothing (CRS). We compare CRS-derived profiles to profiles obtained from the common methods of filling and carving, and show that CRS outperforms these methods in terms of accuracy and precision.

Check out the new TopoToolbox functions that accompany the paper:

  • STREAMobj/crs
  • STREAMobj/crslin
  • STREAMobj/crsapp
  • STREAMobj/smooth
  • STREAMobj/quantcarve
  • FLOWobj/quantcarve

Applications of these functions were already described in the posts on An alternative to along-river swath profiles, Steepness derived from smoothed river profiles, Smooth operator….


Schwanghart, W., Scherler, D., 2017. Bumps in river profiles: uncertainty assessment and smoothing using quantile regression techniques. Earth Surface Dynamics, 5, 821-839. [DOI: 10.5194/esurf-5-821-2017]


Geo.X Hackathon

Posted on Updated on

Are you based in Potsdam or Berlin at an institution of the Geo.X network? Are you PostDoc or PhD student? Are you keen on GIS, software development, and programming? If yes, you might like to participate in the Hackathon on visualizing the output of landscape evolution models (LEMs).

Output from LEMs is multidimensional. In fact, it is +4D: three dimensions in space, one in time, and several variables that change over time in space. This makes comparing and visualizing output from LEMs quite difficult. In this hackathon we will attempt to find solutions to this challenge. We will organize in teams to develop solutions and to rapidly prototype software.

Interested? Then apply here. We have only 15 seats and the deadline for application is on Wednesday, December 20, 2017.