doc

in the command window. MATLAB’s documentation browser will pop up and the landing page will have a new entry in the section Supplemental Software. The link will bring you to the TopoToolbox documentation.

The documentation features a Getting Started section, numerous guides, and a function reference (which is, however, not linking to the individual help functions).

Moreover, you can now search the documentation which will hopefully make it much easier to find the right tools.

The documentation is not yet complete. We are still working on including the help sections of each function into the documentation, which will make the functionality even more accessible and browsable.

**** addendum January 15, 2018 ****

After permanently setting the paths, you might need to restart MATLAB in order to be able to view the documentation in the help browser.

]]>To answer this question, I took monthly visitor counts during the last two years (unfortunately, I can only access the last two years). I saved them in an Excel file which I then imported to MATLAB. My aim was then to do a regression analysis to be able to identify possible trends. Since the dependent data are counts, a standard least squares regression is clearly not advisable. Rather, count data are modelled using Poisson statistics. I thus chose a generalized linear regression model with counts as a Poisson response, and time as independent variable. I used ‘log’ as a link function so that the response variable is Y~Poisson(exp(b1 + b2(X))).

The function fitglm (which is part of the Statistics and Machine Learning Toolbox) is perfectly suited to solve this regression problem. However, the function takes a frequentist approach which I didn’t want to follow here. Rather, I wanted to learn how to tackle this problem with Bayesian inference. Gladly, MATLAB increasingly features tools for Bayesian analysis such as the Hamiltonian Monte Carlo (HMC) sampler. The only thing I needed was some guidance of how to derive the prior distributions and how to calculate the posterior probability density function. For this, I found this paper by Doss and Narasimhan quite useful.

Writing tools for Bayesian analysis is not easy and the code quite difficult to debug. But finally, I managed to get my analysis working, and to return following figure:

Clearly, the posterior distribution of beta_1, the slope of the regression, does not include zero so I can assume with high certainty, that we truly observe a trend to higher visitor numbers. Can this finding be extrapolated? Well, I don’t know. Just keep on visiting this blog frequently, and we’ll see.

If you are interested in the technical details, here is the function:

function bayesianpoissonregress(t,y) %Bayesianpoissonregression % % t --> datetime vector (equally spaced) % y --> vector with counts t = t(:); y = y(:); % Numeric month indices X = (1:size(t,1))'; % add offset to design matrix X = [ones(size(X,1),1) X]; % link function gamma = @(beta) exp(beta(:)'*X')'; % Prior distributions of the parameters are gaussian a = [0 0]; % prior means b = [20 20]; % prior std d = a./b + sum(X.*y); %% MCMC sampling hmc = hmcSampler(@(beta) logpdf(beta),[0 0],'UseNumericalGradient',true); % Estimate max. a posteriori map = estimateMAP(hmc); % tune sampler hmc = tuneSampler(hmc,'StartPoint',map); % draw chain chain = drawSamples(hmc,'StartPoint',map); %% Plot results figure subplot(1,2,1) [~,ax] = plotmatrix(chain); ylabel(ax(1,1),'\beta_0'); ylabel(ax(2,1),'\beta_1'); xlabel(ax(2,1),'\beta_0'); xlabel(ax(2,2),'\beta_1'); subplot(1,2,2) % random set of parameters from chain sam = randperm(size(chain,1),100); sam = chain(sam,:); hold on for r = 1:size(sam,1) chat = gamma(sam(r,:)); plot(t,chat,'color',[.7 .7 .7]); end plot(t,y,'--s') ylabel('Visitors') box on % -- end of function %% Log pdf function p = logpdf(beta) % posterior distribution according to % http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.39.9684&rep=rep1&type=pdf % (Eq. 2.2) % posterior p = - sum(1./(b.^2) .* beta.^2) + sum(d .* beta) - sum(gamma(beta)); end end]]>

Before this year ends, here is an announcement for an event next year: The **Central European Conference on Geomorphology and Quaternary Sciences** will take place in Giessen in September 2018. The event is jointly organized by the German Working Group on Geomorphology (AK Geomorphologie) and the German Quaternary Association (DEUQUA – Deutsche Quartärvereinigung). Here are the key dates:

January 8^{th}2018 |
Start of Registration |

June 15^{th}2018 |
End of Early Registration |

July 6^{th}2018 |
Deadline of Abstract Submission |

September 14^{th}2018 |
End of Late Registration Period |

September 23 – 27 2018 |
Conference |

You’ll find more information >>here<<.

]]>One difficulty with Tanaka’s method is that coloring of contour lines requires knowledge about whether higher values are found to the left or the right of a contour line. GDAL’s contouring algorithm has consistent behavior in this respect. However, MATLAB’s algorithm doesn’t. My trick is to calculate surface normals and their angle to the light source, and then interpolate the gridded directions to the vertex locations of the contour lines.

Now here is my approach which uses some undocumented features of the HG2 graphics. An excellent resource for undocumented features in MATLAB is Yair Altman’s blog.

function tanakacontour(DEM,nlevels) %TANAKACONTOUR Relief depiction using Tanaka contours % % Syntax % % tanakacontour(DEM,nlevels) % % See also: contourf, hillshade % % Author: Wolfgang Schwanghart (w.schwanghart[at]geo.uni-potsdam.de) % Date: 8. December, 2017 if nargin == 1; nlevels = 10; end % calculate solar vector azimuth = 315; azid = azimuth-90; azisource = azid/180*pi; [sx,sy,~] = sph2cart(azisource,0.1,1); % get coordinate matrices and z values [Z,X,Y] = GRIDobj2mat(DEM); % calculate surface normals [Nx,Ny] = surfnorm(Z); N = hypot(Nx,Ny); Nx = Nx./N; Ny = Ny./N; H = GRIDobj(DEM); H.Z = Nx*sx + Ny*sy; % Get coordinate matrices and Z [~,h] = contourf(X,Y,Z,nlevels); axis image drawnow % colormap cmap = gray(255)*255; cval = linspace(0,1,size(cmap,1))'; % the undocumented part for r = 1:numel(h.EdgePrims) % the EdgePrims property contains vertex data x = h.EdgePrims(r).VertexData(1,:); y = h.EdgePrims(r).VertexData(2,:); % interpolate hillshade grayscale values to contour vertices c = interp(H,double(x(:)),double(y(:))); % interpolate to colormap clr = interp1(cval,cmap,c); % adjust color data to conform with EdgePrims ColorData property clr = clr'; clr = uint8([clr;repmat(255,1,size(clr,2))]); % set all properties at once set(h.EdgePrims(r),'ColorBinding','interpolated','ColorData',clr,'LineWidth',1.2); end

Ok, now let’s try this on some data. In this code, I have used GRIDobj, the class for gridded, georeferenced data.

[X,Y,Z] = peaks(500); tanakacontour(GRIDobj(X,Y,Z),10)

And now for the Big Tujunga DEM:

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif'); tanakacontour(DEM,5)

Now that looks ok, but I am not totally convinced by these unrealistic terraces, to be honest. Moreover, my approach so far doesn’t include changing line widths. Contour lines should actually be thicker for slopes that face both towards and away from the light source. However, I didn’t figure out how to do this so far. Any ideas?

Our paper on uncertainty quantification and smoothing of longitudinal river profiles has now been published in Earth Surface Dynamics. The paper describes a new approach to hydrologically correct and smooth profiles using quantile regression. Smoothing is based on curvature regularization with the constraint that elevation must decrease with increasing flow distance. We thus refer to this technique as constrained regularized smoothing (CRS). We compare CRS-derived profiles to profiles obtained from the common methods of filling and carving, and show that CRS outperforms these methods in terms of accuracy and precision.

Check out the new TopoToolbox functions that accompany the paper:

- STREAMobj/crs
- STREAMobj/crslin
- STREAMobj/crsapp
- STREAMobj/smooth
- STREAMobj/quantcarve
- FLOWobj/quantcarve

Applications of these functions were already described in the posts on An alternative to along-river swath profiles, Steepness derived from smoothed river profiles, Smooth operator….

**References**

Schwanghart, W., Scherler, D., 2017. Bumps in river profiles: uncertainty assessment and smoothing using quantile regression techniques. *Earth Surface Dynamics*, 5, 821-839. [DOI: 10.5194/esurf-5-821-2017]

]]>

Output from LEMs is multidimensional. In fact, it is +4D: three dimensions in space, one in time, and several variables that change over time in space. This makes comparing and visualizing output from LEMs quite difficult. In this hackathon we will attempt to find solutions to this challenge. We will organize in teams to develop solutions and to rapidly prototype software.

Interested? Then apply here. We have only 15 seats and the deadline for application is on Wednesday, December 20, 2017.

]]>The last weeks had been quite busy to finish version 2.2 which was available as a prerelease for a long while. TopoToolbox users who keep their software constantly updated (for example by using GIT) won’t see much changes. For those that do not keep pace with the frequent commits to our repository, we encourage them to **do so now**. There are a lot of new functions and modifications. Benjamin Campforts added TTLEM, the TopoToolbox Landscape Evolution Model. The scope of functions for working with river networks (STREAMobj) has tremendously increased with new plotting functions, low-level chi analysis tools, and tools for geometric modifications. We added new functions to hydrologically correct and smooth river networks and values measured along them (e.g. constrained regularized smoothing (CRS)). TopoToolbox now supports multiple flow directions and there are several new functions for working with grids (GRIDobj). In addition, we consolidated the help sections in each function and increased compatibility with older MATLAB versions. Please see the readme-file for a complete overview of changes.

With version 2.2, we offer TopoToolbox as a MATLAB^{®} toolbox file (mltbx-file). This file will make installation very easy. Simply download it, double-click, and follow the instructions.

Dirk and I met this morning in the train (here we are!) …

… and discussed possible directions for a next version. The number of functions has increased a lot which entails the threat that TopoToolbox might become confusing and even deterrent in particular for new users. Simply adding new functionalities is thus not the way forward. Instead, we decided that a new version should have a better documentation that should be integrated in MATLABs documentation browser. To quote John D’Errico, a long-time and excellent contributor of MATLAB code: *Your job as a programmer does not stop when you write the last line of code. If you think so, then you should be fired. You should document your code. Provide help. Otherwise, that code is just a bunch of random bits, useful to nobody else in the world*.

With this in mind, let’s go for 2.3.

]]>The Himalayan history is rich with a sequence of destructive earthquakes. In the last century, ground-shaking, collapsing houses, and landslides in the wake of earthquakes killed tens of thousand of people, wreaking havoc to the Himalayan nations. The 2015 Gorkha Earthquake was the latest in a series of severe earthquakes to hit Nepal.

Seismic hazard analysis in the Himalayas is based on few instrumental records and a paleoseismic record extending back ~1000 years. Paleoseismology largely relies on rupture histories derived from fault trenches, written accounts, and liquefaction features. Other records derived from e.g. lake sediments are scarce.

In a now published paper in Quaternary Science Reviews, Amelie Stolle et al. documents our research in the Pokhara Valley in Nepal. The valley was massively and repeatedly aggraded by several cubic kilometers of debris in the wake of medieval earthquakes in the region. The paper extends on our 2016 paper in Science, offering new radiocarbon dates and detailing the sedimentology of the infills. Based on our findings, we argue that valley fills in the Himalayas may offer substantial additional evidence for past earthquakes subsidy to the current portfolio of paleoseismological records.

**References**

Stolle, A., Bernhardt, A., Schwanghart, W., Hoelzmann, P., Adhikari, B.R., Fort, M., Korup, O., 2017. Catastrophic valley fills record large Himalayan earthquakes, Pokhara, Nepal. Quaternary Science Reviews, 177, 88-103. [DOI: 10.1016/j.quascirev.2017.10.015]. **<< free link to paper until December 26, 2017 >>**

I have mentioned the mn-ratio several times in this blog. Usually, we calculate it using slope-area plots or chi analysis both of which are included in TopoToolbox. However, these methods usually lack consistent ways to express the uncertainties of the mn-ratio. The lack of consistency is due to fitting autocorrelated data which elude a straightforward statistical analysis.

Today, I want to present a new function that uses Bayesian Optimization with cross-validation to find a suitable mn-ratio. While Bayesian Optimization is designed to find optimal values of objective functions involving numerous variables, solving an optimization problem with mn as the only variable nicely illustrates the approach.

Bayesian Optimization finds a minimum of a scalar-valued function in a bounded domain. In a classification problem, this value could be the classification loss, i.e., the price paid for misclassifications. In a regression problem, this value might refer to the sum of squared residuals. The value might also be derived using cross-validation, a common approach to assess the predictive performance of a model. Such cross-validation approaches might take into account only random subsets of data, which entails that the value to be optimized might not be the same for the same set of input parameters. Bayesian Optimization can handle stochastic functions.

Now how can we apply Bayesian Optimization for finding the right mn-ratio? The new function mnoptim uses chi analysis to linearize long-river profiles. If there are several river catchments (or drainage network trees), the function will pick a random subset of these trees to fit a mn-ratio and then tests it with another set of drainage basins. This allows us to assess how well an mn-ratio derived in one catchment can actually be applied to another catchment. The goal is to derive a mn-ratio that applies best to other catchments.

Now let’s try this using the new function mnoptim. Here is the code that I’ll use for entire SRTM-3 DEM of Taiwan. I’ll clip the stream network to the area upstream of the 300 m contour to avoid an influence of the alluvial low-lying reaches.

DEM = GRIDobj('taiwan.tif'); FD = FLOWobj(DEM); S = STREAMobj(FD,'minarea',1e6,'unit','map'); C = griddedcontour(DEM,[300 300],true); S = modify(S,'upstreamto',C); A = flowacc(FD); [mn,results] = mnoptim(S,DEM,A,'optvar','mn','crossval',true); % we can refine the results if we need results = resume(mn); % and get an optimal value of mn: bestPoint(results) ans = table mn _______ 0.41482

Now this nicely derives the optimal mn-value of 0.415 which is close to the often reported value of 0.45. Moreover, based on the plot, we gain an impression of the uncertainty of this value. In a transient landscape with frequent knickpoints, the uncertainty about the mn-ratio will be probably larger.

Note that mnoptim requires the latest version of MATLAB: 2017b, as well as the Statistics and Machine Learning Toolbox. It also runs with 2017a, but won’t be able to use parallel computing then.

]]>Abstract submission for the EGU 2018 has just started and is open until 10 Jan 2018, 13:00 CET. The session **Interactions between tectonics and surface processes from mountain belts to basins**, organized by Dirk Scherler, Alex Whitaker, Taylor Schildgen and me, will address the coupling between tectonics and surface processes. We invite contributions that use geomorphic or sedimentary records to understand tectonic deformation, and we welcome studies that address the interactions and couplings between tectonics and surface processes at a range of spatial and temporal scales. In particular, we encourage coupled catchment-basin studies that take advantage of numerical/physical modeling, geochemical tools for quantifying rates of surface processes (cosmogenic nuclides, low-temperature thermochronology, luminescence dating) and high resolution digital topographic and subsurface data. We also encourage field or subsurface structural and geomorphic studies of landscape evolution, sedimentary patterns and provenance in deformed settings, and invite contributions that address the role of surface processes in modulating rates of deformation and tectonic style.

Please see further information >>here<<.

We look forward to your contributions. See you at the EGU soon!

Wolfgang, Dirk, Taylor and Alex