STREAMobj

Which tolerance should I choose when using knickpointfinder?

Posted on

Kai Deng from the GFZ Potsdam commented on my previous post on knickpointfinder and asked the question about the parameter ‘tol’:

[…] one question on the parameter “tol”. You said that “the value of tol should reflect uncertainties that are inherent in longitudinal river profile data”. However, I’m still confused about how to translate this sentence into code, or in another word, how can I validate my choice of tol by some criteria based on codes?

Now that’s an excellent question. The ‘tol’-parameter determines how many knickpoints the finder will detect and we might not want to miss a knickpoint. At the same time, we know that river profile data often has some erroneous values and we wouldn’t want to pick knickpoints that are merely artefacts of the data. So, what value of ‘tol’ should we choose?

I wrote that “the value of tol should reflect uncertainties that are inherent in longitudinal river profile data”. More specifically, I would argue that one should choose tolerance values that are higher than the maximum expected error between the measured and the true river profile. With lower tolerance values we would likely increase the false positive rate, i.e., the probability that we choose a knickpoint that is due to an artefact.

Now, the problem that we immediately face is that we don’t know this error. Of course, there are quality measures for most common DEM products but usually these do not apply to values measured along valley bottoms, at least if you take common global DEM products such as SRTM, ASTER GDEM, etc (Schwanghart and Scherler, 2017). Usually, there are also no repeated measurements. Hence, we cannot estimate the error as we would in the case of measurements in the lab.

But there is another approach: What we know for sure is that river profiles should monotonously decrease downstream. At least, this is true for the water surface, yet not necessarily bed elevations (e.g. due to the presence of riffle-pool sequences). The elevation of a pixel on the stream network should thus be either lower or equal to the elevation of its upstream neighbor. If it is not, than the value is probably wrong (for whatever reason). I say ‘probably’, because the upstream pixel might have an erroneously low value, too.

The following code demonstrates how we could determine these errors: Through downstream minima imposition (imposemin) and upstream maxima imposition (cummaxupstream). The shaded areas in the thus generated plot (here I show a zoomed version) show the differences between these values.

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
FD = FLOWobj(DEM);
S  = STREAMobj(FD,'minarea',1000);
S = klargestconncomps(S);
zmax = cummaxupstream(S,DEM);
zmin = imposemin(S,getnal(S,DEM));
plotdzshaded(S,[zmin zmax])
hold on
plotdz(S,DEM,'color','k')
tol_1.png
Shaded areas show the difference upstream maxima and downstream minima imposition.

The maximum difference between the elevations is the maximum relative error that we estimate from the profile.

max(zmax-zmin)

ans =

    43

Now we might want to take this elevation difference as tolerance value for the knickpointfinder. However, this value might be very high if our profile is characterized by a few positive or negative outliers. This would result in only few if any knickpoints. Thus, we could come up with a more robust estimate of the error which is based on quantiles. Robust in this sense means: robust to outliers. And the proposed technique is  constrained regularized smoothing (CRS) (Schwanghart and Scherler, 2017).

What is constrained regularized smoothing? It is an approach to hydrologically correct DEMs based on quantile regression techniques. While minima or maxima imposition present two extremes of hydrological conditioning – carving and filling, respectively – CRS (and quantile carving, which is similar but doesn’t smooth) is a hybrid technique and returns river longitudinal profiles that run along the tau’s quantile of the data. Why would this be preferred? Simply because we would not want that the shape of the resulting profile is predominantly determined by the errors with the largest magnitude.

Here is how to apply quantile carving using the quantiles 0.1 and 0.9.

z90 = crs(S,DEM,'K',1,'tau',0.9);
z10 = crs(S,DEM,'K',1,'tau',0.1);
max([z90-z10])

ans =

   41.7629

Of course, the maximum difference is lower than the previous value obtained by minima and maxima imposition. But not very much. Let’s see how well the algorithm performs if we add some additional errors.

IX = randlocs(S,10);
IX = find(ismember(S.IXgrid,IX));
z = getnal(S,DEM);
z(IX) = z(IX) + sign(randn(numel(IX),1))*100;
plotdz(S,z)

 

tol_2.png
River profile with some additional errors.

Let’s compute the difference between carving and filling.

zmin = imposemin(S,z);
zmax = cummaxupstream(S,z);
max(zmax-zmin)

ans =

110

And whatabout our uncertainty estimate based on the CRS algorithm?

z90 = crs(S,z,'K',1,'tau',0.9);
z10 = crs(S,z,'K',1,'tau',0.1);
d = max(z90-z10)

d =

41.7629

The CRS result is not affected at all by few outliers. That means, it enables us to derive a rather robust estimate about the uncertainty of the data.

Moreover, we can use the CRS-corrected profile as input to the knickpointfinder, because we have filtered the spikes from the data. As elevations are often overestimated, you might use directly the node-attribute list z10.

I have shown that you can use an uncertainty estimate derived from your river profile as direct input to knickpointfinder. There are, however, a few things to note:

  1. Don’t carve or fill your data before using CRS. Otherwise, your uncertainty estimate is flawed.
  2. The uncertainty estimate may not reliably detect the true uncertainty in your data. Alsways cross-check whether the uncertainty estimate is either too low or too high.
  3. I haven’t talked about the smoothness parameter ‘K’ which also controls the uncertainty estimates. Larger values of K will likely increase the uncertainty estimate.
  4. CRS can take a while for very large river networks. Consider estimating the tolerance value from a subset of the data.
  5. You can also use a node-attribute list of tolerances in knickpointfinder and thus account for spatially varying tolerances. In this case, however, add some offset because the CRS might indicate areas with zero uncertainty and thus overestimate the true precision of the data.

References

Schwanghart, W., Scherler, D., 2017. Bumps in river profiles: uncertainty assessment and smoothing using quantile regression techniques. Earth Surface Dynamics, 5, 821-839. [DOI: 10.5194/esurf-5-821-2017]

Advertisements

Working with knickpointfinder

Posted on Updated on

I have learnt from numerous comments here on this blog, that the knickpointfinder function seems to be quite popular. However, working with the output of the function often entails some difficulties. So here is a short post about knickpointfinder.

Let’s load and prepare some data first:

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
FD = FLOWobj(DEM);
S  = STREAMobj(FD,'minarea',1000);
S = klargestconncomps(S);

In these first four lines of code, I loaded the DEM of the Big Tujunga catchment, derived flow directions (stored in FD, an instance of FLOWobj), and then extracted the stream network S based on a minimum supporting drainage area of 1000 pixels which is about 0.9 sqkm. In the last line, I clipped the stream network to the largest connected stream network, that is in this case the largest drainage basin.

Now we are ready to run knickpointfinder:

[zs,kp] = knickpointfinder(S,DEM,'tol',30,'split',false);
Animated output as displayed by the knickpointfinder function if ‘plot’ is set to true.

The function takes the stream network S, the digital elevation model DEM and then a few parameter name-value pairs. knickpointfinder reconstructs to the actual longitudinal profile by iteratively removing a curvature constraint at those locations that have a maximum vertical offset to the actual profile. After few iterations, the vertical offset decreases and ‘tol’ sets the minimum vertical offset where the algorithm should stop. The value of tol should reflect uncertainties that are inherent in longitudinal river profile data (Schwanghart and Scherler, 2017).

‘split’ enables running the function in parallel. The function splits the stream network into connected components (see the function STREAMobj2cell) and sends each to individual workers. Parallel execution, however, is only faster if there are several drainage basins. Since we have only one in this example, I set ‘split’ to false. There are other options to control verbose and visual output and you can explore these options by typing

help knickpointfinder

Now, knickpointfinder returned two output variables: zs and kp. zs is a node-attribute list that contains the reconstructed river profile. You can plot it using plotdz.

plotdz(S,DEM)
hold on
plotdz(S,zs)
hold off

kp is a structure array that holds all kind of information.

kp

kp = 

  struct with fields:

           n: 26
           x: [26×1 double]
           y: [26×1 double]
    distance: [26×1 double]
           z: [26×1 double]
      IXgrid: [26×1 double]
       order: [26×1 double]
          dz: [26×1 double]
         nal: [6985×1 logical]

Admittedly, this is a bit messy and there will be some changes in the future. But this will still have to wait a bit. Now what do the structure fields contain? kp.n is 26. Hence, the algorithm has identified 26 knickpoints. kp.x and kp.y contain the coordinates and kp.z  the elevations of the knickpoints. The distance from the outlet is stored in kp.distance. kp.IXgrid contains the linear indices into the DEM. TopoToolbox works a lot with linear indices and thus this output is very crucial. kp.order contains the order with which the knickpoints where identified. However, several knickpoints may have the same order and thus this information is not very meaningful. kp.dz is the vertical offset between the modelled profile at the time when the knickpoint was identified. Finally, kp.nal is a logical vector and a node-attribute list. It allows indexing into other node attribute lists.

Now, here are a few things you might want to do with the structure array kp.

1. Plot a map

imageschs(DEM)
hold on
plot(S,'k');
plot(kp.x,kp.y,'ko','MarkerFaceColor','w')
hold off

2. Plot a longitudinal river profile

plotdz(S,DEM);
hold on
plot(kp.distance,kp.z,'ko','MarkerFaceColor','w')
hold off

3. Plot a longitudinal river profile in chi space

A = flowacc(FD);
c = chitransform(S,A,'mn',0.45);
plotdz(S,DEM,'distance',c);
hold on
[~,locb] = ismember(kp.IXgrid,S.IXgrid);
ckp = c(locb);
plot(ckp,kp.z,'ko','MarkerFaceColor','w')
hold off
xlabel('\chi [m]')

4. Export as shapefile

MS = STREAMobj2mapstruct(S);
shapewrite(MS,'streamnet');
MKP = struct('X',num2cell(kp.x),'Y',num2cell(kp.y),'Geometry','Point');
shapewrite(MKP,'knickpoints');

Ok, that’s a basic work flow that you may apply easily to your own DEM. Give it a try.

References

Schwanghart, W., Scherler, D., 2017. Bumps in river profiles: uncertainty assessment and smoothing using quantile regression techniques. Earth Surface Dynamics, 5, 821-839. [DOI: 10.5194/esurf-5-821-2017]

X inferiority

Posted on

I just need to finish the year with a quick blog entry… And this one is – once again – about chi maps.

I thought about different ways to display chi maps. chi maps are usually plotted using stream networks. However, to this end, chi maps are not so much about stream networks but rather about drainage divides and how they potentially migrate in the future. In addition, the distance of potential divide migration depends on the relative difference of chi values on either side of the divide, not so much on the absolute values. How could we create a map that reflects this information

Here is an idea: First, we create a chi transform of the river network (see function chitransform). Then we map these chi-values to the entire grid. This can be accomplished with the new function mapfromnal which lets you map values from a stream network to its nearest upstream contributing pixels. Then, we will derive some-pixel wide fringes of chi values around the divides and calculate their spatial average. Dividing the chi values by these spatial averages then quantifies the degree to which some part of the divide is either a victim or an aggressor. Since victims have higher values, I call this metric the chi-inferiority.

Now here is the code:

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
FD  = FLOWobj(DEM);
S   = STREAMobj(FD,'minarea',500);
S   = removeedgeeffects(S,FD);
[~,zb] = zerobaselevel(S,DEM);
A   = flowacc(FD);

c   = zb + chitransform(S,A);
C   = mapfromnal(FD,S,c);
D   = drainagebasins(FD,S);

for r = 1:max(D)
    I = D==r;
    I = erode(I,ones(21)) & I;
    C.Z(I.Z) = nan;
end

C1 = dilate(C,ones(21));
C2 = erode(C,ones(21));

C  = C/((C1 + C2)/2);
imageschs(DEM,C,'colormap',ttscm('vik'),...
    'colorbarylabel','\chi-inferiority',...
    'ticklabels','nice');
hold on
plot(S,'k')
hold off
Map of chi-inferiority of the Big Tujunga catchment. Coloring is based on the vik colormap of the scientific colormaps of Fabio Crameri.

Happy New Year everone!

Smoothing the planform geometry of river networks

Posted on

In previous posts (here, here, here), I have covered some of the tools and applications of smoothing in the analysis of river networks. Thereby, I have focused on the analysis of river profiles (see also our paper here). However, these tools can also be used to smooth and analyse the planform patterns of river networks.

Here is how:

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
FD = FLOWobj(DEM,'preprocess','carve');
S  = STREAMobj(FD,'minarea',1000);
S  = klargestconncomps(S);
y = smooth(S,S.y,'k',500,'nstribs',true);
x = smooth(S,S.x,'k',500,'nstribs',true);

[~,~,xs,ys] = STREAMobj2XY(S,x,y);
subplot(2,1,1)
plot(S)
subplot(2,1,2)
plot(xs,ys)
Upper panel: Original river network data. Lower panel: Smoothed river network.

Now what is this good for? First, simplification by smoothing might help in visualization. Second, the smoothed river coordinates can be used to derive metrics such as planform curvature of river networks. These metrics are strongly affected by the circumstance that river networks derived from digital elevation models are forced to follow the diagonal and cardinal directions of the grid. This results in local flow directions in steps of multiples of 45°. Changes in flow direction thus occur in jumps, and accordingly planform curvature of river networks are highly erratic. These erratic values may hide some of the larger-scale curvature patterns that we might be interested in.

The STREAMobj/curvature function features an example that shows how to accentuate larger-scale curvature patterns using smoothing. Following up on the code above, deriving curvature from smoothed planform coordinates can be done as follows:

y = smooth(S,S.y,'k',100,'nstribs',true);
x = smooth(S,S.x,'k',100,'nstribs',true);
c = curvature(S,x,y);
plotc(S,c)
box on
% center 0-value in colormap
caxis([-1 1]*max(abs(caxis)))
colormap(ttscm('vik'))
h = colorbar;
h.Label.String = 'Curvature';
Planform curvature of the river networks.

Clearly, the amount of smoothing determines the spatial scale of curvature. Here, the chosen smoothing parameter (K=100) accentuates the bends at a scale of a few hundred meters.

There are a number of applications for this kind of analysis. Lateral erosion of rivers may undercut slopes and it may be interesting to investigate if planform curvature metrics help to detect and predict locations of landsliding. Just an idea…

mappingapp – a tiny tool to map points along river networks

Posted on Updated on

TopoToolbox has a new tool. Ok, it’s a small tool with limited functionality, but it might be helpful if your aim is to map points along river networks. “Mapping points can be easily done in any GIS!”, you might think. True, but as I am currently working on knickpoints in river profiles (see also the automated knickpointfinder), I wrote this tool to quickly map knickpoints both in planform and profile view of the river network.

So, here is mappingapp. Give it a try.

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
FD = FLOWobj(DEM);
S  = STREAMobj(FD,'minarea',1000);
mappingapp(DEM,S)
DrZMINIW4AEMnH6.jpg large.jpg
mapping-app. A tiny tool to map knickpoints along river profiles.

Currently, the GUI is limited to a few basic tools. You can map a single point which automatically snaps to the river network S. The zoom tools allow you to navigate the DEM. You can add a new point using the + button. This will make the previous point permanent and add a new row to the table. Finally, the table can be exported to the workspace. The table contains the coordinates and z-values as well as a column IXgrid. IXgrid contains the linear indices into the DEM.

Controls of mappingapp

Multiple colormaps in one axis

Posted on Updated on

I recently mentioned that MATLAB now lets you easily use different colormaps in one figure. The trick is to provide the axis handle as first input argument to the colormap function call.

DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
ax(1) = subplot(2,1,1);
imagesc(DEM); 
colorbar;

ax(2) = subplot(2,1,2);
imagesc(DEM); 
% Use the axis handle as first input argument
% to colormap
colormap(ax(2),landcolor); 
colorbar

multiplecmaps.png
Multiple colormaps in one figure.

However, how do you use multiple colormaps in one axes? That’s a little more tricky. Below, you find commented code that shows how to plot a colored and hillshaded DEM together with a stream network colored with ksn-values.

% Some data (DEM and ksn of river network)
DEM = GRIDobj('srtm_bigtujunga30m_utm11.tif');
FD  = FLOWobj(DEM);
A   = flowacc(FD);
S   = STREAMobj(FD,'minarea',1000);
k   = smooth(S,ksn(S,DEM,A),'K',1000);

% Create a new figure
hFig = figure;
% Call imageschs which has some parameter/value pairs
% to control colormap and colorbar appearance 
imageschs(DEM,[],'colormap','landcolor',...
                 'colorbar',true,...
                 'colorbarylabel','Elevation');

% Get the current axes
h_ax = gca;
% Create a new axes in the same position as the
% first one, overlaid on top.
h_ax_stream = axes('position', get(h_ax, 'position'),...
                   'Color','none'); 
% Plot the stream network               
plotc(S,k,'LineWidth',2);
% Make axis invisible
set(h_ax_stream,'Color','none');
% Create a colorbar
hc = colorbar(h_ax_stream,'Location','southoutside');
% ... and label it
hc.Label.String = 'ksn';
% Adjust color range
caxis(h_ax_stream,[0 200])
% Perfectly align both axes
set(h_ax_stream,'position', get(h_ax, 'position'),...
                'XLim',xlim(h_ax),...
                'YLim',ylim(h_ax),...
                'DataAspectRatio',[1 1 1]...
                );
% Link both axes so that you can zoom in and out            
linkaxes([h_ax,h_ax_stream],'xy');

% Resizing the figure may screw it all. Using the figure
% property ResizeFcn makes sure both axes remain perfectly
% aligned
hFig.ResizeFcn = @(varargin) ...
    set(h_ax_stream,'Position',get(h_ax, 'position'));

multipleaxes.png
Multiple colormaps in one axis.

The trick is to use additional axes. The difficulty lies in perfectly aligning them and to make alignment robust against resizing the figure.

Finding knickpoints in river profiles

Posted on Updated on

Did you know that TopoToolbox has a function to identify knickpoints in river profiles? Well, if you don’t know, now you know.

The function is called knickpointfinder. It uses an algorithm that adjusts a strictly concave upward profile to the actual profile. Offsets between the actual and the concave upward profile occur where the actual profile has convexities. Relaxing the concavity constraint where offsets attain a maximum will adjust the concave profile to the actual profile. knickpointfinder adjusts the profile iteratively until offsets fall below a specified tolerance value. Look at the animation below which probably explains more than a thousand words.

knickpointfinder
Animation of the example found in the knickpointfinder help.

Using knickpointfinder is easy. Just see the function help to run the example whose results are shown above.

knickpointfinder.png
Final result with adjusted network and identified knickpoints (red squares). The size of the squares relates to the offset between the actual profile and the strictly upward concave profile.

Let us know how well knickpointfinder suits your needs. Note that this algorithm is not yet published, so please give credits to our TopoToolbox paper if you are using this algorithm in your work.