Blog from July, 2009

Source MatLab: http://ecco2.jpl.nasa.gov/data1/matlab/netcdf_toolbox/netcdf/@ncvar/resize.m

function theResult = resize(self, newSize)

% ncvar/resize - Resize variable dimensions.
% resize(self, newSize) resizes the dimensions of self,
% an "ncvar" object. All variables related to the
% changed dimensions are similarly affected. The
% newSize must have the same number of dimensions
% as the existing variable. The new self is returned.

% Copyright (C) 1998 Dr. Charles R. Denham, ZYDECO.
% All Rights Reserved.
% Disclosure without explicit written consent from the
% copyright owner does not constitute publication.

% Version of 03-Nov-1998 08:52:22.
% Updated 12-Aug-1999 09:42:01.

if nargin < 1, help(mfilename), return, end
if nargout > 0, theResult = self; end

% Check for no-change.

if isequal(ncsize(self), newSize)
result = self;
if nargout > 0
theResult = result;
else
ncans(result)
end
return
end

theItemName = name(self);

% Check for writeability.

f = parent(self);
thePermission = permission(f);
theSrcName = name(f);

if isequal(thePermission, 'nowrite')
disp([' ## NetCDF source file must be writeable.'])
return
end

% Check request.

d = dim(self);
for i = 1:length(d)
if \~isrecdim(d[i]) & newSize(i) <= 0
disp(\[' ## Dimension "' name(d[i]) '" size requires positive integer.'\])
return
end
end

% Create temporary file.

g = \[\];

i = 0;
while isempty(g)
i = i + 1;
theTmpName = ['tmp_' int2str(i) '.nc'|'tmp_' int2str(i) '.nc'];
if exist(theTmpName, 'file') \~= 2
g = netcdf(theTmpName, 'noclobber');
end
end

theTmpName = name(g);

% Copy affected dimensions first.

d = dim(self);
for i = 1:length(d)
if isrecdim(d[i])
g(name(d[i])) = 0;
elseif newSize(i) <= 0
error(\[' ## Dimension "' name(d[i]) '" requires positive integer.'\])
else
g(name(d[i])) = newSize(i) ;
end
end

% Copy other dimensions.

d = dim(f);
for i = 1:length(d)
g(name(d[i])) = ncsize(d[i]);
end

% Copy global attributes.

a = att(f);
for i = 1:length(a)
copy(a[i], g)
end

% Copy variable definitions and attributes.

v = var(f);
for i = 1:length(v)
copy(v[i], g, 0, 1)
end

% Copy variable data as minimal rectangular array.
% Note that the "()" operator is out-of-context
% inside this method, so we have to build our own
% calls to "ncvar/subsref" and "ncvar/subsasgn".
% It might be easier for us to use "virtual"
% variables instead, which could be transferred
% with the more intelligent "ncvar/copy" method.

v = var(f);
w = var(g);

for i = 1:length(v)
sv = ncsize(v[i]);
sw = ncsize(w[i]);
if \~isempty(sw)
d = dim(w[i]);
if isrecdim(d[1])
if sw(1) == 0
if isequal(name(d[1]), theItemName)
sw(1) = newSize;
else
sw(1) = sv(1);
end
end
end
end
theMinimalSize = min(sv, sw);
if prod(theMinimalSize) > 0
if isequal(sv, sw)
copy(v[i], g, 1)
else
theIndices = cell(size(theMinimalSize));
for j = 1:length(theIndices)
theIndices[j] = 1:theMinimalSize(j);
end
theStruct.type = '()';
theStruct.subs = theIndices;
theData = subsref(v[i], theStruct);
w[i] = subsasgn(w[i], theStruct, theData);
end
end
end

% Close both files.

f = close(f);
g = close(g);

{color:#ff0000}% Delete old file.{color}

{color:#ff0000}delete(theSrcName){color}

{color:#ff0000}% Rename new file to old file name.{color}

{color:#ff0000}fcopy(theTmpName, theSrcName){color}
{color:#ff0000}delete(theTmpName){color}

{color:#ff0000}% Open the new file.{color}

{color:#ff0000}g = netcdf(theSrcName, thePermission);{color}

% Return the resized variable.

result = g[theItemName];

if nargout > 0
theResult = result;
else
ncans(result)
end
Location of all user workshop materials

"P:\sobek\Marketing\User Conferences\2009"

Events and Speed

After discussion today about how slow are events I've tried to write a small test to see how slow our events are. It was almost obvious for me that because of compile-time code weaving there should be no big overheads in use of aspects to manage events (of course when they are used properly, it is always easy to misuse things (wink)). Below are results, it looks like it was pretty slow, but we did it, by using Reflection. Taking into account that we did not do optimization at all I did only minimal optimization, actually removed our custom NotifyPropertyChangeArgs implementation and used standard System.ComponentModel.NotifyPropertyChangeEventArgs. As result OldValue and NewValue disappeared. If we will need them - we will have to extend an aspect attribute to use PropertyChanging events. Also we probably will need something similarto implement IEditableObject (transactional object editing) or for implementation of Memento Design Pattern (to implement Undo / Redo). The main advantage is - reuse. Less code is always good plus looking on videos / conferences it seems to be a current direction where programming languages moving to (DSL, AOP).

// Before optimization
// -------------------

NotifyPropertyChangedAttributeTest.SlowDownBecauseOfPropertyChangeEventsShouldBeLessThan250Percent : Failed
1375 [7] INFO 100 000 changes without NotifyPropertyChanged: 125 milliseconds
1421 [7] INFO 100 000 changes with NotifyPropertyChanged: 750 milliseconds
1421 [7] INFO (warning) >>> 500% slower <<< (warning)

NUnit.Framework.AssertionException: Expected: less than or equal to 40.0d
But was: 500.0d

// After Optimization (remove use of reflection in aspect and remove OldValue / NewValue in aspect - use standard .NET 3.5 implementation)
// ------------------

828 [7] INFO 100 000 changes without NotifyPropertyChanged: 140.625 milliseconds
875 [7] INFO 100 000 changes with NotifyPropertyChanged: 187.5 milliseconds
875 [7] INFO >>> 33% slower <<< ... 15x speedup (smile)

Actually it varies ~10-50%. Have no idea why it is slower / faster, probably some .NET or ReSharper pre-compiled bytecode tricks.

33% is a very good price for the functionality we get back.

I guess we will have to write similar performance tests for CollectionChange event.

By the way, the main current bottleneck in the Network classes is an EventedListView, we will need to get rid of it ASAP by replacing it with IEnumerable<...>, only one issue remains is that feature providers will need to be informed somehow that features in collection were changed.

Agile Coaching Started

I just created my account and personal homepage. I'm helping the coming months (every monday) with introduction of Scrum (release planning and sprint planning) and some Engineering practices.

I've worked with the team and product owner last week to produce a Product Backlog. It's in the team room and on the wall (actually back of a drawer). It has four columns (Yes, we're using Kanban), items move form left to right:

  1. Identified: all items (features, mainly) that are identified. This is work that was mentioned by any of the stakeholders, that is not planned yet.
  2. Prioritized: these are all items that have been prioritized. Priorities are just relative. The are sorted top to bottom (high prio to low prio)
  3. Estimated: all items are estimated after they have been prioritized. Estimates are relative and are done by the team only. They are using planning poker. The column stays sorted according to priority.
  4. Ready: These are estimated items that are small enough to be taken into to next sprint.
Speedup of NetworkHelper

It looks like NetworkHelper was very slow (especially on large networks) because of Distance() calls. Probably we should use time-consuming calls only after Envelope is checked:

Old code:

        public static IBranch GetNearestBranch(IEnumerable<IBranch> branches, IBranchFeature branchFeature, double tolerance)
        {
           ...
            var minDistance = double.MaxValue;
            var nearestBranch = (IBranch) null;

            foreach (var branch in branches)
            {
                var distance = branch.Geometry.Distance(branchFeature.Geometry);

                if (distance >= minDistance || distance >= tolerance)
                    continue;
                nearestBranch = branch;
                minDistance = distance;
            }
           ...
        }

Net code:

        public static IBranch GetNearestBranch(IEnumerable<IBranch> branches, IBranchFeature branchFeature, double tolerance)
        {
           ...
            var minDistance = double.MaxValue;
            var nearestBranch = (IBranch) null;

            // first select branches where envelope of branch overlaps with branchFeature
            var overlappingBranches = new List<IBranch>();
            foreach (var branch in branches)
            {
                if(branch.Geometry.EnvelopeInternal.Overlaps(branchFeature.Geometry.EnvelopeInternal))
                {
                    overlappingBranches.Add(branch);
                }
            }

            // then find nearest branch using Distance
            foreach (var branch in overlappingBranches)
            {
                var distance = branch.Geometry.Distance(branchFeature.Geometry);

                if (distance >= minDistance || distance >= tolerance)
                    continue;
                nearestBranch = branch;
                minDistance = distance;
            }
           ...
        }
Usefull batch files

To reset my delftshell layout I use a batch file with the following content:

if not exist "%USERPROFILE%\Local Settings\Application Data\Deltares\DelftShell" goto end
rd /s /q "%USERPROFILE%\Local Settings\Application Data\Deltares\DelftShell"
:end

it removes the deltares directory containing settings for delftshell.

Often the network is slowing down my windows explorer. Therefore I sometimes disconnect the network drives.
To toggle my network drive mapping I use the following batch file:

if exist m:\ goto unmap
echo "mapping drives"
echo off
call net use /user:WL\%USERID% M: \\hafilerg.wldelft.nl\win
call net use /user:WL\%USERID% P: \\FILER.wldelft.nl\PROJECT
call net use /user:WL\%USERID% U: \\FILER.wldelft.nl\HOME
call net use /user:WL\%USERID% W: \\WLHOST.wldelft.nl\WL
call net use /user:WL\%USERID% Y: \\WLHOST.wldelft.nl\LIBRARY
goto end
:unmap
echo "unmapping drives"
echo off
call net use  M: /delete
call net use  P: /delete
call net use  U: /delete
call net use  W: /delete
call net use  Y: /delete
:end
echo "goodbye"
Structures in HEC-RAS

New Orleans - reference image for grid projection en netCDF

Best practices for team (draft)

We are trying to formulate best practices for working within the Delta Shell team

  • limit stand-up meeting to 15 minutes
  • only work on issues that are on the scrum board, during office hours
  • do not delegate responsability to individual team members. Team should fix problems together
  • create manageable work items that describe well what kind of functionality should be implemented
  • issues should be checked and closed after fixing. Delegate to team members during stand-up (one team member can check the work of another one)
  • do not brainstorm with the whole team (productowner and lead architect can do this without the team)
  • make sure to have a finished product after a sprint by prioritizing need-to have vs nice to have
  • all workitems should be either in todo or done status after sprint (no loose ends!)
  • people should join the team during the sprint only if there is some clear need for them to do so.
  • adhere to definition of done when checking in sources
    • unit test should not fail (before checkin)
    • delta shell should start (before checkin)
    • UI functionality related to your code should work (before checkin)
    • Build server should not report failing test
    • Save project should work (loading and saving!)
    • Run flow model should work with new entity
    • Property editor should show new object nicely
Projection systems

Definition of various projection systems can be found at the website spatialreference.org. Using org some functionality is available to convert projection system information to well known text or proj.4 format. The following code snippet is an example to translate information from a .prj file to proj.4 format. However ogr only provides a limited set of conversions.

Currently we require projection from world coordinates (wgs84) to meters (amersfoort new). The proj.4 definition of this projection is as follows (see also spatialreference.org).
"+proj=sterea +lat_0=52.15616055555555 +lon_0=5.38763888888889 +k=0.9999079 +x_0=155000 +y_0=463000 +ellps=bessel +units=m +no_defs"

update: ogr2ogr now supports:
"ogr2ogr -t_srs EPSG:28992 rivers_rdnew.shp rivers.shp", see example later on this page.

More examples for using ogr can be found at California soil resource lab

     public static string  EsriToProj4(string esriProjectionPath)
     {
           OSGeo.OSR.SpatialReference oSRS= new SpatialReference("");
           if (oSRS.SetFromUserInput(esriProjectionPath)!=Ogr.OGRERR_NONE)
           {
               throw new ApplicationException(string.Format("Error occured translating {0}",esriProjectionPath));
           }

            string proj4OutputString;
            if( oSRS.ExportToProj4(out proj4OutputString)==Ogr.OGRERR_NONE)
            {
                return proj4OutputString;
            }
            else
            {
                throw new ApplicationException("Export to proj4 failed");
            }
    }

This code snippet was based on the original c++ version by Frank Warmerdam. I have added the projection files contained with ArcMap. I don't think we should distribute these.

proj.4 is a utility for converting coordinates from one system to another. Proj.4 is written in c++ and can be downloaded from the Proj.4 website. Gdal /Ogr(currently version 1.61) depends on Proj.4 to convert raster or shapefiles from one coordinate system to another. To use proj.4 (windows) it is necessary to set the path to NAD27 grid shift files in an environmental variable, example: set PROJ_LIB=C:\Software\PROJ\NAD (we have to figure out some way to do this in deltashell).

The Shapelib library by Frank Warmerdam supports reprojection of shapefiles. I don't know if the syntax to define projections conforms to the proj.4 syntax.

Example
the following example projects file rowtest to row3, moving data from Stateplane NAD83 zone 1002 to utm zone 16 in meters

shpproj rowtest row -i="init=nad83:1002 units=us-ft" -o="proj=utm zone=16 units=m"
shpproj rowtest row3 -o="proj=utm zone=18 units=m" -i="zone=16 proj=utm units=us-ft"
shpproj rowtest row3 -o="proj=utm zone=18 units=m" 
shpproj rowtest row3 -i=myfile.prj -o=geographic
shpproj rowtest row3 -is=myfile.prj

You can try using ogr2ogr yourself by executing project_to_rdnew.cmd in the following attachment it will convert rivers.shp and project it from "GCS_WGS_1984" to "Amersfoort / RD New". The commandline for this is: "ogr2ogr -t_srs EPSG:28992 rivers_rdnew.shp rivers.shp"

Minimizing VS 2008 releases memory

When your devenv mem usage gets to big. Minimize the main window and it will drop in size (smile)