Please note that for running DFlow-FM from Delft-FEWS only a pre-adapter is needed (a post-adapter is not needed).
D-Flow FM pre-adapter
Model pre-adapter for running D-Flow FM (D-Flow Flexible Mesh) model from Delft-FEWS.
For information about the D-Flow FM model see http://oss.deltares.nl/web/delft3d/d-flow-flexible-mesh
For the D-Flow FM user manual, D-Flow FM technical reference manual and other relevant manuals, see:
http://content.oss.deltares.nl/delft3d/manuals/D-Flow_FM_User_Manual.pdf
http://content.oss.deltares.nl/delft3d/manuals/D-Flow_FM_Technical_Reference.pdf
http://content.oss.deltares.nl/delft3d/manuals/
Usage: DFlowFMPreAdapter <netcdf run file pathname relative to current working directory>
Class name: nl.deltares.dflowfm.DFlowFMPreAdapter
Properties
model_id | (required) | Identifier of the model. This should be the first part of the .mdu file name. This is used to find the relevant .mdu file(s) and restart .nc file(s), as described below. |
---|---|---|
mdu_file | Deprecated. Do not use. | Pathname of the mdu file to update. This should be either an absolute path or a path relative to the workDir specified in the netcdf run file. |
input_grid_files_to_convert | (optional) Deprecated. The latest version of D-Flow FM can directly read grid NetCDF files that are exported from Delft-FEWS. This works for the parameters wind, pressure, rainfall, radiation, temperature and humidity. Therefore converting grid files is not needed anymore. | One or more pathnames of netcdf files with input grid data that should be converted. The pathnames should be separated by semi-colons (;). Each pathname should be either an absolute path or a path relative to the workDir specified in the netcdf run file. |
Notes for users
- For all files that are written by this adapter, if the file to be written already exists, then it will be overwritten.
- This program assumes that the model always runs in time zone GMT.
- This program writes log messages to a log file called dflowfm_pre_adapter_log.txt in the workDir specified in the netcdf run file.
- The pre-adapter uses the information in the specified netcdf run file as input for its activities (see below).
pre-adapter activities: Update MDU file(s)
In the found mdu file(s) the following entries will be updated automatically (no tags needed):
TStart | Start time of the model run, in units of Tunit relative to RefDate. |
---|---|
TStop | End time of the model run, in units of Tunit relative to RefDate. |
RestartFile | Either * pathname of the input state file relative to the MDU file if the input state file is not empty (warm state start) or * empty string if the input state file is an empty dummy file of 0 bytes length (cold state start) or * empty string if there is no input state file at all (cold state start). |
RestartDateTime | The restart time, this is set equal to the start time of the model run. |
RstInterval | Interval (in seconds) for writing *_rst.nc restart files. RstInterval can be overwritten by a property in the run info file called "restartIntervalForFm" |
MDU file name format
The mdu file(s) to update should be in the workDir specified in the netcdf run file. This program only supports:
a single mdu file with file name format <model_id>.mdu or
Valid example: gtsm.mdu(in case of domain decomposition) one mdu file for each partition with file name format <model_id>_<partition_number>.mdu
Valid example: gtsm_0000.mdu, gtsm_0001.mdu, gtsm_0002.mdu
Restart file name format
In case of a warm state start, there should be exactly one input state file (restart file) for each mdu file. This program only supports a single input state file with file name format
- <model_id>_rst.nc , or
Valid example: gtsm_rst.nc - <model_id>_<timestamp>_rst.nc
Valid example: gtsm_20160407_140000_rst.nc
In case of domain decomposition there should be one input state file for each partition with file name format
<model_id>_<partition_number>_rst.nc, or
Valid example: gtsm_0001_rst.nc<model_id>_<partition_number>_<timestamp>_rst.nc
Valid example: gtsm_0001_20160407_140000_rst.nc
pre-adapter activities: Unpdate external forcing files (optional)
The pre-adapter (version Delft-FEWS 2017.01.01 and upwards) converts exported scalar timeseries from xml to external forcing files (i.e. .bc and .tim files). Please note the following:
- The pre-adapter reads the .ext files mentioned in the .mdu file, and finds here the files to check for keywords
- See Delft3D adapter with NetCDF run file#Templatefilesandkeywords on how to use template files and keywords.
- The pre-adapter can handle at most two different forcing files, so if you list more in the .mdu file they will be ignored.
- The pre-adapter assumes a timestep in minutes. Originally, the .tim files could only handle minutes. The newer .bc files can use any unit, but the pre-adapter still assumes minutes. The pre-adapter does -not- interpret the Tunit as described in the .mdu file (Implementing this would be an improvement of the pre-adapter.)
- Example config files can be found elsewhere on this wiki page
pre-adapter activities: Convert input grid time series (optional)
The netcdf file(s) specified in the property "input_grid_files_to_convert" will be converted to files in arcinfo/curvi format. Each netcdf file will be converted to a file with the same path and name as the netcdf file but with a different extension (.amu, .amv or .amp). If the property "input_grid_files_to_convert" is not specified, then this step does nothing.
Conversion from variable to file extension
Each netcdf file should contain only one variable with grid data. A netcdf file with multiple variables with grid data results in an error. The extension of the created file depends on the name of the variable in the netcdf file. For example the original file input/x_wind.nc is converted to input/x_wind.amu
netcdf variable name | extension |
---|---|
x_wind | .amu |
y_wind | .amv |
air_pressure | .amp |
precipitation | not supported by DFlowFM (for rainfall DFlowFM can use the netcdf file directly) |
any other name | not supported by DFlowFM |
Auxiliary grid file
The format (meteo_on_equidistant_grid/meteo_on_curvilinear_grid) of each of the created files depends on whether there is an auxiliary grid file present for that file. To use an auxiliary grid file for a given netcdf file, it must have the same path and name as the netcdf file, but a different extension (.grd). If an auxiliary grid file is present, then the netcdf file will be converted to a curvi file of type meteo_on_curvilinear_grid that refers to the auxiliary grid file. Otherwise it will be converted to an arcinfo file of type meteo_on_equidistant_grid. For rectangular and curvilinear grids there must always be an auxiliary grid file present, otherwise an error is given. For regular grids no auxiliary grid file is needed.
grid type | auxiliary grid (.grd) file needed | type of created file |
---|---|---|
regular | no | arcinfo file of type meteo_on_equidistant_grid |
rectangular | yes | curvi file of type meteo_on_curvilinear_grid |
curvilinear | yes | curvi file of type meteo_on_curvilinear_grid |
Order of grid cells written
The order of the grid cell values in the arcinfo/curvi grid file created by this adapter depend on the type of grid (regular/rectangular/curvilinear). For regular grids (arcinfo meteo_on_equidistant_grid file format or curvi meteo_on_curvilinear_grid file format) the grid cells are always ordered per row from left to right, starting with the upper row of the grid. For rectangular grids (curvi meteo_on_curvilinear_grid file format) the grid cells are always ordered per row from left to right, starting with the upper row of the grid. For curvilinear grids (curvi meteo_on_curvilinear_grid file format) the grid cells are always in the same order as in the netcdf file, which depends in turn on the order of the grid cells in the corresponding grid definition in Delft-FEWS (if the file was exported from Delft-FEWS). If an auxiliary grid (.grd) file is used, then the grid cell coordinates in the .grd file must be in the same order as the grid cell values in the corresponding curvi file. The easiest way to accomplish this is to run the adapter once, then check the order of the grid cell values in the created curvi file(s), then manually make sure that the grid cell coordinates in the corresponding .grd file(s) are in the right order.
Coordinate system used
The coordinate system for the coordinates in an arcinfo meteo_on_equidistant_grid file created by this adapter depends on the coordinate system used in the netcdf file, which depends in turn on the coordinate system (geodatum) in the corresponding grid definition in Delft-FEWS (if the file was exported from Delft-FEWS). Need to manually make sure that this is the same coordinate system as the coordinate system used by the model.
D-Flow FM data in Delft-FEWS
3D data: sigma layer vs Z layer
At the moment there is no example yet of a z layer D-Flow FM model connected to Delft-FEWS. All documentation and examples below are related to sigma layer models. Z layers are supported by the model adapter though. An example of import and display of z layers in a Delft-3D model can be found at Delft3D adapter - 4.3 Import and display 3D data (z layers)
Export of 3D data in generalAdapter (z layers) - NETCDF-CF_ZLAYERS
Scalar time series at the same geo point Z but with different X,Y are considered to be a Z-layer. All available Z’s are used to create a Z-axis (layer axis) in the NetCdf file, and the time series values are written to the associated Z element. To export scalar time series as Z_layers in GA with NETCDF-CF_ZLAYERS, use option <exportZLayers>true</exportZLayers> in the exportNetcdfActivity (see config example below).
An example for float salinity(time=5, node=26, z=40);
Values of Z-axis are stored in meters. Per parameter only one Z-axis is allowed. Different parameters may have different Z-axis. Z-axis values are sorted in ascending order. The number of stations in the nc file equals to the number of unique X,Y that are available in the scalar time series. The location id’s/names associated with the first (lowest) Z are written to the nc file as station id’s/names. If there are parentLocations configured, then the IdMap can be used to write the parentLocations id’s to the nc file. By default the long_name attribute of the parameters is equal to the parameter id. This default behavior is overwritten with the configuration of a parameter description in Parameters.xml, which will be used as long_name in the nc file instead. In GA the default missing value for time series is -999. You can overwrite it in GA using <missVal>, for example <missVal>NaN</missVal>
Use of nl.wldelft.netcdf.NetcdfZLayersPlusTimeSeriesSerializer
If at least one of the parameters to export has parameter description “uxuyadvectionvelocitybnd”, then the following extra variable is written to the nc file (length 50 is just an example)
float uxuyadvectionvelocitybnd(z=50);
:vector = "ux, uy";. Use of this serializer is required for Delft3D-FM to correctly read netcdf advection boundaries.
Import of partitioned 3D data in General Adapter
Partitioned 3D data means that the model domain is divided into several sub-domains to improve model performance. The model results per sub-domain can either be merged before the import back into Delft-FEWS or the sub-domains are imported individually and merged on-the-fly. The second option is recommended as it improves the performance of Delft-FEWS. The depth dimension in D-Flow FM can either be indicated as z-level (absolute depth) or sigma coordinate (% of depth). The basis of handling sigma and z-layers is the same in Delft-FEWS. The only difference is the definition of the respective parameter that must be interpreted. Key functionality for the import of partitioned 3D data is the import property layerIndexAsQualifierId in the General Adapter, which makes the linkage between an external partition based locations to internal partitional and layer based locations possible. Please find an example for the import of partitioned 3D data under configuration examples.
Sigma Layers
In case of sigma-layer data, a locationSet links the sigma layer/coordinate (% of depth) to a layer index and a parent location which determines the respective sub-domain. This can be organized in from of a .csv file. To determine that the sigma layer/coordinate must be interpreted for 3rd dimension by Delft-FEWS define the layerSigmaCoordinate parameter in the locationSet, that is reading the respective .csv file.
Z-Layers
In case of z-layer data, a locationSet links the z-layer (absolute depth) to a layer index and a parent location which determines the respective sub-domain. This can be organized in from of a .csv file. To determine that the z-layer must be interpreted for 3rd dimension by Delft-FEWS define the z-coordinate parameter in the locationSet, that is reading the respective .csv file.
Display of 2D data for overlapping domains
A common model set-up in D-Flow FM will make use of several models for an project area. For example, you could have a large scale coarse regional model, with 1 (or more) finer model(s) zooming in on the area of interest which is (are) located within the regional model area, i.e. the model areas overlap. When you display all models in 1 gridPlot, you don't want to see the coarse grid peaking out from underneath the finer local model(s). The example in the figure has 3 different model areas, going from coarse (blue grid), through intermediate (red grid) to a fine resolution (green grid).
In order to hide the coarser grids where a finer grid is available, you'll need to define a shapefile which masks the area of each model domain that you would like visualised in the gridDisplay. These shapefiles should (barely) overlap, see example in the figure on the right. All data is still available in the gridPlot, but only the data within the shapefile contour is displayed in the gridDisplay. When the user double-clicks in the gridDisplay, a timeSeriesDisplay with scalar data for all models with data for that location (in this example up to 3) is displayed, even though the data might be masked in the gridDisplay. Config example below.
Display of 3D data (sigma layers)
In GridDisplay.xml you need to configure an additional sigmaScaleReferenceTimeSeriesSet for sigma layers (see config example below). A on the fly transformation allows the user to dynamically interpolate between sigma layers in the grid display. For more information on the GridDisplay visit 01 Grid Display - sigmaScaleReferenceTimeSeriesSet
System requirements
- This program needs Java version 8 or higher.
- This program needs the following Java libraries:
- commons-httpclient-3.0.1.jar
- Delft_Util.jar (revision 60330)
- grib-8.0.jar
- log4j-1.2.14.jar
- netcdf-4.2.jar
- slf4j-api-1.5.6.jar
- slf4j-log4j12-1.5.6.jar
- TimeSeriesImport.jar (revision 60330)
Configuration examples
Please note that for running DFlow-FM from Delft-FEWS only a pre-adapter is needed (a post-adapter is not needed).
Update external forcing files
Example of changes to the config when dealing with a D-Flow FM model with scalar boundary conditions.
Excerpt from .mdu file related to external forcing
[external forcing] ExtForceFileNew = mackay_bnd.ext # ExtForceFileNew DDB uses new format ExtForceFile = mackay_pioneer.ext # *.ext
zExample .ext file with reference to .bc file:
[boundary] quantity = waterlevelbnd locationfile = mackay_bnd.pli forcingfile = mackay_bnd.bc
With corresponing .pli file:
mackay_bnd 1 2 1.49201410e+02 -2.10467553e+01 mackay_bnd_0001
and corresponding .bc file:
[forcing] Name = mackay_bnd_0001 Function = timeseries Time-interpolation = linear Quantity = time Unit = minutes since 2017-01-01 00:00:00 Quantity = waterlevelbnd Unit = m $(FLOW_TIMESERIES: waterlevelbnd/mackay_bnd_0001)
Example .ext file with .tim file:
QUANTITY=discharge_salinity_temperature_sorsin FILENAME=mackay_pioneer.pli FILETYPE=9 METHOD=1 OPERAND=O AREA=1
With corresponding .pli file:
pioneer 1 2 1.49099320e+02 -2.11482830e+01
and corresponding .tim file:
$(FLOW_TIMESERIES: Q_sim_fcst/pioneer)
Example of exportTimeSeriesActivity in General Adapter config file:
<exportTimeSeriesActivity> <exportFile>timeseries.xml</exportFile> <timeSeriesSets> <timeSeriesSet> <moduleInstanceSetId>URBS_Forecast</moduleInstanceSetId> <valueType>scalar</valueType> <parameterId>Q.sim.fcst</parameterId> <locationSetId>DFLOWFM_river.$CATCHMENT$_$SUBCATCHMENT$</locationSetId> <timeSeriesType>simulated forecasting</timeSeriesType> <timeStep unit="minute" multiplier="15"/> <relativeViewPeriod unit="day" start="0" end="3"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> <timeSeriesSet> <moduleInstanceId>ImportROMS</moduleInstanceId> <valueType>scalar</valueType> <parameterId>H.tidal.fcst</parameterId> <locationSetId>DFLOWFM_coastal.$CATCHMENT$_$SUBCATCHMENT$</locationSetId> <timeSeriesType>external forecasting</timeSeriesType> <timeStep unit="minute" multiplier="30"/> <readWriteMode>read complete forecast</readWriteMode> </timeSeriesSet> </timeSeriesSets> </exportTimeSeriesActivity>
D-Flow FM model configuration example (single domain, 2D data import)
Example of a FEWS general adapter configuration that uses the DFlow-FM adapter.
<?xml version="1.0" encoding="UTF-8"?> <generalAdapterRun xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews http://fews.wldelft.nl/schemas/version1.0/generalAdapterRun.xsd"> <general> <rootDir>$REGION_HOME$\Modules\dflowfm</rootDir> <workDir>%ROOT_DIR%</workDir> <exportDir>%ROOT_DIR%\dflowfm_curacao\input</exportDir> <exportDataSetDir>$REGION_HOME$\Modules</exportDataSetDir> <exportIdMap>IdExport_DFlowFM</exportIdMap> <importDir>%ROOT_DIR%\dflowfm_curacao\output</importDir> <importIdMap>IdImport_DFlowFM</importIdMap> <dumpFileDir>$GA_DUMPFILEDIR$</dumpFileDir> <dumpDir>%ROOT_DIR%\dflowfm_curacao\dump</dumpDir> <diagnosticFile>%ROOT_DIR%\dummy.xml</diagnosticFile> <missVal>-999.</missVal> <!-- Take care this should be the timezone the computer is running in --> <timeZone> <timeZoneOffset>-04:00</timeZoneOffset> </timeZone> <endDateTimeFormat>yyyyMMdd_HHmmss</endDateTimeFormat> </general> <activities> <startUpActivities> <purgeActivity> <filter>%ROOT_DIR%\dflowfm_curacao\output\*.*</filter> </purgeActivity> <purgeActivity> <filter>%ROOT_DIR%\dflowfm_curacao\input\*.*</filter> </purgeActivity> </startUpActivities> <exportActivities> <exportStateActivity> <moduleInstanceId>DFlowFM_curacao_Historical</moduleInstanceId> <stateExportDir>%ROOT_DIR%\dflowfm_curacao\instate</stateExportDir> <stateSelection> <warmState> <stateSearchPeriod unit="hour" start="-23" end="-2"/> </warmState> </stateSelection> </exportStateActivity> <exportNetcdfActivity> <exportFile>air_pressure.nc</exportFile> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>WFLOW_curacao_GA_Historical</moduleInstanceId> <valueType>grid</valueType> <parameterId>P.specific</parameterId> <locationId>wflow_curacao</locationId> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <relativeViewPeriod unit="hour" end="0"/> <readWriteMode>add originals</readWriteMode> </timeSeriesSet> </timeSeriesSets> </exportNetcdfActivity> <exportNetcdfRunFileActivity> <description>This run file is passed as argument to DFLOWFM pre adapter</description> <exportFile>%WORK_DIR%\run_info.nc</exportFile> <properties> <string key="model_id" value="dflowfm_curacao\curacao"/> <string key="input_grid_files_to_convert" value="%ROOT_DIR%\dflowfm_curacao\input\x_wind.nc;%ROOT_DIR%\dflowfm_curacao\input\air_pressure.nc"/> </properties> </exportNetcdfRunFileActivity> </exportActivities> <executeActivities> <executeActivity> <description>DFlowFM pre adapter</description> <command> <className>nl.deltares.dflowfm.DFlowFMPreAdapter</className> <binDir>adapter</binDir> </command> <arguments> <argument>%WORK_DIR%\run_info.nc</argument> </arguments> <logFile> <file>%WORK_DIR%\dflowfm_pre_adapter_log.txt</file> <errorLinePattern>*ERROR*</errorLinePattern> <warningLinePattern>*WARN*</warningLinePattern> <infoLinePattern>*INFO*</infoLinePattern> <debugLinePattern>*DEBUG*</debugLinePattern> </logFile> <timeOut>99999999</timeOut> <ignoreDiagnostics>true</ignoreDiagnostics> </executeActivity> <executeActivity> <description>Run DFLOWFM</description> <command> <executable>bin\unstruc.exe</executable> </command> <arguments> <argument>--autostartstop</argument> <argument>dflowfm_curacao\curacao.mdu</argument> </arguments> <logFile> <file>%WORK_DIR%\dflowfm_curacao\curacao.dia</file> <errorLinePattern>*ERROR*</errorLinePattern> <warningLinePattern>*WARNING*</warningLinePattern> <debugLinePattern>*INFO*</debugLinePattern> <debugLinePattern>*DEBUG*</debugLinePattern> </logFile> <timeOut>44200000</timeOut> <ignoreDiagnostics>true</ignoreDiagnostics> </executeActivity> </executeActivities> <importActivities> <importStateActivity> <stateFile> <importFile>%WORK_DIR%\dflowfm_curacao\output\curacao_%END_DATE_TIME%_rst.nc</importFile> <relativeExportFile>curacao_%END_DATE_TIME%_rst.nc</relativeExportFile> </stateFile> </importStateActivity> <importNetcdfActivity> <importFile>%WORK_DIR%\dflowfm_curacao\output\curacao_map.nc</importFile> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>DFlowFM_curacao_Historical</moduleInstanceId> <valueType>grid</valueType> <parameterId>H.sim</parameterId> <locationId>DFlowFM_Curacao</locationId> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="nonequidistant"/> <readWriteMode>add originals</readWriteMode> </timeSeriesSet> </timeSeriesSets> </importNetcdfActivity> </importActivities> </activities> </generalAdapterRun>
Z layers - export D-Flow FM 3D results (NETCDF-CF_ZLAYERS)
<activities> <exportActivities> <exportNetcdfActivity> <exportFile>hycom_boundary.nc</exportFile> <exportZLayers>true</exportZLayers> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>Interpolate_Boundaries_HYCOM_forecast</moduleInstanceId> <valueType>scalar</valueType> <parameterId>S.simulated</parameterId> <locationSetId>HYCOM.Boundaries_AllLayers</locationSetId> <timeSeriesType>simulated forecasting</timeSeriesType> <timeStep unit="hour" multiplier="3"/> <relativeViewPeriod unit="day" start="0" end="1" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> </timeSeriesSets> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>Interpolate_Boundaries_HYCOM_forecast</moduleInstanceId> <valueType>scalar</valueType> <parameterId>T.simulated</parameterId> <locationSetId>HYCOM.Boundaries_AllLayers</locationSetId> <timeSeriesType>simulated forecasting</timeSeriesType> <timeStep unit="hour" multiplier="3"/> <relativeViewPeriod unit="day" start="0" end="1" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> </timeSeriesSets> </exportNetcdfActivity> </exportActivities> </activities>
Import partitioned D-Flow FM 3D results
The import of the partitioned D-Flow FM 3D results requires a specific organization of the locations as described in chapter Import of partitioned 3D data in General Adapter. The location names should be consistent for locationSets as well as for the data and grid file names, so that the individual layers and sub-domains can be configured as patterns. In the example "loc" is used for the model name that is automatically used to for example produce the map.nc file model output.
The partitioned gridded model output is written in the following format ({ModelName}_{PartitionNumber}_map.nc) and serves as basis for the location, layer and partition (parent) definition in the configuration.
Add NetCDF import to General Adapter
Use the import NetCDF function of the general adapter to import all map.nc output files of the model. Make sure you include the following three functionalities:
- fileNamePatternFilter: loop over all available map files
- fileNameLocationIdPattern: Aautomatically retrieve the external location id (location id of sub-domains/partitions is going to be the parent id for all model layer locations)
- property layerIndexAsQualifierId: assign a layer index based on the order of the layers as external qualifier to each external sub-location id (parent id) which can be used to map this external id per sub-domain to an internal location per sub-domain and layer
<importNetcdfActivity> <folder>%ROOT_DIR%/output</folder> <fileNamePatternFilter>*_map.nc</fileNamePatternFilter> <!--loop over all available map files --> <fileNameLocationIdPattern>(.*)\_map.nc</fileNameLocationIdPattern> <!--Aautomatically retrieve the external location id (location id of sub-domains/partitions is going to be the parent id for all model layer locations)--> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>$MODULE_INSTANCE_ID$</moduleInstanceId> <valueType>grid</valueType> <parameterId>TW_hc</parameterId> <locationSetId>model_layers</locationSetId> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="minute" multiplier="30"/> <readWriteMode>add originals</readWriteMode> </timeSeriesSet> </timeSeriesSets> <properties> <bool key="layerIndexAsQualifierId" value="true"/> <!--assign a layer index based on the order of the layers as external qualifier to each external sub-location id (parent id) which can be used to map this external id per sub-domain to an internal location per sub-domain and layer--> </properties> </importNetcdfActivity>
Define grids of all sub-domains (partitions)
As each partition is based in an individual sub-domain, each of the domains has to to be defined as grid in Delft-FEWS. Delft-FEWS can read the grid definition from a NetCDF file. Make sure that the file name includes the model name and partition number, which is going to be the location id for the parent id (partitions). In D-Flow FM the grid is defined in the map.nc file. It, however, my be necessary to remove variables that have a time dimension from the map.nc file before using it as grids file in Delft-FEWS. This can for example be done with a short Python Script.
In the grids.xml you can then read all grid NetCDF files by defining a file fileNameLocationIdPattern. Additionally, define the spatialSubdomainNumberVariable and the visibleSpatialSubdomainNumberFileNamePattern.
<netcdfFiles> <fileNameLocationIdPattern>(.*)_$INSTITUTION$_$N_PARTITIONS$_partitions_flowgeom\.nc</fileNameLocationIdPattern> <meshTopologyVariableName>mesh2d</meshTopologyVariableName> <staggerLocation>face</staggerLocation> <spatialSubdomainNumberVariable>mesh2d_flowelem_domain</spatialSubdomainNumberVariable> <visibleSpatialSubdomainNumberFileNamePattern>.*_(.*)_$INSTITUTION$_$N_PARTITIONS$_partitions_flowgeom\.nc</visibleSpatialSubdomainNumberFileNamePattern> </netcdfFiles>
Define all sub-domain (partitions) locations in a location set so that they can be used as parent location for the locations per individual layer and sub-domain.
<locationSet id="model_partitions"> <csvFile> <file>FmModelPartitions.csv</file> <geoDatum>WGS 1984</geoDatum> <id>%ID%</id> <name>%NAME%</name> <x>0</x> <y>0</y> <attribute id="EXTERNAL_ID"> <text>%EXTERNAL_ID%</text> </attribute> </csvFile> </locationSet>
Link sub-domains to layers
In a .csv link the parent location which indicates the whole partition/sub-domain to individual layers within each domain. This can either be done based on sigma-layers or z-layers depending on the model. In the locationSets.xml you have to either define the layerSigmaCoordinate or the z-coordinate to indicate to Delft-FEWS which parameter to use for the depth dimension.
location sets for sigma-layers
<locationSet id="model_layers"> <csvFile> <file>FmModelLayers.csv</file> <geoDatum>WGS 1984</geoDatum> <id>%ID%</id> <parentLocationId>%PARENT_ID%</parentLocationId> <x>0</x> <y>0</y> <layerSigmaCoordinate>%SIGMA_COORDINATE%</layerSigmaCoordinate> <attribute id="SIGMA_COORDINATE"> <number>%SIGMA_COORDINATE%</number> </attribute> <attribute id="LAYER_INDEX"> <text>%LAYER_INDEX%</text> </attribute> <attribute id="PARENT_ID"> <text>%PARENT_ID%</text> </attribute> </csvFile> </locationSet>
location sets for z-layers
<locationSet id="model_layers"> <csvFile> <file>FmModelLayers.csv</file> <geoDatum>WGS 1984</geoDatum> <id>%ID%</id> <parentLocationId>%PARENT_ID%</parentLocationId> <x>0</x> <y>0</y> <z>%Z%</z> <attribute id="LAYER_INDEX"> <text>%LAYER_INDEX%</text> </attribute> <attribute id="PARENT_ID"> <text>%PARENT_ID%</text> </attribute> </csvFile> </locationSet>
Map external and internal locations
The external locations is only based on the respective sub-domain (partition) whereas internally we require locations that is individual per sub-domain (partition) and layer. To being able to map the external locations to the internal requirements we have prepared the following steps above.
- Assign an external qualifier to external location that indicates each layer (in General Adapter).
- Link a parent id that represents the the respective sub-domain (partition) to layer specifc locations.
First map the external sub-domain (partition) locations to internal sub-domain (partition) locations (if necessary).
<locationIdFunction internalLocationSet="NeptuneFm_model_partitions" externalLocationFunction="@EXTERNAL_ID@"/>
Second map the external sub-domain (partition) locations to internal sub-domain (partition) and layer locations.
<locationIdFunction internalLocationSet="NeptuneFmWaq_model_layers" externalLocationFunction="@PARENT_ID@" externalQualifierFunction="@LAYER_INDEX@"/>
Sigma layers - display D-Flow FM 3D results in GridDisplay
This example builds on the grid, location and locationSet defined in the example above. It assumes sigma layers for a multi domain model.
<gridDisplay xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews http://fews.wldelft.nl/schemas/version1.0/gridDisplay.xsd"> <title>title</title> <gridPlotGroup id="gridPlotGroupId" name="gridPlotGroupName"> <gridPlot id="gridPlotId" name="gridPlotName"> <dataLayer> <uTimeSeriesSet> <moduleInstanceId>DFlowFM_HC</moduleInstanceId> <valueType>grid</valueType> <parameterId>parameter</parameterId> <locationSetId>SigmaLayer_0_####</locationSetId> <!-- Configure only the top layer (index = 0). The sibling locations (i.e. other sigma layers) are resolved through the parent of the top layer. --> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="nonequidistant"/> <readWriteMode>read complete forecast</readWriteMode> </uTimeSeriesSet> <vTimeSeriesSet> <moduleInstanceId>DFlowFM_HC</moduleInstanceId> <valueType>grid</valueType> <parameterId>parameter</parameterId> <locationSetId>SigmaLayer_0_####</locationSetId> <!-- Configure only the top layer (index = 0). The sibling locations (i.e. other sigma layers) are resolved through the parent of the top layer. --> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="nonequidistant"/> <readWriteMode>read complete forecast</readWriteMode> </vTimeSeriesSet> <sigmaScaleReferenceTimeSeriesSet> <!-- when this is configured, a vertical slider becomes automatically visible in GridDisplay to slide through the water column --> <moduleInstanceId>DFlowFM_HC</moduleInstanceId> <valueType>grid</valueType> <parameterId>parameter</parameterId> <locationSetId>sigma.merged</locationSetId> <!-- parentLocations for all domains, linking to all sigma layers --> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="nonequidistant"/> <readWriteMode>read complete forecast</readWriteMode> </sigmaScaleReferenceTimeSeriesSet> </dataLayer> <verticalSliderRange start="0" end="100"/> <!-- limit the min and max water depth used in the vertical slider in the GridDisplay --> <!-- if not configured the range of slider is automatically set to cover all available water depths in the grid for the entire period displayed --> </gridPlot> </gridPlotGroup>
Masking - display D-Flow FM 3D results in GridDisplay (multiple overlapping models)
When you have multiple overlapping models and you want to control which data is displayed in the gridDisplay, you can make use of a masking shapefile. Link this file to a locationSet. The <id> specified in the <esriShapeFile> config can be used in the GridDisplay, which will mask the data shown. For more information see above Display of 2D data for overlapping domains
<locationSet id="wave_clipper.shp"> <esriShapeFile> <file>wave_clipper.shp</file> <id>Wave</id> <x>0</x> <y>0</y> </esriShapeFile> </locationSet>
<valueTimeSeriesSet> <moduleInstanceId>Wave_HC</moduleInstanceId> <valueType>grid</valueType> <parameterId>Wave.height.simulated</parameterId> <locationId>Wave</locationId> <timeSeriesType>simulated historical</timeSeriesType> <timeStep multiplier="1" unit="hour"/> <readWriteMode>read complete forecast</readWriteMode> </valueTimeSeriesSet>