Continuum
Continuum (Silvestro et al., 2013) is a continuous distributed hydrological model that strongly relies on a morphological approach, based on a novel way for the drainage network component identification (Giannoni et al., 2005). The model is a compromise between models with a strong empirical connotation, which are easy to implement but far from reality, and complex physically based models which try to reproduce the hydrological processes with high detail but which introduce a hard parameterization and consequent uncertainty and lack of robust parameter identification. For information about the Continuum model see http://www.cimafoundation.org/wp-content/uploads/doc/CONTINUUM_english.pdf
Continuum model adapter
Continuum pre-adapter
Model pre-adapter for running Continuum model from CIMA Foundation.
Usage: python continuum_preadapter.py <netcdf run file pathname relative to current working directory>
Properties
domain_name | (required) | String representing the name of the model domain. This string is used to build output filenames and retrieve domain information files. |
time_resolution | (required) | Output time resolution in hours. |
Notes for users
For all files that are written by this adapter, if the file to be written already exists, then it will be overwritten.
This program prints log messages on screen.
This program uses the information in the specified netcdf run file as input and uses this information to do the following actions:
Converts 2d data fields from the netcdf file to Continuum input binary files.
Create the Continuum configuration file by filling the provided template file. The template is filled with the data according to the table below
domaindir | domain working directory relative path |
domain | domain name |
input_data_dir | input data directory relative path |
numberofsections | number of hydrological sections for the output |
DEMLLLat, DEMLLLon, DEMCellSize, DEMNRows, DEMNCols | Georeference information for the DEM files |
METEOLLLat, METEOLLLon, METEOCellSize, METEONRows, METEONCols | Georeference information for the Continuum input files |
RainfallDuration | length (hours) of the input dataset |
StartDate | Start date string |
EndDate | End date string |
System requirements
This program needs Python 2.7 or higher.
This program needs the following Python modules:
netCDF4
numpy
Continuum post-adapter
Model post-adapter for running Continuum model from CIMA Foundation.
Usage: python continuum_postadapter.py <netcdf run file pathname relative to current working directory>
Notes for users
For all files that are written by this adapter, if the file to be written already exists, then it will be overwritten.
This program prints log messages on screen
This program uses the information in the specified netcdf run file as input and uses this information to convert the Continuum ascii hydrograph output file to netcdf format.
System requirements
This program needs Python 2.7 or higher.
This program needs the following Python modules:
netCDF4
numpy
GeneralAdapterRun Example Configuration
The following gives an example of how to set up the GeneralAdapterRun file for Continuum in FEWS using the model pre and post-adapters. The GeneralAdapterRun file follows the general structure as described here.
General
In this section general information regarding the module such as version number, file directories, missing values, and time zone information can be specified.
<general> <description>Continuum model run for $DOMINIUM$</description> <piVersion>1.8</piVersion> <rootDir>%REGION_HOME%/Modules/Continuum_$DOMINIUM$</rootDir> <workDir>%ROOT_DIR%</workDir> <exportDir>%ROOT_DIR%/input</exportDir> <exportDataSetDir>%ROOT_DIR%/</exportDataSetDir> <exportIdMap>IdExport_continuum</exportIdMap> <importDir>%ROOT_DIR%/output</importDir> <importIdMap>IdImport_ContinuumOutput</importIdMap> <dumpFileDir>$GA_DUMPFILEDIR$</dumpFileDir> <dumpDir>%ROOT_DIR%/diagnostics/</dumpDir> <diagnosticFile>%ROOT_DIR%/diagnostics/diagnostics.xml</diagnosticFile> <missVal>NaN</missVal> <timeZone> <timeZoneName>GMT+0:00</timeZoneName> </timeZone> </general>
|
Start-up activities
Clear the model working directory of any previous runs before starting a new run.
<startUpActivities> <purgeActivity> <filter>%ROOT_DIR%/*</filter> </purgeActivity> </startUpActivities> |
Export activities
In this section the data to be exported from FEWS as input to the module is specified. Data to export to Continuum model includes:
Input data (air temperature, wind, relative humidity, rainfall, solar radiation)
Run file
Note: Input data, on current version, are obtained from an interpolation procedure on the Bocca di Magra domain. This setting is fixed, but should be changed in order to generalize the adapter.
The run file contains information regarding the input file names, start and stop times, and time step. Additional properties can be passed using the run file as listed above under Properties.
<exportActivities> <exportDataSetActivity> <moduleInstanceId>Continuum_$DOMINIUM$</moduleInstanceId> </exportDataSetActivity> <exportNetcdfActivity> <exportFile>continuum_input.nc</exportFile> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>Interpolate_boccadimagra</moduleInstanceId> <valueType>grid</valueType> <parameterId>P.obs</parameterId> <locationId>boccadimagra_regular</locationId> <timeSeriesType>external historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <relativeViewPeriod unit="day" start="-3" end="0" startOverrulable="true" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> <timeSeriesSet> <moduleInstanceId>Interpolate_boccadimagra</moduleInstanceId> <valueType>grid</valueType> <parameterId>T.obs.drybulb</parameterId> <locationId>boccadimagra_regular</locationId> <timeSeriesType>external historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <relativeViewPeriod unit="day" start="-3" end="0" startOverrulable="true" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> <timeSeriesSet> <moduleInstanceId>Interpolate_boccadimagra</moduleInstanceId> <valueType>grid</valueType> <parameterId>Sol.obs</parameterId> <locationId>boccadimagra_regular</locationId> <timeSeriesType>external historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <relativeViewPeriod unit="day" start="-3" end="0" startOverrulable="true" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> <timeSeriesSet> <moduleInstanceId>Interpolate_boccadimagra</moduleInstanceId> <valueType>grid</valueType> <parameterId>Wind.obs.speed</parameterId> <locationId>boccadimagra_regular</locationId> <timeSeriesType>external historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <relativeViewPeriod unit="day" start="-3" end="0" startOverrulable="true" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet>
<timeSeriesSet> <moduleInstanceId>Interpolate_boccadimagra</moduleInstanceId> <valueType>grid</valueType> <parameterId>RH.obs</parameterId> <locationId>boccadimagra_regular</locationId> <timeSeriesType>external historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <relativeViewPeriod unit="day" start="-3" end="0" startOverrulable="true" endOverrulable="true"/> <readWriteMode>read only</readWriteMode> </timeSeriesSet> </timeSeriesSets> </exportNetcdfActivity> <exportNetcdfRunFileActivity> <description>This run file is passed as argument to ContinuumPreAdapter</description> <exportFile>%WORK_DIR%/run_info.nc</exportFile> <properties> <string key="domain_name" value="$DOMINIUM$"/> <int key="time_resolution" value="1"/> </properties> </exportNetcdfRunFileActivity> </exportActivities> |
Execute activities
This section calls the Continuum pre and post-adapters as well as the Continuum executable. The pre and post adapter are wrapped inside a shell or batch script for the exectution.
Note: the run file must be passed as an argument to the Continuum pre and post-adapters.
<executeActivities> <executeActivity> <command> <executable>%REGION_HOME%/Modules/bin/Continuum_adapter/continuum_preadapter.sh</executable> </command> <arguments> <argument>%WORK_DIR%/run_info.nc</argument> </arguments> <timeOut>99999999</timeOut> <ignoreDiagnostics>true</ignoreDiagnostics> </executeActivity> <executeActivity> <command> <executable>%ROOT_DIR%/command_line</executable> </command> <logFile> <file>continuum_error.txt</file> <errorLinePattern>*</errorLinePattern> </logFile> <logFile> <file>continuum_warning.txt</file> <infoLinePattern>*warning*</infoLinePattern> </logFile> <logFile> <file>continuum_log.txt</file> <debugLinePattern>*ERROR*</debugLinePattern> </logFile> <timeOut>99999999</timeOut> <ignoreDiagnostics>true</ignoreDiagnostics> </executeActivity> <executeActivity> <command> <executable>%REGION_HOME%/Modules/bin/Continuum_adapter/continuum_postadapter.sh</executable> </command> <arguments> <argument>%WORK_DIR%/run_info.nc</argument> </arguments> <timeOut>99999999</timeOut> <ignoreDiagnostics>true</ignoreDiagnostics> </executeActivity> </executeActivities>
|
Import activities
In this section the data to be imported into FEWS as output from the module is specified. Only hydrographs are currently imported from Continuum
<importActivities> <importNetcdfActivity> <importFile>continuum_out.nc</importFile> <timeSeriesSets> <timeSeriesSet> <moduleInstanceId>Continuum_$DOMINIUM$</moduleInstanceId> <valueType>scalar</valueType> <parameterId>Q.simulated</parameterId> <locationSetId>magra.hydro.sections.locations</locationSetId> <timeSeriesType>simulated historical</timeSeriesType> <timeStep unit="hour" multiplier="1"/> <readWriteMode>read complete forecast</readWriteMode> <expiryTime unit="week" multiplier="1"/> </timeSeriesSet> </timeSeriesSets> </importNetcdfActivity> </importActivities> |