You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 33 Next »

What

A workflow.xml

Required

no

Description

Definition of sequence of moduleInstances (or workflows) in logical order

schema location

http://fews.wldelft.nl/schemas/version1.0/workflow.xsd

Introduction

Workflows are used in DELFT-FEWS to define logical sequences of running forecast modules. The workflow itself simply defines the sequence with which the configured modules are to be run. There is no inherent information nor constraints within the workflow on the role the module has in delivering the forecasting requirement.

Workflows may contain calls to configured module instances, but may also contain calls to other workflows. In the workflowDescriptors configuration described in the Regional Configuration section, the properties of the workflows is defined.

All workflows are defined in the Workflows section of the configuration; when working from a filesystem this is the WorkflowFiles directory. Each workflow will have the same structure and must adhere to the same XML schema definition. Workflows are identified by their name, which are registered to the system through the workflowDescriptors configuration in the Regional Configuration section.

Workflows

Workfows defined may either be available from the Workflows table – when the configuration is loaded into the database – or available in the WorkflowFiles directory when the configuration is available on the file system.

When available on the file system, the name of the XML file for a workflow (e.g ImportExternal.xml) needs to be registered in the WorkflowDescriptors with Id ImportExternal.

Each processing step in the workflow is referred to as an activity. In defining workflows, several levels of complexity in defining these activities are available;

  • Simple activities
  • Activities with a fallback activity;
  • Activities to be run as an ensemble.
  • Parallel activities, i.e. a group of activities that can be executed in parallel when multiple cores are available
  • Sequence activities, i.e. a group of activities that have to be executed in sequence

Activities

Each activity can either be a single moduleInstance or another workflow.

A single moduleInstance can have a one-to-one relation to a module config file (i.e. file name corresponds to moduleInstanceId) or it use a unique moduleInstanceId while referencing a module config file name which is shared as a template by various module instances. In the latter case, properties can be defined  in the workflow to make the timeSeriesSets used within each moduleInstanceId unique. A property has a key and a value. When a module config file holds a property-key ($key$) the property-value, as defined in e.g. the workflow, will replace the $key$ when the configuration file is read.

As can be seen in the following images, properties can be defined at varies levels in a workflow. In general the property definition at the lowest level overrules a property defined at a higher level. E.g. a property defined at the module instance level overrules a property defined at the (sub)workflow level overrules a property at the global level (i.e. global.properties file).

Figure 142 Elements of the Workflow configuration.

 

Figure 143  Elements of an Activity in the workflow.

activity

Root element for the definition of a workflow activity. Multiple entries can be defined.

runIndependent

Boolean flag to indicate if the activity is considered to be independent of other activities in the workflows. If the flag is set to "false" (default) then the failure of this activity will cause the complete workflow being considered as having failed. No further activities will be carried out. If the flag is set to "true", then failure of an activity will not cause the workflow to fail. The next activity in the workflow will also be attempted. An indication is given to the user in the Forecast Management display if one or more workflow activities have failed.

defaultFlagSource

Optional element; Flag source that is used to write to the flagSourceColumn in an activity that is able to write a flag. Used for non missing values when no other flagSource is applied. The defaultFlagSource will be written in case no flag is changed and data is non missing. Typically this defaultFlagSource is set to be 'OK', and configured in CustomFlagSources configuration file.

flagSourceColumnId

Optional element; This will dictate which flagSourceColumn that will be used to write the flagSources to.  A validation step where a flagSourceColumnId has been defined will write the appropriate flagSource in case of a flag change, both to the flagSource column and the regular flagSource element. In case the flag is not changed, the defaultflagSource will be written to the flagSource column. In this case, the flagSource will only be written to the regular flagSource element when this is empty. The flagSourceColumnId needs to be defined in flagSourceColumn.xml configuration file.

moduleInstanceId

The ID of the moduleInstance to run as the workflow activity. This module instance must be defined in the moduleInstanceDescriptors (see Regional Configuration) and a suitable module configuration file must be available (see Module configurations). This moduleInsatnceId is used to administer the data generated.

moduleConfigFileName

Optional reference to a module config file that holds the processing logic for a module instance. Typically this module config file holds property-keys such that it can acts as template for multiple module instances, each module instacne having a unique set of property-values.

workflowId

The ID of the (nested) workflow to run as the workflow activity. This workflow must be defined in the worfklowDescriptors (see Regional Configuration) and a suitable workflow must be available (see Module configurations).

properties

Root element for the definition of one or more properties. Multiple entries can be defined. Properties can be externalize portions of a timeSeriesSet definitions (e.g. locationSetId) or parameter values from the module configuration file to the level of the workflow definition.

 

 

properties:string

property definition for a string data type.  Holds a key-attribute and a value-attribute, where the attribute value is replacing the $key$ as defined in a module config file.

properties:int

property definition for a integer data type.  Holds a key-attribute and a value-attribute, where the attribute value is replacing the $key$ as defined in a module config file.

properties: float

property definition for a float data type.  Holds a key-attribute and a value-attribute, where the attribute value is replacing the $key$ as defined in a module config file.

properties:double

property definition for a double data type.  Holds a key-attribute and a value-attribute, where the attribute value is replacing the $key$ as defined in a module config file.

properties:bool

property definition for a boolean data type.  Holds a key-attribute and a value-attribute, where the attribute value is replacing the $key$ as defined in a module config file.

properties:dateTime

property definition for a dateTime data type. Holds a key-attribute while the value is based on the combination of the date-attribute and the time-attribute.

fallbackActivity

A fallback activity may be defined to run if the activity under which it is defined fails. This can be used to run a simple module if the more complex method used in preference fails, and ensures continuity of the forecasting process. The definition of the fallbackActivity is the same as the definition of an activity.

ensemble

This element is defined to indicate the activity is to run as an ensemble.

ensemble:ensembleId

Id of the ensemble to apply in retrieving data from the database. For all time series sets used as input for the activities running as an ensemble, a request for time series with this Id defined will be made. Ensemble id's in a sub workflow will override this ensembleId. A sub workflow without an ensembleId will make use of this ensembleId

ensemble:runInLoop

Boolean flag to indicate if the activity it to be run as many times are there are members in the ensemble, or if it is to be run only once, but will use all members of the ensemble in that single run. If the value is set to "true", then when running the workflow DELFT-FEWS will first establish how many members there are in the ensemble, and then run the activity for each member. If the value is set to "false" then the activity will be run only once. On requesting a time series set within the modules to be run, the database will return all members of that ensemble.

ensembleMemberIndex

Optional field to only run one particular ensemble member. If this field is used only the specified ensemble member will be run.

ensembleMemberIndexRange

Optional field to run a particular range of ensemble members. Processing starts at member start and ends at member end. If end is not specified the processing will start at start and end at the last member of the ensemble.

Activities in a workflow or nested workflows may be called only once. This is to avoid mistakes through multiple calls to the same activity in different locations thus creating ambiguous results, or circular references in defining fallback activities.

When running activities as an ensemble that request time series sets from the database that are not a part of that ensemble, the default ensembleId should be added to the TimeSeriesSets definition. The default ensemble Id is "main".

All time series sets written when running in ensemble mode will have the ensembleId as specified in the workflow ensembleId element, unless overruled by a local ensembleId defined in the timeSeriesSet on writing.

workflow:parallel

Grouping to accommodate parallel execution of multiple activities when multiple CPU available. Parallel activities should only be configured when the underlying activities have no data dependency. If data dependency exists, activities should be executed in sequence. This may be forced by embedding them as part of a sequence group. Since FEWS 2017.01 it is optionally possible to run the parallel activities on multiple forecasting shells. This can be done with the multipleForecastingShells=true or forecastingShellCount=n for ensembles and location loops.

workflow:sequence

Grouping to enforce sequential execution of multiple activities as they generally have data dependencies. Typically used in workflows where parallel activities are defined to clarify which activities have be to executed in sequence.

 

 

Figure 143  Elements of an Parallel Activity in the workflow.

loopLocationSetId

For a workflow, a location-loop can be defined in which each location (site) in a location set is run separately. The $LOOP_LOCATION_ID$ tag in the configuration can be used to point to the corresponding locationID of the current location. This is useful in combination with (related location) Constraints, because then it is possible to filter a locationSet on the basis of the $LOOP_LOCATION_ID$.

Examples

The workflow below runs seven moduleInstances. If the first moduleInstance fails in this example all other processing is stopped. If any of the other activities fail the processing will continue.

<workflow version="1.1" xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews
http://fews.wldelft.nl/schemas/version1.0/workflow.xsd">
	<activity>
		<runIndependent>false</runIndependent>
		<moduleInstanceId>Astronomical</moduleInstanceId>
	</activity>
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>BackupPrecipitation_Forecast</moduleInstanceId>
	</activity>

		<runIndependent>true</runIndependent>
		<moduleInstanceId>PrecipitationGaugeToGrid_Forecast</moduleInstanceId>
	</activity>
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>Spatial_Interpolation_Precipitation_Forecast</moduleInstanceId>
	</activity>
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>MergePrecipitation_Forecast</moduleInstanceId>
	</activity>
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>GridToCatchments_Forecast</moduleInstanceId>
	</activity>
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>Singapore_Sobek_Forecast</moduleInstanceId>
	</activity>
</workflow>

The example below is more complex and includes several modules that are run in ensemble mode.

<workflow xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.wldelft.nl/fews 
http://fews.wldelft.nl/schemas/version1.0/workflow.xsd" version="1.1">
	<!--Run Rhein Interpolation -->
	<activity>
		<runIndependent>true</runIndependent>
		<workflowId>Rhein_Interpolate</workflowId>
	</activity>
	<!--Spatial interpolation from grid to HBV-centroids-->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>Rhein_SpatialInterpolationCOSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Aggregate forecast data for display -->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>Rhein_AggregateForecast_COSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Disaggregate timeseries at HBV-centroids -->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>Rhein_DisaggregateSeriesCOSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Merge timeseries from historical run and forecast run -->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>HBV_Rhein_Merge_COSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Aggregate inputs for display -->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>HBV_Rhein_AggregateInputs_COSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Interpolate timeseries from historical run and forecast run -->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>HBV_Rhein_Interpolate_COSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Run HBV-model for forecast period-->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>HBV_Rhein_COSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
		<!--Run ErrorModule for forecast period-->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>HBV_Rhein_AR_COSMO-LEPS</moduleInstanceId>
		<ensemble>
			<ensembleId>COSMO-LEPS</ensembleId>
			<runInLoop>true</runInLoop>
		</ensemble>
	</activity>
	<!--Calculate Statistics-->
	<activity>
		<runIndependent>true</runIndependent>
		<workflowId>Statistics_COSMO-LEPS</workflowId>
	</activity>
	<!--Export forecast data to wavos format -->
	<activity>
		<runIndependent>true</runIndependent>
		<moduleInstanceId>Rhein_ExportForecast_COSMO-LEPS</moduleInstanceId>
	</activity>
</workflow>

The example below is complex in a different way as it uses properties to hand over catchment specific information to a report.

Usage of properties to add catchment specific information to a web-report
<workflow version="1.1" xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews http://fews.wldelft.nl/schemas/version1.0/workflow.xsd">
    <!--Create Calibration reports-->
    <properties>
        <string key="CATCHMENT" value="armidale"/>
        <string key="CATCHMENT_NAME" value="Armidale"/>
        <string key="BASIN" value="macleay"/>
        <string key="STATE" value="NSW"/>
        <string key="TIMEZONE" value="AET"/>
        <string key="ENSEMBLE" value="OFFICIAL"/>
        <string key="ENSMEMBER" value="URBS Final"/>
    </properties>
    <!--Compute -sub-catchment averaged data as sub-catchment data is not archived-->
    <activity>
        <runIndependent>true</runIndependent>
        <moduleInstanceId>Processing_15m_Site_Calibration</moduleInstanceId>
    </activity>
    <activity>
        <runIndependent>true</runIndependent>
        <moduleInstanceId>Performance_15m_Calibration</moduleInstanceId>
    </activity>
    <activity>
        <runIndependent>true</runIndependent>
        <moduleInstanceId>Report_15m_Site_Calibration</moduleInstanceId>
    </activity>
     <activity>
        <runIndependent>true</runIndependent>
        <moduleInstanceId>Report_Catchment_Calibration</moduleInstanceId>
    </activity> 
</workflow>

The following workflow example uses properties to make a apply the logic, encapsulated in general module config file, to a specific catchment. The second code block shows how some of the properties are used to make timeseriessets in the Rainfall_15m_Forecast module config file unique for the locations of a certain catchment.

Workflow defining property-values to apply general processing logic to a specific catchment
<workflow xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews http://fews.wldelft.nl/schemas/version1.0/workflow.xsd" version="1.1">

    <properties>
        <string key="RADAR" value="noradar"/>
        <string key="CATCHMENT" value="armidale"/>
        <string key="BASIN" value="macleay"/>
        <string key="ENSEMBLE" value="IFD"/>
        <string key="ENSMEMBER" value="URBS Local"/>
    </properties>
    <activity>
        <runIndependent>true</runIndependent>
        <moduleInstanceId>armidale_Rainfall_Forecast</moduleInstanceId>
        <moduleConfigFileName>Rainfall_15m_Forecast</moduleConfigFileName>
    </activity>
    <activity>
        <runIndependent>true</runIndependent>
        <workflowId>armidale_URBS_Catchment_Forecast</workflowId>
        <ensemble>
            <ensembleId>IFD</ensembleId>
            <runInLoop>true</runInLoop>
        </ensemble>
    </activity>
</workflow>
Property-keys to allow catchment specific timeseries definitions in the Rainfall_15m_Forecast module config file
    <variable>
        <variableId>ACCESS_G_subarea_3h</variableId>
        <timeSeriesSet>
            <moduleInstanceId>$CATCHMENT$_Rainfall_Forecast</moduleInstanceId>
            <valueType>scalar</valueType>
            <parameterId>P.nwp.fcst</parameterId>
            <qualifierId>ACCESS_G</qualifierId>
            <locationSetId>URBS_subareas.$CATCHMENT$</locationSetId>
            <timeSeriesType>temporary</timeSeriesType>
            <timeStep unit="hour" multiplier="3"/>
            <readWriteMode>read complete forecast</readWriteMode>
        </timeSeriesSet>
    </variable>

Since 2015.01 it is possible to use these property keys in all elements of the timeSeriesSet including e.g. $TIME_SERIES_TYPE$ and $START_DAYS$

Additional property-keys
    <variable>
        <variableId>Variable</variableId>
			<timeSeriesSet>
				<moduleInstanceId>import</moduleInstanceId>
				<valueType>scalar</valueType
				<parameterId>H.obs</parameterId>
				<locationId>H-2001</locationId>
				<timeSeriesType>$TIME_SERIES_TYPE$</timeSeriesType>
				<timeStep unit="day"/> <relativeViewPeriod unit="day" start="$START_DAYS$" end="1"/>
				<readWriteMode>read only</readWriteMode>
			</timeSeriesSet>
    </variable>

Example of a sub-workflow with parallel activities using properties defined at a higher level workflow

Sub-workflow using properties defined at top-level workflow
<workflow xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews
http://fews.wldelft.nl/schemas/version1.0/workflow.xsd" version="1.1">
    <parallel>
        <activity logStartedAsDebug="true" logFinishedAsDebug="true">
            <runIndependent>false</runIndependent>
            <moduleInstanceId>RTCTools_KOT_Check_$RUNTYPE$</moduleInstanceId>
            <moduleConfigFileName>RTCTools_Check_Qinc_Fx</moduleConfigFileName>
        </activity>
        <activity logStartedAsDebug="true" logFinishedAsDebug="true">
            <runIndependent>false</runIndependent>
            <moduleInstanceId>RTCTools_KOT_Check_$RUNTYPE$</moduleInstanceId>
            <moduleConfigFileName>RTCTools_Check_QO_Seed</moduleConfigFileName>
        </activity>
        <activity logStartedAsDebug="true" logFinishedAsDebug="true">
            <runIndependent>false</runIndependent>
            <moduleInstanceId>RTCTools_KOT_Check_$RUNTYPE$</moduleInstanceId>
            <moduleConfigFileName>RTCTools_Check_FB</moduleConfigFileName>
        </activity>
    </parallel>
</workflow>

You can loop over a particular locationSet using the following config. In the ModuleInstance (e.g. exportActivity of a GeneralAdapter) in every 'loop' the $LOOP_LOCATION_ID$ tag will be replaced with the 'actual' locationId.

Sub-workflow using properties defined at top-level workflow
<workflow xmlns="http://www.wldelft.nl/fews" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.wldelft.nl/fews
<activity> 
	<runIndependent>true</runIndependent>
	<moduleInstanceId>ModuleInstanceId</moduleInstanceId>
	<loopLocationSetId>LocationSetId</loopLocationSetId>
</activity>

 

 

  • No labels