Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0

...

The regression analysis option in hymosincludes:

  • computation of correlation matrix, and
  • fitting of following type of functions: polynomial, simple linear, exponential, power, logarithmic, hyperbolic, multiple linear and stepwise.

The multiple linear functions can be fitted by means of multiple or step-wise regression techniques. The algebraic forms of the functions are presented in next section.

...

Choose a regression option from the list box by clicking the function.If you select <Polynomial> the degree of the polynomial has to be entered as well.
If you select <Stepwise> following data have to be entered:

  • maximum number of steps in the stepwise regression analysis;
  • whether or not the results of each step in the stepwise analysis have to be printed; if you enter <N>, only the results of the last step will be presented.
  • F-values to enter and delete variables from the regression analysis, which only applies for the free variables .

    Save Relation

    Save the calculated relation for the time period defined in the Relation Validity Period. The start and end-time of this period can be changed by double-clicking the time-label. The coefficients of the regression equation will be stored in the data base. There are, however, some limitations to the number of coefficients that can be stored:
  • in case of polynomial regression, the degree of the polynomial should be £4;
  • in case of multiple/stepwise regression the number of independent variables should be £4.

    Regression equations

    The following types of regression equations are available, with Y the dependent variable and Xj 's the independent variables:

    Type

    Equation

    Polynomial

    Relation curves^image012.gif!
    with:
    n
    =
    degree of polynomial: n £4,
    Cj
    =
    coefficient

    Simple linear

    Yi = A + B*.Xi
    with:
    A,B = coefficients

    Exponential 1

    Yi = A exp(B*Xi )
    with:
    A,B = coefficients

    Exponential 2

    Yi = A exp(B/Xi )
    with:
    A,B = coefficients

    Power

    Yi = A.Xi B
    with:
    A,B = coefficients

    Logarithmic

    Yi = A + B.ln(Xi )
    with:
    A,B = coefficients

    Hyperbolic

    Yi = A + B/Xi
    with:
    A,B = coefficients

    Multiple linear

    Relation curves^image014.gif!
    with:
    n
    =
    number of series on independent variables, n£4
    Cj
    = coefficient


    Computation procedure

    All coefficients are determined by the least squares method.
    The Multiple Linear equation may be established by multiple or by stepwise regression:
  • In the multiple regression situation all independent variables enter the regression equation at one go.
  • In the stepwise regression situation the independent variables enter the regression equation one by one. The order of entry may be:
  • free: the variables to which this option applies are called free variables ; their entry is determined by statistical properties; the variable added is the one which makes the greatest improvement in goodness of fit. Among the free varia­bles only the significant variables are included in the final regression equation.
  • predetermined: the variables to which this option applies are called forced variables ; i.e. their entry is not based on correlation but solely requested by the user.

    Confidence limits

    For the linear regression methods, confidence intervals can be computed. Based on the sampling distributions of the regression parameters, the following estimates and confidence limits hold (see e.g. Kottegoda and Rosso, 1998).
    If the linear regression function is as follows:
    Image Modified
    where:
    Y = dependent variable, also called response variable,
    X = independent variable or explanatory variable,
    a,b= regression coefficients,
    e= residual because of imperfect match of regression function through measurement points.
    An unbiased estimate of the error variance is given by:

where:

The parameter n is the total number of values for making the regression function and_x_ m ,y m are the mean values of the two series. Note that n-2 appears in the denominator to reflect the fact that two degrees of freedom have been lost in estimating (a, b)

...