How to sum without using Seq.sum

I am using the Math.NET Numerics library to have access to their Matrix and Vector classes, as well as access to some linear algebra stuff.

I’m porting a simple single layer neural network from C# to F# in an attempt to learn some F#. I need to average some matrices but the class does not seem to play well with the Seq.average and Seq.sum functions. I cannot seem to find anyone capable of telling me how to sum without using Seq.sum.

let ComputeDerivative (data : Matrix<float>) (theta : Matrix<float>) : Matrix<float> =
    let row     : Vector<float> = data.Row(0)
    let weights : Vector<float> = theta.Column(0)
    let range   : List<int>     = [2 .. row.Count]

    let sum =
        range
            |> Seq.map (fun index -> row.[index] - (weights.[0] * row.[index - 1]) + (weights.[1] * row.[index - 2]) + weights.[2])
            |> Seq.mapi (fun index errorSignal -> Matrix<float>.Build.Dense(3, 1, [| -2.0 * errorSignal * row.[index - 1]; -2.0 * errorSignal * row.[index - 2]; -2.0 * errorSignal;|]))
            |> Seq.sum

    let average = sum / (float range.Length)

    average

The issue is that Seq.sum will not work because, “Matrix does not support the operator ‘get_Zero’” and Seq.average will not work because Matrix needs to be divided by a float, not an integer.

How can I sum this?
The C# code I’m porting is:

private static Matrix<double> ComputeDerivative(in Matrix<double> data, in Matrix<double> theta) {
    int             length          = Math.Max(data.ColumnCount, data.RowCount);
    Matrix<double>  total           = Matrix<double>.Build.Dense(3, 1, 0);
    double[]        predictedDJI    = new double[length],
                    observedDJI     = new double[length];

    for (int index = 2; index < length; index++) {
        predictedDJI[index] = (theta[0, 0] * data[0, index - 1]) + (theta[1, 0] * data[0, index - 2]) + theta[2, 0];
        observedDJI[index]  = data[0, index];

        double          errorSignal = observedDJI[index] - predictedDJI[index];
        Matrix<double>  dcdTheta    = Matrix<double>.Build.Dense(3, 1, new double[] {
                -2 * errorSignal * data[0, index - 1],
                -2 * errorSignal * data[0, index - 2],
                -2 * errorSignal
        });

        total += dcdTheta;
    }

    return total / (length - 2);
}

Edit: Posted wrong C# snippit

That’s true. There is no Zero matrix. So instead of applying Seq.sum you could use Seq.fold with accumulator:

|> Seq.fold((+)) (Matrix<float>.Build.Dense(3, 1, 0.0))

Also, you create list like [start..end].
For example, [1..5] gives you [1;2;3;4;5] though you might be expected only [1;2;3;4].

So you range will be [2..row.Count - 1] instead of [2..row.Count].

let computeDerivative (data : Matrix<float>) (theta : Matrix<float>) : Matrix<float> =
    let row     : Vector<float> = data.Row(0)
    let weights : Vector<float> = theta.Column(0)
    let range   : List<int>     = [2 .. row.Count - 1]

    let sum =
        range
            |> Seq.map (fun index -> row.[index] - (weights.[0] * row.[index - 1]) + (weights.[1] * row.[index - 2]) + weights.[2])
            |> Seq.mapi (fun index errorSignal -> Matrix<float>.Build.Dense(3, 1, [| -2.0 * errorSignal * row.[index - 1]; -2.0 * errorSignal * row.[index - 2]; -2.0 * errorSignal;|]))
            |> Seq.fold((+)) (Matrix<float>.Build.Dense(3, 1, 0.0))

    let average = sum / (float range.Length)

    average

Thanks a lot. That’s very useful. You’re right about the range and it’s been adjusted.