C# Sum an Array – Performance Analysis

The problem on which mechanism to use in your programming has between some trouble for some areas, here I did an analysis on summing up an array of double. The goal is to choose the tradeoff between easiest of LINQ and performance of FOREACH, as we all know that LINQ implement the IEnumerable class. Here I create a test project to measure the time span on summing up the array base on difference array dimension, than write the results to a CSV file, where I pot a LINQ vs. FOREACH graph to see the tradeoff.

LINQ

double SumLinq = data.Sum();

FOREACH

double SumCommon = 0;
foreach (double d in data)
     SumCommon += d;

As you can see the LINQ can do the job in just one short line of code, whereas FOREACH need three line of code, but when to use which one, in what event? Therefore I create the following test program.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;

namespace SumArray
{
    class Program
    {
        static void Main(string[] args)
        {
            StreamWriter sw = new StreamWriter(@"d:\Documents\Visual Studio 2008\Projects\SumArray\SumArray\result.csv", false);
            sw.WriteLine("Array Dimension,LINQ,FOREACH");
            for (int step = 10000; step <= 10000000; step += 10000)
            {
                long Start = DateTime.Now.Ticks;
                double[] data = new double[step];
                Random random = new Random(DateTime.Now.Millisecond);
                for (int i = 0; i < data.Length; i++)
                    data[i] = random.NextDouble();
                long Stop = DateTime.Now.Ticks - Start;

                long StartWatch1 = DateTime.Now.Ticks;
                //LINQ Method
                double SumLinq = data.Sum();
                long StopWatch1 = DateTime.Now.Ticks - StartWatch1;

                long StartWatch2 = DateTime.Now.Ticks;
                //Common Method
                double SumCommon = 0;
                foreach (double d in data)
                    SumCommon += d;
                long StopWatch2 = DateTime.Now.Ticks - StartWatch2;

                Console.WriteLine("{0}\t{1}\t{2}", step, StopWatch1, StopWatch2);
                sw.WriteLine("{0},{1},{2}", step, StopWatch1, StopWatch2);
            }
            sw.Close();

            Console.ReadKey(true);
        }
    }
}

The test project record ticks or millisecond need to each sum up, first I think the result couldn't be that much difference. But as the result come out, I was amazed to see that below 100,000 record the result are about the some, but when start to go beyond 100,000 time to sum up by LINQ double the time used by FOREACH.

Therefore giving the result, we can conclude that if summing up below 100,000 records, you can use LINQ for simplify and for 100,000 records above you should use FOREACH for performance.

  • http://csharptalk.com simi
  • Cutbko