CodePlexProject Hosting for Open Source Software

A library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.

AutoDiff provides a simple and intuitive API for computing function gradients/derivatives along with a fast state-of-the-art algorithm for performing the computation. Such computations are mainly useful in numeric optimization scenarios. Numeric programming in .NET has never been simpler.

You can download the library either from this site's download section, or you can use NuGet to install the package in your solution.

using AutoDiff; class Program { public static void Main(string[] args) { // define variables var x = new Variable(); var y = new Variable(); var z = new Variable(); // define our function var func = (x + y) * TermBuilder.Exp(z + x * y); // prepare arrays needed for evaluation/differentiation Variable[] vars = { x, y, z }; double[] values = {1, 2, -3 }; // evaluate func at (1, 2, -3) double value = func.Evaluate(vars, values); // calculate the gradient at (1, 2, -3) double[] gradient = func.Differentiate(vars, values); // print results Console.WriteLine("The value at (1, 2, -3) is " + value); Console.WriteLine("The gradient at (1, 2, -3) is ({0}, {1}, {2})", gradient[0], gradient[1], gradient[2]); } }

In the Documentation tab right now we have some basic tutorials, and some others are under construction. We also have an article on CodeProject. In addition, the binary distribution contains XML comments for all public methods and a help file you can view with your favorite help viewer. And finally, the source control contains some code examples in addition to the library's code.

There are many open and commercial .NET libraries that have numeric optimization as one of their features (for example, Microsoft Solver Foundation, AlgLib, Microsoft Research SHO Extreme Optimization, CenterSpace NMath) . Most of them require the user to be able to evaluate the function and the function's gradient. This library tries to save the work in manually developing the function's gradient and coding it.

Once the developer defines his/her function, the AutoDiff library can automatically evaluate and differentiate this function at any point. This allows

- Fast! See 0.5 vs 0.3 benchmark and 0.3 benchmark.
- Composition of functions using arithmetic operators, Exp, Log, Power and user-defined unary and binary functions.
- Function gradient evaluation at specified points
- Function value evaluation at specified points
- Uses Code Contracts for specifying valid parameters and return values
- Computes gradients using Reverse-Mode AD algorithm in
**linear time**!- Yes, it's faster than numeric approximation for multivariate functions
- You get both high accuracy and speed!

Last edited Apr 13, 2012 at 11:14 AM by alexshtf, version 40