Derivatives are required at the core of many numerical algorithms. Unfortunately, they are usually computed inefficiently and approximately by some variant of the finite difference approach $$ f'(x) \approx \frac{f(x+h)  f(x)}{h}, h \text{ small }. $$ This method is inefficient because it requires \( \Omega(n) \) evaluations of \( f : \mathbb{R}^n \to \mathbb{R} \) to compute the gradient \( \nabla f(x) = \left( \frac{\partial f}{\partial x_1}(x), \cdots, \frac{\partial f}{\partial x_n}(x)\right) \), for example. It is approximate because we have to choose some finite, small value of the step length \( h \), balancing floatingpoint precision with mathematical approximation error.
One option is to explicitly write down a function which computes the exact derivatives by using the rules that we know from Calculus. However, this quickly becomes an errorprone and tedious exercise. There is another way! The field of automatic differentiation provides methods for automatically computing exact derivatives (up to floatingpoint error) given only the function \( f \) itself. Some methods use many fewer evaluations of \( f \) than would be required when using finite differences. In the best case, the exact gradient of \( f \) can be evaluated for the cost of \( O(1) \) evaluations of \( f \) itself. The caveat is that \( f \) cannot be considered a black box; instead, we require either access to the source code of \( f \) or a way to plug in a special type of number using operator overloading.
JuliaDiff is an informal organization which aims to unify and document packages written in Julia for evaluating derivatives. The technical features of Julia, namely, multiple dispatch, source code via reflection, JIT compilation, and firstclass access to expression parsing make implementing and using techniques from automatic differentiation easier than ever before (in our biased opinion). Packages hosted under the JuliaDiff organization follow the same guidelines as for JuliaOpt; namely, they should be actively maintained, well documented and have a basic testing suite.
Below we list the packages that are currently included in the JuliaDiff organization and their testing status on the latest Julia release, if available.
DualNumbers 
Implements a Dual number type which can be used for forwardmode automatic differentiation of first derivatives via operator overloading.

ForwardDiff 
A unified package for forwardmode automatic differentiation, combining both DualNumbers and vectorbased gradient accumulations. 
HyperDualNumbers 
Implements a Hyper number type which can be used for forwardmode automatic differentiation of first and second derivatives via operator overloading.

ReverseDiffSource 
Implements reversemode automatic differentiation for gradients and highorder derivatives given usersupplied expressions or generic functions. Accepts a subset of valid Julia syntax, including intermediate assignments. 
TaylorSeries 
Implements truncated multivariate power series for highorder integration of ODEs and forwardmode automatic differentiation of arbitrary order derivatives via operator overloading. 
These Julia packages also provide differentiation functionalities.
Calculus 
Provides methods for symbolic differentiation and finitedifference approximations. 
PowerSeries 
Implements truncated power series type which can be used for forwardmode automatic differentiation of arbitrary order derivatives via operator overloading. 
ReverseDiffOverload 
Implements reversemode automatic differentiation by overloading function inputs to extract scalar and vectorvalued expression graphs. 
ReverseDiffSparse 
Implements reversemode automatic differentiation for gradients and sparse Hessian matrices given closedform expressions. 
SymEngine 
Implements symbolic differentiation. 
Packages implementing automatic differentiation techniques are already in use in the broader Julia ecosystem.
autodiff=true
. The function must be written to take a generic input vector, e.g., f{T}(x::Vector{T})
or just f(x)
instead of f(x::Vector{Float64})
.Discussions on JuliaDiff and its uses may be directed to the juliausers or juliaopt mailing lists. The autodiff.org site serves as a portal for the academic community. For a wellwritten simple introduction to reversemode automatic differentiation, see Justin Domke's blog post. Finally, automatic differentiation techniques have been implemented in a variety of languages. If you would prefer not to use Julia, see the wikipedia page for a comprehensive list of available packages.