Maple Share Library: Automatic Differentiation
==============================================

DIFF            > readshare(DIFF, autodiff); (30K)
                Given a function f(x1,...,xm) input as a Maple procedure, DIFF
                computes D[i$n](f) i.e. the n'th partial derivative of f wrt
                the n'th parameter (optionally returns all the derivatives).

GRADIENT        > readshare(GRADIENT, autodiff); (73K)
                SEE ALSO: autodiff/GRADIENT.tex (41K), autodiff/GRADIENT.ms

                Given a function f(x1,...,xm) input as a Maple procedure
                the output of GRADIENT(f) is a Maple procedure which
                computes the gradient of f, i.e. the vector G where
                    G[i] = diff(f(x1,...,xm),xi) for i=1..m

HESSIAN         > readshare(HESSIAN, autodiff);
                (needs GRADIENT to be loaded)
                The output of HESSIAN(f) is a Maple procedure which computes
                the Hessian matrix H of partial derivatives where
                    H[i,j] = diff(f(x1,...,xm),xi,xj)

JACOBIAN        > readshare(JACOBIAN, autodiff);
                (needs GRADIENT to be loaded)
                Given functions f[i](x1,...xm) for i = 1..n input as Maple
                procedures, the output of JACOBIAN([f[1],...,f[n]]) is a Maple
                procedure which computes the Jacobian matrix J where
                        J[i,j] = diff(f[i](x1,...,xm),xj)

TAYLOR          > readshare(TAYLOR, autodiff);
                (needs GRADIENT to be loaded)
                The output of TAYLOR(f,[x1,...,xm],n) is a Maple procedure
                which computes the taylor series coefficients of f upto
                total degree n.
                AUTHOR: Walter Neuenschwander, waneu@vision.ethz.ch
                AUTHOR: Michael Monagan, monagan@inf.ethz.ch


