West Coast AD

CS107 Final Project - Group 14

Anita Mahinpei, Erik Adames, Lekshmi Santhosh, Yingchen Liu

Agenda

Introduction

  • Background
  • Automatic Differentiation

Intall & Usage

  • Implementation details
  • Software organization
  • How to use

Extension: Optimizer

  • Implementation details
  • How to use Optimizer

Future work

Introduction

Automatic Differentiation

Automatic differentiation is a set of techniques to numerically evaluate the derivative of a function specified by a computer program.

Advantage:

  1. Avoids problems of the symbolic method
    • the application of the rules of differentiation
  2. Evaluation are carried out in parallel
    • during the whole process, only numbers have to be manipulated.
  3. Requires only small extra storage
  4. Running times are short

Background

Automatic differentiation assumes that we are working with a differentiable function composed of a finite number of elementary functions and operations with known symbolic derivatives. The table below shows some examples of elementary functions and their respective derivatives:

Elementary Function Derivative
$x^3$ $3 x^{2}$
$e^x$ $e^x$
$\sin(x)$ $\cos(x)$
$\ln(x)$ $\frac{1}{x}$

Given a list of elementary functions and their corresponding derivatives, the automatic differentiation process involves the evaluations of the derivatives of complex compositions of these elementary functions through repeated applications of the chain rule:

$ \frac{\partial}{\partial x}\left[f_1 \circ (f_2 \circ \ldots (f_{n-1} \circ f_n)) \right] = \frac{\partial f_1}{\partial f_2} \frac{\partial f_2}{\partial f_3} \ldots \frac{\partial f_{n-1}}{\partial f_n}\frac{\partial f_n}{\partial x}$

This process can be applied to the evaluation of partial derivatives as well thus allowing for computing derivatives of multivariate and vector-valued functions.

Install & Usage

Implementation details

Software organization

How to use

Installation with GitHub

  1. Clone the package repository:

    git clone https://github.com/West-Coast-Differentiators/cs107-FinalProject.git

  2. If you do not have Anaconda installed, follow the instructions on their website to install it.

  3. Create a new conda environment:

    conda create --name <env_name> python=3.8

  4. Activate your virtual environment:

    conda activate <env_name>

  5. Navigate to the base repository:

    cd cs107-FinalProject

  6. Install the package and its requirements:

    pip install ./

  7. You may check that the installation was successful by running the package tests:

    python -m unittest discover -s WestCoastAD/test -p '*_test.py'

Using the Package

Having installed the package, users can evaluate derivatives of functions written in terms of WestCoastAD variables. For instance, the derivatives of $f(x) = x^3+ 2x^2$ and $f(x) = sin(x) + cos(x)$ can be evaluated as follows:

In [10]:
# 1) import WestCoastAD
from WestCoastAD import Variable
# 2) define the variables of your function as WestCoastAD variables 
#    with the value at which you want to evaluate the derivative and 
#    a derivative seed value
x = Variable(value=2, derivative_seed=1) 
# 3) define your function in terms of the WestCoastAD variable objects
f = x**3 + 2* x**2
# 4) access the computed derivative and the value of the function 
print('function: f = x**3 + 2* x**2')
print('Value:', f.value)
print('Derivative:', f.derivative)
function: f = x**3 + 2* x**2
Value: 16
Derivative: 20

For other mathematical operations that are not part of the standard python library, the function could be called with numpy or directly on the variable itself.

In [11]:
import numpy as np
x = Variable(value=np.pi/2, derivative_seed=1)
# 6) As an example, below we directly call sin on the variable and use numpy to call cos
f = x.sin() + np.cos(x)
print('function: f = x.sin() + np.cos(x)')
print('Value:', f.value)
print('Derivative:', f.derivative)
function: f = x.sin() + np.cos(x)
Value: 1.0
Derivative: -0.9999999999999999

Passing an array to bind variables

In [12]:
import numpy as np
x = Variable(4, np.array([1, 0]))
y = Variable(1, np.array([0, 1]))
f = x**2*y + np.sin(x-y)
print('function: f =x**2*y + np.sin(x-y)')
print('Value:', f.value)
print('Derivative:', f.derivative)
function: f =x**2*y + np.sin(x-y)
Value: 16.14112000805987
Derivative: [ 7.0100075 16.9899925]

Extension: Optimizer

Implementation details

How to use Optimizer

Future work