Differential Programming in JavaScript

Propel provides a GPU-backed numpy-like infrastructure for scientific computing in JavaScript. JavaScript is a fast, dynamic language which, we think, could act as an ideal workflow for scientific programmers of all sorts.

API RefGithub


Use it in Node:

npm install propel
let pr = require("propel");

Use it in a browser:

<script src="https://unpkg.com/propel@3.2.0"></script>

Run anywhere.

Propel runs both in the browser and natively from Node. In both environments Propel is able to use GPU hardware for computations. In the browser it utilizes WebGL through deeplearn.js and on Node it uses TensorFlow's C API.

Phd optional.

Propel has an imperative autograd-style API. Computation graphs are traced as you run them -- a general purpose gradient function provides an elegant interface to backpropagation.

Did somebody say GPUs?

Browsers are great for demos, but they are not a great numerics platform. WebGL is a far cry from CUDA. By running Propel outside of the browser, users will be able to target multiple GPUs and make TCP connections. The models developed server-side will be much easier to deploy as HTML demos.

Let's do this.

The basic propel npm package is javascript only, without TensorFlow bindings. To upgrade your speed dramatically install

npm install propel_mac
npm install propel_windows
npm install propel_linux
npm install propel_linux_gpu

Neural Networks

step 1 loss 1.099

What are NNs anyway?The terminology is a bit misleading. Any number of operations and architectures can be considered a neural network. The primary thing in common is that NN models use differentiable operations to allow gradient decent to improve their fitness.

Propel provides a concise API for specifying, training, and making inference in neural networks. In the example, a two layer fully-connected ReLU network is being trained on the iris dataset. Iris is a very small dataset where each training example is only 4 floating point features. There are three possible labels to apply. As with all classification problems, we apply softmaxLoss to the labels.