Skip to content

Latest commit

 

History

History
85 lines (70 loc) · 7.41 KB

README.md

File metadata and controls

85 lines (70 loc) · 7.41 KB

nBLAS version travis

Node >=4.0 C++ bindings for all single- and double-precision CBLAS (Basic Linear Algebra Subprograms) routines.

$ npm install nblas
$ npm test

Works out of the box with OSX since CBLAS is included in the standard Accelerate framework. You might have to download and build LAPACK from source on other operating systems (LINUX: sudo apt-get libblas-dev).

var nblas = require('nblas');

var f64a = new Float64Array([1, 2, 3]),
    f64b = new Float64Array([4, 5, 6]);

nblas.dot(f64a, f64b); // or
nblas.ddot(3, f64a, 1, f64b, 1); // 32

var f32a = new Float32Array([1, 2, 3]),
    f32b = new Float32Array([4, 5, 6]);

nblas.dot(f32a, f32b); // or
nblas.sdot(3, f32a, 1, f32b, 1); // 32

Double precision functions expect Float64Array vectors, single precision functions expect Float32Array vectors.