Jump to content

Basic Linear Algebra Subprograms

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by MarkSweep (talk | contribs) at 22:47, 20 May 2007 (→‎See also: ATLAS already mentioned above, no need to single it out here). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Basic Linear Algebra Subprograms (BLAS) are standardized application programming interfaces for subroutines to perform basic linear algebra operations such as vector and matrix multiplication. They were first published in 1979, and are used to build larger packages such as LAPACK. Heavily used in high-performance computing, highly optimized implementations of the BLAS interface have been developed by hardware vendors such as by Intel as well as by other authors (e.g. ATLAS is a portable self-optimizing BLAS). The LINPACK benchmark relies heavily on DGEMM, a BLAS subroutine, for its performance.

Functionality

The BLAS functionality is divided into three levels: 1, 2 and 3.

Level 1

This level contains vector operations of the form

as well as scalar dot products and vector norms, among other things.

Level 2

This level contains matrix-vector operations of the form

as well as solving for x with being triangular, among other things.

Level 3

This level contains matrix-matrix operations of the form

as well as solving for triangular matrices , among other things. This level contains the widely used General Matrix Multiply operation.

Implementations

refblas
The official reference implementation from netlib. C and Fortran 77 versions are available.[1]
Accelerate
Apple's framework for Mac OS X, which includes tuned versions of BLAS and LAPACK for both PowerPC and Intel Core processors.[2]
ACML
The AMD Core Math Library, supporting the AMD Athlon and Opteron CPUs under Linux and Windows.[3]
ATLAS
Automatically Tuned Linear Algebra Software, an open source implementation of BLAS APIs for C and Fortran 77.[4]
CUDA SDK
The NVIDIA CUDA SDK includes BLAS functionality for writing C programs that runs on GeForce 8 Series graphics cards.
ESSL
IBM's Engineering and Scientific Subroutine Library, supporting the PowerPC architecture under AIX and Linux.[5]
Goto BLAS
Kazushige Goto's implementation of BLAS.[6]
HP MLIB
HP's Math library, supporting IA-64, PA-RISC, x86 and Opteron architecture under HPUX and Linux.
Intel MKL
The Intel Math Kernel Library, supporting the Intel Pentium and Itanium CPUs under Linux and Windows.[7]
MathKeisan
NEC's math library, supporting NEC SX architecture under SUPER-UX, and Itanium under Linux. [8]
PDLIB/SX
NEC's Public Domain Mathematical Library for the NEC SX-4 system.[9]
SCSL
SGI's Scientific Computing Software Library contains BLAS and LAPACK implementations for SGI's Irix workstations.[10]
Sun Performance Linaray
Optimized BLAS and LAPACK for SPARC and AMD64 architectures under Solaris 8, 9, and 10.[11]
uBLAS
A generic C++ template class library providing BLAS functionality. Part of the Boost library.[12]

The Sparse BLAS

Sparse extensions to the previously dense BLAS exist.

See also

  • BLAS homepage on Netlib.org
  • BLAS FAQ
  • BLAS operations from the GNU Scientific Library reference manual
  • BLAS Quick Reference Guide from LAPACK Users' Guide
  • Lawson Oral History One of the original authors of the BLAS discusses its creation in an oral history interview. Charles L. Lawson Oral history interview by Thomas Haigh, 6 and 7 November, 2004, San Clemente, California. Society for Industrial and Applied Mathematics, Philadelphia, PA.
  • Dongarra Oral History In an oral history interview, Jack Dongarra explores the early relationship of BLAS to LINPACK, the creation of higher level BLAS versions for new architectures, and his later work on the ATLAS system to automatically optimize BLAS for particular machines. Jack Dongarra, Oral history interview by Thomas Haigh, 26 April, 2005, University of Tennessee, Knoxville TN. Society for Industrial and Applied Mathematics, Philadelphia, PA
  • An Overview of the Sparse Basic Linear Algebra Subprograms: The New Standard from the BLAS Technical Forum [13]