Math Kernel Library

Intel oneAPI Math Kernel Library (Intel oneMKL; formerly Intel Math Kernel Library or Intel MKL) is a library of optimized math routines for science, engineering, and financial applications. Core math functions include BLAS, LAPACK, ScaLAPACK, sparse solvers, fast Fourier transforms, and vector math.[5][6]

Intel oneAPI Math Kernel Library
Developer(s)Intel
Initial releaseMay 9, 2003 (2003-05-09)
Stable release
2023.1 / March 29, 2023 (2023-03-29)[1]
Written inC/C++, Intel DPC++ Compiler, Fortran
Operating systemMicrosoft Windows, Linux, macOS
PlatformCPU[2]

GPU

TypeLibrary and framework
Licensefreeware under ISSL[3][4]
Websitewww.intel.com/content/www/us/en/developer/tools/oneapi/onemkl.html Edit this on Wikidata

The library supports Intel CPUs and GPUs[2] and is available for Windows, Linux and macOS operating systems.[5][6][7]

Intel oneAPI Math Kernel Library is not to be confused with oneAPI Math Kernel Library (oneMKL) Interfaces, a piece of open-source glue code that allows Intel MKL routines to be used from Data Parallel C++.[8]

History and licensing

Intel launched the Math Kernel Library on May 9, 2003, and called it blas.lib.[9] The project's development teams are located in Russia and the United States.

The library was available in a standalone form, free of charge under the terms of Intel Simplified Software License[3] which allow redistribution.[10]

Since April 2020, MKL has become part of oneAPI. Commercial support for oneMKL is available when purchased as part of oneAPI Base Toolkit.

Performance and vendor lock-in

MKL and other programs generated by the Intel C++ Compiler and the Intel DPC++ Compiler improve performance with a technique called function multi-versioning: a function is compiled or written for many of the x86 instruction set extensions, and at run-time a "master function" uses the CPUID instruction to select a version most appropriate for the current CPU. However, as long as the master function detects a non-Intel CPU, it almost always chooses the most basic (and slowest) function to use, regardless of what instruction sets the CPU claims to support. This has netted the system a nickname of "cripple AMD" routine since 2009.[11] As of 2020, Intel's MKL remains the numeric library installed by default along with many pre-compiled mathematical applications on Windows (such as NumPy, SymPy).[12][13] Although relying on the MKL, MATLAB implemented a workaround starting with Release 2020a which ensures full support for AVX2 by the MKL also for non Intel (AMD) CPUs.[14]

Details

Functional categories

Intel MKL has the following functional categories:[15]

  • Linear algebra: BLAS routines are vector-vector (Level 1), matrix-vector (Level 2) and matrix-matrix (Level 3) operations for real and complex single and double precision data. LAPACK consists of tuned LU, Cholesky and QR factorizations, eigenvalue and least squares solvers. MKL also includes Sparse BLAS, ScaLAPACK, Sparse Solver, Extended Eigensolver (FEAST, PARDISO), PBLAS and BLACS. MKL is even better at small dimensions than libxsmm.
    Since MKL uses standard interfaces for BLAS and LAPACK, the application which uses other implementations can get better performance on Intel and compatible processors by re-linking with MKL libraries.
  • MKL includes a variety of Fast Fourier Transforms (FFTs) from 1D to multidimensional, complex to complex, real to complex, and real to real transforms of arbitrary lengths. Applications written with the open source FFTW can be easily ported to MKL by linking with interface wrapper libraries provided as part of MKL for easy migration.
    Cluster versions of LAPACK and FFTs are also available as part of MKL to take advantage of MPI parallelism in addition to single node parallelism from multithreading.
  • Vector math functions include computationally intensive core mathematical operations for single and double precision real and complex data types. These are similar to libm functions from compiler libraries but operate on vectors rather than scalars to provide better performance. There are various controls for setting accuracy, error mode and denormalized number handling to customize the behavior of the routines.
  • Statistics functions include random number generators and probability distributions, optimized for multicore processors. Also included are compute-intensive in and out-of-core routines to compute basic statistics, estimation of dependencies etc.
  • Data fitting functions include splines (linear, quadratic, cubic, look-up, stepwise constant) for 1-dimensional interpolation that can be used in data analytics, geometric modeling and surface approximation applications.
  • Partial Differential Equations
  • Nonlinear Optimization Problem Solvers

Once, MKL included Deep Neural Network functions, but they were removed in version 2020.[16] Its successor is the Intel oneAPI Deep Neural Network Library.

See also

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.