Nothing Special   »   [go: up one dir, main page]

Jump to content

Mem (computing)

From Wikipedia, the free encyclopedia

In computing, mem is a measurement unit for the number of memory accesses used or needed by a process, function, instruction set, algorithm or data structure. Mem has applications in computational complexity theory, computing efficiency, combinatorial optimization, supercomputing, computational cost (algorithmic efficiency) and other computational metrics.

Example usage, when discussing processing time of a search tree node, for finding 10 × 10 Latin squares: "A typical node of the search tree probably requires about 75 mems (memory accesses) for processing, to check validity. Therefore the total running time on a modern computer would be roughly the time needed to perform 2×1020 mems." (Donald Knuth, 2011, The Art of Computer Programming, Volume 4A, p. 6).

Reducing mems as a speed and efficiency enhancement is not a linear benefit, as it trades off increases in ordinary operations costs.

PFOR compression

[edit]

This optimization technique also is called PForDelta[1]

Although lossless compression methods like Rice, Golomb and PFOR are most often associated with signal processing codecs, the ability to optimize binary integers also adds relevance in reducing MEMS tradeoffs vs. operations. (See Golomb coding for details).[2]

See also

[edit]

References

[edit]
  1. ^ "on compression" techniques of benchmarking and optimization using compression" (PDF). Archived from the original (PDF) on 2012-12-21. Retrieved 2014-02-13.
  2. ^ MEMS vs. OOPS Article including compression codecs

Breaking the Wall of the Quantum Computing Hype - MemComputing, Inc.