Most of heavy research on PRNGs (pseudo-random number generators) is for cryptographic or simulation applications. These generators are generally too slow or use too much memory if all that is required is a long sequence of random sounding numbers, such as for noise, dither etc. in audio DSP. Trying to find a reliable, fast, algorithm for this purpose is tricky, since many linear congruential generators have failings. I have a new page: http://www.firstpr.com.au/dsp/rand31/ which provides C, C++ and dsPIC assembly code for a fast, 32 bit integer, no-division, implementation of the well-known Park Miller (1988) "minimal standard" pseudo-random number generator. This follows the suggestion by David G. Carta in 1990 of how to implement this algorithm without using division. This generator produces 31 bit numbers, between 1 and (2^31-2) = 0x7FFFFFFE. Park and Miller rejected Carta's work in 1993 - saying it didn't work. But it works fine! I trace the history of this PRNG, look at some explanations, and give an explanation of why Carta's approach works which is different to and much simpler than Carta's explanation. The dsPIC subroutine uses 18 CPU cycles, including the call and return - so it can generate a million results a second at 20 to 30 MIPS (80 to 120 MHz). With unoptimised C code, an 800 MHz Pentium III can generate 13 million results a second. - Robin http://www.firstpr.com.au Melbourne Australia -- comp.lang.c.moderated - moderation address: clcm@plethora.net -- you must have an appropriate newsgroups line in your header for your mail to be seen, or the newsgroup name in square brackets in the subject line. Sorry.
31 bit pseudo-random number gen in C, C++ & dsPIC assembly code
Started by ●September 27, 2005
Reply by ●September 28, 20052005-09-28
Robin Whittle wrote:> ... PRNGs ... in audio DSP. ... > > http://www.firstpr.com.au/dsp/rand31/Thank you. I bookmarked it. Jerry -- Engineering is the art of making what you want from things you can get. ----------------------------------------------------------------------- -- comp.lang.c.moderated - moderation address: clcm@plethora.net -- you must have an appropriate newsgroups line in your header for your mail to be seen, or the newsgroup name in square brackets in the subject line. Sorry.
Reply by ●September 30, 20052005-09-30
Robin Whittle wrote:> Most of heavy research on PRNGs (pseudo-random number generators) is > for cryptographic or simulation applications. These generators are > generally too slow or use too much memory if all that is required is > a long sequence of random sounding numbers, such as for noise, dither > etc. in audio DSP. Trying to find a reliable, fast, algorithm for > this purpose is tricky, since many linear congruential generators have > failings. >George Marsaglia has published several very fast PRNG algorithms for modeling and simulation. You might Google in "Groups" for his name. The LFSR PRNG sidesteps many of the problems of the linear congruential generators and, with modifications, has been used for cryptography. You might be interested in the following paper. ** "Generalized Feedback Shift Register Pseudorandom Number Algorithm" ** Journal of the ACM, Vol. 20, No. 3, July, 1973, pp 456-468. The Lewis and Payne PRNG is an LFSR algorithm that generates pseudorandom numbers as "bits-in-parallel". It thus produces a very-high-speed stream of numbers of arbitrary width with a very long period. In fact, the period is readily adjustable to incomprehensible sizes depending on the parameters that define the LFSR length and its tap. Most of the work is done during initialization of the parallel shift registers. Below is a C implementation of the functional part of the L&P PRNG that generates 32-bit PRNs. You can judge for yourself how fast it could be. If you would like to look at a complete C implementation of the L&P generator, send me an e-mail. /**************************************************************\ ** ** ** YEARS* P Q ** ** ------ --- --- ** ** 10^17 124 37 ** ** 10^31 170 23 ** ** 10^55 250 103 ** ** 10^94 380 47 ** ** 10^140 532 37 ** ** ** ** * YEARS TO REPEAT CYCLE ASSUMING 1 WORD PER PICOSECOND ** ** (10^12 WORDS PER SECOND) -- CURRENT (2002) TECHNOLOGY ** ** CAN'T QUITE YIELD 10^9 WORDS PER SECOND USING THIS ** ** ALGORITHM. ** ** ** ** According to Lewis and Payne, avoid designs where Q is ** ** close to zero or close to (P-1)/2. By this criteria, ** ** P=124, Q=37 appears to be the best design choice. ** ** ** \**************************************************************/ #define P 124 /* Polynomial Order */ #define Q 37 /* Polynomial Tap */ unsigned long RandLP_int(unsigned long m[], int *j) { int k; *j += 1; if (*j >= P) *j = 0; k = (*j + Q); if (k >= P) k = k - P; return (m[*j] ^= m[k]); }
Reply by ●October 5, 20052005-10-05
"Robin Whittle" <rw@firstpr.com.au> wrote in message news:clcm-20050927-0007@plethora.net...> Most of heavy research on PRNGs (pseudo-random number > generators) is > for cryptographic or simulation applications. These > generators are > generally too slow or use too much memory if all that is > required is > a long sequence of random sounding numbers, such as for > noise, dither > etc. in audio DSP. Trying to find a reliable, fast, > algorithm for > this purpose is tricky, since many linear congruential > generators have > failings. >George Marsaglia has published several very fast PRNG algorithms. You might Google in "Groups" for his name. The LFSR PRNG sidesteps many of the problems of the linear congruential generators and, with modifications, has been used for cryptography. You might be interested in the following paper. ** "Generalized Feedback Shift Register Pseudorandom Number Algorithm" ** Journal of the ACM, Vol. 20, No. 3, July, 1973, pp 456-468. The Lewis and Payne PRNG is an LFSR algorithm that generates pseudorandom numbers as "bits-in-parallel". It thus produces a very-high-speed stream of numbers of arbitrary width with a very long period. In fact, the period is readily adjustable to incomprehensible sizes depending on the parameters that define the LFSR length and its tap. Most of the work is done during initialization of the parallel shift registers. Below is a C implementation of the functional part of the L&P PRNG that generates 32-bit PRNs. You can judge for yourself how fast it could be. If you would like to look at a complete C implementation of the L&P generator, send me an e-mail. /**************************************************************\ ** ** ** YEARS* P Q ** ** ------ --- --- ** ** 10^17 124 37 ** ** 10^31 170 23 ** ** 10^55 250 103 ** ** 10^94 380 47 ** ** 10^140 532 37 ** ** ** ** * YEARS TO REPEAT CYCLE ASSUMING 1 WORD PER PICOSECOND ** ** (10^12 WORDS PER SECOND) -- CURRENT (2002) TECHNOLOGY ** ** CAN'T QUITE YIELD 10^9 WORDS PER SECOND USING THIS ** ** ALGORITHM. ** ** ** ** According to Lewis and Payne, avoid designs where Q is ** ** close to zero or close to (P-1)/2. By this criteria, ** ** P=124, Q=37 appears to be the best design choice. ** ** ** \**************************************************************/ #define P 124 /* Polynomial Order */ #define Q 37 /* Polynomial Tap */ unsigned long RandLP_int(unsigned long m[], int *j) { int k; *j += 1; if (*j >= P) *j = 0; k = (*j + Q); if (k >= P) k = k - P; return (m[*j] ^= m[k]); } -- comp.lang.c.moderated - moderation address: clcm@plethora.net -- you must have an appropriate newsgroups line in your header for your mail to be seen, or the newsgroup name in square brackets in the subject line. Sorry.
Reply by ●October 7, 20052005-10-07
Reply by ●October 7, 20052005-10-07
<onyx49@juno.com> wrote in message news:1128730678.728132.286240@g47g2000cwa.googlegroups.com...> > > > Is there a measure of the "randomness" or should I say > > the "non-randomness" of a signal? > > > Dave >Well, the answer to that is a definite yes, and no. ;) There is a property called "entropy" and it's measured in bits. It's unfortunate that this name collides with the thermodynamic property of the same name, because it confuses a lot of people. There is another related property, frequently cited by cryptographers (repost your question in sci.crypt), that is loosely referred to as "unpredictability". In many cases, especially when dealing with ciphers based on pseudo-random numbers (which aren't random at all) unpredictability is the key property. In other cases, such as modeling and simulation, unpredictability may be less important. In the information theoretic sense, "entropy" is the number of bits required to enumerate all the possibilities in a population that is uniformly distributed in the following sense: if you have a million unique things in a pool, and there is an equal probability of choosing any one of them, (and it is no more *predictable* that you will choose one over any other), then the choice of any item is worth about 20 bits of entropy.
Reply by ●October 8, 20052005-10-08
John E. Hadstate wrote:> <onyx49@juno.com> wrote in message > news:1128730678.728132.286240@g47g2000cwa.googlegroups.com... > >> >> >>Is there a measure of the "randomness" or should I say >> >>the "non-randomness" of a signal? >> >> >>Dave >> > > > Well, the answer to that is a definite yes, and no. ;) > > There is a property called "entropy" and it's measured in > bits. It's unfortunate that this name collides with the > thermodynamic property of the same name, because it confuses > a lot of people. >-snip->It's not an unfortunate name collision, they're the same thing (if possibly scaled differently). There's a Richard Feynman book on the thermodynamics of computation that explains this, and other concepts that seem really twisted but really aren't after you think hard enough about them*. Turns out that you can predict the minimum amount of energy it'll take to do a certain computation (or forget a certain message) in a certain amount of time. At the time that he was writing it the ratio of energy used vs. theoretical minimum was over 1000:1, so it may be a while before Intel has to refer to that work on a daily basis. * of course you have to think so hard that the objecting brain cells curl up and die, but you'll be smarter on average once the debris gets washed down to your liver and kidneys. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●October 8, 20052005-10-08
*** Cross-post to sci.crypt added *** "Tim Wescott" <tim@seemywebsite.com> wrote in message news:Y9WdnYKex68X-9reRVn-pw@web-ster.com...> John E. Hadstate wrote: > >> <onyx49@juno.com> wrote in message >> news:1128730678.728132.286240@g47g2000cwa.googlegroups.com... >> >>> >>> >>>Is there a measure of the "randomness" or should I say >>> >>>the "non-randomness" of a signal? >>> >>> >>>Dave >>> >> >> >> Well, the answer to that is a definite yes, and no. ;) >> >> There is a property called "entropy" and it's measured in >> bits. It's unfortunate that this name collides with the >> thermodynamic property of the same name, because it >> confuses a lot of people. >> > -snip- >> > It's not an unfortunate name collision, they're the same > thing (if possibly scaled differently). There's a Richard > Feynman book on the thermodynamics of computation that > explains this, and other concepts that seem really twisted > but really aren't after you think hard enough about them*.They can't be quite the same thing since thermodynamic entropy measures a property of physical systems while information theoretic entropy measures a property of something that is a figment of our imaginations (a useful figment, to be sure, but "bits" have no physical manifestation and are not constrained by the laws of physics). I'm not equipped nor motivated to elaborate more precisely on the difference. That's why I cross-posted the question to sci.crypt where there are a number of regulars who are emminently qualified to explain the difference (Doug Gwyn, a physicist with a good understanding of information theory comes to mind).> > Turns out that you can predict the minimum amount of > energy it'll take to do a certain computation (or forget a > certain message) in a certain amount of time. At the time > that he was writing it the ratio of energy used vs. > theoretical minimum was over 1000:1, so it may be a while > before Intel has to refer to that work on a daily basis. > > * of course you have to think so hard that the objecting > brain cells curl up and die, but you'll be smarter on > average once the debris gets washed down to your liver and > kidneys. >LOL! And I thought that only beer would flush out dead brain cells. Now I know that I can just sit and ponder esoterica. Much cheaper, but maybe not as enjoyable.> -- > > Tim Wescott > Wescott Design Services > http://www.wescottdesign.com
Reply by ●October 8, 20052005-10-08
John E. Hadstate wrote:>> >>It's not an unfortunate name collision, they're the same >>thing (if possibly scaled differently). There's a Richard >>Feynman book on the thermodynamics of computation that >>explains this, and other concepts that seem really twisted >>but really aren't after you think hard enough about them*. > > > They can't be quite the same thing since thermodynamic > entropy measures a property of physical systems while > information theoretic entropy measures a property of > something that is a figment of our imaginations (a useful > figment, to be sure, but "bits" have no physical > manifestation and are not constrained by the laws of > physics).Claude Shannon didn't pick the term 'entropy' by coincidence. Thermodynamic entropy and cryptographic entropy are both the same function of the probabilities of a system being in each of its possible states. --Mike Amling
Reply by ●October 8, 20052005-10-08
"Mike Amling" <nospam@nospam.com> wrote in message news:pOW1f.476$Ql1.53@chiapp18.algx.net...> John E. Hadstate wrote: >>> >>>It's not an unfortunate name collision, they're the same >>>thing (if possibly scaled differently). There's a >>>Richard Feynman book on the thermodynamics of computation >>>that explains this, and other concepts that seem really >>>twisted but really aren't after you think hard enough >>>about them*. >> >> >> They can't be quite the same thing since thermodynamic >> entropy measures a property of physical systems while >> information theoretic entropy measures a property of >> something that is a figment of our imaginations (a useful >> figment, to be sure, but "bits" have no physical >> manifestation and are not constrained by the laws of >> physics). > > Claude Shannon didn't pick the term 'entropy' by > coincidence. Thermodynamic entropy and cryptographic > entropy are both the same function of the probabilities of > a system being in each of its possible states. >What does it mean for a continuous, physical system to be "in each of its possible states"? So far as I know, thermodynamics pre-dates quantum theory and, as such, views the physical world as a continuum.






