-----Original Message----- From: Jon Agiato [mailto:JonAgiato@nyc.rr.com]
Thanks again for all the help! I was looking through random-generators.html and see that there are in addition to the Mersenne Twister, Linear Congruential, and Lagged Fibonacci, five others: const_mod, additive_combined, shuffle_output, rand48, and inversive_congruential. I am assuming these are all good RNGs to use for comparison, am I wrong?
const_mod: not an RNG; it is used as an implementation helper for other RNGs rand48: uses Linear Congruential algorithm additive_combined, shuffle_output: modifications to the Linear Congruential to try to make it better; so these two require Linear Congruential generator(s) as input
And lastly (don't want to wear out my welcome.. lol), do you think there would be any benefit to using the advanced RNG classes in my research?
Actually, I think it might be a good idea to do so for your project. The specializations provided by Boost.Random are well-known RNGs that have been published in research papers and used extensively. There are undoubtedly many other good specializations out there, but few are used (mainly because it is very hard to define what a "good RNG" is).
Anyway, what I was thinking is that you might make some of your own specializations that are known to be bad RNGs, and discuss why they are bad. Here is where my knowledge about RNGs end; I'm sure that there are known bad RNGs, but I don't know what they are.
-Steve
That's a great idea Steve, thanks! I am rather new to RNG research but agree that a comparison of sorts would be good. Could you or anyone here provide assistance, perhaps a short example on how one would use one of the advanced RNGs Boost provides? I use Boost a lot, and think that one of the things it could really benefit from is more in depth documentation, especially for those of us struggling to utilize the library as quickly as possible. Then again, I know that 1_30_0 should be out soon, so these developers have enough to do on their plates.
Look at the docs again; they explain what is necessary to specialize the
generic algorithms. Note that you should be familiar with the generic RNG
algorithms before attempting to specialize.
For example, looking at the Linear Congruential RNG (which is the only RNG
algorithm I'm familiar with), the algorithm is -- as specified in the
Boost.Random docs:
x(n+1) := (a * x(n) + c) mod m
and boost::random::linear_congruential is declared as:
template