
13 Sep
2005
13 Sep
'05
11:28 a.m.
"Daryle Walker" <darylew@hotmail.com> wrote
OK. When you say "arbitrary precision," you mean that a precision limit must be set (at run-time) before an operation. Most people use "arbitrary precision" to mean unlimited precision, not your "run-time cut-off" precision.
Are there really libraries that have unlimited precision? What happens when the result of a computation is irrational? regards Andy Little