Maximum Size of Matrix in Boost

Hi, I have a matrix withn 10^7 * 10^7 rows, each with double value like 0.333. All this printed in text file. I want to slurp all of them in Boost Matrix, with 6GB RAM, can it handle such matrix? - Gundala Viswanath Jakarta - Indonesia

On Wed, Jan 28, 2009 at 12:02:20AM +0900, Gundala Viswanath wrote:
Hi,
I have a matrix withn 10^7 * 10^7 rows, each with double value like 0.333. All this printed in text file.
So that is 10 million x 10 million x 8 bytes per double. or 800 million million bytes
I want to slurp all of them in Boost Matrix, with 6GB RAM, can it handle such matrix?
Which is 6 thousand million bytes.
- Gundala Viswanath Jakarta - Indonesia
I think that 800 million million is larger than 6 thousand million, so I do not think that your matrix will fit into your memory. Bob -- 40 isn't old. If you're a tree.

AMDG Bob Wilkinson wrote:
I have a matrix withn 10^7 * 10^7 rows, each with double value like 0.333. All this printed in text file.
I want to slurp all of them in Boost Matrix, with 6GB RAM, can it handle such matrix?
I think that 800 million million is larger than 6 thousand million, so I do not think that your matrix will fit into your memory.
Not to mention that 800 TB is probably more than will fit on your hard drive. In Christ, Steven Watanabe

AMDG
Bob Wilkinson wrote:
I have a matrix withn 10^7 * 10^7 rows, each with double value like 0.333. All this printed in text file.
I want to slurp all of them in Boost Matrix, with 6GB RAM, can it handle such matrix?
I think that 800 million million is larger than 6 thousand million, so I do not think that your matrix will fit into your memory.
Not to mention that 800 TB is probably more than will fit on your hard drive.
If the 100 trillion values for the matrix are stored in a text file, that file will not be a whole lot smaller than the 800 TB required to hold the matrix in memory. So it seems likely that the matrix is sparse. uBLAS has tools for sparse matrices. I'm not familiar enough with it to know the maximum size, but the documentation is available at http://www.boost.org/doc/libs/1_37_0/libs/numeric/ublas/doc/index.htm John

I inferred that from the size of the text file required to store 100
trillion values like "0.335" (600 TB, assuming single-byte characters
and single character separation), and the fact that he had earlier asked
a question about sparse matrices.
John
________________________________
From: boost-users-bounces@lists.boost.org
[mailto:boost-users-bounces@lists.boost.org] On Behalf Of Jonathan
Franklin
Sent: Tuesday, January 27, 2009 11:35 AM
To: boost-users@lists.boost.org
Subject: Re: [Boost-users] Maximum Size of Matrix in Boost
On Tue, Jan 27, 2009 at 10:29 AM, John Wilkinson
participants (5)
-
Bob Wilkinson
-
Gundala Viswanath
-
John Wilkinson
-
Jonathan Franklin
-
Steven Watanabe