Hello I did the Eigenvalue codes in GSOC 2015 but it had some stability issues with one specific test matrix and hence it wasn't merged. I can commit to merging them well before the start date of this years GSOC since I have some time between now and march to work on this if someone wants to take the work forward. *Rajaditya Mukherjee * *6th Year Graduate Student* Dept. of Computer Science and Engineering The Ohio State University Columbus, Ohio Tel :- +1-(614)-271-4439 email :- rajaditya.mukherjee@gmail.com,mukherjee.62@osu.edu On Sat, Jan 20, 2018 at 4:30 PM, SHIKHAR SRIVASTAVA via Boost < boost@lists.boost.org> wrote:
Hi Artyom,
Thank you for the insight. Given the metrics, it surely looks like even implementing those operations in ublas won't do any good.
I will look into the blas/LAPACK backend for ublas. I will look for a possible proposal which can be completed in the given GSOC time frame. Then again there is this question, is there any mentor available for this project who can refine some of the requirements ?
Regards Shikhar Srivastava
On 21-Jan-2018 12:36 AM, "Artyom Beilis via Boost"
wrote: On Fri, Jan 19, 2018 at 8:37 AM, SHIKHAR SRIVASTAVA via Boost
wrote: Hi everyone,
I am a 4th year undergraduate student pursuing a degree in Computer Science and Engineering. I have strong programming experience in C++ through internships, self projects and programming events. I wish to be a part of gsoc18 under boost and am particularly interested in the linear algebra library Boost.ublas.
The ublas library can be made more useful for Machine Learning applications like recommendation systems, clustering and classification, pattern recognition by adding some operations required in those. I propose to add advanced matrix operations to ublas including -
1. Triangular Factorisation (LU and Cholesky) 2. Orthogonal Factorisation (QR and QL) 3. Operations to find Singular Value lists 4. Eigenvalue algorithms 5. Singular Value Decomposition (SVD) 6. Jordan Decomposition 7. Schur Decomposition 8. Hessenberg Decomposition
Hello,
I'm sorry to disappoint you but uBlas is not nearby useful library for real world machine learning applications because it exceptionally slow in comparison to "real" BLAS libraries being used for such applications like OpenBLAS, Atlas or proprietary MKL.
They all give you what you are talking about, they are tested very well and exceptionally fast.
I mean uBlas is by 2-3 orders of magnitude slower than OpenBLAS or Atlas even for small matrices
8x8 GEMM - uBlas slower by 50 times than OpenBlas and 30 times slower than Atlas 128x128 GEMM - uBlas slower by 600 times thatn OpenBlas and 50 times slower than Atlas.
So I don't think investing in implementation of algorithm that are already implemented in LAPACK libraries and have way better performance would actually will be helpful for real world applications.
What you CAN do is to provide *Blas/LAPACK based backend for uBlas...
Regards, Artyom
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/ mailman/listinfo.cgi/boost
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/ mailman/listinfo.cgi/boost