CULA on 3GB Notebook Graphics Card

General CULA Dense (LAPACK & BLAS) support and troubleshooting. Use this forum if you are having a general problem or have encountered a bug.

CULA on 3GB Notebook Graphics Card

Postby scottg » Tue Nov 01, 2011 11:53 am

I am using CULA (basic) to solve large linear eqn matricies on a Dell XPS laptop with a 3GB GeForce GT 555m graphics card. It works fine up to around a 1GB matrix, but seems to fail above that. Does anyone know if there is either a hardware limitation (1 GB shared memory ?) or a CULA Basic software limitation that might be causing this? Thanks !
scottg
 
Posts: 2
Joined: Tue Nov 01, 2011 11:47 am

Re: CULA on 3GB Notebook Graphics Card

Postby john » Tue Nov 01, 2011 1:17 pm

You can try to allocate your matrix with the plain old cudaMalloc function to see how large it will go. There is nothing in CULA that will limit your allocations. CULA needs to allocate a few more MB on top of this for workspace, but this is small compared to your matrix size.
john
Administrator
 
Posts: 587
Joined: Thu Jul 23, 2009 2:31 pm

Re: CULA on 3GB Notebook Graphics Card

Postby scottg » Tue Nov 01, 2011 3:46 pm

Hi John - Thanks for the prompt reply. Currently the matrix is initially allocated and populated in other parts of the code, and passed into the routine that calls the CULA solver. I'll write a little piece of sample code and see what I get. The routine I am using is CULA_SGETRF(N,N,A,N,IPVT) where A is a full NxN single precision matrix, N~16000 works OK, N~20,000 does not

Thanks - Scott
scottg
 
Posts: 2
Joined: Tue Nov 01, 2011 11:47 am


Return to CULA Dense Support

Who is online

Users browsing this forum: No registered users and 1 guest

cron