Visible to Intel only — GUID: GUID-9879A339-33AF-46C7-A1C3-C4CB6B49DF4B
Visible to Intel only — GUID: GUID-9879A339-33AF-46C7-A1C3-C4CB6B49DF4B
getrf_batch_scratchpad_size (Group Version)
Computes size of scratchpad memory required for getrf_batch (Group Version) function. This routine belongs to the oneapi::mkl::lapack namespace.
Description
Computes the number of elements of type T the scratchpad memory to be passed to the getrf_batch (Group Version) function should be able to hold.
API
Syntax
namespace oneapi::mkl::lapack { std::int64_t getrf_batch_scratchpad_size(sycl::queue &queue, std::int64_t *m, std::int64_t *n, std::int64_t *lda, std::int64_t group_count, std::int64_t *group_sizes) }
This function supports the following precisions and devices:
T |
Devices supported |
---|---|
float |
CPU and GPU |
double |
CPU and GPU |
std::complex<float> |
CPU and GPU |
std::complex<double> |
CPU and GPU |
Input Parameters
- queue
-
Device queue where calculations will be performed.
- m
-
Array of group_count parameters mg specifying the number of rows in the matrices belonging to group g.
- n
-
Array of group_count parameters ng specifying the number of columns in the matrices belonging to group g.
- lda
-
Array of group_count parameters ldag specifying the leading dimension of the matrices belonging to group g.
- group_count
-
Specifies the number of groups of parameters. Must be at least 0.
- group_sizes
-
Array of group_count integers. Array element with index g specifies the number of problems to solve for each of the groups of parameters g. So the total number of problems to solve, batch_size, is a sum of all parameter group sizes.
Exceptions
Exception |
Description |
---|---|
mkl::lapack::exception |
This exception is thrown when an incorrect argument value is supplied. You can determine the position of the incorrect argument by the info() method of the exception object. |
Return Values
The number of elements of type T the scratchpad memory to be passed to the getrf_batch (Group Version) function should be able to hold.