Home
last modified time | relevance | path

Searched refs:use_flash_attention_cpp (Results 1 – 1 of 1) sorted by relevance

/aosp_15_r20/external/pytorch/aten/src/ATen/native/transformers/
H A Dsdp_utils_cpp.cpp35 bool use_flash_attention_cpp(sdp_params const& params, bool debug) { in use_flash_attention_cpp() function
77 if (use_flash_attention_cpp(kernel_params, print_debug)) { in select_sdp_backend_cpp()
99 use_flash_attention_cpp(kernel_params, print_debug); in select_sdp_backend_cpp()