Home
last modified time | relevance | path

Searched defs:can_use_flash_attention (Results 1 – 2 of 2) sorted by relevance

/aosp_15_r20/external/pytorch/torch/backends/cuda/
H A D__init__.py357 def can_use_flash_attention(params: SDPAParams, debug: bool = False) -> bool: function
/aosp_15_r20/external/pytorch/aten/src/ATen/native/transformers/cuda/
H A Dsdp_utils.cpp553 bool can_use_flash_attention(sdp_params const& params, bool debug) { in can_use_flash_attention() function