Searched defs:retain_graph (Results 1 – 11 of 11) sorted by relevance
/aosp_15_r20/external/pytorch/torch/csrc/autograd/ |
H A D | autograd.cpp | 167 std::optional<bool> retain_graph, in backward() 188 std::optional<bool> retain_graph, in grad()
|
/aosp_15_r20/external/pytorch/torch/_functorch/ |
H A D | eager_transforms.py | 166 outputs, inputs, grad_outputs=None, retain_graph=False, create_graph=True argument 425 def wrapper(cotangents, retain_graph=True, create_graph=None): argument
|
/aosp_15_r20/external/pytorch/torch/csrc/distributed/autograd/ |
H A D | autograd.cpp | 14 bool retain_graph) { in backward()
|
/aosp_15_r20/external/pytorch/torch/csrc/jit/runtime/ |
H A D | register_prim_ops_fulljit.cpp | 232 auto retain_graph = pop(stack).toOptional<bool>(); in __anon1bf66a481902() local 271 auto retain_graph = pop(stack).toOptional<bool>(); in __anon1bf66a481a02() local
|
H A D | register_distributed_ops.cpp | 237 bool retain_graph = pop(stack).toBool(); in __anon84498b7a0802() local
|
H A D | register_prim_ops.cpp | 1132 auto retain_graph = pop(stack).toOptional<bool>(); in __anonbfe5918f4602() local
|
/aosp_15_r20/external/pytorch/torch/autograd/ |
H A D | functional.py | 173 retain_graph=None, argument
|
/aosp_15_r20/external/pytorch/torch/csrc/distributed/rpc/ |
H A D | init.cpp | 493 bool retain_graph) { in rpc_init()
|
/aosp_15_r20/external/pytorch/test/ |
H A D | test_decomp.py | 103 outputs, inputs, grad_outputs=None, retain_graph=False, create_graph=True argument
|
/aosp_15_r20/external/pytorch/torch/ |
H A D | _tensor.py | 526 self, gradient=None, retain_graph=None, create_graph=False, inputs=None argument
|
/aosp_15_r20/external/pytorch/test/functorch/ |
H A D | test_ops.py | 74 outputs, inputs, grad_outputs=None, retain_graph=False, create_graph=True argument
|