1# flake8: noqa: F401 2r"""Quantized Modules. 3 4This file is in the process of migration to `torch/ao/nn/quantized`, and 5is kept here for compatibility while the migration process is ongoing. 6If you are adding a new entry/functionality, please, add it to the 7appropriate file under the `torch/ao/nn/quantized/modules`, 8while adding an import statement here. 9""" 10 11from torch.ao.nn.quantized.modules.activation import ( 12 ELU, 13 Hardswish, 14 LeakyReLU, 15 MultiheadAttention, 16 PReLU, 17 ReLU6, 18 Sigmoid, 19 Softmax, 20) 21