torch.__future__¶
- torch.__future__.set_overwrite_module_params_on_conversion(value)[source]¶
Sets whether to assign new tensors to the parameters instead of changing the existing parameters in-place when converting an
nn.Module
.When enabled, the following methods will assign new parameters to the module:
module.{device}()
(e.g.nn.Module.cuda()
) for moving a module between devicesmodule.{dtype}()
(e.g.nn.Module.float()
) for converting a module to a different dtypenn.Module.to()
nn.Module.to_empty()
- Parameters
value (bool) – Whether to assign new tensors or not.
- torch.__future__.get_overwrite_module_params_on_conversion()[source]¶
Returns whether to assign new tensors to the parameters instead of changing the existing parameters in-place when converting an
torch.nn.Module
. Defaults toFalse
.See
set_overwrite_module_params_on_conversion()
for more information.- Return type
- torch.__future__.set_swap_module_params_on_conversion(value)[source]¶
Sets whether to use
swap_tensors()
instead of setting.data
to change the existing parameters in-place when converting annn.Module
and instead ofparam.copy_(state_dict[key])
when loading a state dict into annn.Module
.Note
This function takes precedence over
get_overwrite_module_params_on_conversion()
When enabled, the following methods will swap the existing parameters in-place:
module.{device}()
(e.g.nn.Module.cuda()
) for moving a module between devicesmodule.{dtype}()
(e.g.nn.Module.float()
) for converting a module to a different dtypenn.Module.to()
nn.Module.to_empty()
nn.Module.load_state_dict()
The semantics for
load_state_dict()
when this is set are as follows:For each parameter/buffer, its corresponding
state_dict['key']
is transformed viamodule_load()
(i.e.res = param.module_load(state_dict['key'])
)If necessary,
res
will be wrapped in anParameter
The parameter/buffer in the module will be swapped via
swap_tensors()
withres
- Parameters
value (bool) – Whether to use
swap_tensors()
or not.
- torch.__future__.get_swap_module_params_on_conversion()[source]¶
Returns whether to use
swap_tensors()
instead of setting .data to change the existing parameters in-place when converting annn.Module
. Defaults toFalse
.See
set_swap_module_params_on_conversion()
for more information.- Return type