1 Star 0 Fork 0

xxxqhloveu/RepOptimizers

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
scale_layer.py 1.22 KB
一键复制 编辑 原始数据 按行查看 历史
Xiaohan Ding 提交于 2022-09-23 17:32 . refactor and check equivalency
# --------------------------------------------------------
# Re-parameterizing Your Optimizers rather than Architectures (https://arxiv.org/abs/2205.15242)
# Github source: https://github.com/DingXiaoH/RepOptimizers
# Licensed under The MIT License [see LICENSE for details]
# The training script is based on the code of Swin Transformer (https://github.com/microsoft/Swin-Transformer)
# --------------------------------------------------------
import torch
from torch.nn.parameter import Parameter
import torch.nn.init as init
class ScaleLayer(torch.nn.Module):
def __init__(self, num_features, use_bias=False, scale_init=1.0):
super(ScaleLayer, self).__init__()
self.weight = Parameter(torch.Tensor(num_features))
init.constant_(self.weight, scale_init)
self.num_features = num_features
if use_bias:
self.bias = Parameter(torch.Tensor(num_features))
init.zeros_(self.bias)
else:
self.bias = None
def forward(self, inputs):
if self.bias is None:
return inputs * self.weight.view(1, self.num_features, 1, 1)
else:
return inputs * self.weight.view(1, self.num_features, 1, 1) + self.bias.view(1, self.num_features, 1, 1)
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/corrshaw/RepOptimizers.git
[email protected]:corrshaw/RepOptimizers.git
corrshaw
RepOptimizers
RepOptimizers
main

搜索帮助