1 Star 0 Fork 2

wangzeyangyi/DETR

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
config.yaml 1.03 KB
一键复制 编辑 原始数据 按行查看 历史
wangzeyangyi 提交于 2023-02-21 20:47 . debug overflow
# Train params
lr: 0.0001
lr_backbone: 0.00001
lr_drop: 200
weight_decay: 0.0001
clip_max_norm: 0.1
batch_size: 4
start_epoch: 0
epochs: 300
save_num_ckpt: 10
resume: ""
pretrained: "/home/w30005666/ckpt/ms_resnet_50.ckpt"
seed: 42
# Context
device_id: 0
device_target: "Ascend"
context_mode: "pynative"
coco_path: "/opt/npu/data/coco2017"
output_dir: "./outputs"
mindrecord_dir: "/home/w30005666/coco_mindrecord/"
clip_max_norm: 0.1
dropout: 0.1
# Loss
aux_loss: True
# Dataset parameters
dataset_file: 'coco'
coco_path: ""
train_data_type: "train2017"
val_data_type: 'val2017'
num_classes: 91
output_dir: ''
num_parallel_workers: 8
python_multiprocessing: False
max_size: 960
flip_ratio: 0.5
# Backbone
backbone: "resnet50"
# Transformer
enc_layers: 6
dec_layers: 6
dim_feedforward: 2048
hidden_dim: 256
dropout: 0.1
nheads: 8
num_queries: 100
pre_norm: False
# Matcher
set_cost_class: 1
set_cost_bbox: 5
set_cost_giou: 2
# Loss coefficients
dice_loss_coef: 1
bbox_loss_coef: 5
giou_loss_coef: 2
eos_coef: 0.1
# Distributed switch
distributed: 1
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/tomzwang11/detr.git
[email protected]:tomzwang11/detr.git
tomzwang11
detr
DETR
master

搜索帮助