-
Notifications
You must be signed in to change notification settings - Fork 0
/
slurm-137580.out
1471 lines (1422 loc) · 122 KB
/
slurm-137580.out
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
==========================================
SLURM_JOB_ID = 137580
SLURM_NODELIST = gnode57
SLURM_JOB_GPUS = 0,1,2,3
==========================================
Command Line Args: Namespace(config_file='', dist_url='auto', eval_only=False, machine_rank=0, num_gpus=4, num_machines=1, opts=[], resume=False)
[32m[05/06 08:09:34 detectron2]: [0mRank of current process: 0. World size: 4
[32m[05/06 08:09:34 detectron2]: [0mEnvironment info:
------------------------ ---------------------------------------------------------------------------------------------
sys.platform linux
Python 3.7.6 (default, Jan 8 2020, 19:59:22) [GCC 7.3.0]
numpy 1.18.1
detectron2 0.1.1 @/home/myfolder/detectron2_set2/detectron2
detectron2 compiler GCC 5.5
detectron2 CUDA compiler 10.2
detectron2 arch flags sm_75
DETECTRON2_ENV_MODULE <not set>
PyTorch 1.4.0+cu100 @/home/myfolder/miniconda3/envs/det_trial/lib/python3.7/site-packages/torch
PyTorch debug build False
CUDA available True
GPU 0,1,2,3 GeForce RTX 2080 Ti
CUDA_HOME /usr/local/cuda
NVCC Cuda compilation tools, release 10.2, V10.2.89
Pillow 6.2.2
torchvision 0.5.0+cu100 @/home/myfolder/miniconda3/envs/det_trial/lib/python3.7/site-packages/torchvision
torchvision arch flags sm_35, sm_50, sm_60, sm_70, sm_75
fvcore 0.1.dev200114
cv2 4.1.2
------------------------ ---------------------------------------------------------------------------------------------
PyTorch built with:
- GCC 7.3
- Intel(R) Math Kernel Library Version 2019.0.4 Product Build 20190411 for Intel(R) 64 architecture applications
- Intel(R) MKL-DNN v0.21.1 (Git Hash 7d2fd500bc78936d1d648ca713b901012f470dbc)
- OpenMP 201511 (a.k.a. OpenMP 4.5)
- NNPACK is enabled
- CUDA Runtime 10.0
- NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_37,code=compute_37
- CuDNN 7.6.3
- Magma 2.5.1
- Build settings: BLAS=MKL, BUILD_NAMEDTENSOR=OFF, BUILD_TYPE=Release, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -fopenmp -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -O2 -fPIC -Wno-narrowing -Wall -Wextra -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Wno-stringop-overflow, DISABLE_NUMA=1, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, USE_CUDA=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_STATIC_DISPATCH=OFF,
[32m[05/06 08:09:34 detectron2]: [0mCommand line arguments: Namespace(config_file='', dist_url='auto', eval_only=False, machine_rank=0, num_gpus=4, num_machines=1, opts=[], resume=False)
[32m[05/06 08:09:34 detectron2]: [0mRunning with full config:
CUDNN_BENCHMARK: False
DATALOADER:
ASPECT_RATIO_GROUPING: True
FILTER_EMPTY_ANNOTATIONS: True
NUM_WORKERS: 2
REPEAT_THRESHOLD: 0.0
SAMPLER_TRAIN: TrainingSampler
DATASETS:
PRECOMPUTED_PROPOSAL_TOPK_TEST: 1000
PRECOMPUTED_PROPOSAL_TOPK_TRAIN: 2000
PROPOSAL_FILES_TEST: ()
PROPOSAL_FILES_TRAIN: ()
TEST: ('cityscapes_fine_inst_seg_val',)
TRAIN: ('cityscapes_fine_inst_seg_train',)
GLOBAL:
HACK: 1.0
INPUT:
CROP:
ENABLED: False
SIZE: [0.9, 0.9]
TYPE: relative_range
FORMAT: BGR
MASK_FORMAT: polygon
MAX_SIZE_TEST: 2048
MAX_SIZE_TRAIN: 2048
MIN_SIZE_TEST: 1024
MIN_SIZE_TRAIN: (800, 832, 864, 896, 928, 960, 992, 1024)
MIN_SIZE_TRAIN_SAMPLING: choice
MODEL:
ANCHOR_GENERATOR:
ANGLES: [[-90, 0, 90]]
ASPECT_RATIOS: [[0.5, 1.0, 2.0]]
NAME: DefaultAnchorGenerator
OFFSET: 0.0
SIZES: [[32], [64], [128], [256], [512]]
BACKBONE:
FREEZE_AT: 2
NAME: build_resnet_fpn_backbone
DEVICE: cuda
FPN:
FUSE_TYPE: sum
IN_FEATURES: ['res2', 'res3', 'res4', 'res5']
NORM:
OUT_CHANNELS: 256
KEYPOINT_ON: False
LOAD_PROPOSALS: False
MASK_ON: True
META_ARCHITECTURE: GeneralizedRCNN
PANOPTIC_FPN:
COMBINE:
ENABLED: True
INSTANCES_CONFIDENCE_THRESH: 0.5
OVERLAP_THRESH: 0.5
STUFF_AREA_LIMIT: 4096
INSTANCE_LOSS_WEIGHT: 1.0
PIXEL_MEAN: [103.53, 116.28, 123.675]
PIXEL_STD: [1.0, 1.0, 1.0]
PROPOSAL_GENERATOR:
MIN_SIZE: 0
NAME: RPN
RESNETS:
DEFORM_MODULATED: False
DEFORM_NUM_GROUPS: 1
DEFORM_ON_PER_STAGE: [False, False, False, False]
DEPTH: 50
NORM: FrozenBN
NUM_GROUPS: 1
OUT_FEATURES: ['res2', 'res3', 'res4', 'res5']
RES2_OUT_CHANNELS: 256
RES5_DILATION: 1
STEM_OUT_CHANNELS: 64
STRIDE_IN_1X1: True
WIDTH_PER_GROUP: 64
RETINANET:
BBOX_REG_WEIGHTS: (1.0, 1.0, 1.0, 1.0)
FOCAL_LOSS_ALPHA: 0.25
FOCAL_LOSS_GAMMA: 2.0
IN_FEATURES: ['p3', 'p4', 'p5', 'p6', 'p7']
IOU_LABELS: [0, -1, 1]
IOU_THRESHOLDS: [0.4, 0.5]
NMS_THRESH_TEST: 0.5
NUM_CLASSES: 80
NUM_CONVS: 4
PRIOR_PROB: 0.01
SCORE_THRESH_TEST: 0.05
SMOOTH_L1_LOSS_BETA: 0.1
TOPK_CANDIDATES_TEST: 1000
ROI_BOX_CASCADE_HEAD:
BBOX_REG_WEIGHTS: ((10.0, 10.0, 5.0, 5.0), (20.0, 20.0, 10.0, 10.0), (30.0, 30.0, 15.0, 15.0))
IOUS: (0.5, 0.6, 0.7)
ROI_BOX_HEAD:
BBOX_REG_WEIGHTS: (10.0, 10.0, 5.0, 5.0)
CLS_AGNOSTIC_BBOX_REG: True
CONV_DIM: 256
FC_DIM: 1024
NAME: FastRCNNConvFCHead
NORM:
NUM_CONV: 0
NUM_FC: 2
POOLER_RESOLUTION: 7
POOLER_SAMPLING_RATIO: 2
POOLER_TYPE: ROIAlignV2
SMOOTH_L1_BETA: 0.0
TRAIN_ON_PRED_BOXES: False
ROI_HEADS:
BATCH_SIZE_PER_IMAGE: 128
IN_FEATURES: ['p2', 'p3', 'p4', 'p5']
IOU_LABELS: [0, 1]
IOU_THRESHOLDS: [0.5]
NAME: CascadeROIHeads
NMS_THRESH_TEST: 0.5
NUM_CLASSES: 8
POSITIVE_FRACTION: 0.25
PROPOSAL_APPEND_GT: True
SCORE_THRESH_TEST: 0.05
ROI_KEYPOINT_HEAD:
CONV_DIMS: (512, 512, 512, 512, 512, 512, 512, 512)
LOSS_WEIGHT: 1.0
MIN_KEYPOINTS_PER_IMAGE: 1
NAME: KRCNNConvDeconvUpsampleHead
NORMALIZE_LOSS_BY_VISIBLE_KEYPOINTS: True
NUM_KEYPOINTS: 17
POOLER_RESOLUTION: 14
POOLER_SAMPLING_RATIO: 0
POOLER_TYPE: ROIAlignV2
ROI_MASK_HEAD:
CLS_AGNOSTIC_MASK: False
CONV_DIM: 256
NAME: MaskRCNNConvUpsampleHead
NORM:
NUM_CONV: 4
POOLER_RESOLUTION: 14
POOLER_SAMPLING_RATIO: 2
POOLER_TYPE: ROIAlignV2
RPN:
BATCH_SIZE_PER_IMAGE: 256
BBOX_REG_WEIGHTS: (1.0, 1.0, 1.0, 1.0)
BOUNDARY_THRESH: 1000
HEAD_NAME: StandardRPNHead
IN_FEATURES: ['p2', 'p3', 'p4', 'p5', 'p6']
IOU_LABELS: [0, -1, 1]
IOU_THRESHOLDS: [0.3, 0.7]
LOSS_WEIGHT: 1.0
NMS_THRESH: 0.7
POSITIVE_FRACTION: 0.5
POST_NMS_TOPK_TEST: 2000
POST_NMS_TOPK_TRAIN: 2000
PRE_NMS_TOPK_TEST: 2000
PRE_NMS_TOPK_TRAIN: 2000
SMOOTH_L1_BETA: 0.0
SEM_SEG_HEAD:
COMMON_STRIDE: 4
CONVS_DIM: 128
IGNORE_VALUE: 255
IN_FEATURES: ['p2', 'p3', 'p4', 'p5']
LOSS_WEIGHT: 1.0
NAME: SemSegFPNHead
NORM: GN
NUM_CLASSES: 54
WEIGHTS: /ssd_scratch/cvit/myfolder/cityscapes/CMRCNN_model_surgery_coco_to_cityscapes.pth
OUTPUT_DIR: /ssd_scratch/cvit/myfolder/cityscapes/models/
SEED: -1
SOLVER:
BASE_LR: 0.001
BASE_MOMENTUM: 0.8
BIAS_LR_FACTOR: 1.0
CHECKPOINT_PERIOD: 2470
CLIP_GRADIENTS:
CLIP_TYPE: value
CLIP_VALUE: 1.0
ENABLED: False
NORM_TYPE: 2.0
CYCLE_MOMENTUM_SWITCH: True
GAMMA: 0.75
IMS_PER_BATCH: 12
LR_SCHEDULER_NAME: WarmupMultiStepLR
MAX_ITER: 29640
MAX_LR: 0.01
MAX_MOMENTUM: 0.9
MIN_LR: 0.001
MOMENTUM: 0.9
NESTEROV: False
SCALE_MODE: cycle
STEPS: (24700,)
STEP_SIZE_UP: 2000
WARMUP_FACTOR: 0.001
WARMUP_ITERS: 1000
WARMUP_METHOD: linear
WEIGHT_DECAY: 0.0001
WEIGHT_DECAY_BIAS: 0.0001
WEIGHT_DECAY_NORM: 0.0
TEST:
AUG:
ENABLED: False
FLIP: True
MAX_SIZE: 4000
MIN_SIZES: (400, 500, 600, 700, 800, 900, 1000, 1100, 1200)
DETECTIONS_PER_IMAGE: 100
EVAL_PERIOD: 2470
EXPECTED_RESULTS: []
KEYPOINT_OKS_SIGMAS: []
PRECISE_BN:
ENABLED: False
NUM_ITER: 200
VERSION: 2
VIS_PERIOD: 0
[32m[05/06 08:09:34 detectron2]: [0mFull config saved to /ssd_scratch/cvit/myfolder/cityscapes/models/config.yaml
[32m[05/06 08:09:34 d2.utils.env]: [0mUsing a generated random seed 34955930
[32m[05/06 08:09:35 detectron2]: [0mModel:
GeneralizedRCNN(
(backbone): FPN(
(fpn_lateral2): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1))
(fpn_output2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(fpn_lateral3): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1))
(fpn_output3): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(fpn_lateral4): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1))
(fpn_output4): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(fpn_lateral5): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1))
(fpn_output5): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(top_block): LastLevelMaxPool()
(bottom_up): ResNet(
(stem): BasicStem(
(conv1): Conv2d(
3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
)
(res2): Sequential(
(0): BottleneckBlock(
(shortcut): Conv2d(
64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv1): Conv2d(
64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
(conv2): Conv2d(
64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
(conv3): Conv2d(
64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
)
(1): BottleneckBlock(
(conv1): Conv2d(
256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
(conv2): Conv2d(
64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
(conv3): Conv2d(
64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
)
(2): BottleneckBlock(
(conv1): Conv2d(
256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
(conv2): Conv2d(
64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
)
(conv3): Conv2d(
64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
)
)
(res3): Sequential(
(0): BottleneckBlock(
(shortcut): Conv2d(
256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv1): Conv2d(
256, 128, kernel_size=(1, 1), stride=(2, 2), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv2): Conv2d(
128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv3): Conv2d(
128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
)
(1): BottleneckBlock(
(conv1): Conv2d(
512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv2): Conv2d(
128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv3): Conv2d(
128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
)
(2): BottleneckBlock(
(conv1): Conv2d(
512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv2): Conv2d(
128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv3): Conv2d(
128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
)
(3): BottleneckBlock(
(conv1): Conv2d(
512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv2): Conv2d(
128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
)
(conv3): Conv2d(
128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
)
)
(res4): Sequential(
(0): BottleneckBlock(
(shortcut): Conv2d(
512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
(conv1): Conv2d(
512, 256, kernel_size=(1, 1), stride=(2, 2), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv2): Conv2d(
256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv3): Conv2d(
256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
)
(1): BottleneckBlock(
(conv1): Conv2d(
1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv2): Conv2d(
256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv3): Conv2d(
256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
)
(2): BottleneckBlock(
(conv1): Conv2d(
1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv2): Conv2d(
256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv3): Conv2d(
256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
)
(3): BottleneckBlock(
(conv1): Conv2d(
1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv2): Conv2d(
256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv3): Conv2d(
256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
)
(4): BottleneckBlock(
(conv1): Conv2d(
1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv2): Conv2d(
256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv3): Conv2d(
256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
)
(5): BottleneckBlock(
(conv1): Conv2d(
1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv2): Conv2d(
256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
)
(conv3): Conv2d(
256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
)
)
)
(res5): Sequential(
(0): BottleneckBlock(
(shortcut): Conv2d(
1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False
(norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
)
(conv1): Conv2d(
1024, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv2): Conv2d(
512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv3): Conv2d(
512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
)
)
(1): BottleneckBlock(
(conv1): Conv2d(
2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv2): Conv2d(
512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv3): Conv2d(
512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
)
)
(2): BottleneckBlock(
(conv1): Conv2d(
2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv2): Conv2d(
512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
)
(conv3): Conv2d(
512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
(norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
)
)
)
)
)
(proposal_generator): RPN(
(anchor_generator): DefaultAnchorGenerator(
(cell_anchors): BufferList()
)
(rpn_head): StandardRPNHead(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(objectness_logits): Conv2d(256, 3, kernel_size=(1, 1), stride=(1, 1))
(anchor_deltas): Conv2d(256, 12, kernel_size=(1, 1), stride=(1, 1))
)
)
(roi_heads): CascadeROIHeads(
(box_pooler): ROIPooler(
(level_poolers): ModuleList(
(0): ROIAlign(output_size=(7, 7), spatial_scale=0.25, sampling_ratio=2, aligned=True)
(1): ROIAlign(output_size=(7, 7), spatial_scale=0.125, sampling_ratio=2, aligned=True)
(2): ROIAlign(output_size=(7, 7), spatial_scale=0.0625, sampling_ratio=2, aligned=True)
(3): ROIAlign(output_size=(7, 7), spatial_scale=0.03125, sampling_ratio=2, aligned=True)
)
)
(box_head): ModuleList(
(0): FastRCNNConvFCHead(
(fc1): Linear(in_features=12544, out_features=1024, bias=True)
(fc2): Linear(in_features=1024, out_features=1024, bias=True)
)
(1): FastRCNNConvFCHead(
(fc1): Linear(in_features=12544, out_features=1024, bias=True)
(fc2): Linear(in_features=1024, out_features=1024, bias=True)
)
(2): FastRCNNConvFCHead(
(fc1): Linear(in_features=12544, out_features=1024, bias=True)
(fc2): Linear(in_features=1024, out_features=1024, bias=True)
)
)
(box_predictor): ModuleList(
(0): FastRCNNOutputLayers(
(cls_score): Linear(in_features=1024, out_features=9, bias=True)
(bbox_pred): Linear(in_features=1024, out_features=4, bias=True)
)
(1): FastRCNNOutputLayers(
(cls_score): Linear(in_features=1024, out_features=9, bias=True)
(bbox_pred): Linear(in_features=1024, out_features=4, bias=True)
)
(2): FastRCNNOutputLayers(
(cls_score): Linear(in_features=1024, out_features=9, bias=True)
(bbox_pred): Linear(in_features=1024, out_features=4, bias=True)
)
)
(mask_pooler): ROIPooler(
(level_poolers): ModuleList(
(0): ROIAlign(output_size=(14, 14), spatial_scale=0.25, sampling_ratio=2, aligned=True)
(1): ROIAlign(output_size=(14, 14), spatial_scale=0.125, sampling_ratio=2, aligned=True)
(2): ROIAlign(output_size=(14, 14), spatial_scale=0.0625, sampling_ratio=2, aligned=True)
(3): ROIAlign(output_size=(14, 14), spatial_scale=0.03125, sampling_ratio=2, aligned=True)
)
)
(mask_head): MaskRCNNConvUpsampleHead(
(mask_fcn1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(mask_fcn2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(mask_fcn3): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(mask_fcn4): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(deconv): ConvTranspose2d(256, 256, kernel_size=(2, 2), stride=(2, 2))
(predictor): Conv2d(256, 8, kernel_size=(1, 1), stride=(1, 1))
)
)
)
[32m[05/06 08:09:36 fvcore.common.checkpoint]: [0mLoading checkpoint from /ssd_scratch/cvit/myfolder/cityscapes/CMRCNN_model_surgery_coco_to_cityscapes.pth
[32m[05/06 08:09:36 fvcore.common.checkpoint]: [0mSome model parameters are not in the checkpoint:
[34mpixel_mean[0m
[34mpixel_std[0m
[32m[05/06 08:09:38 d2.data.datasets.cityscapes]: [0m18 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/train'.
[32m[05/06 08:09:38 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 08:16:11 d2.data.datasets.cityscapes]: [0mLoaded 2975 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/train
[32m[05/06 08:16:11 d2.data.build]: [0mRemoved 10 images with no usable annotations. 2965 images left.
[32m[05/06 08:16:12 d2.data.build]: [0mDistribution of instances among all 8 categories:
[36m| category | #instances | category | #instances | category | #instances |
|:----------:|:-------------|:----------:|:-------------|:----------:|:-------------|
| person | 17910 | rider | 1778 | car | 26957 |
| truck | 484 | bus | 380 | train | 168 |
| motorcycle | 737 | bicycle | 3674 | | |
| total | 52088 | | | | |[0m
[32m[05/06 08:16:12 d2.data.common]: [0mSerializing 2965 elements to byte tensors and concatenating them all ...
[32m[05/06 08:16:12 d2.data.common]: [0mSerialized dataset takes 67.18 MiB
[32m[05/06 08:16:12 d2.data.detection_utils]: [0mTransformGens used in training: [ResizeShortestEdge(short_edge_length=(800, 832, 864, 896, 928, 960, 992, 1024), max_size=2048, sample_style='choice'), RandomFlip()]
[32m[05/06 08:16:12 d2.data.build]: [0mUsing training sampler TrainingSampler
[32m[05/06 08:16:15 detectron2]: [0mStarting training from iteration 0
[32m[05/06 08:19:25 d2.utils.events]: [0m eta: N/A iter: 247 total_loss: 3.623 loss_box_reg_stage0: 0.391 loss_box_reg_stage1: 0.699 loss_box_reg_stage2: 0.788 loss_cls_stage0: 0.395 loss_cls_stage1: 0.363 loss_cls_stage2: 0.435 loss_mask: 0.312 loss_rpn_cls: 0.052 loss_rpn_loc: 0.174 lr: 0.000247 max_mem: 7650M
[32m[05/06 08:22:38 d2.utils.events]: [0m eta: 6:19:07 iter: 494 total_loss: 3.565 loss_box_reg_stage0: 0.400 loss_box_reg_stage1: 0.773 loss_box_reg_stage2: 0.882 loss_cls_stage0: 0.374 loss_cls_stage1: 0.338 loss_cls_stage2: 0.307 loss_mask: 0.305 loss_rpn_cls: 0.041 loss_rpn_loc: 0.172 lr: 0.000494 max_mem: 7650M
[32m[05/06 08:25:52 d2.utils.events]: [0m eta: 6:17:30 iter: 741 total_loss: 3.214 loss_box_reg_stage0: 0.368 loss_box_reg_stage1: 0.733 loss_box_reg_stage2: 0.846 loss_cls_stage0: 0.293 loss_cls_stage1: 0.257 loss_cls_stage2: 0.246 loss_mask: 0.278 loss_rpn_cls: 0.034 loss_rpn_loc: 0.177 lr: 0.000740 max_mem: 7650M
[32m[05/06 08:29:05 d2.utils.events]: [0m eta: 6:13:11 iter: 988 total_loss: 3.272 loss_box_reg_stage0: 0.387 loss_box_reg_stage1: 0.755 loss_box_reg_stage2: 0.881 loss_cls_stage0: 0.287 loss_cls_stage1: 0.255 loss_cls_stage2: 0.238 loss_mask: 0.283 loss_rpn_cls: 0.032 loss_rpn_loc: 0.161 lr: 0.000987 max_mem: 7650M
[32m[05/06 08:32:18 d2.utils.events]: [0m eta: 6:10:02 iter: 1235 total_loss: 3.140 loss_box_reg_stage0: 0.357 loss_box_reg_stage1: 0.745 loss_box_reg_stage2: 0.890 loss_cls_stage0: 0.252 loss_cls_stage1: 0.223 loss_cls_stage2: 0.212 loss_mask: 0.275 loss_rpn_cls: 0.033 loss_rpn_loc: 0.152 lr: 0.001000 max_mem: 7650M
[32m[05/06 08:35:31 d2.utils.events]: [0m eta: 6:06:01 iter: 1482 total_loss: 3.118 loss_box_reg_stage0: 0.370 loss_box_reg_stage1: 0.727 loss_box_reg_stage2: 0.848 loss_cls_stage0: 0.249 loss_cls_stage1: 0.221 loss_cls_stage2: 0.205 loss_mask: 0.263 loss_rpn_cls: 0.028 loss_rpn_loc: 0.163 lr: 0.001000 max_mem: 7650M
[32m[05/06 08:38:44 d2.utils.events]: [0m eta: 6:04:19 iter: 1729 total_loss: 3.201 loss_box_reg_stage0: 0.353 loss_box_reg_stage1: 0.747 loss_box_reg_stage2: 0.871 loss_cls_stage0: 0.255 loss_cls_stage1: 0.221 loss_cls_stage2: 0.215 loss_mask: 0.270 loss_rpn_cls: 0.032 loss_rpn_loc: 0.165 lr: 0.001000 max_mem: 7650M
[32m[05/06 08:41:57 d2.utils.events]: [0m eta: 5:59:56 iter: 1976 total_loss: 3.089 loss_box_reg_stage0: 0.357 loss_box_reg_stage1: 0.715 loss_box_reg_stage2: 0.882 loss_cls_stage0: 0.241 loss_cls_stage1: 0.216 loss_cls_stage2: 0.208 loss_mask: 0.263 loss_rpn_cls: 0.031 loss_rpn_loc: 0.154 lr: 0.001000 max_mem: 7650M
[32m[05/06 08:45:10 d2.utils.events]: [0m eta: 5:56:38 iter: 2223 total_loss: 3.062 loss_box_reg_stage0: 0.365 loss_box_reg_stage1: 0.716 loss_box_reg_stage2: 0.869 loss_cls_stage0: 0.236 loss_cls_stage1: 0.211 loss_cls_stage2: 0.205 loss_mask: 0.274 loss_rpn_cls: 0.026 loss_rpn_loc: 0.160 lr: 0.001000 max_mem: 7650M
[32m[05/06 08:48:22 fvcore.common.checkpoint]: [0mSaving checkpoint to /ssd_scratch/cvit/myfolder/cityscapes/models/model_0002469.pth
[32m[05/06 08:48:23 d2.data.datasets.cityscapes]: [0m3 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val'.
[32m[05/06 08:48:23 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 08:49:53 d2.data.datasets.cityscapes]: [0mLoaded 500 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val
[32m[05/06 08:49:53 d2.data.build]: [0mDistribution of instances among all 8 categories:
[36m| category | #instances | category | #instances | category | #instances |
|:----------:|:-------------|:----------:|:-------------|:----------:|:-------------|
| person | 3399 | rider | 544 | car | 4656 |
| truck | 93 | bus | 98 | train | 23 |
| motorcycle | 149 | bicycle | 1169 | | |
| total | 10131 | | | | |[0m
[32m[05/06 08:49:53 d2.data.common]: [0mSerializing 500 elements to byte tensors and concatenating them all ...
[32m[05/06 08:49:53 d2.data.common]: [0mSerialized dataset takes 12.85 MiB
[32m[05/06 08:49:53 d2.evaluation.evaluator]: [0mStart inference on 125 images
[32m[05/06 08:49:55 d2.evaluation.cityscapes_evaluation]: [0mWriting cityscapes results to temporary directory /tmp/cityscapes_eval_3lvqf7z7 ...
[32m[05/06 08:50:10 d2.evaluation.evaluator]: [0mInference done 11/125. 0.1235 s / img. ETA=0:01:46
[32m[05/06 08:50:15 d2.evaluation.evaluator]: [0mInference done 16/125. 0.1258 s / img. ETA=0:01:47
[32m[05/06 08:50:20 d2.evaluation.evaluator]: [0mInference done 19/125. 0.1290 s / img. ETA=0:02:00
[32m[05/06 08:50:25 d2.evaluation.evaluator]: [0mInference done 24/125. 0.1287 s / img. ETA=0:01:51
[32m[05/06 08:50:30 d2.evaluation.evaluator]: [0mInference done 28/125. 0.1297 s / img. ETA=0:01:50
[32m[05/06 08:50:36 d2.evaluation.evaluator]: [0mInference done 32/125. 0.1305 s / img. ETA=0:01:49
[32m[05/06 08:50:43 d2.evaluation.evaluator]: [0mInference done 37/125. 0.1313 s / img. ETA=0:01:45
[32m[05/06 08:50:48 d2.evaluation.evaluator]: [0mInference done 41/125. 0.1323 s / img. ETA=0:01:41
[32m[05/06 08:50:53 d2.evaluation.evaluator]: [0mInference done 46/125. 0.1316 s / img. ETA=0:01:33
[32m[05/06 08:50:58 d2.evaluation.evaluator]: [0mInference done 49/125. 0.1321 s / img. ETA=0:01:32
[32m[05/06 08:51:03 d2.evaluation.evaluator]: [0mInference done 54/125. 0.1323 s / img. ETA=0:01:25
[32m[05/06 08:51:09 d2.evaluation.evaluator]: [0mInference done 58/125. 0.1326 s / img. ETA=0:01:21
[32m[05/06 08:51:15 d2.evaluation.evaluator]: [0mInference done 62/125. 0.1327 s / img. ETA=0:01:17
[32m[05/06 08:51:20 d2.evaluation.evaluator]: [0mInference done 66/125. 0.1337 s / img. ETA=0:01:13
[32m[05/06 08:51:26 d2.evaluation.evaluator]: [0mInference done 71/125. 0.1337 s / img. ETA=0:01:07
[32m[05/06 08:51:33 d2.evaluation.evaluator]: [0mInference done 76/125. 0.1339 s / img. ETA=0:01:00
[32m[05/06 08:51:40 d2.evaluation.evaluator]: [0mInference done 81/125. 0.1343 s / img. ETA=0:00:55
[32m[05/06 08:51:45 d2.evaluation.evaluator]: [0mInference done 84/125. 0.1348 s / img. ETA=0:00:52
[32m[05/06 08:51:51 d2.evaluation.evaluator]: [0mInference done 88/125. 0.1353 s / img. ETA=0:00:47
[32m[05/06 08:51:57 d2.evaluation.evaluator]: [0mInference done 93/125. 0.1352 s / img. ETA=0:00:41
[32m[05/06 08:52:02 d2.evaluation.evaluator]: [0mInference done 96/125. 0.1355 s / img. ETA=0:00:37
[32m[05/06 08:52:08 d2.evaluation.evaluator]: [0mInference done 100/125. 0.1357 s / img. ETA=0:00:32
[32m[05/06 08:52:14 d2.evaluation.evaluator]: [0mInference done 104/125. 0.1359 s / img. ETA=0:00:27
[32m[05/06 08:52:20 d2.evaluation.evaluator]: [0mInference done 108/125. 0.1363 s / img. ETA=0:00:22
[32m[05/06 08:52:25 d2.evaluation.evaluator]: [0mInference done 113/125. 0.1360 s / img. ETA=0:00:15
[32m[05/06 08:52:31 d2.evaluation.evaluator]: [0mInference done 117/125. 0.1360 s / img. ETA=0:00:10
[32m[05/06 08:52:38 d2.evaluation.evaluator]: [0mInference done 123/125. 0.1356 s / img. ETA=0:00:02
[32m[05/06 08:52:39 d2.evaluation.evaluator]: [0mTotal inference time: 0:02:35.065537 (1.292213 s / img per device, on 4 devices)
[32m[05/06 08:52:39 d2.evaluation.evaluator]: [0mTotal inference pure compute time: 0:00:16 (0.135517 s / img per device, on 4 devices)
[32m[05/06 08:52:55 d2.evaluation.cityscapes_evaluation]: [0mEvaluating results under /tmp/cityscapes_eval_3lvqf7z7 ...
Creating ground truth instances from png files.
Processing 500 images...
All images processed
Matching 500 pairs of images...
All images processed
##################################################
what : AP AP_50%
##################################################
person : 0.373 0.705
rider : 0.276 0.666
car : 0.535 0.794
truck : 0.348 0.465
bus : 0.587 0.763
train : 0.348 0.617
motorcycle : 0.195 0.431
bicycle : 0.212 0.568
--------------------------------------------------
average : 0.359 0.626
[32m[05/06 09:01:18 detectron2]: [0mEvaluation results for cityscapes_fine_inst_seg_val in csv format:
[32m[05/06 09:01:18 d2.evaluation.testing]: [0mcopypaste: Task: segm
[32m[05/06 09:01:18 d2.evaluation.testing]: [0mcopypaste: AP,AP50
[32m[05/06 09:01:18 d2.evaluation.testing]: [0mcopypaste: 35.9114,62.5980
[32m[05/06 09:01:19 d2.utils.events]: [0m eta: 1 day, 5:36:17 iter: 2470 total_loss: 3.085 loss_box_reg_stage0: 0.365 loss_box_reg_stage1: 0.730 loss_box_reg_stage2: 0.875 loss_cls_stage0: 0.237 loss_cls_stage1: 0.219 loss_cls_stage2: 0.213 loss_mask: 0.264 loss_rpn_cls: 0.034 loss_rpn_loc: 0.148 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:04:27 d2.utils.events]: [0m eta: 5:42:25 iter: 2717 total_loss: 2.896 loss_box_reg_stage0: 0.342 loss_box_reg_stage1: 0.714 loss_box_reg_stage2: 0.839 loss_cls_stage0: 0.226 loss_cls_stage1: 0.200 loss_cls_stage2: 0.195 loss_mask: 0.256 loss_rpn_cls: 0.028 loss_rpn_loc: 0.140 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:07:41 d2.utils.events]: [0m eta: 5:48:23 iter: 2964 total_loss: 3.024 loss_box_reg_stage0: 0.349 loss_box_reg_stage1: 0.707 loss_box_reg_stage2: 0.842 loss_cls_stage0: 0.236 loss_cls_stage1: 0.207 loss_cls_stage2: 0.199 loss_mask: 0.263 loss_rpn_cls: 0.030 loss_rpn_loc: 0.155 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:10:55 d2.utils.events]: [0m eta: 5:46:26 iter: 3211 total_loss: 3.056 loss_box_reg_stage0: 0.362 loss_box_reg_stage1: 0.726 loss_box_reg_stage2: 0.890 loss_cls_stage0: 0.236 loss_cls_stage1: 0.205 loss_cls_stage2: 0.188 loss_mask: 0.264 loss_rpn_cls: 0.029 loss_rpn_loc: 0.141 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:14:08 d2.utils.events]: [0m eta: 5:41:37 iter: 3458 total_loss: 3.089 loss_box_reg_stage0: 0.360 loss_box_reg_stage1: 0.738 loss_box_reg_stage2: 0.896 loss_cls_stage0: 0.229 loss_cls_stage1: 0.211 loss_cls_stage2: 0.202 loss_mask: 0.266 loss_rpn_cls: 0.026 loss_rpn_loc: 0.165 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:17:21 d2.utils.events]: [0m eta: 5:38:00 iter: 3705 total_loss: 3.049 loss_box_reg_stage0: 0.355 loss_box_reg_stage1: 0.742 loss_box_reg_stage2: 0.873 loss_cls_stage0: 0.230 loss_cls_stage1: 0.191 loss_cls_stage2: 0.197 loss_mask: 0.268 loss_rpn_cls: 0.025 loss_rpn_loc: 0.140 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:20:35 d2.utils.events]: [0m eta: 5:36:06 iter: 3952 total_loss: 3.080 loss_box_reg_stage0: 0.353 loss_box_reg_stage1: 0.732 loss_box_reg_stage2: 0.885 loss_cls_stage0: 0.231 loss_cls_stage1: 0.213 loss_cls_stage2: 0.200 loss_mask: 0.259 loss_rpn_cls: 0.027 loss_rpn_loc: 0.158 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:23:48 d2.utils.events]: [0m eta: 5:30:05 iter: 4199 total_loss: 2.931 loss_box_reg_stage0: 0.349 loss_box_reg_stage1: 0.692 loss_box_reg_stage2: 0.823 loss_cls_stage0: 0.212 loss_cls_stage1: 0.200 loss_cls_stage2: 0.185 loss_mask: 0.251 loss_rpn_cls: 0.026 loss_rpn_loc: 0.143 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:27:00 d2.utils.events]: [0m eta: 5:27:52 iter: 4446 total_loss: 2.916 loss_box_reg_stage0: 0.345 loss_box_reg_stage1: 0.694 loss_box_reg_stage2: 0.855 loss_cls_stage0: 0.222 loss_cls_stage1: 0.202 loss_cls_stage2: 0.188 loss_mask: 0.260 loss_rpn_cls: 0.027 loss_rpn_loc: 0.160 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:30:13 d2.utils.events]: [0m eta: 5:24:53 iter: 4693 total_loss: 3.010 loss_box_reg_stage0: 0.350 loss_box_reg_stage1: 0.706 loss_box_reg_stage2: 0.863 loss_cls_stage0: 0.223 loss_cls_stage1: 0.192 loss_cls_stage2: 0.193 loss_mask: 0.261 loss_rpn_cls: 0.026 loss_rpn_loc: 0.166 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:33:25 fvcore.common.checkpoint]: [0mSaving checkpoint to /ssd_scratch/cvit/myfolder/cityscapes/models/model_0004939.pth
[32m[05/06 09:33:26 d2.data.datasets.cityscapes]: [0m3 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val'.
[32m[05/06 09:33:26 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 09:34:56 d2.data.datasets.cityscapes]: [0mLoaded 500 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val
[32m[05/06 09:34:56 d2.data.common]: [0mSerializing 500 elements to byte tensors and concatenating them all ...
[32m[05/06 09:34:56 d2.data.common]: [0mSerialized dataset takes 12.85 MiB
[32m[05/06 09:34:57 d2.evaluation.evaluator]: [0mStart inference on 125 images
[32m[05/06 09:34:58 d2.evaluation.cityscapes_evaluation]: [0mWriting cityscapes results to temporary directory /tmp/cityscapes_eval_or_0vqvc ...
[32m[05/06 09:35:13 d2.evaluation.evaluator]: [0mInference done 11/125. 0.1236 s / img. ETA=0:01:42
[32m[05/06 09:35:19 d2.evaluation.evaluator]: [0mInference done 16/125. 0.1268 s / img. ETA=0:01:48
[32m[05/06 09:35:24 d2.evaluation.evaluator]: [0mInference done 19/125. 0.1309 s / img. ETA=0:02:01
[32m[05/06 09:35:29 d2.evaluation.evaluator]: [0mInference done 24/125. 0.1325 s / img. ETA=0:01:54
[32m[05/06 09:35:34 d2.evaluation.evaluator]: [0mInference done 28/125. 0.1322 s / img. ETA=0:01:53
[32m[05/06 09:35:41 d2.evaluation.evaluator]: [0mInference done 32/125. 0.1332 s / img. ETA=0:01:53
[32m[05/06 09:35:46 d2.evaluation.evaluator]: [0mInference done 36/125. 0.1329 s / img. ETA=0:01:49
[32m[05/06 09:35:51 d2.evaluation.evaluator]: [0mInference done 40/125. 0.1338 s / img. ETA=0:01:45
[32m[05/06 09:35:57 d2.evaluation.evaluator]: [0mInference done 44/125. 0.1340 s / img. ETA=0:01:41
[32m[05/06 09:36:03 d2.evaluation.evaluator]: [0mInference done 49/125. 0.1346 s / img. ETA=0:01:36
[32m[05/06 09:36:09 d2.evaluation.evaluator]: [0mInference done 54/125. 0.1344 s / img. ETA=0:01:28
[32m[05/06 09:36:15 d2.evaluation.evaluator]: [0mInference done 58/125. 0.1344 s / img. ETA=0:01:24
[32m[05/06 09:36:20 d2.evaluation.evaluator]: [0mInference done 62/125. 0.1351 s / img. ETA=0:01:19
[32m[05/06 09:36:25 d2.evaluation.evaluator]: [0mInference done 66/125. 0.1357 s / img. ETA=0:01:15
[32m[05/06 09:36:31 d2.evaluation.evaluator]: [0mInference done 71/125. 0.1353 s / img. ETA=0:01:08
[32m[05/06 09:36:37 d2.evaluation.evaluator]: [0mInference done 76/125. 0.1350 s / img. ETA=0:01:02
[32m[05/06 09:36:44 d2.evaluation.evaluator]: [0mInference done 81/125. 0.1349 s / img. ETA=0:00:55
[32m[05/06 09:36:49 d2.evaluation.evaluator]: [0mInference done 84/125. 0.1354 s / img. ETA=0:00:52
[32m[05/06 09:36:56 d2.evaluation.evaluator]: [0mInference done 88/125. 0.1357 s / img. ETA=0:00:48
[32m[05/06 09:37:01 d2.evaluation.evaluator]: [0mInference done 93/125. 0.1356 s / img. ETA=0:00:41
[32m[05/06 09:37:06 d2.evaluation.evaluator]: [0mInference done 96/125. 0.1359 s / img. ETA=0:00:37
[32m[05/06 09:37:12 d2.evaluation.evaluator]: [0mInference done 100/125. 0.1360 s / img. ETA=0:00:32
[32m[05/06 09:37:18 d2.evaluation.evaluator]: [0mInference done 104/125. 0.1363 s / img. ETA=0:00:27
[32m[05/06 09:37:24 d2.evaluation.evaluator]: [0mInference done 108/125. 0.1366 s / img. ETA=0:00:22
[32m[05/06 09:37:30 d2.evaluation.evaluator]: [0mInference done 114/125. 0.1362 s / img. ETA=0:00:14
[32m[05/06 09:37:36 d2.evaluation.evaluator]: [0mInference done 118/125. 0.1364 s / img. ETA=0:00:09
[32m[05/06 09:37:41 d2.evaluation.evaluator]: [0mInference done 123/125. 0.1361 s / img. ETA=0:00:02
[32m[05/06 09:37:43 d2.evaluation.evaluator]: [0mTotal inference time: 0:02:35.565161 (1.296376 s / img per device, on 4 devices)
[32m[05/06 09:37:43 d2.evaluation.evaluator]: [0mTotal inference pure compute time: 0:00:16 (0.135989 s / img per device, on 4 devices)
[32m[05/06 09:38:01 d2.evaluation.cityscapes_evaluation]: [0mEvaluating results under /tmp/cityscapes_eval_or_0vqvc ...
Creating ground truth instances from png files.
Processing 500 images...
All images processed
Matching 500 pairs of images...
All images processed
##################################################
what : AP AP_50%
##################################################
person : 0.379 0.719
rider : 0.283 0.679
car : 0.541 0.799
truck : 0.370 0.509
bus : 0.610 0.808
train : 0.414 0.627
motorcycle : 0.215 0.463
bicycle : 0.223 0.579
--------------------------------------------------
average : 0.379 0.648
[32m[05/06 09:46:32 detectron2]: [0mEvaluation results for cityscapes_fine_inst_seg_val in csv format:
[32m[05/06 09:46:32 d2.evaluation.testing]: [0mcopypaste: Task: segm
[32m[05/06 09:46:32 d2.evaluation.testing]: [0mcopypaste: AP,AP50
[32m[05/06 09:46:32 d2.evaluation.testing]: [0mcopypaste: 37.9359,64.7978
[32m[05/06 09:46:33 d2.utils.events]: [0m eta: 1 day, 3:12:10 iter: 4940 total_loss: 2.912 loss_box_reg_stage0: 0.342 loss_box_reg_stage1: 0.706 loss_box_reg_stage2: 0.870 loss_cls_stage0: 0.219 loss_cls_stage1: 0.189 loss_cls_stage2: 0.181 loss_mask: 0.254 loss_rpn_cls: 0.028 loss_rpn_loc: 0.176 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:49:41 d2.utils.events]: [0m eta: 5:10:37 iter: 5187 total_loss: 2.993 loss_box_reg_stage0: 0.336 loss_box_reg_stage1: 0.706 loss_box_reg_stage2: 0.876 loss_cls_stage0: 0.210 loss_cls_stage1: 0.188 loss_cls_stage2: 0.174 loss_mask: 0.247 loss_rpn_cls: 0.022 loss_rpn_loc: 0.147 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:52:54 d2.utils.events]: [0m eta: 5:15:43 iter: 5434 total_loss: 2.950 loss_box_reg_stage0: 0.344 loss_box_reg_stage1: 0.695 loss_box_reg_stage2: 0.858 loss_cls_stage0: 0.226 loss_cls_stage1: 0.201 loss_cls_stage2: 0.181 loss_mask: 0.264 loss_rpn_cls: 0.026 loss_rpn_loc: 0.150 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:56:08 d2.utils.events]: [0m eta: 5:12:45 iter: 5681 total_loss: 2.838 loss_box_reg_stage0: 0.330 loss_box_reg_stage1: 0.697 loss_box_reg_stage2: 0.836 loss_cls_stage0: 0.209 loss_cls_stage1: 0.195 loss_cls_stage2: 0.183 loss_mask: 0.247 loss_rpn_cls: 0.022 loss_rpn_loc: 0.139 lr: 0.001000 max_mem: 7650M
[32m[05/06 09:59:22 d2.utils.events]: [0m eta: 5:09:59 iter: 5928 total_loss: 3.008 loss_box_reg_stage0: 0.342 loss_box_reg_stage1: 0.705 loss_box_reg_stage2: 0.889 loss_cls_stage0: 0.217 loss_cls_stage1: 0.192 loss_cls_stage2: 0.187 loss_mask: 0.261 loss_rpn_cls: 0.026 loss_rpn_loc: 0.159 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:02:35 d2.utils.events]: [0m eta: 5:05:37 iter: 6175 total_loss: 2.824 loss_box_reg_stage0: 0.321 loss_box_reg_stage1: 0.668 loss_box_reg_stage2: 0.820 loss_cls_stage0: 0.198 loss_cls_stage1: 0.180 loss_cls_stage2: 0.165 loss_mask: 0.238 loss_rpn_cls: 0.023 loss_rpn_loc: 0.134 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:05:48 d2.utils.events]: [0m eta: 5:03:18 iter: 6422 total_loss: 2.881 loss_box_reg_stage0: 0.333 loss_box_reg_stage1: 0.686 loss_box_reg_stage2: 0.876 loss_cls_stage0: 0.207 loss_cls_stage1: 0.191 loss_cls_stage2: 0.180 loss_mask: 0.246 loss_rpn_cls: 0.023 loss_rpn_loc: 0.141 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:09:01 d2.utils.events]: [0m eta: 4:58:25 iter: 6669 total_loss: 2.927 loss_box_reg_stage0: 0.325 loss_box_reg_stage1: 0.703 loss_box_reg_stage2: 0.871 loss_cls_stage0: 0.222 loss_cls_stage1: 0.193 loss_cls_stage2: 0.179 loss_mask: 0.244 loss_rpn_cls: 0.024 loss_rpn_loc: 0.160 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:12:14 d2.utils.events]: [0m eta: 4:56:25 iter: 6916 total_loss: 2.858 loss_box_reg_stage0: 0.335 loss_box_reg_stage1: 0.699 loss_box_reg_stage2: 0.865 loss_cls_stage0: 0.204 loss_cls_stage1: 0.172 loss_cls_stage2: 0.171 loss_mask: 0.252 loss_rpn_cls: 0.021 loss_rpn_loc: 0.140 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:15:27 d2.utils.events]: [0m eta: 4:52:35 iter: 7163 total_loss: 2.659 loss_box_reg_stage0: 0.304 loss_box_reg_stage1: 0.636 loss_box_reg_stage2: 0.775 loss_cls_stage0: 0.193 loss_cls_stage1: 0.183 loss_cls_stage2: 0.165 loss_mask: 0.239 loss_rpn_cls: 0.025 loss_rpn_loc: 0.126 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:18:40 fvcore.common.checkpoint]: [0mSaving checkpoint to /ssd_scratch/cvit/myfolder/cityscapes/models/model_0007409.pth
[32m[05/06 10:18:41 d2.data.datasets.cityscapes]: [0m3 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val'.
[32m[05/06 10:18:41 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 10:20:12 d2.data.datasets.cityscapes]: [0mLoaded 500 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val
[32m[05/06 10:20:12 d2.data.common]: [0mSerializing 500 elements to byte tensors and concatenating them all ...
[32m[05/06 10:20:12 d2.data.common]: [0mSerialized dataset takes 12.85 MiB
[32m[05/06 10:20:12 d2.evaluation.evaluator]: [0mStart inference on 125 images
[32m[05/06 10:20:12 d2.evaluation.cityscapes_evaluation]: [0mWriting cityscapes results to temporary directory /tmp/cityscapes_eval_3yce6rbu ...
[32m[05/06 10:20:27 d2.evaluation.evaluator]: [0mInference done 11/125. 0.1234 s / img. ETA=0:01:40
[32m[05/06 10:20:32 d2.evaluation.evaluator]: [0mInference done 16/125. 0.1248 s / img. ETA=0:01:42
[32m[05/06 10:20:37 d2.evaluation.evaluator]: [0mInference done 20/125. 0.1284 s / img. ETA=0:01:49
[32m[05/06 10:20:42 d2.evaluation.evaluator]: [0mInference done 25/125. 0.1292 s / img. ETA=0:01:44
[32m[05/06 10:20:48 d2.evaluation.evaluator]: [0mInference done 30/125. 0.1301 s / img. ETA=0:01:41
[32m[05/06 10:20:54 d2.evaluation.evaluator]: [0mInference done 34/125. 0.1313 s / img. ETA=0:01:40
[32m[05/06 10:21:00 d2.evaluation.evaluator]: [0mInference done 39/125. 0.1315 s / img. ETA=0:01:38
[32m[05/06 10:21:06 d2.evaluation.evaluator]: [0mInference done 45/125. 0.1309 s / img. ETA=0:01:28
[32m[05/06 10:21:12 d2.evaluation.evaluator]: [0mInference done 49/125. 0.1313 s / img. ETA=0:01:26
[32m[05/06 10:21:17 d2.evaluation.evaluator]: [0mInference done 54/125. 0.1310 s / img. ETA=0:01:19
[32m[05/06 10:21:23 d2.evaluation.evaluator]: [0mInference done 59/125. 0.1310 s / img. ETA=0:01:15
[32m[05/06 10:21:29 d2.evaluation.evaluator]: [0mInference done 64/125. 0.1312 s / img. ETA=0:01:10
[32m[05/06 10:21:35 d2.evaluation.evaluator]: [0mInference done 68/125. 0.1316 s / img. ETA=0:01:06
[32m[05/06 10:21:40 d2.evaluation.evaluator]: [0mInference done 74/125. 0.1310 s / img. ETA=0:00:58
[32m[05/06 10:21:46 d2.evaluation.evaluator]: [0mInference done 78/125. 0.1312 s / img. ETA=0:00:54
[32m[05/06 10:21:51 d2.evaluation.evaluator]: [0mInference done 82/125. 0.1313 s / img. ETA=0:00:49
[32m[05/06 10:21:56 d2.evaluation.evaluator]: [0mInference done 86/125. 0.1319 s / img. ETA=0:00:45
[32m[05/06 10:22:02 d2.evaluation.evaluator]: [0mInference done 90/125. 0.1324 s / img. ETA=0:00:41
[32m[05/06 10:22:07 d2.evaluation.evaluator]: [0mInference done 94/125. 0.1326 s / img. ETA=0:00:36
[32m[05/06 10:22:12 d2.evaluation.evaluator]: [0mInference done 98/125. 0.1326 s / img. ETA=0:00:32
[32m[05/06 10:22:18 d2.evaluation.evaluator]: [0mInference done 103/125. 0.1325 s / img. ETA=0:00:26
[32m[05/06 10:22:24 d2.evaluation.evaluator]: [0mInference done 107/125. 0.1330 s / img. ETA=0:00:21
[32m[05/06 10:22:29 d2.evaluation.evaluator]: [0mInference done 112/125. 0.1328 s / img. ETA=0:00:15
[32m[05/06 10:22:34 d2.evaluation.evaluator]: [0mInference done 116/125. 0.1328 s / img. ETA=0:00:10
[32m[05/06 10:22:40 d2.evaluation.evaluator]: [0mInference done 122/125. 0.1328 s / img. ETA=0:00:03
[32m[05/06 10:22:43 d2.evaluation.evaluator]: [0mTotal inference time: 0:02:21.252326 (1.177103 s / img per device, on 4 devices)
[32m[05/06 10:22:43 d2.evaluation.evaluator]: [0mTotal inference pure compute time: 0:00:15 (0.132562 s / img per device, on 4 devices)
[32m[05/06 10:23:02 d2.evaluation.cityscapes_evaluation]: [0mEvaluating results under /tmp/cityscapes_eval_3yce6rbu ...
Creating ground truth instances from png files.
Processing 500 images...
All images processed
Matching 500 pairs of images...
All images processed
##################################################
what : AP AP_50%
##################################################
person : 0.385 0.723
rider : 0.292 0.682
car : 0.546 0.802
truck : 0.391 0.532
bus : 0.616 0.805
train : 0.407 0.667
motorcycle : 0.218 0.474
bicycle : 0.229 0.591
--------------------------------------------------
average : 0.385 0.660
[32m[05/06 10:30:43 detectron2]: [0mEvaluation results for cityscapes_fine_inst_seg_val in csv format:
[32m[05/06 10:30:43 d2.evaluation.testing]: [0mcopypaste: Task: segm
[32m[05/06 10:30:43 d2.evaluation.testing]: [0mcopypaste: AP,AP50
[32m[05/06 10:30:43 d2.evaluation.testing]: [0mcopypaste: 38.5485,65.9551
[32m[05/06 10:30:43 d2.utils.events]: [0m eta: 22:54:29 iter: 7410 total_loss: 2.667 loss_box_reg_stage0: 0.325 loss_box_reg_stage1: 0.647 loss_box_reg_stage2: 0.782 loss_cls_stage0: 0.194 loss_cls_stage1: 0.171 loss_cls_stage2: 0.164 loss_mask: 0.246 loss_rpn_cls: 0.019 loss_rpn_loc: 0.151 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:33:52 d2.utils.events]: [0m eta: 4:39:57 iter: 7657 total_loss: 2.844 loss_box_reg_stage0: 0.333 loss_box_reg_stage1: 0.685 loss_box_reg_stage2: 0.794 loss_cls_stage0: 0.215 loss_cls_stage1: 0.196 loss_cls_stage2: 0.179 loss_mask: 0.252 loss_rpn_cls: 0.019 loss_rpn_loc: 0.130 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:37:07 d2.utils.events]: [0m eta: 4:45:43 iter: 7904 total_loss: 2.878 loss_box_reg_stage0: 0.345 loss_box_reg_stage1: 0.688 loss_box_reg_stage2: 0.858 loss_cls_stage0: 0.200 loss_cls_stage1: 0.180 loss_cls_stage2: 0.167 loss_mask: 0.252 loss_rpn_cls: 0.019 loss_rpn_loc: 0.145 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:40:20 d2.utils.events]: [0m eta: 4:40:35 iter: 8151 total_loss: 2.862 loss_box_reg_stage0: 0.340 loss_box_reg_stage1: 0.680 loss_box_reg_stage2: 0.845 loss_cls_stage0: 0.199 loss_cls_stage1: 0.181 loss_cls_stage2: 0.169 loss_mask: 0.262 loss_rpn_cls: 0.023 loss_rpn_loc: 0.165 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:43:35 d2.utils.events]: [0m eta: 4:38:56 iter: 8398 total_loss: 2.734 loss_box_reg_stage0: 0.331 loss_box_reg_stage1: 0.664 loss_box_reg_stage2: 0.799 loss_cls_stage0: 0.200 loss_cls_stage1: 0.180 loss_cls_stage2: 0.165 loss_mask: 0.253 loss_rpn_cls: 0.020 loss_rpn_loc: 0.136 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:46:48 d2.utils.events]: [0m eta: 4:33:51 iter: 8645 total_loss: 2.844 loss_box_reg_stage0: 0.323 loss_box_reg_stage1: 0.682 loss_box_reg_stage2: 0.830 loss_cls_stage0: 0.207 loss_cls_stage1: 0.179 loss_cls_stage2: 0.158 loss_mask: 0.251 loss_rpn_cls: 0.020 loss_rpn_loc: 0.176 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:50:02 d2.utils.events]: [0m eta: 4:30:43 iter: 8892 total_loss: 2.578 loss_box_reg_stage0: 0.303 loss_box_reg_stage1: 0.628 loss_box_reg_stage2: 0.782 loss_cls_stage0: 0.188 loss_cls_stage1: 0.174 loss_cls_stage2: 0.166 loss_mask: 0.232 loss_rpn_cls: 0.018 loss_rpn_loc: 0.116 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:53:15 d2.utils.events]: [0m eta: 4:26:54 iter: 9139 total_loss: 2.640 loss_box_reg_stage0: 0.307 loss_box_reg_stage1: 0.628 loss_box_reg_stage2: 0.811 loss_cls_stage0: 0.188 loss_cls_stage1: 0.166 loss_cls_stage2: 0.162 loss_mask: 0.233 loss_rpn_cls: 0.019 loss_rpn_loc: 0.130 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:56:27 d2.utils.events]: [0m eta: 4:22:49 iter: 9386 total_loss: 2.882 loss_box_reg_stage0: 0.326 loss_box_reg_stage1: 0.706 loss_box_reg_stage2: 0.857 loss_cls_stage0: 0.214 loss_cls_stage1: 0.193 loss_cls_stage2: 0.185 loss_mask: 0.247 loss_rpn_cls: 0.020 loss_rpn_loc: 0.163 lr: 0.001000 max_mem: 7650M
[32m[05/06 10:59:40 d2.utils.events]: [0m eta: 4:20:45 iter: 9633 total_loss: 2.763 loss_box_reg_stage0: 0.326 loss_box_reg_stage1: 0.661 loss_box_reg_stage2: 0.853 loss_cls_stage0: 0.197 loss_cls_stage1: 0.177 loss_cls_stage2: 0.170 loss_mask: 0.245 loss_rpn_cls: 0.020 loss_rpn_loc: 0.136 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:02:52 fvcore.common.checkpoint]: [0mSaving checkpoint to /ssd_scratch/cvit/myfolder/cityscapes/models/model_0009879.pth
[32m[05/06 11:02:53 d2.data.datasets.cityscapes]: [0m3 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val'.
[32m[05/06 11:02:53 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 11:04:24 d2.data.datasets.cityscapes]: [0mLoaded 500 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val
[32m[05/06 11:04:24 d2.data.common]: [0mSerializing 500 elements to byte tensors and concatenating them all ...
[32m[05/06 11:04:24 d2.data.common]: [0mSerialized dataset takes 12.85 MiB
[32m[05/06 11:04:24 d2.evaluation.evaluator]: [0mStart inference on 125 images
[32m[05/06 11:04:25 d2.evaluation.cityscapes_evaluation]: [0mWriting cityscapes results to temporary directory /tmp/cityscapes_eval_qwu60av1 ...
[32m[05/06 11:04:40 d2.evaluation.evaluator]: [0mInference done 11/125. 0.1253 s / img. ETA=0:01:39
[32m[05/06 11:04:45 d2.evaluation.evaluator]: [0mInference done 16/125. 0.1254 s / img. ETA=0:01:41
[32m[05/06 11:04:50 d2.evaluation.evaluator]: [0mInference done 20/125. 0.1281 s / img. ETA=0:01:50
[32m[05/06 11:04:56 d2.evaluation.evaluator]: [0mInference done 25/125. 0.1295 s / img. ETA=0:01:46
[32m[05/06 11:05:02 d2.evaluation.evaluator]: [0mInference done 30/125. 0.1294 s / img. ETA=0:01:44
[32m[05/06 11:05:08 d2.evaluation.evaluator]: [0mInference done 34/125. 0.1304 s / img. ETA=0:01:43
[32m[05/06 11:05:15 d2.evaluation.evaluator]: [0mInference done 39/125. 0.1311 s / img. ETA=0:01:40
[32m[05/06 11:05:20 d2.evaluation.evaluator]: [0mInference done 44/125. 0.1307 s / img. ETA=0:01:34
[32m[05/06 11:05:26 d2.evaluation.evaluator]: [0mInference done 49/125. 0.1306 s / img. ETA=0:01:28
[32m[05/06 11:05:31 d2.evaluation.evaluator]: [0mInference done 54/125. 0.1307 s / img. ETA=0:01:22
[32m[05/06 11:05:37 d2.evaluation.evaluator]: [0mInference done 58/125. 0.1312 s / img. ETA=0:01:18
[32m[05/06 11:05:42 d2.evaluation.evaluator]: [0mInference done 62/125. 0.1313 s / img. ETA=0:01:14
[32m[05/06 11:05:47 d2.evaluation.evaluator]: [0mInference done 66/125. 0.1318 s / img. ETA=0:01:09
[32m[05/06 11:05:53 d2.evaluation.evaluator]: [0mInference done 71/125. 0.1315 s / img. ETA=0:01:03
[32m[05/06 11:05:58 d2.evaluation.evaluator]: [0mInference done 76/125. 0.1313 s / img. ETA=0:00:57
[32m[05/06 11:06:04 d2.evaluation.evaluator]: [0mInference done 81/125. 0.1314 s / img. ETA=0:00:51
[32m[05/06 11:06:10 d2.evaluation.evaluator]: [0mInference done 85/125. 0.1317 s / img. ETA=0:00:47
[32m[05/06 11:06:16 d2.evaluation.evaluator]: [0mInference done 89/125. 0.1320 s / img. ETA=0:00:43
[32m[05/06 11:06:22 d2.evaluation.evaluator]: [0mInference done 94/125. 0.1322 s / img. ETA=0:00:37
[32m[05/06 11:06:28 d2.evaluation.evaluator]: [0mInference done 98/125. 0.1323 s / img. ETA=0:00:32
[32m[05/06 11:06:34 d2.evaluation.evaluator]: [0mInference done 103/125. 0.1323 s / img. ETA=0:00:26
[32m[05/06 11:06:39 d2.evaluation.evaluator]: [0mInference done 106/125. 0.1327 s / img. ETA=0:00:23
[32m[05/06 11:06:44 d2.evaluation.evaluator]: [0mInference done 110/125. 0.1326 s / img. ETA=0:00:18
[32m[05/06 11:06:50 d2.evaluation.evaluator]: [0mInference done 116/125. 0.1326 s / img. ETA=0:00:10
[32m[05/06 11:06:56 d2.evaluation.evaluator]: [0mInference done 122/125. 0.1324 s / img. ETA=0:00:03
[32m[05/06 11:07:00 d2.evaluation.evaluator]: [0mTotal inference time: 0:02:24.951993 (1.207933 s / img per device, on 4 devices)
[32m[05/06 11:07:00 d2.evaluation.evaluator]: [0mTotal inference pure compute time: 0:00:15 (0.132331 s / img per device, on 4 devices)
[32m[05/06 11:07:18 d2.evaluation.cityscapes_evaluation]: [0mEvaluating results under /tmp/cityscapes_eval_qwu60av1 ...
Creating ground truth instances from png files.
Processing 500 images...
All images processed
Matching 500 pairs of images...
All images processed
##################################################
what : AP AP_50%
##################################################
person : 0.389 0.729
rider : 0.297 0.681
car : 0.552 0.815
truck : 0.386 0.554
bus : 0.615 0.822
train : 0.449 0.671
motorcycle : 0.243 0.506
bicycle : 0.235 0.601
--------------------------------------------------
average : 0.396 0.672
[32m[05/06 11:15:15 detectron2]: [0mEvaluation results for cityscapes_fine_inst_seg_val in csv format:
[32m[05/06 11:15:15 d2.evaluation.testing]: [0mcopypaste: Task: segm
[32m[05/06 11:15:15 d2.evaluation.testing]: [0mcopypaste: AP,AP50
[32m[05/06 11:15:15 d2.evaluation.testing]: [0mcopypaste: 39.5778,67.2387
[32m[05/06 11:15:15 d2.utils.events]: [0m eta: 20:47:19 iter: 9880 total_loss: 2.740 loss_box_reg_stage0: 0.319 loss_box_reg_stage1: 0.650 loss_box_reg_stage2: 0.816 loss_cls_stage0: 0.202 loss_cls_stage1: 0.166 loss_cls_stage2: 0.154 loss_mask: 0.243 loss_rpn_cls: 0.023 loss_rpn_loc: 0.148 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:18:24 d2.utils.events]: [0m eta: 4:08:34 iter: 10127 total_loss: 2.833 loss_box_reg_stage0: 0.329 loss_box_reg_stage1: 0.671 loss_box_reg_stage2: 0.832 loss_cls_stage0: 0.200 loss_cls_stage1: 0.182 loss_cls_stage2: 0.173 loss_mask: 0.255 loss_rpn_cls: 0.021 loss_rpn_loc: 0.147 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:21:39 d2.utils.events]: [0m eta: 4:12:48 iter: 10374 total_loss: 2.670 loss_box_reg_stage0: 0.304 loss_box_reg_stage1: 0.636 loss_box_reg_stage2: 0.815 loss_cls_stage0: 0.182 loss_cls_stage1: 0.165 loss_cls_stage2: 0.145 loss_mask: 0.233 loss_rpn_cls: 0.019 loss_rpn_loc: 0.142 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:24:54 d2.utils.events]: [0m eta: 4:09:56 iter: 10621 total_loss: 2.672 loss_box_reg_stage0: 0.316 loss_box_reg_stage1: 0.665 loss_box_reg_stage2: 0.790 loss_cls_stage0: 0.199 loss_cls_stage1: 0.171 loss_cls_stage2: 0.159 loss_mask: 0.238 loss_rpn_cls: 0.020 loss_rpn_loc: 0.146 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:28:08 d2.utils.events]: [0m eta: 4:06:19 iter: 10868 total_loss: 2.716 loss_box_reg_stage0: 0.301 loss_box_reg_stage1: 0.636 loss_box_reg_stage2: 0.803 loss_cls_stage0: 0.191 loss_cls_stage1: 0.168 loss_cls_stage2: 0.158 loss_mask: 0.237 loss_rpn_cls: 0.017 loss_rpn_loc: 0.136 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:31:22 d2.utils.events]: [0m eta: 4:02:22 iter: 11115 total_loss: 2.575 loss_box_reg_stage0: 0.303 loss_box_reg_stage1: 0.603 loss_box_reg_stage2: 0.778 loss_cls_stage0: 0.179 loss_cls_stage1: 0.160 loss_cls_stage2: 0.146 loss_mask: 0.236 loss_rpn_cls: 0.019 loss_rpn_loc: 0.148 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:34:36 d2.utils.events]: [0m eta: 3:59:10 iter: 11362 total_loss: 2.580 loss_box_reg_stage0: 0.313 loss_box_reg_stage1: 0.645 loss_box_reg_stage2: 0.766 loss_cls_stage0: 0.180 loss_cls_stage1: 0.159 loss_cls_stage2: 0.146 loss_mask: 0.240 loss_rpn_cls: 0.015 loss_rpn_loc: 0.132 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:37:49 d2.utils.events]: [0m eta: 3:55:11 iter: 11609 total_loss: 2.814 loss_box_reg_stage0: 0.325 loss_box_reg_stage1: 0.675 loss_box_reg_stage2: 0.830 loss_cls_stage0: 0.203 loss_cls_stage1: 0.179 loss_cls_stage2: 0.154 loss_mask: 0.244 loss_rpn_cls: 0.018 loss_rpn_loc: 0.135 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:41:03 d2.utils.events]: [0m eta: 3:52:19 iter: 11856 total_loss: 2.726 loss_box_reg_stage0: 0.318 loss_box_reg_stage1: 0.654 loss_box_reg_stage2: 0.857 loss_cls_stage0: 0.197 loss_cls_stage1: 0.177 loss_cls_stage2: 0.157 loss_mask: 0.241 loss_rpn_cls: 0.019 loss_rpn_loc: 0.135 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:44:16 d2.utils.events]: [0m eta: 3:48:47 iter: 12103 total_loss: 2.691 loss_box_reg_stage0: 0.314 loss_box_reg_stage1: 0.640 loss_box_reg_stage2: 0.814 loss_cls_stage0: 0.184 loss_cls_stage1: 0.164 loss_cls_stage2: 0.149 loss_mask: 0.247 loss_rpn_cls: 0.020 loss_rpn_loc: 0.140 lr: 0.001000 max_mem: 7650M
[32m[05/06 11:47:29 fvcore.common.checkpoint]: [0mSaving checkpoint to /ssd_scratch/cvit/myfolder/cityscapes/models/model_0012349.pth
[32m[05/06 11:47:30 d2.data.datasets.cityscapes]: [0m3 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val'.
[32m[05/06 11:47:30 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 11:49:01 d2.data.datasets.cityscapes]: [0mLoaded 500 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val
[32m[05/06 11:49:01 d2.data.common]: [0mSerializing 500 elements to byte tensors and concatenating them all ...
[32m[05/06 11:49:01 d2.data.common]: [0mSerialized dataset takes 12.85 MiB
[32m[05/06 11:49:01 d2.evaluation.evaluator]: [0mStart inference on 125 images
[32m[05/06 11:49:01 d2.evaluation.cityscapes_evaluation]: [0mWriting cityscapes results to temporary directory /tmp/cityscapes_eval_8uykt_8i ...
[32m[05/06 11:49:15 d2.evaluation.evaluator]: [0mInference done 11/125. 0.1217 s / img. ETA=0:01:32
[32m[05/06 11:49:22 d2.evaluation.evaluator]: [0mInference done 17/125. 0.1258 s / img. ETA=0:01:44
[32m[05/06 11:49:28 d2.evaluation.evaluator]: [0mInference done 23/125. 0.1298 s / img. ETA=0:01:42
[32m[05/06 11:49:34 d2.evaluation.evaluator]: [0mInference done 28/125. 0.1306 s / img. ETA=0:01:40
[32m[05/06 11:49:39 d2.evaluation.evaluator]: [0mInference done 32/125. 0.1318 s / img. ETA=0:01:39
[32m[05/06 11:49:45 d2.evaluation.evaluator]: [0mInference done 37/125. 0.1320 s / img. ETA=0:01:36
[32m[05/06 11:49:51 d2.evaluation.evaluator]: [0mInference done 42/125. 0.1321 s / img. ETA=0:01:32
[32m[05/06 11:49:57 d2.evaluation.evaluator]: [0mInference done 48/125. 0.1311 s / img. ETA=0:01:24
[32m[05/06 11:50:02 d2.evaluation.evaluator]: [0mInference done 52/125. 0.1320 s / img. ETA=0:01:20
[32m[05/06 11:50:07 d2.evaluation.evaluator]: [0mInference done 57/125. 0.1315 s / img. ETA=0:01:14
[32m[05/06 11:50:14 d2.evaluation.evaluator]: [0mInference done 62/125. 0.1315 s / img. ETA=0:01:10
[32m[05/06 11:50:19 d2.evaluation.evaluator]: [0mInference done 66/125. 0.1323 s / img. ETA=0:01:06
[32m[05/06 11:50:25 d2.evaluation.evaluator]: [0mInference done 72/125. 0.1318 s / img. ETA=0:00:59
[32m[05/06 11:50:31 d2.evaluation.evaluator]: [0mInference done 77/125. 0.1316 s / img. ETA=0:00:53
[32m[05/06 11:50:37 d2.evaluation.evaluator]: [0mInference done 82/125. 0.1318 s / img. ETA=0:00:48
[32m[05/06 11:50:42 d2.evaluation.evaluator]: [0mInference done 86/125. 0.1321 s / img. ETA=0:00:44
[32m[05/06 11:50:47 d2.evaluation.evaluator]: [0mInference done 90/125. 0.1324 s / img. ETA=0:00:39
[32m[05/06 11:50:53 d2.evaluation.evaluator]: [0mInference done 95/125. 0.1325 s / img. ETA=0:00:34
[32m[05/06 11:50:59 d2.evaluation.evaluator]: [0mInference done 100/125. 0.1324 s / img. ETA=0:00:28
[32m[05/06 11:51:05 d2.evaluation.evaluator]: [0mInference done 105/125. 0.1324 s / img. ETA=0:00:22
[32m[05/06 11:51:10 d2.evaluation.evaluator]: [0mInference done 109/125. 0.1326 s / img. ETA=0:00:18
[32m[05/06 11:51:15 d2.evaluation.evaluator]: [0mInference done 114/125. 0.1325 s / img. ETA=0:00:12
[32m[05/06 11:51:21 d2.evaluation.evaluator]: [0mInference done 118/125. 0.1327 s / img. ETA=0:00:08
[32m[05/06 11:51:26 d2.evaluation.evaluator]: [0mInference done 124/125. 0.1323 s / img. ETA=0:00:01
[32m[05/06 11:51:27 d2.evaluation.evaluator]: [0mTotal inference time: 0:02:16.815477 (1.140129 s / img per device, on 4 devices)
[32m[05/06 11:51:27 d2.evaluation.evaluator]: [0mTotal inference pure compute time: 0:00:15 (0.132186 s / img per device, on 4 devices)
[32m[05/06 11:51:45 d2.evaluation.cityscapes_evaluation]: [0mEvaluating results under /tmp/cityscapes_eval_8uykt_8i ...
Creating ground truth instances from png files.
Processing 500 images...
All images processed
Matching 500 pairs of images...
All images processed
##################################################
what : AP AP_50%
##################################################
person : 0.385 0.723
rider : 0.296 0.689
car : 0.545 0.802
truck : 0.395 0.567
bus : 0.612 0.808
train : 0.430 0.683
motorcycle : 0.228 0.457
bicycle : 0.233 0.595
--------------------------------------------------
average : 0.391 0.665
[32m[05/06 11:59:26 detectron2]: [0mEvaluation results for cityscapes_fine_inst_seg_val in csv format:
[32m[05/06 11:59:26 d2.evaluation.testing]: [0mcopypaste: Task: segm
[32m[05/06 11:59:26 d2.evaluation.testing]: [0mcopypaste: AP,AP50
[32m[05/06 11:59:26 d2.evaluation.testing]: [0mcopypaste: 39.0668,66.5412
[32m[05/06 11:59:26 d2.utils.events]: [0m eta: 17:41:28 iter: 12350 total_loss: 2.769 loss_box_reg_stage0: 0.315 loss_box_reg_stage1: 0.635 loss_box_reg_stage2: 0.823 loss_cls_stage0: 0.198 loss_cls_stage1: 0.168 loss_cls_stage2: 0.159 loss_mask: 0.246 loss_rpn_cls: 0.020 loss_rpn_loc: 0.152 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:02:35 d2.utils.events]: [0m eta: 3:37:47 iter: 12597 total_loss: 2.727 loss_box_reg_stage0: 0.312 loss_box_reg_stage1: 0.656 loss_box_reg_stage2: 0.824 loss_cls_stage0: 0.194 loss_cls_stage1: 0.161 loss_cls_stage2: 0.150 loss_mask: 0.250 loss_rpn_cls: 0.019 loss_rpn_loc: 0.126 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:05:50 d2.utils.events]: [0m eta: 3:40:13 iter: 12844 total_loss: 2.669 loss_box_reg_stage0: 0.300 loss_box_reg_stage1: 0.653 loss_box_reg_stage2: 0.828 loss_cls_stage0: 0.191 loss_cls_stage1: 0.158 loss_cls_stage2: 0.147 loss_mask: 0.241 loss_rpn_cls: 0.018 loss_rpn_loc: 0.140 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:09:05 d2.utils.events]: [0m eta: 3:37:42 iter: 13091 total_loss: 2.610 loss_box_reg_stage0: 0.312 loss_box_reg_stage1: 0.628 loss_box_reg_stage2: 0.807 loss_cls_stage0: 0.178 loss_cls_stage1: 0.154 loss_cls_stage2: 0.150 loss_mask: 0.240 loss_rpn_cls: 0.019 loss_rpn_loc: 0.137 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:12:20 d2.utils.events]: [0m eta: 3:35:05 iter: 13338 total_loss: 2.503 loss_box_reg_stage0: 0.300 loss_box_reg_stage1: 0.609 loss_box_reg_stage2: 0.794 loss_cls_stage0: 0.181 loss_cls_stage1: 0.150 loss_cls_stage2: 0.132 loss_mask: 0.232 loss_rpn_cls: 0.017 loss_rpn_loc: 0.120 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:15:35 d2.utils.events]: [0m eta: 3:30:41 iter: 13585 total_loss: 2.498 loss_box_reg_stage0: 0.298 loss_box_reg_stage1: 0.593 loss_box_reg_stage2: 0.753 loss_cls_stage0: 0.178 loss_cls_stage1: 0.145 loss_cls_stage2: 0.135 loss_mask: 0.233 loss_rpn_cls: 0.018 loss_rpn_loc: 0.124 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:18:48 d2.utils.events]: [0m eta: 3:26:15 iter: 13832 total_loss: 2.612 loss_box_reg_stage0: 0.307 loss_box_reg_stage1: 0.630 loss_box_reg_stage2: 0.802 loss_cls_stage0: 0.179 loss_cls_stage1: 0.154 loss_cls_stage2: 0.150 loss_mask: 0.229 loss_rpn_cls: 0.017 loss_rpn_loc: 0.154 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:22:01 d2.utils.events]: [0m eta: 3:22:50 iter: 14079 total_loss: 2.691 loss_box_reg_stage0: 0.308 loss_box_reg_stage1: 0.628 loss_box_reg_stage2: 0.824 loss_cls_stage0: 0.189 loss_cls_stage1: 0.164 loss_cls_stage2: 0.154 loss_mask: 0.236 loss_rpn_cls: 0.019 loss_rpn_loc: 0.146 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:25:15 d2.utils.events]: [0m eta: 3:20:29 iter: 14326 total_loss: 2.683 loss_box_reg_stage0: 0.317 loss_box_reg_stage1: 0.641 loss_box_reg_stage2: 0.797 loss_cls_stage0: 0.196 loss_cls_stage1: 0.160 loss_cls_stage2: 0.137 loss_mask: 0.245 loss_rpn_cls: 0.018 loss_rpn_loc: 0.152 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:28:29 d2.utils.events]: [0m eta: 3:17:07 iter: 14573 total_loss: 2.477 loss_box_reg_stage0: 0.294 loss_box_reg_stage1: 0.589 loss_box_reg_stage2: 0.741 loss_cls_stage0: 0.177 loss_cls_stage1: 0.146 loss_cls_stage2: 0.139 loss_mask: 0.232 loss_rpn_cls: 0.016 loss_rpn_loc: 0.124 lr: 0.001000 max_mem: 7650M
[32m[05/06 12:31:42 fvcore.common.checkpoint]: [0mSaving checkpoint to /ssd_scratch/cvit/myfolder/cityscapes/models/model_0014819.pth
[32m[05/06 12:31:43 d2.data.datasets.cityscapes]: [0m3 cities found in '/ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val'.
[32m[05/06 12:31:43 d2.data.datasets.cityscapes]: [0mPreprocessing cityscapes annotations ...
[32m[05/06 12:33:15 d2.data.datasets.cityscapes]: [0mLoaded 500 images from /ssd_scratch/cvit/myfolder/cityscapes/leftImg8bit/val
[32m[05/06 12:33:15 d2.data.common]: [0mSerializing 500 elements to byte tensors and concatenating them all ...
[32m[05/06 12:33:15 d2.data.common]: [0mSerialized dataset takes 12.85 MiB
[32m[05/06 12:33:15 d2.evaluation.evaluator]: [0mStart inference on 125 images
[32m[05/06 12:33:15 d2.evaluation.cityscapes_evaluation]: [0mWriting cityscapes results to temporary directory /tmp/cityscapes_eval_l3n0ocuu ...
[32m[05/06 12:33:29 d2.evaluation.evaluator]: [0mInference done 11/125. 0.1252 s / img. ETA=0:01:37
[32m[05/06 12:33:36 d2.evaluation.evaluator]: [0mInference done 17/125. 0.1264 s / img. ETA=0:01:48
[32m[05/06 12:33:43 d2.evaluation.evaluator]: [0mInference done 23/125. 0.1277 s / img. ETA=0:01:47
[32m[05/06 12:33:49 d2.evaluation.evaluator]: [0mInference done 28/125. 0.1283 s / img. ETA=0:01:44
[32m[05/06 12:33:54 d2.evaluation.evaluator]: [0mInference done 32/125. 0.1294 s / img. ETA=0:01:43
[32m[05/06 12:34:01 d2.evaluation.evaluator]: [0mInference done 37/125. 0.1299 s / img. ETA=0:01:40
[32m[05/06 12:34:07 d2.evaluation.evaluator]: [0mInference done 42/125. 0.1307 s / img. ETA=0:01:36
[32m[05/06 12:34:13 d2.evaluation.evaluator]: [0mInference done 48/125. 0.1305 s / img. ETA=0:01:27