-
Notifications
You must be signed in to change notification settings - Fork 2
/
model_output.txt
7094 lines (7077 loc) · 719 KB
/
model_output.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Using TensorFlow backend.
WARNING:tensorflow:From /usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
Instructions for updating:
Colocations handled automatically by placer.
2019-03-15 10:25:35.963055: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
2019-03-15 10:25:35.968772: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2000125000 Hz
2019-03-15 10:25:35.969634: I tensorflow/compiler/xla/service/service.cc:150] XLA service 0x563c1d28a1e0 executing computations on platform Host. Devices:
2019-03-15 10:25:35.969668: I tensorflow/compiler/xla/service/service.cc:158] StreamExecutor device (0): <undefined>, <undefined>
imgs and labels loaded.
compiling
summarizing
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, 64, 64, 1) 0
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 64, 64, 1) 4 input_1[0][0]
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 64, 64, 64) 1600 batch_normalization_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 64, 64, 64) 256 conv2d_1[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 64, 64, 64) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, 31, 31, 64) 0 activation_1[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 31, 31, 64) 256 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, 31, 31, 64) 0 batch_normalization_3[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 31, 31, 64) 4096 activation_2[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 31, 31, 64) 256 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, 31, 31, 64) 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 31, 31, 64) 36864 activation_3[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 31, 31, 64) 256 conv2d_3[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, 31, 31, 64) 0 batch_normalization_5[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 31, 31, 256) 16384 activation_4[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 31, 31, 256) 16384 activation_2[0][0]
__________________________________________________________________________________________________
add_1 (Add) (None, 31, 31, 256) 0 conv2d_4[0][0]
conv2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 31, 31, 256) 1024 add_1[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, 31, 31, 256) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 31, 31, 64) 16384 activation_5[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 31, 31, 64) 256 conv2d_6[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, 31, 31, 64) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 31, 31, 64) 36864 activation_6[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 31, 31, 64) 256 conv2d_7[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, 31, 31, 64) 0 batch_normalization_8[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 31, 31, 256) 16384 activation_7[0][0]
__________________________________________________________________________________________________
add_2 (Add) (None, 31, 31, 256) 0 conv2d_8[0][0]
add_1[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 31, 31, 256) 1024 add_2[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, 31, 31, 256) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 31, 31, 64) 16384 activation_8[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 31, 31, 64) 256 conv2d_9[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, 31, 31, 64) 0 batch_normalization_10[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 31, 31, 64) 36864 activation_9[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 31, 31, 64) 256 conv2d_10[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, 31, 31, 64) 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 31, 31, 256) 16384 activation_10[0][0]
__________________________________________________________________________________________________
add_3 (Add) (None, 31, 31, 256) 0 conv2d_11[0][0]
add_2[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 31, 31, 256) 1024 add_3[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, 31, 31, 256) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 31, 31, 128) 32768 activation_11[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 31, 31, 128) 512 conv2d_12[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, 31, 31, 128) 0 batch_normalization_13[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 16, 16, 128) 147456 activation_12[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 16, 16, 128) 512 conv2d_13[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, 16, 16, 128) 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 16, 16, 512) 65536 activation_13[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 16, 16, 512) 131072 activation_11[0][0]
__________________________________________________________________________________________________
add_4 (Add) (None, 16, 16, 512) 0 conv2d_14[0][0]
conv2d_15[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 16, 16, 512) 2048 add_4[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, 16, 16, 512) 0 batch_normalization_15[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 16, 16, 128) 65536 activation_14[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 16, 16, 128) 512 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, 16, 16, 128) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 16, 16, 128) 147456 activation_15[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 16, 16, 128) 512 conv2d_17[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, 16, 16, 128) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 16, 16, 512) 65536 activation_16[0][0]
__________________________________________________________________________________________________
add_5 (Add) (None, 16, 16, 512) 0 conv2d_18[0][0]
add_4[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 16, 16, 512) 2048 add_5[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, 16, 16, 512) 0 batch_normalization_18[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 16, 16, 128) 65536 activation_17[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 16, 16, 128) 512 conv2d_19[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, 16, 16, 128) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 16, 16, 128) 147456 activation_18[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 16, 16, 128) 512 conv2d_20[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, 16, 16, 128) 0 batch_normalization_20[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 16, 16, 512) 65536 activation_19[0][0]
__________________________________________________________________________________________________
add_6 (Add) (None, 16, 16, 512) 0 conv2d_21[0][0]
add_5[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 16, 16, 512) 2048 add_6[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, 16, 16, 512) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 16, 16, 128) 65536 activation_20[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 16, 16, 128) 512 conv2d_22[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, 16, 16, 128) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 16, 16, 128) 147456 activation_21[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 16, 16, 128) 512 conv2d_23[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, 16, 16, 128) 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 16, 16, 512) 65536 activation_22[0][0]
__________________________________________________________________________________________________
add_7 (Add) (None, 16, 16, 512) 0 conv2d_24[0][0]
add_6[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 16, 16, 512) 2048 add_7[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, 16, 16, 512) 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 16, 16, 128) 65536 activation_23[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 16, 16, 128) 512 conv2d_25[0][0]
__________________________________________________________________________________________________
activation_24 (Activation) (None, 16, 16, 128) 0 batch_normalization_25[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, 16, 16, 128) 147456 activation_24[0][0]
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 16, 16, 128) 512 conv2d_26[0][0]
__________________________________________________________________________________________________
activation_25 (Activation) (None, 16, 16, 128) 0 batch_normalization_26[0][0]
__________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, 16, 16, 512) 65536 activation_25[0][0]
__________________________________________________________________________________________________
add_8 (Add) (None, 16, 16, 512) 0 conv2d_27[0][0]
add_7[0][0]
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 16, 16, 512) 2048 add_8[0][0]
__________________________________________________________________________________________________
activation_26 (Activation) (None, 16, 16, 512) 0 batch_normalization_27[0][0]
__________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, 16, 16, 128) 65536 activation_26[0][0]
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 16, 16, 128) 512 conv2d_28[0][0]
__________________________________________________________________________________________________
activation_27 (Activation) (None, 16, 16, 128) 0 batch_normalization_28[0][0]
__________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, 16, 16, 128) 147456 activation_27[0][0]
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 16, 16, 128) 512 conv2d_29[0][0]
__________________________________________________________________________________________________
activation_28 (Activation) (None, 16, 16, 128) 0 batch_normalization_29[0][0]
__________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, 16, 16, 512) 65536 activation_28[0][0]
__________________________________________________________________________________________________
add_9 (Add) (None, 16, 16, 512) 0 conv2d_30[0][0]
add_8[0][0]
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 16, 16, 512) 2048 add_9[0][0]
__________________________________________________________________________________________________
activation_29 (Activation) (None, 16, 16, 512) 0 batch_normalization_30[0][0]
__________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, 16, 16, 128) 65536 activation_29[0][0]
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 16, 16, 128) 512 conv2d_31[0][0]
__________________________________________________________________________________________________
activation_30 (Activation) (None, 16, 16, 128) 0 batch_normalization_31[0][0]
__________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, 16, 16, 128) 147456 activation_30[0][0]
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 16, 16, 128) 512 conv2d_32[0][0]
__________________________________________________________________________________________________
activation_31 (Activation) (None, 16, 16, 128) 0 batch_normalization_32[0][0]
__________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, 16, 16, 512) 65536 activation_31[0][0]
__________________________________________________________________________________________________
add_10 (Add) (None, 16, 16, 512) 0 conv2d_33[0][0]
add_9[0][0]
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 16, 16, 512) 2048 add_10[0][0]
__________________________________________________________________________________________________
activation_32 (Activation) (None, 16, 16, 512) 0 batch_normalization_33[0][0]
__________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, 16, 16, 128) 65536 activation_32[0][0]
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 16, 16, 128) 512 conv2d_34[0][0]
__________________________________________________________________________________________________
activation_33 (Activation) (None, 16, 16, 128) 0 batch_normalization_34[0][0]
__________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, 16, 16, 128) 147456 activation_33[0][0]
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 16, 16, 128) 512 conv2d_35[0][0]
__________________________________________________________________________________________________
activation_34 (Activation) (None, 16, 16, 128) 0 batch_normalization_35[0][0]
__________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, 16, 16, 512) 65536 activation_34[0][0]
__________________________________________________________________________________________________
add_11 (Add) (None, 16, 16, 512) 0 conv2d_36[0][0]
add_10[0][0]
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 16, 16, 512) 2048 add_11[0][0]
__________________________________________________________________________________________________
activation_35 (Activation) (None, 16, 16, 512) 0 batch_normalization_36[0][0]
__________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, 16, 16, 256) 131072 activation_35[0][0]
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 16, 16, 256) 1024 conv2d_37[0][0]
__________________________________________________________________________________________________
activation_36 (Activation) (None, 16, 16, 256) 0 batch_normalization_37[0][0]
__________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, 8, 8, 256) 589824 activation_36[0][0]
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 8, 8, 256) 1024 conv2d_38[0][0]
__________________________________________________________________________________________________
activation_37 (Activation) (None, 8, 8, 256) 0 batch_normalization_38[0][0]
__________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, 8, 8, 1024) 262144 activation_37[0][0]
__________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, 8, 8, 1024) 524288 activation_35[0][0]
__________________________________________________________________________________________________
add_12 (Add) (None, 8, 8, 1024) 0 conv2d_39[0][0]
conv2d_40[0][0]
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 8, 8, 1024) 4096 add_12[0][0]
__________________________________________________________________________________________________
activation_38 (Activation) (None, 8, 8, 1024) 0 batch_normalization_39[0][0]
__________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, 8, 8, 256) 262144 activation_38[0][0]
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 8, 8, 256) 1024 conv2d_41[0][0]
__________________________________________________________________________________________________
activation_39 (Activation) (None, 8, 8, 256) 0 batch_normalization_40[0][0]
__________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, 8, 8, 256) 589824 activation_39[0][0]
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 8, 8, 256) 1024 conv2d_42[0][0]
__________________________________________________________________________________________________
activation_40 (Activation) (None, 8, 8, 256) 0 batch_normalization_41[0][0]
__________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, 8, 8, 1024) 262144 activation_40[0][0]
__________________________________________________________________________________________________
add_13 (Add) (None, 8, 8, 1024) 0 conv2d_43[0][0]
add_12[0][0]
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 8, 8, 1024) 4096 add_13[0][0]
__________________________________________________________________________________________________
activation_41 (Activation) (None, 8, 8, 1024) 0 batch_normalization_42[0][0]
__________________________________________________________________________________________________
conv2d_44 (Conv2D) (None, 8, 8, 256) 262144 activation_41[0][0]
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 8, 8, 256) 1024 conv2d_44[0][0]
__________________________________________________________________________________________________
activation_42 (Activation) (None, 8, 8, 256) 0 batch_normalization_43[0][0]
__________________________________________________________________________________________________
conv2d_45 (Conv2D) (None, 8, 8, 256) 589824 activation_42[0][0]
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 8, 8, 256) 1024 conv2d_45[0][0]
__________________________________________________________________________________________________
activation_43 (Activation) (None, 8, 8, 256) 0 batch_normalization_44[0][0]
__________________________________________________________________________________________________
conv2d_46 (Conv2D) (None, 8, 8, 1024) 262144 activation_43[0][0]
__________________________________________________________________________________________________
add_14 (Add) (None, 8, 8, 1024) 0 conv2d_46[0][0]
add_13[0][0]
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 8, 8, 1024) 4096 add_14[0][0]
__________________________________________________________________________________________________
activation_44 (Activation) (None, 8, 8, 1024) 0 batch_normalization_45[0][0]
__________________________________________________________________________________________________
conv2d_47 (Conv2D) (None, 8, 8, 256) 262144 activation_44[0][0]
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 8, 8, 256) 1024 conv2d_47[0][0]
__________________________________________________________________________________________________
activation_45 (Activation) (None, 8, 8, 256) 0 batch_normalization_46[0][0]
__________________________________________________________________________________________________
conv2d_48 (Conv2D) (None, 8, 8, 256) 589824 activation_45[0][0]
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 8, 8, 256) 1024 conv2d_48[0][0]
__________________________________________________________________________________________________
activation_46 (Activation) (None, 8, 8, 256) 0 batch_normalization_47[0][0]
__________________________________________________________________________________________________
conv2d_49 (Conv2D) (None, 8, 8, 1024) 262144 activation_46[0][0]
__________________________________________________________________________________________________
add_15 (Add) (None, 8, 8, 1024) 0 conv2d_49[0][0]
add_14[0][0]
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 8, 8, 1024) 4096 add_15[0][0]
__________________________________________________________________________________________________
activation_47 (Activation) (None, 8, 8, 1024) 0 batch_normalization_48[0][0]
__________________________________________________________________________________________________
conv2d_50 (Conv2D) (None, 8, 8, 256) 262144 activation_47[0][0]
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 8, 8, 256) 1024 conv2d_50[0][0]
__________________________________________________________________________________________________
activation_48 (Activation) (None, 8, 8, 256) 0 batch_normalization_49[0][0]
__________________________________________________________________________________________________
conv2d_51 (Conv2D) (None, 8, 8, 256) 589824 activation_48[0][0]
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 8, 8, 256) 1024 conv2d_51[0][0]
__________________________________________________________________________________________________
activation_49 (Activation) (None, 8, 8, 256) 0 batch_normalization_50[0][0]
__________________________________________________________________________________________________
conv2d_52 (Conv2D) (None, 8, 8, 1024) 262144 activation_49[0][0]
__________________________________________________________________________________________________
add_16 (Add) (None, 8, 8, 1024) 0 conv2d_52[0][0]
add_15[0][0]
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 8, 8, 1024) 4096 add_16[0][0]
__________________________________________________________________________________________________
activation_50 (Activation) (None, 8, 8, 1024) 0 batch_normalization_51[0][0]
__________________________________________________________________________________________________
conv2d_53 (Conv2D) (None, 8, 8, 256) 262144 activation_50[0][0]
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 8, 8, 256) 1024 conv2d_53[0][0]
__________________________________________________________________________________________________
activation_51 (Activation) (None, 8, 8, 256) 0 batch_normalization_52[0][0]
__________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, 8, 8, 256) 589824 activation_51[0][0]
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 8, 8, 256) 1024 conv2d_54[0][0]
__________________________________________________________________________________________________
activation_52 (Activation) (None, 8, 8, 256) 0 batch_normalization_53[0][0]
__________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, 8, 8, 1024) 262144 activation_52[0][0]
__________________________________________________________________________________________________
add_17 (Add) (None, 8, 8, 1024) 0 conv2d_55[0][0]
add_16[0][0]
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 8, 8, 1024) 4096 add_17[0][0]
__________________________________________________________________________________________________
activation_53 (Activation) (None, 8, 8, 1024) 0 batch_normalization_54[0][0]
__________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, 8, 8, 512) 524288 activation_53[0][0]
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 8, 8, 512) 2048 conv2d_56[0][0]
__________________________________________________________________________________________________
activation_54 (Activation) (None, 8, 8, 512) 0 batch_normalization_55[0][0]
__________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, 4, 4, 512) 2359296 activation_54[0][0]
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 4, 4, 512) 2048 conv2d_57[0][0]
__________________________________________________________________________________________________
activation_55 (Activation) (None, 4, 4, 512) 0 batch_normalization_56[0][0]
__________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, 4, 4, 2048) 1048576 activation_55[0][0]
__________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, 4, 4, 2048) 2097152 activation_53[0][0]
__________________________________________________________________________________________________
add_18 (Add) (None, 4, 4, 2048) 0 conv2d_58[0][0]
conv2d_59[0][0]
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 4, 4, 2048) 8192 add_18[0][0]
__________________________________________________________________________________________________
activation_56 (Activation) (None, 4, 4, 2048) 0 batch_normalization_57[0][0]
__________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, 4, 4, 512) 1048576 activation_56[0][0]
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 4, 4, 512) 2048 conv2d_60[0][0]
__________________________________________________________________________________________________
activation_57 (Activation) (None, 4, 4, 512) 0 batch_normalization_58[0][0]
__________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, 4, 4, 512) 2359296 activation_57[0][0]
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 4, 4, 512) 2048 conv2d_61[0][0]
__________________________________________________________________________________________________WARNING:tensorflow:From /usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.cast instead.
activation_58 (Activation) (None, 4, 4, 512) 0 batch_normalization_59[0][0]
__________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, 4, 4, 2048) 1048576 activation_58[0][0]
__________________________________________________________________________________________________
add_19 (Add) (None, 4, 4, 2048) 0 conv2d_62[0][0]
add_18[0][0]
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 4, 4, 2048) 8192 add_19[0][0]
__________________________________________________________________________________________________
activation_59 (Activation) (None, 4, 4, 2048) 0 batch_normalization_60[0][0]
__________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, 4, 4, 512) 1048576 activation_59[0][0]
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 4, 4, 512) 2048 conv2d_63[0][0]
__________________________________________________________________________________________________
activation_60 (Activation) (None, 4, 4, 512) 0 batch_normalization_61[0][0]
__________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, 4, 4, 512) 2359296 activation_60[0][0]
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 4, 4, 512) 2048 conv2d_64[0][0]
__________________________________________________________________________________________________
activation_61 (Activation) (None, 4, 4, 512) 0 batch_normalization_62[0][0]
__________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, 4, 4, 2048) 1048576 activation_61[0][0]
__________________________________________________________________________________________________
add_20 (Add) (None, 4, 4, 2048) 0 conv2d_65[0][0]
add_19[0][0]
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 4, 4, 2048) 8192 add_20[0][0]
__________________________________________________________________________________________________
activation_62 (Activation) (None, 4, 4, 2048) 0 batch_normalization_63[0][0]
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, 2, 2, 2048) 0 activation_62[0][0]
__________________________________________________________________________________________________
flatten_1 (Flatten) (None, 8192) 0 average_pooling2d_1[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 3832) 31395576 flatten_1[0][0]
__________________________________________________________________________________________________
activation_63 (Activation) (None, 3832) 0 dense_1[0][0]
==================================================================================================
Total params: 56,060,220
Trainable params: 56,008,506
Non-trainable params: 51,714
__________________________________________________________________________________________________
Train on 105377 samples, validate on 11709 samples
Epoch 1/15
256/105377 [..............................] - ETA: 4:57:49 - loss: 10.9481 - categorical_accuracy: 0.0000e+00
512/105377 [..............................] - ETA: 4:16:50 - loss: 10.9324 - categorical_accuracy: 0.0000e+00
768/105377 [..............................] - ETA: 3:57:12 - loss: 10.9242 - categorical_accuracy: 0.0000e+00
1024/105377 [..............................] - ETA: 3:47:07 - loss: 10.9405 - categorical_accuracy: 0.0000e+00
1280/105377 [..............................] - ETA: 3:43:16 - loss: 10.9400 - categorical_accuracy: 0.0000e+00
1536/105377 [..............................] - ETA: 3:39:03 - loss: 10.9327 - categorical_accuracy: 0.0000e+00
1792/105377 [..............................] - ETA: 3:37:27 - loss: 10.9319 - categorical_accuracy: 0.0000e+00
2048/105377 [..............................] - ETA: 3:34:19 - loss: 10.9311 - categorical_accuracy: 0.0000e+00
2304/105377 [..............................] - ETA: 3:31:43 - loss: 10.9241 - categorical_accuracy: 0.0000e+00
2560/105377 [..............................] - ETA: 3:30:28 - loss: 10.9252 - categorical_accuracy: 0.0000e+00
2816/105377 [..............................] - ETA: 3:28:39 - loss: 10.9254 - categorical_accuracy: 0.0000e+00
3072/105377 [..............................] - ETA: 3:26:59 - loss: 10.9237 - categorical_accuracy: 0.0000e+00
3328/105377 [..............................] - ETA: 3:26:31 - loss: 10.9269 - categorical_accuracy: 0.0000e+00
3584/105377 [>.............................] - ETA: 3:25:02 - loss: 10.9229 - categorical_accuracy: 0.0000e+00
3840/105377 [>.............................] - ETA: 3:24:05 - loss: 10.9259 - categorical_accuracy: 0.0000e+00
4096/105377 [>.............................] - ETA: 3:23:10 - loss: 10.9285 - categorical_accuracy: 0.0000e+00
4352/105377 [>.............................] - ETA: 3:22:41 - loss: 10.9271 - categorical_accuracy: 0.0000e+00
4608/105377 [>.............................] - ETA: 3:22:06 - loss: 10.9272 - categorical_accuracy: 0.0000e+00
4864/105377 [>.............................] - ETA: 3:21:05 - loss: 10.9249 - categorical_accuracy: 0.0000e+00
5120/105377 [>.............................] - ETA: 3:20:01 - loss: 10.9243 - categorical_accuracy: 0.0000e+00
5376/105377 [>.............................] - ETA: 3:19:37 - loss: 10.9219 - categorical_accuracy: 0.0000e+00
5632/105377 [>.............................] - ETA: 3:18:39 - loss: 10.9203 - categorical_accuracy: 0.0000e+00
5888/105377 [>.............................] - ETA: 3:18:13 - loss: 10.9172 - categorical_accuracy: 0.0000e+00
6144/105377 [>.............................] - ETA: 3:17:29 - loss: 10.9141 - categorical_accuracy: 0.0000e+00
6400/105377 [>.............................] - ETA: 3:16:36 - loss: 10.9140 - categorical_accuracy: 0.0000e+00
6656/105377 [>.............................] - ETA: 3:16:06 - loss: 10.9112 - categorical_accuracy: 0.0000e+00
6912/105377 [>.............................] - ETA: 3:15:35 - loss: 10.9086 - categorical_accuracy: 0.0000e+00
7168/105377 [=>............................] - ETA: 3:14:47 - loss: 10.9081 - categorical_accuracy: 0.0000e+00
7424/105377 [=>............................] - ETA: 3:14:22 - loss: 10.9066 - categorical_accuracy: 0.0000e+00
7680/105377 [=>............................] - ETA: 3:13:34 - loss: 10.9027 - categorical_accuracy: 0.0000e+00
7936/105377 [=>............................] - ETA: 3:12:54 - loss: 10.9030 - categorical_accuracy: 0.0000e+00
8192/105377 [=>............................] - ETA: 3:12:19 - loss: 10.9006 - categorical_accuracy: 1.2207e-04
8448/105377 [=>............................] - ETA: 3:11:42 - loss: 10.8998 - categorical_accuracy: 1.1837e-04
8704/105377 [=>............................] - ETA: 3:11:14 - loss: 10.8979 - categorical_accuracy: 1.1489e-04
8960/105377 [=>............................] - ETA: 3:10:29 - loss: 10.8941 - categorical_accuracy: 1.1161e-04
9216/105377 [=>............................] - ETA: 3:09:45 - loss: 10.8928 - categorical_accuracy: 1.0851e-04
9472/105377 [=>............................] - ETA: 3:09:18 - loss: 10.8904 - categorical_accuracy: 1.0557e-04
9728/105377 [=>............................] - ETA: 3:08:56 - loss: 10.8884 - categorical_accuracy: 1.0280e-04
9984/105377 [=>............................] - ETA: 3:08:16 - loss: 10.8885 - categorical_accuracy: 1.0016e-04
10240/105377 [=>............................] - ETA: 3:07:46 - loss: 10.8861 - categorical_accuracy: 9.7656e-05
10496/105377 [=>............................] - ETA: 3:07:04 - loss: 10.8835 - categorical_accuracy: 9.5274e-05
10752/105377 [==>...........................] - ETA: 3:06:34 - loss: 10.8816 - categorical_accuracy: 9.3006e-05
11008/105377 [==>...........................] - ETA: 3:06:00 - loss: 10.8809 - categorical_accuracy: 9.0843e-05
11264/105377 [==>...........................] - ETA: 3:05:20 - loss: 10.8791 - categorical_accuracy: 8.8778e-05
11520/105377 [==>...........................] - ETA: 3:04:52 - loss: 10.8792 - categorical_accuracy: 8.6806e-05
11776/105377 [==>...........................] - ETA: 3:04:13 - loss: 10.8781 - categorical_accuracy: 8.4918e-05
12032/105377 [==>...........................] - ETA: 3:03:35 - loss: 10.8762 - categorical_accuracy: 1.6622e-04
12288/105377 [==>...........................] - ETA: 3:03:22 - loss: 10.8743 - categorical_accuracy: 1.6276e-04
12544/105377 [==>...........................] - ETA: 3:02:43 - loss: 10.8726 - categorical_accuracy: 1.5944e-04
12800/105377 [==>...........................] - ETA: 3:02:13 - loss: 10.8720 - categorical_accuracy: 1.5625e-04
13056/105377 [==>...........................] - ETA: 3:01:36 - loss: 10.8715 - categorical_accuracy: 1.5319e-04
13312/105377 [==>...........................] - ETA: 3:00:58 - loss: 10.8698 - categorical_accuracy: 1.5024e-04
13568/105377 [==>...........................] - ETA: 3:00:34 - loss: 10.8682 - categorical_accuracy: 1.4741e-04
13824/105377 [==>...........................] - ETA: 2:59:58 - loss: 10.8673 - categorical_accuracy: 1.4468e-04
14080/105377 [===>..........................] - ETA: 2:59:21 - loss: 10.8651 - categorical_accuracy: 2.1307e-04
14336/105377 [===>..........................] - ETA: 2:58:55 - loss: 10.8639 - categorical_accuracy: 2.0926e-04
14592/105377 [===>..........................] - ETA: 2:58:19 - loss: 10.8617 - categorical_accuracy: 2.0559e-04
14848/105377 [===>..........................] - ETA: 2:57:48 - loss: 10.8601 - categorical_accuracy: 2.6940e-04
15104/105377 [===>..........................] - ETA: 2:57:16 - loss: 10.8586 - categorical_accuracy: 2.6483e-04
15360/105377 [===>..........................] - ETA: 2:56:40 - loss: 10.8570 - categorical_accuracy: 2.6042e-04
15616/105377 [===>..........................] - ETA: 2:56:13 - loss: 10.8548 - categorical_accuracy: 2.5615e-04
15872/105377 [===>..........................] - ETA: 2:55:38 - loss: 10.8537 - categorical_accuracy: 2.5202e-04
16128/105377 [===>..........................] - ETA: 2:55:03 - loss: 10.8516 - categorical_accuracy: 2.4802e-04
16384/105377 [===>..........................] - ETA: 2:54:41 - loss: 10.8497 - categorical_accuracy: 2.4414e-04
16640/105377 [===>..........................] - ETA: 2:54:05 - loss: 10.8480 - categorical_accuracy: 2.4038e-04
16896/105377 [===>..........................] - ETA: 2:53:34 - loss: 10.8463 - categorical_accuracy: 2.9593e-04
17152/105377 [===>..........................] - ETA: 2:53:02 - loss: 10.8441 - categorical_accuracy: 3.4981e-04
17408/105377 [===>..........................] - ETA: 2:52:27 - loss: 10.8421 - categorical_accuracy: 3.4467e-04
17664/105377 [====>.........................] - ETA: 2:52:09 - loss: 10.8413 - categorical_accuracy: 3.3967e-04
17920/105377 [====>.........................] - ETA: 2:51:34 - loss: 10.8394 - categorical_accuracy: 3.3482e-04
18176/105377 [====>.........................] - ETA: 2:50:59 - loss: 10.8386 - categorical_accuracy: 3.3011e-04
18432/105377 [====>.........................] - ETA: 2:50:32 - loss: 10.8376 - categorical_accuracy: 3.2552e-04
18688/105377 [====>.........................] - ETA: 2:49:57 - loss: 10.8374 - categorical_accuracy: 3.2106e-04
18944/105377 [====>.........................] - ETA: 2:49:30 - loss: 10.8359 - categorical_accuracy: 3.1672e-04
19200/105377 [====>.........................] - ETA: 2:48:59 - loss: 10.8354 - categorical_accuracy: 3.6458e-04
19456/105377 [====>.........................] - ETA: 2:48:24 - loss: 10.8332 - categorical_accuracy: 3.5979e-04
19712/105377 [====>.........................] - ETA: 2:47:55 - loss: 10.8316 - categorical_accuracy: 3.5511e-04
19968/105377 [====>.........................] - ETA: 2:47:21 - loss: 10.8297 - categorical_accuracy: 3.5056e-04
20224/105377 [====>.........................] - ETA: 2:46:53 - loss: 10.8281 - categorical_accuracy: 3.9557e-04
20480/105377 [====>.........................] - ETA: 2:46:26 - loss: 10.8267 - categorical_accuracy: 3.9063e-04
20736/105377 [====>.........................] - ETA: 2:45:51 - loss: 10.8239 - categorical_accuracy: 4.3403e-04
20992/105377 [====>.........................] - ETA: 2:45:19 - loss: 10.8226 - categorical_accuracy: 4.2873e-04
21248/105377 [=====>........................] - ETA: 2:44:50 - loss: 10.8211 - categorical_accuracy: 4.2357e-04
21504/105377 [=====>........................] - ETA: 2:44:20 - loss: 10.8197 - categorical_accuracy: 4.1853e-04
21760/105377 [=====>........................] - ETA: 2:43:51 - loss: 10.8189 - categorical_accuracy: 4.5956e-04
22016/105377 [=====>........................] - ETA: 2:43:17 - loss: 10.8173 - categorical_accuracy: 4.5422e-04
22272/105377 [=====>........................] - ETA: 2:42:43 - loss: 10.8155 - categorical_accuracy: 5.3879e-04
22528/105377 [=====>........................] - ETA: 2:42:16 - loss: 10.8150 - categorical_accuracy: 5.3267e-04
22784/105377 [=====>........................] - ETA: 2:41:43 - loss: 10.8138 - categorical_accuracy: 5.7058e-04
23040/105377 [=====>........................] - ETA: 2:41:15 - loss: 10.8126 - categorical_accuracy: 5.6424e-04
23296/105377 [=====>........................] - ETA: 2:40:47 - loss: 10.8115 - categorical_accuracy: 6.0096e-04
23552/105377 [=====>........................] - ETA: 2:40:14 - loss: 10.8098 - categorical_accuracy: 5.9443e-04
23808/105377 [=====>........................] - ETA: 2:39:44 - loss: 10.8085 - categorical_accuracy: 5.8804e-04
24064/105377 [=====>........................] - ETA: 2:39:13 - loss: 10.8077 - categorical_accuracy: 5.8178e-04
24320/105377 [=====>........................] - ETA: 2:38:39 - loss: 10.8054 - categorical_accuracy: 5.7566e-04
24576/105377 [=====>........................] - ETA: 2:38:10 - loss: 10.8045 - categorical_accuracy: 6.1035e-04
24832/105377 [======>.......................] - ETA: 2:37:36 - loss: 10.8025 - categorical_accuracy: 6.0406e-04
25088/105377 [======>.......................] - ETA: 2:37:04 - loss: 10.8015 - categorical_accuracy: 5.9790e-04
25344/105377 [======>.......................] - ETA: 2:36:36 - loss: 10.8003 - categorical_accuracy: 5.9186e-04
25600/105377 [======>.......................] - ETA: 2:36:08 - loss: 10.7987 - categorical_accuracy: 5.8594e-04
25856/105377 [======>.......................] - ETA: 2:35:38 - loss: 10.7972 - categorical_accuracy: 5.8014e-04
26112/105377 [======>.......................] - ETA: 2:35:06 - loss: 10.7956 - categorical_accuracy: 6.1275e-04
26368/105377 [======>.......................] - ETA: 2:34:33 - loss: 10.7943 - categorical_accuracy: 6.0680e-04
26624/105377 [======>.......................] - ETA: 2:34:05 - loss: 10.7930 - categorical_accuracy: 6.0096e-04
26880/105377 [======>.......................] - ETA: 2:33:34 - loss: 10.7912 - categorical_accuracy: 5.9524e-04
27136/105377 [======>.......................] - ETA: 2:33:01 - loss: 10.7895 - categorical_accuracy: 5.8962e-04
27392/105377 [======>.......................] - ETA: 2:32:34 - loss: 10.7878 - categorical_accuracy: 5.8411e-04
27648/105377 [======>.......................] - ETA: 2:32:01 - loss: 10.7857 - categorical_accuracy: 5.7870e-04
27904/105377 [======>.......................] - ETA: 2:31:30 - loss: 10.7844 - categorical_accuracy: 5.7339e-04
28160/105377 [=======>......................] - ETA: 2:31:02 - loss: 10.7836 - categorical_accuracy: 5.6818e-04
28416/105377 [=======>......................] - ETA: 2:30:30 - loss: 10.7822 - categorical_accuracy: 5.6306e-04
28672/105377 [=======>......................] - ETA: 2:30:01 - loss: 10.7811 - categorical_accuracy: 5.5804e-04
28928/105377 [=======>......................] - ETA: 2:29:29 - loss: 10.7800 - categorical_accuracy: 5.5310e-04
29184/105377 [=======>......................] - ETA: 2:28:56 - loss: 10.7790 - categorical_accuracy: 5.4825e-04
29440/105377 [=======>......................] - ETA: 2:28:29 - loss: 10.7782 - categorical_accuracy: 5.4348e-04
29696/105377 [=======>......................] - ETA: 2:27:57 - loss: 10.7770 - categorical_accuracy: 5.7247e-04
29952/105377 [=======>......................] - ETA: 2:27:25 - loss: 10.7761 - categorical_accuracy: 6.0096e-04
30208/105377 [=======>......................] - ETA: 2:26:56 - loss: 10.7751 - categorical_accuracy: 6.2897e-04
30464/105377 [=======>......................] - ETA: 2:26:23 - loss: 10.7745 - categorical_accuracy: 6.2369e-04
30720/105377 [=======>......................] - ETA: 2:25:54 - loss: 10.7734 - categorical_accuracy: 6.1849e-04
30976/105377 [=======>......................] - ETA: 2:25:26 - loss: 10.7721 - categorical_accuracy: 6.1338e-04
31232/105377 [=======>......................] - ETA: 2:24:54 - loss: 10.7709 - categorical_accuracy: 6.7239e-04
31488/105377 [=======>......................] - ETA: 2:24:26 - loss: 10.7696 - categorical_accuracy: 6.6692e-04
31744/105377 [========>.....................] - ETA: 2:23:53 - loss: 10.7684 - categorical_accuracy: 6.6154e-04
32000/105377 [========>.....................] - ETA: 2:23:23 - loss: 10.7671 - categorical_accuracy: 6.8750e-04
32256/105377 [========>.....................] - ETA: 2:22:54 - loss: 10.7661 - categorical_accuracy: 6.8204e-04
32512/105377 [========>.....................] - ETA: 2:22:22 - loss: 10.7644 - categorical_accuracy: 6.7667e-04
32768/105377 [========>.....................] - ETA: 2:21:53 - loss: 10.7640 - categorical_accuracy: 6.7139e-04
33024/105377 [========>.....................] - ETA: 2:21:21 - loss: 10.7623 - categorical_accuracy: 6.9646e-04
33280/105377 [========>.....................] - ETA: 2:20:50 - loss: 10.7611 - categorical_accuracy: 6.9111e-04
33536/105377 [========>.....................] - ETA: 2:20:23 - loss: 10.7599 - categorical_accuracy: 6.8583e-04
33792/105377 [========>.....................] - ETA: 2:19:51 - loss: 10.7588 - categorical_accuracy: 7.1023e-04
34048/105377 [========>.....................] - ETA: 2:19:19 - loss: 10.7572 - categorical_accuracy: 7.0489e-04
34304/105377 [========>.....................] - ETA: 2:18:51 - loss: 10.7560 - categorical_accuracy: 6.9963e-04
34560/105377 [========>.....................] - ETA: 2:18:21 - loss: 10.7547 - categorical_accuracy: 6.9444e-04
34816/105377 [========>.....................] - ETA: 2:17:51 - loss: 10.7538 - categorical_accuracy: 6.8934e-04
35072/105377 [========>.....................] - ETA: 2:17:20 - loss: 10.7532 - categorical_accuracy: 6.8431e-04
35328/105377 [=========>....................] - ETA: 2:16:48 - loss: 10.7517 - categorical_accuracy: 6.7935e-04
35584/105377 [=========>....................] - ETA: 2:16:19 - loss: 10.7508 - categorical_accuracy: 7.0256e-04
35840/105377 [=========>....................] - ETA: 2:15:47 - loss: 10.7497 - categorical_accuracy: 7.2545e-04
36096/105377 [=========>....................] - ETA: 2:15:15 - loss: 10.7486 - categorical_accuracy: 7.4801e-04
36352/105377 [=========>....................] - ETA: 2:14:48 - loss: 10.7476 - categorical_accuracy: 7.9776e-04
36608/105377 [=========>....................] - ETA: 2:14:17 - loss: 10.7470 - categorical_accuracy: 8.4681e-04
36864/105377 [=========>....................] - ETA: 2:13:46 - loss: 10.7456 - categorical_accuracy: 8.6806e-04
37120/105377 [=========>....................] - ETA: 2:13:16 - loss: 10.7454 - categorical_accuracy: 8.6207e-04
37376/105377 [=========>....................] - ETA: 2:12:45 - loss: 10.7444 - categorical_accuracy: 8.8292e-04
37632/105377 [=========>....................] - ETA: 2:12:16 - loss: 10.7434 - categorical_accuracy: 8.7691e-04
37888/105377 [=========>....................] - ETA: 2:11:44 - loss: 10.7422 - categorical_accuracy: 8.7099e-04
38144/105377 [=========>....................] - ETA: 2:11:13 - loss: 10.7414 - categorical_accuracy: 8.6514e-04
38400/105377 [=========>....................] - ETA: 2:10:44 - loss: 10.7405 - categorical_accuracy: 8.5938e-04
38656/105377 [==========>...................] - ETA: 2:10:13 - loss: 10.7391 - categorical_accuracy: 8.7955e-04
38912/105377 [==========>...................] - ETA: 2:09:45 - loss: 10.7380 - categorical_accuracy: 8.7377e-04
39168/105377 [==========>...................] - ETA: 2:09:15 - loss: 10.7370 - categorical_accuracy: 8.6806e-04
39424/105377 [==========>...................] - ETA: 2:08:44 - loss: 10.7361 - categorical_accuracy: 8.8778e-04
39680/105377 [==========>...................] - ETA: 2:08:14 - loss: 10.7352 - categorical_accuracy: 8.8206e-04
39936/105377 [==========>...................] - ETA: 2:07:44 - loss: 10.7338 - categorical_accuracy: 8.7640e-04
40192/105377 [==========>...................] - ETA: 2:07:13 - loss: 10.7329 - categorical_accuracy: 8.7082e-04
40448/105377 [==========>...................] - ETA: 2:06:44 - loss: 10.7318 - categorical_accuracy: 8.6531e-04
40704/105377 [==========>...................] - ETA: 2:06:13 - loss: 10.7310 - categorical_accuracy: 8.5987e-04
40960/105377 [==========>...................] - ETA: 2:05:42 - loss: 10.7296 - categorical_accuracy: 8.5449e-04
41216/105377 [==========>...................] - ETA: 2:05:14 - loss: 10.7284 - categorical_accuracy: 8.4918e-04
41472/105377 [==========>...................] - ETA: 2:04:44 - loss: 10.7273 - categorical_accuracy: 8.6806e-04
41728/105377 [==========>...................] - ETA: 2:04:15 - loss: 10.7261 - categorical_accuracy: 8.6273e-04
41984/105377 [==========>...................] - ETA: 2:03:44 - loss: 10.7250 - categorical_accuracy: 8.5747e-04
42240/105377 [===========>..................] - ETA: 2:03:13 - loss: 10.7239 - categorical_accuracy: 8.7595e-04
42496/105377 [===========>..................] - ETA: 2:02:44 - loss: 10.7234 - categorical_accuracy: 8.7067e-04
42752/105377 [===========>..................] - ETA: 2:02:13 - loss: 10.7227 - categorical_accuracy: 8.6546e-04
43008/105377 [===========>..................] - ETA: 2:01:42 - loss: 10.7215 - categorical_accuracy: 8.6031e-04
43264/105377 [===========>..................] - ETA: 2:01:13 - loss: 10.7204 - categorical_accuracy: 8.5521e-04
43520/105377 [===========>..................] - ETA: 2:00:42 - loss: 10.7191 - categorical_accuracy: 8.5018e-04
43776/105377 [===========>..................] - ETA: 2:00:12 - loss: 10.7184 - categorical_accuracy: 8.9090e-04
44032/105377 [===========>..................] - ETA: 1:59:41 - loss: 10.7176 - categorical_accuracy: 8.8572e-04
44288/105377 [===========>..................] - ETA: 1:59:13 - loss: 10.7167 - categorical_accuracy: 8.8060e-04
44544/105377 [===========>..................] - ETA: 1:58:44 - loss: 10.7152 - categorical_accuracy: 8.9799e-04
44800/105377 [===========>..................] - ETA: 1:58:13 - loss: 10.7142 - categorical_accuracy: 9.1518e-04
45056/105377 [===========>..................] - ETA: 1:57:43 - loss: 10.7134 - categorical_accuracy: 9.0998e-04
45312/105377 [===========>..................] - ETA: 1:57:14 - loss: 10.7125 - categorical_accuracy: 9.0484e-04
45568/105377 [===========>..................] - ETA: 1:56:43 - loss: 10.7112 - categorical_accuracy: 8.9975e-04
45824/105377 [============>.................] - ETA: 1:56:14 - loss: 10.7107 - categorical_accuracy: 8.9473e-04
46080/105377 [============>.................] - ETA: 1:55:43 - loss: 10.7098 - categorical_accuracy: 8.8976e-04
46336/105377 [============>.................] - ETA: 1:55:12 - loss: 10.7089 - categorical_accuracy: 8.8484e-04
46592/105377 [============>.................] - ETA: 1:54:43 - loss: 10.7085 - categorical_accuracy: 9.0144e-04
46848/105377 [============>.................] - ETA: 1:54:14 - loss: 10.7077 - categorical_accuracy: 8.9652e-04
47104/105377 [============>.................] - ETA: 1:53:43 - loss: 10.7065 - categorical_accuracy: 9.5533e-04
47360/105377 [============>.................] - ETA: 1:53:14 - loss: 10.7056 - categorical_accuracy: 9.7128e-04
47616/105377 [============>.................] - ETA: 1:52:44 - loss: 10.7043 - categorical_accuracy: 9.8706e-04
47872/105377 [============>.................] - ETA: 1:52:15 - loss: 10.7035 - categorical_accuracy: 9.8178e-04
48128/105377 [============>.................] - ETA: 1:51:44 - loss: 10.7021 - categorical_accuracy: 9.9734e-04
48384/105377 [============>.................] - ETA: 1:51:13 - loss: 10.7016 - categorical_accuracy: 9.9206e-04
48640/105377 [============>.................] - ETA: 1:50:44 - loss: 10.7008 - categorical_accuracy: 9.8684e-04
48896/105377 [============>.................] - ETA: 1:50:13 - loss: 10.6999 - categorical_accuracy: 0.0010
49152/105377 [============>.................] - ETA: 1:49:42 - loss: 10.6988 - categorical_accuracy: 9.9691e-04
49408/105377 [=============>................] - ETA: 1:49:14 - loss: 10.6980 - categorical_accuracy: 9.9174e-04
49664/105377 [=============>................] - ETA: 1:48:44 - loss: 10.6972 - categorical_accuracy: 0.0010
49920/105377 [=============>................] - ETA: 1:48:14 - loss: 10.6962 - categorical_accuracy: 0.0010
50176/105377 [=============>................] - ETA: 1:47:44 - loss: 10.6955 - categorical_accuracy: 0.0010
50432/105377 [=============>................] - ETA: 1:47:13 - loss: 10.6950 - categorical_accuracy: 0.0010
50688/105377 [=============>................] - ETA: 1:46:44 - loss: 10.6943 - categorical_accuracy: 0.0010
50944/105377 [=============>................] - ETA: 1:46:13 - loss: 10.6933 - categorical_accuracy: 0.0010
51200/105377 [=============>................] - ETA: 1:45:42 - loss: 10.6925 - categorical_accuracy: 9.9609e-04
51456/105377 [=============>................] - ETA: 1:45:13 - loss: 10.6916 - categorical_accuracy: 9.9114e-04
51712/105377 [=============>................] - ETA: 1:44:42 - loss: 10.6903 - categorical_accuracy: 9.8623e-04
51968/105377 [=============>................] - ETA: 1:44:12 - loss: 10.6894 - categorical_accuracy: 0.0010
52224/105377 [=============>................] - ETA: 1:43:44 - loss: 10.6886 - categorical_accuracy: 0.0010
52480/105377 [=============>................] - ETA: 1:43:13 - loss: 10.6880 - categorical_accuracy: 0.0010
52736/105377 [==============>...............] - ETA: 1:42:44 - loss: 10.6869 - categorical_accuracy: 0.0010
52992/105377 [==============>...............] - ETA: 1:42:14 - loss: 10.6859 - categorical_accuracy: 0.0010
53248/105377 [==============>...............] - ETA: 1:41:43 - loss: 10.6848 - categorical_accuracy: 0.0010
53504/105377 [==============>...............] - ETA: 1:41:14 - loss: 10.6841 - categorical_accuracy: 0.0010
53760/105377 [==============>...............] - ETA: 1:40:43 - loss: 10.6835 - categorical_accuracy: 0.0010
54016/105377 [==============>...............] - ETA: 1:40:13 - loss: 10.6822 - categorical_accuracy: 0.0010
54272/105377 [==============>...............] - ETA: 1:39:43 - loss: 10.6813 - categorical_accuracy: 0.0010
54528/105377 [==============>...............] - ETA: 1:39:12 - loss: 10.6805 - categorical_accuracy: 0.0010
54784/105377 [==============>...............] - ETA: 1:38:43 - loss: 10.6801 - categorical_accuracy: 0.0010
55040/105377 [==============>...............] - ETA: 1:38:12 - loss: 10.6794 - categorical_accuracy: 0.0010
55296/105377 [==============>...............] - ETA: 1:37:42 - loss: 10.6786 - categorical_accuracy: 0.0010
55552/105377 [==============>...............] - ETA: 1:37:12 - loss: 10.6774 - categorical_accuracy: 0.0010
55808/105377 [==============>...............] - ETA: 1:36:41 - loss: 10.6766 - categorical_accuracy: 0.0010
56064/105377 [==============>...............] - ETA: 1:36:11 - loss: 10.6755 - categorical_accuracy: 0.0011
56320/105377 [===============>..............] - ETA: 1:35:42 - loss: 10.6751 - categorical_accuracy: 0.0010
56576/105377 [===============>..............] - ETA: 1:35:11 - loss: 10.6738 - categorical_accuracy: 0.0010
56832/105377 [===============>..............] - ETA: 1:34:42 - loss: 10.6730 - categorical_accuracy: 0.0011
57088/105377 [===============>..............] - ETA: 1:34:11 - loss: 10.6721 - categorical_accuracy: 0.0011
57344/105377 [===============>..............] - ETA: 1:33:40 - loss: 10.6711 - categorical_accuracy: 0.0011
57600/105377 [===============>..............] - ETA: 1:33:13 - loss: 10.6702 - categorical_accuracy: 0.0011
57856/105377 [===============>..............] - ETA: 1:32:42 - loss: 10.6698 - categorical_accuracy: 0.0011
58112/105377 [===============>..............] - ETA: 1:32:12 - loss: 10.6692 - categorical_accuracy: 0.0010
58368/105377 [===============>..............] - ETA: 1:31:43 - loss: 10.6678 - categorical_accuracy: 0.0011
58624/105377 [===============>..............] - ETA: 1:31:12 - loss: 10.6669 - categorical_accuracy: 0.0011
58880/105377 [===============>..............] - ETA: 1:30:43 - loss: 10.6660 - categorical_accuracy: 0.0011
59136/105377 [===============>..............] - ETA: 1:30:12 - loss: 10.6651 - categorical_accuracy: 0.0011
59392/105377 [===============>..............] - ETA: 1:29:42 - loss: 10.6641 - categorical_accuracy: 0.0011
59648/105377 [===============>..............] - ETA: 1:29:12 - loss: 10.6635 - categorical_accuracy: 0.0011
59904/105377 [================>.............] - ETA: 1:28:42 - loss: 10.6626 - categorical_accuracy: 0.0011
60160/105377 [================>.............] - ETA: 1:28:13 - loss: 10.6618 - categorical_accuracy: 0.0011
60416/105377 [================>.............] - ETA: 1:27:43 - loss: 10.6611 - categorical_accuracy: 0.0011
60672/105377 [================>.............] - ETA: 1:27:13 - loss: 10.6603 - categorical_accuracy: 0.0011
60928/105377 [================>.............] - ETA: 1:26:44 - loss: 10.6596 - categorical_accuracy: 0.0011
61184/105377 [================>.............] - ETA: 1:26:13 - loss: 10.6587 - categorical_accuracy: 0.0011
61440/105377 [================>.............] - ETA: 1:25:42 - loss: 10.6578 - categorical_accuracy: 0.0011
61696/105377 [================>.............] - ETA: 1:25:13 - loss: 10.6570 - categorical_accuracy: 0.0011
61952/105377 [================>.............] - ETA: 1:24:42 - loss: 10.6564 - categorical_accuracy: 0.0011
62208/105377 [================>.............] - ETA: 1:24:12 - loss: 10.6554 - categorical_accuracy: 0.0012
62464/105377 [================>.............] - ETA: 1:23:43 - loss: 10.6546 - categorical_accuracy: 0.0012
62720/105377 [================>.............] - ETA: 1:23:13 - loss: 10.6539 - categorical_accuracy: 0.0012
62976/105377 [================>.............] - ETA: 1:22:43 - loss: 10.6534 - categorical_accuracy: 0.0012
63232/105377 [=================>............] - ETA: 1:22:13 - loss: 10.6526 - categorical_accuracy: 0.0012
63488/105377 [=================>............] - ETA: 1:21:43 - loss: 10.6516 - categorical_accuracy: 0.0012
63744/105377 [=================>............] - ETA: 1:21:13 - loss: 10.6506 - categorical_accuracy: 0.0012
64000/105377 [=================>............] - ETA: 1:20:43 - loss: 10.6497 - categorical_accuracy: 0.0012
64256/105377 [=================>............] - ETA: 1:20:12 - loss: 10.6490 - categorical_accuracy: 0.0012
64512/105377 [=================>............] - ETA: 1:19:43 - loss: 10.6483 - categorical_accuracy: 0.0012
64768/105377 [=================>............] - ETA: 1:19:12 - loss: 10.6477 - categorical_accuracy: 0.0012
65024/105377 [=================>............] - ETA: 1:18:42 - loss: 10.6476 - categorical_accuracy: 0.0012
65280/105377 [=================>............] - ETA: 1:18:12 - loss: 10.6469 - categorical_accuracy: 0.0012
65536/105377 [=================>............] - ETA: 1:17:43 - loss: 10.6460 - categorical_accuracy: 0.0012
65792/105377 [=================>............] - ETA: 1:17:13 - loss: 10.6451 - categorical_accuracy: 0.0012
66048/105377 [=================>............] - ETA: 1:16:43 - loss: 10.6441 - categorical_accuracy: 0.0012
66304/105377 [=================>............] - ETA: 1:16:13 - loss: 10.6435 - categorical_accuracy: 0.0012
66560/105377 [=================>............] - ETA: 1:15:43 - loss: 10.6428 - categorical_accuracy: 0.0012
66816/105377 [==================>...........] - ETA: 1:15:13 - loss: 10.6422 - categorical_accuracy: 0.0012
67072/105377 [==================>...........] - ETA: 1:14:43 - loss: 10.6415 - categorical_accuracy: 0.0012
67328/105377 [==================>...........] - ETA: 1:14:13 - loss: 10.6408 - categorical_accuracy: 0.0012
67584/105377 [==================>...........] - ETA: 1:13:42 - loss: 10.6400 - categorical_accuracy: 0.0012
67840/105377 [==================>...........] - ETA: 1:13:13 - loss: 10.6394 - categorical_accuracy: 0.0012
68096/105377 [==================>...........] - ETA: 1:12:43 - loss: 10.6389 - categorical_accuracy: 0.0012
68352/105377 [==================>...........] - ETA: 1:12:13 - loss: 10.6383 - categorical_accuracy: 0.0012
68608/105377 [==================>...........] - ETA: 1:11:43 - loss: 10.6379 - categorical_accuracy: 0.0013
68864/105377 [==================>...........] - ETA: 1:11:13 - loss: 10.6371 - categorical_accuracy: 0.0013
69120/105377 [==================>...........] - ETA: 1:10:43 - loss: 10.6364 - categorical_accuracy: 0.0013
69376/105377 [==================>...........] - ETA: 1:10:13 - loss: 10.6357 - categorical_accuracy: 0.0013
69632/105377 [==================>...........] - ETA: 1:09:43 - loss: 10.6350 - categorical_accuracy: 0.0013
69888/105377 [==================>...........] - ETA: 1:09:13 - loss: 10.6342 - categorical_accuracy: 0.0013
70144/105377 [==================>...........] - ETA: 1:08:43 - loss: 10.6337 - categorical_accuracy: 0.0013
70400/105377 [===================>..........] - ETA: 1:08:12 - loss: 10.6328 - categorical_accuracy: 0.0013
70656/105377 [===================>..........] - ETA: 1:07:43 - loss: 10.6322 - categorical_accuracy: 0.0013
70912/105377 [===================>..........] - ETA: 1:07:13 - loss: 10.6316 - categorical_accuracy: 0.0013
71168/105377 [===================>..........] - ETA: 1:06:43 - loss: 10.6312 - categorical_accuracy: 0.0013
71424/105377 [===================>..........] - ETA: 1:06:13 - loss: 10.6308 - categorical_accuracy: 0.0013
71680/105377 [===================>..........] - ETA: 1:05:43 - loss: 10.6305 - categorical_accuracy: 0.0014
71936/105377 [===================>..........] - ETA: 1:05:13 - loss: 10.6297 - categorical_accuracy: 0.0013
72192/105377 [===================>..........] - ETA: 1:04:43 - loss: 10.6285 - categorical_accuracy: 0.0014
72448/105377 [===================>..........] - ETA: 1:04:12 - loss: 10.6277 - categorical_accuracy: 0.0014
72704/105377 [===================>..........] - ETA: 1:03:43 - loss: 10.6271 - categorical_accuracy: 0.0014
72960/105377 [===================>..........] - ETA: 1:03:12 - loss: 10.6264 - categorical_accuracy: 0.0014
73216/105377 [===================>..........] - ETA: 1:02:42 - loss: 10.6257 - categorical_accuracy: 0.0014
73472/105377 [===================>..........] - ETA: 1:02:13 - loss: 10.6250 - categorical_accuracy: 0.0015
73728/105377 [===================>..........] - ETA: 1:01:43 - loss: 10.6239 - categorical_accuracy: 0.0015
73984/105377 [====================>.........] - ETA: 1:01:13 - loss: 10.6232 - categorical_accuracy: 0.0014
74240/105377 [====================>.........] - ETA: 1:00:43 - loss: 10.6224 - categorical_accuracy: 0.0015
74496/105377 [====================>.........] - ETA: 1:00:13 - loss: 10.6221 - categorical_accuracy: 0.0014
74752/105377 [====================>.........] - ETA: 59:43 - loss: 10.6213 - categorical_accuracy: 0.0014
75008/105377 [====================>.........] - ETA: 59:13 - loss: 10.6209 - categorical_accuracy: 0.0014
75264/105377 [====================>.........] - ETA: 58:42 - loss: 10.6202 - categorical_accuracy: 0.0015
75520/105377 [====================>.........] - ETA: 58:13 - loss: 10.6196 - categorical_accuracy: 0.0015
75776/105377 [====================>.........] - ETA: 57:42 - loss: 10.6185 - categorical_accuracy: 0.0015
76032/105377 [====================>.........] - ETA: 57:13 - loss: 10.6176 - categorical_accuracy: 0.0016
76288/105377 [====================>.........] - ETA: 56:43 - loss: 10.6172 - categorical_accuracy: 0.0016
76544/105377 [====================>.........] - ETA: 56:13 - loss: 10.6167 - categorical_accuracy: 0.0016
76800/105377 [====================>.........] - ETA: 55:43 - loss: 10.6160 - categorical_accuracy: 0.0016
77056/105377 [====================>.........] - ETA: 55:13 - loss: 10.6151 - categorical_accuracy: 0.0016
77312/105377 [=====================>........] - ETA: 54:43 - loss: 10.6143 - categorical_accuracy: 0.0016
77568/105377 [=====================>........] - ETA: 54:13 - loss: 10.6138 - categorical_accuracy: 0.0016
77824/105377 [=====================>........] - ETA: 53:43 - loss: 10.6128 - categorical_accuracy: 0.0016
78080/105377 [=====================>........] - ETA: 53:13 - loss: 10.6123 - categorical_accuracy: 0.0016
78336/105377 [=====================>........] - ETA: 52:43 - loss: 10.6116 - categorical_accuracy: 0.0016
78592/105377 [=====================>........] - ETA: 52:13 - loss: 10.6110 - categorical_accuracy: 0.0016
78848/105377 [=====================>........] - ETA: 51:44 - loss: 10.6100 - categorical_accuracy: 0.0017
79104/105377 [=====================>........] - ETA: 51:14 - loss: 10.6098 - categorical_accuracy: 0.0017
79360/105377 [=====================>........] - ETA: 50:43 - loss: 10.6091 - categorical_accuracy: 0.0017
79616/105377 [=====================>........] - ETA: 50:14 - loss: 10.6083 - categorical_accuracy: 0.0017
79872/105377 [=====================>........] - ETA: 49:44 - loss: 10.6078 - categorical_accuracy: 0.0017
80128/105377 [=====================>........] - ETA: 49:13 - loss: 10.6070 - categorical_accuracy: 0.0017
80384/105377 [=====================>........] - ETA: 48:44 - loss: 10.6063 - categorical_accuracy: 0.0017
80640/105377 [=====================>........] - ETA: 48:14 - loss: 10.6059 - categorical_accuracy: 0.0017
80896/105377 [======================>.......] - ETA: 47:44 - loss: 10.6052 - categorical_accuracy: 0.0017
81152/105377 [======================>.......] - ETA: 47:14 - loss: 10.6048 - categorical_accuracy: 0.0017
81408/105377 [======================>.......] - ETA: 46:44 - loss: 10.6045 - categorical_accuracy: 0.0017
81664/105377 [======================>.......] - ETA: 46:14 - loss: 10.6037 - categorical_accuracy: 0.0017
81920/105377 [======================>.......] - ETA: 45:44 - loss: 10.6031 - categorical_accuracy: 0.0017
82176/105377 [======================>.......] - ETA: 45:14 - loss: 10.6025 - categorical_accuracy: 0.0017
82432/105377 [======================>.......] - ETA: 44:44 - loss: 10.6020 - categorical_accuracy: 0.0017
82688/105377 [======================>.......] - ETA: 44:14 - loss: 10.6013 - categorical_accuracy: 0.0017
82944/105377 [======================>.......] - ETA: 43:44 - loss: 10.6007 - categorical_accuracy: 0.0017
83200/105377 [======================>.......] - ETA: 43:14 - loss: 10.6001 - categorical_accuracy: 0.0018
83456/105377 [======================>.......] - ETA: 42:44 - loss: 10.5997 - categorical_accuracy: 0.0018
83712/105377 [======================>.......] - ETA: 42:14 - loss: 10.5993 - categorical_accuracy: 0.0018
83968/105377 [======================>.......] - ETA: 41:44 - loss: 10.5989 - categorical_accuracy: 0.0018
84224/105377 [======================>.......] - ETA: 41:14 - loss: 10.5984 - categorical_accuracy: 0.0018
84480/105377 [=======================>......] - ETA: 40:44 - loss: 10.5977 - categorical_accuracy: 0.0018
84736/105377 [=======================>......] - ETA: 40:14 - loss: 10.5969 - categorical_accuracy: 0.0018
84992/105377 [=======================>......] - ETA: 39:44 - loss: 10.5962 - categorical_accuracy: 0.0018
85248/105377 [=======================>......] - ETA: 39:14 - loss: 10.5954 - categorical_accuracy: 0.0018
85504/105377 [=======================>......] - ETA: 38:44 - loss: 10.5946 - categorical_accuracy: 0.0018
85760/105377 [=======================>......] - ETA: 38:14 - loss: 10.5940 - categorical_accuracy: 0.0019
86016/105377 [=======================>......] - ETA: 37:44 - loss: 10.5934 - categorical_accuracy: 0.0019
86272/105377 [=======================>......] - ETA: 37:14 - loss: 10.5927 - categorical_accuracy: 0.0019
86528/105377 [=======================>......] - ETA: 36:44 - loss: 10.5923 - categorical_accuracy: 0.0019
86784/105377 [=======================>......] - ETA: 36:15 - loss: 10.5919 - categorical_accuracy: 0.0019
87040/105377 [=======================>......] - ETA: 35:45 - loss: 10.5914 - categorical_accuracy: 0.0019
87296/105377 [=======================>......] - ETA: 35:15 - loss: 10.5910 - categorical_accuracy: 0.0019
87552/105377 [=======================>......] - ETA: 34:45 - loss: 10.5904 - categorical_accuracy: 0.0019
87808/105377 [=======================>......] - ETA: 34:15 - loss: 10.5899 - categorical_accuracy: 0.0019
88064/105377 [========================>.....] - ETA: 33:45 - loss: 10.5892 - categorical_accuracy: 0.0019
88320/105377 [========================>.....] - ETA: 33:15 - loss: 10.5889 - categorical_accuracy: 0.0020
88576/105377 [========================>.....] - ETA: 32:45 - loss: 10.5881 - categorical_accuracy: 0.0020
88832/105377 [========================>.....] - ETA: 32:15 - loss: 10.5873 - categorical_accuracy: 0.0020
89088/105377 [========================>.....] - ETA: 31:45 - loss: 10.5870 - categorical_accuracy: 0.0020
89344/105377 [========================>.....] - ETA: 31:15 - loss: 10.5863 - categorical_accuracy: 0.0020
89600/105377 [========================>.....] - ETA: 30:45 - loss: 10.5857 - categorical_accuracy: 0.0020
89856/105377 [========================>.....] - ETA: 30:15 - loss: 10.5852 - categorical_accuracy: 0.0020
90112/105377 [========================>.....] - ETA: 29:45 - loss: 10.5847 - categorical_accuracy: 0.0020
90368/105377 [========================>.....] - ETA: 29:15 - loss: 10.5839 - categorical_accuracy: 0.0020
90624/105377 [========================>.....] - ETA: 28:45 - loss: 10.5833 - categorical_accuracy: 0.0020
90880/105377 [========================>.....] - ETA: 28:15 - loss: 10.5828 - categorical_accuracy: 0.0020
91136/105377 [========================>.....] - ETA: 27:45 - loss: 10.5823 - categorical_accuracy: 0.0020
91392/105377 [=========================>....] - ETA: 27:15 - loss: 10.5819 - categorical_accuracy: 0.0020
91648/105377 [=========================>....] - ETA: 26:45 - loss: 10.5814 - categorical_accuracy: 0.0020
91904/105377 [=========================>....] - ETA: 26:16 - loss: 10.5807 - categorical_accuracy: 0.0020
92160/105377 [=========================>....] - ETA: 25:46 - loss: 10.5803 - categorical_accuracy: 0.0020
92416/105377 [=========================>....] - ETA: 25:16 - loss: 10.5801 - categorical_accuracy: 0.0020
92672/105377 [=========================>....] - ETA: 24:46 - loss: 10.5796 - categorical_accuracy: 0.0020
92928/105377 [=========================>....] - ETA: 24:16 - loss: 10.5788 - categorical_accuracy: 0.0020
93184/105377 [=========================>....] - ETA: 23:46 - loss: 10.5784 - categorical_accuracy: 0.0020
93440/105377 [=========================>....] - ETA: 23:16 - loss: 10.5777 - categorical_accuracy: 0.0020
93696/105377 [=========================>....] - ETA: 22:46 - loss: 10.5771 - categorical_accuracy: 0.0020
93952/105377 [=========================>....] - ETA: 22:16 - loss: 10.5768 - categorical_accuracy: 0.0020
94208/105377 [=========================>....] - ETA: 21:46 - loss: 10.5763 - categorical_accuracy: 0.0020
94464/105377 [=========================>....] - ETA: 21:16 - loss: 10.5757 - categorical_accuracy: 0.0021
94720/105377 [=========================>....] - ETA: 20:46 - loss: 10.5749 - categorical_accuracy: 0.0020
94976/105377 [==========================>...] - ETA: 20:16 - loss: 10.5744 - categorical_accuracy: 0.0021
95232/105377 [==========================>...] - ETA: 19:46 - loss: 10.5740 - categorical_accuracy: 0.0021
95488/105377 [==========================>...] - ETA: 19:16 - loss: 10.5731 - categorical_accuracy: 0.0021
95744/105377 [==========================>...] - ETA: 18:46 - loss: 10.5729 - categorical_accuracy: 0.0021
96000/105377 [==========================>...] - ETA: 18:16 - loss: 10.5725 - categorical_accuracy: 0.0021
96256/105377 [==========================>...] - ETA: 17:46 - loss: 10.5717 - categorical_accuracy: 0.0021
96512/105377 [==========================>...] - ETA: 17:16 - loss: 10.5710 - categorical_accuracy: 0.0021
96768/105377 [==========================>...] - ETA: 16:46 - loss: 10.5704 - categorical_accuracy: 0.0021
97024/105377 [==========================>...] - ETA: 16:16 - loss: 10.5697 - categorical_accuracy: 0.0022
97280/105377 [==========================>...] - ETA: 15:47 - loss: 10.5691 - categorical_accuracy: 0.0021
97536/105377 [==========================>...] - ETA: 15:17 - loss: 10.5686 - categorical_accuracy: 0.0022
97792/105377 [==========================>...] - ETA: 14:47 - loss: 10.5682 - categorical_accuracy: 0.0022
98048/105377 [==========================>...] - ETA: 14:17 - loss: 10.5677 - categorical_accuracy: 0.0022
98304/105377 [==========================>...] - ETA: 13:47 - loss: 10.5670 - categorical_accuracy: 0.0022
98560/105377 [===========================>..] - ETA: 13:17 - loss: 10.5665 - categorical_accuracy: 0.0022
98816/105377 [===========================>..] - ETA: 12:47 - loss: 10.5659 - categorical_accuracy: 0.0022
99072/105377 [===========================>..] - ETA: 12:17 - loss: 10.5651 - categorical_accuracy: 0.0023
99328/105377 [===========================>..] - ETA: 11:47 - loss: 10.5646 - categorical_accuracy: 0.0023
99584/105377 [===========================>..] - ETA: 11:17 - loss: 10.5639 - categorical_accuracy: 0.0023
99840/105377 [===========================>..] - ETA: 10:47 - loss: 10.5634 - categorical_accuracy: 0.0023
100096/105377 [===========================>..] - ETA: 10:17 - loss: 10.5626 - categorical_accuracy: 0.0023
100352/105377 [===========================>..] - ETA: 9:47 - loss: 10.5622 - categorical_accuracy: 0.0024
100608/105377 [===========================>..] - ETA: 9:17 - loss: 10.5616 - categorical_accuracy: 0.0024
100864/105377 [===========================>..] - ETA: 8:47 - loss: 10.5609 - categorical_accuracy: 0.0024
101120/105377 [===========================>..] - ETA: 8:17 - loss: 10.5603 - categorical_accuracy: 0.0024
101376/105377 [===========================>..] - ETA: 7:47 - loss: 10.5595 - categorical_accuracy: 0.0024
101632/105377 [===========================>..] - ETA: 7:18 - loss: 10.5588 - categorical_accuracy: 0.0025
101888/105377 [============================>.] - ETA: 6:48 - loss: 10.5582 - categorical_accuracy: 0.0025
102144/105377 [============================>.] - ETA: 6:18 - loss: 10.5576 - categorical_accuracy: 0.0025
102400/105377 [============================>.] - ETA: 5:48 - loss: 10.5573 - categorical_accuracy: 0.0025
102656/105377 [============================>.] - ETA: 5:18 - loss: 10.5567 - categorical_accuracy: 0.0025
102912/105377 [============================>.] - ETA: 4:48 - loss: 10.5561 - categorical_accuracy: 0.0025
103168/105377 [============================>.] - ETA: 4:18 - loss: 10.5554 - categorical_accuracy: 0.0025
103424/105377 [============================>.] - ETA: 3:48 - loss: 10.5548 - categorical_accuracy: 0.0025
103680/105377 [============================>.] - ETA: 3:18 - loss: 10.5544 - categorical_accuracy: 0.0025
103936/105377 [============================>.] - ETA: 2:48 - loss: 10.5539 - categorical_accuracy: 0.0025
104192/105377 [============================>.] - ETA: 2:18 - loss: 10.5532 - categorical_accuracy: 0.0026
104448/105377 [============================>.] - ETA: 1:48 - loss: 10.5525 - categorical_accuracy: 0.0026
104704/105377 [============================>.] - ETA: 1:18 - loss: 10.5521 - categorical_accuracy: 0.0026
104960/105377 [============================>.] - ETA: 48s - loss: 10.5515 - categorical_accuracy: 0.0026
105216/105377 [============================>.] - ETA: 18s - loss: 10.5508 - categorical_accuracy: 0.0026
105377/105377 [==============================] - 12664s 120ms/step - loss: 10.5506 - categorical_accuracy: 0.0026 - val_loss: 12.0025 - val_categorical_accuracy: 0.0000e+00
Epoch 2/15
256/105377 [..............................] - ETA: 3:31:24 - loss: 10.3552 - categorical_accuracy: 0.0078
512/105377 [..............................] - ETA: 3:24:29 - loss: 10.2789 - categorical_accuracy: 0.0078
768/105377 [..............................] - ETA: 3:23:51 - loss: 10.2772 - categorical_accuracy: 0.0065
1024/105377 [..............................] - ETA: 3:22:46 - loss: 10.2547 - categorical_accuracy: 0.0068
1280/105377 [..............................] - ETA: 3:21:32 - loss: 10.2450 - categorical_accuracy: 0.0078
1536/105377 [..............................] - ETA: 3:21:59 - loss: 10.2495 - categorical_accuracy: 0.0078
1792/105377 [..............................] - ETA: 3:20:50 - loss: 10.2614 - categorical_accuracy: 0.0084
2048/105377 [..............................] - ETA: 3:20:24 - loss: 10.2589 - categorical_accuracy: 0.0078
2304/105377 [..............................] - ETA: 3:21:54 - loss: 10.2642 - categorical_accuracy: 0.0087
2560/105377 [..............................] - ETA: 3:20:53 - loss: 10.2608 - categorical_accuracy: 0.0090
2816/105377 [..............................] - ETA: 3:20:26 - loss: 10.2736 - categorical_accuracy: 0.0096
3072/105377 [..............................] - ETA: 3:19:50 - loss: 10.2698 - categorical_accuracy: 0.0091
3328/105377 [..............................] - ETA: 3:18:58 - loss: 10.2697 - categorical_accuracy: 0.0090
3584/105377 [>.............................] - ETA: 3:18:48 - loss: 10.2717 - categorical_accuracy: 0.0089
3840/105377 [>.............................] - ETA: 3:18:01 - loss: 10.2766 - categorical_accuracy: 0.0094
4096/105377 [>.............................] - ETA: 3:17:16 - loss: 10.2730 - categorical_accuracy: 0.0098
4352/105377 [>.............................] - ETA: 3:17:12 - loss: 10.2703 - categorical_accuracy: 0.0099
4608/105377 [>.............................] - ETA: 3:16:40 - loss: 10.2732 - categorical_accuracy: 0.0098
4864/105377 [>.............................] - ETA: 3:16:07 - loss: 10.2708 - categorical_accuracy: 0.0095
5120/105377 [>.............................] - ETA: 3:15:46 - loss: 10.2704 - categorical_accuracy: 0.0094
5376/105377 [>.............................] - ETA: 3:15:03 - loss: 10.2713 - categorical_accuracy: 0.0099
5632/105377 [>.............................] - ETA: 3:14:46 - loss: 10.2744 - categorical_accuracy: 0.0094
5888/105377 [>.............................] - ETA: 3:14:04 - loss: 10.2732 - categorical_accuracy: 0.0095
6144/105377 [>.............................] - ETA: 3:13:23 - loss: 10.2775 - categorical_accuracy: 0.0096
6400/105377 [>.............................] - ETA: 3:13:09 - loss: 10.2799 - categorical_accuracy: 0.0094
6656/105377 [>.............................] - ETA: 3:12:29 - loss: 10.2764 - categorical_accuracy: 0.0093
6912/105377 [>.............................] - ETA: 3:11:55 - loss: 10.2762 - categorical_accuracy: 0.0093
7168/105377 [=>............................] - ETA: 3:11:32 - loss: 10.2753 - categorical_accuracy: 0.0092
7424/105377 [=>............................] - ETA: 3:11:04 - loss: 10.2722 - categorical_accuracy: 0.0089
7680/105377 [=>............................] - ETA: 3:11:02 - loss: 10.2735 - categorical_accuracy: 0.0087
7936/105377 [=>............................] - ETA: 3:10:27 - loss: 10.2707 - categorical_accuracy: 0.0089
8192/105377 [=>............................] - ETA: 3:09:47 - loss: 10.2686 - categorical_accuracy: 0.0088
8448/105377 [=>............................] - ETA: 3:09:28 - loss: 10.2687 - categorical_accuracy: 0.0089
8704/105377 [=>............................] - ETA: 3:08:49 - loss: 10.2690 - categorical_accuracy: 0.0088
8960/105377 [=>............................] - ETA: 3:08:14 - loss: 10.2672 - categorical_accuracy: 0.0090
9216/105377 [=>............................] - ETA: 3:07:48 - loss: 10.2649 - categorical_accuracy: 0.0092
9472/105377 [=>............................] - ETA: 3:07:11 - loss: 10.2656 - categorical_accuracy: 0.0091
9728/105377 [=>............................] - ETA: 3:06:46 - loss: 10.2650 - categorical_accuracy: 0.0096
9984/105377 [=>............................] - ETA: 3:06:17 - loss: 10.2659 - categorical_accuracy: 0.0095
10240/105377 [=>............................] - ETA: 3:05:53 - loss: 10.2642 - categorical_accuracy: 0.0097
10496/105377 [=>............................] - ETA: 3:05:31 - loss: 10.2652 - categorical_accuracy: 0.0094
10752/105377 [==>...........................] - ETA: 3:04:54 - loss: 10.2655 - categorical_accuracy: 0.0094
11008/105377 [==>...........................] - ETA: 3:04:20 - loss: 10.2638 - categorical_accuracy: 0.0094
11264/105377 [==>...........................] - ETA: 3:03:54 - loss: 10.2620 - categorical_accuracy: 0.0094
11520/105377 [==>...........................] - ETA: 3:03:18 - loss: 10.2637 - categorical_accuracy: 0.0095
11776/105377 [==>...........................] - ETA: 3:02:51 - loss: 10.2635 - categorical_accuracy: 0.0094
12032/105377 [==>...........................] - ETA: 3:02:16 - loss: 10.2615 - categorical_accuracy: 0.0095
12288/105377 [==>...........................] - ETA: 3:01:41 - loss: 10.2608 - categorical_accuracy: 0.0095
12544/105377 [==>...........................] - ETA: 3:01:17 - loss: 10.2627 - categorical_accuracy: 0.0095
12800/105377 [==>...........................] - ETA: 3:00:43 - loss: 10.2621 - categorical_accuracy: 0.0095
13056/105377 [==>...........................] - ETA: 3:00:20 - loss: 10.2625 - categorical_accuracy: 0.0099
13312/105377 [==>...........................] - ETA: 2:59:55 - loss: 10.2635 - categorical_accuracy: 0.0098
13568/105377 [==>...........................] - ETA: 2:59:20 - loss: 10.2644 - categorical_accuracy: 0.0099
13824/105377 [==>...........................] - ETA: 2:58:54 - loss: 10.2637 - categorical_accuracy: 0.0100
14080/105377 [===>..........................] - ETA: 2:58:18 - loss: 10.2638 - categorical_accuracy: 0.0102
14336/105377 [===>..........................] - ETA: 2:57:43 - loss: 10.2634 - categorical_accuracy: 0.0100
14592/105377 [===>..........................] - ETA: 2:57:17 - loss: 10.2648 - categorical_accuracy: 0.0099
14848/105377 [===>..........................] - ETA: 2:56:42 - loss: 10.2627 - categorical_accuracy: 0.0099
15104/105377 [===>..........................] - ETA: 2:56:13 - loss: 10.2630 - categorical_accuracy: 0.0100
15360/105377 [===>..........................] - ETA: 2:55:49 - loss: 10.2620 - categorical_accuracy: 0.0100
15616/105377 [===>..........................] - ETA: 2:55:25 - loss: 10.2624 - categorical_accuracy: 0.0099
15872/105377 [===>..........................] - ETA: 2:54:58 - loss: 10.2610 - categorical_accuracy: 0.0101
16128/105377 [===>..........................] - ETA: 2:54:24 - loss: 10.2612 - categorical_accuracy: 0.0101
16384/105377 [===>..........................] - ETA: 2:53:50 - loss: 10.2606 - categorical_accuracy: 0.0100
16640/105377 [===>..........................] - ETA: 2:53:24 - loss: 10.2599 - categorical_accuracy: 0.0102
16896/105377 [===>..........................] - ETA: 2:52:50 - loss: 10.2598 - categorical_accuracy: 0.0102
17152/105377 [===>..........................] - ETA: 2:52:16 - loss: 10.2599 - categorical_accuracy: 0.0103
17408/105377 [===>..........................] - ETA: 2:51:51 - loss: 10.2594 - categorical_accuracy: 0.0103
17664/105377 [====>.........................] - ETA: 2:51:21 - loss: 10.2602 - categorical_accuracy: 0.0102
17920/105377 [====>.........................] - ETA: 2:50:56 - loss: 10.2595 - categorical_accuracy: 0.0102
18176/105377 [====>.........................] - ETA: 2:50:30 - loss: 10.2604 - categorical_accuracy: 0.0101
18432/105377 [====>.........................] - ETA: 2:49:56 - loss: 10.2595 - categorical_accuracy: 0.0099
18688/105377 [====>.........................] - ETA: 2:49:30 - loss: 10.2599 - categorical_accuracy: 0.0100
18944/105377 [====>.........................] - ETA: 2:48:57 - loss: 10.2594 - categorical_accuracy: 0.0099
19200/105377 [====>.........................] - ETA: 2:48:24 - loss: 10.2597 - categorical_accuracy: 0.0099
19456/105377 [====>.........................] - ETA: 2:47:59 - loss: 10.2595 - categorical_accuracy: 0.0099
19712/105377 [====>.........................] - ETA: 2:47:26 - loss: 10.2593 - categorical_accuracy: 0.0098
19968/105377 [====>.........................] - ETA: 2:46:58 - loss: 10.2593 - categorical_accuracy: 0.0098
20224/105377 [====>.........................] - ETA: 2:46:28 - loss: 10.2590 - categorical_accuracy: 0.0098
20480/105377 [====>.........................] - ETA: 2:45:55 - loss: 10.2576 - categorical_accuracy: 0.0098
20736/105377 [====>.........................] - ETA: 2:45:30 - loss: 10.2589 - categorical_accuracy: 0.0097
20992/105377 [====>.........................] - ETA: 2:45:02 - loss: 10.2582 - categorical_accuracy: 0.0096
21248/105377 [=====>........................] - ETA: 2:44:29 - loss: 10.2581 - categorical_accuracy: 0.0097
21504/105377 [=====>........................] - ETA: 2:44:03 - loss: 10.2589 - categorical_accuracy: 0.0098
21760/105377 [=====>........................] - ETA: 2:43:30 - loss: 10.2580 - categorical_accuracy: 0.0098
22016/105377 [=====>........................] - ETA: 2:43:02 - loss: 10.2582 - categorical_accuracy: 0.0098
22272/105377 [=====>........................] - ETA: 2:42:30 - loss: 10.2576 - categorical_accuracy: 0.0098
22528/105377 [=====>........................] - ETA: 2:41:58 - loss: 10.2572 - categorical_accuracy: 0.0098
22784/105377 [=====>........................] - ETA: 2:41:32 - loss: 10.2570 - categorical_accuracy: 0.0097
23040/105377 [=====>........................] - ETA: 2:41:00 - loss: 10.2565 - categorical_accuracy: 0.0098
23296/105377 [=====>........................] - ETA: 2:40:29 - loss: 10.2566 - categorical_accuracy: 0.0098
23552/105377 [=====>........................] - ETA: 2:40:07 - loss: 10.2557 - categorical_accuracy: 0.0097
23808/105377 [=====>........................] - ETA: 2:39:35 - loss: 10.2558 - categorical_accuracy: 0.0097
24064/105377 [=====>........................] - ETA: 2:39:07 - loss: 10.2551 - categorical_accuracy: 0.0098
24320/105377 [=====>........................] - ETA: 2:38:36 - loss: 10.2544 - categorical_accuracy: 0.0099
24576/105377 [=====>........................] - ETA: 2:38:04 - loss: 10.2540 - categorical_accuracy: 0.0098
24832/105377 [======>.......................] - ETA: 2:37:36 - loss: 10.2534 - categorical_accuracy: 0.0099
25088/105377 [======>.......................] - ETA: 2:37:03 - loss: 10.2533 - categorical_accuracy: 0.0098
25344/105377 [======>.......................] - ETA: 2:36:31 - loss: 10.2532 - categorical_accuracy: 0.0097
25600/105377 [======>.......................] - ETA: 2:36:04 - loss: 10.2530 - categorical_accuracy: 0.0098
25856/105377 [======>.......................] - ETA: 2:35:32 - loss: 10.2534 - categorical_accuracy: 0.0098
26112/105377 [======>.......................] - ETA: 2:35:06 - loss: 10.2532 - categorical_accuracy: 0.0097
26368/105377 [======>.......................] - ETA: 2:34:34 - loss: 10.2523 - categorical_accuracy: 0.0098
26624/105377 [======>.......................] - ETA: 2:34:03 - loss: 10.2522 - categorical_accuracy: 0.0098
26880/105377 [======>.......................] - ETA: 2:33:35 - loss: 10.2514 - categorical_accuracy: 0.0099
27136/105377 [======>.......................] - ETA: 2:33:03 - loss: 10.2527 - categorical_accuracy: 0.0098
27392/105377 [======>.......................] - ETA: 2:32:31 - loss: 10.2520 - categorical_accuracy: 0.0098
27648/105377 [======>.......................] - ETA: 2:32:04 - loss: 10.2522 - categorical_accuracy: 0.0098
27904/105377 [======>.......................] - ETA: 2:31:32 - loss: 10.2513 - categorical_accuracy: 0.0099
28160/105377 [=======>......................] - ETA: 2:31:06 - loss: 10.2506 - categorical_accuracy: 0.0102