convolution neural network architecture - An Overview
All convolutions inside a dense block are ReLU-activated and use batch normalization. Channel-clever concatenation is just feasible if the height and width Proportions of the info stay unchanged, so convolutions inside a dense block are all of stride 1. Pooling levels are inserted between dense blocks for further more dimensionality reduction.Black