All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-wise concatenation is just doable if the peak and width dimensions of the data remain unchanged, so convolutions inside of a dense block are all of stride 1. Pooling levels are inserted between dense blocks for further https://financefeeds.com/edgeclear-joins-tradingview-roster-of-futures-brokers/