LoopyDenseNet: Combining Skip Connections, Dense Connectivity and Loops within a Convolutional Neural Network

Publikationen: Thesis / Studienabschlussarbeiten und HabilitationsschriftenMasterarbeit

Standard

LoopyDenseNet: Combining Skip Connections, Dense Connectivity and Loops within a Convolutional Neural Network. / Niederl, Peter.
2022.

Publikationen: Thesis / Studienabschlussarbeiten und HabilitationsschriftenMasterarbeit

Bibtex - Download

@mastersthesis{5bd6aa3bb4094ed1923b616f83a6368d,
title = "LoopyDenseNet: Combining Skip Connections, Dense Connectivity and Loops within a Convolutional Neural Network",
abstract = "Convolutional neural networks (CNNs) have achieved remarkable results in visual object recognition. By using convolutional layers, filters are trained in order to detect distinct features, which enable the network to correctly classify different objects. A traditional CNN follows a hierarchical structure, where every layer is used exactly once. In this work a new network architecture is proposed which utilizes convolutional layers multiple times by looping them. By doing so the following convolutional layers receive more refined feature-maps of different origins. It is shown experimentally, that looping convolutional operations can have a shifting-effect on the detected features, such that the network focuses on certain features in certain regions of the input, depending on the filter. Furthermore, a new type of skip connection is presented, which makes more information available at the flatten layer and is strengthening feature propagation. By looping convolutions the network is very parameter efficient, while still being able to create divers feature-maps. In order to build deeper models with the proposed network architecture some methods are given in order to reduce computational costs and parameters. The proposed network architecture is compared to the traditional CNN architecture on 5 different datasets (MNIST, Fashion-MNIST, CIFAR-10, Fruits-360, Hand gesture), showing superior or similar results on most datasets while having comparable computational costs.",
keywords = "Convolutional Neural Networks, DenseNet, Convolutional loop, Skip Connection, Objektklassifizierung, Convolutional Neural Network, DenseNet, Convolutional loop, Skip connection, Object classification",
author = "Peter Niederl",
note = "no embargo",
year = "2022",
language = "English",
school = "Montanuniversitaet Leoben (000)",

}

RIS (suitable for import to EndNote) - Download

TY - THES

T1 - LoopyDenseNet

T2 - Combining Skip Connections, Dense Connectivity and Loops within a Convolutional Neural Network

AU - Niederl, Peter

N1 - no embargo

PY - 2022

Y1 - 2022

N2 - Convolutional neural networks (CNNs) have achieved remarkable results in visual object recognition. By using convolutional layers, filters are trained in order to detect distinct features, which enable the network to correctly classify different objects. A traditional CNN follows a hierarchical structure, where every layer is used exactly once. In this work a new network architecture is proposed which utilizes convolutional layers multiple times by looping them. By doing so the following convolutional layers receive more refined feature-maps of different origins. It is shown experimentally, that looping convolutional operations can have a shifting-effect on the detected features, such that the network focuses on certain features in certain regions of the input, depending on the filter. Furthermore, a new type of skip connection is presented, which makes more information available at the flatten layer and is strengthening feature propagation. By looping convolutions the network is very parameter efficient, while still being able to create divers feature-maps. In order to build deeper models with the proposed network architecture some methods are given in order to reduce computational costs and parameters. The proposed network architecture is compared to the traditional CNN architecture on 5 different datasets (MNIST, Fashion-MNIST, CIFAR-10, Fruits-360, Hand gesture), showing superior or similar results on most datasets while having comparable computational costs.

AB - Convolutional neural networks (CNNs) have achieved remarkable results in visual object recognition. By using convolutional layers, filters are trained in order to detect distinct features, which enable the network to correctly classify different objects. A traditional CNN follows a hierarchical structure, where every layer is used exactly once. In this work a new network architecture is proposed which utilizes convolutional layers multiple times by looping them. By doing so the following convolutional layers receive more refined feature-maps of different origins. It is shown experimentally, that looping convolutional operations can have a shifting-effect on the detected features, such that the network focuses on certain features in certain regions of the input, depending on the filter. Furthermore, a new type of skip connection is presented, which makes more information available at the flatten layer and is strengthening feature propagation. By looping convolutions the network is very parameter efficient, while still being able to create divers feature-maps. In order to build deeper models with the proposed network architecture some methods are given in order to reduce computational costs and parameters. The proposed network architecture is compared to the traditional CNN architecture on 5 different datasets (MNIST, Fashion-MNIST, CIFAR-10, Fruits-360, Hand gesture), showing superior or similar results on most datasets while having comparable computational costs.

KW - Convolutional Neural Networks

KW - DenseNet

KW - Convolutional loop

KW - Skip Connection

KW - Objektklassifizierung

KW - Convolutional Neural Network

KW - DenseNet

KW - Convolutional loop

KW - Skip connection

KW - Object classification

M3 - Master's Thesis

ER -