vapnik-chervonenkis dimension in neural networks
abstract
this thesis aims to explore the potential of statistical concepts, specifically the
vapnik-chervonenkis dimension (vcd)[33], in optimizing neural networks. with the
increasing use of neural networks in replacing human labor, ensuring the safety and
reliability of these systems is a critical concern. the thesis delves into the question
of how to test the safety of neural networks and optimize them through accessible
statistical concepts.
the thesis presents two case studies to demonstrate the effectiveness of using
vcd in optimizing neural networks. the first case study focuses on optimizing the
autoencoder, a neural network with both encoding and decoding functions, through
the calculation of the vcd. the conclusion suggests that optimizing the activation
function can improve the accuracy of the autoencoder at the mathematical level.
the second case study explores the optimization of the vgg16 neural network
by comparing it to vgg19 in terms of their ability to process high-density data. by
adding three hidden layers, vgg19 outperforms vgg16 in learning ability, suggesting
that adjusting the number of neural network layers can be an effective way to analyze
the capacity of neural networks.
overall, this thesis proposes that statistical concepts such as vcd can provide a
promising avenue for analyzing neural networks, thus contributing to the development
of more reliable and efficient machine learning systems. the final vision is to allocate
the mathematical model reasonably to machine learning and establish an idealized
neural network establishment, allowing for safe and effective use of neural networks
in various industries.