Answer by user201857 for Why are neural networks becoming deeper, but not wider?
Currently, on GPUs - we use 32-bit float and with 512 features - combining them we already get quite imprecise. Going even wider is hence limited by numerics and precision of 32-bit float. Another...
View ArticleAnswer by Ishan Vijay Ghutake for Why are neural networks becoming deeper,...
I think you are get the in detail answer of the question through this paper name Impact of fully connected layers on performance of convolutional neural networks for image classification link -...
View ArticleAnswer by Charles Staats for Why are neural networks becoming deeper, but not...
For a densely connected neural net of depth $d$ and width $w$, the number of parameters (hence, RAM required to run or train the network) is $O(dw^2)$. Thus, if you only have a limited number of...
View ArticleAnswer by Aksakal for Why are neural networks becoming deeper, but not wider?
Adding more features helps but the benefit quickly become marginal after a lot of features were added. That's one reason why tools like PCA work: a few components capture most variance in the features....
View ArticleAnswer by Borbei for Why are neural networks becoming deeper, but not wider?
I don't think there is a definite answer to your questions. But I think the conventional wisdom goes as following: Basically, as the hypothesis space of a learning algorithm grows, the algorithm can...
View ArticleAnswer by J. O'Brien Antognini for Why are neural networks becoming deeper,...
As a disclaimer, I work on neural nets in my research, but I generally use relatively small, shallow neural nets rather than the really deep networks at the cutting edge of research you cite in your...
View ArticleWhy are neural networks becoming deeper, but not wider?
In recent years, convolutional neural networks (or perhaps deep neural networks in general) have become deeper and deeper, with state-of-the-art networks going from 7 layers (AlexNet) to 1000 layers...
View Article
More Pages to Explore .....