Quantcast
Channel: Why are neural networks becoming deeper, but not wider? - Cross Validated
Browsing all 7 articles
Browse latest View live

Answer by user201857 for Why are neural networks becoming deeper, but not wider?

Currently, on GPUs - we use 32-bit float and with 512 features - combining them we already get quite imprecise. Going even wider is hence limited by numerics and precision of 32-bit float. Another...

View Article



Answer by Ishan Vijay Ghutake for Why are neural networks becoming deeper,...

I think you are get the in detail answer of the question through this paper name Impact of fully connected layers on performance of convolutional neural networks for image classification link -...

View Article

Answer by Charles Staats for Why are neural networks becoming deeper, but not...

For a densely connected neural net of depth $d$ and width $w$, the number of parameters (hence, RAM required to run or train the network) is $O(dw^2)$. Thus, if you only have a limited number of...

View Article

Answer by Aksakal for Why are neural networks becoming deeper, but not wider?

Adding more features helps but the benefit quickly become marginal after a lot of features were added. That's one reason why tools like PCA work: a few components capture most variance in the features....

View Article

Answer by Borbei for Why are neural networks becoming deeper, but not wider?

I don't think there is a definite answer to your questions. But I think the conventional wisdom goes as following: Basically, as the hypothesis space of a learning algorithm grows, the algorithm can...

View Article


Answer by J. O'Brien Antognini for Why are neural networks becoming deeper,...

As a disclaimer, I work on neural nets in my research, but I generally use relatively small, shallow neural nets rather than the really deep networks at the cutting edge of research you cite in your...

View Article

Why are neural networks becoming deeper, but not wider?

In recent years, convolutional neural networks (or perhaps deep neural networks in general) have become deeper and deeper, with state-of-the-art networks going from 7 layers (AlexNet) to 1000 layers...

View Article
Browsing all 7 articles
Browse latest View live




Latest Images