Citation Relationships

Legends: Link to a Model Reference cited by multiple papers

Windisch D (2005) Loading Deep Networks Is Hard: The Pyramidal Case Neural Comput 17:487-502

References and models cited by this paper

References and models that cite this paper

Bartlett P, Ben-david S (2002) Hardness results for neural network approximation problems Theoretical Computer Science 284:53-66
Blum A, Rivest RL (1992) Training a three-node neural network is NP-complete Neural Netw 5:117-127
Dasgupta B, Hammer B (2000) On approximate learning by multilayered feedforward circuits Algorithmic learning theory 2000, Arimura H:Jain S:Sharma A, ed. pp.264
de_Souto MCP, de_Oliveira WR (1999) The loading problem for pyramidal neural networks Electronic Journal on Mathematics of Computation (Available at:
Garey MR, Johnson DS (1979) Computers and intractability: A guide to the theory of NP-completeness
Hammer B (1998) Some complexity results for perceptron networks International Conference on Artificial Neural Networks 98, Niklasson L:Boden M:Ziemke T, ed. pp.639
Hammer B (1998) Training a sigmoidal network is difficult European Symposium on Artificial Neural Networks 98, Verleysen M, ed. pp.255
Hoffgen KU, Simon HU, Vanhorn KS (1995) Robust trainability of single neurons J Computer Sys Sci 50:114-125
Judd JS (1990) Neural network design and the complexity of learning
Muroga S, Toda I, Takasu S (1961) Theory of majority decision elements Journal Of The Franklin Institute 271:376-418
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323:533-536
Sima J (1994) Loading deep networks is hard Neural Comput 6:842-850
Sima J (1996) Back-propagation is not efficient Neural Netw 9:1017-1023
Síma J (2002) Training a single sigmoidal neuron is hard. Neural Comput 14:2709-28 [Journal] [PubMed]
Vu VH (1998) On the infeasibility of training neural networks with small squared errors Advances in neural information processing systems, Jordan MI:Kearns MJ:Solla SA, ed. pp.371
(15 refs)