Citation Relationships

Legends: Link to a Model Reference cited by multiple papers


Jacobsson H (2005) Rule extraction from recurrent neural networks: A taxonomy and review Neural Comput 17:1223-1263

References and models cited by this paper

References and models that cite this paper

Alquezar R, Sanfeliu A (1994) A hybrid connectionist symbolic approach to regular grammar inference based on neural learning and hierarchical clustering Proc ICGI94 :203-211
Alquezar R, Sanfeliu A (1994) Inference and recognition of regular grammars by training recurrent neural networks to learn the next-symbol prediction task Advances in pattern recognition and applications: Selected papers from the Vth Spanish Symposium on Pattern Recognition and Image Analysis, Casacuberta F:Sanfeliu A, ed. pp.48
Alquezar R, Sanfeliu A, Sainz M (1997) Experimental assessment of connectionist regular inference from positive and negative examples VII Simposium Nacional De Reconocimiento De Formas Y Analisis De Imagenes 1:49-54
Andrews R, Diederich J, Tickle AB (1995) Survey and critique of techniques for extracting rules from trained artificial neural networks Knowledge Based Systems 8:373-389
Bakker B (2004) The state of mind: Reinforcement learning with recurrent neural networks Unpublished doctoral dissertation
Bakker B, de_Jong M (2000) The epsilon state count From animals to animats 6: Proceedings of the Sixth International Conference on Simulation of Adaptive Behavior, Meyer JA:Berthoz A:Floreano D:Roitblat H:Wilson S, ed. pp.51
Barreto Gde A, Araújo AF, Kremer SC (2003) A taxonomy for spatiotemporal connectionist networks revisited: the unsupervised case. Neural Comput 15:1255-320 [Journal] [PubMed]
Bengio Y, Simard P, Frasconi P (1994) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw 5:157-66 [Journal] [PubMed]
Blair A, Pollack J (1997) Analysis of dynamical recognizers Neural Comput 9:1127-1142
Blanco A, Delgado M, Pegalajar MC (2000) Extracting rules from a (fuzzy-crisp) recurrent neural network using a self-organizing map Int J Intell Syst 15:595-621
Boden M, Jacobsson H, Ziemke T (2000) Evolving context-free language predictors Proceedings of the Genetic and Evolutionary Computation Conference :1033-1040
Boden M, Wiles J, Tonkes B, Blair A (1999) Learning to predict a context-free language: Analysis of dynamics in recurrent hidden units Proc ICANN99 99:359-364
Bruske J, Sommer G (1995) Dynamic cell structure learns perfectly topology preserving map Neural Comput 7:845-865
Bullinaria JA (1997) Analyzing the internal representations of trained artificial neural networks Neural network analysis, architectures and applications, Browne A, ed. pp.3
Carrasco RC, Forcada ML (2001) Simple strategies to encode tree automata in sigmoid recursive neural networks IEEE Trans Know Data Eng 13:148-156
Carrasco RC, Forcada ML, Valdés-Muñoz MA, Neco RP (2000) Stable encoding of finite-state machines in discrete-time recurrent neural nets with sigmoid units. Neural Comput 12:2129-74 [PubMed]
Casey M (1996) The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction. Neural Comput 8:1135-78 [PubMed]
Cechin AL, Pechmann_Simon DR, Stertz K (2003) State automata extraction from recurrent neural nets using k-means and fuzzy clustering XXIII International Conference of the Chilean Computer Science Society :73-78
Cicchello O, Kremer SC (2003) Inducing grammars from sparse data sets: A survey of algorithms and results J Mach Learn Res 4:603-632
Cleeremans A, Servan-Schreiber D, Mcclelland JL (1989) Finite state automata and simple recurrent networks Neural Comput 1:372-381
Craven MW, Shavlik JW (1994) Using sampling and queries to extract rules from trained neural networks Machine learning: Proceedings of the Eleventh International Conference, Cohen WW:Hirsh H, ed.
Craven MW, Shavlik JW (1996) Extracting tree-structured representations of trained networks Advances in neural information processing systems, Touretzky D:Mozer M:Hasselmo M, ed. pp.24
Craven MW, Shavlik JW (1999) Rule extraction: Where do we go from here? Tech. Rep. No. Machine Learning Research Group Working Paper 99-1
Crutchfield JP (1994) The calculi of emergence: Computation, dynamics, and induction Physica D 75:11-54
Crutchfield JP, Young K (1990) Computation at the onset of chaos Complexity, entropy and the physics of information, Zurek W, ed.
Das S, Das R (1991) Induction of discrete-state machine by stabilizing a simple recurrent network using clustering Computer Science And Information 21:35-40
Das S, Giles CL, Sun GZ (1993) Using prior knowledge in a NNPDA to learn context-free languages Advances in neural information processing systems, Hanson SJ:Cowan JD:Giles CL, ed. pp.65
Das S, Mozer MC (1994) A unified gradient-descent-clustering architecture for finite state machine induction Advances in neural information processing systems, Cowan JD:Tesauro G:Alspector J, ed. pp.19
Elman JL (1990) Finding structure in time Cognitive Science 14:179-211
Forcada ML (2002) Neural networks: Automata and formal models of computation An unfinished survey (Available online at: http:--www.dlsi.ua.es-~mlf-nnafmc-)
Forcada ML, Carrasco RC (2001) Finite-state computation in analog neural networks: Steps towards biologically plausible models? Emergent computational models based on neuroscience, Wermter S:Austin J:Willshaw D, ed.
Frasconi P, Gori M, Maggini M, Soda G (1996) Representation of finite state automata in recurrent radial basis function networks Mach Learn 23:5-32
Gers FA, Schmidhuber E (2001) LSTM recurrent networks learn simple context-free and context-sensitive languages. IEEE Trans Neural Netw 12:1333-40 [Journal] [PubMed]
Giles CL, Chen D, Miller C, Chen H, Sun G, Lee Y (1991) Second-order recurrent neural networks for grammatical inference Proceedings Of International Joint Conference On Neural Networks 2:273-281
Giles CL, Horne BG, Lin T (1995) Learning a class of large finite state machines with a recurrent neural network Neural Netw 8:1359-1365
Giles CL, Lawrence S, Tsoi A (1997) Rule inference for financial prediction using recurrent neural networks Proceedings of IEEE-IAFE Conference on Computational Intelligence for Financial Engineering (CIFEr) :253-259
Giles CL, Lawrence S, Tsoi AC (2001) Noisy time series prediction using a recurrent neural network and grammatical inference Mach Learn 44:161-183
Giles CL, Miller CB, Chen D, Chen HH, Sun GZ, Lee YC (1992) Learning and extracting finite state automata with second-order recurrent neural networks Neural Comput 4:393-405
Giles CL, Miller CB, Chen D, Sun GZ, Chen HH, Lee YC (1992) Extracting and learning an unknown grammar with recurrent neural networks Advances in neural information processing systems, Moody JE:Hanson SJ:Lippman RP, ed. pp.317
Giles CL, Omlin CW (1993) Insertion and refinement of production rules in recurrent neural networks Connection Science 5:307-377
Giles CL, Omlin CW (1994) Pruning recurrent neural networks for improved generalization performance. IEEE Trans Neural Netw 5:848-51 [Journal] [PubMed]
Golea M (1996) On the complexity of rule extraction from neural networks and network-querying Tech Rep
Gori M, Maggini M, Martinelli E, Soda G (1998) Inductive inference from noisy examples using the hybrid finite state filter. IEEE Trans Neural Netw 9:571-5 [Journal] [PubMed]
Gori M, Maggini M, Soda G (1994) Scheduling of modular architectures for inductive inference of regular grammars ECAI94 Workshop on Combining Symbolic and Connectionist Processing :78-87
Goudreau MW, Giles CL (1995) Using recurrent neural networks to learn the structure of interconnection networks Neural Netw 8:793-804
Goudreau MW, Giles CL, Chakradhar ST, Chen D (1994) First-order vs. second-order single layer recurrent neural networks IEEE Trans Neural Networks 5:511-518
Hammer B, Tino P (2003) Recurrent neural networks with small weights implement definite memory machines Neural Comput 15:1897-1929
Hinton GE (1990) Mapping part-whole hierarchies into connectionist networks Art Intell 46:47-75
Hopcroft J, Ullman J (1979) Introduction to automata theory, languages, and computation
Horne BG, Giles CL (1995) An experimental comparison of recurrent neural networks Advances in neural information processing systems, Tesauro G:Touretzky D:Leen T, ed. pp.697
Horne BG, Hush DR (1994) Bounds on the complexity of recurrent neural network implementations of finite state machines Advances in neural information processing systems, Cowan JD:Tesauro G:Alspector J, ed. pp.359
Jacobsson H, Ziemke T (2003) Improving procedures for evaluation of connectionist context-free language predictors. IEEE Trans Neural Netw 14:963-6 [Journal] [PubMed]
Jacobsson H, Ziemke T (2003) Reducing complexity of rule extraction from prediction RNNs through domain interaction Tech. Rep. No. HS-IDA-TR-03-007
Jaeger H (2003) Adaptive nonlinear system identification with echo state networks Advances in neural information processing systems, Becker S:Thrun S:Obermayer K, ed. pp.593
Jagota A, Plate T, Shastri L, Sun R (1999) Connectionist symbol processing: Dead or alive? Neural Computing Surveys 2:1-40
Jain AK, Murty MN, Flynn PJ (1999) Data clustering: A review ACM Computing Surveys 31:264-323
Kohonen T (1995) Self-organizing Maps
Kolen J, Pollack J (1995) The observers paradox: Apparent computational complexity in physical systems J Exp Theoret Art Intell 7:253-277
Kolen JF (1993) Fools gold: Extracting finite state machines from recurrent network dynamics Neural information processing systems, Cowan J:Tesauro G:Alspector J, ed. pp.501
Kolen JF (1994) Exploring the computational capabilities of recurrent neural networks Unpublished doctoral dissertation
Kolen JF, Kremer SC (2001) A field guide to dynamical recurrent networks, Kolen JF:Kremer SC, ed.
Kremer SC (2001) Spatiotemporal connectionist networks: A taxonomy and review Neural Comput 13:248-306
Kuhn TS (1962) The structure of scientific revolutions
Lawrence S, Giles CL, Fong S (2000) Natural language grammatical inference with recurrent neural networks IEEE Transactions On Knowledge And Data Engineering 12:126-140
Lawrence S, Giles CL, Tsoi AC (1998) Symbolic conversion, grammatical inference and rule extraction for foreign exchange rate prediction Neural networks in the capital markets NNCM96, Abu-Mostafa Y:Weigend AS:Refenes P, ed. pp.333
Maggini M (1998) Recursive neural networks and automata Adaptive processing of sequences and data structures, Giles CL:Gori M, ed. pp.248
Manolios P, Fanelli R (1994) First order recurrent neural networks and deterministic finite state automata Neural Comput 6:1155-1173
Mcculloch WS, Pitts W (1943) A Logical Calculus of Ideas Immanent in Nervous Activity Bull Math Biophysics 5:115-133
Medler D (1998) A brief history of connectionism Neural Computing Surveys 1:61-101
Meeden LA (1996) An incremental approach to developing intelligent neural network controllers for robots. IEEE Trans Syst Man Cybern B Cybern 26:474-85 [Journal] [PubMed]
Miller CB, Giles CL (1993) Experimental comparison of the effect of order in recurrent neural networks Int J Pattern Recogn Art Intell 7:849-872
Mirkin B (1996) Mathematical classification and clustering
Mozer M, Das S (1998) Dynamic On-line Clustering and State Extraction: An Approach to Symbolic Learning. Neural Netw 11:53-64 [PubMed]
Niklasson L, Boden M (1997) Representing structure and structured representations in connectionist networks Neural Network perspectives on cognition and adaptive robotics, Browne A, ed. pp.20
Omlin C, Giles CL (1996) Extraction of rules from discrete-time recurrent neural networks Neural Networks 9:41-51
Omlin CW (2001) Understanding and explaining DRN behaviour A field guide to dynamical recurrent networks, Kolen JF:Kremer SC, ed. pp.207
Omlin CW, Giles C, Miller C (1992) Heuristics for the extraction of rules from discrete-time recurrent neural networks Proceedings Of The International Joint Conference On Neural Networks 1:33-38
Omlin CW, Giles CL (1992) Training second-order recurrent neural networks using hints Proceedings of the Ninth International Conference on Machine Learning, Sleeman D:Edwards P, ed. pp.363
Omlin CW, Giles CL (1996) Constructing deterministic finite-state automata in recurrent neural networks J ACM 43:937-972
Omlin CW, Giles CL (1996) Rule revision with recurrent neural networks Know Data Eng 8:183-188
Omlin CW, Giles CL (2000) Symbolic knowledge representation in recurrent neural networks: Insights from theoretical models of computation Knowledge-based neurocomputing, Cloete I:Zuranda JM, ed.
Omlin CW, Thornber KK, Giles CL (1998) Deterministic fuzzy finite state automata can be deterministically encoded into recurrent neural networks IEEE Trans Fuzzy Systems 6:76-89
Paz A (1971) Introduction to probabilistic automata
Rabin MO (1963) Probabilistic automata Information And Control 6:230-245
Rodriguez P, Wiles J, Elman JL (1999) A recurrent network that learns to count Connection Science 11:5-40
Rodriguez PF (1999) Mathematical foundations of simple recurrent neural networks in language processing Unpublished doctoral dissertation
Sanfeliu A, Alquezar R (1995) Active grammatical inference: A new learning methodology Shape, Structure and Pattern Recognition, 5th IAPR International Workshop on Structural and Syntactic Pattern Recognition :191-200
Schellhammer I, Diederich J, Towsey M, Brugman C (1998) Knowledge extraction and recurrent neural networks: An analysis of an Elman network trained on a natural language learning task Proceedings of the Joint Conference on New Methods in Language Processing and Computational Natural Language Learning: NeMLaP3-CoNLL98, Powers DMW, ed. pp.73
Schmidhuber J (1992) Learning complex, extended sequences using the principle of history compression Neural Comput 4:234-242
Servan-Schreiber D, Cleeremans A, Mcclelland JL (1989) Learning sequential structure in simple recurrent networks Advances in neural information processing systems, Touretzky DS, ed. pp.643
Servan-Schreiber D, Cleeremans A, Mcclelland JL (1991) Graded state machines: The representation of temporal contingencies in simple recurrent networks Mach Learn 7:161-193
Sharkey AJC (1996) [Special issue]. Combining artificial neural nets: Ensemble approaches Connection Science 8:3
Sharkey NE, Jackson SA (1995) An internal report for connectionists Computational architectures integrating neural and symbolic processes , Sun R:Bookman LA, ed. pp.223
Siegelmann H, Sontag E (1995) On the computational power of neural nets Journal Of Computer And System Sciences 50:132-150
Sun GZ, Giles CL, Chen HH (1998) The neural network pushdown automation: Architecture, dynamics and learning Adaptive processing of sequences and data structures, Giles C:Gori M, ed. pp.296
Sun R (2001) Introduction to sequence learning Sequence learning: Paradigms, algorithms, and applications, Sun R:Giles CL, ed. pp.1
Sun R, Peterson T, Sessions C (2001) The extraction of planning knowledge from reinforcement learning neural networks Proceedings of WIRN2001
Tabor W, Tanenhaus M (1999) Dynamical models of sentence processing Cognitive Science 24:491-515
Tickle A, Andrews R, Golea M, Diederich J (1997) Rule extraction from artificial neural networks Neural network analysis, architectures and applications , Browne A, ed. pp.61
Tickle AB, Andrews R, Golea M, Diederich J (1998) The truth will come to light: Directions and challenges in extracting the knowledge embedded within mined artificial neural networks IEEE Transactions On Neural Networks 9:1057-1068
Tino P, Cernanský M, Benusková L (2004) Markovian architectural bias of recurrent neural networks. IEEE Trans Neural Netw 15:6-15 [Journal] [PubMed]
Tino P, Dorffner G, Schittenkopf C (2000) Understanding state space organization in recurrent neural networks with iterative function systems dynamics Hybrid neural symbolic integration, Wermter S:Sun R, ed. pp.256
Tino P, Hammer B (2003) Architectural bias in recurrent neural networks-Fractal analysis Neural Comput 15:1931-1957
Tino P, Horne BG, Giles CL, Collingwood PC (1998) Finite state machines and recurrent neural networks-automata and dynamical systems approaches Neural networks and pattern recognition, Dayhoff JE:Omidvar O, ed. pp.171
Tino P, Köteles M (1999) Extracting finite-state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE Trans Neural Netw 10:284-302 [Journal] [PubMed]
Tino P, Sajda J (1995) Learning and extracting initial mealy machines with a modular neural network model Neural Comput 7:822-844
Tino P, Vojtek V (1998) Extracting stochastic machines from recurrent neural networks trained on complex symbolic sequences Neural Network World 8:517-530
Tomita M (1982) Dynamic construction of finite-state automata from examples using hillclimbing Proceedings of Fourth Annual Cognitive Science Conference :105-108
Tonkes B, Blair A, Wiles J (1998) Inductive bias in context-free language learning Proceedings of the Ninth Australian Conference on Neural Networks :52-56
Tonkes B, Wiles J (1999) Learning a context-free task with a recurrent neural network: An analysis of stability Dynamical Cognitive Science: Proceedings of the Fourth Biennial Conference of the Australasian Cognitive Science Society, Heath R:Hayes B:Heathcote A:Hooker C, ed.
Towell GG, Shavlik JW (1993) The extraction of refined rules from knowledge-based neural networks Mach Learn 13:17-101
Trakhtenbrot BA, Barzdin JM (1973) Finite automata: Behavior and synthesis
Vahed A, Omlin CW (1999) Rule extraction from recurrent neural networks using a symbolic machine learning algorithm Tech. Rep. No. US-CS-TR-4
Vahed A, Omlin CW (2004) A machine learning method for extracting symbolic knowledge from recurrent neural networks. Neural Comput 16:59-71 [PubMed]
Watrous RL, Kuhn GM (1992) Induction of finite-state automata using second-order recurrent networks Advances in neural information processing systems, Moody JE:Hanson SJ:Lippman RP, ed. pp.309
Wiles J, Elman JL (1995) Learning to count without a counter: A case study of dynamics and activation landscapes in recurrent neural networks Proceedings of the Seventeenth Annual Conference of the Cognitive Science Society :482-487
Young K, Crutchfield JP (1993) Fluctuation spectroscopy Chaos, Solutions, and Fractals 4:5-39
Zeng Z, Goodman RM, Smyth P (1993) Learning finite state machines with self-clustering recurrent networks Neural Comput 5:976-990
Ziemke T, Thieme M (2002) Neuromodulation of reactive sensorimotor mappings as a short-term memory mechanism in delayed response tasks Adapt Behav 10:185-199
Jacobsson H (2006) The crystallizing substochastic sequential machine extractor: CrySSMEx. Neural Comput 18:2211-55 [Journal] [PubMed]
Tino P, Mills AJ (2006) Learning beyond finite memory in recurrent networks of spiking neurons. Neural Comput 18:591-613 [Journal] [PubMed]
(121 refs)