29 resultados para John Cage (1912-1992)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/johninnocent00canduoft

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/johnludwigkrapfe00kretiala

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/historyofcatholi00sheaiala

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/jamesevans00maclrich

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/womeninthemissio00telfuoft

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/johnwesleytheman00pikeuoft

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/bibleillustratio00ingluoft

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Throughout the history of the Church, the Epistle to the Hebrews has been one of the most puzzling letters in the Canon, particularly regarding the implications of understanding the person of Jesus Christ. John Chrysostom, an important patristic writer, is acknowledged to have made significant contributions to the exegesis of this letter. Chrysostom's thought became the norm for traditional thinking and interpretation of this letter in the Middle Ages. Martin Luther's reception of Chrysostom's Homilies on Hebrews presents a unique interpretation that some scholars may describe as the "Reformation Discovery" on Hebrews. In tracing Luther's reception and appropriation of Chrysostom's exegesis of the letter to the Hebrews, there is a noticeable and significant shift in Christological interpretation. Whether or not these modifications were necessary is a matter of debate; however, they do reflect Luther's contextual and existential questions regarding faith, Christ and knowledge of God, which is evident in his Lectures on Hebrews.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A neural model is described of how adaptively timed reinforcement learning occurs. The adaptive timing circuit is suggested to exist in the hippocampus, and to involve convergence of dentate granule cells on CA3 pyramidal cells, and NMDA receptors. This circuit forms part of a model neural system for the coordinated control of recognition learning, reinforcement learning, and motor learning, whose properties clarify how an animal can learn to acquire a delayed reward. Behavioral and neural data are summarized in support of each processing stage of the system. The relevant anatomical sites are in thalamus, neocortex, hippocampus, hypothalamus, amygdala, and cerebellum. Cerebellar influences on motor learning are distinguished from hippocampal influences on adaptive timing of reinforcement learning. The model simulates how damage to the hippocampal formation disrupts adaptive timing, eliminates attentional blocking, and causes symptoms of medial temporal amnesia. It suggests how normal acquisition of subcortical emotional conditioning can occur after cortical ablation, even though extinction of emotional conditioning is retarded by cortical ablation. The model simulates how increasing the duration of an unconditioned stimulus increases the amplitude of emotional conditioning, but does not change adaptive timing; and how an increase in the intensity of a conditioned stimulus "speeds up the clock", but an increase in the intensity of an unconditioned stimulus does not. Computer simulations of the model fit parametric conditioning data, including a Weber law property and an inverted U property. Both primary and secondary adaptively timed conditioning are simulated, as are data concerning conditioning using multiple interstimulus intervals (ISIs), gradually or abruptly changing ISis, partial reinforcement, and multiple stimuli that lead to time-averaging of responses. Neurobiologically testable predictions are made to facilitate further tests of the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new family of neural network architectures is presented. This family of architectures solves the problem of constructing and training minimal neural network classification expert systems by using switching theory. The primary insight that leads to the use of switching theory is that the problem of minimizing the number of rules and the number of IF statements (antecedents) per rule in a neural network expert system can be recast into the problem of minimizing the number of digital gates and the number of connections between digital gates in a Very Large Scale Integrated (VLSI) circuit. The rules that the neural network generates to perform a task are readily extractable from the network's weights and topology. Analysis and simulations on the Mushroom database illustrate the system's performance.