562 resultados para exemplar
Resumo:
Este trabalho busca principalmente reconhecer o que considera ser a dimensão sistemática do pensamento de Paul Ricoeur, fundada, como se acredita, na concepção da consciência como tarefa. Procura outrossim compreender os pressupostos ontológicos e metodológicos de seu pensamento. Pretende mostrar também que, ao perseguir uma simbólica dos sentidos múltiplos, a reflexão concreta de Ricoeur integra, a partir do que ele chama dom da linguagem, o logos filosófico, a episteme que virou ciência e a riqueza pré-filosófica do símbolo. Sua hermenêutica filosófica busca a compreensão do si, da reflexão sobre si mesmo, através da interpretação aplicada sobre os signos e símbolos de uma consciência que não se sabe no princípio, mas ao cabo do desvio de suas obras e de seus atos. A reflexão é a reapropriação daquilo que se é a partir do que é dito a si mesmo pelos signos e símbolos da cultura e das tradições. Paul Ricoeur é apresentado ainda como crítico da consciência imediata e narcísica. Ele constrói seu percurso intelectual de sorte a arbitrar o conflito das diversas interpretações que versam sobre o simbolismo humano. Este trabalho procura mostrar finalmente que o filósofo propõe de fato um novo percurso para a reflexão, através de uma démarche em dois tempos, em que o pensamento reflexivo se desapossa do imediatismo da consciência falsa e comum, para posteriormente se reapropriar das significações mais profundas manifestadas em nosso esforço para existir e nutridas pelo nosso desejo de ser. Eis a trajetória histórica e exemplar que esta dissertação busca evidenciar.
Resumo:
Santo Antônio de Lisboa / de Pádua viveu entre 1191 e 1231, período conhecido como Baixa Idade Média (século XIII ao XV), estudou nos centros de ensino mais proeminentes de Portugal em sua época, Mosteiro de São Vicente e Mosteiro de Santa Cruz de Coimbra, o que lhe possibilitou assimilar vasto conhecimento que seria usado posteriormente na pregação e no combate aos hereges, sobretudo os cátaros. Em uma época de efervescência religiosa, em que os fiéis exigiam maior participação na vida eclesiástica, e de crescentes críticas, os movimentos mendicantes foram o sustentáculo de Roma: os dominicanos com os estudos e com a pregação, e os franciscanos com a pregação por meio sobretudo da vida exemplar. É também nesse período que tem início o estabelecimento de uma arte de pregar medieval, que possui como referência a própria prédica dos primórdios do Cristianismo, baseando-se principalmente em Jesus Cristo e no apóstolo Paulo; nos Padres da Igreja, sobretudo Santo Agostinho e Gregório Magno; e, enfim, em diversos preceptores do século XIII. Santo Antônio valeu-se de todo o conhecimento adquirido nos mosteiros pelos quais passou e da ars praedicandi do período, mostrando-se bastante familiarizado com as questões de seu tempo. Critica severamente aos sacerdotes iníquos, organiza de forma sistemática a teologia da Trindade e se põe como eco estrondoso do IV Concílio de Latrão. Em seus sermões é possível verificar a presença de vários elementos persuasivos que possuem como objetivo alcançar a benevolência do ouvinte e, assim, atingir o propósito máximo, no dizer de Santo Agostinho: instruir para convencer e comover. Para alcançar tal propósito, fez amplo uso das cláusulas, das Ciências Naturais, dos Pais da Igreja, de escritores pagãos e dos bestiários medievais. Este último foi de vital importância principalmente na pregação contra os hereges cátaros, que negligenciavam a natureza como algo puro e de onde se poderia retirar preceitos espirituais ocultos. São esses os objetos, textuais e contextuais, a serem observados na presente dissertação.
Resumo:
People are alarmingly susceptible to manipulations that change both their expectations and experience of the value of goods. Recent studies in behavioral economics suggest such variability reflects more than mere caprice. People commonly judge options and prices in relative terms, rather than absolutely, and display strong sensitivity to exemplar and price anchors. We propose that these findings elucidate important principles about reward processing in the brain. In particular, relative valuation may be a natural consequence of adaptive coding of neuronal firing to optimise sensitivity across large ranges of value. Furthermore, the initial apparent arbitrariness of value may reflect the brains' attempts to optimally integrate diverse sources of value-relevant information in the face of perceived uncertainty. Recent findings in neuroscience support both accounts, and implicate regions in the orbitofrontal cortex, striatum, and ventromedial prefrontal cortex in the construction of value.
Resumo:
New Timber Architecture in Scotland illustrates 90 exemplar projects and demonstrates clearly that there is no single building type unsuited to the use of this adaptable, variable and infinitely renewable material. Too long out of fashion, timber is now widely specified and has become an important design element in some of the most innovative projects being built today. The projects selected for inclusion are not the work of a few superstar architects: they represent the output of a significant percentage of architectural practices in Scotland and illustrate a burgeoning confidence in timber as an exciting, contemporary construction material. New Timber Architecture in Scotland aims to stimulate others to follow their lead.
Resumo:
Traditionally, language speakers are categorised as mono-lingual, bilingual, or multilingual. It is traditionally assumed in English language education that the ‘lingual’ is something that can be ‘fixed’ in form, written down to be learnt, and taught. Accordingly, the ‘mono’-lingual will have a ‘fixed’ linguistic form. Such a ‘form’ differs according to a number of criteria or influences including region or ‘type’ of English (for example, World Englishes) but is nevertheless assumed to be a ‘form’. ‘Mono-lingualism’ is defined and believed, traditionally, to be ‘speaking one language’; wherever that language is; or whatever that language may be. In this chapter, grounded in an individual subjective philosophy of language, we question this traditional definition. Viewing language from the philosophical perspectives such as those of Bakhtin and Voloshinov, we argue that the prominence of ‘context’ and ‘consciousness’ in language means that to ‘fix’ the form of a language goes against the very spirit of how it is formed and used. We thus challenge the categorisation of ‘mono’-lingualism; proposing that such a categorisation is actually a category error, or a case ‘in which a property is ascribed to a thing that could not possibly have that property’ (Restivo, 2013, p. 175), in this case the property of ‘mono’. Using this proposition as a starting point, we suggest that more time be devoted to language in its context and as per its genuine use as a vehicle for consciousness. We theorise this can be done through a ‘literacy’ based approach which fronts the context of language use rather than the language itself. We outline how we envision this working for teachers, students and materials developers of English Language Education materials in a global setting. To do this we consider Scotland’s Curriculum for Excellence as an exemplar to promote conscious language use in context.
Resumo:
C.G.G. Aitken, Q. Shen, R. Jensen and B. Hayes. The evaluation of evidence for exponentially distributed data. Computational Statistics & Data Analysis, vol. 51, no. 12, pp. 5682-5693, 2007.
Resumo:
This dissertation, an exercise in practical theology, consists of a critical conversation between the evangelistic practice of Campus Crusade for Christ in two American university contexts, Bryan Stone's ecclesiologically grounded theology of evangelism, and William Abraham's eschatologically grounded theology of evangelism. It seeks to provide these evangelizing communities several strategic proposals for a more ecclesiologically and eschatologically grounded practice of evangelism within a university context. The current literature on evangelism is long on evangelistic strategy and activity, but short on theological analysis and reflection. This study focuses on concrete practices, but is grounded in a thick description of two particular contexts (derived from qualitative research methods) and a theological analysis of the ecclesiological and eschatological beliefs embedded within their evangelistic activities. The dissertation provides an historical overview of important figures, ideas, and events that helped mold the practice of evangelism inherited by the two ministries of this study, beginning with the famous Haystack Revival on Williams College in 1806. Both ministries, Campus Crusade for Christ at Bowling Green State University (Ohio) and at Washington State University, inherited an evangelistic practice sorely infected with many of the classic distortions that both Abraham and Stone attempt to correct. Qualitative research methods detail the direction that Campus Crusade for Christ at Bowling Green State University (Ohio) and Washington State University have taken the practice of evangelism they inherited. Applying the analytical categories that emerge from a detailed summary of Stone and Abraham to qualitative data of these two ministries reveals several ways evangelism has morphed in a manner sympathetic to Stone's insistence that the central logic of evangelism is the embodied witness of the church. The results of this analysis reveal the subversive and pervasive influence of modernity on these evangelizing communities—an influence that warrants several corrective strategic proposals including: 1) re-situating evangelism within a reading of the biblical narrative that emphasizes the present, social, public, and realized nature of the gospel of the kingdom of God rather than simply its future, personal, private, and unrealized dimensions; 2) clarifying the nature of the evangelizing communities and their relationship to the church; and 3) emphasizing the virtues that characterize a new evangelistic exemplar who is incarnational, intentional, humble, and courageous.
Resumo:
— Consideration of how people respond to the question What is this? has suggested new problem frontiers for pattern recognition and information fusion, as well as neural systems that embody the cognitive transformation of declarative information into relational knowledge. In contrast to traditional classification methods, which aim to find the single correct label for each exemplar (This is a car), the new approach discovers rules that embody coherent relationships among labels which would otherwise appear contradictory to a learning system (This is a car, that is a vehicle, over there is a sedan). This talk will describe how an individual who experiences exemplars in real time, with each exemplar trained on at most one category label, can autonomously discover a hierarchy of cognitive rules, thereby converting local information into global knowledge. Computational examples are based on the observation that sensors working at different times, locations, and spatial scales, and experts with different goals, languages, and situations, may produce apparently inconsistent image labels, which are reconciled by implicit underlying relationships that the network’s learning process discovers. The ARTMAP information fusion system can, moreover, integrate multiple separate knowledge hierarchies, by fusing independent domains into a unified structure. In the process, the system discovers cross-domain rules, inferring multilevel relationships among groups of output classes, without any supervised labeling of these relationships. In order to self-organize its expert system, the ARTMAP information fusion network features distributed code representations which exploit the model’s intrinsic capacity for one-to-many learning (This is a car and a vehicle and a sedan) as well as many-to-one learning (Each of those vehicles is a car). Fusion system software, testbed datasets, and articles are available from http://cns.bu.edu/techlab.
Resumo:
Adaptive Resonance Theory (ART) models are real-time neural networks for category learning, pattern recognition, and prediction. Unsupervised fuzzy ART and supervised fuzzy ARTMAP synthesize fuzzy logic and ART networks by exploiting the formal similarity between the computations of fuzzy subsethood and the dynamics of ART category choice, search, and learning. Fuzzy ART self-organizes stable recognition categories in response to arbitrary sequences of analog or binary input patterns. It generalizes the binary ART 1 model, replacing the set-theoretic: intersection (∩) with the fuzzy intersection (∧), or component-wise minimum. A normalization procedure called complement coding leads to a symmetric: theory in which the fuzzy inter:>ec:tion and the fuzzy union (∨), or component-wise maximum, play complementary roles. Complement coding preserves individual feature amplitudes while normalizing the input vector, and prevents a potential category proliferation problem. Adaptive weights :otart equal to one and can only decrease in time. A geometric interpretation of fuzzy AHT represents each category as a box that increases in size as weights decrease. A matching criterion controls search, determining how close an input and a learned representation must be for a category to accept the input as a new exemplar. A vigilance parameter (p) sets the matching criterion and determines how finely or coarsely an ART system will partition inputs. High vigilance creates fine categories, represented by small boxes. Learning stops when boxes cover the input space. With fast learning, fixed vigilance, and an arbitrary input set, learning stabilizes after just one presentation of each input. A fast-commit slow-recode option allows rapid learning of rare events yet buffers memories against recoding by noisy inputs. Fuzzy ARTMAP unites two fuzzy ART networks to solve supervised learning and prediction problems. A Minimax Learning Rule controls ARTMAP category structure, conjointly minimizing predictive error and maximizing code compression. Low vigilance maximizes compression but may therefore cause very different inputs to make the same prediction. When this coarse grouping strategy causes a predictive error, an internal match tracking control process increases vigilance just enough to correct the error. ARTMAP automatically constructs a minimal number of recognition categories, or "hidden units," to meet accuracy criteria. An ARTMAP voting strategy improves prediction by training the system several times using different orderings of the input set. Voting assigns confidence estimates to competing predictions given small, noisy, or incomplete training sets. ARPA benchmark simulations illustrate fuzzy ARTMAP dynamics. The chapter also compares fuzzy ARTMAP to Salzberg's Nested Generalized Exemplar (NGE) and to Simpson's Fuzzy Min-Max Classifier (FMMC); and concludes with a summary of ART and ARTMAP applications.
Resumo:
The influence of the Essays of Michel de Montaigne on the thought of Friedrich Nietzsche has, hitherto, received scant scholarly attention. The aim of this thesis is to address this lacuna in the literature by making evident the importance of the Essays to the development of Nietzsche’s philosophy. I argue that, in order to fully appreciate Nietzsche’s thought, it must be recognized that, from the beginning to the end of his philosophical life, Montaigne was for him a thinker of the deepest personal and philosophical significance. Against the received scholarly opinion, which would see Montaigne as influential only for Nietzsche’s middle works, I contend that the Essays continue to be a key inspiration for Nietzsche even into his late and final works. Montaigne, with his cheerful affirmation of life, his experimental mode of philosophizing, and his resolutely naturalized perspective, was an exemplar for Nietzsche as a philosopher, psychologist, sceptic and naturalist. The Essays not only stimulated Nietzsche’s thinking on questions to do with morality, epistemology and the nature of the soul but also informed his conception of the ideal philosophical life. Moreover, to explore the Essays from a Nietzschean viewpoint, allows the drawing out of the more radical aspects of Montaigne’s thought, while to probe Montaigne’s impact on Nietzsche, provides insight into the trajectory of Nietzsche’s philosophy as he broke free from romantic pessimism and embraced the naturalism that would guide his works from Human, All Too Human onward.
Cost savings from relaxation of operational constraints on a power system with high wind penetration
Resumo:
Wind energy is predominantly a nonsynchronous generation source. Large-scale integration of wind generation with existing electricity systems, therefore, presents challenges in maintaining system frequency stability and local voltage stability. Transmission system operators have implemented system operational constraints (SOCs) in order to maintain stability with high wind generation, but imposition of these constraints results in higher operating costs. A mixed integer programming tool was used to simulate generator dispatch in order to assess the impact of various SOCs on generation costs. Interleaved day-ahead scheduling and real-time dispatch models were developed to allow accurate representation of forced outages and wind forecast errors, and were applied to the proposed Irish power system of 2020 with a wind penetration of 32%. Savings of at least 7.8% in generation costs and reductions in wind curtailment of 50% were identified when the most influential SOCs were relaxed. The results also illustrate the need to relax local SOCs together with the system-wide nonsynchronous penetration limit SOC, as savings from increasing the nonsynchronous limit beyond 70% were restricted without relaxation of local SOCs. The methodology and results allow for quantification of the costs of SOCs, allowing the optimal upgrade path for generation and transmission infrastructure to be determined.
Resumo:
Concepts are mental representations that are the constituents of thought. EdouardMachery claims that psychologists generally understand concepts to be bodies of knowledge or information carrying mental states stored in long term memory that are used in the higher cognitive competences such as in categorization judgments, induction, planning, and analogical reasoning. While most research in the concepts field generally have been on concrete concepts such as LION, APPLE, and CHAIR, this paper will examine abstract moral concepts and whether such concepts may have prototype and exemplar structure. After discussing the philosophical importance of this project and explaining the prototype and exemplar theories, criticisms will be made against philosophers, who without experimental support from the sciences of the mind, contend that moral concepts have prototype and/or exemplar structure. Next, I will scrutinize Mark Johnson's experimentally-based argument that moral concepts have prototype structure. Finally, I will show how our moral concepts may indeed have prototype and exemplar structure as well as explore the further ethical implications that may be reached by this particular moral concepts conclusion. © 2011 Springer Science+Business Media B.V.
Resumo:
Undergraduates were asked to generate a name for a hypothetical new exemplar of a category. They produced names that had the same numbers of syllables, the same endings, and the same types of word stems as existing exemplars of that category. In addition, novel exemplars, each consisting of a nonsense syllable root and a prototypical ending, were accurately assigned to categories. The data demonstrate the abstraction and use of surface properties of words.
Resumo:
This paper presents an investigation into dynamic self-adjustment of task deployment and other aspects of self-management, through the embedding of multiple policies. Non-dedicated loosely-coupled computing environments, such as clusters and grids are increasingly popular platforms for parallel processing. These abundant systems are highly dynamic environments in which many sources of variability affect the run-time efficiency of tasks. The dynamism is exacerbated by the incorporation of mobile devices and wireless communication. This paper proposes an adaptive strategy for the flexible run-time deployment of tasks; to continuously maintain efficiency despite the environmental variability. The strategy centres on policy-based scheduling which is informed by contextual and environmental inputs such as variance in the round-trip communication time between a client and its workers and the effective processing performance of each worker. A self-management framework has been implemented for evaluation purposes. The framework integrates several policy-controlled, adaptive services with the application code, enabling the run-time behaviour to be adapted to contextual and environmental conditions. Using this framework, an exemplar self-managing parallel application is implemented and used to investigate the extent of the benefits of the strategy
Resumo:
This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development Environments (DECADE). A brief discussion sets the background for IoT, and the development of the distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and quantitative analysis carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service architecture, combining a distributed data warehouse, web services for analysis agents, ontology agents and a verification engine, with a centrally verified outcome database maintained by certifying body for qualification/professional status.