745 resultados para hinder
Filozofia antyczna wobec problemu interpretacji. Rozwój alegorezy od przedsokratyków do Arystotelesa
Resumo:
The present work examines the beginnings of ancient hermeneutics. More specifically, it discusses the connection between the rise of the practice of allegoresis, on the one hand, and the emergence of the first theory of figurative language, on the other. Thus, this book investigates the specific historical and cultural circumstances that enabled the ancient Greeks not only to discover the possibility of allegorical interpretation, but also to treat figurative language as a philosophical problem. By posing difficulties in understanding the enigmatic sense of various esoteric doctrines, poems, oracles and riddles, figurative language created the context for theoretical reflection on the meaning of these “messages”. Hence, ancient interpreters began to ponder over the nature and functions of figurative (“enigmatic”) language as well as over the techniques of its proper use and interpretation. Although the practice of allegorical interpretation was closely linked to the development of the whole of ancient philosophy, the present work covers only the period from the 6th to the 4th century B.C. It concentrates, then, on the philosophical and cultural consequences of allegoresis in the classical age. The main thesis advocated here has it that the ancient Greeks were in-clined to regard allegory as a cognitive problem rather than merely as a stylistic or a literary one. When searching for the hidden meanings of various esoteric doc-trines, poems, oracles and riddles, ancient interpreters of these “messages” assumed allegory to be the only tool suitable for articulating certain matters. In other words, it was their belief that the use of figurative language resulted from the necessity of expressing things that were otherwise inexpressible. The present work has been organized in the following manner. The first part contains historical and philological discussions that provide the point of departure for more philosophical considerations. This part consists of two introductory chapters. Chapter one situates the practice of allegorical interpretation at the borderline of two different traditions: the rhetorical-grammatical and the hermeneutical. In order to clearly differentiate between the two, chapter one distinguishes between allegory and allegoresis, on the one hand, and allegoresis and exegesis, on the other. While pointing to the conventionality (and even arbitrariness) of such distinctions, the chapter argues, nevertheless, for their heuristic usefulness. The remaining part of chapter one focuses on a historical and philological reconstruction of the most important conceptual tools of ancient hermeneutics. Discussing the semantics of such terms as allēgoría, hypónoia, ainigma and symbolon proves important for at least two crucial reasons. Firstly, it reveals the mutual affinity between allegoresis and divination, i.e., practices that are inherently connected with the need to discover the latent meaning of the “message” in question (whether poem or oracle). Secondly, these philological analyses bring to light the specificity of the ancient understanding of such concepts as allegory or symbol. It goes without saying that antiquity employed these terms in a manner quite disparate from modernity. Chapter one concludes with a discussion of ancient views on the cognitive value of figurative (“enigmatic”) language. Chapter two focuses on the role that allegoresis played in the process of transforming mythos into logos. It is suggested here that it was the practice of allegorical interpretation that made it possible to preserve the traditional myths as an important point of reference for the whole of ancient philosophy. Thus, chapter two argues that the existence of a clear opposition between mythos into logos in Preplatonic philosophy is highly questionable in light of the indisputable fact that the Presocratics, Sophists and Cynics were profoundly convinced about the cognitive value of mythos (this conviction was also shared by Plato and Aristotle, but their attitude towards myth was more complex). Consequently, chapter two argues that in Preplatonic philosophy, myth played a function analogous to the concepts discussed in chapter one (i.e., hidden meanings, enigmas and symbols), for in all these cases, ancient interpreters found tools for conveying issues that were otherwise difficult to convey. Chapter two concludes with a classification of various types of allegoresis. Whilst chapters one and two serve as a historical and philological introduction, the second part of this book concentrates on the close relationship between the development of allegoresis, on the one hand, and the flowering of philosophy, on the other. Thus, chapter three discusses the crucial role that allegorical interpretation came to play in Preplatonic philosophy, chapter four deals with Plato’s highly complex and ambivalent attitude to allegoresis, and chapter five has been devoted to Aristotle’s original approach to the practice of allegorical interpretation. It is evident that allegoresis was of paramount importance for the ancient thinkers, irrespective of whether they would value it positively (Preplatonic philosophers and Aristotle) or negatively (Plato). Beginning with the 6th century B.C., the ancient practice of allegorical interpretation is motivated by two distinct interests. On the one hand, the practice of allegorical interpretation reflects the more or less “conservative” attachment to the authority of the poet (whether Homer, Hesiod or Orpheus). The purpose of this apologetic allegoresis is to exonerate poetry from the charges leveled at it by the first philosophers and, though to a lesser degree, historians. Generally, these allegorists seek to save the traditional paideia that builds on the works of the poets. On the other hand, the practice of allegorical interpretation reflects also the more or less “progressive” desire to make original use of the authority of the poet (whether Homer, Hesiod or Orpheus) so as to promote a given philosophical doctrine. The objective of this instrumental allegoresis is to exculpate philosophy from the accusations brought against it by the more conservative circles. Needless to say, these allegorists significantly contribute to the process of the gradual replacing of the mythical view of the world with its more philosophical explanation. The present book suggests that it is the philosophy of Aristotle that should be regarded as a sort of acme in the development of ancient hermeneutics. The reasons for this are twofold. On the one hand, the Stagirite positively values the practice of allegoresis, rehabilitating, thus, the tradition of Preplatonic philosophy against Plato. And, on the other hand, Aristotle initiates the theoretical reflection on figurative (“enigmatic”) language. Hence, in Aristotle we encounter not only the practice of allegoresis, but also the theory of allegory (although the philosopher does not use the term allēgoría). With the situation being as it is, the significance of Aristotle’s work cannot be overestimated. First of all, the Stagirite introduces the concept of metaphor into the then philosophical considerations. From that moment onwards, the phenomenon of figurative language becomes an important philosophical issue. After Aristo-tle, the preponderance of thinkers would feel obliged to specify the rules for the appropriate use of figurative language and the techniques of its correct interpretation. Furthermore, Aristotle ascribes to metaphor (and to various other “excellent” sayings) the function of increasing and enhancing our knowledge. Thus, according to the Stagirite, figurative language is not only an ornamental device, but it can also have a significant explanatory power. Finally, Aristotle observes that figurative expressions cause words to become ambiguous. In this context, the philosopher notices that ambiguity can enrich the language of a poet, but it can also hinder a dialectical discussion. Accordingly, Aristotle is inclined to value polysemy either positively or negatively. Importantly, however, the Stagirite is perfectly aware of the fact that in natural languages ambiguity is unavoidable. This is why Aristotle initiates a syste-matic reflection on the phenomenon of ambiguity and distinguishes its various kinds. In Aristotle, ambiguity is, then, both a problem that needs to be identified and a tool that can help in elucidating intricate philosophical issues. This unique approach to ambiguity and figurative (“enigmatic”) language enabled Aristotle to formulate invaluable intuitions that still await appropriate recognition.
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
Background Achieving the goals set by Roll Back Malaria and the Government of Kenya for use of insecticide treated bednets (ITNs) will require that the private retail market for nets and insecticide treatments grow substantially. This paper applies some basic concepts of market structure and pricing to a set of recently-collected retail price data from Kenya in order to answer the question, “How well are Kenyan retail markets for ITNs working?” Methods Data on the availability and prices of ITNs at a wide range of retail outlets throughout Kenya were collected in January 2002, and vendors and manufacturers were interviewed regarding market structure. Findings Untreated nets are manufactured in Kenya by a number of companies and are widely available in large and medium-sized towns. Availability in smaller villages is limited. There is relatively little geographic price variation, and nets can be found at competitive prices in towns and cities. Marketing margins on prices appear to be within normal ranges. No finished nets are imported. Few pre-treated nets or net+treatment combinations are available, with the exception of the subsidized Supanet/Power Tab combination marketed by a donor-funded social marketing project. Conclusions Retail markets for untreated nets in Kenya are well established and appear to be competitive. Markets for treated nets and insecticide treatment kits are not well established. The role of subsidized ITN marketing projects should be monitored to ensure that these projects support, rather than hinder, the development of retail markets.
Resumo:
In research areas involving mathematical rigor, there are numerous benefits to adopting a formal representation of models and arguments: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [30] we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. In this report we evaluate our proposed design criteria by utilizing within the context of novel research a formal reasoning system that is designed according to these criteria. In particular, we consider how the design and capabilities of the formal reasoning system that we employ influence, aid, or hinder our ability to accomplish a formal reasoning task – the assembly of a machine-verifiable proof pertaining to the NetSketch formalism. NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. It provides capabilities for compositional analysis based on a strongly-typed domain-specific language (DSL) for describing and reasoning about constrained-flow networks and invariants that need to be enforced thereupon. In a companion paper [13] we overview NetSketch, highlight its salient features, and illustrate how it could be used in actual applications. In this paper, we define using a machine-readable syntax major parts of the formal system underlying the operation of NetSketch, along with its semantics and a corresponding notion of validity. We then provide a proof of soundness for the formalism that can be partially verified using a lightweight formal reasoning system that simulates natural contexts. A traditional presentation of these definitions and arguments can be found in the full report on the NetSketch formalism [12].
Resumo:
The data streaming model provides an attractive framework for one-pass summarization of massive data sets at a single observation point. However, in an environment where multiple data streams arrive at a set of distributed observation points, sketches must be computed remotely and then must be aggregated through a hierarchy before queries may be conducted. As a result, many sketch-based methods for the single stream case do not apply directly, as either the error introduced becomes large, or because the methods assume that the streams are non-overlapping. These limitations hinder the application of these techniques to practical problems in network traffic monitoring and aggregation in sensor networks. To address this, we develop a general framework for evaluating and enabling robust computation of duplicate-sensitive aggregate functions (e.g., SUM and QUANTILE), over data produced by distributed sources. We instantiate our approach by augmenting the Count-Min and Quantile-Digest sketches to apply in this distributed setting, and analyze their performance. We conclude with experimental evaluation to validate our analysis.
Resumo:
Background: Hospital clinicians are increasingly expected to practice evidence-based medicine (EBM) in order to minimize medical errors and ensure quality patient care, but experience obstacles to information-seeking. The introduction of a Clinical Informationist (CI) is explored as a possible solution. Aims: This paper investigates the self-perceived information needs, behaviour and skill levels of clinicians in two Irish public hospitals. It also explores clinicians perceptions and attitudes to the introduction of a CI into their clinical teams. Methods: A questionnaire survey approach was utilised for this study, with 22 clinicians in two hospitals. Data analysis was conducted using descriptive statistics. Results: Analysis showed that clinicians experience diverse information needs for patient care, and that barriers such as time constraints and insufficient access to resources hinder their information-seeking. Findings also showed that clinicians struggle to fit information-seeking into their working day, regularly seeking to answer patient-related queries outside of working hours. Attitudes towards the concept of a CI were predominantly positive. Conclusion: This paper highlights the factors that characterise and limit hospital clinicians information-seeking, and suggests the CI as a potentially useful addition to the clinical team, to help them to resolve their information needs for patient care.
Resumo:
Gemstone Team No More Needles
Resumo:
Failing to find a tumor in an x-ray scan or a gun in an airport baggage screening can have dire consequences, making it fundamentally important to elucidate the mechanisms that hinder performance in such visual searches. Recent laboratory work has indicated that low target prevalence can lead to disturbingly high miss rates in visual search. Here, however, we demonstrate that misses in low-prevalence searches can be readily abated. When targets are rarely present, observers adapt by responding more quickly, and miss rates are high. Critically, though, these misses are often due to response-execution errors, not perceptual or identification errors: Observers know a target was present, but just respond too quickly. When provided an opportunity to correct their last response, observers can catch their mistakes. Thus, low target prevalence may not be a generalizable cause of high miss rates in visual search.
Resumo:
BACKGROUND: Measurement of CD4+ T-lymphocytes (CD4) is a crucial parameter in the management of HIV patients, particularly in determining eligibility to initiate antiretroviral treatment (ART). A number of technologies exist for CD4 enumeration, with considerable variation in cost, complexity, and operational requirements. We conducted a systematic review of the performance of technologies for CD4 enumeration. METHODS AND FINDINGS: Studies were identified by searching electronic databases MEDLINE and EMBASE using a pre-defined search strategy. Data on test accuracy and precision included bias and limits of agreement with a reference standard, and misclassification probabilities around CD4 thresholds of 200 and 350 cells/μl over a clinically relevant range. The secondary outcome measure was test imprecision, expressed as % coefficient of variation. Thirty-two studies evaluating 15 CD4 technologies were included, of which less than half presented data on bias and misclassification compared to the same reference technology. At CD4 counts <350 cells/μl, bias ranged from -35.2 to +13.1 cells/μl while at counts >350 cells/μl, bias ranged from -70.7 to +47 cells/μl, compared to the BD FACSCount as a reference technology. Misclassification around the threshold of 350 cells/μl ranged from 1-29% for upward classification, resulting in under-treatment, and 7-68% for downward classification resulting in overtreatment. Less than half of these studies reported within laboratory precision or reproducibility of the CD4 values obtained. CONCLUSIONS: A wide range of bias and percent misclassification around treatment thresholds were reported on the CD4 enumeration technologies included in this review, with few studies reporting assay precision. The lack of standardised methodology on test evaluation, including the use of different reference standards, is a barrier to assessing relative assay performance and could hinder the introduction of new point-of-care assays in countries where they are most needed.
Resumo:
The rise of private food standards has brought forth an ongoing debate about whether they work as a barrier for smallholders and hinder poverty reduction in developing countries. This paper uses a global value chain approach to explain the relationship between value chain structure and agrifood safety and quality standards and to discuss the challenges and possibilities this entails for the upgrading of smallholders. It maps four potential value chain scenarios depending on the degree of concentration in the markets for agrifood supply (farmers and manufacturers) and demand (supermarkets and other food retailers) and discusses the impact of lead firms and key intermediaries on smallholders in different chain situations. Each scenario is illustrated with case examples. Theoretical and policy issues are discussed, along with proposals for future research in terms of industry structure, private governance, and sustainable value chains.
Resumo:
Delivering a lecture requires confidence, a sound knowledge and well developed teaching skills (Cooper and Simonds, 2007, Quinn and Hughes, 2007). However, practitioners who are new to lecturing large groups in higher education may initially lack the confidence to do so which can manifest itself in their verbal and non-verbal cues and the fluency of their teaching skills. This results in the perception that students can identify the confident and non-confident teacher during a lecture (Street, 2007) and so potentially contributing to a lecturer’s level of anxiety prior to, and during, a lecture. Therefore, in the current educational climate of consumerisation, with the increased evaluation of teaching by students, having the ability to deliver high-quality, informed, and interesting lectures assumes greater significance for both lecturers and universities (Carr, 2007; Higher Education Founding Council 2008, Glass et al., 2006). This paper will present both the quantitative and qualitative data from a two-phase mixed method study with 75 nurse lecturers and 62 nursing students in one university in the United Kingdom. The study investigated the notion that lecturing has similarities to acting (Street, 2007). The findings presented here are concerned with how students perceived lecturers’ level of confidence and how lecturers believed they demonstrated confidence. In phase one a specifically designed questionnaire was distributed to both lecturers and students and a response rate of 91% (n=125) was achieved, while in phase two 12 in-depth semi-structured interviews were conducted with lecturers. Results suggested that students in a lecture could identify if the lecturer was confident or not by the way they performed a lecture. Students identified 57 manifestations of non-confidence and lecturers identified 85, while 57 manifestations of confidence were identified by students and 88 by lecturers. Overall, these fell into 12 main converse categories, ranging from body language to the use of space within the room. Both students and lecturers ranked body language, vocal qualities, delivery skills, involving the students and the ability to share knowledge as the most evident manifestations of confidence. Elements like good eye contact, smiling, speaking clearly and being fluent in the use of media recourses where all seen as manifestations confidence, conversely if these were poorly executed then a presentation of under confidence was evident. Furthermore, if the lecturer appeared enthusiastic it was clearly underpinned by the manifestation of a highly confidence lecturer who was secure in their knowledge base and teaching abilities: Some lecturers do appear enthusiastic but others don’t. I think the ones that do know what they are talking about, you can see it in their voice and in their lively body language. I think they are also good at involving the students even. I think the good ones are able to turn boring subjects into lively and interesting ones. (Student 50) Significantly more lecturers than students felt the lecturer should appear confident when lecturing. The lecturers stated it was particularly important to do so when they did not feel confident, because they were concerned with appearing capable. It seems that these students and lecturers perceived that expressive and apparently confident lecturers can make a positive impact on student groups in terms of involvement in lectures; the data also suggested the reverse, for the under confident lecturer. Findings from phase two indicated that these lecturers assumed a persona when lecturing, particularly, but not exclusively, when they were nervous. These lecturers went through a process of assuming and maintaining this persona before and during a lecture as a way of promoting their internal perceptions of confidence but also their outward manifestation of confidence. Although assuming a convincing persona may have a degree of deception about it, providing the knowledge communicated is accurate, the deception may aid rather than hinder learning, because enhances the delivery of a lecture. Therefore, the deception of acting a little more confidently than one feels might be justified when the lecturer knows the knowledge they are communicating is correct, unlike the Dr Fox Effect where the person delivering a lecture is an actor and does not know the subject in any detail or depth and where the deception to be justified (Naftulin, et al., 1973). In conclusion, these students and lecturers perceive that confident and enthusiastic lecturers communicate their passion for the subject in an interesting and meaningful manner through the use of their voice, body, space and interactions in such a way that shows confidence in their knowledge as well as their teaching abilities. If lecturers, therefore, can take a step back to consider how they deliver lectures in apparently confident ways this may increase their ability to engage their students and not only help them being perceived as good lecturers, but also contribute to the genuine act of education.
Resumo:
Aims: To determine the extent to which clinical nursing practice has adopted research evidence. To identify barriers to the application of research findings in practice and to propose ways of overcoming these barriers. Background: Way back in 1976, nursing and midwifery practice started adopting research evidence. By 1990s, there was some transparency of research evidence in practice, but more could have been done to widen its adoption. Many barriers were identified which could hinder implementation of the evidence in practice, and the effort to remove these remains weak. Evaluation: 25 research articles from across Europe and America were selected, and scrutinized, and recommendations analysed. Findings: Many clinical practitioners report a lack of time, ability and motivation to appraise research reports and adopt findings in practice. The clinical environment was not seen as research friendly as there were a general lack of research activities and facilities locally. There was a clear lack of research leadership in practice. Implication for nursing management: This paper reviewed the research evidence from several published research papers and provides consultant nurses with practical suggestions on how to enhance research evidence application in their practice. It recommends how consultant nurses can make their practice more research transparent by providing the required leadership, creating a research-friendly organization, developing a clear research agenda and facilitating staff develop a local research framework for reading research and implementing research evidence in their practice.
Resumo:
For seizing the potential of serious games, the RAGE project - funded by the Horizon-2020 Programme of the European Commission - will make available an interoperable set of advanced technology components (software assets) that support game studios at serious game development. This paper describes the overall software architecture and design conditions that are needed for the easy integration and reuse of such software assets in existing game platforms. Based on the component-based software engineering paradigm the RAGE architecture takes into account the portability of assets to different operating systems, different programming languages and different game engines. It avoids dependencies on external software frameworks and minimizes code that may hinder integration with game engine code. Furthermore it relies on a limited set of standard software patterns and well-established coding practices. The RAGE architecture has been successfully validated by implementing and testing basic software assets in four major programming languages (C#, C++, Java and Typescript/JavaScript, respectively). A demonstrator implementation of asset integration with an existing game engine was created and validated. The presented RAGE architecture paves the way for large scale development and application of cross-engine reusable software assets for enhancing the quality and diversity of serious gaming.
Resumo:
Marine diatoms and dinoflagellates play a variety of key ecosystem roles as important primary producers (diatoms and some dinoflagellates) and grazers (some dinoflagellates). Additionally some are harmful algal bloom (HAB) species and there is widespread concern that HAB species may be increasing accompanied by major negative socio-economic impacts, including threats to human health and marine harvesting1, 2. Using 92,263 samples from the Continuous Plankton Recorder survey, we generated a 50-year (1960–2009) time series of diatom and dinoflagellate occurrence in the northeast Atlantic and North Sea. Dinoflagellates, including both HAB taxa (for example, Prorocentrum spp.) and non-HAB taxa (for example, Ceratium furca), have declined in abundance, particularly since 2006. In contrast, diatom abundance has not shown this decline with some common diatoms, including both HAB (for example, Pseudo-nitzschia spp.) and non-HAB (for example, Thalassiosira spp.) taxa, increasing in abundance. Overall these changes have led to a marked increase in the relative abundance of diatoms versus dinoflagellates. Our analyses, including Granger tests to identify criteria of causality, indicate that this switch is driven by an interaction effect of both increasing sea surface temperatures combined with increasingly windy conditions in summer.