991 resultados para Reading problem
Resumo:
This study examined the effect of expHcitly instructing students to use a repertoire of reading comprehension strategies. Specifically, this study examined whether providing students with a "predictive story-frame" which combined the use of prediction and summarization strategies improved their reading comprehension relative to providing students with generic instruction on prediction and summarization. Results were examined in terms of instructional condition and reading ability. Students from 2 grade 4 classes participated in this study. The reading component of the Canadian Achievement Tests, Second Edition (CAT/2) was used to identify students as either "average or above average" or "below average" readers. Students received either strategic predication and summarization instruction (story-frame) or generic prediction and summarization instruction (notepad). Students were provided with new but comparable stories for each session. For both groups, the researcher modelled the strategic tools and provided guided practice, independent practice, and independent reading sessions. Comprehension was measured with an immediate and 1-week delayed comprehension test for each of the 4 stories, hi addition, students participated in a 1- week delayed interview, where they were asked to retell the story and to answer questions about the central elements (character, setting, problem, solution, beginning, middle, and ending events) of each story. There were significant differences, with medium to large effect sizes, in comprehension and recall scores as a fimction of both instructional condition and reading ability. Students in the story-frame condition outperformed students in the notepad condition, and average to above average readers performed better than below average readers. Students in the story-frame condition outperformed students in the notepad condition on the comprehension tests and on the oral retellings when teacher modelling and guidance were present. In the cued recall sessions, students in the story-frame instructional condition recalled more correct information and generated fewer errors than students in the notepad condition. Average to above average readers performed better than below average readers across comprehension and retelling measures. The majority of students in both instructional conditions reported that they would use their strategic tool again.
Resumo:
The main thrust of this thesis is the re-exploration of Friedrich Nietzsche's "critique of nihilism" through the lenses of Gilles Deleuze. A Deleuzian reading of Nietzsche is motivated by a post-deconstrnctive style of interpretation, inasmuch as Deleuze goes beyond, or in between, henneneutics and deconstrnction. Deleuze's post-deconstrnctive reading of Nietzsche is, however, only secondary to the main aim of this thesis. The primary thrust of this study is the critique of a way of thinking characterized by Nietzsche as nihilistic. Therefore, it should be noted that this study is not about Deleuze's reading per se; rather, it is an appraisal of Nietzsche's "critique of nihilism" using Deleuze's experimental reading. We will accrue Nietzsche's critique and Deleuze's post-deconstrnctive reading in order to appraise Nietzsche's critique itself. Insofar as we have underscored Deleuze's purported experimentation of Nietzschean themes, this study is also an experiment in itself. Through this experimentation, we will find out whether it is possible to partly gloss Nietzsche's critique of nihilism through Deleuzian phraseology. Far from presenting a mere exposition of Nietzsche's text, we are, rather, re-reading, that is, re-evaluating Nietzsche's critique of nihilism through Deleuze's experimentation. This is our way of thinking with Nietzsche. Nihilism is the central problem upon which Nietzsche's philosophical musings are directed; he deems nihilism as a cultural experience and, as such, a phenomenon to be reckoned with. In our reconstruction of Nietzsche's critique of nihilism, we locate two related elements which constitute the structure of the prescription of a cure, Le., the ethics of affirmation and the ontology of becoming. Appraising Nietzsche's ethics and ontology amounts to clarifying what Deleuze thinks as the movement from the "dogmatic image of thought" to the "new image of thought." Through this new image of thought, Deleuze makes sense of a Nietzschean counterculture which is a perspective that resists traditional or representational metaphysics. Deleuze takes the reversal of Platonism or the transmutation of values to be the point of departure. We have to, according to Deleuze, abandon our old image of the world in order to free ourselves from the obscurantism of foundationalist or essentialist thinking. It is only through the transmutation of values that we can make sense of Nietzsche's ethics of affirmation and ontology of becoming. We have to think of Nietzsche's ethics as an "ethics" and not a moral philosophy, and we have to think of his ontology as 1/ ontology" and not as metaphysics. Through Deleuze, we are able to avoid reading Nietzsche as a moral philosopher and metaphysician. Rather, we are able to read Nietzsche as one espousing an ethical imperative through the thought of the eternal return and one advocating a theory of existence based on an immanent, as opposed to transcendent, image of the world.
Resumo:
Cette thèse a pour point de départ l’idée du jeune Heidegger selon laquelle le sens de l’être doit être recherché au moyen d’une phénoménologie ontologique, plus précisément par l’analytique existentiale du Dasein et la destruction de l’histoire de l’ontologie ; ou, comme nous l’interprétons, dans la transcendance du Dasein et la transcendance de l’être. L’étude du problème de la transcendance fait l’objet d’une approche phénoménologique, laquelle porte une attention particulière au vécu en tant que tel ainsi qu’aux conditions de possibilité de ce vécu, et repose sur une lecture attentive et critique des œuvres de Heidegger. C’est une telle approche phénoménologique qui nous permettra de mener à bien notre analyse du problème de la transcendance dans le corpus heideggérien. Nous serons par ailleurs en mesure d’aborder le débat opposant à ce sujet la tradition analytique (plus particulièrement l’approche pragmatiste) et la tradition continentale, notre étude s’inscrivant dans le cadre de cette dernière. Nous proposons ici une phénoménologie du problème de la transcendance qui fait également figure de phénoménologie du sens, de la possibilité et de la normativité. Prenant pour point de départ certaines contributions issues de la tradition continentale, nous soutenons que le sens de l’être peut être compris comme le problème de la transcendance. L’histoire de la philosophie doit être perturbée, déconstruite et repensée afin que le chemin de la philosophie, encore non pensé, puisse être mis au jour. L’accès à cet autre commencement doit être recherché dans la transcendance en tant que telle – de l’appel de la conscience fondé dans la nullité à l’encontre authentique avec la mort et l’ouverture de la temporalité ; de l’avènement historial de l’être jusqu’à, ultimement, le i! ! ! ! refus de l’être et le retrait du rien. L’événement (Ereignis) de l’être est donc compris comme processus de dépassement de soi à partir duquel la transcendance de l’être – ou, comme le formule Heidegger, la fin des questions – est possible.
Resumo:
Taking into account the study of Luegi (2006), where eye movements of 20 Portuguese university students while reading text passages were analyzed, in this article we discuss some methodological issues concerning eye tracking measures to evaluate reading difficulties. Relating syntactic complexity, grammaticality and ambiguity to eye movements, we will discuss the use of many different dependent variables that indicate the immediate and delayed processes in text processing. We propose a new measure that we called Progression-Path which permits analyzing, in the critical region, what happens when the reader proceeds on the sentence instead of going backwards to solve a problem that s/he found (which is the most common expected behavior but not the only one, as is illustrated by some of our examples).
Resumo:
In this paper we consider the 2D Dirichlet boundary value problem for Laplace’s equation in a non-locally perturbed half-plane, with data in the space of bounded and continuous functions. We show uniqueness of solution, using standard Phragmen-Lindelof arguments. The main result is to propose a boundary integral equation formulation, to prove equivalence with the boundary value problem, and to show that the integral equation is well posed by applying a recent partial generalisation of the Fredholm alternative in Arens et al [J. Int. Equ. Appl. 15 (2003) pp. 1-35]. This then leads to an existence proof for the boundary value problem. Keywords. Boundary integral equation method, Water waves, Laplace’s
Resumo:
The Proceedings of the Ninth Annual Conference of the British Association for Biological Anthropology and Osteoarchaeology (BABAO) held at the University of Reading in 2007. Contents: 1) A life course perspective of growing up in medieval London: evidence of sub-adult health from St Mary Spital (London) (Rebecca Redfern and Don Walker); 2) Preservation of non-adult long bones from an almshouse cemetery in the United States dating to the late nineteenth to the early twentieth centuries (Colleen Milligan, Jessica Zotcavage and Norman Sullivan); 3) Childhood oral health: dental palaeopathology of Kellis 2, Dakhleh, Egypt. A preliminary investigation (Stephanie Shukrum and JE Molto); 4) Skeletal manifestation of non-adult scurvy from early medieval Northumbria: the Black Gate cemetery, Newcastle-upon-Tyne (Diana Mahoney-Swales and Pia Nystrom); 5) Infantile cortical hyperostosis: cases, causes and contradictions (Mary Lewis and Rebecca Gowland); 6) Biological Anthropology Tuberculosis of the hip in the Victorian Britain (Benjamin Clarke and Piers Mitchell); 7) The re-analysis of Iron Age human skeletal material from Winnall Down (Justine Tracey); 8) Can we estimate post-mortem interval from an individual body part? A field study using sus scrofa (Branka Franicevec and Robert Pastor); 9) The expression of asymmetry in hand bones from the medieval cemetery at Écija, Spain (Lisa Cashmore and Sonia Zakrezewski); 10) Returning remains: a curator’s view (Quinton Carroll); 11) Authority and decision making over British human remains: issues and challenges (Piotr Bienkowski and Malcolm Chapman); 12) Ethical dimensions of reburial, retention and repatriation of archaeological human remains: a British perspective (Simon Mays and Martin Smith); 13) The problem of provenace: inaccuracies, changes and misconceptions (Margaret Clegg); 14) Native American human remains in UK collections: implications of NAGPRA to consultation, repatriation, and policy development (Myra J Giesen); 15) Repatriation – a view from the receiving end: New Zealand (Nancy Tayles).
Resumo:
Problem structuring methods or PSMs are widely applied across a range of variable but generally small-scale organizational contexts. However, it has been argued that they are seen and experienced less often in areas of wide ranging and highly complex human activity-specifically those relating to sustainability, environment, democracy and conflict (or SEDC). In an attempt to plan, track and influence human activity in SEDC contexts, the authors in this paper make the theoretical case for a PSM, derived from various existing approaches. They show how it could make a contribution in a specific practical context-within sustainable coastal development projects around the Mediterranean which have utilized systemic and prospective sustainability analysis or, as it is now known, Imagine. The latter is itself a PSM but one which is 'bounded' within the limits of the project to help deliver the required 'deliverables' set out in the project blueprint. The authors argue that sustainable development projects would benefit from a deconstruction of process by those engaged in the project and suggest one approach that could be taken-a breakout from a project-bounded PSM to an analysis that embraces the project itself. The paper begins with an introduction to the sustainable development context and literature and then goes on to illustrate the issues by grounding the debate within a set of projects facilitated by Blue Plan for Mediterranean coastal zones. The paper goes on to show how the analytical framework could be applied and what insights might be generated.
Resumo:
Indicators are commonly recommended as tools for assessing the attainment of development, and the current vogue is for aggregating a number of indicators together into a single index. It is claimed that such indices of development help facilitate maximum impact in policy terms by appealing to those who may not necessarily have technical expertise in data collection, analysis and interpretation. In order to help counter criticisms of over-simplification, those advocating such indices also suggest that the raw data be provided so as to allow disaggregation into component parts and hence facilitate a more subtle interpretation if a reader so desires. This paper examines the problems involved with interpreting indices of development by focusing on the United Nations Development Programmes (UNDP) Human Development Index (HDI) published each year in the Human Development Reports (HDRs). The HDI was intended to provide an alternative to the more economic based indices, such as GDP, commonly used within neo-liberal development agendas. The paper explores the use of the HDI as a gauge of human development by making comparisons between two major political and economic communities in Africa (ECOWAS and SADC). While the HDI did help highlight important changes in human development as expressed by the HDI over 10 years, it is concluded that the HDI and its components are difficult to interpret as methodologies have changed significantly and the 'averaging' nature of the HDI could hide information unless care is taken. The paper discusses the applicability of alternative models to the HDI such as the more neo-populist centred methods commonly advocated for indicators of sustainable development. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Pressing global environmental problems highlight the need to develop tools to measure progress towards "sustainability." However, some argue that any such attempt inevitably reflects the views of those creating such tools and only produce highly contested notions of "reality." To explore this tension, we critically assesses the Environmental Sustainability Index (ESI), a well-publicized product of the World Economic Forum that is designed to measure 'sustainability' by ranking nations on league tables based on extensive databases of environmental indicators. By recreating this index, and then using statistical tools (principal components analysis) to test relations between various components of the index, we challenge ways in which countries are ranked in the ESI. Based on this analysis, we suggest (1) that the approach taken to aggregate, interpret and present the ESI creates a misleading impression that Western countries are more sustainable than the developing world; (2) that unaccounted methodological biases allowed the authors of the ESI to over-generalize the relative 'sustainability' of different countries; and, (3) that this has resulted in simplistic conclusions on the relation between economic growth and environmental sustainability. This criticism should not be interpreted as a call for the abandonment of efforts to create standardized comparable data. Instead, this paper proposes that indicator selection and data collection should draw on a range of voices, including local stakeholders as well as international experts. We also propose that aggregating data into final league ranking tables is too prone to error and creates the illusion of absolute and categorical interpretations. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The “butterfly effect” is a popularly known paradigm; commonly it is said that when a butterfly flaps its wings in Brazil, it may cause a tornado in Texas. This essentially describes how weather forecasts can be extremely senstive to small changes in the given atmospheric data, or initial conditions, used in computer model simulations. In 1961 Edward Lorenz found, when running a weather model, that small changes in the initial conditions given to the model can, over time, lead to entriely different forecasts (Lorenz, 1963). This discovery highlights one of the major challenges in modern weather forecasting; that is to provide the computer model with the most accurately specified initial conditions possible. A process known as data assimilation seeks to minimize the errors in the given initial conditions and was, in 1911, described by Bjerkness as “the ultimate problem in meteorology” (Bjerkness, 1911).