47 resultados para Kalin, Maija: Coping with problems of understanding

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent work has shown that the evolution of Drosophila melanogaster resistance to attack by the parasitoid Asobara tabida is constrained by a trade-off with larval competitive ability. However, there are two very important questions that need to be answered. First, is this a general cost, or is it parasitoid specific? Second, does a selected increase in immune response against one parasitoid species result in a correlated change in resistance to other parasitoid species? The answers to both questions will influence the coevolutionary dynamics of these species, and also may have a previously unconsidered, yet important, influence on community structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apraxia of speech (AOS) is typically described as a motor-speech disorder with clinically well-defined symptoms, but without a clear understanding of the underlying problems in motor control. A number of studies have compared the speech of subjects with AOS to the fluent speech of controls, but only a few have included speech movement data and if so, this was primarily restricted to the study of single articulators. If AOS reflects a basic neuromotor dysfunction, this should somehow be evident in the production of both dysfluent and perceptually fluent speech. The current study compared motor control strategies for the production of perceptually fluent speech between a young woman with apraxia of speech (AOS) and Broca’s aphasia and a group of age-matched control speakers using concepts and tools from articulation-based theories. In addition, to examine the potential role of specific movement variables on gestural coordination, a second part of this study involved a comparison of fluent and dysfluent speech samples from the speaker with AOS. Movement data from the lips, jaw and tongue were acquired using the AG-100 EMMA system during the reiterated production of multisyllabic nonwords. The findings indicated that although in general kinematic parameters of fluent speech were similar in the subject with AOS and Broca’s aphasia to those of the age-matched controls, speech task-related differences were observed in upper lip movements and lip coordination. The comparison between fluent and dysfluent speech characteristics suggested that fluent speech was achieved through the use of specific motor control strategies, highlighting the potential association between the stability of coordinative patterns and movement range, as described in Coordination Dynamics theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study sets out to examine the strategies used by Chinese learners in a predominantly naturalistic environment and how such learner strategy use relates to their proficiency in the second language. Data were collected from four Chinese research students in the UK using semi-structured interviews. Their proficiency in English was assessed with an oral interview and a listening test. The main findings from this study are that the learners used a wide range of strategies overall, including metacognitive, cognitive, social/affective and compensation strategies. The majority of the commonly reported strategies were metacognitive strategies, suggesting that the learners were self-directed and attempting to manage their own learning in an informal context. They also showed idiosyncrasies in their use of learner strategies. Attempts to explain the learners’ strategy use in relation to their levels of proficiency in English and contextual factors, as well as several other factors, are offered. Implications for target-country institutions in terms of the provision of support to Chinese students are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we describe how to cope with the delays inherent in a real time control system for a steerable stereo head/eye platform. A purposive and reactive system requires the use of fast vision algorithms to provide the controller with the error signals to drive the platform. The time-critical implementation of these algorithms is necessary, not only to enable short latency reaction to real world events, but also to provide sufficiently high frequency results with small enough delays that controller remain stable. However, even with precise knowledge of that delay, nonlinearities in the plant make modelling of that plant impossible, thus precluding the use of a Smith Regulator. Moreover, the major delay in the system is in the feedback (image capture and vision processing) rather than feed forward (controller) loop. Delays ranging between 40msecs and 80msecs are common for the simple 2D processes, but might extend to several hundred milliseconds for more sophisticated 3D processes. The strategy presented gives precise control over the gaze direction of the cameras despite the lack of a priori knowledge of the delays involved. The resulting controller is shown to have a similar structure to the Smith Regulator, but with essential modifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A university degree is effectively a prerequisite for entering the archaeological workforce in the UK. Archaeological employers consider that new entrants to the profession are insufficiently skilled, and hold university training to blame. But university archaeology departments do not consider it their responsibility to deliver fully formed archaeological professionals, but rather to provide an education that can then be applied in different workplaces, within and outside archaeology. The number of individuals studying archaeology at university exceeds the total number working in professional practice, with many more new graduates emerging than archaeological jobs advertised annually. Over-supply of practitioners is also a contributing factor to low pay in archaeology. Steps are being made to provide opportunities for vocational training, both within and outside the university system, but archaeological training and education within the universities and subsequently the archaeological labour market may be adversely impacted upon by the introduction of variable top-up student fees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rationale: In UK hospitals, the preparation of all total parenteral nutrition (TPN) products must be made in the pharmacy as TPNs are categorised as high-risk injectables (NPSA/2007/20). The National Aseptic Error Reporting Scheme has been collecting data on pharmacy compounding errors in the UK since August 2003. This study reports on types of error associated with the preparation of TPNs, including the stage at which these were identified and potential and actual patient outcomes. Methods: Reports of compounding errors for the period 1/2004 - 3/2007 were analysed on an Excel spreadsheet. Results: Of a total of 3691 compounding error reports, 674 (18%) related to TPN products; 548 adult vs. 126 paediatric. A significantly higher proportion of adult TPNs (28% vs. 13% paediatric) were associated with labelling errors and a significantly higher proportion of paediatric TPNs (25% vs. 15% adult) were associated with incorrect transcriptions (Chi-Square Test; p<0.005). Labelling errors were identified equally by pharmacists (42%) and technicians (48%) with technicians detecting mainly at first check and pharmacists at final check. Transcription errors were identified mainly by technicians (65% vs. 27% pharmacist) at first check. Incorrect drug selection (13%) and calculation errors (9%) were associated with adult and paediatric TPN preparations in the same ratio. One paediatric TPN error detected at first check was considered potentially catastrophic; 31 (5%) errors were considered of major and 38 (6%) of moderate potential consequence. Five errors (2 moderate, 1 minor) were identified during or after administration. Conclusions: While recent UK patient safety initiatives are aimed at improving the safety of injectable medicines in clinical areas, the current study highlights safety problems that exist within pharmacy production units. This could be used in the creation of an error management tool for TPN compounding processes within hospital pharmacies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses the social model of disability to examine visually impaired children's experiences of their housing and neighbourhoods and finds that they did not experience any significant problems with the design of them. The source of their problems was within these environments, and was caused by factors such as the intensity of movement, for example, from flows of traffic. We conclude by discussing the social policy implications of these findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In The Conduct of Inquiry in International Relations, Patrick Jackson situates methodologies in International Relations in relation to their underlying philosophical assumptions. One of his aims is to map International Relations debates in a way that ‘capture[s] current controversies’ (p. 40). This ambition is overstated: whilst Jackson’s typology is useful as a clarificatory tool, (re)classifying existing scholarship in International Relations is more problematic. One problem with Jackson’s approach is that he tends to run together the philosophical assumptions which decisively differentiate his methodologies (by stipulating a distinctive warrant for knowledge claims) and the explanatory strategies that are employed to generate such knowledge claims, suggesting that the latter are entailed by the former. In fact, the explanatory strategies which Jackson associates with each methodology reflect conventional practice in International Relations just as much as they reflect philosophical assumptions. This makes it more difficult to identify each methodology at work than Jackson implies. I illustrate this point through a critical analysis of Jackson’s controversial reclassification of Waltz as an analyticist, showing that whilst Jackson’s typology helps to expose inconsistencies in Waltz’s approach, it does not fully support the proposed reclassification. The conventional aspect of methodologies in International Relations also raises questions about the limits of Jackson’s ‘engaged pluralism’.