965 resultados para Samson (Biblical judge)
Resumo:
Objective To introduce a new approach to problem-based learning (PBL) for self-directed learning in renal therapeutics. Design This 5-week course, designed for large student cohorts using minimal teaching resources, was based on a series of case studies and subsequent pharmaceutical care plans, followed by intensive and regular feedback from the instructor. Assessment Assessment of achievement of the learning outcomes was based on weekly-graded care plans and peer review assessment, allowing each student to judge the contributions of each group member and their own, along with a written case-study based examination. The pharmaceutical care plan template, designed using a “tick-box” system, significantly reduced staff time for feedback and scoring. Conclusion The proposed instructional model achieved the desired learning outcomes with appropriate student feedback, while promoting skills that are essential for the students' future careers as health care professionals.
Resumo:
This paper describes experiments relating to the perception of the roughness of simulated surfaces via the haptic and visual senses. Subjects used a magnitude estimation technique to judge the roughness of “virtual gratings” presented via a PHANToM haptic interface device, and a standard visual display unit. It was shown that under haptic perception, subjects tended to perceive roughness as decreasing with increased grating period, though this relationship was not always statistically significant. Under visual exploration, the exact relationship between spatial period and perceived roughness was less well defined, though linear regressions provided a reliable approximation to individual subjects’ estimates.
Resumo:
Discusses the trasmission of biblical sapiential sources in a variery medieval contexts and discourses, to ultimately focus on Dante's works
Resumo:
We explicitly tested for the first time the ‘environmental specificity’ of traditional 16S rRNAtargeted fluorescence in situ hybridization (FISH) through comparison of the bacterial diversity actually targeted in the environment with the diversity that should be exactly targeted (i.e. without mismatches) according to in silico analysis. To do this, we exploited advances in modern Flow Cytometry that enabled improved detection and therefore sorting of sub-micron-sized particles and used probe PSE1284 (designed to target Pseudomonads) applied to Lolium perenne rhizosphere soil as our test system. The 6-carboxyfluorescein (6-FAM)-PSE1284-hybridised population, defined as displaying enhanced green fluorescence in Flow Cytometry, represented 3.51±1.28% of the total detected population when corrected using a nonsense (NON-EUB338) probe control. Analysis of 16S rRNA gene libraries constructed from Fluorescence Activated Cell Sorted (FACS) -recovered fluorescent populations (n=3), revealed that 98.5% (Pseudomonas spp. comprised 68.7% and Burkholderia spp. 29.8%) of the total sorted population was specifically targeted as evidenced by the homology of the 16S rRNA sequences to the probe sequence. In silico evaluation of probe PSE1284 with the use of RDP-10 probeMatch justified the existence of Burkholderia spp. among the sorted cells. The lack of novelty in Pseudomonas spp. sequences uncovered was notable, probably reflecting the well-studied nature of this functionally important genus. To judge the diversity recorded within the FACS-sorted population, rarefaction and DGGE analysis were used to evaluate, respectively, the proportion of Pseudomonas diversity uncovered by the sequencing effort and the representativeness of the Nycodenz® method for the extraction of bacterial cells from soil.
Resumo:
The family of theories dubbed ‘luck egalitarianism’ represent an attempt to infuse egalitarian thinking with a concern for personal responsibility, arguing that inequalities are just when they result from, or the extent to which they result from, choice, but are unjust when they result from, or the extent to which they result from, luck. In this essay I argue that luck egalitarians should sometimes seek to limit inequalities, even when they have a fully choice-based pedigree (i.e., result only from the choices of agents). I grant that the broad approach is correct but argue that the temporal standpoint from which we judge whether the person can be held responsible, or the extent to which they can be held responsible, should be radically altered. Instead of asking, as Standard (or Static) Luck Egalitarianism seems to, whether or not, or to what extent, a person was responsible for the choice at the time of choosing, and asking the question of responsibility only once, we should ask whether, or to what extent, they are responsible for the choice at the point at which we are seeking to discover whether, or to what extent, the inequality is just, and so the question of responsibility is not settled but constantly under review. Such an approach will differ from Standard Luck Egalitarianism only if responsibility for a choice is not set in stone – if responsibility can weaken then we should not see the boundary between luck and responsibility within a particular action as static. Drawing on Derek Parfit’s illuminating discussions of personal identity, and contemporary literature on moral responsibility, I suggest there are good reasons to think that responsibility can weaken – that we are not necessarily fully responsible for a choice for ever, even if we were fully responsible at the time of choosing. I call the variant of luck egalitarianism that recognises this shift in temporal standpoint and that responsibility can weaken Dynamic Luck Egalitarianism (DLE). In conclusion I offer a preliminary discussion of what kind of policies DLE would support.
Resumo:
This paper examines cyclical behaviour in commercial property values over the period 1956 to 1996, using a structural times series (unobserved components) approach. The influence of the transition to short rent reviews during the late 1960s and the short and long-term impacts of the 1974 and 1990 property crashes are also incorporated into the analysis, via dummy variables. It is found that once these variables are taken into account a fairly regular cyclical pattern can be discerned, with a period of about 7.8 years. Furthermore, the 1974 and 1990 property crashes are shown to have had a major long-term impact on property value growth (presumably via their influence on investors' expectations).
Resumo:
This is a reading of the work of Mary Martha Sherwood, the Victorian Evangelist and Children’s Author (and pupil at the Abbey School, Reading). Based upon research on Sherwood’s private correspondences and diary conducted at UCLA with the aid of a Mitzi Myers (this before my arrival at Reading), the essay offers a radical reinterpretation of her work. Previously understood in terms of a rigid, if self-contradictory and ‘anxious’, Evangelism, the essay reads the diary through Sherwood’s little known Biblical scholarship. Through this I argue that Sherwood grants her own writing the status of Biblical truth precisely because of its contradictions and ‘anxiety’.
Resumo:
High-resolution ensemble simulations (Δx = 1 km) are performed with the Met Office Unified Model for the Boscastle (Cornwall, UK) flash-flooding event of 16 August 2004. Forecast uncertainties arising from imperfections in the forecast model are analysed by comparing the simulation results produced by two types of perturbation strategy. Motivated by the meteorology of the event, one type of perturbation alters relevant physics choices or parameter settings in the model's parametrization schemes. The other type of perturbation is designed to account for representativity error in the boundary-layer parametrization. It makes direct changes to the model state and provides a lower bound against which to judge the spread produced by other uncertainties. The Boscastle has genuine skill at scales of approximately 60 km and an ensemble spread which can be estimated to within ∼ 10% with only eight members. Differences between the model-state perturbation and physics modification strategies are discussed, the former being more important for triggering and the latter for subsequent cell development, including the average internal structure of convective cells. Despite such differences, the spread in rainfall evaluated at skilful scales is shown to be only weakly sensitive to the perturbation strategy. This suggests that relatively simple strategies for treating model uncertainty may be sufficient for practical, convective-scale ensemble forecasting.
Resumo:
Some proponents of local knowledge, such as Sillitoe (2010), have expressed second thoughts about its capacity to effect development on the ‘revolutionary’ scale once predicted. Our argument in this article follows a similar route. Recent research into the management of livestock in South Africa makes clear that rural African livestock farmers experience uncertainty in relation to the control of stock diseases. State provision of veterinary services has been significantly reduced over the past decade. Both white and African livestock owners are to a greater extent left to their own devices. In some areas of animal disease management, African livestock owners have recourse to tried-and-tested local remedies, which are largely plant-based. But especially in the critical sphere of tick control, efficacious treatments are less evident, and livestock owners struggle to find adequate solutions to high tickloads. This is particularly important in South Africa in the early twenty-first century because land reform and the freedom to purchase land in the post-apartheid context affords African stockowners opportunities to expand livestock holdings. Our research suggests that the limits of local knowledge in dealing with ticks is one of the central problems faced by African livestock owners. We judge this not only in relation to efficacy but also the perceptions of livestock owners themselves. While confidence and practice varies, and there is increasing resort of chemical acaricides we were struck by the uncertainty of livestock owners over the best strategies.
The unsteady flow of a weakly compressible fluid in a thin porous layer II: three-dimensional theory
Resumo:
We consider the problem of determining the pressure and velocity fields for a weakly compressible fluid flowing in a three-dimensional layer, composed of an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting and/or extracting fluid. Numerical solution of this three-dimensional evolution problem may be expensive, particularly in the case that the depth scale of the layer h is small compared to the horizontal length scale l, a situation which occurs frequently in the application to oil and gas reservoir recovery and which leads to significant stiffness in the numerical problem. Under the assumption that $\epsilon\propto h/l\ll 1$, we show that, to leading order in $\epsilon$, the pressure field varies only in the horizontal directions away from the wells (the outer region). We construct asymptotic expansions in $\epsilon$ in both the inner (near the wells) and outer regions and use the asymptotic matching principle to derive expressions for all significant process quantities. The only computations required are for the solution of non-stiff linear, elliptic, two-dimensional boundary-value, and eigenvalue problems. This approach, via the method of matched asymptotic expansions, takes advantage of the small aspect ratio of the layer, $\epsilon$, at precisely the stage where full numerical computations become stiff, and also reveals the detailed structure of the dynamics of the flow, both in the neighbourhood of wells and away from wells.
Resumo:
We describe a novel method for determining the pressure and velocity fields for a weakly compressible fluid flowing in a thin three-dimensional layer composed of an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting and/or extracting fluid. Our approach uses the method of matched asymptotic expansions to derive expressions for all significant process quantities, the computation of which requires only the solution of linear, elliptic, two-dimensional boundary value and eigenvalue problems. In this article, we provide full implementation details and present numerical results demonstrating the efficiency and accuracy of our scheme.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.