69 resultados para consistency in indexing
Resumo:
As a result of resource limitations, state in branch predictors is frequently shared between uncorrelated branches. This interference can significantly limit prediction accuracy. In current predictor designs, the branches sharing prediction information are determined by their branch addresses and thus branch groups are arbitrarily chosen during compilation. This feasibility study explores a more analytic and systematic approach to classify branches into clusters with similar behavioral characteristics. We present several ways to incorporate this cluster information as an additional information source in branch predictors.
Resumo:
Latent semantic indexing (LSI) is a technique used for intelligent information retrieval (IR). It can be used as an alternative to traditional keyword matching IR and is attractive in this respect because of its ability to overcome problems with synonymy and polysemy. This study investigates various aspects of LSI: the effect of the Haar wavelet transform (HWT) as a preprocessing step for the singular value decomposition (SVD) in the key stage of the LSI process; and the effect of different threshold types in the HWT on the search results. The developed method allows the visualisation and processing of the term document matrix, generated in the LSI process, using HWT. The results have shown that precision can be increased by applying the HWT as a preprocessing step, with better results for hard thresholding than soft thresholding, whereas standard SVD-based LSI remains the most effective way of searching in terms of recall value.
Resumo:
Cross-sectional and longitudinal data consistently indicate that mathematical difficulties are more prevalent in older than in younger children (e.g. Department of Education, 2011). Children’s trajectories can take a variety of shapes such as linear, flat, curvilinear and uneven, and shape has been found to vary within children and across tasks (J Jordan et al. 2009). There has been an increase in the use of statistical methods which are specifically designed to study development, and this has greatly improved our understanding of children’s mathematical development. However, the effects of many cognitive and social variables (e.g. working memory and verbal ability) on mathematical development are unclear. It is likely that greater consistency between studies will be achieved by adopting a componential approach to study mathematics, rather than treating mathematics as a unitary concept.
Resumo:
In a double-blind crossover study the efficacies of Agiolax, a combination of fibre and senna pod, and lactulose were compared in 77 long-stay elderly patients with chronic constipation. Mean daily bowel frequency, stool consistency and ease of evacuation were significantly greater with Agiolax than lactulose. The recommended dose was exceeded more frequently with lactulose than Agiolax (chi 2 = 8.38, p <0.01). Adverse effects were not different for the 2 treatments. In long-stay elderly patients with chronic constipation Agiolax and lactulose were well tolerated, but Agiolax proved a more effective treatment.
Resumo:
This paper uses harmonized data for the member states of the European Union to analyse household income packaging from a 'welfare regimes' perspective. Using data from the third wave of the ECHP, it looks at how the role of welfare transfers in the income package varies across countries and welfare regimes, and assesses whether this is consistent with the predictions of welfare regime theory, having first elaborated some specific hypotheses in that regard. It finds that when one focuses on averages across countries categorized into regimes, many of these hypotheses about the role of transfers are in broad terms borne out by the evidence. However, when one focuses on individual countries rather than regime averages the picture is a good deal more complex and consistency with the range of hypotheses more limited. It is essential that this variation across countries is taken into account in interpreting and using welfare regime theory and typologies.
Resumo:
Pain assessment in neonates often presents problems. The problem of inadequate or inaccurate assessment is complicated by issues related to the nature, consistency, and variability of the infant's physiologic and behavioral responses; the reliability, validity, specificity, sensitivity, and practicality of existing neonatal pain measures or measurement approaches; ethical questions about pain research in infants; and uncertainty about the responsibilities of health care professionals in managing pain in clinical settings. Despite these many issues, neonates need to be comfortable and as free of pain as possible to grow and develop normally. Valid and reliable assessment of pain is the major prerequisite for attaining this goal. Issues embodied in neonatal pain responses, measurement, ethical, and clinical considerations are explored. Suggestions for resolving some of these problems are presented.
Resumo:
The present study investigated the long-term consistency of individual differences in dairy cattles’ responses in tests of behavioural and hypothalamo–pituitary–adrenocortical (HPA) axis reactivity, as well as the relationship between responsiveness in behavioural tests and the reaction to first milking. Two cohorts of heifer calves, Cohorts 1 (N = 25) and 2 (N = 16), respectively, were examined longitudinally from the rearing period until adulthood. Cohort 1 heifers were subjected to open field (OF), novel object (NO), restraint, and response to a human tests at 7 months of age, and were again observed in an OF test during first pregnancy between 22 and 24 months of age. Subsequently, inhibition of milk ejection and stepping and kicking behaviours were recorded in Cohort 1 heifers during their first machine milking. Cohort 2 heifers were individually subjected to OF and NO tests as well as two HPA axis reactivity tests (determining ACTH and/or cortisol response profiles after administration of exogenous CRH and ACTH, respectively) at 6 months of age and during first lactation at approximately 29 months of age. Principal component analysis (PCA) was used to condense correlated response measures (to behavioural tests and to milking) within ages into independent dimensions underlying heifers’ reactivity. Heifers demonstrated consistent individual differences in locomotion and vocalisation during an OF test from rearing to first pregnancy (Cohort 1) or first lactation (Cohort 2). Individual differences in struggling in a restraint test at 7 months of age reliably predicted those in OF locomotion during first pregnancy in Cohort 1 heifers. Cohort 2 animals with high cortisol responses to OF and NO tests and high avoidance of the novel object at 6 months of age also exhibited enhanced cortisol responses to OF and NO tests at 29 months of age. Measures of HPA axis reactivity, locomotion, vocalisation and adrenocortical and behavioural responses to novelty were largely uncorrelated, supporting the idea that stress responsiveness in dairy cows is mediated by multiple independent underlying traits. Inhibition of milk ejection and stepping and kicking behaviours during first machine milking were not related to earlier struggling during restraint, locomotor responses to OF and NO tests, or the behavioural interaction with a novel object. Heifers with high rates of OF and NO vocalisation and short latencies to first contact with the human at 7 months of age exhibited better milk ejection during first machine milking. This suggests that low underlying sociality might be implicated in the inhibition of milk ejection at the beginning of lactation in heifers.
Resumo:
We investigate the acceleration of particles by Alfven waves via the second-order Fermi process in the lobes of giant radio galaxies. Such sites are candidates for the accelerators of ultra-high-energy cosmic rays (UHECR). We focus on the nearby Fanaroff-Riley type I radio galaxy Centaurus A. This is motivated by the coincidence of its position with the arrival direction of several of the highest energy Auger events. The conditions necessary for consistency with the acceleration time-scales predicted by quasi-linear theory are reviewed. Test particle calculations are performed in fields which guarantee electric fields with no component parallel to the local magnetic field. The results of quasi-linear theory are, to an order of magnitude, found to be accurate at low turbulence levels for non-relativistic Alfven waves and at both low and high turbulence levels in the mildly relativistic case. We conclude that for pure stochastic acceleration via Alfven waves to be plausible as the generator of UHECR in Cen A, the baryon number density would need to be several orders of magnitude below currently held upper limits.
Resumo:
Advances in computational and information technologies have facilitated the acquisition of geospatial information for regional and national soil and geology databases. These have been completed for a range of purposes from geological and soil baseline mapping to economic prospecting and land resource assessment, but have become increasingly used for forensic purposes. On the question of provenance of a questioned sample, the geologist or soil scientist will draw invariably on prior expert knowledge and available digital map and database sources in a ‘pseudo Bayesian’ approach. The context of this paper is the debate on whether existing (digital) geology and soil databases are indeed useful and suitable for forensic inferences. Published and new case studies are used to explore issues of completeness, consistency, compatibility and applicability in relation to the use of digital geology and soil databases in environmental and criminal forensics. One key theme that emerges is that, despite an acknowledgement that databases can be neither exhaustive nor precise enough to portray spatial variability at the scene of crime scale, coupled with expert knowledge, they play an invaluable role in providing background or
reference material in a criminal investigation. Moreover databases can offer an independent control set of samples.
Resumo:
Context. The VLT-FLAMES Tarantula Survey has an extensive view of the copious number of massive stars in the 30 Doradus (30 Dor) star forming region of the Large Magellanic Cloud. These stars play a crucial role in our understanding of the stellar feedback in more distant, unresolved star forming regions. Aims. The first comprehensive census of hot luminous stars in 30 Dor is compiled within a 10 arcmin (150 pc) radius of its central cluster, R136. We investigate the stellar content and spectroscopic completeness of the early type stars. Estimates were made for both the integrated ionising luminosity and stellar wind luminosity. These values were used to re-assess the star formation rate (SFR) of the region and determine the ionising photon escape fraction. Methods. Stars were selected photometrically and combined with the latest spectral classifications. Spectral types were estimated for stars lacking spectroscopy and corrections were made for binary systems, where possible. Stellar calibrations were applied to obtain their physical parameters and wind properties. Their integrated properties were then compared to global observations from ultraviolet (UV) to far-infrared (FIR) imaging as well as the population synthesis code, Starburst99. Results. Our census identified 1145 candidate hot luminous stars within 150 pc of R136 of which >700 were considered to be genuine early type stars and contribute to feedback. We assess the survey to be spectroscopically complete to 85% in the outer regions (>5 pc) but only 35% complete in the region of the R136 cluster, giving a total of 500 hot luminous stars in the census which had spectroscopy. Only 31 were found to be Wolf-Rayet (W-R) or Of/WN stars, but their contribution to the integrated ionising luminosity and wind luminosity was ~ 40% and ~ 50%, respectively. Similarly, stars with M > 100 M (mostly H-rich WN stars) also showed high contributions to the global feedback, ~ 25% in both cases. Such massive stars are not accounted for by the current Starburst99 code, which was found to underestimate the integrated ionising luminosity of R136 by a factor ~ 2 and the wind luminosity by a factor ~ 9. The census inferred a SFR for 30 Dor of 0.073 ± 0.04 M yr . This was generally higher than that obtained from some popular SFR calibrations but still showed good consistency with the far-UV luminosity tracer as well as the combined Hα and mid-infrared tracer, but only after correcting for Hα extinction. The global ionising output was also found to exceed that measured from the associated gas and dust, suggesting that ~6 % of the ionising photons escape the region. Conclusions. When studying the most luminous star forming regions, it is essential to include their most massive stars if one is to determine a reliable energy budget. Photon leakage becomes more likely after including their large contributions to the ionising output. If 30 Dor is typical of other massive star forming regions, estimates of the SFR will be underpredicted if this escape fraction is not accounted for.
Resumo:
Inductively coupled plasma (ICP) following aqua regia digestion and X-ray fluorescence (XRF) are both geochemical techniques used to determine ‘total’ concentrations of elements in soil. The aim of this study is to compare these techniques, identify elements for which inconsistencies occur and investigate why they arise. A study area (∼14,000 km2) with a variety of total concentration controls and a large geochemical dataset (n = 7950) was selected. Principal component analysis determined underlying variance in a dataset composed of both geogenic and anthropogenic elements. Where inconsistencies between the techniques were identified, further numerical and spatial analysis was completed. The techniques are more consistent for elements of geogenic sources and lead, whereas other elements of anthropogenic sources show less consistency within rural samples. XRF is affected by sample matrix, while the form of element affects ICP concentrations. Depending on their use in environmental studies, different outcomes would be expected from the techniques employed, suggesting the choice of analytical technique for geochemical analyses may be more critical than realised.
Resumo:
Thermal stability is of major importance in polymer extrusion, where product quality is dependent upon the level of melt homogeneity achieved by the extruder screw. Extrusion is an energy intensive process and optimisation of process energy usage while maintaining melt stability is necessary in order to produce good quality product at low unit cost. Optimisation of process energy usage is timely as world energy prices have increased rapidly over the last few years. In the first part of this study, a general discussion was made on the efficiency of an extruder. Then, an attempt was made to explore correlations between melt thermal stability and energy demand in polymer extrusion under different process settings and screw geometries. A commodity grade of polystyrene was extruded using a highly instrumented single screw extruder, equipped with energy consumption and melt temperature field measurement. Moreover, the melt viscosity of the experimental material was observed by using an off-line rheometer. Results showed that specific energy demand of the extruder (i.e. energy for processing of unit mass of polymer) decreased with increasing throughput whilst fluctuation in energy demand also reduced. However, the relationship between melt temperature and extruder throughput was found to be complex, with temperature varying with radial position across the melt flow. Moreover, the melt thermal stability deteriorated as throughput was increased, meaning that a greater efficiency was achieved at the detriment of melt consistency. Extruder screw design also had a significant effect on the relationship between energy consumption and melt consistency. Overall, the relationship between the process energy demand and thermal stability seemed to be negatively correlated and also it was shown to be highly complex in nature. Moreover, the level of process understanding achieved here can help to inform selection of equipment and setting of operating conditions to optimise both energy and thermal efficiencies in parallel.
Resumo:
RATIONALE, AIMS AND OBJECTIVES: Health care services offered to the public should be based on the best available evidence. We aimed to explore pharmacy tutors' and trainees' views on the importance of evidence when making decisions about over-the-counter (OTC) medicines and also to investigate whether the tutor influenced the trainee in practice.
METHODS: Following ethical approval and piloting, semi-structured interviews were conducted with pharmacy graduates (trainees) and pharmacist tutors. Transcribed interview data were entered into the NVivo software package (version 10), coded and analysed via thematic analysis.
RESULTS: Twelve trainees (five males, seven females) and 11 tutors (five males, six females) participated. Main themes that emerged were (in)consistency and contradiction, confidence, acculturation, and continuation and perpetuation. Despite having an awareness of the importance and potential benefits, an evidence-based approach did not seem to be routinely or consistently implemented in practice. Confidence in products was largely derived from personal use and patient feedback. A lack of discussion about evidence was justified on the basis of not wanting to lessen patient confidence in requested product(s) or possibly negating the placebo effect. Trainees became acculturated to 'real-life' practice; university teaching and evidence was deemed less relevant than meeting customer expectations. The tutor's actions were mirrored by their trainee resulting in continuation and perpetuation of the same professional attitudes and behaviours.
CONCLUSIONS: Evidence appeared to have limited influence on OTC decision making. The tutor played a key role in the trainee's professional development. More work could be performed to investigate how evidence can be regarded as relevant and something that is consistently implemented in practice.
Resumo:
The sustainable control of animal parasitic nematodes requires the development of efficient functional genomics platforms to facilitate target validation and enhance anthelmintic discovery. Unfortunately, the utility of RNA interference (RNAi) for the validation of novel drug targets in nematode parasites remains problematic. Ascaris suum is an important veterinary parasite and a zoonotic pathogen. Here we show that adult A. suum is RNAi competent, and highlight the induction, spread and consistency of RNAi across multiple tissue types. This platform provides a new opportunity to undertake whole organism-, tissue- and cell-level gene function studies to enhance target validation processes for nematode parasites of veterinary/medical significance.