120 resultados para Errors in variables models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The frequencies of atmospheric blocking in both winter and summer and the changes in them from the 20th to the 21st centuries as simulated in twelve CMIP5 models is analysed. The RCP 8.5 high emission scenario runs are used to represent the 21st century. The analysis is based on the wave-breaking methodology of Pelly and Hoskins (2003a). It differs from the Tibaldi and Molteni (1990) index in viewing equatorward cut-off lows and poleward blocking highs in equal manner as indicating a disruption to the westerlies. 1-dimensional and 2-dimensional diagnostics are applied to identify blocking of the mid-latitude storm-track and also at higher latitudes. Winter blocking frequency is found to be generally underestimated. The models give a decrease in the European blocking maximum in the 21st century, consistent with the results in other studies. There is a mean 21st century winter poleward shift of high- latitude blocking, but little agreement between the models on the details. In summer, Eurasian blocking is also underestimated in the models, whereas it is now too large over the high-latitude ocean basins. A decrease in European blocking frequency in the 21st century model runs is again found. However in summer there is a clear eastward shift of blocking over Eastern Europe and Western Russia, in a region close to the blocking that dominated the Russian summer of 2010. While summer blocking decreases in general, the poleward shift of the storm track into the region of frequent high latitude blocking may mean that the incidence of storms being obstructed by blocks may actually increase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To determine the prevalence and nature of prescribing and monitoring errors in general practices in England. Design Retrospective case note review of unique medication items prescribed over a 12 month period to a 2% random sample of patients. Mixed effects logistic regression was used to analyse the data. Setting Fifteen general practices across three primary care trusts in England. Data sources Examination of 6048 unique prescription items prescribed over the previous 12 months for 1777 patients. Main outcome measures Prevalence of prescribing and monitoring errors, and severity of errors, using validated definitions. Results Prescribing and/or monitoring errors were detected in 4.9% (296/6048) of all prescription items (95% confidence interval 4.4 - 5.5%). The vast majority of errors were of mild to moderate severity, with 0.2% (11/6048) of items having a severe error. After adjusting for covariates, patient-related factors associated with an increased risk of prescribing and/or monitoring errors were: age less than 15 (Odds Ratio (OR) 1.87, 1.19 to 2.94, p=0.006) or greater than 64 years (OR 1.68, 1.04 to 2.73, p=0.035), and higher numbers of unique medication items prescribed (OR 1.16, 1.12 to 1.19, p<0.001). Conclusion Prescribing and monitoring errors are common in English general practice, although severe errors are unusual. Many factors increase the risk of error. Having identified the most common and important errors, and the factors associated with these, strategies to prevent future errors should be developed based on the study findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To examine the causes of prescribing and monitoring errors in English general practices and provide recommendations for how they may be overcome. Design: Qualitative interview and focus group study with purposive sampling and thematic analysis informed by Reason’s accident causation model. Participants: General practice staff participated in a combination of semi-structured interviews (n=34) and six focus groups (n=46). Setting: Fifteen general practices across three primary care trusts in England. Results: We identified seven categories of high-level error-producing conditions: the prescriber, the patient, the team, the task, the working environment, the computer system, and the primary-secondary care interface. Each of these was further broken down to reveal various error-producing conditions. The prescriber’s therapeutic training, drug knowledge and experience, knowledge of the patient, perception of risk, and their physical and emotional health, were all identified as possible causes. The patient’s characteristics and the complexity of the individual clinical case were also found to have contributed to prescribing errors. The importance of feeling comfortable within the practice team was highlighted, as well as the safety of general practitioners (GPs) in signing prescriptions generated by nurses when they had not seen the patient for themselves. The working environment with its high workload, time pressures, and interruptions, and computer related issues associated with mis-selecting drugs from electronic pick-lists and overriding alerts, were all highlighted as possible causes of prescribing errors and often interconnected. Conclusion: This study has highlighted the complex underlying causes of prescribing and monitoring errors in general practices, several of which are amenable to intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Care home residents are at particular risk from medication errors, and our objective was to determine the prevalence and potential harm of prescribing, monitoring, dispensing and administration errors in UK care homes, and to identify their causes. Methods: A prospective study of a random sample of residents within a purposive sample of homes in three areas. Errors were identified by patient interview, note review, observation of practice and examination of dispensed items. Causes were understood by observation and from theoretically framed interviews with home staff, doctors and pharmacists. Potential harm from errors was assessed by expert judgement. Results: The 256 residents recruited in 55 homes were taking a mean of 8.0 medicines. One hundred and seventy-eight (69.5%) of residents had one or more errors. The mean number per resident was 1.9 errors. The mean potential harm from prescribing, monitoring, administration and dispensing errors was 2.6, 3.7, 2.1 and 2.0 (0 = no harm, 10 = death), respectively. Contributing factors from the 89 interviews included doctors who were not accessible, did not know the residents and lacked information in homes when prescribing; home staff’s high workload, lack of medicines training and drug round interruptions; lack of team work among home, practice and pharmacy; inefficient ordering systems; inaccurate medicine records and prevalence of verbal communication; and difficult to fill (and check) medication administration systems. Conclusions: That two thirds of residents were exposed to one or more medication errors is of concern. The will to improve exists, but there is a lack of overall responsibility. Action is required from all concerned.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

HD (Huntington's disease) is a late onset heritable neurodegenerative disorder that is characterized by neuronal dysfunction and death, particularly in the cerebral cortex and medium spiny neurons of the striatum. This is followed by progressive chorea, dementia and emotional dysfunction, eventually resulting in death. HD is caused by an expanded CAG repeat in the first exon of the HD gene that results in an abnormally elongated polyQ (polyglutamine) tract in its protein product, Htt (Huntingtin). Wild-type Htt is largely cytoplasmic; however, in HD, proteolytic N-terminal fragments of Htt form insoluble deposits in both the cytoplasm and nucleus, provoking the idea that mutHtt (mutant Htt) causes transcriptional dysfunction. While a number of specific transcription factors and co-factors have been proposed as mediators of mutHtt toxicity, the causal relationship between these Htt/transcription factor interactions and HD pathology remains unknown. Previous work has highlighted REST [RE1 (repressor element 1)-silencing transcription factor] as one such transcription factor. REST is a master regulator of neuronal genes, repressing their expression. Many of its direct target genes are known or suspected to have a role in HD pathogenesis, including BDNF (brain-derived neurotrophic factor). Recent evidence has also shown that REST regulates transcription of regulatory miRNAs (microRNAs), many of which are known to regulate neuronal gene expression and are dysregulated in HD. Thus repression of miRNAs constitutes a second, indirect mechanism by which REST can alter the neuronal transcriptome in HD. We will describe the evidence that disruption to the REST regulon brought about by a loss of interaction between REST and mutHtt may be a key contributory factor in the widespread dysregulation of gene expression in HD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We test the expectations theory of the term structure of U.S. interest rates in nonlinear systems. These models allow the response of the change in short rates to past values of the spread to depend upon the level of the spread. The nonlinear system is tested against a linear system, and the results of testing the expectations theory in both models are contrasted. We find that the results of tests of the implications of the expectations theory depend on the size and sign of the spread. The long maturity spread predicts future changes of the short rate only when it is high.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The etiology of colorectal cancer (CRC), a common cause of cancer-related mortality globally, has strong associations with diet. There is considerable epidemiological evidence that fruits and vegetables are associated with reduced risk of CRC. This paper reviews the extensive evidence, both from in vitro studies and animal models, that components of berry fruits can modulate biomarkers of DNA damage and that these effects may be potentially chemoprotective, given the likely role that oxidative damage plays in mutation rate and cancer risk. Human intervention trials with berries are generally consistent in indicating a capacity to significantly decrease oxidative damage to DNA, but represent limited evidence for anticarcinogenicity, relying as they do on surrogate risk markers. To understand the effects of berry consumption on colorectal cancer risk, future studies will need to be well controlled, with defined berry extracts, using suitable and clinically relevant end points and considering the importance of the gut microbiota.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Amplified Arctic warming is expected to have a significant longterm influence on the midlatitude atmospheric circulation by the latter half of the 21st century. Potential influences of recent and near future Arctic changes on shorter timescales are much less clear, despite having received much recent attention in the literature. In this letter, climate models from the recent CMIP5 experiment are analysed for evidence of an influence of Arctic temperatures on midlatitude blocking and cold European winters in particular. The focus is on the variability of these features in detrended data and, in contrast to other studies, limited evidence of an influence is found. The occurrence of cold European winters is found to be largely independent of the temperature variability in the key Barents–Kara Sea region. Positive correlations of the Barents–Kara temperatures with Eurasian blocking are found in some models, but significant correlations are limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The IEEE 754 standard for oating-point arithmetic is widely used in computing. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. The IEEE infinities are said to have the behaviour of limits. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. We elucidate the transreal tangent and extend real limits to transreal limits. Arguing from this firm foundation, we maintain that there are three category errors in the IEEE 754 standard. Firstly the claim that IEEE infinities are limits of real arithmetic confuses limiting processes with arithmetic. Secondly a defence of IEEE negative zero confuses the limit of a function with the value of a function. Thirdly the definition of IEEE NaNs confuses undefined with unordered. Furthermore we prove that the tangent function, with the infinities given by geometrical con- struction, has a period of an entire rotation, not half a rotation as is commonly understood. This illustrates a category error, confusing the limit with the value of a function, in an important area of applied mathe- matics { trigonometry. We brie y consider the wider implications of this category error. Another paper proposes transreal arithmetic as a basis for floating- point arithmetic; here we take the profound step of proposing transreal arithmetic as a replacement for real arithmetic to remove the possibility of certain category errors in mathematics. Thus we propose both theo- retical and practical advantages of transmathematics. In particular we argue that implementing transreal analysis in trans- floating-point arith- metic would extend the coverage, accuracy and reliability of almost all computer programs that exploit real analysis { essentially all programs in science and engineering and many in finance, medicine and other socially beneficial applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theoretical estimates for the cutoff errors in the Ewald summation method for dipolar systems are derived. Absolute errors in the total energy, forces and torques, both for the real and reciprocal space parts, are considered. The applicability of the estimates is tested and confirmed in several numerical examples. We demonstrate that these estimates can be used easily in determining the optimal parameters of the dipolar Ewald summation in the sense that they minimize the computation time for a predefined, user set, accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experiments with CO2 instantaneously quadrupled and then held constant are used to show that the relationship between the global-mean net heat input to the climate system and the global-mean surface-air-temperature change is nonlinear in Coupled Model Intercomparison Project phase 5 (CMIP5) Atmosphere-Ocean General Circulation Models (AOGCMs). The nonlinearity is shown to arise from a change in strength of climate feedbacks driven by an evolving pattern of surface warming. In 23 out of the 27 AOGCMs examined the climate feedback parameter becomes significantly (95% confidence) less negative – i.e. the effective climate sensitivity increases – as time passes. Cloud feedback parameters show the largest changes. In the AOGCM-mean approximately 60% of the change in feedback parameter comes from the topics (30N-30S). An important region involved is the tropical Pacific where the surface warming intensifies in the east after a few decades. The dependence of climate feedbacks on an evolving pattern of surface warming is confirmed using the HadGEM2 and HadCM3 atmosphere GCMs (AGCMs). With monthly evolving sea-surface-temperatures and sea-ice prescribed from its AOGCM counterpart each AGCM reproduces the time-varying feedbacks, but when a fixed pattern of warming is prescribed the radiative response is linear with global temperature change or nearly so. We also demonstrate that the regression and fixed-SST methods for evaluating effective radiative forcing are in principle different, because rapid SST adjustment when CO2 is changed can produce a pattern of surface temperature change with zero global mean but non-zero change in net radiation at the top of the atmosphere (~ -0.5 Wm-2 in HadCM3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper evaluates the current status of global modeling of the organic aerosol (OA) in the troposphere and analyzes the differences between models as well as between models and observations. Thirty-one global chemistry transport models (CTMs) and general circulation models (GCMs) have participated in this intercomparison, in the framework of AeroCom phase II. The simulation of OA varies greatly between models in terms of the magnitude of primary emissions, secondary OA (SOA) formation, the number of OA species used (2 to 62), the complexity of OA parameterizations (gas-particle partitioning, chemical aging, multiphase chemistry, aerosol microphysics), and the OA physical, chemical and optical properties. The diversity of the global OA simulation results has increased since earlier AeroCom experiments, mainly due to the increasing complexity of the SOA parameterization in models, and the implementation of new, highly uncertain, OA sources. Diversity of over one order of magnitude exists in the modeled vertical distribution of OA concentrations that deserves a dedicated future study. Furthermore, although the OA / OC ratio depends on OA sources and atmospheric processing, and is important for model evaluation against OA and OC observations, it is resolved only by a few global models. The median global primary OA (POA) source strength is 56 Tg a−1 (range 34–144 Tg a−1) and the median SOA source strength (natural and anthropogenic) is 19 Tg a−1 (range 13–121 Tg a−1). Among the models that take into account the semi-volatile SOA nature, the median source is calculated to be 51 Tg a−1 (range 16–121 Tg a−1), much larger than the median value of the models that calculate SOA in a more simplistic way (19 Tg a−1; range 13–20 Tg a−1, with one model at 37 Tg a−1). The median atmospheric burden of OA is 1.4 Tg (24 models in the range of 0.6–2.0 Tg and 4 between 2.0 and 3.8 Tg), with a median OA lifetime of 5.4 days (range 3.8–9.6 days). In models that reported both OA and sulfate burdens, the median value of the OA/sulfate burden ratio is calculated to be 0.77; 13 models calculate a ratio lower than 1, and 9 models higher than 1. For 26 models that reported OA deposition fluxes, the median wet removal is 70 Tg a−1 (range 28–209 Tg a−1), which is on average 85% of the total OA deposition. Fine aerosol organic carbon (OC) and OA observations from continuous monitoring networks and individual field campaigns have been used for model evaluation. At urban locations, the model–observation comparison indicates missing knowledge on anthropogenic OA sources, both strength and seasonality. The combined model–measurements analysis suggests the existence of increased OA levels during summer due to biogenic SOA formation over large areas of the USA that can be of the same order of magnitude as the POA, even at urban locations, and contribute to the measured urban seasonal pattern. Global models are able to simulate the high secondary character of OA observed in the atmosphere as a result of SOA formation and POA aging, although the amount of OA present in the atmosphere remains largely underestimated, with a mean normalized bias (MNB) equal to −0.62 (−0.51) based on the comparison against OC (OA) urban data of all models at the surface, −0.15 (+0.51) when compared with remote measurements, and −0.30 for marine locations with OC data. The mean temporal correlations across all stations are low when compared with OC (OA) measurements: 0.47 (0.52) for urban stations, 0.39 (0.37) for remote stations, and 0.25 for marine stations with OC data. The combination of high (negative) MNB and higher correlation at urban stations when compared with the low MNB and lower correlation at remote sites suggests that knowledge about the processes that govern aerosol processing, transport and removal, on top of their sources, is important at the remote stations. There is no clear change in model skill with increasing model complexity with regard to OC or OA mass concentration. However, the complexity is needed in models in order to distinguish between anthropogenic and natural OA as needed for climate mitigation, and to calculate the impact of OA on climate accurately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The implications of polar cap expansions, contractions and movements for empirical models of high-latitude plasma convection are examined. Some of these models have been generated by directly averaging flow measurements from large numbers of satellite passes or radar scans; others have employed more complex means to combine data taken at different times into large-scale patterns of flow. In all cases, the models have implicitly adopted the assumption that the polar cap is in steady state: they have all characterized the ionospheric flow in terms of the prevailing conditions (e.g. the interplanetary magnetic field and/or some index of terrestrial magnetic activity) without allowance for their history. On long enough time scales, the polar cap is indeed in steady state but on time scales shorter than a few hours it is not and can oscillate in size and position. As a result, the method used to combine the data can influence the nature of the convection reversal boundary and the transpolar voltage in the derived model. This paper discusses a variety of effects due to time-dependence in relation to some ionospheric convection models which are widely applied. The effects are shown to be varied and to depend upon the procedure adopted to compile the model.