898 resultados para 340402 Econometric and Statistical Methods
Resumo:
Objective: Both neurocognitive impairments and a history of childhood abuse are highly prevalent in patients with schizophrenia. Childhood trauma has been associated with memory impairment as well as hippocampal volume reduction in adult survivors. The aim of the following study was to examine the contribution of childhood adversity to verbal memory functioning in people with schizophrenia. Methods: Eighty-five outpatients with a Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition) diagnosis of chronic schizophrenia were separated into 2 groups on the basis of self-reports of childhood trauma. Performance on measures of episodic narrative memory, list learning, and working memory was then compared using multivariate analysis of covariance. Results: Thirty-eight (45%) participants reported moderate to severe levels of childhood adversity, while 47 (55%) reported no or low levels of childhood adversity. After controlling for premorbid IQ and current depressive symptoms, the childhood trauma group had significantly poorer working memory and episodic narrative memory. However, list learning was similar between groups. Conclusion: Childhood trauma is an important variable that can contribute to specific ongoing memory impairments in schizophrenia.
Resumo:
An optimal search theory, the so-called Levy-flight foraging hypothesis(1), predicts that predators should adopt search strategies known as Levy flights where prey is sparse and distributed unpredictably, but that Brownian movement is sufficiently efficient for locating abundant prey(2-4). Empirical studies have generated controversy because the accuracy of statistical methods that have been used to identify Levy behaviour has recently been questioned(5,6). Consequently, whether foragers exhibit Levy flights in the wild remains unclear. Crucially, moreover, it has not been tested whether observed movement patterns across natural landscapes having different expected resource distributions conform to the theory's central predictions. Here we use maximum-likelihood methods to test for Levy patterns in relation to environmental gradients in the largest animal movement data set assembled for this purpose. Strong support was found for Levy search patterns across 14 species of open-ocean predatory fish (sharks, tuna, billfish and ocean sunfish), with some individuals switching between Levy and Brownian movement as they traversed different habitat types. We tested the spatial occurrence of these two principal patterns and found Levy behaviour to be associated with less productive waters (sparser prey) and Brownian movements to be associated with productive shelf or convergence-front habitats (abundant prey). These results are consistent with the Levy-flight foraging hypothesis(1,7), supporting the contention(8,9) that organism search strategies naturally evolved in such a way that they exploit optimal Levy patterns.
Resumo:
This article gives an extensive overview of the wide range of analytical procedures developed for the detection of amphenicol antibiotic residues (chloramphenicol, thiamphenicol, and florfenicol) in many different types of foodstuffs (milk, meat, eggs, honey, seafood). Screening methods such as microbial inhibition methods, antibody-based immunoassays using conventional and biosensor-based detection systems, and some methods based on alternative recognition systems are described. The relative advantages and disadvantages of these methods are discussed and compared. The current status and future trends and developments in the need for accurate and rapid detection of this group of antimicrobials are also discussed.
Resumo:
Background: Evidence suggests that in prokaryotes sequence-dependent transcriptional pauses a?ect the dynamics of transcription and translation, as well as of small genetic circuits. So far, a few pause-prone sequences have been identi?ed from in vitro measurements of transcription elongation kinetics.
Results: Using a stochastic model of gene expression at the nucleotide and codon levels with realistic parameter values, we investigate three di?erent but related questions and present statistical methods for their analysis. First, we show that information from in vivo RNA and protein temporal numbers is su?cient to discriminate between models with and without a pause site in their coding sequence. Second, we demonstrate that it is possible to separate a large variety of models from each other with pauses of various durations and locations in the template by means of a hierarchical clustering and a random forest classi?er. Third, we introduce an approximate likelihood function that allows to estimate the location of a pause site.
Conclusions: This method can aid in detecting unknown pause-prone sequences from temporal measurements of RNA and protein numbers at a genome-wide scale and thus elucidate possible roles that these sequences play in the dynamics of genetic networks and phenotype.
Resumo:
Heterometallic clusters with strong luminescence have been synthesized (see picture: Au(CCPh)2 yellow-red, Ag2 blue, O red) from the metalloligand unit [Au(CCPh)PPh3] (yellow/red bars) by using both standard solvent-based and solvent-free reactions. The aggregates are stabilized only by acetylide–metal or metal–metal interactions, and their nuclearity is controlled through the addition of different donor ligands.
Resumo:
Background - The study of corneal endothelium, by specular microscopy, in patients with anterior uveitis has largely been restricted to observations on the endothelial cells. In this prospective study 'keratic precipitates' (KP) in different types of uveitis were examined in different stages of the disease process and the endothelial changes occurring in the vicinity of the KP were evaluated in comparison with the endothelium of the uninvolved eye. Methods - 13 patients with active unilateral uveitis were recruited. The mean age was 42.9 years (range 20-76 years). A Tomey-1100 contact wide field specular (x10) microscope was used to capture endothelial images and KP until the resolution of uveitis. Data regarding type of uveitis, number, size, and nature of KP were recorded. Automated morphometric analysis was done for cell size, cell density and coefficient of variation, and statistical comparisons of cell size and cell density were made (Student's t test) between the endothelium in the vicinity of fresh and resolving KP, fresh KP and normal endothelium, and resolving KP and normal endothelium. Results - On specular microscopy, fresh KP were seen as dense, white glistening deposits occupying 5-10 endothelial cells in diameter and fine KP were widely distributed and were one or two endothelial cells in diameter. The KP in Posner-Schlossman syndrome had a distinct and different morphology. With clinical remission of uveitis, the KP were observed to undergo characteristic morphological changes and old KP demonstrated a large, dark halo surrounding a central white deposit and occasionally a dark shadow or a 'lacuna' replaced the site of the original KP. Endothelial blebs were noted as dark shadows or defects in the endothelial mosaic in patients with recurrent uveitis. There was significant statistical difference in the mean cell size and cell density of endothelial cells in the vicinity of fresh KP compared with normal endothelium of the opposite eye. Conclusion - This study elucidated the different specular microscopic features of KP in anterior uveitis. Distinct morphological features of large and fine KP were noted. These features underwent dramatic changes on resolution of uveitis. The endothelium was abnormal in the vicinity of KP, which returned to near normal values on resolution of uveitis.
Resumo:
High-dimensional gene expression data provide a rich source of information because they capture the expression level of genes in dynamic states that reflect the biological functioning of a cell. For this reason, such data are suitable to reveal systems related properties inside a cell, e.g., in order to elucidate molecular mechanisms of complex diseases like breast or prostate cancer. However, this is not only strongly dependent on the sample size and the correlation structure of a data set, but also on the statistical hypotheses tested. Many different approaches have been developed over the years to analyze gene expression data to (I) identify changes in single genes, (II) identify changes in gene sets or pathways, and (III) identify changes in the correlation structure in pathways. In this paper, we review statistical methods for all three types of approaches, including subtypes, in the context of cancer data and provide links to software implementations and tools and address also the general problem of multiple hypotheses testing. Further, we provide recommendations for the selection of such analysis methods.
Resumo:
The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.
Resumo:
In recent years, the issue of life expectancy has become of upmost importance to pension providers, insurance companies and the government bodies in the developed world. Significant and consistent improvements in mortality rates and, hence, life expectancy have led to unprecedented increases in the cost of providing for older ages. This has resulted in an explosion of stochastic mortality models forecasting trends in mortality data in order to anticipate future life expectancy and, hence, quantify the costs of providing for future aging populations. Many stochastic models of mortality rates identify linear trends in mortality rates by time, age and cohort, and forecast these trends into the future using standard statistical methods. The modeling approaches used failed to capture the effects of any structural change in the trend and, thus, potentially produced incorrect forecasts of future mortality rates. In this paper, we look at a range of leading stochastic models of mortality and test for structural breaks in the trend time series.
Resumo:
The performances of four LC-MS/MS methodologies for determination of up to eight mycotoxin biomarkers in human urines were compared by involving three laboratories that analysed common urine samples spiked at two levels of each biomarker. Each laboratory received a calibration solution, spiked urines and the corresponding unspiked urine. The two spiking levels for each biomarker were chosen by considering the levels naturally occurring in human urines and the limits of quantification of the LC-MS/MS methodologies used by the participating laboratories. The results of each laboratory were evaluated for their z-score values. The percentage of satisfactory z-scores (vertical bar z vertical bar 2) were obtained for fumonisin B-1 (7/12 results), ochratoxin A (4/8 results) and alpha-zearalenol (1/8 results). The percentage of satisfactory z-scores for fumonisin B-1 and ochratoxin A increased from 42 to 83% for fumonisin B-1 and from 50 to 62% for ochratoxin A when laboratories 1 and 2 used own calibrants. Factors that could explain the different results obtained for fumonisin B-1 and ochratoxin A with provided and own calibration solutions could not be identified in this study and should be carefully investigated in future studies.
Resumo:
Objective. To ascertain goal orientations of pharmacy students and establish whether associations exist between academic performance, gender, or year of study. Methods. Goal orientations were assessed using a validated questionnaire. Respondents were categorized as high or low performers based on university grades. Associations and statistical significance were ascertained using parametric and nonparametric tests and linear regression, as appropriate. Results. A response rate of 60.7% was obtained. High performers were more likely to be female than male. The highest mean score was for mastery approach; the lowest for work avoidance. The mean score for work avoidance was significantly greater for low performers than for high performers and for males than for females. First-year students were most likely to have top scores in mastery and performance approaches. Conclusion. It is encouraging that the highest mean score was for mastery approach orientation, as goal orientation may play a role in academic performance of pharmacy students.
Resumo:
Hulun Lake, China's fifth-largest inland lake, experienced severe declines in water level in the period of 2000-2010. This has prompted concerns whether the lake is drying up gradually. A multi-million US dollar engineering project to construct a water channel to transfer part of the river flow from a nearby river to maintain the water level was completed in August 2010. This study aimed to advance the understanding of the key processes controlling the lake water level variation over the last five decades, as well as investigate the impact of the river transfer engineering project on the water level. A water balance model was developed to investigate the lake water level variations over the last five decades, using hydrological and climatic data as well as satellite-based measurements and results from land surface modelling. The investigation reveals that the severe reduction of river discharge (-364±64 mm/yr, ∼70% of the five-decade average) into the lake was the key factor behind the decline of the lake water level between 2000 and 2010. The decline of river discharge was due to the reduction of total runoff from the lake watershed. This was a result of the reduction of soil moisture due to the decrease of precipitation (-49±45 mm/yr) over this period. The water budget calculation suggests that the groundwater component from the surrounding lake area as well as surface run off from the un-gauged area surrounding the lake contributed ∼ net 210 Mm3/yr (equivalent to ∼ 100 mm/yr) water inflows into the lake. The results also show that the water diversion project did prevent a further water level decline of over 0.5 m by the end of 2012. Overall, the monthly water balance model gave an excellent prediction of the lake water level fluctuation over the last five decades and can be a useful tool to manage lake water resources in the future.
Resumo:
BACKGROUND: We conducted a systematic review on the management of psychogenic cough, habit cough, and tic cough to update the recommendations and suggestions of the 2006 guideline on this topic.
METHODS: We followed the American College of Chest Physicians (CHEST) methodologic guidelines and the Grading of Recommendations, Assessment, Development, and Evaluation framework. The Expert Cough Panel based their recommendations on data from the systematic review, patients' values and preferences, and the clinical context. Final grading was reached by consensus according to Delphi methodology.
RESULTS: The results of the systematic review revealed only low-quality evidence to support how to define or diagnose psychogenic or habit cough with no validated diagnostic criteria. With respect to treatment, low-quality evidence allowed the committee to only suggest therapy for children believed to have psychogenic cough. Such therapy might consist of nonpharmacologic trials of hypnosis or suggestion therapy, or combinations of reassurance, counseling, and referral to a psychologist, psychotherapy, and appropriate psychotropic medications. Based on multiple resources and contemporary psychologic, psychiatric, and neurologic criteria (Diagnostic and Statistical Manual of Mental Disorders, 5th edition and tic disorder guidelines), the committee suggests that the terms psychogenic and habit cough are out of date and inaccurate.
CONCLUSIONS: Compared with the 2006 CHEST Cough Guidelines, the major change in suggestions is that the terms psychogenic and habit cough be abandoned in favor of somatic cough syndrome and tic cough, respectively, even though the evidence to do so at this time is of low quality.
Resumo:
Background: High risk medications are commonly prescribed to older US patients. Currently, less is known about high risk medication prescribing in other Western Countries, including the UK. We measured trends and correlates of high risk medication prescribing in a subset of the older UK population (community/institutionalized) to inform harm minimization efforts. Methods: Three cross-sectional samples from primary care electronic clinical records (UK Clinical Practice Research Datalink, CPRD) in fiscal years 2003/04, 2007/08 and 2011/12 were taken. This yielded a sample of 13,900 people aged 65 years or over from 504 UK general practices. High risk medications were defined by 2012 Beers Criteria adapted for the UK. Using descriptive statistical methods and regression modelling, prevalence of ‘any’ (drugs prescribed at least once per year) and ‘long-term’ (drugs prescribed all quarters of year) high risk medication prescribing and correlates were determined. Results: While polypharmacy rates have risen sharply, high risk medication prevalence has remained stable across a decade. A third of older (65+) people are exposed to high risk medications, but only half of the total prevalence was long-term (any = 38.4 % [95 % CI: 36.3, 40.5]; long-term = 17.4 % [15.9, 19.9] in 2011/12). Long-term but not any high risk medication exposure was associated with older ages (85 years or over). Women and people with higher polypharmacy burden were at greater risk of exposure; lower socio-economic status was not associated. Ten drugs/drug classes accounted for most of high risk medication prescribing in 2011/12. Conclusions: High risk medication prescribing has not increased over time against a background of increasing polypharmacy in the UK. Half of patients receiving high risk medications do so for less than a year. Reducing or optimising the use of a limited number of drugs could dramatically reduce high risk medications in older people. Further research is needed to investigate why the oldest old and women are at greater risk. Interventions to reduce high risk medications may need to target shorter and long-term use separately.